← BackCase study

Fund Homecare Analytics

A cross-platform marketing analytics dashboard built to centralize performance insights and support better fundraising decisions.

Next.jsGA4APIsAnalyticsProduct Thinking
Note: The source code is not publicly available because it’s owned by the organization. This case study focuses on decisions, structure, and outcomes.

Overview

Fund Homecare needed a clearer view of what was working across social channels and donation activity. Reporting was fragmented across platforms and hard to compare week to week. The goal was to unify key metrics into a single, calm interface that supports quick decisions: what to post, when to post, and which campaigns actually convert.

The problem

  • Metrics were split across platforms with inconsistent naming and definitions.
  • Reporting was manual and time-consuming, making it hard to maintain.
  • Campaign decisions were harder to justify without consistent comparisons.
  • It wasn’t obvious how engagement related to donations and outcomes.

My role

I served as Software Project Manager leading a team of four. I drove requirements gathering, defined what “success” meant for stakeholders, prioritized scope, and supported implementation across the frontend and data layer.

Responsibilities

  • Translate stakeholder questions into measurable KPIs.
  • Define dashboard sections and information hierarchy.
  • Coordinate tasks, timelines, and review cycles across the team.
  • Ensure the output stayed usable, not “feature-heavy.”

Solution

We designed a single dashboard that makes performance readable at a glance, with consistent metric definitions and a structure that supports weekly decision-making.

Unify the data story

Centralize campaign performance views so leadership doesn’t have to cross-reference multiple platforms to answer simple questions.

Make metrics comparable

Use consistent naming, time ranges, and definitions so week-over-week trends are reliable and not misleading.

Design for calm scanning

Prioritize readability: clear hierarchy, muted supporting text, and only the most useful charts and KPIs.

Outcomes

  • Reduced the effort required to prepare recurring performance updates.
  • Improved clarity around what content and timing performed best.
  • Created a shared “source of truth” for stakeholders and the team.

What I learned

The biggest lesson was that analytics products only work when the metrics match the decisions people actually need to make. Clean UI matters, but clarity in definitions and stakeholder alignment matters more. I also learned how quickly scope can grow in “dashboard” projects, and how important it is to protect usability.