All Articles
Digital Marketing 6 min readApril 4, 2025

How to Make Data-Driven Marketing Decisions Without a Data Science Degree

Data doesn't make decisions — people with frameworks do. Pierre Subeh's guide to building a data culture where marketing decisions are faster, smarter, and more defensible without drowning in dashboards.

Data-Driven Marketing Analytics Marketing Strategy Digital Marketing Pierre Subeh
P

Pierre Subeh

Forbes 30 Under 30 · CEO, X Network · TEDx Speaker

The Data-Driven Fallacy

Everyone says they're data-driven. Most of the time, it means one of three things:

1. They have access to dashboards and check them periodically

2. They make decisions and then find data to support what they already wanted to do

3. They're genuinely overwhelmed by data and default to gut feeling anyway

None of these is actually data-driven decision-making. Data-driven marketing means using data to answer specific questions that inform specific decisions — and having frameworks for which data answers which questions.

I've built marketing programs for brands ranging from Apple Music and Häagen-Dazs down to small local businesses. Across that range, the teams that made genuinely better decisions with data weren't the ones with the most analytics tools — they were the ones with the clearest frameworks for what they were measuring and why.

The Framework Before the Tools

The mistake most teams make is buying analytics tools and then trying to figure out what questions to ask. The sequence should be reversed.

Start with the decisions you actually need to make:

  • Which channels should we increase investment in?
  • Which campaign creative should we scale?
  • Is this campaign driving revenue or just traffic?
  • Why did retention drop last quarter?
  • Each decision has a specific data requirement. The channel investment question requires cost-per-acquisition by channel, properly attributed. The creative scaling question requires conversion rate and quality metrics by creative variation. The revenue question requires attribution modeling that connects marketing activity to closed revenue. The retention question requires cohort analysis, not just aggregate churn.

    When you identify the decisions first, you build dashboards that answer actual questions rather than dashboards that display whatever data is available.

    The Metrics That Actually Matter

    Revenue attribution. The north star for marketing data is the connection between marketing activity and revenue. This requires more than last-click attribution — it requires understanding the contribution of each touchpoint across a customer journey that may span weeks or months.

    For most businesses, this is the hardest data problem, and imperfect attribution data that requires judgment is better than ignoring attribution entirely.

    Customer acquisition cost (CAC) by channel. Total channel investment divided by customers acquired from that channel. The challenge: "acquired" needs to be defined carefully (first touch? primary channel? last touch?), and the timeframe matters (monthly CAC is volatile; 90-day rolling average is more stable).

    Lifetime value (LTV) or revenue per customer. CAC only makes sense in relation to LTV. A $200 CAC is catastrophic if average customer value is $250. It's outstanding if average customer value is $2,000. LTV:CAC ratio (typically looking for 3:1 or better) is a more useful metric than either in isolation.

    Conversion rate by funnel stage. Where do users drop out of the purchase journey? The stage with the highest exit rate contains the highest-leverage optimization opportunity. Aggregate site conversion rate is too blunt — funnel stage conversion tells you where the problem is.

    Engagement depth, not just engagement. Click rate, open rate, and follower counts are surface metrics. Time on site, pages per session, return visit rate, and content completion rate indicate whether engagement is substantive or superficial.

    The Questions Data Can't Answer

    Understanding what data can't tell you is as important as knowing what it can.

    Data can't tell you why. Analytics show you what happened; they can't explain the mechanism. Conversion rate dropped 30% — the data tells you that happened, not why. Understanding why requires qualitative research: user interviews, session recordings, exit surveys, customer conversations.

    Data can't tell you what to do next. It tells you what happened and, with careful analysis, what correlations exist. The interpretation of what to do next always involves human judgment about priorities, constraints, and strategy. Teams that "let the data decide" are often avoiding accountability for strategic decisions.

    Data in short windows lies. A landing page test run for three days doesn't have statistical significance. A channel that performed poorly during a holiday period looks bad in isolation. An account that grew quickly and then plateaued looks like a declining trend if you measure only the tail end. The timeframe of your analysis dramatically affects the conclusions you can draw.

    Standard attribution models under-credit brand. Last-click attribution, which Google Ads and most analytics tools default to, gives 100% credit for the conversion to the final touchpoint before purchase. A brand awareness campaign that ran six weeks before the conversion event gets zero credit. This systematically undercounts the ROI of brand investment and overcounts the ROI of retargeting and branded search.

    Building a Reporting Cadence That Actually Gets Used

    The reports teams actually use are simpler than the reports analytics platforms make it easy to create. A practical reporting cadence:

    Weekly performance pulse (15 minutes). Core KPIs vs. prior period and vs. target: traffic, leads, revenue, CAC by primary channel. Purpose: catch anomalies early, identify what's working or broken.

    Monthly analysis (1-2 hours). Deeper dive into conversion funnel performance, channel contribution, campaign-level ROI, trend analysis. Purpose: assess what changed and why, inform the next month's prioritization.

    Quarterly strategic review. Attribution analysis, LTV cohort analysis, year-over-year comparison, competitive context. Purpose: validate or challenge strategic assumptions, resource allocation decisions.

    The discipline: only track metrics that inform decisions. If a metric appears in the dashboard but never influences a decision, it's noise. Remove it.

    The Judgment That Sits on Top of Data

    The best data-driven marketers I've worked with share a characteristic: they're skeptical of data that confirms what they expected, and curious about data that surprises them.

    Confirmation bias in data interpretation is as common as confirmation bias in any other domain. The team that built the campaign tends to look at the performance data looking for evidence that it worked. The more useful posture is looking for evidence that the underlying hypothesis was wrong — because that's where the real learning is.

    Data tells you what happened. Understanding why it happened, deciding what it means, and choosing what to do next are human judgment calls that data informs but doesn't replace. The goal of data-driven marketing isn't to replace judgment — it's to make judgment faster, more reliable, and more defensible.

    Key Takeaways

  • Start with decisions, not data: identify the decisions you need to make, then determine what data answers those questions
  • Revenue attribution is the north star metric — connection between marketing activity and revenue, however imperfectly measured
  • LTV:CAC ratio (typically 3:1 target) is more useful than CAC or LTV in isolation
  • Data explains what happened; understanding why requires qualitative research — session recordings, exit surveys, customer interviews
  • Short windows produce unreliable data — statistical significance requires adequate sample sizes and time horizons
  • Standard attribution models under-credit brand investment — systematically overcounts performance, undercounts awareness
  • Weekly pulse → monthly analysis → quarterly strategic review is a reporting cadence that actually gets used
  • The goal is faster, more reliable human judgment — not replacement of judgment with data

Previous

The Entrepreneur's Productivity System: How to Get More Done in Less Time

Next

Voice Search Optimization: Preparing for the Way People Actually Search

More in Digital Marketing

Written by Pierre Subeh

Want More Marketing Intelligence?

Browse All ArticlesWork with Pierre