The Ecomm Analyst

Growing stores, one honest take at a time.

Asking Customers Where They Came From

There’s a class of marketing channels that pixels can’t see. Someone hears about you on a podcast. A friend texts them a link. They watch a YouTube review on a logged-out account, then come back a week later through a Google search and convert. By the time the order fires, all the attribution tool can see is “google / organic” and that’s the credit it gives.

Post-purchase surveys exist because the only person who actually knows where they came from is the customer.

The setup is simple. After checkout completes, before the thank-you page fully resolves, you show a one-question survey: “How did you hear about us?” with structured options (Instagram, Friend or family, Podcast, TikTok, Google, etc.) plus a free-text “Other” field. I run mine through ThoughtMetric, which keeps the survey response sitting next to the click and pixel data on the same customer record. That part matters more than I expected, and I’ll come back to it.

Here’s why I read this data every week.

It catches the channels your pixel literally cannot see. Last quarter I watched podcast mentions become the third-largest source of new customers for a brand I work with. Zero of those people showed up in any attribution tool. None of them were on the channel-level dashboard. They all came in attributed to “direct” or “organic search,” because they Googled the brand name after hearing the spot. Without the survey, that channel doesn’t exist on paper, and the brand quietly stops sponsoring podcasts.

It validates or contradicts your click data. This is where having the survey response and the click-path on the same customer record actually matters. When ThoughtMetric tells me Meta drove 40% of new customer revenue and the survey shows 42% of customers saying they first heard about us on Instagram, I trust both numbers more. When the two disagree by a lot on a specific channel, that’s the flag to dig in. Usually the answer is that one is measuring “first heard about” and the other is measuring “last clicked,” and reconciling the two tells you something useful about how that customer journey actually unfolds.

It surfaces individual conversations. The free-text “Other” responses are where I find specific creators, specific Reddit threads, specific newsletters that are sending customers. You can’t optimize for what you can’t see, and pixel data will never tell you that one Reddit comment from eight months ago is still sending you two orders a week.

A few things I’ve learned not to do:

Don’t ask too many questions. The response rate craters past one or two. I’ve seen surveys with five fields where almost nobody finishes, and the data is biased toward the small subset of people willing to fill out forms.

Don’t trust any single response. Customers misremember. Customers click whatever’s at the top of the list. Customers who genuinely came from a friend will sometimes click “Google” because that’s how they technically reached the site. The signal is in the aggregate, not the individual answer.

Don’t ignore the “Other” field. The structured options give you scale, but the free text is where the surprises are. I’d rather skim 50 free-text responses on Monday morning than refresh the chart of structured options.

The best version of this is reading survey data alongside your attribution data, not instead of it. The pixel knows what people did. The survey knows what they remember doing. The truth lives somewhere between the two, and most operators only ever look at one side of it.

Leave a comment

Navigation

About

Six years in e-commerce. Three Shopify stores across different niches, one scaled past seven figures. I’ve tested hundreds of ad creatives, obsessed over email flows, and learned more from my failures than my wins.

Now I focus on conversion optimization, retention marketing, and the analytics behind it all. This blog is where I share what actually works, backed by real numbers. No fluff, no guru energy.