Building a TV Attribution Analytics Platform: A SaaS Use Case
The Scenario
In 2012, a media agency approached a digital marketing entrepreneur with a specific challenge: measure the effectiveness of a national TV media campaign. The founder — a veteran with more than 15 years of digital marketing experience — recognized the gap in the market and set out to build an analytics platform that could give advertisers and agencies next-day results on how their television advertisements were influencing website traffic and conversions.
The concept evolved from an in-house alpha prototype into a commercial Software-as-a-Service product: a TV attribution analytics platform designed to connect broadcast-side data with web-side behaviour.
By 2013, the organization had a working alpha but needed a development partner to move the product beyond its initial state. The constraints were demanding: an aggressive, fixed timeline tied to one of the firm's clients running a multi-million dollar TV advertising campaign. The MVP had to be production-ready, enterprise-grade, and capable of handling real-world data volumes from day one.
The Approach
The platform's core job is deceptively difficult. TV advertising generates audience responses that are diffuse, delayed, and geographically distributed. Tying a spike in website visits back to a specific broadcast — on a specific network, at a specific time, in a specific region — requires both reliable data pipelines and attribution logic sophisticated enough to handle ambiguity at scale.
Three primary workstreams shaped the build.
1. Gathering TV Data
The most complex component was aggregating, attributing, and displaying data from a broad range of television sources: networks, programs, local broadcasting and cable systems, broadcasting times, and geographic locations across the United States.
A combination of external API integrations and a custom-built application was developed to achieve automated, continuous tracking, archiving, and reporting of TV source data. Handling the variety of data formats and update cadences across these sources required a flexible ingestion layer that could be extended as the platform scaled.
2. Attribution
Piwik — an open-source web analytics platform — was chosen as the foundation for data collection operations. Rather than building a collection stack from scratch, extending Piwik with custom-made plugins allowed the team to create a stable, high-performing, and scalable solution to the attribution challenges specific to TV measurement.
A key element was enriching the collected data with contextual layers — notably, time zone information covering the various U.S. time zones — so that broadcast times could be accurately matched against web visit timestamps regardless of where the viewer was located.
The platform incorporated the firm's own dual attribution algorithms to match TV source data with website behavioural data, identifying the relevant connections between broadcast events and downstream user actions. It also integrated a proprietary cookie-less tracking technology to capture web visits that would otherwise be invisible to standard cookie-based analytics — an important consideration for measuring audiences who block cookies or arrive via connected TV devices.
3. UX/UI Design and Data Display
The design process began with a clear principle: the end user should not need any prior web analytics knowledge to operate the platform. This ruled out the dense, configuration-heavy interfaces typical of enterprise analytics tools.
The UX/UI work focused on presenting tables, graphs, metrics, and reports in a visually accessible way — one that could function as a standalone product without requiring technical support. Simplicity served both the product goal and the go-to-market strategy, since the platform was intended for advertisers and agencies whose core competency is media buying, not analytics infrastructure.
Implementation Considerations
The technical architecture reflects several deliberate choices:
- Piwik as the analytics foundation: Using an established open-source platform reduced time-to-market and provided a proven data collection layer. Custom plugins extended it to meet the specific demands of TV attribution at scale.
- Dual attribution algorithms: The proprietary matching logic was integrated directly into the platform, enabling the system to correlate TV broadcast events with web visit data in a way that accounts for time-shifting, geographic distribution, and channel complexity.
- Cookie-less tracking: Proprietary tracking technology was embedded to supplement standard analytics, broadening coverage of the addressable audience.
- Time zone normalization: U.S. time zone data was built into the attribution pipeline, ensuring that broadcast timestamps and web visit timestamps were always compared on a consistent basis.
- External API + custom application hybrid: TV data ingestion combined third-party API feeds with a bespoke application to handle sources that lacked standardized programmatic access.
The platform processes hundreds of thousands of website visits per day — a volume that shaped decisions around scalability and plugin architecture throughout the build.
Timeline and Outcomes
| Milestone | Period |
|---|---|
| Alpha platform created in-house | 2012–2013 |
| Project commencement | Q3 2013 |
| MVP release and beta launch | Q1 2014 |
| Platform adjustments based on user feedback | Q1–Q2 2014 |
| Acquisition of high-profile clients | Q1 2014 onward |
| Series A investment round | August 2014 |
The MVP was delivered within the original fixed timeline and used by a paying client for a live multi-million dollar TV campaign. Following launch, the platform was refined iteratively based on user feedback before expanding to a broader client base.
The resulting platform continuously collects, attributes, calculates, and aggregates data from TV sources, then surfaces it through a reporting interface accessible to non-technical users. The combination of next-day results and an approachable UX differentiated it from heavier enterprise measurement tools that required analyst intervention to extract insights.
Key Takeaways
For organizations building in the TV attribution space, this case illustrates several replicable patterns:
- Extend rather than rebuild: Leveraging an established analytics platform (Piwik in this case) as a foundation, and extending it with domain-specific plugins, is faster and less risky than building data collection infrastructure from scratch.
- Attribution logic is the moat: The platform's differentiation came not from the collection layer but from the proprietary algorithms that matched TV and web data. Investing in this layer — and integrating it tightly with the data pipeline — is where the analytical value lives.
- UX investment pays off for non-technical buyers: Simplicity in the interface was a strategic choice, not just a design preference. Platforms that require no prior analytics knowledge lower the barrier to adoption among media buyers and agency teams.
- Time zone handling is non-trivial: TV is inherently geographic and time-distributed. Any attribution platform operating across U.S. markets needs robust time zone normalization baked into its data model, not bolted on after the fact.
- Cookie-less tracking extends coverage: For TV-to-web attribution specifically, relying solely on cookie-based analytics underestimates audience response. Supplementary tracking methods improve the accuracy of conversion attribution.