Tracking Plan as a Product Contract: Why Analytics Starts Before Build
Explaining why the development of a tracking plan and data dictionary should occur before the Build phase to ensure the measurability of changes and avoid 'post-factum arguments'.
Tracking Plan as a Product Contract: Why Analytics Starts Before Build
"We built the feature, metrics later" — this is one of the most expensive mistakes a product team can make. If analytics and instrumentation are added "after the fact," the Evaluate phase turns into guesswork, and the entire learning cycle becomes a fiction.
In the PTOS methodology, the Tracking Plan is not a technical task for an analyst or developer. It is a product contract that is created before the Build phase begins and ensures that any change will be measurable.
Why is "Tracking Later" a Mortal Sin for a Product?
- You won't be able to conduct an honest
Evaluate. You simply won't have the data to answer the main question: "Did behavior change?" - You will make decisions blindly. Without data, any "debrief" turns into an argument of opinions, where the loudest voice (
HiPPO) wins. - It creates technical debt. Adding analytics retrospectively is always more complex, more expensive, and prone to more errors.
What is a Tracking Plan?
A Tracking Plan is a living document that describes what user actions we track, how we name them, and most importantly, what business question each event helps answer. It is a "single source of truth" for the product team, engineers, and analysts.
As Amplitude notes, leading product teams formalize the tracking plan as a "living document" that defines what, why, and from where data is tracked, and create a taxonomy (events, properties, conventions).
Key Components of a Tracking Plan ("Data Dictionary")
A good tracking plan should answer three key questions:
- Are users using the feature?
- Where do users get stuck?
- What broke or worsened?
For this, a "dictionary" of 10–20 key events is compiled for each new feature. Each event in this dictionary must have a "passport":
-
User Action: What exactly does the user do in the interface?
- Example: Clicks the "Save Report" button.
-
Event Name: How do we name this action in our analytics system. The name must be unambiguous and follow a consistent convention (e.g.,
Object_Action).- Example:
report_saved.
- Example:
-
Properties: What additional context is important for our analysis? This allows for data segmentation.
- Example:
report_type: 'standard',user_segment: 'enterprise',previous_screen: 'editor'.
- Example:
-
Trigger: At what exact moment is the event sent?
- Example: Immediately after a successful server response for saving.
-
Business Question: What question does this event help answer? This is the most important point, linking technical implementation with product meaning.
- Example: "How often do users from different segments save different types of reports?"
Tracking Plan as Part of the Definition of Done for Build
In PTOS, the Build phase is considered complete not when "code is written," but when several conditions are met, and one of the key ones is readiness for measurement.
DoDforBuild:- Tracking Plan/event dictionary is ready.
- Events are implemented in the code.
- Data is actually coming in on staging or test environments (not just "should be coming in").
Only when these conditions are met can the team move to the Launch phase.
Conclusion
Stop treating analytics as something that can be "bolted on later." A Tracking Plan is not a task for an analyst; it is a product artifact for which the product manager is responsible.
By creating it before development begins, you transform analytics from a tool for "post-mortem analysis" into a powerful mechanism for continuous learning and making strong, data-driven decisions.