A/B Test Results
Viewing results
Section titled “Viewing results”Go to A/B Tests and click on an active or completed test. The detail page shows per-variant metrics.
Metrics per variant
Section titled “Metrics per variant”| Metric | Description |
|---|---|
| Clicks | Total link clicks attributed to this variant |
| Installs | App installs attributed to this variant |
| Store redirects | Users sent to the App Store or Play Store |
| Custom events | Count of the goal event (if using a custom event goal) |
Each metric is shown alongside the conversion rate (e.g. installs / clicks) so you can compare variants directly.
When to declare a winner
Section titled “When to declare a winner”There is no automatic winner declaration. You decide when you have enough data. As a general guideline:
- Minimum sample size: Each variant should have at least a few hundred events (for the goal metric) before drawing conclusions. Small samples lead to unreliable results.
- Run duration: Run the test for at least 7 days to account for day-of-week variation in traffic.
- Clear difference: If one variant has a noticeably higher conversion rate and the difference has held consistently over several days, it is likely a real improvement.
Declaring a winner
Section titled “Declaring a winner”When you are confident in the results:
- Click Declare Winner on the test detail page.
- Select the winning variant.
- Confirm.
This action:
- Marks the test as
completed. - Stops traffic splitting. All visitors now see the winning variant.
- For landing page tests: the winning landing page is automatically assigned to the route.
- For banner, message, and bio page tests: you may want to manually deactivate the losing variants.
After declaring a winner
Section titled “After declaring a winner”The test remains in your history for reference. You can view the final metrics at any time.
If you want to run a follow-up test (e.g. testing the winner against a new challenger), create a new test. Previous tests do not interfere with new ones.
- Test one thing at a time: Change a single element between variants (headline, image, CTA text, layout). If you change multiple things, you will not know which one caused the difference.
- Do not peek and stop: Checking results daily is fine, but do not end the test the moment one variant looks better. Wait for a sufficient sample.
- Document your tests: Keep notes on what you changed and why. Over time, you will build a library of what works for your audience.