In order to keep the functionality both simple and easily to understand for you, we have decided to only enable tests for the homepage of your funnels.
Once this feature becomes established more, we are able to further develop A/B testing to then include other pages and expand on features for your analysis.
Please feel free to provide us with any feedback on how we can improve the A/B testing feature using the software’s chat function.
What does significance mean?
In the context of A/B testing, statistical significance refers to the likelihood that the difference between the original version and the variant of your experiment is not based on chance or random error. For example, you test with a significance level of 95% to be 95% confident that the observed differences are actually true.
To prevent premature conclusions, it is particularly important to ensure that testing is done correctly. A good example of misinterpreting a test is a so-called A/A test. Here, the original and the variant are both left unchanged, while the conversion rate shows a significant difference.
Factors other than the page variations can have a significant influence on your results. Examples include: the visitor's persona types, traffic sources, test duration, total number of visitors, and many more.
If you would like to get more insights on this topic, we suggest reading the following highly informative article by Tomi Mester.
Find additional information on the following topics in our Help Center: