Confidence 'Level' vs Confidence 'Interval'

Created by LiRou C, Modified on Tue, 19 Mar at 3:49 AM by LiRou C

What is a Confidence 'Level'?

A confidence level is a statistical concept that quantifies the degree of certainty that a given prediction or hypothesis is correct. It's typically represented as a percentage, reflecting how strongly we believe our prediction is likely to be accurate.


For instance, a 95% confidence level indicates that one is 95% certain that the stated prediction or hypothesis is accurate. In other words, if the same situation were repeated 100 times, we would expect our prediction to be correct 95 times out of 100. Therefore, a confidence level provides a measure of trust in the validity of a statistical estimate or prediction.



What is a Confidence 'Interval'?

A confidence interval, in statistics, addresses the degree of uncertainty linked with an estimated value, typically an average, derived from a limited sample of a larger population. This concept underlines that any statistic or test statistic, though inherently subject to uncertainty, can still provide meaningful insights into the overall population.


Consider the scenario of a marketing study evaluating the average time spent by consumers on a new website. After tracking a select group of users' browsing sessions, the duration of these visits can be compiled and an average calculated. However, as this calculated average only relies on a subset of users, it is accompanied by a degree of uncertainty.


In this context, a confidence interval becomes pertinent. It frames a range within which the actual average browsing time of all the site's users is likely to fall. In essence, a confidence interval informs you how confident you can be that the observed results accurately reflect the values that would have been seen if it were feasible to collect data from every single individual in the population. A broader confidence interval implies more uncertainty, while a narrower interval indicates a higher degree of confidence in the estimate's accuracy.



How to interpret Confidence 'Level' and 'Confidence Interval' in Mida report?

You can choose your 'Confidence Level' in Mida when setting up experiment (learn more here). The confidence level we choose usually ties back to our accepted level of mistake, often called statistical significance or alpha. 


For instance, if we set an alpha of 0.05 (meaning we are okay with being wrong 5% of the time), then our confidence level would be 95% because 100% - 5% = 95%. 


Based on the following test result, we can assert with 99.71% certainty that our conversion rate (CR) will fall within our predicted range (Confidence Interval).




Summary: Confidence level vs. confidence interval




Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select at least one of the reasons
CAPTCHA verification is required.

Feedback sent

We appreciate your effort and will try to fix the article