## What is a Maximum Likelihood Estimate?

Aliases: *MLE*

A maximum likelihood estimate (MLE) is an estimate of the point at which the likelihood function reaches its maximum value. In other words, it is the point with highest plausibility based on a certain statistical model and data x_{0}. In A/B testing in which we are most often interested in differences between means (proportions being a special case) and so the MLE is also the observed difference of means.

The sample mean is no longer the MLE in case it is not the only random variable of interest. For example, in sequential testing the stopping time is a second random variable we need to take into account, resulting a different maximum likelihood estimate.

In practice one often calculates the natural logarithm of the likelihood function (log-likelihood) as being more convenient (easier to differentiate). The fact that a logarithm is strictly increasing is useful when calculating maximum likelihood: log-likelihood reaches the maximum at the same point as the likelihood.

A good MLE has all the desirable properties of an estimator: asymptotic consistency, finite-sample unbiasdness, efficiency and sufficiency.