The information contained in ''n'' independent [[Bernoulli trial]] s, each with probability of success ''θ'' may be calculated as follows. In the following, ''a'' represents the number of successes, ''b'' the number of failures, and ''n=a+b'' is the total number of trials.
The first line is just the definition of information; the second uses the fact that the information contained in a sufficient statistic is the same as that of the sample itself; the third line just expands the [[logarithm|log]] term (and drops a constant), the fourth and fifth just differentiation wrt ''θ'', the sixth replaces ''a'' and ''b'' with their expectations, and the seventh is algebraic manipulation.
The overall result, viz
may be seen to be in accord with what one would expect, since it is the reciprocal of the variance of the sum of the ''n'' Bernoulli random variables..
In case the parameter θ is vector valued, the information is a positive-definite matrix, which defines a metric on the parameter space; consequently [[differential geometry]] is applied to this topic. See [[Fisher information metric]].