Informant (statistics)

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Linas (talk | contribs) at 22:47, 16 June 2006 (→‎Example: fix parens). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Jump to navigation Jump to search

In statistics, the score is the partial derivative, with respect to some parameter , of the logarithm (commonly the natural logarithm) of the likelihood function. If the observation is , then the score can be found through the chain rule:

Note that is a function of and the observation . The score is a sufficient statistic for .

The expected value of , written , is zero. To see this, rewrite the definition of expectation, using the fact that the probability mass function is just , which is conventionally denoted by (in which the dependence on is more explicit). With this change of notation and writing for the partial derivative with respect to ,

where the integral runs over the whole of the probability space of X and a prime denotes partial differentiation with respect to . If certain differentiability conditions are met, the integral may be rewritten as

.

It is worth restating the above result in words: the expected value of the score is zero. Thus, if one were to repeatedly sample from some distribution, and repeatedly calculate the score, then the mean value of the scores would tend to zero as the number of repeat samples approached infinity.

The variance of the score is known as the Fisher information and is written . Because the expectation of the score is zero, this may be written as

.

Note that the Fisher information, as defined above, is not a function of a particular observation, as the random variable has been averaged out. This concept of information is useful when comparing two methods of observation of some random process.

Example

Consider a Bernoulli process, with A successes and B failures; the probability of success is .

Then the likelihood is

so the score is given by

This is a standard calculus problem: A and B are treated as constants. Then

So if the score is zero, . We can now verify that the expectation of the score is zero. Noting that the expectation of is and the expectation of is , we can see that the expectation of is

.

We can also check the variance of . We know that A + B = n and the variance of A is so the variance of is

See also