# All Science Fair Projects

## Science Fair Project Encyclopedia for Schools!

 Search    Browse    Forum  Coach    Links    Editor    Help    Tell-a-Friend    Encyclopedia    Dictionary

# Science Fair Project Encyclopedia

For information on any area of science that interests you,
enter a keyword (eg. scientific method, molecule, cloud, carbohydrate etc.).
Or else, you can start by choosing any of the categories below.

# Information geometry

In mathematics and especially in statistical inference, information geometry is the study of probability and information by way of differential geometry. It reached maturity through the work of Shun'ichi Amari in the 1980s, with what is currently the canonical reference book: Differential-geometrical methods in statistics.

Information geometry is based primarily on the Fisher information metric:

$g_{ij}=\int \frac{\partial \log p(x,\theta)}{\partial \theta_i} \frac{\partial \log p(x,\theta)}{\partial \theta_j} p(x,\theta)\, dx$

Substituting i = −log(p) from information theory, the formula becomes:

$g_{ij}=\int \frac{\partial i(x,\theta)}{\partial \theta_i} \frac{\partial i(x,\theta)}{\partial \theta_j} p(x,\theta)\, dx$

Which can be thought of intuitively as: "The distance between two points on a statistical differential manifold is the amount of information between them, i.e. the informational difference between them."

Thus, if a point in information space represents the state of a system, then the trajectory of that point will, on average, be a random walk through information space, i.e. will diffuse according to Brownian motion.

With this in mind, the information space can be thought of as a fitness landscape, a trajectory through this space being an "evolution". The Brownian motion of evolution trajectories thus represents the no free lunch phenomena discussed by Stuart Kauffman.

An important concept in information geometry is the natural gradient . The concept and theory of the natural gradient suggests an adjustment to the energy function of a learning rule . This adjustment takes into account the curvature of the (prior) statistical differential manifold , by way of the Fisher information metric.

This concept has many important applications in blind signal separation, neural networks, artificial intelligence, and other engineering problems that deal with information. Experimental results have shown that application of the concept leads to substantial performance gains.

## References

• Shun'ichi Amari - Differential-geometrical methods in statistics, Lecture notes in statistics, Springer-Verlag, Berlin, 1985
• Shun'ichi Amari, Hiroshi Nagaoka - Methods of information geometry, Transactions of mathematical monographs; v. 191, American Mathematical Society, 2000
03-10-2013 05:06:04