Quantcast
Channel: Hacker News
Viewing all articles
Browse latest Browse all 25817

Information Geometry

$
0
0
Information Geometry

John Baez

February 3, 2016

Information geometry is the study of 'stochastic manifolds', which are spaces where each point is a hypothesis about some state of affairs. This subject, usually considered a branch of statistics, has important applications to machine learning and somewhat unexpected connections to evolutionary biology. To learn this subject, I'm writing a series of articles on it. You can navigate forwards and back through these using the blue arrows. And by clicking the links that say "on Azimuth", you can see blog entries containing these articles. Those let you read comments about my articles—and also make comments or ask questions of your own!
  • Part 1 - the Fisher information metric from statistical mechanics.
  • Part 2 - connecting the statistical mechanics approach to the usual definition of the Fisher information metric.
  • Part 3 - the Fisher information metric on any manifold equipped with a map to the mixed states of some system.
  • Part 4 - the Fisher information metric as the real part of a complex-valued quantity whose imaginary part measures quantum uncertainty.
  • Part 5 - an example: the harmonic oscillator in a heat bath.
  • Part 6 - relative entropy.
  • Part 7 - the Fisher information metric as the matrix of second derivatives of relative entropy.
  • Part 8 - information geometry and evolution: how natural selection resembles Bayesian inference, and how it's related to relative entropy.
  • Part 9 - information geometry and evolution: the replicator equation and the decline of entropy as a successful species takes over.
  • Part 10 - information geometry and evoluton: how entropy changes under the replicator equation.
  • Part 11 - information geometry and evolution: the decline of relative information.
  • Part 12 - information geometry and evolution: an introduction to evolutionary game theory.
  • Part 13 - information geometry and evolution: the decline of relative information as a population approaches an evolutionarily stable state.
  • Part 14 - how relative entropy changes in open Markov processes.
The following papers are spinoffs of the above series of blog articles. You can also read blog articles summarizing these papers:
You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage. - John von Neumann, giving advice to Claude Shannon on what to name his discovery.

© 2016 John Baez
baez@math.removethis.ucr.andthis.edu

home


Viewing all articles
Browse latest Browse all 25817

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>