Boltzmann Machine

A Boltzmann machine is a type of recurrent neural network in which nodes make binary decisions with some bias. Boltzmann machines can be strung together to make more sophisticated systems such as deep belief networks.
A Boltzmann machine is also known as a stochastic Hopfield network with hidden units.
Although the Boltzmann machine is named after the Austrian scientist Ludwig Boltzmann who came up with the Boltzmann distribution in the 20th century, this type of network was actually developed by Stanford scientist Geoff Hinton. It is closely related to the idea of a Hopfield network developed in the 1970s, and relies on ideas from the world of thermodynamics to conduct work toward desired states. In fact, some experts might talk about certain types of Boltzmann machine as a “stochastic Hopfield network with hidden units.”
In the Boltzmann machine, there's a desire to reach a “thermal equilibrium” or optimize global distribution of energy where the temperature and energy of the system are not literal, but relative to laws of thermodynamics. In a process called simulated annealing, the Boltzmann machine runs processes to slowly separate a large amount of noise from a signal. Boltzmann machines use stochastic binary units to reach probability distribution equilibrium, or in other words, to minimize energy.
Restricted Boltzmann machines are machines where there is no intra-layer connections in the hidden layers of the network.

Post a Comment

0 Comments