Gated Recurrent Unit (GRU)

A gated recurrent unit (GRU) is part of a specific model of recurrent neural network that intends to use connections through a sequence of nodes to perform machine learning tasks associated with memory and clustering, for instance, in speech recognition. Gated recurrent units help to adjust neural network input weights to solve the vanishing gradient problem that is a common issue with recurrent neural networks.


As a refinement of the general recurrent neural network structure, gated recurrent units have what's called an update gate and a reset gate. Using these two vectors, the model refines outputs by controlling the flow of information through the model. Like other kinds of recurrent network models, models with gated recurrent units can retain information over a period of time – that is why one of the simplest ways to describe these types of technologies is that they are a "memory-centered" type of neural network. By contrast, other types of neural networks without gated recurrent units often do not have the ability to retain information.
In addition to speech recognition, neural network models using gated recurrent units may be used for research on the human genome, handwriting analysis and much more. Some of these innovative networks are used in stock market analysis and government work. Many of them leverage the simulated ability of machines to remember information.

Post a Comment

0 Comments