\section{L\+S\+TM$<$ Input\+Data\+Type, Output\+Data\+Type $>$ Class Template Reference}
\label{classmlpack_1_1ann_1_1LSTM}\index{L\+S\+T\+M$<$ Input\+Data\+Type, Output\+Data\+Type $>$@{L\+S\+T\+M$<$ Input\+Data\+Type, Output\+Data\+Type $>$}}


Implementation of the \doxyref{L\+S\+TM}{p.}{classmlpack_1_1ann_1_1LSTM} module class.  


\subsection*{Public Member Functions}
\begin{DoxyCompactItemize}
\item 
\textbf{ L\+S\+TM} ()
\begin{DoxyCompactList}\small\item\em Create the \doxyref{L\+S\+TM}{p.}{classmlpack_1_1ann_1_1LSTM} object. \end{DoxyCompactList}\item 
\textbf{ L\+S\+TM} (const size\+\_\+t in\+Size, const size\+\_\+t out\+Size, const size\+\_\+t rho=std\+::numeric\+\_\+limits$<$ size\+\_\+t $>$\+::max())
\begin{DoxyCompactList}\small\item\em Create the \doxyref{L\+S\+TM}{p.}{classmlpack_1_1ann_1_1LSTM} layer object using the specified parameters. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Input\+Type , typename Error\+Type , typename Gradient\+Type $>$ }\\void \textbf{ Backward} (const Input\+Type \&input, const Error\+Type \&gy, Gradient\+Type \&g)
\begin{DoxyCompactList}\small\item\em Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards trough f. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Delta} () const
\begin{DoxyCompactList}\small\item\em Get the delta. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Delta} ()
\begin{DoxyCompactList}\small\item\em Modify the delta. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Input\+Type , typename Output\+Type $>$ }\\void \textbf{ Forward} (const Input\+Type \&input, Output\+Type \&output)
\begin{DoxyCompactList}\small\item\em Ordinary feed-\/forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Input\+Type , typename Output\+Type $>$ }\\void \textbf{ Forward} (const Input\+Type \&input, Output\+Type \&output, Output\+Type \&cell\+State, bool use\+Cell\+State=false)
\begin{DoxyCompactList}\small\item\em Ordinary feed-\/forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Input\+Type , typename Error\+Type , typename Gradient\+Type $>$ }\\void \textbf{ Gradient} (const Input\+Type \&input, const Error\+Type \&error, Gradient\+Type \&gradient)
\item 
Output\+Data\+Type const  \& \textbf{ Gradient} () const
\begin{DoxyCompactList}\small\item\em Get the gradient. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Gradient} ()
\begin{DoxyCompactList}\small\item\em Modify the gradient. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Output\+Parameter} () const
\begin{DoxyCompactList}\small\item\em Get the output parameter. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Output\+Parameter} ()
\begin{DoxyCompactList}\small\item\em Modify the output parameter. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Parameters} () const
\begin{DoxyCompactList}\small\item\em Get the parameters. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Parameters} ()
\begin{DoxyCompactList}\small\item\em Modify the parameters. \end{DoxyCompactList}\item 
void \textbf{ Reset} ()
\item 
void \textbf{ Reset\+Cell} (const size\+\_\+t size)
\item 
size\+\_\+t \textbf{ Rho} () const
\begin{DoxyCompactList}\small\item\em Get the maximum number of steps to backpropagate through time (B\+P\+TT). \end{DoxyCompactList}\item 
size\+\_\+t \& \textbf{ Rho} ()
\begin{DoxyCompactList}\small\item\em Modify the maximum number of steps to backpropagate through time (B\+P\+TT). \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Archive $>$ }\\void \textbf{ serialize} (Archive \&ar, const unsigned int)
\begin{DoxyCompactList}\small\item\em Serialize the layer. \end{DoxyCompactList}\end{DoxyCompactItemize}


\subsection{Detailed Description}
\subsubsection*{template$<$typename Input\+Data\+Type = arma\+::mat, typename Output\+Data\+Type = arma\+::mat$>$\newline
class mlpack\+::ann\+::\+L\+S\+T\+M$<$ Input\+Data\+Type, Output\+Data\+Type $>$}

Implementation of the \doxyref{L\+S\+TM}{p.}{classmlpack_1_1ann_1_1LSTM} module class. 

The implementation corresponds to the following algorithm\+:

\begin{eqnarray} i &=& sigmoid(W \cdot x + W \cdot h + W \cdot c + b) \\ f &=& sigmoid(W \cdot x + W \cdot h + W \cdot c + b) \\ z &=& tanh(W \cdot x + W \cdot h + b) \\ c &=& f \odot c + i \odot z \\ o &=& sigmoid(W \cdot x + W \cdot h + W \cdot c + b) \\ h &=& o \odot tanh(c) \end{eqnarray}

Note that if an \doxyref{L\+S\+TM}{p.}{classmlpack_1_1ann_1_1LSTM} layer is desired as the first layer of a neural network, an Identity\+Layer should be added to the network as the first layer, and then the \doxyref{L\+S\+TM}{p.}{classmlpack_1_1ann_1_1LSTM} layer should be added.

For more information, see the following.


\begin{DoxyCode}
@article\{Graves2013,
  author  = \{Alex Graves and Abdel\{-\}rahman Mohamed and Geoffrey E. Hinton\},
  title   = \{Speech Recognition with Deep Recurrent Neural Networks\},
  journal = CoRR\},
  year    = \{2013\},
  url     = \{http:\textcolor{comment}{//arxiv.org/abs/1303.5778\},}
\}
\end{DoxyCode}


\begin{DoxySeeAlso}{See also}
\doxyref{Fast\+L\+S\+TM}{p.}{classmlpack_1_1ann_1_1FastLSTM} for a faster \doxyref{L\+S\+TM}{p.}{classmlpack_1_1ann_1_1LSTM} version which combines the calculation of the input, forget, output gates and hidden state in a single step.
\end{DoxySeeAlso}

\begin{DoxyTemplParams}{Template Parameters}
{\em Input\+Data\+Type} & Type of the input data (arma\+::colvec, arma\+::mat, arma\+::sp\+\_\+mat or arma\+::cube). \\
\hline
{\em Output\+Data\+Type} & Type of the output data (arma\+::colvec, arma\+::mat, arma\+::sp\+\_\+mat or arma\+::cube). \\
\hline
\end{DoxyTemplParams}


Definition at line 67 of file layer\+\_\+types.\+hpp.



\subsection{Constructor \& Destructor Documentation}
\mbox{\label{classmlpack_1_1ann_1_1LSTM_a40761e9b624105b88d73d495a7814e54}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!L\+S\+TM@{L\+S\+TM}}
\index{L\+S\+TM@{L\+S\+TM}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{L\+S\+T\+M()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily \textbf{ L\+S\+TM} (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})}



Create the \doxyref{L\+S\+TM}{p.}{classmlpack_1_1ann_1_1LSTM} object. 

\mbox{\label{classmlpack_1_1ann_1_1LSTM_aa84d360c20446c2883e8a78993b90911}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!L\+S\+TM@{L\+S\+TM}}
\index{L\+S\+TM@{L\+S\+TM}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{L\+S\+T\+M()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily \textbf{ L\+S\+TM} (\begin{DoxyParamCaption}\item[{const size\+\_\+t}]{in\+Size,  }\item[{const size\+\_\+t}]{out\+Size,  }\item[{const size\+\_\+t}]{rho = {\ttfamily std\+:\+:numeric\+\_\+limits$<$~size\+\_\+t~$>$\+:\+:max()} }\end{DoxyParamCaption})}



Create the \doxyref{L\+S\+TM}{p.}{classmlpack_1_1ann_1_1LSTM} layer object using the specified parameters. 


\begin{DoxyParams}{Parameters}
{\em in\+Size} & The number of input units. \\
\hline
{\em out\+Size} & The number of output units. \\
\hline
{\em rho} & Maximum number of steps to backpropagate through time (B\+P\+TT). \\
\hline
\end{DoxyParams}


\subsection{Member Function Documentation}
\mbox{\label{classmlpack_1_1ann_1_1LSTM_ab2f5417bbabbf195ffec6e34c6adcd52}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Backward@{Backward}}
\index{Backward@{Backward}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Backward()}
{\footnotesize\ttfamily void Backward (\begin{DoxyParamCaption}\item[{const Input\+Type \&}]{input,  }\item[{const Error\+Type \&}]{gy,  }\item[{Gradient\+Type \&}]{g }\end{DoxyParamCaption})}



Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards trough f. 

Using the results from the feed forward pass.


\begin{DoxyParams}{Parameters}
{\em input} & The propagated input activation. \\
\hline
{\em gy} & The backpropagated error. \\
\hline
{\em g} & The calculated gradient. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1LSTM_a797f7edb44dd081e5e2b3cc316eef6bd}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Delta@{Delta}}
\index{Delta@{Delta}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Delta()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Delta (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the delta. 



Definition at line 159 of file lstm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1LSTM_ad6601342d560219ce951d554e69e5e87}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Delta@{Delta}}
\index{Delta@{Delta}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Delta()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Delta (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the delta. 



Definition at line 161 of file lstm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1LSTM_a09440df0a90bdcc766e56e097d91205b}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Forward@{Forward}}
\index{Forward@{Forward}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Forward()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily void Forward (\begin{DoxyParamCaption}\item[{const Input\+Type \&}]{input,  }\item[{Output\+Type \&}]{output }\end{DoxyParamCaption})}



Ordinary feed-\/forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. 


\begin{DoxyParams}{Parameters}
{\em input} & Input data used for evaluating the specified function. \\
\hline
{\em output} & Resulting output activation. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1LSTM_aec4140d6aff36b440b2653f0c8002b9d}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Forward@{Forward}}
\index{Forward@{Forward}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Forward()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily void Forward (\begin{DoxyParamCaption}\item[{const Input\+Type \&}]{input,  }\item[{Output\+Type \&}]{output,  }\item[{Output\+Type \&}]{cell\+State,  }\item[{bool}]{use\+Cell\+State = {\ttfamily false} }\end{DoxyParamCaption})}



Ordinary feed-\/forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. 


\begin{DoxyParams}{Parameters}
{\em input} & Input data used for evaluating the specified function. \\
\hline
{\em output} & Resulting output activation. \\
\hline
{\em cell\+State} & Cell state of the \doxyref{L\+S\+TM}{p.}{classmlpack_1_1ann_1_1LSTM}. \\
\hline
{\em use\+Cell\+State} & Use the cell\+State passed in the \doxyref{L\+S\+TM}{p.}{classmlpack_1_1ann_1_1LSTM} cell. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1LSTM_a41f722e63794aa37f301ec3d8c2ce7aa}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Gradient@{Gradient}}
\index{Gradient@{Gradient}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Gradient()\hspace{0.1cm}{\footnotesize\ttfamily [1/3]}}
{\footnotesize\ttfamily void Gradient (\begin{DoxyParamCaption}\item[{const Input\+Type \&}]{input,  }\item[{const Error\+Type \&}]{error,  }\item[{Gradient\+Type \&}]{gradient }\end{DoxyParamCaption})}

\mbox{\label{classmlpack_1_1ann_1_1LSTM_a0f1f4e6d93472d83852731a96c8c3f59}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Gradient@{Gradient}}
\index{Gradient@{Gradient}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Gradient()\hspace{0.1cm}{\footnotesize\ttfamily [2/3]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Gradient (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the gradient. 



Definition at line 164 of file lstm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1LSTM_a19abce4739c3b0b658b612537e21956a}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Gradient@{Gradient}}
\index{Gradient@{Gradient}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Gradient()\hspace{0.1cm}{\footnotesize\ttfamily [3/3]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Gradient (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the gradient. 



Definition at line 166 of file lstm.\+hpp.



References L\+S\+T\+M$<$ Input\+Data\+Type, Output\+Data\+Type $>$\+::serialize().

\mbox{\label{classmlpack_1_1ann_1_1LSTM_a0ee21c2a36e5abad1e7a9d5dd00849f9}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Output\+Parameter@{Output\+Parameter}}
\index{Output\+Parameter@{Output\+Parameter}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Output\+Parameter()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Output\+Parameter (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the output parameter. 



Definition at line 154 of file lstm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1LSTM_a21d5f745f02c709625a4ee0907f004a5}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Output\+Parameter@{Output\+Parameter}}
\index{Output\+Parameter@{Output\+Parameter}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Output\+Parameter()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Output\+Parameter (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the output parameter. 



Definition at line 156 of file lstm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1LSTM_aa530552c7ef915c952fbacc77b965c90}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Parameters@{Parameters}}
\index{Parameters@{Parameters}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Parameters()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Parameters (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the parameters. 



Definition at line 149 of file lstm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1LSTM_a9c5c5900772a689d5a6b59778ec67120}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Parameters@{Parameters}}
\index{Parameters@{Parameters}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Parameters()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Parameters (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the parameters. 



Definition at line 151 of file lstm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1LSTM_a372de693ad40b3f42839c8ec6ac845f4}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Reset@{Reset}}
\index{Reset@{Reset}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Reset()}
{\footnotesize\ttfamily void Reset (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})}

\mbox{\label{classmlpack_1_1ann_1_1LSTM_ab92846e8253dc597239574d87d969ffb}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Reset\+Cell@{Reset\+Cell}}
\index{Reset\+Cell@{Reset\+Cell}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Reset\+Cell()}
{\footnotesize\ttfamily void Reset\+Cell (\begin{DoxyParamCaption}\item[{const size\+\_\+t}]{size }\end{DoxyParamCaption})}

\mbox{\label{classmlpack_1_1ann_1_1LSTM_a45280505858e9bda815a48c96e930f8d}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Rho@{Rho}}
\index{Rho@{Rho}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Rho()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily size\+\_\+t Rho (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the maximum number of steps to backpropagate through time (B\+P\+TT). 



Definition at line 144 of file lstm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1LSTM_aeb617af2894a3e4bbabcd7ebc30a35af}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!Rho@{Rho}}
\index{Rho@{Rho}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{Rho()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily size\+\_\+t\& Rho (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the maximum number of steps to backpropagate through time (B\+P\+TT). 



Definition at line 146 of file lstm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1LSTM_af0dd9205158ccf7bcfcd8ff81f79c927}} 
\index{mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}!serialize@{serialize}}
\index{serialize@{serialize}!mlpack\+::ann\+::\+L\+S\+TM@{mlpack\+::ann\+::\+L\+S\+TM}}
\subsubsection{serialize()}
{\footnotesize\ttfamily void serialize (\begin{DoxyParamCaption}\item[{Archive \&}]{ar,  }\item[{const unsigned}]{int }\end{DoxyParamCaption})}



Serialize the layer. 



Referenced by L\+S\+T\+M$<$ Input\+Data\+Type, Output\+Data\+Type $>$\+::\+Gradient().



The documentation for this class was generated from the following files\+:\begin{DoxyCompactItemize}
\item 
/var/www/mlpack.\+ratml.\+org/mlpack.\+org/\+\_\+src/mlpack-\/3.\+3.\+1/src/mlpack/methods/ann/layer/\textbf{ layer\+\_\+types.\+hpp}\item 
/var/www/mlpack.\+ratml.\+org/mlpack.\+org/\+\_\+src/mlpack-\/3.\+3.\+1/src/mlpack/methods/ann/layer/\textbf{ lstm.\+hpp}\end{DoxyCompactItemize}
