\section{Batch\+Norm$<$ Input\+Data\+Type, Output\+Data\+Type $>$ Class Template Reference}
\label{classmlpack_1_1ann_1_1BatchNorm}\index{Batch\+Norm$<$ Input\+Data\+Type, Output\+Data\+Type $>$@{Batch\+Norm$<$ Input\+Data\+Type, Output\+Data\+Type $>$}}


Declaration of the Batch Normalization layer class.  


\subsection*{Public Member Functions}
\begin{DoxyCompactItemize}
\item 
\textbf{ Batch\+Norm} ()
\begin{DoxyCompactList}\small\item\em Create the \doxyref{Batch\+Norm}{p.}{classmlpack_1_1ann_1_1BatchNorm} object. \end{DoxyCompactList}\item 
\textbf{ Batch\+Norm} (const size\+\_\+t size, const double eps=1e-\/8, const bool average=true, const double momentum=0.\+1)
\begin{DoxyCompactList}\small\item\em Create the \doxyref{Batch\+Norm}{p.}{classmlpack_1_1ann_1_1BatchNorm} layer object for a specified number of input units. \end{DoxyCompactList}\item 
bool \textbf{ Average} () const
\begin{DoxyCompactList}\small\item\em Get the average parameter. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename eT $>$ }\\void \textbf{ Backward} (const arma\+::\+Mat$<$ eT $>$ \&input, const arma\+::\+Mat$<$ eT $>$ \&gy, arma\+::\+Mat$<$ eT $>$ \&g)
\begin{DoxyCompactList}\small\item\em Backward pass through the layer. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Delta} () const
\begin{DoxyCompactList}\small\item\em Get the delta. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Delta} ()
\begin{DoxyCompactList}\small\item\em Modify the delta. \end{DoxyCompactList}\item 
bool \textbf{ Deterministic} () const
\begin{DoxyCompactList}\small\item\em Get the value of deterministic parameter. \end{DoxyCompactList}\item 
bool \& \textbf{ Deterministic} ()
\begin{DoxyCompactList}\small\item\em Modify the value of deterministic parameter. \end{DoxyCompactList}\item 
double \textbf{ Epsilon} () const
\begin{DoxyCompactList}\small\item\em Get the epsilon value. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename eT $>$ }\\void \textbf{ Forward} (const arma\+::\+Mat$<$ eT $>$ \&input, arma\+::\+Mat$<$ eT $>$ \&output)
\begin{DoxyCompactList}\small\item\em Forward pass of the Batch Normalization layer. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename eT $>$ }\\void \textbf{ Gradient} (const arma\+::\+Mat$<$ eT $>$ \&input, const arma\+::\+Mat$<$ eT $>$ \&error, arma\+::\+Mat$<$ eT $>$ \&gradient)
\begin{DoxyCompactList}\small\item\em Calculate the gradient using the output delta and the input activations. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Gradient} () const
\begin{DoxyCompactList}\small\item\em Get the gradient. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Gradient} ()
\begin{DoxyCompactList}\small\item\em Modify the gradient. \end{DoxyCompactList}\item 
size\+\_\+t \textbf{ Input\+Size} () const
\begin{DoxyCompactList}\small\item\em Get the number of input units / channels. \end{DoxyCompactList}\item 
double \textbf{ Momentum} () const
\begin{DoxyCompactList}\small\item\em Get the momentum value. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Output\+Parameter} () const
\begin{DoxyCompactList}\small\item\em Get the output parameter. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Output\+Parameter} ()
\begin{DoxyCompactList}\small\item\em Modify the output parameter. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Parameters} () const
\begin{DoxyCompactList}\small\item\em Get the parameters. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Parameters} ()
\begin{DoxyCompactList}\small\item\em Modify the parameters. \end{DoxyCompactList}\item 
void \textbf{ Reset} ()
\begin{DoxyCompactList}\small\item\em Reset the layer parameters. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Archive $>$ }\\void \textbf{ serialize} (Archive \&ar, const unsigned int)
\begin{DoxyCompactList}\small\item\em Serialize the layer. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Training\+Mean} () const
\begin{DoxyCompactList}\small\item\em Get the mean over the training data. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Training\+Mean} ()
\begin{DoxyCompactList}\small\item\em Modify the mean over the training data. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Training\+Variance} () const
\begin{DoxyCompactList}\small\item\em Get the variance over the training data. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Training\+Variance} ()
\begin{DoxyCompactList}\small\item\em Modify the variance over the training data. \end{DoxyCompactList}\end{DoxyCompactItemize}


\subsection{Detailed Description}
\subsubsection*{template$<$typename Input\+Data\+Type = arma\+::mat, typename Output\+Data\+Type = arma\+::mat$>$\newline
class mlpack\+::ann\+::\+Batch\+Norm$<$ Input\+Data\+Type, Output\+Data\+Type $>$}

Declaration of the Batch Normalization layer class. 

The layer transforms the input data into zero mean and unit variance and then scales and shifts the data by parameters, gamma and beta respectively. These parameters are learnt by the network.

If deterministic is false (training), the mean and variance over the batch is calculated and the data is normalized. If it is set to true (testing) then the mean and variance accrued over the training set is used.

For more information, refer to the following paper,


\begin{DoxyCode}
@article\{Ioffe15,
  author    = \{Sergey Ioffe and
               Christian Szegedy\},
  title     = \{Batch Normalization: Accelerating Deep Network Training by
               Reducing Internal Covariate Shift\},
  journal   = \{CoRR\},
  volume    = \{abs/1502.03167\},
  year      = \{2015\},
  url       = \{http:\textcolor{comment}{//arxiv.org/abs/1502.03167\},}
  eprint    = \{1502.03167\},
\}
\end{DoxyCode}



\begin{DoxyTemplParams}{Template Parameters}
{\em Input\+Data\+Type} & Type of the input data (arma\+::colvec, arma\+::mat, arma\+::sp\+\_\+mat or arma\+::cube). \\
\hline
{\em Output\+Data\+Type} & Type of the output data (arma\+::colvec, arma\+::mat, arma\+::sp\+\_\+mat or arma\+::cube). \\
\hline
\end{DoxyTemplParams}


Definition at line 56 of file batch\+\_\+norm.\+hpp.



\subsection{Constructor \& Destructor Documentation}
\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a854f142e5c3785c754d8f063269add79}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Batch\+Norm@{Batch\+Norm}}
\index{Batch\+Norm@{Batch\+Norm}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Batch\+Norm()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily \textbf{ Batch\+Norm} (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})}



Create the \doxyref{Batch\+Norm}{p.}{classmlpack_1_1ann_1_1BatchNorm} object. 

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a47e2f9600ab95dbdc278161064a6fd63}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Batch\+Norm@{Batch\+Norm}}
\index{Batch\+Norm@{Batch\+Norm}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Batch\+Norm()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily \textbf{ Batch\+Norm} (\begin{DoxyParamCaption}\item[{const size\+\_\+t}]{size,  }\item[{const double}]{eps = {\ttfamily 1e-\/8},  }\item[{const bool}]{average = {\ttfamily true},  }\item[{const double}]{momentum = {\ttfamily 0.1} }\end{DoxyParamCaption})}



Create the \doxyref{Batch\+Norm}{p.}{classmlpack_1_1ann_1_1BatchNorm} layer object for a specified number of input units. 


\begin{DoxyParams}{Parameters}
{\em size} & The number of input units / channels. \\
\hline
{\em eps} & The epsilon added to variance to ensure numerical stability. \\
\hline
{\em average} & Boolean to determine whether cumulative average is used for updating the parameters or momentum is used. \\
\hline
{\em momentum} & Parameter used to to update the running mean and variance. \\
\hline
\end{DoxyParams}


\subsection{Member Function Documentation}
\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_aab4ef6131dc58825790fb04cc209faab}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Average@{Average}}
\index{Average@{Average}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Average()}
{\footnotesize\ttfamily bool Average (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the average parameter. 



Definition at line 161 of file batch\+\_\+norm.\+hpp.



References Batch\+Norm$<$ Input\+Data\+Type, Output\+Data\+Type $>$\+::serialize().

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a78dbad83871f43db1975e45a9a69c376}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Backward@{Backward}}
\index{Backward@{Backward}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Backward()}
{\footnotesize\ttfamily void Backward (\begin{DoxyParamCaption}\item[{const arma\+::\+Mat$<$ eT $>$ \&}]{input,  }\item[{const arma\+::\+Mat$<$ eT $>$ \&}]{gy,  }\item[{arma\+::\+Mat$<$ eT $>$ \&}]{g }\end{DoxyParamCaption})}



Backward pass through the layer. 


\begin{DoxyParams}{Parameters}
{\em input} & The input activations \\
\hline
{\em gy} & The backpropagated error. \\
\hline
{\em g} & The calculated gradient. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a797f7edb44dd081e5e2b3cc316eef6bd}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Delta@{Delta}}
\index{Delta@{Delta}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Delta()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Delta (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the delta. 



Definition at line 127 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_ad6601342d560219ce951d554e69e5e87}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Delta@{Delta}}
\index{Delta@{Delta}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Delta()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Delta (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the delta. 



Definition at line 129 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a9f4103707f4d199ce5594d239b60443e}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Deterministic@{Deterministic}}
\index{Deterministic@{Deterministic}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Deterministic()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily bool Deterministic (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the value of deterministic parameter. 



Definition at line 137 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a42d4ee3da432cff20d3a41b8b1ec801c}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Deterministic@{Deterministic}}
\index{Deterministic@{Deterministic}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Deterministic()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily bool\& Deterministic (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the value of deterministic parameter. 



Definition at line 139 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_af6d960193bb5db37e51416e12bf720de}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Epsilon@{Epsilon}}
\index{Epsilon@{Epsilon}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Epsilon()}
{\footnotesize\ttfamily double Epsilon (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the epsilon value. 



Definition at line 155 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a461f849bc638c15bec262dc9c3a58abe}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Forward@{Forward}}
\index{Forward@{Forward}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Forward()}
{\footnotesize\ttfamily void Forward (\begin{DoxyParamCaption}\item[{const arma\+::\+Mat$<$ eT $>$ \&}]{input,  }\item[{arma\+::\+Mat$<$ eT $>$ \&}]{output }\end{DoxyParamCaption})}



Forward pass of the Batch Normalization layer. 

Transforms the input data into zero mean and unit variance, scales the data by a factor gamma and shifts it by beta.


\begin{DoxyParams}{Parameters}
{\em input} & Input data for the layer \\
\hline
{\em output} & Resulting output activations. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_aaf577db350e2130754490d8486fba215}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Gradient@{Gradient}}
\index{Gradient@{Gradient}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Gradient()\hspace{0.1cm}{\footnotesize\ttfamily [1/3]}}
{\footnotesize\ttfamily void Gradient (\begin{DoxyParamCaption}\item[{const arma\+::\+Mat$<$ eT $>$ \&}]{input,  }\item[{const arma\+::\+Mat$<$ eT $>$ \&}]{error,  }\item[{arma\+::\+Mat$<$ eT $>$ \&}]{gradient }\end{DoxyParamCaption})}



Calculate the gradient using the output delta and the input activations. 


\begin{DoxyParams}{Parameters}
{\em input} & The input activations \\
\hline
{\em error} & The calculated error \\
\hline
{\em gradient} & The calculated gradient. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a0f1f4e6d93472d83852731a96c8c3f59}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Gradient@{Gradient}}
\index{Gradient@{Gradient}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Gradient()\hspace{0.1cm}{\footnotesize\ttfamily [2/3]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Gradient (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the gradient. 



Definition at line 132 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a19abce4739c3b0b658b612537e21956a}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Gradient@{Gradient}}
\index{Gradient@{Gradient}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Gradient()\hspace{0.1cm}{\footnotesize\ttfamily [3/3]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Gradient (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the gradient. 



Definition at line 134 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a5a4c4984aa897a28d516e638e7ea5308}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Input\+Size@{Input\+Size}}
\index{Input\+Size@{Input\+Size}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Input\+Size()}
{\footnotesize\ttfamily size\+\_\+t Input\+Size (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the number of input units / channels. 



Definition at line 152 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a47bdc16d2d5d5514d9711eae8938fd35}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Momentum@{Momentum}}
\index{Momentum@{Momentum}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Momentum()}
{\footnotesize\ttfamily double Momentum (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the momentum value. 



Definition at line 158 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a0ee21c2a36e5abad1e7a9d5dd00849f9}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Output\+Parameter@{Output\+Parameter}}
\index{Output\+Parameter@{Output\+Parameter}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Output\+Parameter()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Output\+Parameter (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the output parameter. 



Definition at line 122 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a21d5f745f02c709625a4ee0907f004a5}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Output\+Parameter@{Output\+Parameter}}
\index{Output\+Parameter@{Output\+Parameter}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Output\+Parameter()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Output\+Parameter (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the output parameter. 



Definition at line 124 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_aa530552c7ef915c952fbacc77b965c90}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Parameters@{Parameters}}
\index{Parameters@{Parameters}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Parameters()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Parameters (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the parameters. 



Definition at line 117 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a9c5c5900772a689d5a6b59778ec67120}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Parameters@{Parameters}}
\index{Parameters@{Parameters}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Parameters()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Parameters (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the parameters. 



Definition at line 119 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a372de693ad40b3f42839c8ec6ac845f4}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Reset@{Reset}}
\index{Reset@{Reset}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Reset()}
{\footnotesize\ttfamily void Reset (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})}



Reset the layer parameters. 

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_af0dd9205158ccf7bcfcd8ff81f79c927}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!serialize@{serialize}}
\index{serialize@{serialize}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{serialize()}
{\footnotesize\ttfamily void serialize (\begin{DoxyParamCaption}\item[{Archive \&}]{ar,  }\item[{const unsigned}]{int }\end{DoxyParamCaption})}



Serialize the layer. 



Referenced by Batch\+Norm$<$ Input\+Data\+Type, Output\+Data\+Type $>$\+::\+Average().

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_ac2df8242145c0ce6b0715750985e2d10}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Training\+Mean@{Training\+Mean}}
\index{Training\+Mean@{Training\+Mean}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Training\+Mean()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Training\+Mean (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the mean over the training data. 



Definition at line 142 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_af537b166f2862d70c750c30cd6be5c9f}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Training\+Mean@{Training\+Mean}}
\index{Training\+Mean@{Training\+Mean}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Training\+Mean()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Training\+Mean (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the mean over the training data. 



Definition at line 144 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a3faae0b64ac1f68fd95872a3c1cafd11}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Training\+Variance@{Training\+Variance}}
\index{Training\+Variance@{Training\+Variance}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Training\+Variance()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Training\+Variance (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the variance over the training data. 



Definition at line 147 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a3a3209098696730697bbe8b9cf0dc30c}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Training\+Variance@{Training\+Variance}}
\index{Training\+Variance@{Training\+Variance}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Training\+Variance()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Training\+Variance (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the variance over the training data. 



Definition at line 149 of file batch\+\_\+norm.\+hpp.



The documentation for this class was generated from the following file\+:\begin{DoxyCompactItemize}
\item 
/var/www/mlpack.\+ratml.\+org/mlpack.\+org/\+\_\+src/mlpack-\/git/src/mlpack/methods/ann/layer/\textbf{ batch\+\_\+norm.\+hpp}\end{DoxyCompactItemize}
