\section{Batch\+Norm$<$ Input\+Data\+Type, Output\+Data\+Type $>$ Class Template Reference}
\label{classmlpack_1_1ann_1_1BatchNorm}\index{Batch\+Norm$<$ Input\+Data\+Type, Output\+Data\+Type $>$@{Batch\+Norm$<$ Input\+Data\+Type, Output\+Data\+Type $>$}}


Declaration of the Batch Normalization layer class.  


\subsection*{Public Member Functions}
\begin{DoxyCompactItemize}
\item 
\textbf{ Batch\+Norm} ()
\begin{DoxyCompactList}\small\item\em Create the \doxyref{Batch\+Norm}{p.}{classmlpack_1_1ann_1_1BatchNorm} object. \end{DoxyCompactList}\item 
\textbf{ Batch\+Norm} (const size\+\_\+t size, const double eps=1e-\/8)
\begin{DoxyCompactList}\small\item\em Create the \doxyref{Batch\+Norm}{p.}{classmlpack_1_1ann_1_1BatchNorm} layer object for a specified number of input units. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename eT $>$ }\\void \textbf{ Backward} (const arma\+::\+Mat$<$ eT $>$ \&input, const arma\+::\+Mat$<$ eT $>$ \&gy, arma\+::\+Mat$<$ eT $>$ \&g)
\begin{DoxyCompactList}\small\item\em Backward pass through the layer. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Delta} () const
\begin{DoxyCompactList}\small\item\em Get the delta. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Delta} ()
\begin{DoxyCompactList}\small\item\em Modify the delta. \end{DoxyCompactList}\item 
bool \textbf{ Deterministic} () const
\begin{DoxyCompactList}\small\item\em Get the value of deterministic parameter. \end{DoxyCompactList}\item 
bool \& \textbf{ Deterministic} ()
\begin{DoxyCompactList}\small\item\em Modify the value of deterministic parameter. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename eT $>$ }\\void \textbf{ Forward} (const arma\+::\+Mat$<$ eT $>$ \&input, arma\+::\+Mat$<$ eT $>$ \&output)
\begin{DoxyCompactList}\small\item\em Forward pass of the Batch Normalization layer. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename eT $>$ }\\void \textbf{ Gradient} (const arma\+::\+Mat$<$ eT $>$ \&input, const arma\+::\+Mat$<$ eT $>$ \&error, arma\+::\+Mat$<$ eT $>$ \&gradient)
\begin{DoxyCompactList}\small\item\em Calculate the gradient using the output delta and the input activations. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Gradient} () const
\begin{DoxyCompactList}\small\item\em Get the gradient. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Gradient} ()
\begin{DoxyCompactList}\small\item\em Modify the gradient. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Output\+Parameter} () const
\begin{DoxyCompactList}\small\item\em Get the output parameter. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Output\+Parameter} ()
\begin{DoxyCompactList}\small\item\em Modify the output parameter. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Parameters} () const
\begin{DoxyCompactList}\small\item\em Get the parameters. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Parameters} ()
\begin{DoxyCompactList}\small\item\em Modify the parameters. \end{DoxyCompactList}\item 
void \textbf{ Reset} ()
\begin{DoxyCompactList}\small\item\em Reset the layer parameters. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Archive $>$ }\\void \textbf{ serialize} (Archive \&ar, const unsigned int)
\begin{DoxyCompactList}\small\item\em Serialize the layer. \end{DoxyCompactList}\item 
Output\+Data\+Type \textbf{ Training\+Mean} ()
\begin{DoxyCompactList}\small\item\em Get the mean over the training data. \end{DoxyCompactList}\item 
Output\+Data\+Type \textbf{ Training\+Variance} ()
\begin{DoxyCompactList}\small\item\em Get the variance over the training data. \end{DoxyCompactList}\end{DoxyCompactItemize}


\subsection{Detailed Description}
\subsubsection*{template$<$typename Input\+Data\+Type = arma\+::mat, typename Output\+Data\+Type = arma\+::mat$>$\newline
class mlpack\+::ann\+::\+Batch\+Norm$<$ Input\+Data\+Type, Output\+Data\+Type $>$}

Declaration of the Batch Normalization layer class. 

The layer transforms the input data into zero mean and unit variance and then scales and shifts the data by parameters, gamma and beta respectively. These parameters are learnt by the network.

If deterministic is false (training), the mean and variance over the batch is calculated and the data is normalized. If it is set to true (testing) then the mean and variance accrued over the training set is used.

For more information, refer to the following paper,


\begin{DoxyCode}
@article\{Ioffe15,
  author    = \{Sergey Ioffe and
               Christian Szegedy\},
  title     = \{Batch Normalization: Accelerating Deep Network Training by
               Reducing Internal Covariate Shift\},
  journal   = \{CoRR\},
  volume    = \{abs/1502.03167\},
  year      = \{2015\},
  url       = \{http:\textcolor{comment}{//arxiv.org/abs/1502.03167\},}
  eprint    = \{1502.03167\},
\}
\end{DoxyCode}



\begin{DoxyTemplParams}{Template Parameters}
{\em Input\+Data\+Type} & Type of the input data (arma\+::colvec, arma\+::mat, arma\+::sp\+\_\+mat or arma\+::cube). \\
\hline
{\em Output\+Data\+Type} & Type of the output data (arma\+::colvec, arma\+::mat, arma\+::sp\+\_\+mat or arma\+::cube). \\
\hline
\end{DoxyTemplParams}


Definition at line 56 of file batch\+\_\+norm.\+hpp.



\subsection{Constructor \& Destructor Documentation}
\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a854f142e5c3785c754d8f063269add79}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Batch\+Norm@{Batch\+Norm}}
\index{Batch\+Norm@{Batch\+Norm}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Batch\+Norm()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily \textbf{ Batch\+Norm} (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})}



Create the \doxyref{Batch\+Norm}{p.}{classmlpack_1_1ann_1_1BatchNorm} object. 

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a33a0776ad6fc986074519b466f40cd7d}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Batch\+Norm@{Batch\+Norm}}
\index{Batch\+Norm@{Batch\+Norm}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Batch\+Norm()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily \textbf{ Batch\+Norm} (\begin{DoxyParamCaption}\item[{const size\+\_\+t}]{size,  }\item[{const double}]{eps = {\ttfamily 1e-\/8} }\end{DoxyParamCaption})}



Create the \doxyref{Batch\+Norm}{p.}{classmlpack_1_1ann_1_1BatchNorm} layer object for a specified number of input units. 


\begin{DoxyParams}{Parameters}
{\em size} & The number of input units. \\
\hline
{\em eps} & The epsilon added to variance to ensure numerical stability. \\
\hline
\end{DoxyParams}


\subsection{Member Function Documentation}
\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a78dbad83871f43db1975e45a9a69c376}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Backward@{Backward}}
\index{Backward@{Backward}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Backward()}
{\footnotesize\ttfamily void Backward (\begin{DoxyParamCaption}\item[{const arma\+::\+Mat$<$ eT $>$ \&}]{input,  }\item[{const arma\+::\+Mat$<$ eT $>$ \&}]{gy,  }\item[{arma\+::\+Mat$<$ eT $>$ \&}]{g }\end{DoxyParamCaption})}



Backward pass through the layer. 


\begin{DoxyParams}{Parameters}
{\em input} & The input activations \\
\hline
{\em gy} & The backpropagated error. \\
\hline
{\em g} & The calculated gradient. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a797f7edb44dd081e5e2b3cc316eef6bd}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Delta@{Delta}}
\index{Delta@{Delta}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Delta()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Delta (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the delta. 



Definition at line 121 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_ad6601342d560219ce951d554e69e5e87}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Delta@{Delta}}
\index{Delta@{Delta}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Delta()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Delta (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the delta. 



Definition at line 123 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a9f4103707f4d199ce5594d239b60443e}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Deterministic@{Deterministic}}
\index{Deterministic@{Deterministic}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Deterministic()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily bool Deterministic (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the value of deterministic parameter. 



Definition at line 131 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a42d4ee3da432cff20d3a41b8b1ec801c}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Deterministic@{Deterministic}}
\index{Deterministic@{Deterministic}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Deterministic()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily bool\& Deterministic (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the value of deterministic parameter. 



Definition at line 133 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a461f849bc638c15bec262dc9c3a58abe}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Forward@{Forward}}
\index{Forward@{Forward}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Forward()}
{\footnotesize\ttfamily void Forward (\begin{DoxyParamCaption}\item[{const arma\+::\+Mat$<$ eT $>$ \&}]{input,  }\item[{arma\+::\+Mat$<$ eT $>$ \&}]{output }\end{DoxyParamCaption})}



Forward pass of the Batch Normalization layer. 

Transforms the input data into zero mean and unit variance, scales the data by a factor gamma and shifts it by beta.


\begin{DoxyParams}{Parameters}
{\em input} & Input data for the layer \\
\hline
{\em output} & Resulting output activations. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_aaf577db350e2130754490d8486fba215}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Gradient@{Gradient}}
\index{Gradient@{Gradient}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Gradient()\hspace{0.1cm}{\footnotesize\ttfamily [1/3]}}
{\footnotesize\ttfamily void Gradient (\begin{DoxyParamCaption}\item[{const arma\+::\+Mat$<$ eT $>$ \&}]{input,  }\item[{const arma\+::\+Mat$<$ eT $>$ \&}]{error,  }\item[{arma\+::\+Mat$<$ eT $>$ \&}]{gradient }\end{DoxyParamCaption})}



Calculate the gradient using the output delta and the input activations. 


\begin{DoxyParams}{Parameters}
{\em input} & The input activations \\
\hline
{\em error} & The calculated error \\
\hline
{\em gradient} & The calculated gradient. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a0f1f4e6d93472d83852731a96c8c3f59}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Gradient@{Gradient}}
\index{Gradient@{Gradient}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Gradient()\hspace{0.1cm}{\footnotesize\ttfamily [2/3]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Gradient (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the gradient. 



Definition at line 126 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a19abce4739c3b0b658b612537e21956a}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Gradient@{Gradient}}
\index{Gradient@{Gradient}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Gradient()\hspace{0.1cm}{\footnotesize\ttfamily [3/3]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Gradient (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the gradient. 



Definition at line 128 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a0ee21c2a36e5abad1e7a9d5dd00849f9}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Output\+Parameter@{Output\+Parameter}}
\index{Output\+Parameter@{Output\+Parameter}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Output\+Parameter()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Output\+Parameter (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the output parameter. 



Definition at line 116 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a21d5f745f02c709625a4ee0907f004a5}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Output\+Parameter@{Output\+Parameter}}
\index{Output\+Parameter@{Output\+Parameter}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Output\+Parameter()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Output\+Parameter (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the output parameter. 



Definition at line 118 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_aa530552c7ef915c952fbacc77b965c90}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Parameters@{Parameters}}
\index{Parameters@{Parameters}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Parameters()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Parameters (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the parameters. 



Definition at line 111 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a9c5c5900772a689d5a6b59778ec67120}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Parameters@{Parameters}}
\index{Parameters@{Parameters}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Parameters()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Parameters (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the parameters. 



Definition at line 113 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a372de693ad40b3f42839c8ec6ac845f4}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Reset@{Reset}}
\index{Reset@{Reset}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Reset()}
{\footnotesize\ttfamily void Reset (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})}



Reset the layer parameters. 

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_af0dd9205158ccf7bcfcd8ff81f79c927}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!serialize@{serialize}}
\index{serialize@{serialize}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{serialize()}
{\footnotesize\ttfamily void serialize (\begin{DoxyParamCaption}\item[{Archive \&}]{ar,  }\item[{const unsigned}]{int }\end{DoxyParamCaption})}



Serialize the layer. 



Referenced by Batch\+Norm$<$ Input\+Data\+Type, Output\+Data\+Type $>$\+::\+Training\+Variance().

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_ad08f88b9f31bd4444f34e8fb72b44f61}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Training\+Mean@{Training\+Mean}}
\index{Training\+Mean@{Training\+Mean}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Training\+Mean()}
{\footnotesize\ttfamily Output\+Data\+Type Training\+Mean (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Get the mean over the training data. 



Definition at line 136 of file batch\+\_\+norm.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1BatchNorm_a2d990a30d296c8608719307bea36b8da}} 
\index{mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}!Training\+Variance@{Training\+Variance}}
\index{Training\+Variance@{Training\+Variance}!mlpack\+::ann\+::\+Batch\+Norm@{mlpack\+::ann\+::\+Batch\+Norm}}
\subsubsection{Training\+Variance()}
{\footnotesize\ttfamily Output\+Data\+Type Training\+Variance (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Get the variance over the training data. 



Definition at line 139 of file batch\+\_\+norm.\+hpp.



References Batch\+Norm$<$ Input\+Data\+Type, Output\+Data\+Type $>$\+::serialize().



The documentation for this class was generated from the following file\+:\begin{DoxyCompactItemize}
\item 
/var/www/mlpack.\+ratml.\+org/mlpack.\+org/\+\_\+src/mlpack-\/3.\+3.\+0/src/mlpack/methods/ann/layer/\textbf{ batch\+\_\+norm.\+hpp}\end{DoxyCompactItemize}
