\section{K\+L\+Divergence$<$ Input\+Data\+Type, Output\+Data\+Type $>$ Class Template Reference}
\label{classmlpack_1_1ann_1_1KLDivergence}\index{K\+L\+Divergence$<$ Input\+Data\+Type, Output\+Data\+Type $>$@{K\+L\+Divergence$<$ Input\+Data\+Type, Output\+Data\+Type $>$}}


The Kullback–\+Leibler divergence is often used for continuous distributions (direct regression).  


\subsection*{Public Member Functions}
\begin{DoxyCompactItemize}
\item 
\textbf{ K\+L\+Divergence} (const bool take\+Mean=false)
\begin{DoxyCompactList}\small\item\em Create the Kullback–\+Leibler Divergence object with the specified parameters. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Input\+Type , typename Target\+Type , typename Output\+Type $>$ }\\void \textbf{ Backward} (const Input\+Type \&input, const Target\+Type \&target, Output\+Type \&output)
\begin{DoxyCompactList}\small\item\em Ordinary feed backward pass of a neural network. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Input\+Type , typename Target\+Type $>$ }\\Input\+Type\+::elem\+\_\+type \textbf{ Forward} (const Input\+Type \&input, const Target\+Type \&target)
\begin{DoxyCompactList}\small\item\em Computes the Kullback–\+Leibler divergence error function. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Output\+Parameter} () const
\begin{DoxyCompactList}\small\item\em Get the output parameter. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Output\+Parameter} ()
\begin{DoxyCompactList}\small\item\em Modify the output parameter. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Archive $>$ }\\void \textbf{ serialize} (Archive \&ar, const unsigned int)
\begin{DoxyCompactList}\small\item\em Serialize the loss function. \end{DoxyCompactList}\item 
bool \textbf{ Take\+Mean} () const
\begin{DoxyCompactList}\small\item\em Get the value of take\+Mean. \end{DoxyCompactList}\item 
bool \& \textbf{ Take\+Mean} ()
\begin{DoxyCompactList}\small\item\em Modify the value of take\+Mean. \end{DoxyCompactList}\end{DoxyCompactItemize}


\subsection{Detailed Description}
\subsubsection*{template$<$typename Input\+Data\+Type = arma\+::mat, typename Output\+Data\+Type = arma\+::mat$>$\newline
class mlpack\+::ann\+::\+K\+L\+Divergence$<$ Input\+Data\+Type, Output\+Data\+Type $>$}

The Kullback–\+Leibler divergence is often used for continuous distributions (direct regression). 

For more information, see the following paper.


\begin{DoxyCode}
article\{Kullback1951,
  title   = \{On Information and Sufficiency\},
  author  = \{S. Kullback, R.A. Leibler\},
  journal = \{The Annals of Mathematical Statistics\},
  year    = \{1951\}
\}
\end{DoxyCode}



\begin{DoxyTemplParams}{Template Parameters}
{\em Input\+Data\+Type} & Type of the input data (arma\+::colvec, arma\+::mat, arma\+::sp\+\_\+mat or arma\+::cube). \\
\hline
{\em Output\+Data\+Type} & Type of the output data (arma\+::colvec, arma\+::mat, arma\+::sp\+\_\+mat or arma\+::cube). \\
\hline
\end{DoxyTemplParams}


Definition at line 45 of file kl\+\_\+divergence.\+hpp.



\subsection{Constructor \& Destructor Documentation}
\mbox{\label{classmlpack_1_1ann_1_1KLDivergence_a16755dd3b869553b03796619adcb8e52}} 
\index{mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}!K\+L\+Divergence@{K\+L\+Divergence}}
\index{K\+L\+Divergence@{K\+L\+Divergence}!mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}}
\subsubsection{K\+L\+Divergence()}
{\footnotesize\ttfamily \textbf{ K\+L\+Divergence} (\begin{DoxyParamCaption}\item[{const bool}]{take\+Mean = {\ttfamily false} }\end{DoxyParamCaption})}



Create the Kullback–\+Leibler Divergence object with the specified parameters. 


\begin{DoxyParams}{Parameters}
{\em take\+Mean} & Boolean variable to specify whether to take mean or not. \\
\hline
\end{DoxyParams}


\subsection{Member Function Documentation}
\mbox{\label{classmlpack_1_1ann_1_1KLDivergence_a7a5e88245fe9cf5644f846902393e97a}} 
\index{mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}!Backward@{Backward}}
\index{Backward@{Backward}!mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}}
\subsubsection{Backward()}
{\footnotesize\ttfamily void Backward (\begin{DoxyParamCaption}\item[{const Input\+Type \&}]{input,  }\item[{const Target\+Type \&}]{target,  }\item[{Output\+Type \&}]{output }\end{DoxyParamCaption})}



Ordinary feed backward pass of a neural network. 


\begin{DoxyParams}{Parameters}
{\em input} & The propagated input activation. \\
\hline
{\em target} & The target vector. \\
\hline
{\em output} & The calculated error. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1KLDivergence_aad9536a75d4ecfe220d313adc47f38fa}} 
\index{mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}!Forward@{Forward}}
\index{Forward@{Forward}!mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}}
\subsubsection{Forward()}
{\footnotesize\ttfamily Input\+Type\+::elem\+\_\+type Forward (\begin{DoxyParamCaption}\item[{const Input\+Type \&}]{input,  }\item[{const Target\+Type \&}]{target }\end{DoxyParamCaption})}



Computes the Kullback–\+Leibler divergence error function. 


\begin{DoxyParams}{Parameters}
{\em input} & Input data used for evaluating the specified function. \\
\hline
{\em target} & Target data to compare with. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1KLDivergence_a8bae962cc603d1cab8d80ec78f8d505d}} 
\index{mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}!Output\+Parameter@{Output\+Parameter}}
\index{Output\+Parameter@{Output\+Parameter}!mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}}
\subsubsection{Output\+Parameter()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Output\+Parameter (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the output parameter. 



Definition at line 79 of file kl\+\_\+divergence.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1KLDivergence_a21d5f745f02c709625a4ee0907f004a5}} 
\index{mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}!Output\+Parameter@{Output\+Parameter}}
\index{Output\+Parameter@{Output\+Parameter}!mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}}
\subsubsection{Output\+Parameter()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Output\+Parameter (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the output parameter. 



Definition at line 81 of file kl\+\_\+divergence.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1KLDivergence_af0dd9205158ccf7bcfcd8ff81f79c927}} 
\index{mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}!serialize@{serialize}}
\index{serialize@{serialize}!mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}}
\subsubsection{serialize()}
{\footnotesize\ttfamily void serialize (\begin{DoxyParamCaption}\item[{Archive \&}]{ar,  }\item[{const unsigned}]{int }\end{DoxyParamCaption})}



Serialize the loss function. 



Referenced by K\+L\+Divergence$<$ Input\+Data\+Type, Output\+Data\+Type $>$\+::\+Take\+Mean().

\mbox{\label{classmlpack_1_1ann_1_1KLDivergence_ab1afafdad2b04d3378dce6f13c9968a2}} 
\index{mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}!Take\+Mean@{Take\+Mean}}
\index{Take\+Mean@{Take\+Mean}!mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}}
\subsubsection{Take\+Mean()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily bool Take\+Mean (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the value of take\+Mean. 



Definition at line 84 of file kl\+\_\+divergence.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1KLDivergence_a6523d960bcd088ba1e86fbe2e095a79b}} 
\index{mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}!Take\+Mean@{Take\+Mean}}
\index{Take\+Mean@{Take\+Mean}!mlpack\+::ann\+::\+K\+L\+Divergence@{mlpack\+::ann\+::\+K\+L\+Divergence}}
\subsubsection{Take\+Mean()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily bool\& Take\+Mean (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the value of take\+Mean. 



Definition at line 86 of file kl\+\_\+divergence.\+hpp.



References K\+L\+Divergence$<$ Input\+Data\+Type, Output\+Data\+Type $>$\+::serialize().



The documentation for this class was generated from the following file\+:\begin{DoxyCompactItemize}
\item 
/var/www/mlpack.\+ratml.\+org/mlpack.\+org/\+\_\+src/mlpack-\/git/src/mlpack/methods/ann/loss\+\_\+functions/\textbf{ kl\+\_\+divergence.\+hpp}\end{DoxyCompactItemize}
