\section{Cross\+Entropy\+Error$<$ Input\+Data\+Type, Output\+Data\+Type $>$ Class Template Reference}
\label{classmlpack_1_1ann_1_1CrossEntropyError}\index{Cross\+Entropy\+Error$<$ Input\+Data\+Type, Output\+Data\+Type $>$@{Cross\+Entropy\+Error$<$ Input\+Data\+Type, Output\+Data\+Type $>$}}


The cross-\/entropy performance function measures the network\textquotesingle{}s performance according to the cross-\/entropy between the input and target distributions.  


\subsection*{Public Member Functions}
\begin{DoxyCompactItemize}
\item 
\textbf{ Cross\+Entropy\+Error} (const double eps=1e-\/10)
\begin{DoxyCompactList}\small\item\em Create the \doxyref{Cross\+Entropy\+Error}{p.}{classmlpack_1_1ann_1_1CrossEntropyError} object. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Input\+Type , typename Target\+Type , typename Output\+Type $>$ }\\void \textbf{ Backward} (const Input\+Type \&input, const Target\+Type \&target, Output\+Type \&output)
\begin{DoxyCompactList}\small\item\em Ordinary feed backward pass of a neural network. \end{DoxyCompactList}\item 
double \textbf{ Eps} () const
\begin{DoxyCompactList}\small\item\em Get the epsilon. \end{DoxyCompactList}\item 
double \& \textbf{ Eps} ()
\begin{DoxyCompactList}\small\item\em Modify the epsilon. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Input\+Type , typename Target\+Type $>$ }\\Input\+Type\+::elem\+\_\+type \textbf{ Forward} (const Input\+Type \&input, const Target\+Type \&target)
\begin{DoxyCompactList}\small\item\em Computes the cross-\/entropy function. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Output\+Parameter} () const
\begin{DoxyCompactList}\small\item\em Get the output parameter. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Output\+Parameter} ()
\begin{DoxyCompactList}\small\item\em Modify the output parameter. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Archive $>$ }\\void \textbf{ serialize} (Archive \&ar, const unsigned int)
\begin{DoxyCompactList}\small\item\em Serialize the layer. \end{DoxyCompactList}\end{DoxyCompactItemize}


\subsection{Detailed Description}
\subsubsection*{template$<$typename Input\+Data\+Type = arma\+::mat, typename Output\+Data\+Type = arma\+::mat$>$\newline
class mlpack\+::ann\+::\+Cross\+Entropy\+Error$<$ Input\+Data\+Type, Output\+Data\+Type $>$}

The cross-\/entropy performance function measures the network\textquotesingle{}s performance according to the cross-\/entropy between the input and target distributions. 


\begin{DoxyTemplParams}{Template Parameters}
{\em Input\+Data\+Type} & Type of the input data (arma\+::colvec, arma\+::mat, arma\+::sp\+\_\+mat or arma\+::cube). \\
\hline
{\em Output\+Data\+Type} & Type of the output data (arma\+::colvec, arma\+::mat, arma\+::sp\+\_\+mat or arma\+::cube). \\
\hline
\end{DoxyTemplParams}


Definition at line 34 of file cross\+\_\+entropy\+\_\+error.\+hpp.



\subsection{Constructor \& Destructor Documentation}
\mbox{\label{classmlpack_1_1ann_1_1CrossEntropyError_a45aac4ebfddcf693a90557538c477597}} 
\index{mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}!Cross\+Entropy\+Error@{Cross\+Entropy\+Error}}
\index{Cross\+Entropy\+Error@{Cross\+Entropy\+Error}!mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}}
\subsubsection{Cross\+Entropy\+Error()}
{\footnotesize\ttfamily \textbf{ Cross\+Entropy\+Error} (\begin{DoxyParamCaption}\item[{const double}]{eps = {\ttfamily 1e-\/10} }\end{DoxyParamCaption})}



Create the \doxyref{Cross\+Entropy\+Error}{p.}{classmlpack_1_1ann_1_1CrossEntropyError} object. 


\begin{DoxyParams}{Parameters}
{\em eps} & The minimum value used for computing logarithms and denominators in a numerically stable way. \\
\hline
\end{DoxyParams}


\subsection{Member Function Documentation}
\mbox{\label{classmlpack_1_1ann_1_1CrossEntropyError_a7a5e88245fe9cf5644f846902393e97a}} 
\index{mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}!Backward@{Backward}}
\index{Backward@{Backward}!mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}}
\subsubsection{Backward()}
{\footnotesize\ttfamily void Backward (\begin{DoxyParamCaption}\item[{const Input\+Type \&}]{input,  }\item[{const Target\+Type \&}]{target,  }\item[{Output\+Type \&}]{output }\end{DoxyParamCaption})}



Ordinary feed backward pass of a neural network. 


\begin{DoxyParams}{Parameters}
{\em input} & The propagated input activation. \\
\hline
{\em target} & The target vector. \\
\hline
{\em output} & The calculated error. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1CrossEntropyError_a6b1a203165d5e3a6a30534a95c5ea339}} 
\index{mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}!Eps@{Eps}}
\index{Eps@{Eps}!mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}}
\subsubsection{Eps()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily double Eps (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the epsilon. 



Definition at line 73 of file cross\+\_\+entropy\+\_\+error.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1CrossEntropyError_ab50f77742d49705ce1a2a0fa1feff24e}} 
\index{mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}!Eps@{Eps}}
\index{Eps@{Eps}!mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}}
\subsubsection{Eps()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily double\& Eps (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the epsilon. 



Definition at line 75 of file cross\+\_\+entropy\+\_\+error.\+hpp.



References Cross\+Entropy\+Error$<$ Input\+Data\+Type, Output\+Data\+Type $>$\+::serialize().

\mbox{\label{classmlpack_1_1ann_1_1CrossEntropyError_aad9536a75d4ecfe220d313adc47f38fa}} 
\index{mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}!Forward@{Forward}}
\index{Forward@{Forward}!mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}}
\subsubsection{Forward()}
{\footnotesize\ttfamily Input\+Type\+::elem\+\_\+type Forward (\begin{DoxyParamCaption}\item[{const Input\+Type \&}]{input,  }\item[{const Target\+Type \&}]{target }\end{DoxyParamCaption})}



Computes the cross-\/entropy function. 


\begin{DoxyParams}{Parameters}
{\em input} & Input data used for evaluating the specified function. \\
\hline
{\em target} & The target vector. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1CrossEntropyError_a8bae962cc603d1cab8d80ec78f8d505d}} 
\index{mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}!Output\+Parameter@{Output\+Parameter}}
\index{Output\+Parameter@{Output\+Parameter}!mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}}
\subsubsection{Output\+Parameter()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Output\+Parameter (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the output parameter. 



Definition at line 68 of file cross\+\_\+entropy\+\_\+error.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1CrossEntropyError_a21d5f745f02c709625a4ee0907f004a5}} 
\index{mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}!Output\+Parameter@{Output\+Parameter}}
\index{Output\+Parameter@{Output\+Parameter}!mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}}
\subsubsection{Output\+Parameter()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Output\+Parameter (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the output parameter. 



Definition at line 70 of file cross\+\_\+entropy\+\_\+error.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1CrossEntropyError_af0dd9205158ccf7bcfcd8ff81f79c927}} 
\index{mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}!serialize@{serialize}}
\index{serialize@{serialize}!mlpack\+::ann\+::\+Cross\+Entropy\+Error@{mlpack\+::ann\+::\+Cross\+Entropy\+Error}}
\subsubsection{serialize()}
{\footnotesize\ttfamily void serialize (\begin{DoxyParamCaption}\item[{Archive \&}]{ar,  }\item[{const unsigned}]{int }\end{DoxyParamCaption})}



Serialize the layer. 



Referenced by Cross\+Entropy\+Error$<$ Input\+Data\+Type, Output\+Data\+Type $>$\+::\+Eps().



The documentation for this class was generated from the following file\+:\begin{DoxyCompactItemize}
\item 
/var/www/mlpack.\+ratml.\+org/mlpack.\+org/\+\_\+src/mlpack-\/3.\+3.\+2/src/mlpack/methods/ann/loss\+\_\+functions/\textbf{ cross\+\_\+entropy\+\_\+error.\+hpp}\end{DoxyCompactItemize}
