\section{Log\+Cosh\+Loss$<$ Input\+Data\+Type, Output\+Data\+Type $>$ Class Template Reference}
\label{classmlpack_1_1ann_1_1LogCoshLoss}\index{Log\+Cosh\+Loss$<$ Input\+Data\+Type, Output\+Data\+Type $>$@{Log\+Cosh\+Loss$<$ Input\+Data\+Type, Output\+Data\+Type $>$}}


The Log-\/\+Hyperbolic-\/\+Cosine loss function is often used to improve variational auto encoder.  


\subsection*{Public Member Functions}
\begin{DoxyCompactItemize}
\item 
\textbf{ Log\+Cosh\+Loss} (const double a=1.\+0)
\begin{DoxyCompactList}\small\item\em Create the Log-\/\+Hyperbolic-\/\+Cosine object with the specified parameters. \end{DoxyCompactList}\item 
double \textbf{ A} () const
\begin{DoxyCompactList}\small\item\em Get the value of hyperparameter a. \end{DoxyCompactList}\item 
double \& \textbf{ A} ()
\begin{DoxyCompactList}\small\item\em Modify the value of hyperparameter a. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Input\+Type , typename Target\+Type , typename Output\+Type $>$ }\\void \textbf{ Backward} (const Input\+Type \&input, const Target\+Type \&target, Output\+Type \&output)
\begin{DoxyCompactList}\small\item\em Ordinary feed backward pass of a neural network. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Input\+Type , typename Target\+Type $>$ }\\Input\+Type\+::elem\+\_\+type \textbf{ Forward} (const Input\+Type \&input, const Target\+Type \&target)
\begin{DoxyCompactList}\small\item\em Computes the Log-\/\+Hyperbolic-\/\+Cosine loss function. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Output\+Parameter} () const
\begin{DoxyCompactList}\small\item\em Get the output parameter. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Output\+Parameter} ()
\begin{DoxyCompactList}\small\item\em Modify the output parameter. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Archive $>$ }\\void \textbf{ serialize} (Archive \&ar, const unsigned int)
\begin{DoxyCompactList}\small\item\em Serialize the loss function. \end{DoxyCompactList}\end{DoxyCompactItemize}


\subsection{Detailed Description}
\subsubsection*{template$<$typename Input\+Data\+Type = arma\+::mat, typename Output\+Data\+Type = arma\+::mat$>$\newline
class mlpack\+::ann\+::\+Log\+Cosh\+Loss$<$ Input\+Data\+Type, Output\+Data\+Type $>$}

The Log-\/\+Hyperbolic-\/\+Cosine loss function is often used to improve variational auto encoder. 

This function is the log of hyperbolic cosine of difference between true values and predicted values.


\begin{DoxyTemplParams}{Template Parameters}
{\em Input\+Data\+Type} & Type of the input data (arma\+::colvec, arma\+::mat, arma\+::sp\+\_\+mat or arma\+::cube). \\
\hline
{\em Output\+Data\+Type} & Type of the output data (arma\+::colvec, arma\+::mat, arma\+::sp\+\_\+mat or arma\+::cube). \\
\hline
\end{DoxyTemplParams}


Definition at line 35 of file log\+\_\+cosh\+\_\+loss.\+hpp.



\subsection{Constructor \& Destructor Documentation}
\mbox{\label{classmlpack_1_1ann_1_1LogCoshLoss_aa786489fcdc270dd947b12eb8709c064}} 
\index{mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}!Log\+Cosh\+Loss@{Log\+Cosh\+Loss}}
\index{Log\+Cosh\+Loss@{Log\+Cosh\+Loss}!mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}}
\subsubsection{Log\+Cosh\+Loss()}
{\footnotesize\ttfamily \textbf{ Log\+Cosh\+Loss} (\begin{DoxyParamCaption}\item[{const double}]{a = {\ttfamily 1.0} }\end{DoxyParamCaption})}



Create the Log-\/\+Hyperbolic-\/\+Cosine object with the specified parameters. 


\begin{DoxyParams}{Parameters}
{\em a} & A double type value for smoothening loss function. It must be positive a real number, Sharpness of loss function is directly proportional to a. It can also act as a scaling factor hence making the loss function more sensitive to small losses around the origin. Default value = 1.\+0. \\
\hline
\end{DoxyParams}


\subsection{Member Function Documentation}
\mbox{\label{classmlpack_1_1ann_1_1LogCoshLoss_aab6c632054fc383ec1edf83231163bf7}} 
\index{mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}!A@{A}}
\index{A@{A}!mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}}
\subsubsection{A()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily double A (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the value of hyperparameter a. 



Definition at line 79 of file log\+\_\+cosh\+\_\+loss.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1LogCoshLoss_ad9111aa092ab0ee1b38e0369657d2bfa}} 
\index{mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}!A@{A}}
\index{A@{A}!mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}}
\subsubsection{A()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily double\& A (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the value of hyperparameter a. 



Definition at line 81 of file log\+\_\+cosh\+\_\+loss.\+hpp.



References Log\+Cosh\+Loss$<$ Input\+Data\+Type, Output\+Data\+Type $>$\+::serialize().

\mbox{\label{classmlpack_1_1ann_1_1LogCoshLoss_a7a5e88245fe9cf5644f846902393e97a}} 
\index{mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}!Backward@{Backward}}
\index{Backward@{Backward}!mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}}
\subsubsection{Backward()}
{\footnotesize\ttfamily void Backward (\begin{DoxyParamCaption}\item[{const Input\+Type \&}]{input,  }\item[{const Target\+Type \&}]{target,  }\item[{Output\+Type \&}]{output }\end{DoxyParamCaption})}



Ordinary feed backward pass of a neural network. 


\begin{DoxyParams}{Parameters}
{\em input} & The propagated input activation. \\
\hline
{\em target} & The target vector. \\
\hline
{\em output} & The calculated error. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1LogCoshLoss_aad9536a75d4ecfe220d313adc47f38fa}} 
\index{mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}!Forward@{Forward}}
\index{Forward@{Forward}!mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}}
\subsubsection{Forward()}
{\footnotesize\ttfamily Input\+Type\+::elem\+\_\+type Forward (\begin{DoxyParamCaption}\item[{const Input\+Type \&}]{input,  }\item[{const Target\+Type \&}]{target }\end{DoxyParamCaption})}



Computes the Log-\/\+Hyperbolic-\/\+Cosine loss function. 


\begin{DoxyParams}{Parameters}
{\em input} & Input data used for evaluating the specified function. \\
\hline
{\em target} & Target data to compare with. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1LogCoshLoss_a8bae962cc603d1cab8d80ec78f8d505d}} 
\index{mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}!Output\+Parameter@{Output\+Parameter}}
\index{Output\+Parameter@{Output\+Parameter}!mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}}
\subsubsection{Output\+Parameter()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Output\+Parameter (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the output parameter. 



Definition at line 74 of file log\+\_\+cosh\+\_\+loss.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1LogCoshLoss_a21d5f745f02c709625a4ee0907f004a5}} 
\index{mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}!Output\+Parameter@{Output\+Parameter}}
\index{Output\+Parameter@{Output\+Parameter}!mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}}
\subsubsection{Output\+Parameter()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Output\+Parameter (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the output parameter. 



Definition at line 76 of file log\+\_\+cosh\+\_\+loss.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1LogCoshLoss_af0dd9205158ccf7bcfcd8ff81f79c927}} 
\index{mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}!serialize@{serialize}}
\index{serialize@{serialize}!mlpack\+::ann\+::\+Log\+Cosh\+Loss@{mlpack\+::ann\+::\+Log\+Cosh\+Loss}}
\subsubsection{serialize()}
{\footnotesize\ttfamily void serialize (\begin{DoxyParamCaption}\item[{Archive \&}]{ar,  }\item[{const unsigned}]{int }\end{DoxyParamCaption})}



Serialize the loss function. 



Referenced by Log\+Cosh\+Loss$<$ Input\+Data\+Type, Output\+Data\+Type $>$\+::\+A().



The documentation for this class was generated from the following file\+:\begin{DoxyCompactItemize}
\item 
/var/www/mlpack.\+ratml.\+org/mlpack.\+org/\+\_\+src/mlpack-\/3.\+3.\+0/src/mlpack/methods/ann/loss\+\_\+functions/\textbf{ log\+\_\+cosh\+\_\+loss.\+hpp}\end{DoxyCompactItemize}
