\section{Hard\+Shrink$<$ Input\+Data\+Type, Output\+Data\+Type $>$ Class Template Reference}
\label{classmlpack_1_1ann_1_1HardShrink}\index{Hard\+Shrink$<$ Input\+Data\+Type, Output\+Data\+Type $>$@{Hard\+Shrink$<$ Input\+Data\+Type, Output\+Data\+Type $>$}}


Hard Shrink operator is defined as, \begin{eqnarray*} f(x) &=& \left\{ \begin{array}{lr} x & : x > lambda \\ x & : x < -lambda \\ 0 & : otherwise \end{array} \\ \right. f'(x) &=& \left\{ \begin{array}{lr} 1 & : x > lambda \\ 1 & : x < -lambda \\ 0 & : otherwise \end{array} \right. \end{eqnarray*} lambda is set to 0.\+5 by default.  


\subsection*{Public Member Functions}
\begin{DoxyCompactItemize}
\item 
\textbf{ Hard\+Shrink} (const double lambda=0.\+5)
\begin{DoxyCompactList}\small\item\em Create \doxyref{Hard\+Shrink}{p.}{classmlpack_1_1ann_1_1HardShrink} object using specified hyperparameter lambda. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Data\+Type $>$ }\\void \textbf{ Backward} (const Data\+Type \&input, Data\+Type \&gy, Data\+Type \&g)
\begin{DoxyCompactList}\small\item\em Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Delta} () const
\begin{DoxyCompactList}\small\item\em Get the delta. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Delta} ()
\begin{DoxyCompactList}\small\item\em Modify the delta. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Input\+Type , typename Output\+Type $>$ }\\void \textbf{ Forward} (const Input\+Type \&input, Output\+Type \&output)
\begin{DoxyCompactList}\small\item\em Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. \end{DoxyCompactList}\item 
double const  \& \textbf{ Lambda} () const
\begin{DoxyCompactList}\small\item\em Get the hyperparameter lambda. \end{DoxyCompactList}\item 
double \& \textbf{ Lambda} ()
\begin{DoxyCompactList}\small\item\em Modify the hyperparameter lambda. \end{DoxyCompactList}\item 
Output\+Data\+Type const  \& \textbf{ Output\+Parameter} () const
\begin{DoxyCompactList}\small\item\em Get the output parameter. \end{DoxyCompactList}\item 
Output\+Data\+Type \& \textbf{ Output\+Parameter} ()
\begin{DoxyCompactList}\small\item\em Modify the output parameter. \end{DoxyCompactList}\item 
{\footnotesize template$<$typename Archive $>$ }\\void \textbf{ serialize} (Archive \&ar, const unsigned int)
\begin{DoxyCompactList}\small\item\em Serialize the layer. \end{DoxyCompactList}\end{DoxyCompactItemize}


\subsection{Detailed Description}
\subsubsection*{template$<$typename Input\+Data\+Type = arma\+::mat, typename Output\+Data\+Type = arma\+::mat$>$\newline
class mlpack\+::ann\+::\+Hard\+Shrink$<$ Input\+Data\+Type, Output\+Data\+Type $>$}

Hard Shrink operator is defined as, \begin{eqnarray*} f(x) &=& \left\{ \begin{array}{lr} x & : x > lambda \\ x & : x < -lambda \\ 0 & : otherwise \end{array} \\ \right. f'(x) &=& \left\{ \begin{array}{lr} 1 & : x > lambda \\ 1 & : x < -lambda \\ 0 & : otherwise \end{array} \right. \end{eqnarray*} lambda is set to 0.\+5 by default. 

Definition at line 48 of file hardshrink.\+hpp.



\subsection{Constructor \& Destructor Documentation}
\mbox{\label{classmlpack_1_1ann_1_1HardShrink_a498f1df42a9d937e5025db81f6e63783}} 
\index{mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}!Hard\+Shrink@{Hard\+Shrink}}
\index{Hard\+Shrink@{Hard\+Shrink}!mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}}
\subsubsection{Hard\+Shrink()}
{\footnotesize\ttfamily \textbf{ Hard\+Shrink} (\begin{DoxyParamCaption}\item[{const double}]{lambda = {\ttfamily 0.5} }\end{DoxyParamCaption})}



Create \doxyref{Hard\+Shrink}{p.}{classmlpack_1_1ann_1_1HardShrink} object using specified hyperparameter lambda. 


\begin{DoxyParams}{Parameters}
{\em lambda} & Is calculated by multiplying the noise level sigma of the input(noisy image) and a coefficient \textquotesingle{}a\textquotesingle{} which is one of the training parameters. Default value of lambda is 0.\+5. \\
\hline
\end{DoxyParams}


\subsection{Member Function Documentation}
\mbox{\label{classmlpack_1_1ann_1_1HardShrink_a3ad74424be92ee20e633e1008e08004b}} 
\index{mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}!Backward@{Backward}}
\index{Backward@{Backward}!mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}}
\subsubsection{Backward()}
{\footnotesize\ttfamily void Backward (\begin{DoxyParamCaption}\item[{const Data\+Type \&}]{input,  }\item[{Data\+Type \&}]{gy,  }\item[{Data\+Type \&}]{g }\end{DoxyParamCaption})}



Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f. 

Using the results from the feed forward pass.


\begin{DoxyParams}{Parameters}
{\em input} & The propagated input activation f(x). \\
\hline
{\em gy} & The backpropagated error. \\
\hline
{\em g} & The calculated gradient. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1HardShrink_a797f7edb44dd081e5e2b3cc316eef6bd}} 
\index{mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}!Delta@{Delta}}
\index{Delta@{Delta}!mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}}
\subsubsection{Delta()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Delta (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the delta. 



Definition at line 91 of file hardshrink.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1HardShrink_ad6601342d560219ce951d554e69e5e87}} 
\index{mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}!Delta@{Delta}}
\index{Delta@{Delta}!mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}}
\subsubsection{Delta()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Delta (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the delta. 



Definition at line 93 of file hardshrink.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1HardShrink_a09440df0a90bdcc766e56e097d91205b}} 
\index{mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}!Forward@{Forward}}
\index{Forward@{Forward}!mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}}
\subsubsection{Forward()}
{\footnotesize\ttfamily void Forward (\begin{DoxyParamCaption}\item[{const Input\+Type \&}]{input,  }\item[{Output\+Type \&}]{output }\end{DoxyParamCaption})}



Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. 


\begin{DoxyParams}{Parameters}
{\em input} & Input data used for evaluating the Hard Shrink function. \\
\hline
{\em output} & Resulting output activation. \\
\hline
\end{DoxyParams}
\mbox{\label{classmlpack_1_1ann_1_1HardShrink_acb669457ad59e62d0fccc5bae3a6c35e}} 
\index{mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}!Lambda@{Lambda}}
\index{Lambda@{Lambda}!mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}}
\subsubsection{Lambda()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily double const\& Lambda (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the hyperparameter lambda. 



Definition at line 96 of file hardshrink.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1HardShrink_aaf66629b989a326453647f42443c6a0c}} 
\index{mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}!Lambda@{Lambda}}
\index{Lambda@{Lambda}!mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}}
\subsubsection{Lambda()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily double\& Lambda (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the hyperparameter lambda. 



Definition at line 98 of file hardshrink.\+hpp.



References Hard\+Shrink$<$ Input\+Data\+Type, Output\+Data\+Type $>$\+::serialize().

\mbox{\label{classmlpack_1_1ann_1_1HardShrink_a0ee21c2a36e5abad1e7a9d5dd00849f9}} 
\index{mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}!Output\+Parameter@{Output\+Parameter}}
\index{Output\+Parameter@{Output\+Parameter}!mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}}
\subsubsection{Output\+Parameter()\hspace{0.1cm}{\footnotesize\ttfamily [1/2]}}
{\footnotesize\ttfamily Output\+Data\+Type const\& Output\+Parameter (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption}) const\hspace{0.3cm}{\ttfamily [inline]}}



Get the output parameter. 



Definition at line 86 of file hardshrink.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1HardShrink_a21d5f745f02c709625a4ee0907f004a5}} 
\index{mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}!Output\+Parameter@{Output\+Parameter}}
\index{Output\+Parameter@{Output\+Parameter}!mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}}
\subsubsection{Output\+Parameter()\hspace{0.1cm}{\footnotesize\ttfamily [2/2]}}
{\footnotesize\ttfamily Output\+Data\+Type\& Output\+Parameter (\begin{DoxyParamCaption}{ }\end{DoxyParamCaption})\hspace{0.3cm}{\ttfamily [inline]}}



Modify the output parameter. 



Definition at line 88 of file hardshrink.\+hpp.

\mbox{\label{classmlpack_1_1ann_1_1HardShrink_af0dd9205158ccf7bcfcd8ff81f79c927}} 
\index{mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}!serialize@{serialize}}
\index{serialize@{serialize}!mlpack\+::ann\+::\+Hard\+Shrink@{mlpack\+::ann\+::\+Hard\+Shrink}}
\subsubsection{serialize()}
{\footnotesize\ttfamily void serialize (\begin{DoxyParamCaption}\item[{Archive \&}]{ar,  }\item[{const unsigned}]{int }\end{DoxyParamCaption})}



Serialize the layer. 



Referenced by Hard\+Shrink$<$ Input\+Data\+Type, Output\+Data\+Type $>$\+::\+Lambda().



The documentation for this class was generated from the following file\+:\begin{DoxyCompactItemize}
\item 
/var/www/mlpack.\+ratml.\+org/mlpack.\+org/\+\_\+src/mlpack-\/3.\+3.\+0/src/mlpack/methods/ann/layer/\textbf{ hardshrink.\+hpp}\end{DoxyCompactItemize}
