Documentation for 341ed50da8

This commit is contained in:
github-actions
2021-09-03 19:56:17 +00:00
parent 52d0c55145
commit 08e8eb037e
3432 changed files with 71392 additions and 72953 deletions

View File

@@ -2,8 +2,8 @@
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/>
<meta http-equiv="X-UA-Compatible" content="IE=9"/>
<meta name="generator" content="Doxygen 1.9.1"/>
<meta http-equiv="X-UA-Compatible" content="IE=11"/>
<meta name="generator" content="Doxygen 1.9.2"/>
<meta name="viewport" content="width=device-width, initial-scale=1"/>
<title>Algorithms_in_C++: machine_learning/neural_network.cpp File Reference</title>
<link href="../../tabs.css" rel="stylesheet" type="text/css"/>
@@ -17,9 +17,9 @@
<script type="text/javascript" src="../../search/searchdata.js"></script>
<script type="text/javascript" src="../../search/search.js"></script>
<script type="text/x-mathjax-config">
MathJax.Hub.Config({
extensions: ["tex2jax.js", "TeX/AMSmath.js", "TeX/AMSsymbols.js"],
jax: ["input/TeX","output/HTML-CSS"],
MathJax.Hub.Config({
extensions: ["tex2jax.js", "TeX/AMSmath.js", "TeX/AMSsymbols.js"],
jax: ["input/TeX","output/HTML-CSS"],
});
</script>
<script type="text/javascript" async="async" src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.7/MathJax.js?config=TeX-MML-AM_CHTML/MathJax.js"></script>
@@ -32,8 +32,7 @@
<tbody>
<tr style="height: 56px;">
<td id="projectalign" style="padding-left: 0.5em;">
<div id="projectname">Algorithms_in_C++
&#160;<span id="projectnumber">1.0.0</span>
<div id="projectname">Algorithms_in_C++<span id="projectnumber">&#160;1.0.0</span>
</div>
<div id="projectbrief">Set of algorithms implemented in C++.</div>
</td>
@@ -42,21 +41,22 @@
</table>
</div>
<!-- end header part -->
<!-- Generated by Doxygen 1.9.1 -->
<!-- Generated by Doxygen 1.9.2 -->
<script type="text/javascript">
/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&amp;dn=gpl-2.0.txt GPL-v2 */
var searchBox = new SearchBox("searchBox", "../../search",false,'Search','.html');
/* @license magnet:?xt=urn:btih:d3d9a9a6595521f9666a5e94cc830dab83b65699&amp;dn=expat.txt MIT */
var searchBox = new SearchBox("searchBox", "../../search",'Search','.html');
/* @license-end */
</script>
<script type="text/javascript" src="../../menudata.js"></script>
<script type="text/javascript" src="../../menu.js"></script>
<script type="text/javascript">
/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&amp;dn=gpl-2.0.txt GPL-v2 */
/* @license magnet:?xt=urn:btih:d3d9a9a6595521f9666a5e94cc830dab83b65699&amp;dn=expat.txt MIT */
$(function() {
initMenu('../../',true,false,'search.php','Search');
$(document).ready(function() { init_search(); });
});
/* @license-end */</script>
/* @license-end */
</script>
<div id="main-nav"></div>
</div><!-- top -->
<div id="side-nav" class="ui-resizable side-nav-resizable">
@@ -70,7 +70,7 @@ $(function() {
</div>
</div>
<script type="text/javascript">
/* @license magnet:?xt=urn:btih:cf05388f2679ee054f2beb29a391d25f4e673ac3&amp;dn=gpl-2.0.txt GPL-v2 */
/* @license magnet:?xt=urn:btih:d3d9a9a6595521f9666a5e94cc830dab83b65699&amp;dn=expat.txt MIT */
$(document).ready(function(){initNavTree('d2/d58/neural__network_8cpp.html','../../'); initResizable(); });
/* @license-end */
</script>
@@ -94,12 +94,11 @@ $(document).ready(function(){initNavTree('d2/d58/neural__network_8cpp.html','../
<a href="#nested-classes">Classes</a> &#124;
<a href="#namespaces">Namespaces</a> &#124;
<a href="#func-members">Functions</a> </div>
<div class="headertitle">
<div class="title">neural_network.cpp File Reference</div> </div>
<div class="headertitle"><div class="title">neural_network.cpp File Reference</div></div>
</div><!--header-->
<div class="contents">
<p>Implementation of <a href="https://en.wikipedia.org/wiki/Multilayer_perceptron">Multilayer Perceptron</a>.
<p>Implementation of <a href="https://en.wikipedia.org/wiki/Multilayer_perceptron" target="_blank">Multilayer Perceptron</a>.
<a href="#details">More...</a></p>
<div class="textblock"><code>#include &lt;algorithm&gt;</code><br />
<code>#include &lt;cassert&gt;</code><br />
@@ -119,32 +118,32 @@ Include dependency graph for neural_network.cpp:</div>
</div>
</div>
</div><table class="memberdecls">
<tr class="heading"><td colspan="2"><h2 class="groupheader"><a name="nested-classes"></a>
<tr class="heading"><td colspan="2"><h2 class="groupheader"><a id="nested-classes" name="nested-classes"></a>
Classes</h2></td></tr>
<tr class="memitem:"><td class="memItemLeft" align="right" valign="top">class &#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="../../dc/d93/classmachine__learning_1_1neural__network_1_1layers_1_1_dense_layer.html">machine_learning::neural_network::layers::DenseLayer</a></td></tr>
<tr class="separator:"><td class="memSeparator" colspan="2">&#160;</td></tr>
<tr class="memitem:"><td class="memItemLeft" align="right" valign="top">class &#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html">machine_learning::neural_network::NeuralNetwork</a></td></tr>
<tr class="separator:"><td class="memSeparator" colspan="2">&#160;</td></tr>
</table><table class="memberdecls">
<tr class="heading"><td colspan="2"><h2 class="groupheader"><a name="namespaces"></a>
<tr class="heading"><td colspan="2"><h2 class="groupheader"><a id="namespaces" name="namespaces"></a>
Namespaces</h2></td></tr>
<tr class="memitem:d8/d77/namespacemachine__learning"><td class="memItemLeft" align="right" valign="top"> &#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="../../d8/d77/namespacemachine__learning.html">machine_learning</a></td></tr>
<tr class="memdesc:d8/d77/namespacemachine__learning"><td class="mdescLeft">&#160;</td><td class="mdescRight"><a href="https://en.wikipedia.org/wiki/A*_search_algorithm">A* search algorithm</a> <br /></td></tr>
<tr class="memitem:d8/d77/namespacemachine__learning"><td class="memItemLeft" align="right" valign="top">namespace &#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="../../d8/d77/namespacemachine__learning.html">machine_learning</a></td></tr>
<tr class="memdesc:d8/d77/namespacemachine__learning"><td class="mdescLeft">&#160;</td><td class="mdescRight"><a href="https://en.wikipedia.org/wiki/A*_search_algorithm" target="_blank">A* search algorithm</a> <br /></td></tr>
<tr class="separator:"><td class="memSeparator" colspan="2">&#160;</td></tr>
<tr class="memitem:d0/d2e/namespaceneural__network"><td class="memItemLeft" align="right" valign="top"> &#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="../../d0/d2e/namespaceneural__network.html">neural_network</a></td></tr>
<tr class="memitem:d0/d2e/namespaceneural__network"><td class="memItemLeft" align="right" valign="top">namespace &#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="../../d0/d2e/namespaceneural__network.html">neural_network</a></td></tr>
<tr class="memdesc:d0/d2e/namespaceneural__network"><td class="mdescLeft">&#160;</td><td class="mdescRight">Neural Network or Multilayer Perceptron. <br /></td></tr>
<tr class="separator:"><td class="memSeparator" colspan="2">&#160;</td></tr>
<tr class="memitem:d5/d39/namespaceactivations"><td class="memItemLeft" align="right" valign="top"> &#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="../../d5/d39/namespaceactivations.html">activations</a></td></tr>
<tr class="memitem:d5/d39/namespaceactivations"><td class="memItemLeft" align="right" valign="top">namespace &#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="../../d5/d39/namespaceactivations.html">activations</a></td></tr>
<tr class="memdesc:d5/d39/namespaceactivations"><td class="mdescLeft">&#160;</td><td class="mdescRight">Various activation functions used in Neural network. <br /></td></tr>
<tr class="separator:"><td class="memSeparator" colspan="2">&#160;</td></tr>
<tr class="memitem:d3/d17/namespaceutil__functions"><td class="memItemLeft" align="right" valign="top"> &#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="../../d3/d17/namespaceutil__functions.html">util_functions</a></td></tr>
<tr class="memitem:d3/d17/namespaceutil__functions"><td class="memItemLeft" align="right" valign="top">namespace &#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="../../d3/d17/namespaceutil__functions.html">util_functions</a></td></tr>
<tr class="memdesc:d3/d17/namespaceutil__functions"><td class="mdescLeft">&#160;</td><td class="mdescRight">Various utility functions used in Neural network. <br /></td></tr>
<tr class="separator:"><td class="memSeparator" colspan="2">&#160;</td></tr>
<tr class="memitem:d5/d2c/namespacelayers"><td class="memItemLeft" align="right" valign="top"> &#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="../../d5/d2c/namespacelayers.html">layers</a></td></tr>
<tr class="memitem:d5/d2c/namespacelayers"><td class="memItemLeft" align="right" valign="top">namespace &#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="../../d5/d2c/namespacelayers.html">layers</a></td></tr>
<tr class="memdesc:d5/d2c/namespacelayers"><td class="mdescLeft">&#160;</td><td class="mdescRight">This namespace contains layers used in MLP. <br /></td></tr>
<tr class="separator:"><td class="memSeparator" colspan="2">&#160;</td></tr>
</table><table class="memberdecls">
<tr class="heading"><td colspan="2"><h2 class="groupheader"><a name="func-members"></a>
<tr class="heading"><td colspan="2"><h2 class="groupheader"><a id="func-members" name="func-members"></a>
Functions</h2></td></tr>
<tr class="memitem:a23aa9d32bcbcd65cfc85f0a41e2afadc"><td class="memItemLeft" align="right" valign="top">double&#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="../../d2/d58/neural__network_8cpp.html#a23aa9d32bcbcd65cfc85f0a41e2afadc">machine_learning::neural_network::activations::sigmoid</a> (const double &amp;x)</td></tr>
<tr class="separator:a23aa9d32bcbcd65cfc85f0a41e2afadc"><td class="memSeparator" colspan="2">&#160;</td></tr>
@@ -169,14 +168,14 @@ Functions</h2></td></tr>
<tr class="separator:ae66f6b31b5ad750f1fe042a706a4e3d4"><td class="memSeparator" colspan="2">&#160;</td></tr>
</table>
<a name="details" id="details"></a><h2 class="groupheader">Detailed Description</h2>
<div class="textblock"><p>Implementation of <a href="https://en.wikipedia.org/wiki/Multilayer_perceptron">Multilayer Perceptron</a>. </p>
<dl class="section author"><dt>Author</dt><dd><a href="https://github.com/imdeep2905">Deep Raval</a></dd></dl>
<div class="textblock"><p >Implementation of <a href="https://en.wikipedia.org/wiki/Multilayer_perceptron" target="_blank">Multilayer Perceptron</a>. </p>
<dl class="section author"><dt>Author</dt><dd><a href="https://github.com/imdeep2905" target="_blank">Deep Raval</a></dd></dl>
<p>A multilayer perceptron (MLP) is a class of feedforward artificial neural network (ANN). The term MLP is used ambiguously, sometimes loosely to any feedforward ANN, sometimes strictly to refer to networks composed of multiple layers of perceptrons (with threshold activation). Multilayer perceptrons are sometimes colloquially referred to as "vanilla" neural networks, especially when they have a single hidden layer.</p>
<p>An MLP consists of at least three layers of nodes: an input layer, a hidden layer and an output layer. Except for the input nodes, each node is a neuron that uses a nonlinear activation function. MLP utilizes a supervised learning technique called backpropagation for training. Its multiple layers and non-linear activation distinguish MLP from a linear perceptron. It can distinguish data that is not linearly separable.</p>
<p>See <a href="https://en.wikipedia.org/wiki/Backpropagation">Backpropagation</a> for training algorithm.</p>
<p >An MLP consists of at least three layers of nodes: an input layer, a hidden layer and an output layer. Except for the input nodes, each node is a neuron that uses a nonlinear activation function. MLP utilizes a supervised learning technique called backpropagation for training. Its multiple layers and non-linear activation distinguish MLP from a linear perceptron. It can distinguish data that is not linearly separable.</p>
<p >See <a href="https://en.wikipedia.org/wiki/Backpropagation" target="_blank">Backpropagation</a> for training algorithm.</p>
<dl class="section note"><dt>Note</dt><dd>This implementation uses mini-batch gradient descent as optimizer and MSE as loss function. Bias is also not included. </dd></dl>
</div><h2 class="groupheader">Function Documentation</h2>
<a id="aa69e95a34054d7989bf446f96b2ffaf9"></a>
<a id="aa69e95a34054d7989bf446f96b2ffaf9" name="aa69e95a34054d7989bf446f96b2ffaf9"></a>
<h2 class="memtitle"><span class="permalink"><a href="#aa69e95a34054d7989bf446f96b2ffaf9">&#9670;&nbsp;</a></span>drelu()</h2>
<div class="memitem">
@@ -191,14 +190,14 @@ Functions</h2></td></tr>
</tr>
</table>
</div><div class="memdoc">
<p>Derivative of relu function </p><dl class="params"><dt>Parameters</dt><dd>
<p >Derivative of relu function </p><dl class="params"><dt>Parameters</dt><dd>
<table class="params">
<tr><td class="paramname">X</td><td>Value </td></tr>
</table>
</dd>
</dl>
<dl class="section return"><dt>Returns</dt><dd>derivative of relu(x) </dd></dl>
<div class="fragment"><div class="line"><a name="l00081"></a><span class="lineno"> 81</span>&#160;{ <span class="keywordflow">return</span> x &gt;= 0.0 ? 1.0 : 0.0; }</div>
<div class="fragment"><div class="line"><a id="l00081" name="l00081"></a><span class="lineno"> 81</span>{ <span class="keywordflow">return</span> x &gt;= 0.0 ? 1.0 : 0.0; }</div>
</div><!-- fragment --><div class="dynheader">
Here is the call graph for this function:</div>
<div class="dyncontent">
@@ -208,7 +207,7 @@ Here is the call graph for this function:</div>
</div>
</div>
<a id="a76eb66212d577f948a457b6e29d87c46"></a>
<a id="a76eb66212d577f948a457b6e29d87c46" name="a76eb66212d577f948a457b6e29d87c46"></a>
<h2 class="memtitle"><span class="permalink"><a href="#a76eb66212d577f948a457b6e29d87c46">&#9670;&nbsp;</a></span>dsigmoid()</h2>
<div class="memitem">
@@ -223,14 +222,14 @@ Here is the call graph for this function:</div>
</tr>
</table>
</div><div class="memdoc">
<p>Derivative of sigmoid function </p><dl class="params"><dt>Parameters</dt><dd>
<p >Derivative of sigmoid function </p><dl class="params"><dt>Parameters</dt><dd>
<table class="params">
<tr><td class="paramname">X</td><td>Value </td></tr>
</table>
</dd>
</dl>
<dl class="section return"><dt>Returns</dt><dd>Returns derivative of sigmoid(x) </dd></dl>
<div class="fragment"><div class="line"><a name="l00067"></a><span class="lineno"> 67</span>&#160;{ <span class="keywordflow">return</span> x * (1 - x); }</div>
<div class="fragment"><div class="line"><a id="l00067" name="l00067"></a><span class="lineno"> 67</span>{ <span class="keywordflow">return</span> x * (1 - x); }</div>
</div><!-- fragment --><div class="dynheader">
Here is the call graph for this function:</div>
<div class="dyncontent">
@@ -240,7 +239,7 @@ Here is the call graph for this function:</div>
</div>
</div>
<a id="a2a5e874b9774aa5362dbcf288828b95c"></a>
<a id="a2a5e874b9774aa5362dbcf288828b95c" name="a2a5e874b9774aa5362dbcf288828b95c"></a>
<h2 class="memtitle"><span class="permalink"><a href="#a2a5e874b9774aa5362dbcf288828b95c">&#9670;&nbsp;</a></span>dtanh()</h2>
<div class="memitem">
@@ -255,14 +254,14 @@ Here is the call graph for this function:</div>
</tr>
</table>
</div><div class="memdoc">
<p>Derivative of Sigmoid function </p><dl class="params"><dt>Parameters</dt><dd>
<p >Derivative of Sigmoid function </p><dl class="params"><dt>Parameters</dt><dd>
<table class="params">
<tr><td class="paramname">X</td><td>Value </td></tr>
</table>
</dd>
</dl>
<dl class="section return"><dt>Returns</dt><dd>Returns derivative of tanh(x) </dd></dl>
<div class="fragment"><div class="line"><a name="l00095"></a><span class="lineno"> 95</span>&#160;{ <span class="keywordflow">return</span> 1 - x * x; }</div>
<div class="fragment"><div class="line"><a id="l00095" name="l00095"></a><span class="lineno"> 95</span>{ <span class="keywordflow">return</span> 1 - x * x; }</div>
</div><!-- fragment --><div class="dynheader">
Here is the call graph for this function:</div>
<div class="dyncontent">
@@ -272,7 +271,7 @@ Here is the call graph for this function:</div>
</div>
</div>
<a id="a32c00da08f2cf641dd336270f6e3c407"></a>
<a id="a32c00da08f2cf641dd336270f6e3c407" name="a32c00da08f2cf641dd336270f6e3c407"></a>
<h2 class="memtitle"><span class="permalink"><a href="#a32c00da08f2cf641dd336270f6e3c407">&#9670;&nbsp;</a></span>identity_function()</h2>
<div class="memitem">
@@ -287,14 +286,14 @@ Here is the call graph for this function:</div>
</tr>
</table>
</div><div class="memdoc">
<p>Identity function </p><dl class="params"><dt>Parameters</dt><dd>
<p >Identity function </p><dl class="params"><dt>Parameters</dt><dd>
<table class="params">
<tr><td class="paramname">X</td><td>Value </td></tr>
</table>
</dd>
</dl>
<dl class="section return"><dt>Returns</dt><dd>Returns x </dd></dl>
<div class="fragment"><div class="line"><a name="l00112"></a><span class="lineno"> 112</span>&#160;{ <span class="keywordflow">return</span> x; }</div>
<div class="fragment"><div class="line"><a id="l00112" name="l00112"></a><span class="lineno"> 112</span>{ <span class="keywordflow">return</span> x; }</div>
</div><!-- fragment --><div class="dynheader">
Here is the call graph for this function:</div>
<div class="dyncontent">
@@ -304,7 +303,7 @@ Here is the call graph for this function:</div>
</div>
</div>
<a id="ae66f6b31b5ad750f1fe042a706a4e3d4"></a>
<a id="ae66f6b31b5ad750f1fe042a706a4e3d4" name="ae66f6b31b5ad750f1fe042a706a4e3d4"></a>
<h2 class="memtitle"><span class="permalink"><a href="#ae66f6b31b5ad750f1fe042a706a4e3d4">&#9670;&nbsp;</a></span>main()</h2>
<div class="memitem">
@@ -322,11 +321,11 @@ Here is the call graph for this function:</div>
<p>Main function. </p>
<dl class="section return"><dt>Returns</dt><dd>0 on exit </dd></dl>
<div class="fragment"><div class="line"><a name="l00833"></a><span class="lineno"> 833</span>&#160; {</div>
<div class="line"><a name="l00834"></a><span class="lineno"> 834</span>&#160; <span class="comment">// Testing</span></div>
<div class="line"><a name="l00835"></a><span class="lineno"> 835</span>&#160; <a class="code" href="../../d2/d58/neural__network_8cpp.html#aa8dca7b867074164d5f45b0f3851269d">test</a>();</div>
<div class="line"><a name="l00836"></a><span class="lineno"> 836</span>&#160; <span class="keywordflow">return</span> 0;</div>
<div class="line"><a name="l00837"></a><span class="lineno"> 837</span>&#160;}</div>
<div class="fragment"><div class="line"><a id="l00833" name="l00833"></a><span class="lineno"> 833</span> {</div>
<div class="line"><a id="l00834" name="l00834"></a><span class="lineno"> 834</span> <span class="comment">// Testing</span></div>
<div class="line"><a id="l00835" name="l00835"></a><span class="lineno"> 835</span> <a class="code hl_function" href="../../d2/d58/neural__network_8cpp.html#aa8dca7b867074164d5f45b0f3851269d">test</a>();</div>
<div class="line"><a id="l00836" name="l00836"></a><span class="lineno"> 836</span> <span class="keywordflow">return</span> 0;</div>
<div class="line"><a id="l00837" name="l00837"></a><span class="lineno"> 837</span>}</div>
<div class="ttc" id="aneural__network_8cpp_html_aa8dca7b867074164d5f45b0f3851269d"><div class="ttname"><a href="../../d2/d58/neural__network_8cpp.html#aa8dca7b867074164d5f45b0f3851269d">test</a></div><div class="ttdeci">static void test()</div><div class="ttdef"><b>Definition:</b> neural_network.cpp:805</div></div>
</div><!-- fragment --><div class="dynheader">
Here is the call graph for this function:</div>
@@ -337,7 +336,7 @@ Here is the call graph for this function:</div>
</div>
</div>
<a id="af8f264600754602b6a9ea19cc690e50e"></a>
<a id="af8f264600754602b6a9ea19cc690e50e" name="af8f264600754602b6a9ea19cc690e50e"></a>
<h2 class="memtitle"><span class="permalink"><a href="#af8f264600754602b6a9ea19cc690e50e">&#9670;&nbsp;</a></span>relu()</h2>
<div class="memitem">
@@ -352,14 +351,14 @@ Here is the call graph for this function:</div>
</tr>
</table>
</div><div class="memdoc">
<p>Relu function </p><dl class="params"><dt>Parameters</dt><dd>
<p >Relu function </p><dl class="params"><dt>Parameters</dt><dd>
<table class="params">
<tr><td class="paramname">X</td><td>Value </td></tr>
</table>
</dd>
</dl>
<dl class="section return"><dt>Returns</dt><dd>relu(x) </dd></dl>
<div class="fragment"><div class="line"><a name="l00074"></a><span class="lineno"> 74</span>&#160;{ <span class="keywordflow">return</span> <a class="codeRef" target="_blank" href="http://en.cppreference.com/w/cpp/algorithm/max.html">std::max</a>(0.0, x); }</div>
<div class="fragment"><div class="line"><a id="l00074" name="l00074"></a><span class="lineno"> 74</span>{ <span class="keywordflow">return</span> <a class="code hl_functionRef" target="_blank" href="http://en.cppreference.com/w/cpp/algorithm/max.html">std::max</a>(0.0, x); }</div>
<div class="ttc" id="amax_html"><div class="ttname"><a href="http://en.cppreference.com/w/cpp/algorithm/max.html">std::max</a></div><div class="ttdeci">T max(T... args)</div></div>
</div><!-- fragment --><div class="dynheader">
Here is the call graph for this function:</div>
@@ -370,7 +369,7 @@ Here is the call graph for this function:</div>
</div>
</div>
<a id="a23aa9d32bcbcd65cfc85f0a41e2afadc"></a>
<a id="a23aa9d32bcbcd65cfc85f0a41e2afadc" name="a23aa9d32bcbcd65cfc85f0a41e2afadc"></a>
<h2 class="memtitle"><span class="permalink"><a href="#a23aa9d32bcbcd65cfc85f0a41e2afadc">&#9670;&nbsp;</a></span>sigmoid()</h2>
<div class="memitem">
@@ -385,14 +384,14 @@ Here is the call graph for this function:</div>
</tr>
</table>
</div><div class="memdoc">
<p>Sigmoid function </p><dl class="params"><dt>Parameters</dt><dd>
<p >Sigmoid function </p><dl class="params"><dt>Parameters</dt><dd>
<table class="params">
<tr><td class="paramname">X</td><td>Value </td></tr>
</table>
</dd>
</dl>
<dl class="section return"><dt>Returns</dt><dd>Returns sigmoid(x) </dd></dl>
<div class="fragment"><div class="line"><a name="l00060"></a><span class="lineno"> 60</span>&#160;{ <span class="keywordflow">return</span> 1.0 / (1.0 + <a class="codeRef" target="_blank" href="http://en.cppreference.com/w/cpp/numeric/math/exp.html">std::exp</a>(-x)); }</div>
<div class="fragment"><div class="line"><a id="l00060" name="l00060"></a><span class="lineno"> 60</span>{ <span class="keywordflow">return</span> 1.0 / (1.0 + <a class="code hl_functionRef" target="_blank" href="http://en.cppreference.com/w/cpp/numeric/math/exp.html">std::exp</a>(-x)); }</div>
<div class="ttc" id="aexp_html"><div class="ttname"><a href="http://en.cppreference.com/w/cpp/numeric/math/exp.html">std::exp</a></div><div class="ttdeci">T exp(T... args)</div></div>
</div><!-- fragment --><div class="dynheader">
Here is the call graph for this function:</div>
@@ -403,7 +402,7 @@ Here is the call graph for this function:</div>
</div>
</div>
<a id="a45d3e30406712ada3d9713ece3c1b153"></a>
<a id="a45d3e30406712ada3d9713ece3c1b153" name="a45d3e30406712ada3d9713ece3c1b153"></a>
<h2 class="memtitle"><span class="permalink"><a href="#a45d3e30406712ada3d9713ece3c1b153">&#9670;&nbsp;</a></span>square()</h2>
<div class="memitem">
@@ -418,14 +417,14 @@ Here is the call graph for this function:</div>
</tr>
</table>
</div><div class="memdoc">
<p>Square function </p><dl class="params"><dt>Parameters</dt><dd>
<p >Square function </p><dl class="params"><dt>Parameters</dt><dd>
<table class="params">
<tr><td class="paramname">X</td><td>Value </td></tr>
</table>
</dd>
</dl>
<dl class="section return"><dt>Returns</dt><dd>Returns x * x </dd></dl>
<div class="fragment"><div class="line"><a name="l00106"></a><span class="lineno"> 106</span>&#160;{ <span class="keywordflow">return</span> x * x; }</div>
<div class="fragment"><div class="line"><a id="l00106" name="l00106"></a><span class="lineno"> 106</span>{ <span class="keywordflow">return</span> x * x; }</div>
</div><!-- fragment --><div class="dynheader">
Here is the call graph for this function:</div>
<div class="dyncontent">
@@ -435,7 +434,7 @@ Here is the call graph for this function:</div>
</div>
</div>
<a id="a371aa7dd5d5add0143d1756bb0a1b32f"></a>
<a id="a371aa7dd5d5add0143d1756bb0a1b32f" name="a371aa7dd5d5add0143d1756bb0a1b32f"></a>
<h2 class="memtitle"><span class="permalink"><a href="#a371aa7dd5d5add0143d1756bb0a1b32f">&#9670;&nbsp;</a></span>tanh()</h2>
<div class="memitem">
@@ -450,14 +449,14 @@ Here is the call graph for this function:</div>
</tr>
</table>
</div><div class="memdoc">
<p>Tanh function </p><dl class="params"><dt>Parameters</dt><dd>
<p >Tanh function </p><dl class="params"><dt>Parameters</dt><dd>
<table class="params">
<tr><td class="paramname">X</td><td>Value </td></tr>
</table>
</dd>
</dl>
<dl class="section return"><dt>Returns</dt><dd>Returns tanh(x) </dd></dl>
<div class="fragment"><div class="line"><a name="l00088"></a><span class="lineno"> 88</span>&#160;{ <span class="keywordflow">return</span> 2 / (1 + <a class="codeRef" target="_blank" href="http://en.cppreference.com/w/cpp/numeric/math/exp.html">std::exp</a>(-2 * x)) - 1; }</div>
<div class="fragment"><div class="line"><a id="l00088" name="l00088"></a><span class="lineno"> 88</span>{ <span class="keywordflow">return</span> 2 / (1 + <a class="code hl_functionRef" target="_blank" href="http://en.cppreference.com/w/cpp/numeric/math/exp.html">std::exp</a>(-2 * x)) - 1; }</div>
</div><!-- fragment --><div class="dynheader">
Here is the call graph for this function:</div>
<div class="dyncontent">
@@ -467,7 +466,7 @@ Here is the call graph for this function:</div>
</div>
</div>
<a id="aa8dca7b867074164d5f45b0f3851269d"></a>
<a id="aa8dca7b867074164d5f45b0f3851269d" name="aa8dca7b867074164d5f45b0f3851269d"></a>
<h2 class="memtitle"><span class="permalink"><a href="#aa8dca7b867074164d5f45b0f3851269d">&#9670;&nbsp;</a></span>test()</h2>
<div class="memitem">
@@ -489,35 +488,35 @@ Here is the call graph for this function:</div>
</tr>
</table>
</div><div class="memdoc">
<p>Function to test neural network </p><dl class="section return"><dt>Returns</dt><dd>none </dd></dl>
<div class="fragment"><div class="line"><a name="l00805"></a><span class="lineno"> 805</span>&#160; {</div>
<div class="line"><a name="l00806"></a><span class="lineno"> 806</span>&#160; <span class="comment">// Creating network with 3 layers for &quot;iris.csv&quot;</span></div>
<div class="line"><a name="l00807"></a><span class="lineno"> 807</span>&#160; <a class="code" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html">machine_learning::neural_network::NeuralNetwork</a> myNN =</div>
<div class="line"><a name="l00808"></a><span class="lineno"> 808</span>&#160; <a class="code" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html">machine_learning::neural_network::NeuralNetwork</a>({</div>
<div class="line"><a name="l00809"></a><span class="lineno"> 809</span>&#160; {4, <span class="stringliteral">&quot;none&quot;</span>}, <span class="comment">// First layer with 3 neurons and &quot;none&quot; as activation</span></div>
<div class="line"><a name="l00810"></a><span class="lineno"> 810</span>&#160; {6,</div>
<div class="line"><a name="l00811"></a><span class="lineno"> 811</span>&#160; <span class="stringliteral">&quot;relu&quot;</span>}, <span class="comment">// Second layer with 6 neurons and &quot;relu&quot; as activation</span></div>
<div class="line"><a name="l00812"></a><span class="lineno"> 812</span>&#160; {3, <span class="stringliteral">&quot;sigmoid&quot;</span>} <span class="comment">// Third layer with 3 neurons and &quot;sigmoid&quot; as</span></div>
<div class="line"><a name="l00813"></a><span class="lineno"> 813</span>&#160; <span class="comment">// activation</span></div>
<div class="line"><a name="l00814"></a><span class="lineno"> 814</span>&#160; });</div>
<div class="line"><a name="l00815"></a><span class="lineno"> 815</span>&#160; <span class="comment">// Printing summary of model</span></div>
<div class="line"><a name="l00816"></a><span class="lineno"> 816</span>&#160; myNN.<a class="code" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html#a61d30113d13304c664057118b92a5931">summary</a>();</div>
<div class="line"><a name="l00817"></a><span class="lineno"> 817</span>&#160; <span class="comment">// Training Model</span></div>
<div class="line"><a name="l00818"></a><span class="lineno"> 818</span>&#160; myNN.<a class="code" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html#a5172a6791b9bd24f4232bab8d6b81fff">fit_from_csv</a>(<span class="stringliteral">&quot;iris.csv&quot;</span>, <span class="keyword">true</span>, 100, 0.3, <span class="keyword">false</span>, 2, 32, <span class="keyword">true</span>);</div>
<div class="line"><a name="l00819"></a><span class="lineno"> 819</span>&#160; <span class="comment">// Testing predictions of model</span></div>
<div class="line"><a name="l00820"></a><span class="lineno"> 820</span>&#160; assert(<a class="code" href="../../d8/d77/namespacemachine__learning.html#a1b42d24ad7bedbfa8e5b59fe96987a44">machine_learning::argmax</a>(</div>
<div class="line"><a name="l00821"></a><span class="lineno"> 821</span>&#160; myNN.<a class="code" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html#ac130322a5abb1ff763b7c1a55405a35e">single_predict</a>({{5, 3.4, 1.6, 0.4}})) == 0);</div>
<div class="line"><a name="l00822"></a><span class="lineno"> 822</span>&#160; assert(<a class="code" href="../../d8/d77/namespacemachine__learning.html#a1b42d24ad7bedbfa8e5b59fe96987a44">machine_learning::argmax</a>(</div>
<div class="line"><a name="l00823"></a><span class="lineno"> 823</span>&#160; myNN.<a class="code" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html#ac130322a5abb1ff763b7c1a55405a35e">single_predict</a>({{6.4, 2.9, 4.3, 1.3}})) == 1);</div>
<div class="line"><a name="l00824"></a><span class="lineno"> 824</span>&#160; assert(<a class="code" href="../../d8/d77/namespacemachine__learning.html#a1b42d24ad7bedbfa8e5b59fe96987a44">machine_learning::argmax</a>(</div>
<div class="line"><a name="l00825"></a><span class="lineno"> 825</span>&#160; myNN.<a class="code" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html#ac130322a5abb1ff763b7c1a55405a35e">single_predict</a>({{6.2, 3.4, 5.4, 2.3}})) == 2);</div>
<div class="line"><a name="l00826"></a><span class="lineno"> 826</span>&#160; <span class="keywordflow">return</span>;</div>
<div class="line"><a name="l00827"></a><span class="lineno"> 827</span>&#160;}</div>
<p >Function to test neural network </p><dl class="section return"><dt>Returns</dt><dd>none </dd></dl>
<div class="fragment"><div class="line"><a id="l00805" name="l00805"></a><span class="lineno"> 805</span> {</div>
<div class="line"><a id="l00806" name="l00806"></a><span class="lineno"> 806</span> <span class="comment">// Creating network with 3 layers for &quot;iris.csv&quot;</span></div>
<div class="line"><a id="l00807" name="l00807"></a><span class="lineno"> 807</span> <a class="code hl_class" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html">machine_learning::neural_network::NeuralNetwork</a> myNN =</div>
<div class="line"><a id="l00808" name="l00808"></a><span class="lineno"> 808</span> <a class="code hl_class" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html">machine_learning::neural_network::NeuralNetwork</a>({</div>
<div class="line"><a id="l00809" name="l00809"></a><span class="lineno"> 809</span> {4, <span class="stringliteral">&quot;none&quot;</span>}, <span class="comment">// First layer with 3 neurons and &quot;none&quot; as activation</span></div>
<div class="line"><a id="l00810" name="l00810"></a><span class="lineno"> 810</span> {6,</div>
<div class="line"><a id="l00811" name="l00811"></a><span class="lineno"> 811</span> <span class="stringliteral">&quot;relu&quot;</span>}, <span class="comment">// Second layer with 6 neurons and &quot;relu&quot; as activation</span></div>
<div class="line"><a id="l00812" name="l00812"></a><span class="lineno"> 812</span> {3, <span class="stringliteral">&quot;sigmoid&quot;</span>} <span class="comment">// Third layer with 3 neurons and &quot;sigmoid&quot; as</span></div>
<div class="line"><a id="l00813" name="l00813"></a><span class="lineno"> 813</span> <span class="comment">// activation</span></div>
<div class="line"><a id="l00814" name="l00814"></a><span class="lineno"> 814</span> });</div>
<div class="line"><a id="l00815" name="l00815"></a><span class="lineno"> 815</span> <span class="comment">// Printing summary of model</span></div>
<div class="line"><a id="l00816" name="l00816"></a><span class="lineno"> 816</span> myNN.<a class="code hl_function" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html#a61d30113d13304c664057118b92a5931">summary</a>();</div>
<div class="line"><a id="l00817" name="l00817"></a><span class="lineno"> 817</span> <span class="comment">// Training Model</span></div>
<div class="line"><a id="l00818" name="l00818"></a><span class="lineno"> 818</span> myNN.<a class="code hl_function" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html#a5172a6791b9bd24f4232bab8d6b81fff">fit_from_csv</a>(<span class="stringliteral">&quot;iris.csv&quot;</span>, <span class="keyword">true</span>, 100, 0.3, <span class="keyword">false</span>, 2, 32, <span class="keyword">true</span>);</div>
<div class="line"><a id="l00819" name="l00819"></a><span class="lineno"> 819</span> <span class="comment">// Testing predictions of model</span></div>
<div class="line"><a id="l00820" name="l00820"></a><span class="lineno"> 820</span> assert(<a class="code hl_function" href="../../d8/d77/namespacemachine__learning.html#a50480fccfb39de20ca47f1bf51ecb6ec">machine_learning::argmax</a>(</div>
<div class="line"><a id="l00821" name="l00821"></a><span class="lineno"> 821</span> myNN.<a class="code hl_function" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html#a3b9eac1824d365dce715fb17c33cb96f">single_predict</a>({{5, 3.4, 1.6, 0.4}})) == 0);</div>
<div class="line"><a id="l00822" name="l00822"></a><span class="lineno"> 822</span> assert(<a class="code hl_function" href="../../d8/d77/namespacemachine__learning.html#a50480fccfb39de20ca47f1bf51ecb6ec">machine_learning::argmax</a>(</div>
<div class="line"><a id="l00823" name="l00823"></a><span class="lineno"> 823</span> myNN.<a class="code hl_function" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html#a3b9eac1824d365dce715fb17c33cb96f">single_predict</a>({{6.4, 2.9, 4.3, 1.3}})) == 1);</div>
<div class="line"><a id="l00824" name="l00824"></a><span class="lineno"> 824</span> assert(<a class="code hl_function" href="../../d8/d77/namespacemachine__learning.html#a50480fccfb39de20ca47f1bf51ecb6ec">machine_learning::argmax</a>(</div>
<div class="line"><a id="l00825" name="l00825"></a><span class="lineno"> 825</span> myNN.<a class="code hl_function" href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html#a3b9eac1824d365dce715fb17c33cb96f">single_predict</a>({{6.2, 3.4, 5.4, 2.3}})) == 2);</div>
<div class="line"><a id="l00826" name="l00826"></a><span class="lineno"> 826</span> <span class="keywordflow">return</span>;</div>
<div class="line"><a id="l00827" name="l00827"></a><span class="lineno"> 827</span>}</div>
<div class="ttc" id="aclassmachine__learning_1_1neural__network_1_1_neural_network_html"><div class="ttname"><a href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html">machine_learning::neural_network::NeuralNetwork</a></div><div class="ttdef"><b>Definition:</b> neural_network.cpp:247</div></div>
<div class="ttc" id="aclassmachine__learning_1_1neural__network_1_1_neural_network_html_a3b9eac1824d365dce715fb17c33cb96f"><div class="ttname"><a href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html#a3b9eac1824d365dce715fb17c33cb96f">machine_learning::neural_network::NeuralNetwork::single_predict</a></div><div class="ttdeci">std::vector&lt; std::valarray&lt; double &gt; &gt; single_predict(const std::vector&lt; std::valarray&lt; double &gt; &gt; &amp;X)</div><div class="ttdef"><b>Definition:</b> neural_network.cpp:451</div></div>
<div class="ttc" id="aclassmachine__learning_1_1neural__network_1_1_neural_network_html_a5172a6791b9bd24f4232bab8d6b81fff"><div class="ttname"><a href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html#a5172a6791b9bd24f4232bab8d6b81fff">machine_learning::neural_network::NeuralNetwork::fit_from_csv</a></div><div class="ttdeci">void fit_from_csv(const std::string &amp;file_name, const bool &amp;last_label, const int &amp;epochs, const double &amp;learning_rate, const bool &amp;normalize, const int &amp;slip_lines=1, const size_t &amp;batch_size=32, const bool &amp;shuffle=true)</div><div class="ttdef"><b>Definition:</b> neural_network.cpp:587</div></div>
<div class="ttc" id="aclassmachine__learning_1_1neural__network_1_1_neural_network_html_a61d30113d13304c664057118b92a5931"><div class="ttname"><a href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html#a61d30113d13304c664057118b92a5931">machine_learning::neural_network::NeuralNetwork::summary</a></div><div class="ttdeci">void summary()</div><div class="ttdef"><b>Definition:</b> neural_network.cpp:773</div></div>
<div class="ttc" id="aclassmachine__learning_1_1neural__network_1_1_neural_network_html_ac130322a5abb1ff763b7c1a55405a35e"><div class="ttname"><a href="../../d4/df4/classmachine__learning_1_1neural__network_1_1_neural_network.html#ac130322a5abb1ff763b7c1a55405a35e">machine_learning::neural_network::NeuralNetwork::single_predict</a></div><div class="ttdeci">std::vector&lt; std::valarray&lt; double &gt; &gt; single_predict(const std::vector&lt; std::valarray&lt; double &gt;&gt; &amp;X)</div><div class="ttdef"><b>Definition:</b> neural_network.cpp:451</div></div>
<div class="ttc" id="anamespacemachine__learning_html_a1b42d24ad7bedbfa8e5b59fe96987a44"><div class="ttname"><a href="../../d8/d77/namespacemachine__learning.html#a1b42d24ad7bedbfa8e5b59fe96987a44">machine_learning::argmax</a></div><div class="ttdeci">size_t argmax(const std::vector&lt; std::valarray&lt; T &gt;&gt; &amp;A)</div><div class="ttdef"><b>Definition:</b> vector_ops.hpp:307</div></div>
<div class="ttc" id="anamespacemachine__learning_html_a50480fccfb39de20ca47f1bf51ecb6ec"><div class="ttname"><a href="../../d8/d77/namespacemachine__learning.html#a50480fccfb39de20ca47f1bf51ecb6ec">machine_learning::argmax</a></div><div class="ttdeci">size_t argmax(const std::vector&lt; std::valarray&lt; T &gt; &gt; &amp;A)</div><div class="ttdef"><b>Definition:</b> vector_ops.hpp:307</div></div>
</div><!-- fragment --><div class="dynheader">
Here is the call graph for this function:</div>
<div class="dyncontent">
@@ -533,7 +532,7 @@ Here is the call graph for this function:</div>
<div id="nav-path" class="navpath"><!-- id is needed for treeview function! -->
<ul>
<li class="navelem"><a class="el" href="../../dir_3343723ae086de42ee4ca9774da3a13f.html">machine_learning</a></li><li class="navelem"><a class="el" href="../../d2/d58/neural__network_8cpp.html">neural_network.cpp</a></li>
<li class="footer">Generated by <a href="https://www.doxygen.org/index.html"><img class="footer" src="../../doxygen.svg" width="104" height="31" alt="doxygen"/></a> 1.9.1 </li>
<li class="footer">Generated by <a href="https://www.doxygen.org/index.html"><img class="footer" src="../../doxygen.svg" width="104" height="31" alt="doxygen"/></a> 1.9.2 </li>
</ul>
</div>
</body>