.. EpyNN documentation master file, created by sphinx-quickstart on Tue Jul 6 18:46:11 2021. You can adapt this file completely to your liking, but it should at least contain the root `toctree` directive. .. toctree:: Flatten - Adapter =============================== Source files in ``EpyNN/epynn/flatten/``. See `Appendix - Notations`_ for mathematical conventions. .. _Appendix - Notations: glossary.html#notations Layer architecture ------------------------------ .. image:: _static/Flatten/flatten-01.svg :alt: Flatten A *Flatten* - or reshaping - layer is an object which flattens data arrays from three or more dimensions to two dimensions. It can be seen as an *adapter* layer because it is necessary, for instance, when the shape of the output of layer *k-1* is not compatible with the expected shape of the input for layer *k*. .. autoclass:: epynn.flatten.models.Flatten :show-inheritance: Shapes ~~~~~~~~~~~~~~~~~~~~~~~ .. automethod:: epynn.flatten.models.Flatten.compute_shapes .. literalinclude:: ./../epynn/flatten/parameters.py :pyobject: flatten_compute_shapes :language: python Within a *Flatten* layer, shapes of interest include: * Input *X* of shape *(m, ...)* with *m* equal to the number of samples. The number of input dimensions is unknown *a priori*. * The number of features *n* per sample can still be determined formally: it is equal to the size of the input *X* divided by the number of samples *m*. * The output shape of the *Flatten* layer is equal to *(m, n)*. Forward ~~~~~~~~~~~~~~~~~~~~~~~ .. automethod:: epynn.flatten.models.Flatten.forward .. literalinclude:: ./../epynn/flatten/forward.py :pyobject: flatten_forward The forward propagation function in a *Flatten* layer *k* includes: * (1): Input *X* in current layer *k* is equal to the output *A* of previous layer *k-1*. * (2): Output *A* of current layer *k* is equal to input *X* reshaped from *(m, ...)* to *(m, n)*. Note that: * The reshaping operation preserves the association between samples and corresponding features. The shape *(m, ...)* means there is one row per sample, regardless the number of dimensions within each row. The reshaping operation is applied with respect to each row, therefore preserving data integrity. .. math:: \begin{alignat*}{2} & x^{k}_{m,d_1...d_n} &&= a^{\km}_{m,d_1...d_n} \tag{1} \\ & a^{k}_{m,n} &&= f(x^{k}_{m,d_1...d_n}) \tag{2} \end{alignat*} .. math:: \begin{align} where~f~is~defined~as: \\ f:\mathcal{M}_{m,d_1...d_n}(\mathbb{R}) & \to \mathcal{M}_{m,n}(\mathbb{R}) \\ X & \to f(X) \\ with~n \in \mathbb{N}^* \end{align} Backward ~~~~~~~~~~~~~~~~~~~~~~~ .. automethod:: epynn.flatten.models.Flatten.backward .. literalinclude:: ./../epynn/flatten/backward.py :pyobject: flatten_backward The backward propagation function in a *Flatten* pass-through layer *k* includes: * (1): *dA* the gradient of the loss with respect to the output of forward propagation *A* for current layer *k*. It is equal to the gradient of the loss with respect to input of forward propagation for next layer *k+1*. * (2): The gradient of the loss *dX* with respect to the input of forward propagation *X* for current layer *k* is equal to the reverse of the reshaping operation applied on *dA*. Therefore, *dX* has same shape as *X* which is *(m, ...)*. .. math:: \begin{alignat*}{2} & \delta^{\kp}_{mn} &&= \frac{\partial \mathcal{L}}{\partial a^{k}_{mn}} = \frac{\partial \mathcal{L}}{\partial x^{\kp}_{mn}} \tag{1} \\ & \delta^{k}_{m,d_1...d_n} &&= \frac{\partial \mathcal{L}}{\partial x^{k}_{m,d_1...d_n}} = \frac{\partial \mathcal{L}}{\partial a^{\km}_{m,d_1...d_n}} = f^{-1}(\frac{\partial \mathcal{L}}{\partial a^{k}_{mn}}) \tag{2} \end{alignat*} Gradients ~~~~~~~~~~~~~~~~~~~~~~~ .. automethod:: epynn.flatten.models.Flatten.compute_gradients A *Flatten* layer is not a *trainable* layer. It has no *trainable* parameters such as weight *W* or bias *b*. Therefore, there is no parameters to update. Live examples ------------------------------ * `Dummy time - Flatten-(Dense)n`_ * `Dummy time - Flatten-(Dense)n with Dropout`_ * `Dummy image - Flatten-(Dense)n with Dropout`_ * `Dummy image - Conv-Pool-Flatten-Dense`_ * `Dummy string - Flatten-Dense - Perceptron`_ * `Author music - Flatten-(Dense)n with Dropout`_ * `Author music - RNN(sequences=True)-Flatten-(Dense)n with Dropout`_ * `Author music - GRU(sequences=True)-Flatten-(Dense)n with Dropout`_ * `Protein Modification - LSTM(sequence=True)-Flatten-Dense`_ * `Protein Modification - LSTM(sequence=True)-Flatten-(Dense)n with Dropout`_ * `MNIST Database - Flatten-(Dense)n with Dropout`_ * `MNIST Database - Conv-MaxPool-Flatten-Dense`_ You may also like to browse all `Network training examples`_ provided with EpyNN. .. _Network training examples: run_examples.html .. _Protein Modification - LSTM(sequence\=True)-Flatten-Dense: epynnlive/ptm_protein/train.html#LSTM(sequence=True)-Flatten-Dense .. _Protein Modification - LSTM(sequence\=True)-Flatten-(Dense)n with Dropout: epynnlive/ptm_protein/train.html#LSTM(sequence=True)-Flatten-(Dense)n-with-Dropout .. _Author music - Flatten-(Dense)n with Dropout: epynnlive/author_music/train.html#Flatten-(Dense)n-with-Dropout .. _Author music - GRU(sequences\=True)-Flatten-(Dense)n with Dropout: epynnlive/author_music/train.html#GRU(sequences=True)-Flatten-(Dense)n-with-Dropout .. _Author music - RNN(sequences\=True)-Flatten-(Dense)n with Dropout: epynnlive/author_music/train.html#RNN(sequences=True)-Flatten-(Dense)n-with-Dropout .. _Dummy string - Flatten-Dense - Perceptron: epynnlive/dummy_string/train.html#Flatten-Dense---Perceptron .. _Dummy time - Flatten-(Dense)n: epynnlive/dummy_time/train.html#Flatten-(Dense)n .. _Dummy time - Flatten-(Dense)n with Dropout: epynnlive/dummy_time/train.html#Flatten-(Dense)n-with-Dropout .. _Dummy image - Flatten-(Dense)n with Dropout: epynnlive/dummy_image/train.html#Flatten-(Dense)n-with-Dropout .. _Dummy image - Conv-Pool-Flatten-Dense: epynnlive/dummy_image/train.html#Conv-Pool-Flatten-Dense .. _MNIST Database - Flatten-(Dense)n with Dropout: epynnlive/captcha_mnist/train.html#Flatten-(Dense)n-with-Dropout .. _MNIST Database - Conv-MaxPool-Flatten-Dense: epynnlive/captcha_mnist/train.html#Conv-MaxPool-Flatten-Dense