.. EpyNN documentation master file, created by
sphinx-quickstart on Tue Jul 6 18:46:11 2021.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
.. toctree::
Embedding layer (Input)
===============================
Source files in ``EpyNN/epynn/embedding/``.
See `Appendix - Notations`_ for mathematical conventions.
.. _Appendix - Notations: glossary.html#notations
Layer architecture
------------------------------
.. image:: _static/embedding-01.svg
In EpyNN, the *Embedding* - or input - layer must be the first layer of every Neural Network. This layer is not *trainable* but binds the data to be forwarded through the network. Importantly, it contains specific procedures for data pre-processing in the :py:mod:`epynn.embedding.dataset` module.
.. autoclass:: epynn.embedding.models.Embedding
:show-inheritance:
Upon instantiation, the *Embedding* layer can be instructed to *one-hot encode* sample features and/or label. It can also apply a global scaling of features within \[0, 1\]. The *batch_size* argument can be set for training batches preparation.
Shapes
~~~~~~~~~~~~~~~~~~~~
.. automethod:: epynn.embedding.models.Embedding.compute_shapes
.. literalinclude:: ./../epynn/embedding/parameters.py
:pyobject: embedding_compute_shapes
:language: python
Within an *Embedding* layer, shapes of interest include:
* Input *X* of shape *(m, ...)* with *m* equal to the number of samples. The number of input dimensions is unknown *a priori*.
* The number of features *n* per sample can still be determined formally: it is equal to the size of the input *X* divided by the number of samples *m*.
Note that:
* The *Embedding* layer is like a pass-through layer except that it is the first layer of the Network. Therefore, it does not receive an input from the previous layer because there is none.
* The *Embedding* layer is not *trainable* and does not transform the data during the training phase. The input dimensions does not need to be known by the layer.
Forward
~~~~~~~~~~~~~~~~~~~~
.. automethod:: epynn.embedding.models.Embedding.forward
.. literalinclude:: ./../epynn/embedding/forward.py
:pyobject: embedding_forward
The forward propagation function in the *Embedding* layer *k* includes:
* (1): Input *X* in current layer *k* is equal to the user-provided samples features as a whole or in batches depending on user-choices and training or prediction mode.
* (2): Output *A* of current layer *k* is equal to input *X*.
.. math::
\begin{alignat*}{2}
& x^{k}_{m,d_1...d_n} &&= X\_data \tag{1} \\
& a^{k}_{m,d_1...d_n} &&= x^{k}_{m,d_1...d_n} \tag{2}
\end{alignat*}
Backward
~~~~~~~~~~~~~~~~~~~~
.. automethod:: epynn.embedding.models.Embedding.backward
.. literalinclude:: ./../epynn/embedding/backward.py
:pyobject: embedding_backward
The backward propagation function in the *Embedding* layer *k* includes:
* (1): *dA* the gradient of the loss with respect to the output of forward propagation *A* for current layer *k*. It is equal to the gradient of the loss with respect to input of forward propagation for next layer *k+1*.
* (2): The gradient of the loss *dX* with respect to the input of forward propagation *X* for current layer *k* is mathematically equal to *dA*. However, the *Embedding* layer returns *None* because there is no previous layer.
.. math::
\begin{alignat*}{2}
& \delta^{\kp}_{m,d_1...d_n} &&= \frac{\partial \mathcal{L}}{\partial a^{k}_{m,d_1...d_n}} = \frac{\partial \mathcal{L}}{\partial x^{\kp}_{m,d_1...d_n}} \tag{1} \\
& \delta^{k}_{m,d_1...d_n} &&= \frac{\partial \mathcal{L}}{\partial x^{k}_{m,d_1...d_n}} = \frac{\partial \mathcal{L}}{\partial a^{\km}_{m,d_1...d_n}} = \varnothing \tag{2}
\end{alignat*}
Gradients
~~~~~~~~~~~~~~~~~~~~
.. automethod:: epynn.embedding.models.Embedding.compute_gradients
.. literalinclude:: ./../epynn/embedding/parameters.py
:pyobject: embedding_compute_gradients
The *Embedding* layer is not a *trainable* layer. It has no *trainable* parameters such as weight *W* or bias *b*. Therefore, there is no parameters gradients to compute.
Live examples
------------------------------
`Practical description of the Embedding layer`_
.. _Practical description of the Embedding layer: epynnlive/dummy_boolean/train.html#The-Embedding-layer-object