There are many things, systems, that have Principal Components Analysis as the result of their evolution, their computation; their dynamics. Things like neural networks, for example. So, this time, I decided to play with autoencoders.
If is the dimension of the input layer, the dimension of the output layer and the dimension of the hidden one; then and
The output should be as close as possible to the input (In some sense, usually the quadratic error one)
This is the dimensionality reduction setup of the autoencoder, portraying the characteristic funnel architecture shown in figure 1b; it can be seen as a sequence of two affine maps between 3 vector spaces , and as in the figure 1a.
When was the last time you needed an asymmetric (skewed) probability density function (pdf) with infinite support? Traditional skewed distributions like the gamma family suffer from a semi-infinite support, that is, . The support, if you are out of the loop, is the set of values in the domain of such that . Why is this inconvenient? well, I will give more details about the specific application later, meanwhile, let’s say that the fact that its derivative is discontinuous at is problematic; even more, I need my function to be at least in , that is, to have at least continuous derivatives!.
This post constitutes a somewhat dirty solution being as unwilling as I am to review any literature in depth (I might be inventing the wheel again, some wheel, but who cares; this is a blog!). Here we go.