Download PDFOpen PDF in browser

Biologically Inspired Sleep Algorithm for VariationalAuto-Encoders

EasyChair Preprint no. 4310

14 pagesDate: October 1, 2020

Abstract

Variational auto-encoders (VAEs) are a class of likelihood-based generative models that operate by providing an approximation to the problem of inference by introducing a latent variable and encoder/decoder components. However, the latent codes usually have no structure, are not informative, and are not interpretable. This problem is amplified if these models need to be used for auxiliary tasks or when different aspects of the generated samples need to be controlled or interpreted by humans. We address these issues by proposing a biologically realistic sleep algorithm for VAEs (VAE-sleep). The algorithm augments the normal training phase of the VAE with an unsupervised learning phase in the equivalent spiking VAE modeled after how the human brain learns, using the Mirrored Spike Timing Dependent Plasticity learning rule. We hypothesize the proposed unsupervised VAE-sleep phase creates more realistic feature representations, which in turn lead to increase a VAE’s robustness to reconstruct the input. We conduct quantitative and qualitative experiments, including comparisons with the state-of-the-art on three datasets: CelebA, MNIST, and Fashion-MNIST. We show that our model performs better than the standard VAE and varitional sparse coding (VSC) on benchmark classification task by demonstrating improved classification accuracy and significantly increased robustness to the number of latent dimensions. As a result of experiments suggest, the proposed method shows improved performance in comparison with other widely used methods and performs favorably under the metrics PSNR, SSIM, LPIPS. The quantitative evaluations also suggest that our model can generate more realistic images compared to the state of arts when tested on disturbed or noisy inputs.

Keyphrases: Sleep Algorithm, Spiking Neural Network, Varitional Auto Encoder

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:4310,
  author = {Sameerah Talafha and Banafsheh Rekabdar and Christos Mousas and Chinwe Ekenna},
  title = {Biologically Inspired Sleep Algorithm for VariationalAuto-Encoders},
  howpublished = {EasyChair Preprint no. 4310},

  year = {EasyChair, 2020}}
Download PDFOpen PDF in browser