ENHANCING TRANSFER LEARNING IN VAES WITH EFFICIENTNET AND AMORTIZED STOCHASTIC VARIATIONAL INFERENCE FOR MOBILE ENVIRONMENTS
Abstract
<p>Variational Autoencoders (VAEs) have become a cornerstone in generative modeling, providing a powerful framework for learning latent representations of data. Recent advances in neural architectures, such as EfficientNet, offer promising avenues for improving VAE performance while reducing resource consumption. This paper aims to explore the integration of these advancements to enhance transfer learning in VAEs for mobile and resourceconstrained environments. The proposed model integrates the Adam optimizer with Amortized Stochastic Variationsal Inference (ASVI), adaptive hyperparameter tuning, and specific miniaturization techniques. The ELBO is optimised to maximise the predicted log-likelihood while minimising the KL divergence between the variational posterior and the prior over latent variables. We evaluate our proposed model on three benchmark datasets: MNIST, CIFAR-10, and CelebA. Our experimental results demonstrate significant performance gains in terms of reconstruction quality, classification accuracy, and computational efficiency. Our proposed model sets a new benchmark for transfer learning, paving the way for further research in this direction</p>