In AI architecture, variational autoencoders simulate the latent space (between the encoder and decoder)—usually with a mixture of Gaussian functions—to maximize the ELBO (evidence lower bound)
VAE Gaussians
without VAE
with VAE
Vector quantized variational autoencoders (VQ-VAE) utilize an discrete embedding space
VQ-VAE