Generative AI – Latent Space

Exploring Latent Space in Generative Models

Introduction Latent space is a fundamental concept in generative models, such as GANs (Generative Adversarial Networks) and VAEs (Variational Autoencoders). It refers to the space of hidden variables that the model uses to generate new data instances.

What is Latent Space? In the context of generative models, latent space is an abstract, lower-dimensional space where each point corresponds to a possible output of the model. The model learns to map these points to high-dimensional data, such as images or text.

Navigating Latent Space Latent space can be thought of as a compressed representation of the data distribution. By sampling points in latent space and feeding them through the decoder of the model, we can generate new, realistic data instances.

Example Code: Sampling from Latent Space Here’s an example using a simple VAE to explore latent space:

import torch
from torch import nn
from torchvision import datasets, transforms
from torch.utils.data import DataLoader

# Define the VAE model
class VAE(nn.Module):
    def __init__(self, latent_dim=2):
        super(VAE, self).__init__()
        self.fc1 = nn.Linear(784, 400)
        self.fc21 = nn.Linear(400, latent_dim)
        self.fc22 = nn.Linear(400, latent_dim)
        self.fc3 = nn.Linear(latent_dim, 400)
        self.fc4 = nn.Linear(400, 784)

    def encode(self, x):
        h1 = torch.relu(self.fc1(x))
        return self.fc21(h1), self.fc22(h1)

    def reparameterize(self, mu, logvar):
        std = torch.exp(0.5 * logvar)
        eps = torch.randn_like(std)
        return mu + eps * std

    def decode(self, z):
        h3 = torch.relu(self.fc3(z))
        return torch.sigmoid(self.fc4(h3))

    def forward(self, x):
        mu, logvar = self.encode(x.view(-1, 784))
        z = self.reparameterize(mu, logvar)
        return self.decode(z), mu, logvar

# Example usage
model = VAE()
# Assuming `data_loader` is a DataLoader for MNIST or a similar dataset
for batch in data_loader:
    data, _ = batch
    recon_batch, mu, logvar = model(data)
    # Sample from latent space
    sample = torch.randn(64, 2)  # 64 samples, 2-dimensional latent space
    generated_images = model.decode(sample)

Latent space provides a powerful way to understand and generate data with generative models. By manipulating points in latent space, we can create diverse and realistic data samples, making it a crucial concept in AI and machine learning.


Posted

in

, ,

by