The latest machine learning research at MIT presents a novel ‘Poisson Flow Generative Model’ (PFGM) that maps any data distribution into a uniform distribution on a high-dimensional hemisphere
Deep generative models (DGMs) are a data generation strategy that is widely used to produce high-quality images, texts, and audio samples and improve semisupervised learning, generalization of domains, and imitation. Deep generative models have some shortcomings, such as unstable GANs and low quality samples (VAEs or normalizing flows). Recent developments in diffusion- and scoredbased models achieve equivalent sample quality without adversarial learning, but the stochastic selection procedure is slow. We present new strategies to secure the training of CNN or ViT based GAN models.
The authors suggest that backward ODEsamplers, which normalize flow (normalizing flow), can accelerate the sampling process. These approaches are not yet superior to their SDE counterparts. We present a new \”Poisson Flow\” generative (PFGM) model that exploits a surprising physics truth that extends into N dimensions. They interpret N-dimensional items x (say pictures) as positive electrical charges in the plane z = 0, of an N+1 dimensional environment filled with viscous fluid like honey. The figure below shows that motion in a viscous liquid converts any charge distribution in a plane into an angular distribution.
The forward process is reversed by generating a uniform distribution of negative charges on the hemisphere, then tracking their path back to the z = 0 planes, where they will be dispersed as the data distribution. The reverse process is performed by creating a uniform distribution on the hemisphere of negative charges, and then tracking the path they take back to the planes at z = 0.