transformers

Unlocking GenAI: How Diffusion Transformers are Revolutionizing OpenAI’s Sora

Screenshot 2024 02 15 At 2.45.48 Pm Transformed
Saining Xie, a computer science professor at NYU, began the research project that spawned the diffusion transformer in June 2022. Diffusion models typically have a “backbone,” or engine of sorts, called a U-Net. In other words, larger and larger transformer models can be trained with significant but not unattainable increases in compute. The current process of training diffusion transformers potentially introduces some inefficiencies and performance loss, but Xie believes this can be addressed over the long horizon. “I’m interested in integrating the domains of content understanding and creation within the framework of diffusion transformers.