Griffin Floto
I'm a research engineer who works on generative AI
Most recently I was a member of technical staff at Ideogram from 2024-2025 where I worked on distillation and inpainting models.
In 2023 I completed a MSc in Computer Science from the University of Toronto under the supervision of Scott Sanner. Most of my work was focused on diffusion models and variational auto-encoders.
From 2022-2024 I was at EthicalAI doing consulting. Primarily, I trained vision transformers for PAVE and helped them secure a partnership with Amazon. We also worked with Aethon Arial and Toronto Hydro.
Papers
I would like to publish outside the year 2023 at some point.
-
The Tilted Variational Autoencoder: Improving Out-Of-Distribution Detection
Griffin Floto, Stefan Kremer, Mihai Nica
ICLR 2023
-
Diffusion on the Probability Simplex
Griffin Floto, Thorsteinn Jonsson, Mihai Nica, Scott Sanner, Eric Zhengyu Zhu
ICML Workshop on Sampling and Optimization in Discrete Space, 2023
-
DiffuDetox: A Mixed Diffusion Model for Text Detoxification
Griffin Floto, Mohammad Mahdi Abdollah Pour, Parsa Farinneya, Zhenwei Tang, Ali Pesaranghader, Manasa Bharadwaj, Scott Sanner
Finding of ACL, 2023
-
LogicRec: Recommendation with Users' Logical Requirements
Armin Toroghi, Griffin Floto, Zhenwei Tang, Scott Sanner
SIGIR, 2023
-
LogicRec: Recommendation with Users' Logical Requirements
Zhenwei Tang, Griffin Floto, Armin Toroghi, Shichao Pei, Xiangliang Zhang, Scott Sanner
SIGIR, 2023
Presentation Slides
Below are the slides for some presentations I gave while completing my masters degree. I broadly remain interested in these topics.
-
Diffusion Models
In-depth presentation of diffusion models, primarily focusing on the DDPM paper. Some connections are made to the continuous SDE intepretation.
-
State Space Models
Introduction to state space models, going through how the Mamba model was created and it's relationship to the S4 and HiPPO paper.
-
Geometry and Topology in Deep Learning
Introduces the manifold hypothesis. Shows how to estimate instrinsic dimenion and gives some examples. Discusses how data constraints and group symmetries can be used to shed light on the intrinsic dataset structure. Concludes by suggesting we could model the data manifold and density independently.
-
Differential Equations and Machine Learning
Discusses how differential equations are relate to neural ode's, diffusion models / continuous normalizing flows and graph neural networks.
Projects
Some hobby projects I've enjoyed working on and hope to continue in my spare time.