Register to attend!
You're registered, see you at FPTalks 2025!
Welcome
Session 1
Some thoughts on Floating Point
Jean-Michel Muller, CNRS
APyTypes: Flexible Data Types for Finite Word Length Simulations in Python
Theodor Lindberg, Linköping University
Vector-friendly floats with 64n-bit precision
Fredrik Johansson, Inria Bordeaux
Software-Defined Floating-Point Number Formats
Amir Sabbagh Molahosseini, Queen's University Belfast
Break
Session 2
Floating point for AI
Andrew Fitzgibbon, Graphcore
A Datapath Dialect for Representing Operations with Carry-Save Results
Sam Coward, University College London
Upstreaming Arm-Optimized Vector Math Routines
Pierre Blanchard, ARM
Scaling Arithmetic via Bitblasting in Probabilistic Programs
Poorva Garg, UCLA
Break
Session 3
Detecting Numerical Instability in ML Code via Soft Assertions
Wei Le, Iowa State
Correctly Rounded Randomness
Nima Badizadegan, Anthropic
Accurate and Efficient Formats for LLMs
Jordan Dotzel, Cornell University
Experiments in Double-Double
Pavel Panchekha, University of Utah
Conclusion
For more information, please see the FPBench project and check out past recordings from:

The FPTalks Workshop Series is supported in part by the National Science Foundation Division of Computer and Network Systems under award CNS-2346394.