Table 1: Details of FP8 Binary Formats Compares the two formats. For anyone complaining about alt text please read the posts I made.
https://social.librem.one/system/media_attachments/files/019/025/323/original/35edd798cc7bf470.png
Screenshot from the PDF.
It appears that #E4M3 (the one without multiple NaNs etc.) has less range, and less precision near 0, but smaller steps between numbers on average.
E4M3 also has NaN, just one type of NaN (nice).
I'm not thinking about #deepLearning here in particular, but I think I prefer E4M3 personally (for what it's worth).
Interesting development. Never thought float8's could have any use, and here we are in 2025 with the potential use of them.
GNU social JP is a social network, courtesy of GNU social JP管理人. It runs on GNU social, version 2.0.2-dev, available under the GNU Affero General Public License.
All GNU social JP content and data are available under the Creative Commons Attribution 3.0 license.