Elon Musk @elonmusk
@samiralyateem @ID_AA_Carmack There are a lot of pointless zeroes in FP32 neural nets. You can chop off 16 of the 32 bits without losing meaningful precision. This works well with neural nets, but not with regular computing, which expects extreme precision. — PolitiTweet.org