CROSS Affiliate John Gustafson Speaks at UCSC Computer Science Department’s Distinguished Lecturer Series

September 25, 2017

Beating Floating Point at its Own Game: Posit Arithmetic

Start Time:  Friday, October 6, 2017 – 11:00am
End Time:  Friday, October 6, 2017 – 12:30pm
Location:  Engineering 2, room 180
 

Abstract:

A new data type called a posit is designed as a direct drop-in replacement for IEEE Standard 754 floating-point numbers (floats). Unlike earlier forms of universal number (unum) arithmetic, posits do not require interval arithmetic or variable size operands; like floats, they round if an answer is inexact. However, they provide compelling advantages over floats, including larger dynamic range, higher accuracy, better closure, bitwise identical results across systems, simpler hardware, and simpler exception handling. Posits never overflow to infinity or underflow to zero, and “Not-a-Number” (NaN) indicates an action instead of a bit pattern. A posit processing unit takes less circuitry than an IEEE float FPU. With lower power use and smaller silicon footprint, the posit operations per second (POPS) supported by a chip can be significantly higher than the FLOPS using similar hardware resources. GPU accelerators and Deep Learning processors, in particular, can do more per watt and per dollar with posits, yet deliver superior answer quality. A comprehensive series of benchmarks compares floats and posits for decimals of accuracy produced for a set precision. Low precision posits provide a better solution than “approximate computing” methods that try to tolerate decreased answer quality. High precision posits provide more correct decimals than floats of the same size; in some cases, a 32-bit posit may safely replace a 64-bit float. In other words, posits beat floats at their own game.

Bio:
Dr. John L. Gustafson is an applied physicist and mathematician who is a Visiting Scientist at A*CRC and Professor at NUS. He is a former Director at Intel Labs and former Chief Product Architect at AMD. A pioneer in high-performance computing, he introduced cluster computing in 1985 and first demonstrated scalable massively parallel performance on real applications in 1988. This became known as Gustafson’s Law, for which he won the inaugural ACM Gordon Bell Prize. He is also a recipient of the IEEE Computer Society’s Golden Core Award.