Found in 7 comments on Hacker News
daly · 2021-04-24 · Original thread
The field of software is exploding in many directions (e.g. machine learning, quantum programming, program proofs, dependent types, category theory, etc).

At the hardware level it is also exploding. There are a dozen new computer architectures and instruction sets. It is now possible to design your own cpu (yes, I am) and build it in an FPGA at the cost of a few dollars. That forces you to learn VERILOG and processor design. It also forces you to learn electronics in order to breadboard your ideas.

The field is also converging. Intel now has a cpu that also has an fpga. (This is only available to the FAANG players it seems, which is a source of frustration for me. Why, Intel, Why?). Imaging being able to build your own instructions "on the fly". I want to implement Gustafson's UNUM arithmetic (see "The End of Error" https://www.amazon.com/End-Error-Computing-Chapman-Computati...)

It's not clear what the long-term shakeout will be but it is fun to try to learn what the leading edge is doing.

Like the Red Queen from Alice in Wonderland said

'Now, here, you see, it takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!'

I'm doing all the running I can to ride the leading edge.

What could be more fun?

andrewl · 2019-07-13 · Original thread
I'd never heard of him. He's done a lot of interesting stuff, mostly in high performance computing. From his personal page:

"Gustafson has recently finished writing a book, The End of Error: Unum Computing, that presents a new approach to computer arithmetic: the unum. The universal number, or unum format, encompasses all IEEE floating-point formats as well as fixed-point and exact integer arithmetic. This approach obtains more accurate answers than floating-point arithmetic yet uses fewer bits in many cases, saving memory, bandwidth, energy, and power."

Has anybody read it?

https://www.amazon.com/dp/1482239868

daly · 2019-04-09 · Original thread
Are they planning to implement "The End of Error" (https://www.amazon.com/End-Error-Computing-Chapman-Computati...) algorithms?
_0w8t · 2016-04-02 · Original thread
The original unum proposal is in [1] which was very interesting read. This is a newer version of the format that replaces NaN with a notion of an empty set and represents 1/x exactly. Rather splendid!

[1] http://www.amazon.com/The-End-Error-Computing-Computational/...

An interview with Gustafson: http://insidehpc.com/2015/03/slidecast-john-gustafson-explai...

His book on Unums: http://www.amazon.com/End-Error-Computing-Chapman-Computatio...

IEEE floating point is an absolute disaster, so I hope this works out.

From the reddit thread: https://www.reddit.com/r/programming/comments/2ckk5u/unum_a_...

natosaichek · 2016-02-24 · Original thread
I look forward to the end of the float / double.

Having variable-sized mantissa and exponent fields, along with error tracking, we'll get rid of that baloney for real.

http://sites.ieee.org/scv-cs/files/2013/03/Right-SizingPreci... http://www.amazon.com/The-End-Error-Computing-Computational/...

trsohmers · 2015-08-10 · Original thread
The End of Error (http://www.amazon.com/The-End-Error-Computing-Computational/...) is the book describing them in great detail... I'm working with John to put together a publicly accessible wiki, which I hope will be up in the next month or two. That being said, the book is worth having.

Fresh book recommendations delivered straight to your inbox every Thursday.