Found in 4 comments on Hacker News
robomartin · 2022-10-10 · Original thread
The answer is a bit more complex when it comes to the transition to modern high definition television. It's a bit of a long story with very interesting twists and turns that involve politics, national security, the Pentagon, US Congress and none other than Donald Rumsfeld (CIA and Defense Secretary).

The story is well chronicled in a book I read about twenty years ago:

https://www.amazon.com/Defining-Vision-Broadcasters-Governme...

The title is intriguing enough:

"Defining Vision: How Broadcasters Lured the Government into Inciting a Revolution in Television"

I can't possibly do it justice here. I'll just mention that one would not be wrong to call Donald Rumsfeld the father of high definition television. His approach to wrangling the ATSC and FCC into adopting a cornucopia of standards was, from a business perspective, nothing less than genius while, from a technical perspective, a complete mess. The fractional frame rates would have evaporated from this planet had it not been for this part of the story.

Well worth reading for anyone interested in the technology or working in associated industries. Your jaw will most definitely drop as you get deeper into the story.

joezydeco · 2020-04-30 · Original thread
If you're an insomniac, I recommend David Brinkley's book "Defining Vision : How Broadcasters Lured the Government into Inciting a Revolution in Television"

It's an account of the incredibly messy process that got us the ATSC standard in the USA. A little dry but a little interesting.

https://www.amazon.com/Defining-Vision-Broadcasters-Governme...

The upside is that the group took so long to define a standard that digital compression and broadcasting technologies matured enough so that HDTV in the US could be fully digital and not analog like the Japanese had pioneered and proposed as a world standard.

joezydeco · 2018-08-26 · Original thread
If you want a more complete (although dry) overview of how we got from NHK's MUSE to the USA's NTSC-8VSB / MPEG-2 system, the book Defining Vision: How Broadcasters Lured the Government into Inciting a Revolution in Television is a good resource:

https://www.amazon.com/Defining-Vision-Broadcasters-Governme...

The TLDR is that the USA took so damn long to make up their minds about a standard (while cleverly keeping the spectrum tied up and away from others that wanted it) that digital coding and compression was able to develop and mature into a system that worked. MUSE was an analog standard that would never fit into a single channel in the US system.

It's been a few years (OK, >10) since I spent some time at the Rochester Institute of Technology studying aspects of Color Science and later at UCLA studying image sensor design from the guys who designed and built nearly every image sensor that's gone into space.

The problem with the "we need better sensors" question is that, in reality "they" don't, "we" do.

By this I mean that the vast majority of the people on this planet are well served with a color system, from sensor to display, that provides the images we get today. These images are great for everything from selling you an iPhone to being entertained for a couple of hours by a movie to printing stunning images in a Victoria's Secret catalog and posting about your vacation in Maui with your kids on Facebook. There are well-understood color management approaches for making all of the above work very well.

In other words, from "their" perspective, there are no problems and "we" are all crazy.

Would people be amazed by the images one could produce with better sensors on matching display systems? Absolutely. Just as I was when I saw analog HDTV at least ten years before it got to consumer-land.

However, the issue really becomes one of economics. Consumer electronics isn't about excellence. It's about a simple question: "What's the next piece of shit we can get everyone hooked on?".

Famously: https://www.youtube.com/watch?v=8AyVh1_vWYQ

OK, that's a little harsh, but, yes, consumer electronics companies are always on the hunt for the next mass craze in the segment. Remember how everyone needed a 3D TV --not---, or how everyone needed a 240Hz TV --not-- and now everyone needs 4K --not? Consumer electronics companies are constantly throwing stuff up on the wall to see if anything will take off or if they can trigger a new "need" or "must have" through marketing and back-door content creation.

The reality has been that almost everything past the transition to HD and LCD TV's has failed to engage because, well, people don't need it. The transition from CRT's to LCD's, accelerated artificially due to RoHS [1], had a visible and measurable (in layman's terms) step improvement. People could derive satisfaction from spending the money and they, eventually, fell in line and behaved like good little consumers. Yet the entire transition had to be engineered at a massive level. I'd recommend anyone interested in the subject and, in particular, how it is that we got HDTV, read a nice little book titled "Defining Vision":

http://www.amazon.com/Defining-Vision-Broadcasters-Governmen...

I'll just mention a tid-bit that might have a bunch of readers go off and buy it: We have to thank Donald Rumsfeld for it. Yes, that Donald Rumsfeld, ex. Secretary of Defense, etc.:

https://en.wikipedia.org/wiki/Donald_Rumsfeld

If you think we got HDTV on technical grounds...well, read the book.

That's a long way around to say we don't have better imaging systems because the segment of the population who might legitimately need them is minuscule and has virtually no market power. A better imaging system would be a set of very expensive laboratory instruments used for a range of what I'll term esoteric tasks. In the meantime, what we have today is beyond good enough for anyone watching the World Cup or an episode of Lucy.

[1] https://en.wikipedia.org/wiki/Restriction_of_Hazardous_Subst...

Fresh book recommendations delivered straight to your inbox every Thursday.