0. a huge dilution of the concept of expertise (see my other point) and easy access to crap information mixed-in with the good in huge piles,
1. a big waste of time for children/teens/infantilized adults (and a killer of socialization) that's already an "addiction",
2. something ho-hum (I was there in the 90s and 80s before GPS became widespread. We could still walk around towns, find our way, and drive places).
3. OK-ish, still an order of magnitude less helpful in saving lives compared to early low hanging fruits like access to running water, antibiotics, hand-washing in hospitals, etc.
4. Still a yawn atm.
5. A fad if I ever saw one, touted to "change the world" and already nearly forgotten except in enthusiast circles,
6. Something still marginally useful, and with a large potential for a dystopian future (large parts of the population living in slums as their work is not required, drones/robots used to police autocratic states, etc).
>You aren't excited about having all of human knowledge in your pocket?
No. I'm more excited about the output (books, articles, etc) from people pre-2000 (sometimes much pre) who didn't have "all of human knowledge in [their] pocket" and had to study hard, be dedicated, and actually digest the information to consider themselves knowledgable.
As opposed to "instant faux-experts" (people confident to chime in because they've read 2 paragraphs about a subject in Wikipedia - or worse something like some anti-vaxxing website etc), and: https://www.amazon.com/Shallows-What-Internet-Doing-Brains/d...
Access to "all of human knowledge" was hardly ever a problem since the invention of the printing press, and even less in the 20th century with libraries, bookstores, media, and so on. Knowing what to read, how to value some piece of knowledge (which could be crap, like 90% of what's on the net is), and understanding of what you've read was a problem since forever.
I highly recommend you check out "The Shallows: What The Internet Is Doing To Our Brains", by Nicholas Carr .
 - https://www.amazon.ca/Shallows-What-Internet-Doing-Brains/dp...
My solution was to drastically cut back my internet use most of the time I'd have spent online reading books instead. I do what I need to online for work, and I browse HN once a day to keep up with the latest and greatest.
I've found that my attention span returned relatively quickly, and reading books for hours at a time became easy again. The neat thing is that the increased attention span didn't just apply to books. I'm able to get more done at work, too, by staying locked on to whatever I'm doing and not getting distracted.
Carr—author of The Big Switch (2007) and the much-discussed Atlantic Monthly story “Is Google Making Us Stupid?”—is an astute critic of the information technology revolution. Here he looks to neurological science to gauge the organic impact of computers, citing fascinating experiments that contrast the neural pathways built by reading books versus those forged by surfing the hypnotic Internet, where portals lead us on from one text, image, or video to another while we’re being bombarded by messages, alerts, and feeds. This glimmering realm of interruption and distraction impedes the sort of comprehension and retention “deep reading” engenders, Carr explains. And not only are we reconfiguring our brains, we are also forging a “new intellectual ethic,” an arresting observation Carr expands on while discussing Google’s gargantuan book digitization project.
I've also read Nick Carr's "The Shallows" and other authors about about the web's effect on attention span, distractions, etc.
With all that said, I'm not convinced that people "should" read long form books. I read all those books because I personally enjoyed it. I just can't say with confidence that others should do the same or they will be "missing out" on some unquantifiable intellectual nirvana.
I also enjoy getting lost in Wikipedia articles and jumping around hyperlinks without fully finishing the wiki article I was reading. (Wiki articles are not ever "finished" anyway so there's no guilt trip in leaving the page to head down another rabbit hole.)
15 years ago, I read a dozen of C++ books cover-to-cover. Can someone today get similar levels of knowledge jumping around quality blog posts and watching youtube videos? I think so. I don't hold my traditional reading method for C++ to be superior; it's simply what I did before the internet was available in 1995. I certainly did not learn Golang by reading a book cover-to-cover.
Books certainly have benefits but I think they are overstated in relation to non-book forms of consuming words.
It does, actually. Read Nicholas Carr's 'The Shallows' , it's a pretty decent book about the subject. It also starts off with comparing our usage of the internet with the rise of reading - you know, books and the like. History lesson; humans needed to adapt their brains to be able to read attentively for longer periods of time. The book contrasts that with the ADD nature of the internet, and yet, indicates how it's actually going back to where we were before. Or just a change similar to when books became publicly accessible.
tl;dr, yes there is a change, but I don't think it's necessarily good or bad; just different. And shocking / to be resisted by the older generation, just as how their parents were shocked and resisting the Beatles and similar long-haired freaks. :p
I've noticed this before MOOCs, actually. Take web development tutorials, for instance. A few colleagues of mine loved video tutorials from (e.g) lynda.com; I found it utterly boring and inefficient.
Changes in technology have a fundamental impact on the way humans interacting with the world (for better or worse), an interesting book called The Shallows  highlights some of these points. Is the technology we're utilizing moving us in a direction that is long-term beneficial or harmful? People can access information more easily, but at the expense of what - lack of focus? Problems with deep thought and long-term planning?
This is tangental, but I find this an interesting topic - your memory not limited, at least not the way most people think it is. To be accurate - a body of experiments has failed to turn any evidence for old knowledge preventing formation of new knowledge, or new knowledge pushing out the old.
What prevents formation of new knowledge is time - you could learn A or B, but not both at the same time. What makes us forget piece A is not the piece B committed after A, but absence of repetition of A. If you repeat A every so often, you will remember it equally well regardless of whether you also repeat B or not during the same period.
I get my information from this book, which in turn has references to all of the underling studies: http://www.amazon.com/The-Shallows-Internet-Doing-Brains/dp/...
"Neuroscientists and psychologists have discovered that, even as adults, our brains are very plastic," Carr explains. "They're very malleable, they adapt at the cellular level to whatever we happen to be doing. And so the more time we spend surfing, and skimming, and scanning ... the more adept we become at that mode of thinking."
I highly recommend The Shallows. It's a look at the way the internet is changing our brains. It really might be a good idea to limit exposure to the internet. As a programmer and geek it's worth spending some time thinking about these questions and least being aware of the affects of the medium.
I'm reading it right now and, if nothing else, can totally relate to "I can't concentrate on things very well anymore" feeling.
Neither of these are novel concepts: we've heard about abysmal internet security (FireSheep) and low attention spans (Nicholas Carr[0, 1] and Jonah Lehrer) repeatedly over the last couple of years.
This release may seem profound to you, but LulzSec proposes no solutions to the problems they're creating. They're too nihilistic to put on white hats, and they deserve none of your praise as a result.
Fresh book recommendations delivered straight to your inbox every Thursday.