The religion thread got me thinking about this book. I finished it last night.
I enjoyed Sapiens and bought this as a follow up. I found it disturbing and depressing. I found a very good summary on line:
Credit to Daniel Miessler:
Dataism declares that the universe consists of data flows, and the value of any phenomenon or entity is determined by its contribution to data processing.
~ Harari, Yuval Noah (2017-02-21). Homo Deus: A Brief History of Tomorrow (p. 367). HarperCollins. Kindle Edition.
He describes a few core concepts in the same book, which I’ll capture and talk about here.
Dataism is an alternative to liberalism in that it does not place the human and their feelings and decisions at the center of the universe. Instead, it’s all about information flow.
The human species can be seen as a single data processing system, and the people are the chips in this computer metaphor.
Competing religions and political systems are simply competing information processing systems, on this view.
We improve the human system by: increasing the number of processors, increasing the variety of processors, increasing the connections between processors, or increasing the transfer of information between connections and processors.
The human processing system went through four main phases: the Cognitive Revolution, which put us all into a single data-processing network.
The Agricultural Revolution, which continued until the invention of writing and money. The Writing & Money revolution. And the Scientific Revolution, which began around 1492.
He believes the ultimate output of this data processing is the creation of an Internet-of-All-Things, which will ultimately make Homo sapiens obsolete.
A primary commandment of Dataism is to maximize the flow of information by connecting to more of everything in order to create and consume more content. It also wants to link everything to itself, which is the bridge into the Internet of Things.
People subscribing to this are more likely to be young, and will be happy to give up their data to make this happen.
Proponents of this ideology believe that the system itself can know you better than you can know yourself, as it can with everyone. And ultimately we’ll rely on it more and more to make every decision, because it’ll do a better job.
The nature of AI algorithms makes it so that we won’t know how or what the system is doing, or how it’s so damn good at predicting and choosing things. It’ll become a black box, and it actually already is in many cases.
Humanism is all about experiences within us. Dataists say that anything that’s not shared is wasted, and that we cannot find meaning within us. Supposedly the solution is to send our experiences into the system and have it tell us what we should do next.
Right at the front. Leading the way.