© 2021
In touch with the world ... at home on the High Plains
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Radio Readers BookByte: Is An Unexamined Life Worth Living

Louvre
/
Wikimedia Commons

This is Nicole English coming to you from Fort Hays State University for HPPR's Book-Bytes.

This is a discussion of the book, Homo Deus by Yuval Noah Harari. As with his earlier book, Sapiens, Harari attempts to give a perspective to human history, then give that history an ethical critique. 

The book is divided into three parts:

The first part describes Harari's view of how humans became the dominant species in the world.

The second part describes how Harari views humans make meaning of the world, with symbols, language, and mental and social constructs.

The third part of the book describes Harari's view of the possibility of an apocalyptic future for humans as we embrace new technologies that may know us better than we know ourselves, and may dominate humans, rather than to serve humans.

Although I find Harari's ideas very thought-provoking and interesting, I found that I had to make certain adjustments in my thinking while reading the book, particularly concerning Harari's terminology and his assumptions.

While I agreed with many of his conclusions about critically re-assessing technology's uses, I found many of his premises and assumptions to flawed, and thus his arguments, his logic to be flawed. 

For example, Harari often uses the terms pleasure and happiness interchangeably, even though many of the authors he cites make a distinction between these two concepts (Epicurus, Buddha, etc.). 

However, he finally lands on the conclusion that slowing down the pursuit of pleasure as being a more preferable strategy, for life satisfaction with which I would agree.   

Also, what Harari calls "Liberalism" is actually more commonly referred to as "Neo-Liberalism", or the liberalism of business and consumerism.  This might cast a slightly different light upon his arguments, when this is understood. 

Harari also has a rather specific interpretation of humanism, seeing it as a religion that replaces humans for gods, making us quest for desires, immortality, pleasure, and power, at the cost our values, ethics, and morality.  However, humanism could make the same argument about religions. 

In other words, simply because a humanist approach to social problems may not be bound by specific religious doctrine, does not mean that humans abandon values, ethics, empathy, or morality.  These are social constructs that are created and become norms through interactions with other humans. 

Harari's understanding of Artificial Intelligence seems somewhat limited, and of how it works, but then this term has been so misused in recent time so as to render it at best, confusing, and at worst, meaningless.  Obviously, algorithms are written by humans, so they reflect the biases of the programmers. 

Again, however, I do agree with his conclusion that we should not embrace technology, Big Data, and algorithms uncritically, or without question.  To paraphrase one of my favorite philosophers: an unexamined life might not be worth living.