Ep.28: Industrial Society and Its Future | Machine Intelligence, Encryption, and the Will to Power


Join Our Mailing List

In this episode of Hidden Forces, host Demetri Kofinas lays out his vision for a possible future driven by the emergent forces we have been covering in 2017. He reads passages from Ted Kaczynski’s “Industrial Society and its Future,” as well as from Bill Joy’s “Why the Future Doesn’t Need Us.” He plays clips from interviews with Barack Obama, Tim Cook, and Jamie Dimon, as he considers how power, privacy, and control, all factor into the emerging technological landscape.

What is the goal we seek to attain with our technologies? What are the benefits and the costs associated with allowing the technological, political, and economic forces of the modern age to continue unabated?

Producer & Host: Demetri Kofinas
Editor: Stylianos Nicolaou

Join the conversation on FacebookInstagram, and Twitter at @hiddenforcespod

What Kind of World Are We Building? | Industrial Society and Its Future

And, I’ve been reading Tim O’Reilly’s new book on Technology and our Future, and it got me thinking about how to reaffirm and refine this evolving mission of Hidden Forces, which has always been founded on this idea that we can’t really know reality – the best we can do is approximate it through abstractions, models, essentially – and we do that because we have an innate, intrinsic, biological need to understand the world better; to understand where we find ourselves and where we’re headed, collectively. And never has this been more relevant than today, where our environment – physical, social, technological – is changing so rapidly. The disillusionment of western society with religious institutions over the centuries has left a gaping hole in our understanding and it has, I think, exacerbated this intrinsic human need for answers; it has exposed our insecurities; it has intensified our need to find meaning. And the exciting, but also unsettling changes brought to us by the microprocessor and the subsequent advancements in science and technology that we’ve managed to introduce into our lives so quickly, can feel very threatening. They threaten not only to our social and cultural institutions, but these changes threaten our very sense of self, our notions of identity; even our biological preconceptions of what it means to be a human being. One of the reasons I’ve produced so many episodes on issues that circle the wagon of technological futurism is because they expose the arbitrary nature of social reality. The so-called “war on facts,” is one of many examples, where we find that one part of society feels that the other is living in a completely different universe with a completely different set of facts. The truth, I think, is that our models no longer work. Our maps are outdated. Our myths have become less relevant, less applicable to lived experience. It isn’t so much that we are losing touch with reality, it’s that we are losing touch with our shared illusion of it. People are fragmenting and breaking off into smaller and smaller subcultures – and in an increasingly digital world, those subcultures can differ immensely. Humanity may be more homogenous than ever before – we each may look more like each other – but our minds and our shared illusion of reality have never before exhibited such variety, at least not in the physical presence of other people. You could be sitting in a subway car with one hundred other passengers and each one of those individuals has a very different working model for explaining reality. The irony of our world is that two of the most dominant forces of the 20th century – the microprocessor and globalization – are bringing us closer together, physically, just as they are tearing us apart culturally and economically. And this can feel very scary because as exciting as these changes are, they’re also deeply unsettling. They threaten our livelihoods, as well as our political institutions, and I think there’s a real longing for order. If people can’t bring order and predictability to their lives, democratic society falls apart. The social contract is broken. And, the risk of that type of disorder is very frightening, particularly to the most powerful individuals in society, who have the most to lose from a changing status quo. And so, that touches on another force that we’ve spent time covering on the show, which is this increasing disparity in income and wealth.

The 400 richest people in America own nearly one-sixth of the country’s GDP. 40% – nearly half of the country has a negative net-worth – they’re operationally bankrupt. Our long-term government liabilities are rising as our birth rate is falling and as we erect walls to prevent the immigration that we need in order to keep our population from contracting like it has in Japan, where there are more diapers sold for adults than for children.

And so, I was thinking about all of these things as I was reading through Tim’s book, and then, he quoted John Gage, the former chief scientist at Sun Microsystems, and that got me to thinking about Bill Joy, the original chief scientist, the co-founder of Sun Microsystems, and his very prescient article in Wired Magazine from 2000 titled “Why the Future Doesn’t Need Us.” I can’t remember exactly when I first read his article, but I do remember finding it surreal and very, very remote – something that I would never need to worry about. Something that was for a generation that I would never meet. Re-reading it today, with what I know and with what the world looks like, feels very different. His message sounds surprisingly reasonable.

And, there’s a passage in this 11,000-word article that sent me down a rabbit hole. It had to do with machine intelligence and automation, and the implications for society and the individual of Moore’s Law – the superliner doubling in the number of transistors in our microchips, which has become a stand-in term for the exponential rate of change we’ve seen in all of the areas of our economy touched by technology.

I want to read that passage for you, in its entirety:

First let us postulate that the computer scientists succeed in developing intelligent machines that can do all things better than human beings can do them. In that case presumably, all work will be done by vast, highly organized systems of machines and no human effort will be necessary. Either of two cases might occur. The machines might be permitted to make all of their own decisions without human oversight, or else human control over machines might be retained.

If the machines are permitted to make all their own decisions, we can’t make any conjectures as to the results, because it is impossible to guess how such machines might behave. We only point out that the fate of the human race would be at the mercy of the machines. It might be argued that the human race would never be foolish enough to hand over all the power to the machines. But we are suggesting neither that the human race would voluntarily turn power over to the machines nor that the machines would willfully seize power. What we do suggest is that the human race might easily permit itself to drift into a position of such dependence on the machines that it would have no practical choice but to accept all of the machines’ decisions. As society and the problems that face it become more and more complex and machines become more and more intelligent, people will let machines make more of their decisions for them, simply because machine-made decisions will bring better results than man-made ones. Eventually, a stage may be reached at which the decisions necessary to keep the system running will be so complex that human beings will be incapable of making them intelligently. At that stage, the machines will be in effective control. People won’t be able to just turn the machines off because they will be so dependent on them that turning them off would amount to suicide.

On the other hand, it is possible that human control over machines may be retained. In that case, the average man may have control over certain private machines of his own, such as his car or his personal computer, but control over large systems of machines will be in the hands of a tiny elite – just as it is today, but with two differences. Due to improved techniques, the elite will have greater control over the masses; and because human work will no longer be necessary the masses will be superfluous, a useless burden on the system. If the elite is ruthless they may simply decide to exterminate the mass of humanity. If they are humane they may use propaganda or other psychological or biological techniques to reduce the birth rate until the mass of humanity becomes extinct, leaving the world to the elite. Or, if the elite consists of soft-hearted liberals, they may decide to play the role of good shepherds to the rest of the human race. They will see to it that everyone’s physical needs are satisfied, that all children are raised under psychologically hygienic conditions, that everyone has a wholesome hobby to keep him busy, and that anyone who may become dissatisfied undergoes “treatment” to cure his “problem.” Of course, life will be so purposeless that people will have to be biologically or psychologically engineered either to remove their need for the power process or make them “sublimate” their drive for power into some harmless hobby. These engineered human beings may be happy in such a society, but they will most certainly not be free. They will have been reduced to the status of domestic animals.

If you are anything like me, that last sentence feels jolting and stands as a metaphor that is not altogether inapplicable. What Bill Joy doesn’t reveal until completing the passage is that he has not written it. It was lifted from a 1995 paper titled Industrial Society and Its Future by Theodore Kaczynski. If that name sounds familiar, it’s because it’s the same Ted Kaczynski currently sitting in federal prison in Colorado for killing or injuring over 20 people in the span of more than 15 years of bombings. America knows him popularly as the Unabomber. He is also a Harvard graduate with a Ph.D. in mathematics, who resigned his professorship at Berkley, moving to the woods of Montana, only to realize that no place on earth could shield him from the technological and industrial forces engulfing the world around him.

I highly suggest the paper to anyone who is interested in the topic I’m laying out today, not only because of its prescience but also because the seemingly deranged man who wrote it concluded, quite lucidly, that “It would be better to dump the whole stinking system and take the consequences.” And that got me thinking, even if we wanted to dump the whole system, as Kaczynski writes, can we? Where are we in the future that he and Bill Joy layout for us? And that jolted a memory from a movie that I have often proselytized for – The Matrix – which I think captures, more than any other piece of 20th-century artwork the dilemma of the modern age. This particular scene takes place in the engineering level of Zion, the only remaining human city, buried deep inside the bowls of an earth inhabited entirely by machines, themselves running on the energy created by human bodies harvested precisely for this purpose. The head of the council of Zion is speaking with Neo, the messiah of this story, about the dilemma facing humanity as they wrestle to vanquish an enemy that they cannot live without:

What is control? That’s a question I’ve been asking myself a lot lately, as I grapple with how I feel about the world we’re creating. I’m not confident that we have a good enough grip on it, nor am I convinced that the solution can be found in constraining the application of our imaginative energies or in blowing up the system as Kaczynski attempted. Blowing up the system is akin to suicide, something that the Councilor alludes to as well. But, what is the alternative? If Kaczynski represents an example of one extreme solution, I would contend that the famous futurist Ray Kurzweil, represents an equally extreme and suicidal alternative. For unfamiliar with Kurzweil, he is part of a growing clique of Silicon Valley thought leaders who also believe that humanity cannot survive into the 21st century, though his solution is step function evolution – a merging of man with machine. I won’t bother enumerating all the ways in which I find his argument circular and unconvincing, but the short end of my disagreement rests on the question “What is it that we hope to accomplish? What is the goal we hope to achieve with our creations? What do we want to get out of all of this?”

I’m not sure what the answer to that question is, but I can tell you that I’m worried about what those in power might think it is. And, that raises another concern, which I’ve already touched on, and that is about where power resides in society, how it’s being unfettered, and how it’s being amplified in ways that I don’t think we’re fully appreciating. We don’t need to enter Kurzweil’s singularity, where machine intelligence surpasses human intelligence, in order to be threatened by the tools we’ve created. As power accumulates towards the top, a smaller number of people will have unlimited access to technologies that would otherwise be confused with magic. We’ve talked about some of them on this show – gene editing, narrow AI, autonomous weapons, mass surveillance technologies – and that carries with it a corollary, which is that, as the wealth and power in society consolidates at the top, the same tools that can be used by a small minority to control society can be used by the individual to destroy it. Ted Kaczynski wanted to blow up the system, but all he had at his disposal were pipe bombs. Far less intelligent, but equally motivated people will have increasingly more sophisticated and capable weapons at their disposal. Advances in synthetic biology and 3D printing make the current concerns about cybersecurity pale in comparison. What happens when one person can create a bug that could kill most of humanity from the comfort of his own home?

It’s something that should concern all of us, but which disproportionately threatens the world’s wealthiest and most powerful. They have the most to lose, and that becomes obvious when you consider that the tradeoffs the rest of humanity is being asked to make – tradeoffs of privacy for security, or in the case of the medical sciences ending death and disease in return for the re-engineering of the human race. For those with the wealth or position to avoid TSA lines and mass surveillance, the choice is not so difficult. More surveillance, more control. MORE advancements are the only way to safeguard the system, which works for a smaller and smaller number of people.

And this got me to thinking further about how liberal and progressive-minded people in America have reacted to the election of Donald Trump, where many see him as Hitlarian and as a danger to democracy. I think this reflects a tribal instinct that has blinded us from bearing witness to the expansion of executive power over the last 16 years, under both Republican, as well as Democratic administrations.

I want to play a clip from an interview given by former President Barack Obama at SOUTH BY SOUTHWEST less than one year ago, where he was addressing the recent, and quite scary, showdown between the executive branch of the United States government and a private corporation – Apple – in the aftermath of the San Bernadino terrorist attack, where the FBI wanted to brute force the iPhone of a dead terrorist, but needed Apple to essentially write a new operating system that they could then install on the phone, and which would remove the limit of bad attempts one could make in entering the phone’s password. Basically, they wanted Apple to compromise the security of all its customers – you and me – in order to satisfy the requests of our government. The clip is about a minute long.

I wonder, how many of us would readily believe that the government today, is more constrained in apprehending a child pornographer, or in solving a terror plot, or in enforcing tax policy? Does anyone remember what life was like before the 9-11 hijackers terrorized the nation by slamming passenger airliners into the two most iconic buildings in America? We granted the government immense powers in nothing short of a national panic, and we have largely added to that list in the years since. It makes me think of something I read once by Mark DAY-NER, a war correspondent, where he talked about how war reveals the nature of power, where it resides, and what the intentions of those who wield it are. This isn’t a personal thing. How you feel about any particular president isn’t going to change the trajectory I’m describing. This is an institutional concern. The executive was never meant to hoard or exercise this level of power or control. Nor was it created in order to rummage through the bank accounts of it citizens. In that clip, Obama drew a direct line from terrorism to encryption, to tax evasion, to hidden bank accounts, – and that got me thinking about money, and what it means that we still have some control over our money – we may not have much control left over our privacy, but in America, if you have enough money, you can buy privacy. And that’s instrumental to consider when thinking about how titans like Mark Zuckerberg and Eric Schmidt declare privacy dead – billionaires who’ve made their fortunes from selling our data and assaulting our privacy – who go out of their way to protect their own privacy by setting perimeters around their homes or in the case of Facebook’s CEO, buying up the four adjacent houses to his own property. Again, this isn’t a personal thing. I doubt most of our world’s business and political elite are sitting around twirling their mustaches hatching plots to turn us into teacup Chihuahuas. But, the larger point I’m making is that when those in power become increasingly out of touch with the people over which they hold that power, bad things can happen.

And it explains this unreasonable willingness to compromise our privacy and chip away at our liberties for very marginal gains in security. It puts Jamie Dimon’s recent remarks about Bitcoin in context. Take a listen:

“if you’re a criminal.” That’s not just anyone saying that. It’s the head of the largest bank in the Western world. JP Morgan has $2.6 trillion dollars worth of assets. That’s the annual GDP equivalent of the fifth largest economy in the world behind Germany. This helps you appreciate the power wielded by this one individual, who assumes that you are a “criminal” if you want to own and transact in an asset that sits outside the financial system that he and a very small number of multinational corporations and governments uniquely benefit from. And consider also the consternation expressed by the Obama administration when they realized that Apple’s encryption made it impossible for them to access one, very small part of their subject’s communication. The NSA has tremendous access to our data. Obama’s suggestion that the investigation and the country’s safety depended on not only having unbridled access to all the network traffic but also to be able to retroactively demand from a company that it literally, write a special Operating System in order to access a particular device is revealing and alarming in its presumption.

Which brings us to our next three clips of Tim Cook responding to the government’s requests. The relevant parts are between (7:29 – 7:59) in the first clip, along with (:26-: 55) and (1:51 – 2:05) in the second.

A few things strike me when listening to that. The first is the very obvious but often forgotten point, that our smartphones are no longer just a device for sending text messages or making calls. They shouldn’t even be called phones. They carry and record more information about us than we can remember about ourselves. That’s not an exaggeration. The other thing that strikes me is the emphasis on encryption. It’s something that Obama, Jamie Dimon, and Tim Cook all touched on, which is that any capacity to hide information from the government is deemed unacceptable. The suggestion is that it threatens our most basic notions of safety – be that from criminals, sex offenders, or terrorists – and that the government, working in conjunction with multi-national corporations, is simply incapable of doing its job without claiming for itself the right to reach into the deepest and most intimate parts of our private lives.

Now, consider what all of this means in an increasingly automated future unfolding on servers that are in the physical possession of corporations working hand in glove with governments, and you start to see why encryption and publicly distributed databases are so threatening to the existing power structure. It isn’t that their concerns are illegitimate – that terrorists and criminals won’t try to use these technologies to wreak havoc upon society – but that we aren’t all playing by the same rules and so, we don’t all bear the same costs. We aren’t properly pricing the risks that we’re taking. In this scenario, the bulk of humanity is the externality in the economic equation of the planet’s elite. If forcing people to live like domesticated animals will slightly increase the likelihood of stopping a Ted Kaczynski, a Timothy McVeigh, or a Mohamed Atta, then it stands to reason that this is what will happen, because the future doesn’t just appear out of nowhere. It’s shaped by the forces of the present. And if you don’t act to change the angle of the vector driving those changes, or account for them in the planning of your own life, then you shouldn’t be surprised at the magnitude of the outcome.

Some of you may wonder where I come down on this. You might think that I oppose government intrusion on principle or that I believe privacy is more important than security. I’m not so sure. The more I stare at the future, the more uncertain everything appears. I think we have a legitimate shot at making it safely through this century, but I think it’s going to take more than striking a balance between privacy and security. I think it’s going to require that we form new bonds with one another and that we find a deeper, more empathic means of communicating with each other. Because the more we segregate and subjugate parts of society, the more likely it is that a Kaczynski is going to succeed in breaking the system and that may likely be because the system he is breaking is failing us. This show, and the work that we are doing, together, by learning more about the forces that are shaping our lives and our future, is part of the solution. But ultimately, what matters is what we chose to do with that understanding.

And with that, I wish you all a happy start to your New Year and I look forward to all the amazing conversations that we will have together in 2018.