INDEPENDENT NEWS

Jaron Lanier: Stop Online Human Exploitation

Published: Thu 3 Apr 2014 10:54 AM
Jaron Lanier: Stop Online Human Exploitation
by Suzan Mazur
April 3, 2014

JARON LANIER
"Hopefully -- and it's hard for man to make that jump -- hopefully at the end [of the film Insect Gods] you don't care about the fate of the few. You don't care about the fate of man. You see that civilization has advanced in another way. That it's the roaches that have inherited the Earth. That they have become the gods." -- Saturday Night Live comic Michael O'Donoghue in conversation with Suzan Mazur, 1979
Great minds think alike -- that is, late Saturday Night Live comedic genius Michael O'Donoghue and Jaron Lanier, the "father of virtual reality" and author of the book, Who Owns the Future??.
In the late Seventies, O'Donoghue was in pre-production on an end-of-the-world film about the ascendancy of the New York cockroach into a diaphanous creature that would replace humans. The film was an homage to Roger Corman, and O'Donoghue told me it HAD to be shot in black and white and that he had rejected funding for color. Sadly, we lost Michael, and Insect Gods never got made.
Two decades later, futurist Jaron Lanier et al. wrote to the New York Times they had a way to preserve archives for a thousand years that would survive various disaster scenarios, claiming that with a budget of $75,000 they could implant the robust New York cockroach with a "time capsule" of back copies of the Times Magazine and then release a certain volume of the archival insects (eight cubic feet) to breed all over Manhattan. Unlike O'Donoghue's scheme, Lanier's was not intended as black humor.
However, more recent experimenters have demonstrated that commands can be successfully sent from a mobile phone or PC to live cockroaches. So descendants of the Lanier implanted roaches could have been sabotaged, posing an even greater challenge to New York City and BEYOND, possibly unleashing O'Donoghue's vision.
The rest is history. Lanier, a vulnerable man with a Hitchcock-like profile, Rasta hair and generous eyes has become an international media darling as virtual reality pioneer, musician and visual artist. He is cited as one of the most interesting thinkers alive -- once a goat herder -- with no formal college education, only honorary PhDs. He is also not a physicist, yet is the lead endorsement on physicist Lee Smolin's recent provocative book, Time Reborn. (Curiously, the other lead endorsement on the book is not from a physicist either. Brockman book agent common thread here?)
In Who Owns the Future?, Lanier describes himself as a "humanist softie," and he seems to really enjoy being a father. As we spoke by phone, for instance, Lanier playfully cautioned his young daughter just back from outdoors that the pollen in her hair might sprout. In fact, Lanier has dedicated the book to his daughter and "[t]o everyone my daughter will know as she grows up," saying further, "I hope she will be able to invent her place in a world in which it's normal to find success and fulfillment."
Indeed, Lanier's concerns and pronouncements in Who Owns the Future?, now in paperback, are chilling regarding the consolidation of power enabled by the digital world. In our recent phone conversation he told me that "maximum openness actually turns out to be maximum closedness," referring to the five big tightly-controlled platforms.
We also discussed my interest in "Who Owns Origin of Life", touching on the secretive, privately-funded world of protocell development.
I find the most compelling part of Lanier's book his hopeful exploration of the idea of people being financially compensated for their online contribution instead of being "exploited" (the caveat being RAMPED-UP TRACKING):
"The existence of advanced networking creates the option of directly compensating people for the value they bring to the information space instead of having a giant bureaucracy in the middle, which could only implement an extremely crude and distorting approximation of fairness."
I also adore Lanier's attention to the real nitty gritty, "the end of laundry and never having to wear the same dress twice." Something I once discussed with fashion designer Betsey Johnson. Lanier predicts a countertrend to this technology, as well, i.e., greater reverence for vintage and handmade clothing. Bill Blass would have agreed.
Excerpts of my conversation with Jaron Lanier about the future and who owns it follow:
Suzan Mazur: You've written a book titled, Who Owns the Future?. Do you have concerns about a few people, privately funded, acting as intelligent designers of more or less a second genesis on Earth? Also, do you have concerns about what they will create?
Jaron Lanier: I have what I hope is a somewhat nuanced position on that. I know some of the people in the protocell world and think for the most part they're good eggs. That world is somewhat more ethically aware than the computer science world, for instance, which is spying on people, taking advantage of people. Doing damage to the economy.
I am, of course, concerned -- as you are -- with the small number of people making protocells, the extreme control, and that it's being facilitated by a few rich people instead of being publicly funded.
Suzan Mazur: Have you been inside some of the protocell labs?
Jaron Lanier: Sure.
Suzan Mazur: Of these half dozen protocell labs, you've been to one or two of them? You've been to Szostak's lab and to Gerry Joyce's?
Jaron Lanier: I've been to three labs.
Suzan Mazur: Did you have to sign a confidentiality agreement?
Jaron Lanier: I have not signed confidentiality agreements. There might be some that are implied. I'm of the tech and science world.
Suzan Mazur: They trust you.
Jaron Lanier: This notion of very tight control and a very narrow super elite at work is not exclusive to the protocell labs. The whole world is like that right now. The computer cloud that works with Artificial Intelligence and works with machine learning with giant databases is also restricted to the five big platforms, which are pretty tightly controlled ultimately. The whole world has become one of tiny elites and the rest who are kind of left out. It's a negative trend all around with this being one good example.
It blows my mind that in a very twisted way what seems to people like maximum openness actually turns out to be maximum closedness.
Suzan Mazur: How soon do you think we'll have a protocell?
Jaron Lanier: I don't really know. I'm kind of more interested in trying to figure out some way to develop a better empirical technique so we can really know what we've done. I think there's a tremendous fallacy in the field to think that you can see finality, like in computer programming -- that you think you know what can happen.
Suzan Mazur: Thinking that computer chemistry can replace bench chemistry, for instance?
Jaron Lanier: Using simulations for virtual possibilities is perfectly worthwhile…But we can't pretend that the simulation is reliable before really doing the very, very hard work.
We're all familiar with weather prediction. It has gotten better over time. And biology's surely going to be harder than weather to predict. But because it's so much harder to gather data on microbiology, we're free to fantasize that we have more predictive power than we really do. That illusion is the one that really scares me.
Suzan Mazur: Do you see life as algorithmic or nonalgorithmic?
Jaron Lanier: I think it's a misleading question because it depends on what you mean by algorithmic. In terms of formal algorithms we study in computer science or mathematics -- these things do not ever really exist in physical reality at all. The very idea of a computer that can be described by an algorithm is a bit of a fantasy. What we do is we create an artificial zone where entropy [information] is suspended and where there's this perfect determinism for a period of time.
It can't last forever. That's what we think of as a computer. And algorithms very quickly correspond to what we can achieve if we do that, but even then it's not perfect. Things will always break down, after a while a cosmic ray will zap them, etc.
Suzan Mazur: And so you don't see life as algorithmic.
Jaron Lanier: If by algorithmic you mean the thing that we study in computer science -- it doesn't even exist in reality… If by algorithmic you just mean causal, then of course everything is algorithmic. The problem is you're caught between two extreme definitions of algorithmic, and I don't think there's any middle one.
Suzan Mazur: You describe yourself in your book as a "humanist softie" and have also said that we humans may have taken a wrong turn -- implying our turn into the digital world. But that we did it for the right reasons. Other thinkers, Piet Hut, for instance, an astrophysicist at the Institute for Advanced Study, has said that we may have taken a wrong turn to the objective pole with our very focus on science -- which Hut thinks can't be purely objective anyway. Hut says there are other ways of knowing, that science has been only 1% of our human history and that the other 99% -- the subjective pole -- needs further exploration. Would you comment?
Jaron Lanier: I don't think you can really do experiments with consciousness unless you do experiments with aspects of reality. I mean I understand where Piet Hut's coming from certainly. I'm sympathetic with it but as a practical matter, I don't know what more you do with consciousness other than enjoy it.
Suzan Mazur: David Orban, founder and director of Singularity's Institute for Artificial Intelligence - Europe, told me at a robotics conference in Bergamo a few months ago that people who don't embrace robotics in the future will not be able to survive. Do you agree?
Jaron Lanier: First of all I think it's the stupidest institute ever. It's purely about this religious fantasy of superiority. The whole basis of it is repulsive. Yet the people there are great friends of mine. I admire them. We have fun together. And I tell them all this to their faces. I've also given talks at Singularity about how ridiculous I think it is.
Here's the problem. They say people won't be able to survive if we don't have robotics. Well, how is that different from saying: Oh, if we don't like the way people are, we'll kill them. What is the difference ultimately?
There's a way in which the new sort of vaguely Asperger-like digital technocrat is absolutely lacking in any self-awareness of ethics or morality. It astounds me, again and again. They're my friends, and we like each other, but I do think it's astonishing.
Suzan Mazur: The European Union is heavily investing in industrial robotics and will retrain workers for jobs lost to robotics. But can it work in the US, which is not a social democracy, where job retraining will be left to the unemployed to shoulder the cost?
Jaron Lanier: I don't even know if it can work in Europe. It can work in the early phase in Europe, perhaps, but you can't have a situation where you pretend that all the people aren't needed for anything and robots do the work. This gets to the illusion of Big Data. The truth is the only way to make machine learning algorithms work is robotics or autonomous systems. And it all depends on what we call Big Data, which means massive contributions from massive numbers of people. Without people creating examples, modifying them and reacting to them and all that, the machines can't work. We're pretending that people are less needed.
Now in the early phase you can train people to work with the robots. If you adhere to the Artificial Intelligence ideology, then gradually you'll find a way to convince yourself that people aren't needed. But you can't have a 100% welfare state. So even Europe will break eventually. But the US will break first certainly, as we'll find out.
Suzan Mazur: Is job application via computer one of the reasons so many people in the US are unemployed? Do we need to go back to a more human Human Resources Department? Are the robots looking for catch phrases in CVs failing to recognize human ability and capability at the other end of the computer? Do we need to return to a more human system of hiring?
Jaron Lanier: It's absolutely true that mechanization of human resources has been cruel and ridiculous. That's very, very true. A lot of that was driven by liability avoidance where if the machine did it, then a company couldn't be sued for discrimination. We're pretending that we're doing things that are really needed but aren't actually needed. It's a giant illusion. I think the reason there are so few jobs for qualified people is that we're exploiting those people online.
Suzan Mazur: How far along are we in the digital world with the concept of rewarding people for their real contribution?
Jaron Lanier: The problem still is that online there's a way for you to pay for stuff but no way for you to earn your real value. People are adding value to all the cloud algorithms that allow Artificial Intelligence to operate, but there is as yet no way for those same people to benefit.
*************

Suzan Mazur is the author of The Altenberg 16: An Exposé of the Evolution Industry. Her reports have appeared in the Financial Times, The Economist, Forbes, Newsday, Philadelphia Inquirer, Archaeology, Connoisseur, Omni and others, as well as on PBS, CBC and MBC. She has been a guest on McLaughlin, Charlie Rose and various Fox Television News programs. Email: sznmzr@aol.com

Next in Comment

US Lessons For New Zealand’s Health System: Profiteering, Hospital Adverse Events And Patient Outcomes
By: Ian Powell
Israel’s Argument At The Hague: We Are Incapable Of Genocide
By: Binoy Kampmark
View as: DESKTOP | MOBILE © Scoop Media