From Ray Bradbury’s mechanical hounds in Fahrenheit 451 and Philip K. Dick’s replicants and robot spiders in Blade Runner and Minority Report and down to the Terminator, Star Wars and Matrix franchises, machine armies have ruled science-fiction. But now that the Pentagon has sent out a contract for a “Multi-Robot Pursuit System” to hunt down “non-cooperative humans,” sci-fi has taken a backseat to the real world. I covered the intersections and fluctuations of this next-generation phenomena for AlterNet.
The Pentagon’s Small Business Innovation Research (SBIR) program recently sent out a call for contractors to design a pack of robots whose main purpose would be to track down what the SBIR ominously referred to as “noncooperative human subject[s].”
How does the robot pack decide which human is cooperative and which is not? Welcome to the wonderful, dystopian world of defense pork.
The call immediately raised red flags, as well as philosophical and moral chills, from one end of humankind to the other. Not surprisingly, it was quickly removed from the public Web site before its cyborg spark evolved into a full-fledged paranoia over machine armies and murderous artificial intelligence, the likes of which were previously known only in seminal science-fiction exercises as old-school as Ray Bradbury’s Fahrenheit 451 and Philip K. Dick’s stories, “Minority Report” and “Do Androids Dream of Electric Sheep?” and as new-school as the Star Wars, Terminator and Matrix franchises.
According to the SBIR offer, the “Multi-Robot Pursuit System” would need “a software and sensor package to enable a team of robots to search for and detect human presence in an indoor environment.” The robots would be led by a human commander using “semiautonomous map-based control.” For good measure, the offer added that there “has also been significant research in the game-theory community involving pursuit/evasion scenarios.” According to the offer, the robots should weight a little over 200 pounds apiece, and there should be three to five of them assigned to their human overlord.
The superficial logic at work in this curious merge of machine and flesh dictates that this speculative pack of robots would greatly reduce the human danger inherent in hunting down armed or violent persons hiding indoors. After all, robots are used today to detect and detonate incendiary devices; in fact, those very robots were evolved, armed and deployed to Baghdad, where they are currently awaiting orders to fire. The Pentagon’s future robot pack is just the next inevitable step in that machine evolution: An armed machine given game-theory programming in predation and differentiation. The reduction is slightly convincing: If a robot is smart enough to detect bombs, it’s smart enough to hunt down enemies. Give it a gun and count the saved lives on reality TV.
But slightly convincing is also akin to slightly terrifying, in this case, and not because of what it means for machines. Rather, it’s terrifying because of what it says about their masters.
I, Dehumanizer
“It’s not technology we have to worry about, it’s the humans,” argue Arthur and Marilouise Kroker, editors of the academic technology and culture journal CTheory, which also counts as contributers famed theorists Bruce Sterling, Jean Baudrillard, Paul Virilio and DJ Spooky. “Why blame technology? It generally does what it is coded to do. It’s the human sentient understanding of how to take cruel advantage of human weakness that’s the problem. If the image of lethally armed robots can give rise to futurist dystopian visions, it’s probably because that future has already happened with a military command that specializes in dehumanization.”
It’s not just the military: From entertainment spectacles like Heroes to the New York Police Department and all the way down to the torture porn of Rube Goldberg films like Saw and games like Manhunter, American culture is blitzkrieged by mechanized violence.
One of our currently mediated weapons of choice is the Taser, which has been seen on several Heroes episodes in the hands of mortal secret agents looking to hunt and take down renegade and innocent superbeings. In late September, a mentally disturbed Brooklyn man named Iman Morales was Tasered by an NYPD officer, against departmental rules and protocol, and fell immobilized to his death. Michael Pigott, the lieutenant who ordered the illegal Tasering of Morales, was found a couple of weeks later with a bullet in his head and a suicide note at his side.
We have created more ways to kill and die in our heroic narratives, it seems, than to coexist and compromise. Check any of the hyperviolent installments of culturally charged phenomena like Grand Theft Auto, Hostel and Fear Factor, or just watch the rerun in Iraq, and you get the point quickly. Suddenly, armed-robot pursuit seems perfectly normal.
“If robotics and artificial intelligence advance to this, the question will not be about technology but control being used to concentrate power,” explains Jay Stanley, public education director of the American Civil Liberties Union’s Technology and Liberty Project. “We need to get our house in order, institutionally. The underlying problem isn’t the technoogy, but this large national security establishment grabbing more power and subject to no checks and balances. The National Security Administration has something like 60,000 employees, and who is overseeing them? Congress and its staff of hundreds? We need to appoint privacy commissioniers like every other modern industrial country. During the Cold War, we built a massive security establishment; during the War on Terror, we turned the lenses on ourselves, and it was done rapidly.”
Like the grinning, gorgeous greenhorns of Paul Verhoven’s criminally underrated film adaptation of Robert Heinlein’s sci-fi classic, Starship Troopers, American society has sleepwalked through an intense, expensive militarization that looks like must-see TV. The reality-television phenomenon supplanted real-world privacy invasions and covert torture, replacing the latter civil liberties violations with wide-screen automatons posing as humans in any number of soap-operatic exercises. Bradbury imagined this world in his foundational novel, Fahrenheit 451, which extrapolated television onto entire walls of mundane programming while, yes, packs of robot hounds hunted down noncooperative human subjects clinging to their books. Which is to say, their human history.
Don’t Do the Precrime If You Can’t Travel Time
“This technology may well come back into the civilian world, if required,” says Noel Sharkey, professor of artificial intelligence and robotics at the University of Sheffield. “A number of U.S. police forces and SWAT teams are already using robots regularly for dangerous situations, and iRobot, the makers of the military packbot, have been working with Taser International to arm the packbot for civilian use.
“Sending a pack of robots into a building for clearance would obviously be useful in some police operations. I can easily imagine them being used for policing riots or demonstrations. Who knows where it will lead with society developing so many laws under the cloak of terrorist prevention?”
Those laws have been fearsome and abused in equal measure. Take the RNC 8, for example, mild-mannered Twin Cities political activists who were pre-emptively arrested, in Philip K. Dick “precrime” fashion, before they had a chance to protest the 2008 Republican National Convention and are now facing charges of — what else? — “furthering terrorism.” That may sound like science fiction, but it’s worse: It’s an apotheosis of Minnesota’s enforcement of the Patriot Act.
“Do robots have to look like sci-fi cyborgs? Or something else?” asks Arthur Kroker, who is professor of political science and director of the Pacific Center for Technology and Culture at the University of Victoria. “How about lethal hunting packs of computer-generated financial markets, configured by robo-traders, running and crashing on automatic, and taking most of the world down with them? Maybe there’s nothing more dystopian than the present.”
Fighting the Future
“This is a clear step towards one of the main goals of the Future Combat System’s project of making a single soldier the nexus for a large scale robot attack,” Sharkey says of the Multi-Robot Pursuit System. “Force multiplication of this sort can only be achieved through group robot autonomy. It is also another slide down the ramp toward autonomous fighting weapons. Independently, ground and aerial robots have been tested together, and once the bits are joined, there will be a robot force under command of a single soldier with potentially dire consequences for innocents around the corner.”
Or benefits for innocents, the Pentagon might argue. Using robots saves lives, goes the aphorism. Of course, that argument could easily be rejoined by the military’s current record on murdering innocents in wartime: So far, the occupation of Iraq has erased hundreds of thousands of civilians off the face of the Earth at an economic cost running into the trillions, with no victory defined and no end in sight. In fact, when it comes to war, the United States is a money pit. Just like the Multi-Robot Pursuit System.
“I suspect that these contracts in the short term may not come to much,” cautions the ACLU’s Stanley. “But taxpayer money is best spent on research and education. That’s the best long-term investment in our nation’s future, broader than these narrow military purposes. This is not to say that technological advances can’t be useful, and there certainly is the potential for this project to save lives. But it’s mostly military guys playing with high-tech toys, as they have been doing for decades. And robot armies are a lot scarier than the illegally wiretapped intelligence sitting on a rack in a server farm. But a robot pack’s time is better spent cleaning up litter on Interstate 66. This controversy is telling us more about the present than the future.”
True enough, but sci-fi has always mutated the present and engineered the future. From cell phones and satellites to invisibility cloaks and nanotechnology, it’s only a dream until it becomes a reality. And it usually becomes a reality, one way or another. So I have no problem predicting that robots will replace humans on the battlefield, and I’ll join some esteemed company in doing so. Eventually, we will have to tease out our totalitarian impulses and funnel them into our mechanized progeny and let them have at it while we sit on Olympus and hope they stay down there and fight. Some of us want to know just what the hell we’re going to do if they decide to come home to mommy and daddy.
“As a means to express the darker aspects of our id, technology has worked to devastating effect,” says Mark Pauline, whose robot armies from Survival Research Laboratories have literally gone to war with each other in punishing artistic spectacles. “However, it has been burdened with one serious and unresolved shortcoming: It’s just too impersonal. What could be more impersonal than staring down a machine whose sole purpose is to kill you? As the engineering roadblocks to this type of interaction melt away, it will soon be impossible to say ‘It was just a movie’, or ‘I had a strange dream.’ When that happens, we will finally have created the one truly worthy bogeyman that has so far eluded us. We will have met the enemy, and he will not be us.”
This article appeared at ALTERNET