All the news that's fit to assimilate[ Home | Blogs | Events | Robots | Humans | Projects | Podcasts | About | Account ]
I find the idea of an autonomous robot capable of traversing great distances, seeking out and destroying the enemy, be they military or civilian personnel quite exciting. Besides, when it comes down to it, there's really no such thing as a civilian; the non-military personnel work in the factories that produce the weapons and equipment used to wage war and the children simply grow up to either build more weapons or become soldiers.
I'm pretty sure that after a few beta test wars we could get the software debugged and working in a manner that would satisfy the LibeCrats, the robots would go and kill 100% of the military personnel while not damaging the environment, the robots would then capture as many "civilians" as possible and bring them back so the LibeCrats could put them on welfare and give them low interest business loans.
All kidding aside, if they don't want us to make our combat robots, they should march their dope smoking aids infected same sex having tree hugging welfare recipient freak show of an individual down to the recruiter's office and join the military. If by some miracle one of them actually makes it through boot camp, the rest of us can sit back and laugh at the news when Johnny LibeCrat is KIA by an 8 year old so called non-combatant that stuffed a grenade (manufactured by his whacked out fanatic parents) into his sleeping bag while he was preoccupied thinking about how to get away with using my tax dollars to revoke more of my constitutional rights.
Whilst the language is emotive and I got lost in all the numbers towards the end the basic thrust of the article is correct - future wars will be a lot more automated than they are now. I think that most military technology in the forseeable future will be teleoperated though, rather than fully autonomous. I doubt that the megelomaniacs of the future will be entirely willing to remove humans from the loop altogether.
The author‛TM]s last paragraph †` a terrible cut-and-paste ending, implies that because the U.S. will use robots, the only target available to the enemy will be U.S. (and ally) civilians, not that robots will be directly killing civilians. His examples of course are all the recent terrorist events that occurred even though we where not using the war automation described in the article, reducing the solid, main thrust of the article to liberal whimpering.
What the article fails to point out is that the government still lacks the autonomous ability to wage urban warfare, which in my mind would be the ultimate ekiller app†for mobile robotics.
After reading those last paragraphs again, I suppose you could take his comment about civilians either way. He says:
Our military may indeed be able to kill at enormous distances with its Frankenstein killing machines. But all that means is that civilians, not the military, become targets.I disagree with either interpretation, however, so it probably doesn't matter. It seems reasonably clear that modern advances in weapons systems have been designed to make them more accurate, resulting in fewer casulties among non-combatants. On the other hand, if he's saying that retaliating against terrorists with high-tech weapons is the cause of terrorism, I think he has his "cause and effect" backwards. We retaliate against terrorists because they attack civilians. Why terrorists attack civilians in the first place is open to debate but it seems to be because they're irrational, violent, and uneducated. They hate anyone who has different ideals and religious views (much the same way that Mr. Hallinan seems to hate conservatives because they have different views than his own - hmmm...)
The sad thing is that I think there really are ethical and moral issues to be considered in giving autonomous robots the power to kill but Mr. Hallinan manages to completely miss most of these issues in his attempt to stir up emotional outrage at his imagined conservative conspiracy.
Most definitely, we could have a juicy and lengthy debate regarding the ethics of using autonomous robotic weapons. When you think about it, pushing the "KILL" button on your attack robot is just like pushing the fire button to launch a missile; when it comes down to it a missile is a self destructing autonomous robot; You tell the missile where to go to kill the enemy much in the same manner you might tell the killer robot, I really don't see the difference between missiles and attack robots, when's the last time you read an article debating the ethics of using missiles?
2012 Top 10 Robot Christmas Gift Ideas
DARPA Robotics Challenge Kick Off
2012 ASABE Robot Contest Photos
Interview with David L. Heiserman
David Anderson on Subsumption Robots
Review: Apocalyptic AI by Robert M. Geraci
Raspberry Pi Interview with Eben Upton
2012 VEX Robotics World Championship
Giant Dallas Robot Cited as Best Public Art
There's More Than One Way to Skin a Robot
Day of the Androids at Hanson Robotics
Apocalyptic AI by Robert M. Geraci
Robotics Programming 101
Pololu 3pi: the 10,000 Mile Review
Unofficial LEGO Mindstorms NXT Guide
Machinima Review: Stolen Life
i-ROBOT Poetry by Jason Christie
The Definitive Guide to Building Java Robots
Microbric Viper Kit
Introduction to Autonomous Mobile Robots