.

NEWS AND VIEWS THAT IMPACT LIMITED CONSTITUTIONAL GOVERNMENT

"There is danger from all men. The only maxim of a free government ought to be to trust no man living with
power to endanger the public liberty." - - - - John Adams

Friday, July 17, 2015

Killer Robots: The Soldiers That Never Sleep



Your Tax Dollars at Work

  • It is a neck and neck race to human extermination.  Who will win out?  Government funded Terminators?  A 12 Monkeys created Black Death?  or nukes fired by the "best and the brightest" in government?
  • About the best a man can do is to go and do a full Jeremiah Johnson, run to the back woods and hope to ride out the coming apocalypse.


(BBC News)  -  The Super aEgis II, South Korea’s best-selling automated turret, will not fire without first receiving an OK from a human. The human operator must first enter a password into the computer system to unlock the turret’s firing ability. Then they must give the manual input that permits the turret to shoot. “It wasn’t initially designed this way,” explains Jungsuk Park, a senior research engineer for DoDAAM, the turret’s manufacturer. Park works in the Robotic Surveillance Division of the company, which is based in the Yuseong tech district of Daejon. 

It employs 150 staff, most of whom, like Park, are also engineers. “Our original version had an auto-firing system,” he explains. “But all of our customers asked for safeguards to be implemented. Technologically it wasn’t a problem for us. But they were concerned the gun might make a mistake.”

The Super aEgis II, first revealed in 2010, is one of a new breed of automated weapon, able to identify, track and destroy a moving target from a great distance, theoretically without human intervention. The machine has proved popular and profitable. DoDAAM claims to have sold more than 30 units since launch, each one as part of integrated defence systems costing more than $40m (£28m) apiece. 


The turret is currently in active use in numerous locations in the Middle East, including three airbases in the United Arab Emirates (Al Dhafra, Al Safran and Al Minad), the Royal Palace in Abu Dhabi, an armoury in Qatar and numerous other unspecified airports, power plants, pipelines and military airbases elsewhere in the world.

In 2000, US Congress ordered that one-third of military ground vehicles and deep-strike aircraft should be replaced by robotic vehicles. Six years later, hundreds of PackBot Tactical Mobile Robots were deployed in Iraq and Afghanistan to open doors in urban combat, lay optical fibre, defuse bombs and perform other hazardous duties that would have otherwise been carried out by humans.


"Our weapons don’t sleep, like humans must. They can see in the dark, like humans can’t. Our technology therefore plugs the gaps in human capability.”


Read More . . . .

Future Military Robots





ATK Palletized Autonomous Weapon System (PAWS)

Uh-oh, a robot just passed the 

self-awareness test

Giving robots the ability to think. 
What could possibly go wrong?


(Techradar)  -  Roboticists at the Ransselaer Polytechnic Institute in New York have built a trio of robots that were put through the classic 'wise men puzzle' test of self-awareness - and one of them passed.

In the puzzle, a fictional king is choosing a new advisor and gathers the three wisest people in the land. He promises the contest will be fair, then puts either a blue or white hat on each of their heads and tells them all that the first person to stand up and correctly deduce the colour of their own hat will become his new advisor.

Selmer Bringsjord set up a similar situation for the three robots - two were prevented from talking, then all three were asked which one was still able to speak. All attempt to say "I don't know", but only one succeeds - and when it hears its own voice, it understands that it was not silenced, saying "Sorry, I know now!"

It might sound a pretty simple task for a human, but it's not for a robot - the bot must listen to and understand the question, then hear their own voice saying "I don't know" and recognise it as distinct from another robot's voice, then connect that with the original question to conclude that they hadn't been silenced.

Logical puzzles requiring an element of self-awareness like this are essential in building robots that can understand their role in society. By passing many tests of this type, it's hoped that robots will be able to build up a group of human-like abilities that become useful when combined.

Read More . . . .


Kyle Reese:  There was a nuclear war. A few years from now, all this, this whole place, everything, it's gone. Just gone. There were survivors. Here, there. Nobody even knew who started it. It was the machines, Sarah.
Sarah Connor:  I don't understand.
Kyle Reese:  Defense network computers. New... powerful... hooked into everything, trusted to run it all. They say it got smart, a new order of intelligence. Then it saw all people as a threat, not just the ones on the other side. Decided our fate in a microsecond: extermination.


No comments: