Jump to content
House Price Crash Forum

Archived

This topic is now archived and is closed to further replies.

inflating

Killer Robots - U N Seeks Debate

Recommended Posts

Thanks for the link.

This issue has been troubling my thoughts a lot recently.

As long as people are given 20 seconds to comply everything'll be peachy

Share this post


Link to post
Share on other sites

A robot is an Automiton...ie, it is unthinking.

We have had robot killers from since time immemorial.

Lets look at automita in killing:

traps in the form of snares, hidden pits, spring traps sprung by tripwire.

Landmines.

all automita....

Share this post


Link to post
Share on other sites

A robot is an Automiton...ie, it is unthinking.

We have had robot killers from since time immemorial.

Lets look at automita in killing:

traps in the form of snares, hidden pits, spring traps sprung by tripwire.

Landmines.

all automita....

All essentially passive, defensive instruments.

The new batch will be a whole lot more proactive.

Share this post


Link to post
Share on other sites

All essentially passive, defensive instruments.

The new batch will be a whole lot more proactive.

lets hope they are powered windows.

Share this post


Link to post
Share on other sites

Looks like the stuff of nightmares, and they're already available.

(video) http://www.bbc.co.uk.../world-22724583

The article didnt say they were already available...they said many nations are developing them.

the issue with these things is that they are limited in that they will run out of ammo, power, or simply fall over.

They should be banned, and the placer of such devices be subject to war crimes.

The article also said robots cant commit warcrimes.....neither can a crossbow....Its the user that commits the crime, so IF one is placed in a community and it massacres surrenderers ( not sure who they would be surrendering too) or civvies, then the USER would be the war criminal for placing it.

Share this post


Link to post
Share on other sites

lets hope they are powered windows.

A reading from the Gospel of Dick Jones

I had a guarantee military sale with ED 209. Renovation program. Spare parts for 25 years. Who cares if it worked or not?

edit:

A whimsical spoof on autonomous drones and 'ethical' algorithms I've already posted a while back. Probably a little dry for most people but well done imho...

Share this post


Link to post
Share on other sites

A reading from the Gospel of Dick Jones

edit:

A whimsical spoof on autonomous drones and 'ethical' algorithms I've already posted a while back. Probably a little dry for most people but well done imho...

I meant, powered BY Windows....apparently it breaks down a lot.:(

Share this post


Link to post
Share on other sites

I meant, powered BY Windows....apparently it breaks down a lot.:(

As Dick says, if you've got a contract who cares if it works or not?

No need for a full-on UN debate. Just sit all the delegates down in front of a VCR of Robocop.

Share this post


Link to post
Share on other sites

As Dick says, if you've got a contract who cares if it works or not?

No need for a full-on UN debate. Just sit all the delegates down in front of a VCR of Robocop.

excellent point.....are cluster bombs illegal?

Share this post


Link to post
Share on other sites

One dreadful scenario is where the weapons bit works fine, but the decision-making bit goes wrong. As shown in Robocop films.

Or the decision-making bit is fundamentally flawed, but still sold to regimes that are persuaded it works fine.

Remember, there is no scientific evidence that a polygraph works as a lie detector. they are used all over the world. Now military & security services are developing systems that can purportedly recognise faces, analyse the behaviour of people from how they walk in front of a CCTV, or predict their intentions from, say, the text of emails. They don't even have to work; the buyer merely has to be persuaded they work.

Iraqi defence forces were persuaded that re-badged dowsing golf-ball finders were reliable detectors of explosives FFS. How easy would it be to sell a government like that dodgy automated analytics systems?

It is a small step to a manufacturer claiming infallibility of such systems, and them then being deployed.

Would anyone place misguided trust in automated decision making systems? Of course not... oh, wait, what is a Flash Crash?

And I am not saying that a reliable automated decision making process is OK to roll out.

Share this post


Link to post
Share on other sites

I think that, as a starting point, the first one or two of Isaac Asimov's three Laws of Robotics should be given the true force of International law, in the sense that they would be binding on whosoever builds, activates, operates or deploys automated equipment, even for war. (The first law is effectively already in place in the civilian context, due to rules regarding criminal liability)

First Law: A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
Second Law: A robot must obey orders given it by human beings, except where such orders would conflict with the First Law.
Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

I think that the third law doesn't address any important ethical issue, until such time arrives when a robot can be a sentient being, possibly entitled to it's own rights.

Share this post


Link to post
Share on other sites

What's to say?

People need to be killed to further political purpose.

Why put your own in physical harms way on the battlefield or psychologically in the aftermath.

Just type a name into a computer and be done with it.

War is the human condition above all others.

Share this post


Link to post
Share on other sites

(The first law is effectively already in place in the civilian context, due to rules regarding criminal liability)

As I read the First Law there's no 'state-sanctioned violence is OK' opt out. Robots are not to injure people, full stop. Be that criminal harm or, ahem, lawful harm.

These things will be rolled out, as soon as is practicable. Regardless of whether there is legitimate cause for their implementation. They'll be rolled out because of the money that will be made manufacturing and maintaining them and because they offer the people who run things the tantalising prospect of automata that will enforce the Will of the managerial class, without question.

The only downside for TPTB that I can see is that these things will remove the protective layer of scapegoats, sub-educated West Virginia National Guardsmen that sort of thing, between TPTB and the consequences of their orders. Plausible deniability goes right out of the window. Drones really get super-interesting in a political environment where the leadership doesn't even have to pretend it is accountable to anyone.

Share this post


Link to post
Share on other sites

As I read the First Law there's no 'state-sanctioned violence is OK' opt out. Robots are not to injure people, full stop. Be that criminal harm or, ahem, lawful harm.

The problem is that the first law contradicts itself.

Say a robot were to witness a shooting spree; it's choices are to resond with force in breach of the first part of the law or to do nothing in breach of the second part.

Share this post


Link to post
Share on other sites

War is the human condition above all others.

Not so. There are cultures that have no concept of war.

All cultures have religion on the other hand.

(thats not an endorsement of religion btw)

Share this post


Link to post
Share on other sites

Not so. There are cultures that have no concept of war.

Hmm. You do realise that Britain is the most warring nation in the history of humanity, right?

Name one 'culture'... Name one at any time in human history that has been neither participant and unaffected by acts of war in its tenure.

You are likely referring to select liberals who've chosen to censor history, or you've lived your life in an institution of some sort or another.

Share this post


Link to post
Share on other sites

Yeah, they should reign in automatic killing machines to stop corrupt authority figures eventually taking over with an invincible army of killer robots.

Share this post


Link to post
Share on other sites

The problem is that the first law contradicts itself.

Say a robot were to witness a shooting spree; it's choices are to resond with force in breach of the first part of the law or to do nothing in breach of the second part.

Yup.

That omission of action qualifier opens up the way for all sorts of genuine conflicts and deliberate fudging.

Laws are like that. Especially when you start writing in exceptions.

Share this post


Link to post
Share on other sites

As I read the First Law there's no 'state-sanctioned violence is OK' opt out. Robots are not to injure people, full stop. Be that criminal harm or, ahem, lawful harm.

All three laws are just proposals by Asimov.

My point is, in a civilan context, if a machine harms a human, the owner or operator of that machine is potentially criminally liable under existing laws (it will vary from state to state).

As Asimov's first law is not expressly enacted in any jurisdiction, however, there is currently no law forbidding an automated machine from killing in an act of war.

It would be possible to outlaw such weapons, in principle, in the same way that, say, chemical weapons are outlawed. Their use to kill would then be a war crime.

It has rightly pointed out that, in a simple sense, things such as land mines are automated killers. Thankfully, these are gradually being outlawed across the globe (with resistance from certain fascist states). As machines become more 'intelligent', there is the unhealthy prospect of them being used to make ever more complex 'decisions', and thus open the prospect of even more mistakes.

The US routinely lies about gunship and drone attacks, claiming that targets are always positively identified with proper safeguards. I don't think they would hesitate to lie if a robot made a mistake.

In Stock Exchanges, they put 'circuit breakers' in that stop everything if HFT platforms get out of hand. Anyone like to gamble on circuit breakers being put in place in a war? I wouldn't.

Share this post


Link to post
Share on other sites

  • Recently Browsing   0 members

    No registered users viewing this page.

  • 239 Brexit, House prices and Summer 2020

    1. 1. Including the effects Brexit, where do you think average UK house prices will be relative to now in June 2020?


      • down 5% +
      • down 2.5%
      • Even
      • up 2.5%
      • up 5%



×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.