+

Answer Overview

Response rates from 662 Yorkshire and The Humber voters.

39%
Yes
61%
No
39%
Yes
61%
No

Historical Support

Trend of support over time for each answer from 662 Yorkshire and The Humber voters.

Loading data...

Loading chart... 

Historical Importance

Trend of how important this issue is for 662 Yorkshire and The Humber voters.

Loading data...

Loading chart... 

Other Popular Answers

Unique answers from Yorkshire and The Humber voters whose views went beyond the provided options.

 @9MV4FB5answered…6mos6MO

Yes, but so long as it is under ultimate human control, the AI is not autonomous in when to intervene, that it is solely to ensure efficiency and accuracy and that there is an absolute failsafe that can stop the AI going rogue.

 @9QB522Ranswered…5mos5MO

Yes, as long as it does not include weapons of mass destruction, and is pre authorised by multiple layers of human agreement.

 @9ZCBQKSanswered…2wks2W

Only if they have been thoroughly tested then it it up to the military themselves if they want to use AI

 @9XFDYGTanswered…3wks3W

AI can be liable to tactical errors and accidental attacks on civilians, and can also be hacked by an enemy force and rendered redundant.

 @9Q5SXJ6 answered…5mos5MO

AI should be utilised only to allow faster decision making, but the ultimate choice should always require human discretion

 @9R8FL9Wanswered…4mos4MO

Until it can be effectively be controlled, AI should not control military weapons for the foreseeable future.

 @9QJWJ69answered…5mos5MO

Absolutely not, AI do not have empathy, which is needed for the often ad hoc decisions that need to be made when using military weapons.

 @9QJDCLDanswered…5mos5MO

Yes, but The military I was still responsible for casualties caused by weapons as well as have a human deactivation control point