+

Answer Overview

Response rates from 160 Leeds voters.

36%
Yes
64%
No
36%
Yes
64%
No

Historical Support

Trend of support over time for each answer from 160 Leeds voters.

Loading data...

Loading chart... 

Historical Importance

Trend of how important this issue is for 160 Leeds voters.

Loading data...

Loading chart... 

Other Popular Answers

Unique answers from Leeds voters whose views went beyond the provided options.

 @9MV4FB5answered…7mos7MO

Yes, but so long as it is under ultimate human control, the AI is not autonomous in when to intervene, that it is solely to ensure efficiency and accuracy and that there is an absolute failsafe that can stop the AI going rogue.

 @9QB522Ranswered…5mos5MO

Yes, as long as it does not include weapons of mass destruction, and is pre authorised by multiple layers of human agreement.

 @9ZCBQKSanswered…1mo1MO

Only if they have been thoroughly tested then it it up to the military themselves if they want to use AI

 @9XFDYGTanswered…1mo1MO

AI can be liable to tactical errors and accidental attacks on civilians, and can also be hacked by an enemy force and rendered redundant.

 @9Q5SXJ6 answered…5mos5MO

AI should be utilised only to allow faster decision making, but the ultimate choice should always require human discretion

 @9R8FL9Wanswered…5mos5MO

Until it can be effectively be controlled, AI should not control military weapons for the foreseeable future.

 @9QJWJ69answered…5mos5MO

Absolutely not, AI do not have empathy, which is needed for the often ad hoc decisions that need to be made when using military weapons.

 @9QJDCLDanswered…5mos5MO

Yes, but The military I was still responsible for casualties caused by weapons as well as have a human deactivation control point