Read Amazon’s next big thing may redefine big (BBC News)

Amazon doesn’t feel it has a responsibility to make sure its groundbreaking technology is always used ethically.

“Civil rights groups have called it “perhaps the most dangerous surveillance technology ever developed”, and called for Amazon to stop selling it to government agencies, particularly police forces.”

“Mr Vogels doesn’t feel it’s Amazon’s responsibility to make sure Rekognition is used accurately or ethically.

“That’s not my decision to make,” he tells me.”

Murky AF. I guess this kind of moral self-absolution is a necessity if you’re in charge of Amazon.

“He likens ML and AI to steel mills. Sometimes steel is used to make incubators for babies, he says, but sometimes steel is used to make guns.”

Amazon’s ML/AI is not a raw material. It’s shaped (and sold) by a cadre of people at Amazon.

Do they build in any accountability mechanisms to their algorithms?

They’re making a loaded technology. They’re making the guns, and he’s saying “hey – it’s not our responsibility to add safety catches.”

 

Leave a Reply

Your email address will not be published. Required fields are marked *

To respond on your own website, enter the URL of your response which should contain a link to this post's permalink URL. Your response will then appear (possibly after moderation) on this page. Want to update or remove your response? Update or delete your post and re-enter your post's URL again. (Find out more about Webmentions.)