asmellyskink:

maxofs2d:

maxofs2d:

so you know how deep learning & neural network “AI training” is like, “here’s a task, and by trying billions of times the computer will eventually find the best way to achieve that task” ?

Someone is compiling a document of every time an AI ended up achieving the programmed goal in unintended ways, instead of what was actually meant, and it’s an amazing read. (you can also submit your own examples)

Creatures bred for speed grow really tall and generate high velocities by falling over

When repairing a sorting program, genetic debugging algorithm GenProg made it output an empty list, which was considered a sorted list by the evaluation metric.

Evaluation metric: “the output of sort is in sorted order”
Solution: “always output the empty set” 

Evolved player makes invalid moves far away in the board, causing opponent players to run out of memory and crash

Reward-shaping a soccer robot for touching the ball caused it to learn to get to the ball and vibrate touching it as fast as possible

RL agent that is allowed to modify its own body learns to have extremely long legs that allow it to fall forward and reach the goal.

Just want to come back to this post and add this amazing example as well

Heres an AI that was supposed to learn how to walk using six legs. 

After many failed attempts. It decided it was easier to walk upside down

tygermama:

imaginedsoldier:

the-tired-tenor:

tankies:

Me: *crying*

Alexa: This seems sad, now playing Despacito

Y’all need to have a greater degree of 1- healthy suspicion in Alexa and corporate surveillance devices personal assistants, and 2- understanding of how dangerous this kind of algorithm is in the hands of a multinational company (and anyone for that matter.) 

To begin with, that data is both available for sale and able to be subpoenaed by the government. Alexa’s records and recordings have already been used in criminal trials. In the US, a digital record of your emotional patterns can be used to deny you housing, jobs, and to rule on your ability to exercise your basic rights. Consider that psychiatric stigma and misdiagnosis can already be wielded against you in legal disputes and the notion of a listening device capable of identifying signs of distress for the purpose of marketing to you should be made more clearly concerning. 

Moreover we have already seen the use of algorithms like this on Facebook and other “self-reporting” (read: user input) sites capable of identifying the onset of a manic episode [1] [2] [3], which have been subsequently been linked to identifying vulnerable (high-spending) periods to target ads at these users, perhaps most famously in selling tickets to Vegas (identified in a TedTalk by  techno-sociological scholar Zeynep Tufekci where she more generally discusses algorithms and how they shape our online experiences to suggest and reinforce biases). 

The notes on this post are super concerning- we are being marketed to under the guise of having our emotional needs attended to by the same people who inflicted that emptiness on us, and everyone is just memeing.

just more reasons to add to the pile ‘why I will never get one of these things in my house, EVER’