Forwarded from FUCK LAWNS (🦌 SovereignCervine 🦌)
Psychosis den🔞
https://www.tumblr.com/jingerpi/790198981975228416
This also goes for so many other places in the world. Think of the worst, shittiest, most "dangerous" place you can think of, and people still manage to LIVE there, not just survive.
I have a non-small number of friends who have lived internationally, most will say the same.
Forwarded from USER WAS DIAGNOSED FOR THIS POST
Update on the VISA situation. More calls and email botting are needed.
❤5
Forwarded from Blockchain & GenAI are Stupid
🧵 Thread • FxBluesky
Mary Gillis (@marygillis.bsky.social)
Every. Single. Story. about women not using AI as much as men portray it as bad for their future career prospects (without a shred of evidence) and a result of women being fearful little ladies rather than looking at AI's output and saying "this sucks, it's…
Forwarded from Mnémosyne's Echo Chamber
UK Plans AI Experiment on Children Seeking Asylum | Human Rights Watch
"The United Kingdom’s announcement on July 22 that it would use AI face-scanning technology to evaluate whether an asylum seeker is under age 18 threatens to harm children seeking refuge. The asylum minister, Angela Eagle, said that the decision was because this experimental technology is likely to be the cheapest option.
[...]
AI face scans were never designed for children seeking asylum, and risk producing disastrous, life-changing errors. Algorithms identify patterns in the distance between nostrils and the texture of skin; they cannot account for children who have aged prematurely from trauma and violence. They cannot grasp how malnutrition, dehydration, sleep deprivation, and exposure to salt water during a dangerous sea crossing might profoundly alter a child’s face.
These AI systems’ inability to explain or reproduce results further erodes a child’s right to redress and remedy following wrongful assessment, while creating new privacy and non-discrimination risks.
The UK government has repeatedly and illegally subjected children seeking asylum to abusive conditions by wrongly classifying them as adults."
"The United Kingdom’s announcement on July 22 that it would use AI face-scanning technology to evaluate whether an asylum seeker is under age 18 threatens to harm children seeking refuge. The asylum minister, Angela Eagle, said that the decision was because this experimental technology is likely to be the cheapest option.
[...]
AI face scans were never designed for children seeking asylum, and risk producing disastrous, life-changing errors. Algorithms identify patterns in the distance between nostrils and the texture of skin; they cannot account for children who have aged prematurely from trauma and violence. They cannot grasp how malnutrition, dehydration, sleep deprivation, and exposure to salt water during a dangerous sea crossing might profoundly alter a child’s face.
These AI systems’ inability to explain or reproduce results further erodes a child’s right to redress and remedy following wrongful assessment, while creating new privacy and non-discrimination risks.
The UK government has repeatedly and illegally subjected children seeking asylum to abusive conditions by wrongly classifying them as adults."
Human Rights Watch
UK Plans AI Experiment on Children Seeking Asylum
The United Kingdom’s announcement on July 22 that it would use AI face-scanning technology to evaluate whether an asylum seeker is under age 18 threatens to harm children seeking refuge.
🤮4🤬1😭1
Forwarded from Military_History
Please open Telegram to view this post
VIEW IN TELEGRAM
Please open Telegram to view this post
VIEW IN TELEGRAM
🤬5