IPB

Bienvenue invité ( Connexion | Inscription )

Observations des dérives racistes, Le futur est arrivé en avance
Averell
posté 29/05/2018 10:56
Message #1


#TeamZemmour
******

Groupe : Members
Messages : 49,978
Inscrit : 25/02/2009
Lieu : Berlin, Zion
Membre no 813
Tribune : Autre club



https://www.artsy.net/article/artsy-editori...gence-white/amp

QUOTE
It’s no secret by now that artificial intelligence has a white guy problem. One could say the same of almost any industry, but the tech world is singular in rapidly shaping the future. As has been widely publicized, the unconscious biases of white developers proliferate on the internet, mapping our social structures and behaviors onto code and repeating the imbalances and injustices that exist in the real world.

There was the case of black people being classified as gorillas; the computer system that rejected an Asian man’s passport photo because it read his eyes as being closed; and the controversy surrounding the predictive policing algorithms that have been deployed in cities like Chicago and New Orleans, enabling police officers to pinpoint individuals it deems to be predisposed to crime—giving rise to accusations of profiling. Earlier this year, the release of Google’s Arts and Culture App, which allows users to match their faces with a historical painting, produced less than nuanced results for Asians, as well as African-Americans. Additionally, a new book, Algorithms of Oppression: How Search Engines Reinforce Racism, argues that search engines themselves are inherently discriminatory.

“Data sets reflect the hierarchy in which we live,” said Kate Crawford, an artificial intelligence expert at NYU, at a recent research and development salon at the Museum of Modern Art. These biased algorithms and skewed data sets “reify what are fluid social categories,” she said, making the white man the online default, the norm. (Type “CEO” into Google image search and you’ll see an endless mosaic of suited white guys.)

Given all of this, it’s no wonder that when artist Stephanie Dinkins came across a brown-skinned robot, she was surprised. Five years ago, she stumbled upon a YouTube video of an animatronic human bust, named Bina48, shown moving and conversing. Its developer, Martine Rothblatt, had modeled the robot after her wife, a woman of color named Bina Rothblatt. Dinkins recalls her astounded initial reaction: “What the heck is that?” she wondered. “How did a black woman become the most advanced of the technologies at the time?” That encounter led Dinkins to Lincoln, Vermont—where Bina48 lives at Rothblatt’s Terasem Movement Foundation—in order to engage in a series of conversations with the robot about race, intimacy, and the nature of being.

Dinkins has since gone down what she calls a “rabbit-hole” of investigations into the way that culture—particularly the experiences of race and gender—is codified in technology. She has become a strong voice in the effort to sound the alarm about the dangers of minority populations being absent from creations of the computer algorithms that now mold our lives. Her research into these imbalances has taken her on a head-spinning tour of tech companies, conferences, and residencies over the past few years. Dinkins is currently in residence at the tech programs of both Pioneer Works and Eyebeam, nonprofit art centers based in Brooklyn. She regularly leads community workshops where she educates people broadly about the impact of AI on our lives, aiming to cultivate an attitude toward technology that sees it not as an intimidating monolith—the haunting specter of computers gone awry that we see so often in Black Mirror or, most iconically, in the cunning and calculating character of HAL in Stanley Kubrick’s 1968 film 2001: A Space Odyssey—but an approachable tool that is, for better or worse, very human.

“We live in a world where we have to always be learning and willing to take on new information, and to do the work to get there, otherwise we’re sunk,” she says. “How do you move forwards in this super fast technological world?” She operates under the principle that if she can get people to think about the future in increments, it’s not quite so daunting. “In five years, what’s my world going to look like? What do I need to be doing now to start dealing with that world?”

Dinkins tries to find accessible avenues into what can seem like brain-scrambling concepts by speaking the language of her target group. In recent workshops at the New York nonprofit space Recess, she worked with kids who’d been diverted from the criminal justice system. She began by inviting them to wrap their heads around what an algorithm is, exactly, finding analog comparisons in “basic things you can do without thinking,” like brushing your teeth, or behavioral tendencies, like those that shape encounters with police. She helped them to see the way in which they are hardwired to react in such moments of conflict. They worked on Venn diagrams to visualize these interactions from their point of view and that of a cop: What were those two people thinking in this shared moment? Where are the overlaps? “Some of [the kids] can be very reactionary, which makes the situation worse,” says Dinkins. “That’s where the algorithm has to change.”

From that familiar territory, Dinkins works her way into talking about online systems and chat bots—pieces of software that emulate the conversational style of humans, and evolve as people enter into dialogue with them—as well as the larger goal of training AI to use language and ideas that relate to a more diverse range of worldviews. Participants in her workshops will often have a go at setting the intentions of a bot, then implanting it with data. One group created a bot whose sole purpose was to tell jokes. The input? Yo mama jokes. “I thought that was just amazing,” says Dinkins. “It’s the idea of taking one’s own culture and putting it into the machine, and using that to figure out how the machine is making decisions.”

Dinkins herself is busy turning the experiences of three generations of her family into a bot, an oral-history-cum-memoir of a black family in America in the form of a computer algorithm. She, her aunt, and her niece have been interviewing one another intensively for the past several months, using stock questions intended to get at the fundamentals of their values, ethics, experiences, and the history of their family. The raw interviews are put into a machine learning system that digests them and generates an amalgam of the three family members—a voice in the machine whose manner of speech is a reflection of the family’s language and concerns.

“I can already see that when you ask [the bot] stuff, it sounds sort of like my family,” says Dinkins. “I know that love is going to come out, that family is going to come out. We haven’t even fed it that much data yet, but it sounds like us. It’s kind of magical. I’ve been thinking about making another bot that’s all about telling people they’re loved. But I realize that’s just my family coming out in another way.”

Dinkins sees in this algorithmic memoir something of a proof of concept: the potential to illustrate how different AI could look when it reflects the experiences and values of a more diverse set of people, and is divorced from market values. “It’s amazing that you put in a certain ethos and ethics and it comes back out,” she says. “What does that mean when it’s detached from commercial imperatives? Because I think that’s super important too. If we’re all after the next buck, we know what we get already. It could have value commercially, but it isn’t about commercial value.”

Dinkins has settled into the reality that advocating for greater representation and human values in code will probably be her life’s work. “The project keeps growing, which is both excellent and crazy,” she says. “People are listening to me, so I’m talking about something that needs to be said, clearly. There’s an urgency about it.”

Taking on the biases of a vast, multinational web of artificial intelligence technologies is no small task. Fortunately, Dinkins is part of a small but growing community of academics, technologists, multidisciplinary professionals, and organizations—like Black Girls Code and Black in AI—who recognize the threat at hand. “It’s a monster,” she says of the scale of the problem, “but I don’t think we can afford to have an adversarial relationship with technology. The work to try to get there is really worth the effort.”

That goal may require radically breaking with received code. It brings to mind the HBO series Westworld, in which we see a fantasy universe populated by robots—and created by white men. Thandie Newton, a black British actor, plays the bot Maeve Millay, the manager of a brothel who gradually unravels the true nature of her reality: that her every action is the result of computer algorithms written by men. “All my life I’ve prided myself on being a survivor. But surviving is just another loop,” she says in one scene of the first season. In another: “Time to write my own fucking story.”

Dinkins herself is busy turning the experiences of three generations of her family into a bot, an oral-history-cum-memoir of a black family in America in the form of a computer algorithm. She, her aunt, and her niece have been interviewing one another intensively for the past several months, using stock questions intended to get at the fundamentals of their values, ethics, experiences, and the history of their family. The raw interviews are put into a machine learning system that digests them and generates an amalgam of the three family members—a voice in the machine whose manner of speech is a reflection of the family’s language and concerns.

“I can already see that when you ask [the bot] stuff, it sounds sort of like my family,” says Dinkins. “I know that love is going to come out, that family is going to come out. We haven’t even fed it that much data yet, but it sounds like us. It’s kind of magical. I’ve been thinking about making another bot that’s all about telling people they’re loved. But I realize that’s just my family coming out in another way.”

Dinkins sees in this algorithmic memoir something of a proof of concept: the potential to illustrate how different AI could look when it reflects the experiences and values of a more diverse set of people, and is divorced from market values. “It’s amazing that you put in a certain ethos and ethics and it comes back out,” she says. “What does that mean when it’s detached from commercial imperatives? Because I think that’s super important too. If we’re all after the next buck, we know what we get already. It could have value commercially, but it isn’t about commercial value.”

Dinkins has settled into the reality that advocating for greater representation and human values in code will probably be her life’s work. “The project keeps growing, which is both excellent and crazy,” she says. “People are listening to me, so I’m talking about something that needs to be said, clearly. There’s an urgency about it.”

Taking on the biases of a vast, multinational web of artificial intelligence technologies is no small task. Fortunately, Dinkins is part of a small but growing community of academics, technologists, multidisciplinary professionals, and organizations—like Black Girls Code and Black in AI—who recognize the threat at hand. “It’s a monster,” she says of the scale of the problem, “but I don’t think we can afford to have an adversarial relationship with technology. The work to try to get there is really worth the effort.”

That goal may require radically breaking with received code. It brings to mind the HBO series Westworld, in which we see a fantasy universe populated by robots—and created by white men. Thandie Newton, a black British actor, plays the bot Maeve Millay, the manager of a brothel who gradually unravels the true nature of her reality: that her every action is the result of computer algorithms written by men. “All my life I’ve prided myself on being a survivor. But surviving is just another loop,” she says in one scene of the first season. In another: “Time to write my own fucking story.”


Le futur 41148013dff4d0.gif


--------------------
J'en suis au même niveau.
Sauf que moi je peux pas enchaîner 2 jours de suite :ph34r:

1970 - 2010
Go to the top of the page
 
+Quote Post

Les messages de ce sujet
- Averell   Observations des dérives racistes   29/05/2018 10:56
- - Tchoune   Problème de copie de l'article, les 5 dernier...   29/05/2018 12:58
|- - Le Trépied   Citation (Tchoune @ 29/05/2018 13:58) Pr...   29/05/2018 18:36
- - Averell   https://eu.usatoday.com/story/news/2018/05/...lity...   29/05/2018 13:51
- - Miles   J'avais pas vu que c'était un nouveau top...   30/05/2018 00:00
- - Juan Pablo Sauron   Citation (Averell @ 29/05/2018 11:56) Di...   30/05/2018 18:59
- - romano   https://twitter.com/JoaoGwadloup/status/1002444050...   03/06/2018 22:39
- - Le Trépied   Mais il est condamné pour quoi ?   03/06/2018 22:41
|- - wil   Citation (Le Trépied @ 03/06/2018 23:41...   03/06/2018 22:49
- - romano   C'est le racisme systémique.   03/06/2018 22:42
- - Le Trépied   J’avais pas vu l’article au dessous   03/06/2018 22:49
- - Miles   Citation L’homme qui l’accompagnait m’a ensu...   03/06/2018 22:54
|- - romano   Citation (Miles @ 03/06/2018 23:54) Le...   03/06/2018 23:02
- - Miles   Ah ouais gros level Pendant ce temps-là du côt...   03/06/2018 23:08
- - sukercop   "Je me laisse pas interpeller parce que j...   04/06/2018 07:24
- - Le Trépied   J’attends quand même l’avis éclairé de Clau...   04/06/2018 10:05
- - Mr Charo   Mediapart qui est train de rédiger un livre à pa...   04/06/2018 13:55
|- - Babou1   Citation (Mr Charo @ 04/06/2018 14:55) M...   04/06/2018 14:07
|- - colt   Citation (Mr Charo @ 04/06/2018 14:55) M...   04/06/2018 14:09
||- - Tourista-chan   Citation (colt @ 04/06/2018 15:09) Ils v...   04/06/2018 14:11
||- - Mr Charo   Citation (colt @ 04/06/2018 15:09) Ils v...   04/06/2018 14:12
|||- - Tourista-chan   Citation (Mr Charo @ 04/06/2018 15:12) L...   04/06/2018 14:16
|||- - Kaionedirection   Citation (Tourista-chan @ 04/06/2018 15...   04/06/2018 14:22
|||- - Mr Charo   Citation (Tourista-chan @ 04/06/2018 15...   04/06/2018 14:26
|||- - stoner_man   Citation (Tourista-chan @ 04/06/2018 15...   04/06/2018 14:26
||- - Rushkoff   Citation (colt @ 04/06/2018 15:09) Ils v...   04/06/2018 16:23
|||- - colt   Citation (Rushkoff @ 04/06/2018 17:23) N...   04/06/2018 16:55
||- - M4URIC3   Citation (colt @ 04/06/2018 14:09) Ils ve...   04/06/2018 17:17
|- - RegardZehef   Citation (Mr Charo @ 04/06/2018 14:55) Me...   04/06/2018 17:20
|- - Ashura   Citation (RegardZehef @ 04/06/2018 18:20...   04/06/2018 17:59
- - Mr Charo   Dans quel sens Babou1, dsl j'ai du mal aujourd...   04/06/2018 14:08
|- - Babou1   Citation (Mr Charo @ 04/06/2018 15:08) D...   04/06/2018 14:11
|- - Mr Charo   Citation (Babou1 @ 04/06/2018 15:11) Que...   04/06/2018 14:22
- - Putamadre   Là pour le moment c'est un blog sur le journa...   04/06/2018 15:59
- - Oyé Sapapaya   Le blog ne reflète pas forcément l'avis des ...   04/06/2018 16:11
- - Averell   Non mais des ahuris de mediapart c'est leur fo...   04/06/2018 16:13
- - corto-news   Citation (Mr Charo @ 04/06/2018 13:55) M...   04/06/2018 16:55
- - Mr Charo   Non je ne veut pas rétablir ce qui est pour moi l...   04/06/2018 18:36
- - Miles   Citation Plusieurs membres de la famille de Théo ...   05/06/2018 07:39
- - Parisian   Citation "Il était cruel avec moi, a dénonc...   05/06/2018 07:40
|- - lomobob   Citation (Parisian @ 05/06/2018 08:40) D...   05/06/2018 08:29
|- - Parisian   Citation (lomobob @ 05/06/2018 08:29) Pa...   05/06/2018 08:31
- - Homer   Le mec absolument pas lucide sur son niveau... Gua...   05/06/2018 11:40
|- - Philo   Citation (Homer @ 05/06/2018 12:40) Le m...   07/06/2018 10:40
- - Le Trépied   Yaya Touré c’est pas la première fois qu’il ...   07/06/2018 10:47
- - Boulick   https://amp.lepoint.fr/2226139   15/06/2018 23:54
- - Averell   Ah oui l'histoire où le patron a licencié le...   15/06/2018 23:59
- - Boulick   Ils sont cinglés c'est effrayant   16/06/2018 00:03
- - M4URIC3   Andy Ngo c'est leur Tourista.   16/06/2018 00:14
|- - Mr Charo   Citation (M4URIC3 @ 16/06/2018 01:14) An...   16/06/2018 09:03
- - NeoSeb35   Bordel ce mindfuck   16/06/2018 00:26
- - Parisian   Ha ouais quand meme   16/06/2018 01:54
- - Miles   Chaud de se dire qu'on y passera   16/06/2018 08:20
- - Maboune   Gerbant.   16/06/2018 08:36
- - stoner_man   Remarquez comme tous ces gens se créent un emploi...   16/06/2018 08:53
- - Houdini   https://lesglorieuses.fr/appropriation/ "Digi...   30/06/2018 14:52
|- - Averell   QUOTE (Houdini @ 30/06/2018 15:52) https...   30/06/2018 14:56
|- - Herbert   Citation (Houdini @ 30/06/2018 15:52) ht...   30/06/2018 14:57
- - Boulick   OLD   30/06/2018 14:58
- - sukercop   Faut rester loin de ça.   30/06/2018 15:25
- - Houdini   https://www.twitter.com/L_M_Vin/status/10141954091...   04/07/2018 13:22
- - M4URIC3   Le pire c'est qu'une large partie de ces c...   04/07/2018 13:39
|- - nappy   Citation (M4URIC3 @ 04/07/2018 14:39) ...   04/07/2018 13:55
|- - M4URIC3   Citation (nappy @ 04/07/2018 13:55) Bah j...   04/07/2018 15:01
|- - Kaelas   Citation (M4URIC3 @ 04/07/2018 16:01) Le...   04/07/2018 15:17
|- - M4URIC3   Citation (Kaelas @ 04/07/2018 15:17) Ils...   04/07/2018 16:18
|- - Kaelas   Citation (M4URIC3 @ 04/07/2018 17:18) Ou...   04/07/2018 17:58
- - witchfinder   Donc là on discute de savoir s'ils sont incoh...   04/07/2018 15:27
|- - Averell   QUOTE (witchfinder @ 04/07/2018 16:27) D...   04/07/2018 15:32
- - NeoSeb35   Ces gens la ont toujours eu un double discours. Co...   04/07/2018 15:36
- - nestor.burma   Citation (Kaelas @ 04/07/2018 16:17) Ils ...   04/07/2018 15:44
|- - M4URIC3   Citation (nestor.burma @ 04/07/2018 15:44...   04/07/2018 16:26
||- - nestor.burma   Citation (M4URIC3 @ 04/07/2018 17:26) Ce ...   04/07/2018 16:33
|- - Homer   Citation (nestor.burma @ 04/07/2018 16:4...   04/07/2018 21:50
- - Kaionedirection   La véritable raison est qu'ils ont plus de ch...   04/07/2018 15:44
|- - Averell   QUOTE (Kaionedirection @ 04/07/2018 16:4...   04/07/2018 15:45
- - Boulick   Ta gueule. LAIT = VIOL   04/07/2018 16:20
- - witchfinder   Citation (M4URIC3 @ 04/07/2018 17:18) Ou...   04/07/2018 16:26
|- - M4URIC3   Citation (witchfinder @ 04/07/2018 16:26)...   04/07/2018 16:35
|- - witchfinder   Citation (M4URIC3 @ 04/07/2018 17:35) On...   04/07/2018 16:39
- - Nemeto   Je sens dans tes posts comme un passif avec les nu...   04/07/2018 16:33
|- - M4URIC3   Citation (Nemeto @ 04/07/2018 16:33) Je ...   04/07/2018 16:39
|- - witchfinder   Citation (M4URIC3 @ 04/07/2018 17:39) Ja...   04/07/2018 16:41
- - Miles   On dirait la dernière planète d'asimov dans ...   04/07/2018 19:21
- - Putamadre   Perso j'en ai rien à foutre, mais qu'ils ...   04/07/2018 21:52
- - Miles   Pareil, leur alimentation les regarde mais attaque...   05/07/2018 08:15
- - Nikos B.   Citation (nestor.burma @ 04/07/2018 16:4...   05/07/2018 09:00
|- - nestor.burma   Citation (Nikos B. @ 05/07/2018 10:00) J...   05/07/2018 10:29
- - Averell   https://www.liveleak.com/view?i=075_1503685692 Po...   05/07/2018 09:16
- - yéga   Citation (Averell @ 05/07/2018 10:16) ht...   05/07/2018 15:33
|- - Averell   QUOTE (yéga @ 05/07/2018 16:33) Sa fring...   05/07/2018 16:38
- - yéga   Comment tu peux sortir en public comme ça ?   05/07/2018 18:14
|- - Duff   Citation (yéga @ 05/07/2018 17:14) Comm...   05/07/2018 19:40
|- - M4URIC3   Citation (yéga @ 05/07/2018 18:14) Comme...   05/07/2018 23:06
|- - Averell   QUOTE (M4URIC3 @ 06/07/2018 00:06) C...   05/07/2018 23:07
- - Kaionedirection   "Paypal me". Heureusement que la plupa...   05/07/2018 23:08
|- - Boulick   Citation (Kaionedirection @ 06/07/2018 00...   05/07/2018 23:13
|- - Averell   QUOTE (Boulick @ 06/07/2018 00:13) C...   05/07/2018 23:16
|- - Boulick   Citation (Averell @ 06/07/2018 00:16) Ca...   05/07/2018 23:36
- - Averell   Y'all white folks is demons   05/07/2018 23:50
26 Pages V   1 2 3 > » 


add postStart new topic
1 utilisateur(s) sur ce sujet (1 invité(s) et 0 utilisateur(s) anonyme(s))
0 membre(s) :

 



Version bas débit Nous sommes le : 16/04/2024 21:18