Observations des dérives racistes, Le futur est arrivé en avance |
Bienvenue invité ( Connexion | Inscription )
Observations des dérives racistes, Le futur est arrivé en avance |
29/05/2018 10:56
Message
#2401
|
|
|
#TeamZemmour ![]() ![]() ![]() ![]() ![]() ![]() Groupe : Members Messages : 49,981 Inscrit : 25/02/2009 Lieu : Berlin, Zion Membre no 813 Tribune : Autre club |
https://www.artsy.net/article/artsy-editori...gence-white/amp
QUOTE It’s no secret by now that artificial intelligence has a white guy problem. One could say the same of almost any industry, but the tech world is singular in rapidly shaping the future. As has been widely publicized, the unconscious biases of white developers proliferate on the internet, mapping our social structures and behaviors onto code and repeating the imbalances and injustices that exist in the real world. There was the case of black people being classified as gorillas; the computer system that rejected an Asian man’s passport photo because it read his eyes as being closed; and the controversy surrounding the predictive policing algorithms that have been deployed in cities like Chicago and New Orleans, enabling police officers to pinpoint individuals it deems to be predisposed to crime—giving rise to accusations of profiling. Earlier this year, the release of Google’s Arts and Culture App, which allows users to match their faces with a historical painting, produced less than nuanced results for Asians, as well as African-Americans. Additionally, a new book, Algorithms of Oppression: How Search Engines Reinforce Racism, argues that search engines themselves are inherently discriminatory. “Data sets reflect the hierarchy in which we live,” said Kate Crawford, an artificial intelligence expert at NYU, at a recent research and development salon at the Museum of Modern Art. These biased algorithms and skewed data sets “reify what are fluid social categories,” she said, making the white man the online default, the norm. (Type “CEO” into Google image search and you’ll see an endless mosaic of suited white guys.) Given all of this, it’s no wonder that when artist Stephanie Dinkins came across a brown-skinned robot, she was surprised. Five years ago, she stumbled upon a YouTube video of an animatronic human bust, named Bina48, shown moving and conversing. Its developer, Martine Rothblatt, had modeled the robot after her wife, a woman of color named Bina Rothblatt. Dinkins recalls her astounded initial reaction: “What the heck is that?” she wondered. “How did a black woman become the most advanced of the technologies at the time?” That encounter led Dinkins to Lincoln, Vermont—where Bina48 lives at Rothblatt’s Terasem Movement Foundation—in order to engage in a series of conversations with the robot about race, intimacy, and the nature of being. Dinkins has since gone down what she calls a “rabbit-hole” of investigations into the way that culture—particularly the experiences of race and gender—is codified in technology. She has become a strong voice in the effort to sound the alarm about the dangers of minority populations being absent from creations of the computer algorithms that now mold our lives. Her research into these imbalances has taken her on a head-spinning tour of tech companies, conferences, and residencies over the past few years. Dinkins is currently in residence at the tech programs of both Pioneer Works and Eyebeam, nonprofit art centers based in Brooklyn. She regularly leads community workshops where she educates people broadly about the impact of AI on our lives, aiming to cultivate an attitude toward technology that sees it not as an intimidating monolith—the haunting specter of computers gone awry that we see so often in Black Mirror or, most iconically, in the cunning and calculating character of HAL in Stanley Kubrick’s 1968 film 2001: A Space Odyssey—but an approachable tool that is, for better or worse, very human. “We live in a world where we have to always be learning and willing to take on new information, and to do the work to get there, otherwise we’re sunk,” she says. “How do you move forwards in this super fast technological world?” She operates under the principle that if she can get people to think about the future in increments, it’s not quite so daunting. “In five years, what’s my world going to look like? What do I need to be doing now to start dealing with that world?” Dinkins tries to find accessible avenues into what can seem like brain-scrambling concepts by speaking the language of her target group. In recent workshops at the New York nonprofit space Recess, she worked with kids who’d been diverted from the criminal justice system. She began by inviting them to wrap their heads around what an algorithm is, exactly, finding analog comparisons in “basic things you can do without thinking,” like brushing your teeth, or behavioral tendencies, like those that shape encounters with police. She helped them to see the way in which they are hardwired to react in such moments of conflict. They worked on Venn diagrams to visualize these interactions from their point of view and that of a cop: What were those two people thinking in this shared moment? Where are the overlaps? “Some of [the kids] can be very reactionary, which makes the situation worse,” says Dinkins. “That’s where the algorithm has to change.” From that familiar territory, Dinkins works her way into talking about online systems and chat bots—pieces of software that emulate the conversational style of humans, and evolve as people enter into dialogue with them—as well as the larger goal of training AI to use language and ideas that relate to a more diverse range of worldviews. Participants in her workshops will often have a go at setting the intentions of a bot, then implanting it with data. One group created a bot whose sole purpose was to tell jokes. The input? Yo mama jokes. “I thought that was just amazing,” says Dinkins. “It’s the idea of taking one’s own culture and putting it into the machine, and using that to figure out how the machine is making decisions.” Dinkins herself is busy turning the experiences of three generations of her family into a bot, an oral-history-cum-memoir of a black family in America in the form of a computer algorithm. She, her aunt, and her niece have been interviewing one another intensively for the past several months, using stock questions intended to get at the fundamentals of their values, ethics, experiences, and the history of their family. The raw interviews are put into a machine learning system that digests them and generates an amalgam of the three family members—a voice in the machine whose manner of speech is a reflection of the family’s language and concerns. “I can already see that when you ask [the bot] stuff, it sounds sort of like my family,” says Dinkins. “I know that love is going to come out, that family is going to come out. We haven’t even fed it that much data yet, but it sounds like us. It’s kind of magical. I’ve been thinking about making another bot that’s all about telling people they’re loved. But I realize that’s just my family coming out in another way.” Dinkins sees in this algorithmic memoir something of a proof of concept: the potential to illustrate how different AI could look when it reflects the experiences and values of a more diverse set of people, and is divorced from market values. “It’s amazing that you put in a certain ethos and ethics and it comes back out,” she says. “What does that mean when it’s detached from commercial imperatives? Because I think that’s super important too. If we’re all after the next buck, we know what we get already. It could have value commercially, but it isn’t about commercial value.” Dinkins has settled into the reality that advocating for greater representation and human values in code will probably be her life’s work. “The project keeps growing, which is both excellent and crazy,” she says. “People are listening to me, so I’m talking about something that needs to be said, clearly. There’s an urgency about it.” Taking on the biases of a vast, multinational web of artificial intelligence technologies is no small task. Fortunately, Dinkins is part of a small but growing community of academics, technologists, multidisciplinary professionals, and organizations—like Black Girls Code and Black in AI—who recognize the threat at hand. “It’s a monster,” she says of the scale of the problem, “but I don’t think we can afford to have an adversarial relationship with technology. The work to try to get there is really worth the effort.” That goal may require radically breaking with received code. It brings to mind the HBO series Westworld, in which we see a fantasy universe populated by robots—and created by white men. Thandie Newton, a black British actor, plays the bot Maeve Millay, the manager of a brothel who gradually unravels the true nature of her reality: that her every action is the result of computer algorithms written by men. “All my life I’ve prided myself on being a survivor. But surviving is just another loop,” she says in one scene of the first season. In another: “Time to write my own fucking story.” Dinkins herself is busy turning the experiences of three generations of her family into a bot, an oral-history-cum-memoir of a black family in America in the form of a computer algorithm. She, her aunt, and her niece have been interviewing one another intensively for the past several months, using stock questions intended to get at the fundamentals of their values, ethics, experiences, and the history of their family. The raw interviews are put into a machine learning system that digests them and generates an amalgam of the three family members—a voice in the machine whose manner of speech is a reflection of the family’s language and concerns. “I can already see that when you ask [the bot] stuff, it sounds sort of like my family,” says Dinkins. “I know that love is going to come out, that family is going to come out. We haven’t even fed it that much data yet, but it sounds like us. It’s kind of magical. I’ve been thinking about making another bot that’s all about telling people they’re loved. But I realize that’s just my family coming out in another way.” Dinkins sees in this algorithmic memoir something of a proof of concept: the potential to illustrate how different AI could look when it reflects the experiences and values of a more diverse set of people, and is divorced from market values. “It’s amazing that you put in a certain ethos and ethics and it comes back out,” she says. “What does that mean when it’s detached from commercial imperatives? Because I think that’s super important too. If we’re all after the next buck, we know what we get already. It could have value commercially, but it isn’t about commercial value.” Dinkins has settled into the reality that advocating for greater representation and human values in code will probably be her life’s work. “The project keeps growing, which is both excellent and crazy,” she says. “People are listening to me, so I’m talking about something that needs to be said, clearly. There’s an urgency about it.” Taking on the biases of a vast, multinational web of artificial intelligence technologies is no small task. Fortunately, Dinkins is part of a small but growing community of academics, technologists, multidisciplinary professionals, and organizations—like Black Girls Code and Black in AI—who recognize the threat at hand. “It’s a monster,” she says of the scale of the problem, “but I don’t think we can afford to have an adversarial relationship with technology. The work to try to get there is really worth the effort.” That goal may require radically breaking with received code. It brings to mind the HBO series Westworld, in which we see a fantasy universe populated by robots—and created by white men. Thandie Newton, a black British actor, plays the bot Maeve Millay, the manager of a brothel who gradually unravels the true nature of her reality: that her every action is the result of computer algorithms written by men. “All my life I’ve prided myself on being a survivor. But surviving is just another loop,” she says in one scene of the first season. In another: “Time to write my own fucking story.” Le futur -------------------- J'en suis au même niveau.
Sauf que moi je peux pas enchaîner 2 jours de suite :ph34r: 1970 - 2010 |
|
|
|
Averell Observations des dérives racistes 29/05/2018 10:56
PuceDeBarbesLaFaMiLLe8013 Pas de tensions raciales en Croatie et en Hongrie 23/01/2022 00:13
Sakavomi Je viens d'arriver à Budapest. Les regards so... 23/01/2022 00:31
jp.sorin Citation (Sakavomi @ 23/01/2022 01:31) J... 23/01/2022 00:57
gandjao Citation (Sakavomi @ 23/01/2022 00:31) J... 23/01/2022 10:18
Ayevi Citation (gandjao @ 23/01/2022 10:18) Ni... 23/01/2022 13:01
voolool Citation (Ayevi @ 23/01/2022 13:01) J... 23/01/2022 13:14
gandjao Citation (Ayevi @ 23/01/2022 13:01) J... 23/01/2022 13:25
PuceDeBarbesLaFaMiLLe8013 Après je crois qu’il y’a eu quelques conflits... 23/01/2022 13:31
Zul QUOTE (PuceDeBarbesLaFaMiLLe8013 @ 23/01/2022... 23/01/2022 13:33
Parisian Citation (PuceDeBarbesLaFaMiLLe8013 @ 23/01/2... 23/01/2022 13:34
gandjao Citation (PuceDeBarbesLaFaMiLLe8013 @ 23/01/2... 23/01/2022 13:37
voolool Citation (gandjao @ 23/01/2022 13:37) Y ... 23/01/2022 15:16
Babass En fait vous avez raison tous les 2.
Pour connaît... 23/01/2022 13:39
Maboune Je suis parti en vacances en Croatie il y a quelqu... 23/01/2022 14:21
NewYorkSup Non mais vous comprenez pas. Y a pas de tension et... 23/01/2022 14:44
gandjao Citation (Maboune @ 23/01/2022 14:21) Je... 23/01/2022 15:17
Maboune Citation (gandjao @ 23/01/2022 15:17) C... 23/01/2022 18:09
Parisian Citation (gandjao @ 23/01/2022 16:17) C... 24/01/2022 00:05
sukercop Les croates ne sont pas très accueillants et nati... 23/01/2022 14:56
Raster Citation (sukercop @ 23/01/2022 14:56) L... 24/01/2022 01:24
valderrama Citation (Raster @ 24/01/2022 01:24) Val... 24/01/2022 09:57
voolool T'habites où gandjao? 23/01/2022 15:21
Ayevi Citation (voolool @ 23/01/2022 15:21) T... 23/01/2022 15:37
Wormy Citation (Ayevi @ 23/01/2022 15:37) J... 23/01/2022 17:58
Sakavomi Ils sont très présents en tout cas (en plus des ... 24/01/2022 11:10
Parisian Citation (Sakavomi @ 24/01/2022 11:10) Pa... 24/01/2022 11:19
Sakavomi Citation (Parisian @ 24/01/2022 11:19) J... 24/01/2022 11:27
Alexinho Alors même qu'on n'est pas encore arrivé... 24/01/2022 11:31
sukercop Même observation en Espagne, où des voitures peu... 24/01/2022 11:34
valderrama Citation (sukercop @ 24/01/2022 11:34) M... 24/01/2022 14:59
Babass Citation (valderrama @ 24/01/2022 14:59)... 24/01/2022 18:55
PuceDeBarbesLaFaMiLLe8013 Je préfère de loin notre société qui emmerde c... 24/01/2022 11:36
sukercop Citation (PuceDeBarbesLaFaMiLLe8013 @ 24/01/2... 24/01/2022 12:12
bijotel rien de mieux que le ski 24/01/2022 12:08
Sakavomi Citation (bijotel @ 24/01/2022 12:08) ri... 24/01/2022 12:12
PuceDeBarbesLaFaMiLLe8013 Ils sont eux mêmes véhiculés 24/01/2022 12:14
mamut7 Citation (PuceDeBarbesLaFaMiLLe8013 @ 24/01/2... 24/01/2022 13:19
PuceDeBarbesLaFaMiLLe8013 Masque mon compte jackouille 24/01/2022 13:34
mamut7 Citation (PuceDeBarbesLaFaMiLLe8013 @ 24/01/2... 24/01/2022 14:04
Houdini Citation (PuceDeBarbesLaFaMiLLe8013 @ 24/01/2... 24/01/2022 14:11
PuceDeBarbesLaFaMiLLe8013 RE: Observations des dérives racistes 24/01/2022 14:25
Miles Citation (sukercop @ 24/01/2022 11:34) M... 24/01/2022 17:02
Raël Citation (Miles @ 24/01/2022 17:02) J... 24/01/2022 19:32
stoner_man https://twitter.com/aggierican/status/149190107238... 11/02/2022 18:13
Parisian https://twitter.com/le_Parisien/status/14946320018... 22/02/2022 11:07
Tonio Merde 23/03/2022 07:38
Tonio https://twitter.com/AnasseKazib/status/15116332114... 06/04/2022 15:46
Lask Kazib si avec son pote Boussaf ils pouvaient s... 06/04/2022 17:43
Parisian https://www.mediapart.fr/journal/france/060...-dis... 07/04/2022 17:19
Pastk Demande à Kaelas 07/04/2022 18:36
Parisian Citation (Pastk @ 07/04/2022 18:36) Dema... 07/04/2022 19:03
Miles Ah ouais, mec est maire LR et bouffe à tous les r... 07/04/2022 18:41
Raster "l’enseignant s’emporte, propose à son ... 07/04/2022 18:50
iscfa Citation (Miles @ 07/04/2022 19:41) Ah o... 07/04/2022 19:10
Parisian Citation (iscfa @ 07/04/2022 20:10) Tant ... 07/04/2022 19:15

Picollo Citation (Parisian @ 07/04/2022 20:15) C... 07/04/2022 20:28
Philo Citation (iscfa @ 07/04/2022 20:10) Tant... 08/04/2022 08:21
Tchoune Citation (Philo @ 08/04/2022 09:21) Soit... 12/04/2022 17:13
parizien Du coup il aurait du le traiter de fiotte ça aura... 07/04/2022 20:21
succo https://variety.com/2022/music/opinion/remo...ave-... 12/04/2022 16:48
taikai Citation Mexico City residents angered by influx o... 29/07/2022 23:36
SkyPars https://twitter.com/Loopsidernews/status/155692638... 11/08/2022 10:36
RegardZehef QUOTE (SkyPars @ 11/08/2022 10:36) h... 11/08/2022 12:40
Parisian Citation (RegardZehef @ 11/08/2022 13:40... 11/08/2022 12:47
Tchoune Citation (Parisian @ 11/08/2022 13:47) E... 11/08/2022 13:08
Wanker Texas Ranger Citation (Parisian @ 11/08/2022 13:47) E... 12/08/2022 01:06
k0brakai Drew Barrymore danced in the rain on TikTok, does ... 26/08/2022 16:28
Tonio https://twitter.com/Debruges2/status/1561802230289... 26/08/2022 18:45
Kaelas Citation (Tonio @ 26/08/2022 19:45) ... 27/08/2022 06:27
Miles Alors que c'est au ski que c'est choquant ... 26/08/2022 19:51
Guilanboy Je pense que la dame a surtout été dérangé par... 27/08/2022 06:59
RegardZehef https://twitter.com/thismorning/status/91558979367... 02/09/2022 19:18
TrappACouilles C'est l'accent allemand qui en fait une ma... 02/09/2022 19:32
Parisian RE: Observations des dérives racistes 02/09/2022 19:35
Guilanboy RE: Observations des dérives racistes 02/09/2022 19:37
stoner_man Je regrette l'époque saine où on jetait des ... 02/09/2022 20:29
Tonio https://twitter.com/nss_i13/status/158172202841947... 17/10/2022 18:41
Rene Boisrond Citation (Tonio @ 17/10/2022 19:41) ... 17/10/2022 22:35
succo Si vous avez été à l'AP du nouveau Black Pa... 29/10/2022 15:14
Guilanboy Citation (succo @ 29/10/2022 15:14) Si v... 30/10/2022 14:55
Zatman Return merci encore les ricains toujours plus loin, proch... 29/10/2022 15:17
Siba Tu sais que t'es mignon toi avec ton patriotis... 30/10/2022 14:16
Zatman Return Citation (Siba @ 30/10/2022 14:16) Tu sai... 30/10/2022 16:39
Parisian Citation (Zatman Return @ 30/10/2022 16:3... 30/10/2022 16:43
Zatman Return je suis d'origine albanaise, musulmane, je pou... 30/10/2022 16:54
Siba Ah ouais t'es vraiment bête en fait, je pensa... 30/10/2022 17:18
Zatman Return Citation (Siba @ 30/10/2022 17:18) Trou d... 30/10/2022 17:37
Zatman Return ferme ta gueule et donne moi ladresse de rencontre... 30/10/2022 17:23
Picollo Citation (Zatman Return @ 30/10/2022 18:2... 30/10/2022 17:27
Zatman Return T'aurais pu etre le said tagmaoui de france ma... 30/10/2022 18:00
Siba Ah merde je suis plus un bobo ? Je ne me moque plu... 30/10/2022 18:01
Zatman Return tu joins la quittance de loyer avec un justificati... 30/10/2022 18:03
Siba Que t'es con putain 30/10/2022 18:04
Zatman Return Citation (Siba @ 30/10/2022 18:04) Que t... 30/10/2022 18:11
Guilanboy J'ai comme l'impression qu'il y a un q... 30/10/2022 18:14
Zatman Return depeche toi putain jvais bientot me faire ban du f... 30/10/2022 18:16![]() ![]() |
| Version bas débit | Nous sommes le : 01/05/2026 18:13 |