Observations des dérives racistes, Le futur est arrivé en avance |
Bienvenue invité ( Connexion | Inscription )
Observations des dérives racistes, Le futur est arrivé en avance |
29/05/2018 10:56
Message
#2501
|
|
|
#TeamZemmour ![]() ![]() ![]() ![]() ![]() ![]() Groupe : Members Messages : 49,981 Inscrit : 25/02/2009 Lieu : Berlin, Zion Membre no 813 Tribune : Autre club |
https://www.artsy.net/article/artsy-editori...gence-white/amp
QUOTE It’s no secret by now that artificial intelligence has a white guy problem. One could say the same of almost any industry, but the tech world is singular in rapidly shaping the future. As has been widely publicized, the unconscious biases of white developers proliferate on the internet, mapping our social structures and behaviors onto code and repeating the imbalances and injustices that exist in the real world. There was the case of black people being classified as gorillas; the computer system that rejected an Asian man’s passport photo because it read his eyes as being closed; and the controversy surrounding the predictive policing algorithms that have been deployed in cities like Chicago and New Orleans, enabling police officers to pinpoint individuals it deems to be predisposed to crime—giving rise to accusations of profiling. Earlier this year, the release of Google’s Arts and Culture App, which allows users to match their faces with a historical painting, produced less than nuanced results for Asians, as well as African-Americans. Additionally, a new book, Algorithms of Oppression: How Search Engines Reinforce Racism, argues that search engines themselves are inherently discriminatory. “Data sets reflect the hierarchy in which we live,” said Kate Crawford, an artificial intelligence expert at NYU, at a recent research and development salon at the Museum of Modern Art. These biased algorithms and skewed data sets “reify what are fluid social categories,” she said, making the white man the online default, the norm. (Type “CEO” into Google image search and you’ll see an endless mosaic of suited white guys.) Given all of this, it’s no wonder that when artist Stephanie Dinkins came across a brown-skinned robot, she was surprised. Five years ago, she stumbled upon a YouTube video of an animatronic human bust, named Bina48, shown moving and conversing. Its developer, Martine Rothblatt, had modeled the robot after her wife, a woman of color named Bina Rothblatt. Dinkins recalls her astounded initial reaction: “What the heck is that?” she wondered. “How did a black woman become the most advanced of the technologies at the time?” That encounter led Dinkins to Lincoln, Vermont—where Bina48 lives at Rothblatt’s Terasem Movement Foundation—in order to engage in a series of conversations with the robot about race, intimacy, and the nature of being. Dinkins has since gone down what she calls a “rabbit-hole” of investigations into the way that culture—particularly the experiences of race and gender—is codified in technology. She has become a strong voice in the effort to sound the alarm about the dangers of minority populations being absent from creations of the computer algorithms that now mold our lives. Her research into these imbalances has taken her on a head-spinning tour of tech companies, conferences, and residencies over the past few years. Dinkins is currently in residence at the tech programs of both Pioneer Works and Eyebeam, nonprofit art centers based in Brooklyn. She regularly leads community workshops where she educates people broadly about the impact of AI on our lives, aiming to cultivate an attitude toward technology that sees it not as an intimidating monolith—the haunting specter of computers gone awry that we see so often in Black Mirror or, most iconically, in the cunning and calculating character of HAL in Stanley Kubrick’s 1968 film 2001: A Space Odyssey—but an approachable tool that is, for better or worse, very human. “We live in a world where we have to always be learning and willing to take on new information, and to do the work to get there, otherwise we’re sunk,” she says. “How do you move forwards in this super fast technological world?” She operates under the principle that if she can get people to think about the future in increments, it’s not quite so daunting. “In five years, what’s my world going to look like? What do I need to be doing now to start dealing with that world?” Dinkins tries to find accessible avenues into what can seem like brain-scrambling concepts by speaking the language of her target group. In recent workshops at the New York nonprofit space Recess, she worked with kids who’d been diverted from the criminal justice system. She began by inviting them to wrap their heads around what an algorithm is, exactly, finding analog comparisons in “basic things you can do without thinking,” like brushing your teeth, or behavioral tendencies, like those that shape encounters with police. She helped them to see the way in which they are hardwired to react in such moments of conflict. They worked on Venn diagrams to visualize these interactions from their point of view and that of a cop: What were those two people thinking in this shared moment? Where are the overlaps? “Some of [the kids] can be very reactionary, which makes the situation worse,” says Dinkins. “That’s where the algorithm has to change.” From that familiar territory, Dinkins works her way into talking about online systems and chat bots—pieces of software that emulate the conversational style of humans, and evolve as people enter into dialogue with them—as well as the larger goal of training AI to use language and ideas that relate to a more diverse range of worldviews. Participants in her workshops will often have a go at setting the intentions of a bot, then implanting it with data. One group created a bot whose sole purpose was to tell jokes. The input? Yo mama jokes. “I thought that was just amazing,” says Dinkins. “It’s the idea of taking one’s own culture and putting it into the machine, and using that to figure out how the machine is making decisions.” Dinkins herself is busy turning the experiences of three generations of her family into a bot, an oral-history-cum-memoir of a black family in America in the form of a computer algorithm. She, her aunt, and her niece have been interviewing one another intensively for the past several months, using stock questions intended to get at the fundamentals of their values, ethics, experiences, and the history of their family. The raw interviews are put into a machine learning system that digests them and generates an amalgam of the three family members—a voice in the machine whose manner of speech is a reflection of the family’s language and concerns. “I can already see that when you ask [the bot] stuff, it sounds sort of like my family,” says Dinkins. “I know that love is going to come out, that family is going to come out. We haven’t even fed it that much data yet, but it sounds like us. It’s kind of magical. I’ve been thinking about making another bot that’s all about telling people they’re loved. But I realize that’s just my family coming out in another way.” Dinkins sees in this algorithmic memoir something of a proof of concept: the potential to illustrate how different AI could look when it reflects the experiences and values of a more diverse set of people, and is divorced from market values. “It’s amazing that you put in a certain ethos and ethics and it comes back out,” she says. “What does that mean when it’s detached from commercial imperatives? Because I think that’s super important too. If we’re all after the next buck, we know what we get already. It could have value commercially, but it isn’t about commercial value.” Dinkins has settled into the reality that advocating for greater representation and human values in code will probably be her life’s work. “The project keeps growing, which is both excellent and crazy,” she says. “People are listening to me, so I’m talking about something that needs to be said, clearly. There’s an urgency about it.” Taking on the biases of a vast, multinational web of artificial intelligence technologies is no small task. Fortunately, Dinkins is part of a small but growing community of academics, technologists, multidisciplinary professionals, and organizations—like Black Girls Code and Black in AI—who recognize the threat at hand. “It’s a monster,” she says of the scale of the problem, “but I don’t think we can afford to have an adversarial relationship with technology. The work to try to get there is really worth the effort.” That goal may require radically breaking with received code. It brings to mind the HBO series Westworld, in which we see a fantasy universe populated by robots—and created by white men. Thandie Newton, a black British actor, plays the bot Maeve Millay, the manager of a brothel who gradually unravels the true nature of her reality: that her every action is the result of computer algorithms written by men. “All my life I’ve prided myself on being a survivor. But surviving is just another loop,” she says in one scene of the first season. In another: “Time to write my own fucking story.” Dinkins herself is busy turning the experiences of three generations of her family into a bot, an oral-history-cum-memoir of a black family in America in the form of a computer algorithm. She, her aunt, and her niece have been interviewing one another intensively for the past several months, using stock questions intended to get at the fundamentals of their values, ethics, experiences, and the history of their family. The raw interviews are put into a machine learning system that digests them and generates an amalgam of the three family members—a voice in the machine whose manner of speech is a reflection of the family’s language and concerns. “I can already see that when you ask [the bot] stuff, it sounds sort of like my family,” says Dinkins. “I know that love is going to come out, that family is going to come out. We haven’t even fed it that much data yet, but it sounds like us. It’s kind of magical. I’ve been thinking about making another bot that’s all about telling people they’re loved. But I realize that’s just my family coming out in another way.” Dinkins sees in this algorithmic memoir something of a proof of concept: the potential to illustrate how different AI could look when it reflects the experiences and values of a more diverse set of people, and is divorced from market values. “It’s amazing that you put in a certain ethos and ethics and it comes back out,” she says. “What does that mean when it’s detached from commercial imperatives? Because I think that’s super important too. If we’re all after the next buck, we know what we get already. It could have value commercially, but it isn’t about commercial value.” Dinkins has settled into the reality that advocating for greater representation and human values in code will probably be her life’s work. “The project keeps growing, which is both excellent and crazy,” she says. “People are listening to me, so I’m talking about something that needs to be said, clearly. There’s an urgency about it.” Taking on the biases of a vast, multinational web of artificial intelligence technologies is no small task. Fortunately, Dinkins is part of a small but growing community of academics, technologists, multidisciplinary professionals, and organizations—like Black Girls Code and Black in AI—who recognize the threat at hand. “It’s a monster,” she says of the scale of the problem, “but I don’t think we can afford to have an adversarial relationship with technology. The work to try to get there is really worth the effort.” That goal may require radically breaking with received code. It brings to mind the HBO series Westworld, in which we see a fantasy universe populated by robots—and created by white men. Thandie Newton, a black British actor, plays the bot Maeve Millay, the manager of a brothel who gradually unravels the true nature of her reality: that her every action is the result of computer algorithms written by men. “All my life I’ve prided myself on being a survivor. But surviving is just another loop,” she says in one scene of the first season. In another: “Time to write my own fucking story.” Le futur -------------------- J'en suis au même niveau.
Sauf que moi je peux pas enchaîner 2 jours de suite :ph34r: 1970 - 2010 |
|
|
|
Averell Observations des dérives racistes 29/05/2018 10:56
Tycoon Le groupe vit bien. 30/10/2022 18:17
Parisian Citation (Tycoon @ 30/10/2022 18:17)
L... 30/10/2022 18:19
Siba Tu sais pas lire et tu parles trop
Zatman le goat... 30/10/2022 18:18
Zatman Return l'espace d'un message je veux madresser au... 30/10/2022 18:31
Zatman Return said tagmaoui grand acteur francais issu de limigr... 30/10/2022 19:23
Rene Boisrond RE: Observations des dérives racistes 30/10/2022 21:17
jp.sorin Citation (Siba @ 30/10/2022 17:18) Trou ... 30/10/2022 21:49
Zatman Return Citation (jp.sorin @ 30/10/2022 21:49) Pa... 30/10/2022 22:00
Parisian Citation (Zatman Return @ 30/10/2022 22:0... 30/10/2022 22:07

Vince206plusHDI Citation (Parisian @ 30/10/2022 22:07)
... 31/10/2022 12:02


Guilanboy Citation (Vince206plusHDI @ 31/10/2022 12... 31/10/2022 12:14

Kaelas Citation (Parisian @ 30/10/2022 22:07) La... 31/10/2022 12:41

Parisian Citation (Kaelas @ 31/10/2022 12:41) En ... 31/10/2022 12:48

Kaelas Citation (Parisian @ 31/10/2022 12:48) J... 31/10/2022 12:52

Parisian Citation (Kaelas @ 31/10/2022 12:52) T... 31/10/2022 12:59

Kaelas Citation (Parisian @ 31/10/2022 12:59) L... 31/10/2022 13:01

Parisian Citation (Kaelas @ 31/10/2022 13:01) Les ... 31/10/2022 13:06

Kaelas Citation (Parisian @ 31/10/2022 13:06) Ca... 31/10/2022 13:08

Parisian Citation (Kaelas @ 31/10/2022 13:08) C... 31/10/2022 13:26
Guilanboy Citation (Zatman Return @ 30/10/2022 22... 30/10/2022 22:35
Zatman Return J'admet avoir été en tort depuis le depart, ... 30/10/2022 22:36
Guilanboy Citation (Parisian @ 30/10/2022 22:07)
... 30/10/2022 22:37
Zatman Return Par contre Franklin dans GTAV sa premiere baraque ... 30/10/2022 22:53
Philo Bon, on va débrancher quelques comptes je crois, ... 31/10/2022 12:46
Rene Boisrond Citation (Philo @ 31/10/2022 12:46) Bon,... 31/10/2022 21:44
TrappACouilles Vous essayez de rendre tous les topics insupportab... 31/10/2022 13:27
Parisian Citation (TrappACouilles @ 31/10/2022 13... 31/10/2022 13:31
Kaelas Citation (Parisian @ 31/10/2022 13:31) Il... 31/10/2022 13:32
TrappACouilles Je parlais pas specialement de Parisian non.
D... 31/10/2022 14:26
Kaelas Citation (TrappACouilles @ 31/10/2022 14... 31/10/2022 14:35
succo Le Zat et Shiva qui s'écharpent à cause d... 31/10/2022 16:26
RegardZehef QUOTE (succo @ 31/10/2022 17:26) Le Zat e... 31/10/2022 17:02
Parisian Citation (RegardZehef @ 31/10/2022 17:02... 31/10/2022 17:14
Miles Je trouve la question mal posée je vois pas trop ... 31/10/2022 16:49
Parisian Citation (Miles @ 31/10/2022 16:49) Je tr... 31/10/2022 16:53
succo J'avoue mais ca ne m'énerve pas ca me fai... 31/10/2022 17:11
RegardZehef QUOTE (succo @ 31/10/2022 17:11) J'a... 31/10/2022 17:32
Nikos B. Tu me diras tu peux être fier d'etre américa... 31/10/2022 17:16
Zatman Return jetais choqué quand je regardais un streamer amer... 31/10/2022 17:39
Zul QUOTE (Zatman Return @ 31/10/2022 17:39)... 31/10/2022 18:08
RegardZehef Sinon pour vos histoires de communautés, le coupl... 31/10/2022 18:10
Zatman Return par contre je suis daccord avec lui sur le fait qu... 31/10/2022 18:12
succo Et pas mal de Français peuvent remercier les asia... 31/10/2022 18:17
Zatman Return Citation (succo @ 31/10/2022 18:17) Et p... 31/10/2022 18:18
Zatman Return Ok j'avais pas vu j'me fais menacer par le... 31/10/2022 21:46
wil Citation (Zatman Return @ 31/10/2022 21... 31/10/2022 22:19
Zul https://twitter.com/asdza_tlehonaei/status/1604571... 21/12/2022 14:11
taikai https://twitter.com/AshwiniSahaya/status/161241251... 10/01/2023 04:07
Basilide https://twitter.com/SeriesUpdateFR/status/16462465... 14/04/2023 14:25
Parisian Citation (Basilide @ 14/04/2023 15:25) ... 23/04/2023 11:31
Kaelas Citation (Parisian @ 23/04/2023 12:31) h... 23/04/2023 11:56
Tonio Citation (Kaelas @ 23/04/2023 12:56) Fau... 23/04/2023 12:20
Parisian RE: Observations des dérives racistes 18/04/2023 14:50
Oyé Sapapaya https://twitter.com/WSJ/status/1659552038303170560... 24/05/2023 10:12
Parisian Citation (Oyé Sapapaya @ 24/05/2023 10:1... 24/05/2023 11:10
TrappACouilles Ça fait très jeunesses hitlériennes, surtout le... 24/05/2023 10:15
Parisian RE: Observations des dérives racistes 25/05/2023 09:23
succo https://www.bfmtv.com/culture/le-festival-d...2307... 26/07/2023 11:22
SkyPars https://twitter.com/mhdksafa/status/17052557589782... 25/09/2023 19:08
TrappACouilles https://www.gymnasticsireland.com/news/gymn...vent... 25/09/2023 19:20
tonato Citation (TrappACouilles @ 25/09/2023 20... 25/09/2023 19:31![]() ![]() |
| Version bas débit | Nous sommes le : 01/05/2026 07:26 |