
AI Summary
→ WHAT IT COVERS Dr. Joy Buolamwini, MIT researcher and founder of the Algorithmic Justice League, discusses her groundbreaking work exposing racial and gender bias in facial recognition AI systems, combining technical research with poetry to advocate for algorithmic justice. → KEY INSIGHTS - **Gender Shades Research:** Buolamwini tested commercial facial recognition systems from IBM, Microsoft, and Amazon, finding Microsoft achieved 100% accuracy on lighter-skinned males but only 80% on darker-skinned females, while some systems performed near coin-toss levels at 68% accuracy for darker women. - **Evocative Audits:** Combining algorithmic testing with performance art creates accessible demonstrations of AI bias. Buolamwini's poem "AI, Ain't I a Woman" showed IBM mislabeling Serena Williams as male and Microsoft describing Ida B. Wells as a small boy, making technical failures visceral and undeniable. - **Creative Rights Framework:** Artists and writers whose work trains generative AI systems deserve four protections: consent before use, compensation for contributions, control over how work is deployed, and credit for outputs. This framework addresses unauthorized use of copyrighted books and creative works in AI training datasets. - **Institutional and Narrative Power:** Buolamwini strategically uses MIT credentials for institutional access to policymakers while wielding narrative power through poetry and storytelling. This dual approach enabled her to present research to EU defense ministers, US Congress, and Davos while empowering Brooklyn tenants resisting facial recognition systems. - **Black Feminist Epistemology in Tech:** Lived experience constitutes valid knowledge in AI research. Buolamwini builds on work by Dr. Latanya Sweeney, who exposed racial bias in search engine arrest ads, and Dr. Safiya Noble, who documented racist image results, demonstrating how personal experience drives critical technical scholarship. → NOTABLE MOMENT Buolamwini discovered facial recognition bias when software failed to detect her dark-skinned face during a MIT project but immediately recognized a white mask held over her face, sparking her career investigating how AI systems perpetuate discrimination against marginalized communities. 💼 SPONSORS [{"name": "Odoo", "url": "odoo.com"}] 🏷️ AI Bias, Facial Recognition, Algorithmic Justice, Tech Ethics, Creative Rights