Talks
Speakers
Events
Topics
Sign in
Home
Talks
Speakers
Events
Topics
Leaderboard
Use
Analytics
Sign in
Suggest modification to this talk
Title
Description
We have the power to stand up against oppression that exists in our software. Data discrimination, biases in algorithms, etc. are becoming an issue. You’ll learn about the biases we build, software that is bettering us and what you can do about it, as a developer, to truly make a difference. By Kinsey Ann Durham https://twitter.com/@KinseyAnnDurham Kinsey Ann Durham is an engineer at DigitalOcean working remotely in Denver, CO. She teaches students from around the globe how to write code through a program called Bloc. She co-founded a non-profit called Kubmo in 2013 that teaches and builds technology curriculum for women’s empowerment programs around the world. She, also, helps run the Scholar and Guide Program for Ruby Central conferences. In her free time, she enjoys fly fishing and adventuring in the Colorado outdoors with her dog, Harleigh. https://rubyonice.com/speakers/kinsey_ann_durham
Date
Summarized using AI?
If this talk's summary was generated by AI, please check this box. A "Summarized using AI" badge will be displayed in the summary tab to indicate that the summary was generated using AI.
Show "Summarized using AI" badge on summary page
Summary
Markdown supported
In her presentation titled "Breaking the Chains of Oppressive Software," Kinsey Ann Durham discusses the critical issue of biases embedded within algorithms and the broader implications for society. She begins by asserting the necessity of recognizing how these biases manifest in various forms of software that influence significant life decisions. The key points she outlines include: - **Understanding Bias**: Bias is defined broadly as an inclination or prejudice towards particular groups or ideas. Durham emphasizes that the focus should be on biases in algorithms rather than solely on unconscious human biases in tech companies. - **Impact of Algorithms**: Algorithms are increasingly pivotal in operations across sectors such as education, justice, and social services. They can both enhance convenience and exert control over lives, thereby necessitating scrutiny for potential biases. - **Examples of Bias**: - **Search Engine Bias**: Durham shares a video highlighting sexist and racist results from Google search algorithms, revealing how these biases are not incidental but systematic. - **Image Recognition Flaws**: She cites the incident where Google Photos mistakenly tagged people of color due to inherent biases in the image recognition algorithms. - **COMPAS Algorithm**: This algorithm predicts recidivism rates for defendants, but its black box nature raises due process concerns as its bias can affect sentencing dramatically. - **Recruitment Tools**: Amazon's scrapped AI recruiting tool, which favored male candidates, illustrates the dangers of algorithmic bias in employment. - **Root Causes of Bias**: Durham argues that biases begin at the data collection stage, and the historical reliance on binary classifications in programming perpetuates these biases into modern AI systems. - **Call to Action**: To tackle these issues, she encourages developers to engage in discussions about ethical algorithm design, advocate for algorithmic accountability, promote diversity in tech, and be aware of personal biases. She suggests practical steps such as running the implicit bias test and implementing bias reduction techniques in code reviews. Durham concludes by asserting that developers have the power to create a more equitable tech landscape and emphasizes the importance of addressing the moral implications of their work. The session serves as a powerful reminder of the responsibility developers hold in ensuring ethical use of technology, with an invitation for them to leverage their skills towards creating inclusive and just systems.
Suggest modifications
Cancel