Showing 4 results for:

algorithm

by Topic

All results

4
How Clubhouse Led L. Michelle McCray To Become The Navigational Voice On Snoop Dogg's New Album

Voiceover artist L. Michelle McCray’s introduction into the arts began at five-years-old after singing “When You Wish Upon a Star” in a Disney play. Her first official gig, which was in radio, sparked her adolescent years. Tapping into her passion early on set the tone for her future to come. “I had been playing little pranks, imitating, and my mother would read me stories. And, I was so in love with what my mother did that I started imitating her, and then it turned into me imitating people in the news and practicing all these inflections, sounds, and tones,” L. Michelle told AfroTech. L. Michelle’s radio background and theatre, music, and media production pivoted her career as an artist, particularly in voice acting. Now, with over 25 years in the game under her belt, L. Michelle rightfully crowned herself as the “voice acting master.” In 2020, her journey as an artist took a slight halt due to the arrival of COVID-19. The pandemic hit after she moved to Boston to attend Berklee...

Ngozi Nwanji

Dec 14, 2021

Uber, Lyft Respond to Study Showing Higher Rates For Trips to Non-White Areas

Tech companies such as Uber recently took a public stand against systemic racism and called out police brutality through their app’s new features, TODAY reports . However, the company’s values along with another ride-hailing company, Lyft, are being called into question after a new study showed the two companies’ algorithms charge higher rates for customers traveling to non-white neighborhoods, Salon reports . The study’s research showed a data set of over 100 million trips taken in Chicago, IL between November 2018 and December 2019 that proved while “demand and speed” have the highest correlation with ride fares, Complex reports , various forms of social bias are also present for riders traveling to and from certain neighborhoods. According to the study — conducted by researchers Aylin Caliskan and Akshat Pandey at George Washington University in Washington D.C. — research showed that the ride-hailing companies charged a higher price per mile for trips where the destination or...

Njera Perkins

Jun 23, 2020

Cory Booker and Others Introduce Bill That Could Make Tech Companies Check Their AI For Biases

Over the past few years, big tech companies like Facebook and Amazon have come under fire for discriminatory artificial intelligence. Now, U.S. lawmakers are presenting a bill that will make tech companies check their algorithms for biases. Drafted by Sens. Cory Booker and Ron Wyden, the Algorithmic Accountability Act of 2019 calls for the Federal Trade Commission to require companies collecting and sharing data for the purpose of algorithms to conduct impact assessments on their privacy and AI tools. The law notes that algorithms can contribute to and amplify “unfair, biased, or discriminatory decisions” that impact consumers. For now, the bill is aimed at big tech companies and data brokers. It would only apply to companies who are valued at more than $50 million or who have access to more than 1 million consumers’ data. “Computers are increasingly involved in the most important decisions affecting Americans’ lives — whether or not someone can buy a home, get a job or even go to...

Arriana McLymore

Apr 11, 2019

Civil Rights Groups Say Using Algorithms In Risk-Based Bail Assessments Is A Problem

The Advanced Civil Liberties Union (ACLU) and National Association for the Advancement of Colored People (NAACP), Color of Change, MoveOn and 115 other groups signed a statement of concern about the use of pretrial “risk assessment” algorithms to determine bail . The amount of state and local governments turning to algorithms to determine pretrial flight and criminal recidivism risks. The algorithm was intended to remove bias from pretrial hearings to reduce jail overcrowding, etc. But i n the statement, the groups explain that our justice system is still disproportionately impacting communities of color.  Because of this, the data entered reflects those flaws. This makes what might be intended to be objective or neutral actually biased tools that will continue to add to the discrepancies in the justice system. The organizations are also calling for the government to follow principles, including the following: If in use, a pretrial risk assessment instrument must be designed and...