Showing 4 results for:

bias

by Topic

All results

4
Revea Secures $6M In Funding To Launch A Precise Mobile Skincare Experience

To ensure bias is removed from skincare product development, Revea has raised $6 million in funding. According to a press release, the Seed II round was led by Alpha Edison. In addition, Ulta Beauty, WaldenCast Ventures, GISEV, Verlinvest, Kathaka, and Stanford Co-Chair Musculoskeletal Imaging also participated in the round. “Ulta Beauty was founded to disrupt the status quo and today, we remain focused on doing just that across every touchpoint – with greater personalization than ever before,” said chief digital officer at Ulta Beauty Prama Bhatt, according to a press release. “We are thrilled to support Revea as they continue to disrupt and deliver unique, personalized skincare solutions.”

Apr 21, 2022

Edward Jones To Pay Black Employees $34M To Settle Race Discrimination Lawsuit

St. Louis-based financial services firm Edward Jones has been ordered by a federal judge to pay a $34 million settlement for bias claims by Black financial advisors, Reuters first reported. The firm is known for serving individual investors and boasts more than 19,000 financial advisors. This settlement comes after Stowell & Friedman filed a class-action lawsuit in 2018 on behalf of roughly 800 Black financial advisors. The official complaint was “race discrimination,” states a court filing that names Wayne Bland as the main plaintiff in the class action case. The Black financial advisors accused Edward Jones of assigning them to less lucrative work, denying them working with certain high-level client accounts, and depriving them of advancement opportunities. In the case represented, Edward Jones, which Dowd Bennett denied wrongdoing, went as far as to try to get the case dismissed since it claimed that the named plaintiffs failed to provide more than speculative claims in the...

May 11, 2021

Airbnb Bans White Supremacist Users Linked to Iron March From its Site

In response to a data leak that revealed the identities and messaging history of white supremacists, Airbnb has decided to ban such users from its site. Airbnb’s security systems recently identified 60-plus account holders from Iron March , a far-right, white supremacist site. Their accounts were deleted in accordance with Airbnb’s zero-tolerance policy. A spokesperson for the short-term rental giant labeled the step a “no-brainer,” reiterating Airbnb’s commitment to “ continuously seeking to proactively identify those who could put our hosts and guests at risk .” Airbnb CEO Brian Chesky has previously stood in staunch opposition to white supremacy. “The violence, racism and hatred demonstrated by neo-Nazis, the alt-right, and white supremacists should have no place in this world,” Chesky said in a statement according to The Verge in 2017. Airbnb’s Community Commitment — a nondiscrimination policy that promotes for inclusiveness — was updated in 2016 to ensure fairness to all users.

Dec 16, 2019

AI Fails More At Recognizing Objects From Lower-Income Countries. Here’s Why That Matters

Many tech companies — including Microsoft, Google, and Amazon — have produced object recognition algorithms. This form of artificial intelligence is meant to do exactly what it says: recognize objects. It sounds like something that can’t be messed up, but a recent study found that object recognition is worse at identifying items from lower-income countries. The study was conducted by researchers — Terrance DeVries, Ishan Misra, Changhan Wang, and Laurens van der Maaten — from Facebook’s AI Lab. The team focused on analyzing five popular object recognition algorithms: Microsoft Azure, Clarifai, Google Cloud Vision, Amazon’s Rekognition, and IBM Watson The global dataset included 117 categories focusing on common household items, like shoes and soap. Researchers also made sure to diversify both household incomes and geographic locations. Researchers found that the difference in accuracy was striking. The object recognition algorithms made had an increased 10 percent error rate when...

Jun 12, 2019