Ride-share drivers for Uber, Lyft, and Via continue to demonstrate racial and LGBT bias, despite efforts to eliminate or reduce discrimination, researchers report.
In response to drivers’ biased behavior, ride-sharing companies removed information that could indicate a rider’s gender and race from initial ride requests.
However, researchers still found that biases against underrepresented groups and those who indicate support for the LGBT community continued to exist after drivers accepted a ride request—when the rider’s picture was then displayed.
In other words, their efforts shifted some of the biased behavior until after the rider received a confirmation, resulting in higher cancellation rates.
It’s important for ride-sharing companies to understand whether bias remains as they not only compete against each other but also with traditional transportation options, researchers say.
“Our results confirm that bias at the ride request stage has been removed. However, after ride acceptance, racial and LGBT biases are persistent, while we found no evidence of gender biases,” says Jorge Mejia, assistant professor of operations and decision technologies at Indiana University.
“We show that signaling support for a social cause—in our case, the lesbian, gay, bisexual, and transgender community—can also impact service provision. Riders who show support for the LGBT community, regardless of race or gender, also experience significantly higher cancellation rates.”
Mejia and coauthor Chris Parker, assistant professor in the information technology and analytics department at American University, believe they are the first to use support for social causes as a bias-enabling characteristic. The study appears in Management Science.
The researchers performed a field experiment on a ride-sharing platform in fall 2018 in Washington, DC. They randomly manipulated rider names, using those traditionally perceived to be white or Black, as well as profile pictures to observe ride-share drivers’ behavior patterns in accepting and canceling rides. To illustrate support for LGBT rights, the authors overlaid a rainbow filter on the rider’s picture profile.
“We found that underrepresented minorities are more than twice as likely to have a ride canceled than Caucasians; that’s about 3% versus 8%,” Mejia says. “There was no evidence of gender bias.”
Mejia and Parker also varied times of ride requests to study whether peak price periods affected bias. They found that higher prices associated with peak times alleviated some of the bias against riders from the underrepresented group, but not against those who signal support for the LGBT community.
They believe that companies should use other data-driven solutions to take note of rider characteristics when a ride-share driver cancels and penalize the driver for biased behavior. One possible way to punish drivers is to move them down the priority list when they exhibit biased cancellation behavior, so they have fewer ride requests. Alternatively, less-punitive measures may provide “badges” for drivers who exhibit especially low cancellation rates for minority riders.
But, ultimately, policymakers may need to intervene, Mejia says.
“Investments in reducing bias may not occur organically, as ride-sharing platforms are trying to maximize the number of participants in the platform—they want to attract both riders and drivers,” he says.
“As a result, it may be necessary for policymakers to mandate what information can be provided to a driver to ensure an unbiased experience, while maintaining the safety of everyone involved, or to create policies that require ride-sharing platforms to monitor and remove drivers based on biased behavior.
“Careful attention should be paid to these policies both before and after implementation, as unintended consequences are almost sure to follow any simple fix.”
Source: Indiana University