Two brothers who drive for Uber recently conducted an experiment. They opened their Uber apps while sitting in the same room, and tested which brother could earn more money to do the same work.
In a video published on The Rideshare Guy YouTube channel, the brothers recorded themselves looking for rides on the app. They found that Uber showed them nearly identical jobs, but offered to pay one of them a little better. The siblings could only guess why. Had Uber's algorithm somehow calculated their worth differently?
University of California College of the Law professor Veena Dubal says that's exactly what's going on. In a recent paper, she says rideshare apps promote "algorithmic wage discrimination" by personalizing wages for each driver based on data they gather from them. The algorithms are proprietary, so workers have no way of knowing how their data is being used, Dubal says.
"The app is their boss," Dubal told Morning Edition's A Martinez. "But unlike a human boss who you can negotiate with or withhold information from, the algorithms know so much about these workers."
Uber says workers who drive electric vehicles get a $1 bonus per ride, but the company does not use drivers' personal data to set their pay rates. "Uber does not personalize fares to individual drivers, and a driver's race, ethnicity, acceptance rate, total earnings, or prior trip history are not considered when calculating fares," a spokesperson writes in a statement. A representative for Lyft calls Dubal's paper "biased", saying it relies on cherry-picked data and debunked anecdotal information.
Personalized digitized pay is already the new normal in some workplaces, according to Dubal, and it's begun to attract attention from regulators.
This conversation has been lightly edited for clarity and length.
Interview highlights
On how algorithmic wage discrimination works
Rideshare drivers say the app is their boss. And unlike a human boss who you can negotiate with or withhold information from, the algorithms know so much about these workers. They know how much a worker is willing to accept for a particular ride. They know how much workers try to earn on any given day. They can really personalize how much that worker makes in order to influence their behavior in particular ways.
I look specifically at ride-hailing firms to discuss the phenomenon of digitalized variable pay, but it's happening across the on-demand economy and even maybe beyond it. Basically, these firms, because they treat their workers as independent contractors, cannot tell them what to do and where to go. Instead, they use these pay mechanisms to influence their behavior. They learn everything that they can about particular workers and use that knowledge to shape how workers get paid.
On how this differs from other forms of unequal pay
In a more familiar employment setting where workers make different amounts of money [to do similar jobs], we still have a legal norm: equal pay for equal work. In those contexts, there is often some logic to why people are earning more, whether it's seniority or experience or skill, and that is often transparent. There are also laws that ensure that companies check their own practices to make sure people are earning roughly the same amounts. What's complicated about algorithmic pay is that there is no logic. Instead, it might be that the person who works for a really long time, works really hard, and has the most experience is earning the least, and we just can't know. The logic is all hidden behind black box algorithms.
On how algorithms can reproduce discrimination
Some of these firms have, in their own research, found that these practices can lead to women earning less than men. They ascribe these differences to the algorithms. But if these algorithms are recreating traditional wage differences that are illegal under employment laws, then something is deeply wrong.
On whether workers can access how algorithms calculate their pay
In Europe, under GDPR [General Data Protection Regulation], some workers have, after litigation, won the right to have some access to what data companies are extracting from their work to determine particular prices. None of that has been revealed yet. In the U.S. there are similar privacy laws, but none of this has been litigated yet, and attempts to get at it from regulators has largely been met with resistance. Companies maintain that this is, oddly, about privacy, that they don't want to unveil their practices because it might lead to information about workers being leaked. They also maintain that these systems they've developed are their intellectual property.
On whether variable digitized pay is illegal
It might be illegal under antitrust laws. There is the potential to say that some of this is price fixing, if all of these workers are independent contractors. That is being litigated in California courts right now. But absent a finding on an antitrust violation, this isn't necessarily illegal. It's a brave new world.
On whether regulators could intervene
The Federal Trade Commission is looking into this. They're very interested in whether or not this violates antitrust laws. And I think that any number of lawmakers who are generally interested in economic equality – from Sen. Elizabeth Warren to Sen. Bernie Sanders – are likely very interested in what amounts to a dystopic system of work.
Transcript
A MARTÍNEZ, HOST:
Uber and Lyft drivers say those apps promote wage discrimination. They say algorithms use drivers' personal data to shape how much they earn, resulting in unequal pay. Uber denies using data in this way. And Lyft says workers can reject jobs that don't pay enough. I spoke to University of California College of Law, San Francisco professor Veena Dubal, who interviewed rideshare drivers about this.
VEENA DUBAL: These firms, because they treat their workers as independent contractors, cannot tell them what to do and where to go. So instead, they use these pay mechanisms to influence their behavior. So they learn everything that they can about particular workers and then use that knowledge to shape how workers get paid. And the problem in large part is not just that these workers are getting different pay for the same work, but also that they don't know why they're getting paid in particular ways.
MARTÍNEZ: Is it too much of a leap if I say that in some ways, the algorithm becomes their boss?
DUBAL: No, that's exactly how drivers talk about it. They talk about how the app is their boss. And unlike a human boss who you can kind of negotiate with, who you can withhold information from, the algorithm knows so much about these workers. They know how much a worker is willing to accept for particular rides. They know how much they try and earn on any given day. They might know other aspects about the worker, such that they can really personalize how much that worker makes in order to influence their behavior in particular ways.
MARTÍNEZ: So professor, how is this, then, any different from, say, another kind of employment setting, either in an office or a factory, where there are two employees that are hired at different pay rates to essentially do the same job?
DUBAL: In a more familiar employment setting, we still have a legal norm. And that legal norm is equal pay for equal work. And so in those contexts, there is often some logic to why people are earning more, whether it's seniority or experience or skill. And that is often transparent. And there are checks and balances. There are laws that ensure that companies check their own practices to make sure that people are earning roughly the same amounts. What's complicated about this context is that there is no logic. Instead, it might be that the person who works for a really long time and works really hard and the longest hours and has the most experience is earning the least. And we just can't know. The logic is all hidden behind black box algorithms.
MARTÍNEZ: And do the workers have any rights to ask the tech companies why this pay is so inconsistent? Or is it because they're independent contractors, they don't have that ability?
DUBAL: In Europe, workers have recently, after litigation, won the right to have some access to what data companies are extracting from their work and to determine particular prices. That - none of that has been revealed yet. In the U.S., there are similar privacy laws. But none of this has been litigated yet. And attempts to get at it from regulators has largely been met with resistance.
MARTÍNEZ: Is any of this actually illegal?
DUBAL: It might be illegal under antitrust laws. There is the potential to say that some of this is price fixing if all of these workers are independent contractors. That is being litigated in California courts right now. But absent a finding on an antitrust violation, this isn't necessarily illegal. There's nothing that says that independent contractors have to get paid the same amount, unlike in the employment context, where workers have to get a wage floor. And there has to be equal pay for equal work as it pertains to protected categories of people. There's sort of nothing like that. It's a brave new world.
MARTÍNEZ: Veena Dubal is a professor at the University of California College of Law. Professor, thanks.
DUBAL: Thank you for having me.
(SOUNDBITE OF AAESPO'S "9") Transcript provided by NPR, Copyright NPR.
300x250 Ad
300x250 Ad