Icy Tales

How Gig Platforms Built a Legal Shield Around Algorithmic Firing

Sathi
By
20 Min Read

Post Author

“I have a wife and three children. They deactivated my account for no reason since 2018. I tried contacting them so many times, but they didn’t activate me.”

That is what one driver told researchers during a Zoom hearing for a new report by the Asian American Legal Defense and Education Fund (AALDEF), released in October 2025.

Stories like his are everywhere once you start looking. And what’s striking isn’t just that they happen, it’s that nothing seems to change.

When Human Rights Watch dug into this last May, they surveyed 127 gig workers in Texas. Forty of them had been deactivated at some point. And here’s the number that stuck with me: nearly half of those workers were later cleared of any wrongdoing.

Let’s get this straight. We’re discussing a termination system with a roughly 50% error rate. And these aren’t struggling startups; Uber pulled in $43.9 billion in revenue last year with $9.8 billion in net income.

Alt: Bar chart illustrating Uber earnings, net income, and driver wages for 2024.
Bar chart illustrating Uber earnings, net income, and driver wages for 2024.

They could absolutely afford to have humans review these cases. So why don’t they?

The more I looked into this, the more I realized we’re not dealing with a glitchy algorithm or bad customer service. What these companies have built is something more deliberate: a legal architecture where algorithmic firing and forced arbitration work together to make accountability almost impossible.

What Deactivation Actually Looks Like

Another driver at that AALDEF hearing, A.G. Egwim, got deactivated because he canceled a ride to go to a doctor’s appointment. His question to the researchers was pretty raw: “Do the Uber and Lyft people see us as human beings at all, or just money-making machines for them?”

The AALDEF team surveyed nearly 350 deactivated NYC drivers, and the findings are bleak. 70% of Uber drivers and 76% of Lyft drivers got no notice before being cut off.

Not a warning, not a heads-up, just locked out one day!

Spend any time on driver forums, and you’ll see these stories pile up. “Over 500 rides with a perfect 5-star rating and I was deactivated with no warning or reason,” one driver wrote. “Was working to pay off my mom’s medical bills.”

And it’s not just about losing gig income. Futurism reported back in March that labor organizers are calling this a “deactivation crisis.” According to an ACRE survey they cited, close to two-thirds of deactivated drivers fell behind on bills, and 14% faced losing their homes.

Here’s the kicker: almost all the drivers AALDEF surveyed tried to appeal. The vast majority stayed locked out anyway.

So what’s going on? Why can’t they fight back?

The Fine Print That Changes Everything

When drivers sign up, they click through Uber’s Platform Access Agreement without reading it. (Let’s be honest, who reads these things?) But buried in there is language that basically determines everything:

“IF YOU DO NOT OPT OUT OF THIS ARBITRATION PROVISION… YOU ARE AGREEING IN ADVANCE… THAT YOU WILL NOT PARTICIPATE IN… ANY SUCH CLASS, COLLECTIVE OR REPRESENTATIVE LAWSUIT.”

So here is the simple Translation: no class actions, no jury trials, no public record of what happened.

Just you versus a company worth $169 billion, in a private process where the outcome never gets reported and sets no precedent for anyone else.

One driver on UberPeople.net simply nailed it: “It’s still the Wild West with these rideshare companies and the legalities that go along with deactivations. There are a lot of legal grey areas and you can bet they have an army of lawyers that make sure they’re protected.”

And the data backs him up. According to the Economic Policy Institute, over 60 million American workers are covered by mandatory arbitration clauses. How many actually file claims each year? About 5,758. That’s 1 in 10,400 workers. If people filed at the rate they do in regular courts, we’d see hundreds of thousands of claims annually.

Illustration showing 1 in 10,400 workers file arbitration claims annually.
Visual representation of arbitration claim statistics for workers and employers.

For the few who do file? The American Association for Justice found that in 2020, just 82 employees won monetary awards in forced arbitration. Out of 60 million workers. They pointed out that more people climb Everest each year, with better odds of making it.

So that’s the first part of the shield: an arbitration system so effective at suppressing claims that companies can get away with a 50% error rate in terminations.

But there’s a second part that makes the first one work.

Why Nobody Can Coordinate

Here’s the thing about arbitration clauses: they have a weakness. In 2019, a law firm decided to call DoorDash’s bluff. They filed arbitration demands for over 5,000 drivers at once. Under DoorDash’s own agreement, the company had to pay $1,900 per case in filing fees. That’s millions of dollars due upfront, before anything was even decided.

DoorDash’s response was almost funny. They went to court and asked to handle everything as a class action instead, which is exactly what their arbitration clause was designed to prevent.

The judge wasn’t having it: “No doubt, DoorDash never expected that so many would actually seek arbitration. This hypocrisy will not be blessed.”

Mass arbitration actually works. Uber ended up paying $146-170 million to settle 60,000 claims. Amazon dropped its arbitration clause entirely after 75,000 Echo users filed demands. When workers band together, the shield starts to crack.

Which brings us to why the algorithmic system works the way it does.

Well, mass arbitration needs two things: workers need to know they’ve been wronged, and they need to find others who got wronged the same way. The algorithmic opacity prevents both.

When drivers get deactivated, they’re not told which ride triggered it, what the passenger claimed, or what evidence exists.

One driver described it this way: “They’ll take whatever the rider says as gospel. Yet, they won’t even tell us what ride it was, nor give us adequate consideration to defend ourselves.”

Think about what that means. If you don’t know why you were fired, you can’t find others who were fired for the same reason. If you can’t find them, you can’t coordinate. And if you can’t coordinate, mass arbitration becomes impossible.

The opacity isn’t a bug. It’s what protects the arbitration clause from the one thing that can actually break it.

Proof That This Is By Design

If I’m right that this is all by design, we should see totally different outcomes when the shield gets broken. And we do.

Seattle created America’s first deactivation rights ordinance for rideshare drivers back in 2021. They set up a resolution center where Uber and Lyft drivers could challenge terminations in front of a neutral arbitrator, with actual union representation.

The University of Washington studied what happened. They looked at 1,420 cases and found that 80% of terminated drivers got reinstated when they had real due process. Compare that to the 10% success rate AALDEF found in Uber’s own internal appeals in NYC.

80% versus 10%. That’s not a minor difference; it tells you that most of these deactivations can’t survive actual scrutiny.

The study also uncovered something else: racial bias baked into the system. Drivers of color and white drivers got deactivated at similar rates, but drivers of color got reinstated at higher rates, which means they were being wrongly terminated more often. As Cornell researchers noted, this points to “more incidents of discriminatory customer complaints” getting encoded into the algorithm. That discrimination was invisible until transparency exposed it.

The Independent Drivers Guild in New York and New Jersey reports similar numbers: they’ve helped 20,000 drivers fight deactivations over five years, more than 4,000 just in 2024, with a 90% success rate. When workers actually get representation, the system’s errors become obvious.

This is what the legal shield prevents. Not fair outcomes, just visibility.

Now They’re Claiming Transparency Violates Free Speech

Seattle passed a new ordinance that kicked in on January 1, 2025. It requires gig companies to publish their deactivation policies, give 14 days’ notice before terminating workers, and provide real appeal procedures.

Uber and Instacart are suing to block it. And their argument is… something else. They’re claiming the law violates their First Amendment rights because it forces them to explain their deactivation policies. Transparency, they say, is compelled speech.

Read that again. They’re not saying algorithmic flexibility is needed for safety. They’re not saying human review costs too much. They’re saying the Constitution protects their right to fire people without telling them why.

A federal judge shot this down and let the law take effect: “The Court finds that the Ordinance’s requirements do little more than regulate conduct without any significant impact on speech or expression.”

It’s now in the Ninth Circuit, where the judges seemed pretty skeptical during July arguments. Judge Susan Graber pointed out the obvious: “The policy regulates conduct. That is, when to deactivate, when you’re not allowed to deactivate.”

Here’s the thing: if these companies actually believed their termination criteria were fair and defensible, transparency would help them. The fact that they’re fighting it this hard tells you everything about what those criteria would look like under scrutiny.

The Opt-Out Game

There’s technically a 30-day window to opt out of Uber’s arbitration clause. Courts have ruled that this makes the whole thing fair. But here’s what they don’t emphasize: you have to opt out again every time Uber updates its terms of service.

Clear visual of steps in arbitration process including jury waiver and confidential outcomes.
Diagram explaining the arbitration procedure, highlighting key stages like jury trial waiver and private arbitration.

One driver posted a warning about this: “ONCE YOU AGREE TO THIS YOU MUST OPT OUT AGAIN. You have 30 days to do so.”

That same driver shared what opting out was worth: they got $9,500 from the O’Connor v. Uber settlement after opting out back in 2014. The settlement created a $20 million fund, but only for drivers who’d kept their right to sue. Everyone who stayed in arbitration? Nothing.

The opt-out exists so courts will enforce the clause. Its structure makes sure almost nobody actually uses it. You’d need to be constantly vigilant against a company that controls when the clock resets.

How This Plays Out Globally

Looking at how this works in other countries actually helps explain why the American version is so effective.

In Amsterdam, courts ruled in 2021 that Uber’s automated deactivations violated GDPR protections against purely automated decision-making. Uber had to compensate workers. European privacy law created an opening that American law simply doesn’t have.

In India, the shield works differently, through labor oversupply and weak enforcement rather than arbitration. Researchers estimate that 60-70% of blue-collar gig workers there have been deactivated at some point.

Dr. Anjana Karumathil of IIM Kozhikode put it well: “When earnings fluctuate wildly, social security is absent, and bargaining power is weak, it’s not flexibility—it’s precarity.”

A study from Australia found something even darker: international students, limited by visas to 24 hours of work per week, get pushed by algorithmic incentives to exceed those limits. Deactivation becomes the threat enforcing compliance, with deportation as the ultimate consequence. The researchers called it “a form of social harm caused by the interaction between technology and discriminatory migration policies.”

The pattern holds: where legal systems give workers real recourse, platforms face limits. Where they don’t, the system runs without friction. American forced arbitration is the most effective friction-elimination tool out there.

What Congress Is (Not) Doing

Congress has noticed that. The Algorithmic Accountability Act of 2025, introduced last September by Rep. Yvette Clarke and Sen. Ron Wyden, would require companies to do impact assessments of AI systems used in employment decisions. Transparency requirements, bias testing, and reports to the FTC.

It’s sitting in committee. GovTrack gives it a 3% chance of getting past committee and a 1% chance of becoming law. So, you know, don’t hold your breath.

California’s AB 1340 looks more promising; it gives rideshare drivers collective bargaining rights starting this year, a first in the nation. But there’s a catch: Uber actually supported it. Why?

Because the bill was part of a deal that cut their liability insurance requirements. Lorena Gonzalez, president of the California Federation of Labor Unions, was pretty direct about what happened: “Look, Uber needed something. They wanted a reduction of their insurance… And in exchange, we’ll agree to this process.”

When platforms embrace worker protections, always check what they got in return.

What Would Actually Fix This

The shield has two parts, and you have to go after both. Ban arbitration but keep algorithmic opacity, and companies will find new ways to suppress claims. Require transparency but leave arbitration in place, and workers still can’t coordinate.

What would actually work: a federal ban on forced arbitration for employment disputes. Mandatory human review before termination. A legal right to know why you were fired—the specific incident, the evidence, the reasoning. Public reporting on deactivation rates and outcomes by demographic group.

Seattle’s ordinance comes closest to this, which is exactly why Uber and Instacart are fighting it with constitutional arguments. If it survives the Ninth Circuit, it becomes a template for everyone else.

What This Is Really About

Go back to that driver from the AALDEF hearing, the father of three, deactivated in 2018, still locked out seven years later. His situation isn’t a glitch or an edge case. It’s the system working exactly as intended.

Uber could implement human review tomorrow. The cost would be a rounding error on $9.8 billion in annual profit. The technology for fair termination systems exists. They’ve just chosen not to use it, because the legal architecture makes unfairness the cheaper option.

Arbitration suppresses individual claims. Algorithmic opacity prevents collective action. Together, they make it so that wrongly firing half your terminated workers carries almost no consequences. The shield doesn’t just protect against bad outcomes—it makes bad outcomes the rational business choice.

One driver, looking at the whole landscape, put it this way: “If only Uber showed the same amount of trust and respect to their drivers as they do to passengers, even ones who give fake responses. One day, they are going to realize that drivers are what make their business possible.”

They already realize it. That’s exactly why they built the shield.

Sources

  1. Human Rights Watch, “The Gig Trap” (May 2025): https://www.hrw.org/report/2025/05/12/the-gig-trap/algorithmic-wage-and-labor-exploitation-in-platform-work-in-the-us
  2. AALDEF, “Uber and Lyft Deactivate NYC Drivers With No Notice” (October 2025): https://www.aaldef.org/press-release/aaldef-report-uber-and-lyft-deactivate-nyc-drivers-with-no-notice-no-due-process-and-no/
  3. Economic Policy Institute, “The growing use of mandatory arbitration”: https://www.epi.org/publication/the-growing-use-of-mandatory-arbitration-access-to-the-courts-is-now-barred-for-more-than-60-million-american-workers/
  4. American Association for Justice, “Forced Arbitration in a Pandemic”: https://www.justice.org/resources/research/forced-arbitration-in-a-pandemic
  5. Uber Platform Access Agreement: https://tb-static.uber.com/prod/reddog/country/UnitedStates/licensed/73d82f4c-c374-41af-9057-8f371c38d32f.pdf
  6. Courthouse News, DoorDash mass arbitration ruling (2020): https://www.courthousenews.com/doordash-ordered-to-pay-12m-to-arbitrate-5000-labor-disputes/
  7. Bryan Schwartz Law, Uber arbitration settlement: https://www.bryanschwartzlaw.com/02-12-20/
  8. Seattle Times, UW racial bias study (August 2023): https://www.seattletimes.com/seattle-news/uw-study-finds-racial-bias-in-rideshare-driver-deactivations/
  9. Courthouse News, Ninth Circuit challenge (July 2025): https://www.courthousenews.com/uber-instacart-challenge-seattle-gig-worker-law-at-ninth-circuit/
  10. Seattle Times, deactivation law ruling (January 2025): https://www.seattletimes.com/seattle-news/law-justice/judge-allows-seattle-driver-deactivation-law-to-go-into-effect/
  11. Seattle App-Based Worker Deactivation Rights Ordinance: https://www.seattle.gov/laborstandards/ordinances/app-based-worker-ordinances/app-based-worker-deactivation-rights-ordinance
  12. Independent Drivers Guild: https://driversguild.org/deactivation/
  13. Cornell ILR, just cause protections analysis: https://www.ilr.cornell.edu/carow/carow-policy/just-cause-nyc-gig-workers-provides-human-review-algorithmic-firings
  14. UberPeople.net, arbitration opt-out thread: https://www.uberpeople.net/threads/arbitration-template-to-opt-out-do-it-again-for-the-2022-agreement-edit-2024.457335/
  15. UberPeople.net, deactivation experiences: https://www.uberpeople.net/threads/uber-wrongfully-deactivated-my-account-seeking-advice-on-next-steps.502738/
  16. O’Connor v. Uber settlement: https://www.uberlitigation.com/
  17. Futurism, deactivation crisis reporting (March 2025): https://futurism.com/uber-rideshare-drivers-deactivation
  18. TravelingTed, personal deactivation account: https://travelingted.com/2021/01/31/why-uber-sucks/
  19. Open Rights Group, GDPR and gig workers: https://www.openrightsgroup.org/blog/automated-hiring-and-firing-how-the-data-act-will-harm-gig-workers/
  20. BusinessToday India, gig economy power gap (January 2026): https://www.businesstoday.in/india/story/scale-doesnt-excuse-shifting-all-risk-to-workers-hidden-power-gap-in-indias-gig-economy-509531-2026-01-05
  21. Griffith Law Review, algorithmic control and deportation (2024): https://www.tandfonline.com/doi/full/10.1080/10383441.2024.2433403
  22. Algorithmic Accountability Act of 2025: https://www.congress.gov/bill/119th-congress/senate-bill/2164/text
  23. GovTrack bill analysis: https://www.govtrack.us/congress/bills/119/s2164
  24. Cap Radio, California union rights (December 2025): https://www.capradio.org/articles/2025/12/22/california-gig-drivers-gear-up-for-union-rights-in-2026/

Stay Connected

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *