Workers at delivery company Shipt “found that their paychecks had become…unpredictable,” according to an article in IEEE Spectrum. “They were doing the same work they’d always done, yet their paychecks were often less than they expected. And they didn’t know why….” The article notes that “Companies whose business models rely on gig workers have an interest in keeping their algorithms opaque.” But “The workers showed that it’s possible to fight back against the opaque authority of algorithms, creating transparency despite a corporation’s wishes.” On Facebook and Reddit, workers compared notes. Previously, they’d known what to expect from their pay because Shipt had a formula: It gave workers a base pay of $5 per delivery plus 7.5 percent of the total amount of the customer’s order through the app. That formula allowed workers to look at order amounts and choose jobs that were worth their time. But Shipt had changed the payment rules without alerting workers. When the company finally issued a press release about the change, it revealed only that the new pay algorithm paid workers based on “effort,” which included factors like the order amount, the estimated amount of time required for shopping, and the mileage driven. The company claimed this new approach was fairer to workers and that it better matched the pay to the labor required for an order. Many workers, however, just saw their paychecks dwindling. And since Shipt didn’t release detailed information about the algorithm, it was essentially a black box that the workers couldn’t see inside. The workers could have quietly accepted their fate, or sought employment elsewhere. Instead, they banded together, gathering data and forming partnerships with researchers and organizations to help them make sense of their pay data. I’m a data scientist; I was drawn into the campaign in the summer of 2020, and I proceeded to build an SMS-based tool — the Shopper Transparency Calculator [written in Python, using optical character recognition and Twilio, and running on a home server] — to collect and analyze the data. With the help of that tool, the organized workers and their supporters essentially audited the algorithm and found that it had given 40 percent of workers substantial pay cuts… This “information asymmetry” helps companies better control their workforces — they set the terms without divulging details, and workers’ only choice is whether or not to accept those terms… There’s no technical reason why these algorithms need to be black boxes; the real reason is to maintain the power structure… In a fairer world where workers have basic data rights and regulations require companies to disclose information about the AI systems they use in the workplace, this transparency would be available to workers by default. The tool’s creator was attracted to the idea of helping a community “control and leverage their own data,” and ultimately received more than 5,600 screenshots from over 200 workers. 40% were earning at least 10% less — and about 33% were earning less than their state’s minimum wage. Interestingly, “Sharing data about their work was technically against the company’s terms of service; astoundingly, workers — including gig workers who are classified as ‘independent contractors’ — often don’t have rights to their own data… “[O]ur experiment served as an example for other gig workers who want to use data to organize, and it raised awareness about the downsides of algorithmic management. What’s needed is wholesale changes to platforms’ business models… The battles that gig workers are fighting are the leading front in the larger war for workplace rights, which will affect all of us. The time to define the terms of our relationship with algorithms is right now.” Thanks to long-time Slashdot reader mspohr for sharing the article. Read more of this story at Slashdot.
Shipt’s Pay Algorithm Squeezed Gig Workers. They Fought Back
Advertisment