Prompted by a frustrated (and viral) Twitter thread about sexism underlying Apple Card’s credit limit allowances, the New York State Department of Financial Services (NYSDFS) is opening a probe into Goldman Sachs’ credit card practices.
The @AppleCard is such a fucking sexist program. My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does. No appeals work.
— DHH (@dhh) November 7, 2019
Apple Card is a joint venture between Apple and Goldman Sachs which was launched earlier this year. Sachs is responsible for all the credit decisions on the card.
What happened?
David Heinemeier Hansson, a Danish tech entrepreneur, on November 8, had tweeted his ire at how, despite having a worse credit score than his wife, he is allowed a 20X more credit limit than she is. Highlighting the problem of black boxed algorithms that replicate real world biases, he also pointed out that customer service couldn’t also do much in this case since “it’s just the algorithm”.
She spoke to two Apple reps. Both very nice, courteous people representing an utterly broken and reprehensible system. The first person was like “I don’t know why, but I swear we’re not discriminating, IT’S JUST THE ALGORITHM”. I shit you not. “IT’S JUST THE ALGORITHM!”.
— DHH (@dhh) November 8, 2019
6 different representatives across Apple, and Goldman Sachs have no visibility into this black boxed algorithm. As the thread gained traction on social media and turned into a PR disaster for the company, Apple raised Hansson’s wife’s credit limit to match his. But such a “VIP bump” doesn’t resolve the underlying issue of algorithmic bias.
I wasn’t even pessimistic to expect this outcome, but here we are: @AppleCard just gave my wife the VIP bump to match my credit limit, but continued to be an utter fucking failure of a customer service experience. Let me explain…
— DHH (@dhh) November 8, 2019
Hansson also pointed out that since this discrimination is a result of a proprietary algorithm, “there will be no recourse, no accountability because any review of our biases and processes is an invasion of our business privileges”. Steve Wozniak, Apple’s co-founder, also said that he got 10X the credit limit than his wife.
The same thing happened to us. I got 10x the credit limit. We have no separate bank or credit card accounts or any separate assets. Hard to get to a human for a correction though. It’s big tech in 2019.
— Steve Wozniak (@stevewoz) November 10, 2019
New York State regulator orders probe
Linda A. Lacewell, the superintendent of NYSDFS, announced the probe in a Medium post on November 11. As she cited New York law, that prohibits discrimination against protected classes of individuals, even if it done via an algorithm. She also cited another investigation by DFS wherein an algorithm sold by a UnitedHealth Group subsidy allegedly led to less comprehensive care for black patients than white patients.
In response to the online maelstrom, Goldman Sachs CEO Carey Halio issued a statement earlier this morning wherein he said that during the Apple Card application process, the company is not aware of gender or marital status.
‘Need equality, justice, transparency and fairness,’ says Jamie
Jamie Heinemeier Hansson, Hansson’s wife, wrote a blog post on his website “lest [she] be cast as a meek housewife who cannot speak for herself”. Shedding light on her financial status, she wrote that she had higher credit score than her husband’s, had a successful career, and despite identifying as a “homemaker” on her tax returns, was still a millionaire in her own right “who contributes greatly to my household and pays off credit in full each month”.
Jamie also recognised her own privilege in the fact that the Apple manager, on learning about David’s viral tweets, raised her credit limit without resolving the underlying issue. “This is not merely a story about sexism and credit algorithm blackboxes, but about how rich people nearly always get their way,” she wrote.
Looking at algorithmic bias
Black boxed algorithms, both digital and analog, have long been used to determine people’s credit worthiness, and in a Black Mirror-esque situation, their social worthiness. The datasets that buttress a lot of these algorithms themselves replicate and perpetuate prejudices of the real world. As digital algorithms get more complex, fewer people completely understand the causal relationship between datasets, the algorithm, and the consequences of its operation. As a result, the most affected people are the ones who are already discriminated against.
American lawmakers have sponsored a bill, the Algorithmic Accountability Act, that seeks to assess the algorithms that power large companies for bias.
If you want to read more about algorithmic bias, you can look at this Twitter thread.