wordpress blog stats
Connect with us

Hi, what are you looking for?

New York regulator orders probe into Goldman Sachs’ credit card practices over Apple Card and sexism

Apple Card
Apple Card

Prompted by a frustrated (and viral) Twitter thread about sexism underlying Apple Card’s credit limit allowances, the New York State Department of Financial Services (NYSDFS) is opening a probe into Goldman Sachs’ credit card practices.

Apple Card is a joint venture between Apple and Goldman Sachs which was launched earlier this year. Sachs is responsible for all the credit decisions on the card.

What happened?

David Heinemeier Hansson, a Danish tech entrepreneur, on November 8, had tweeted his ire at how, despite having a worse credit score than his wife, he is allowed a 20X more credit limit than she is. Highlighting the problem of black boxed algorithms that replicate real world biases, he also pointed out that customer service couldn’t also do much in this case since “it’s just the algorithm”.

6 different representatives across Apple, and Goldman Sachs have no visibility into this black boxed algorithm. As the thread gained traction on social media and turned into a PR disaster for the company, Apple raised Hansson’s wife’s credit limit to match his. But such a “VIP bump” doesn’t resolve the underlying issue of algorithmic bias.

Hansson also pointed out that since this discrimination is a result of a proprietary algorithm, “there will be no recourse, no accountability because any review of our biases and processes is an invasion of our business privileges”. Steve Wozniak, Apple’s co-founder, also said that he got 10X the credit limit than his wife.

New York State regulator orders probe

Linda A. Lacewell, the superintendent of NYSDFS, announced the probe in a Medium post on November 11.  As she cited New York law, that prohibits discrimination against protected classes of individuals, even if it done via an algorithm. She also cited another investigation by DFS wherein an algorithm sold by a UnitedHealth Group subsidy allegedly led to less comprehensive care for black patients than white patients.

In response to the online maelstrom, Goldman Sachs CEO Carey Halio issued a statement earlier this morning wherein he said that during the Apple Card application process, the company is not aware of gender or marital status.

‘Need equality, justice, transparency and fairness,’ says Jamie

Jamie Heinemeier Hansson, Hansson’s wife, wrote a blog post on his website “lest [she] be cast as a meek housewife who cannot speak for herself”. Shedding light on her financial status, she wrote that she had higher credit score than her husband’s, had a successful career, and despite identifying as a “homemaker” on her tax returns, was still a millionaire in her own right “who contributes greatly to my household and pays off credit in full each month”.

Jamie also recognised her own privilege in the fact that the Apple manager, on learning about David’s viral tweets, raised her credit limit without resolving the underlying issue. “This is not merely a story about sexism and credit algorithm blackboxes, but about how rich people nearly always get their way,” she wrote.

Looking at algorithmic bias

Black boxed algorithms, both digital and analog, have long been used to determine people’s credit worthiness, and in a Black Mirror-esque situation, their social worthiness. The datasets that buttress a lot of these algorithms themselves replicate and perpetuate prejudices of the real world. As digital algorithms get more complex, fewer people completely understand the causal relationship between datasets, the algorithm, and the consequences of its operation. As a result, the most affected people are the ones who are already discriminated against.

American lawmakers have sponsored a bill, the Algorithmic Accountability Act, that seeks to assess the algorithms that power large companies for bias.

If you want to read more about algorithmic bias, you can look at this Twitter thread.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like


We must have a legal framework dealing with the “full range” of issues around artificial intelligence, including algorithmic bias, data mining, using data without...


The Department of Telecommunications’ newly released framework of the Indian AI Stack admitted that the AI stack will suffer from algorithmic bias, if “contaminated” data...


In an ironic move, Google is planning to launch new AI ethics services this year through which it will advise clients on how to...


By Ariadna Matamoros-Fernández and Joanne Gray People watch more than a billion hours of video on YouTube every day. Over the past few years,...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2018 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to Daily Newsletter

    © 2008-2018 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ