Apple and Goldman Sachs will be defending itself against a probe by the New York State Department of Financial Service after a series of tweets by a couple of tech-savvy men accused the Apple Card’s “black box algorithm” of discriminating against women.
“The investigation follows a series of viral tweets by entrepreneur and web developer David Heinemeier Hansson about algorithms used for the Apple Card, which Goldman Sachs manages in partnership with Apple. Hansson said the card offered him a credit limit 20 times greater than it gave to his wife, even though she has a higher credit score. He called the algorithm a sexist program,” according to the Associated Press.
“In several tweets that were often liked thousands of times and frequently retweeted, Hansson didn’t disclose his or his wife’s income, but wrote that they have been married a long time, file joint tax returns and live in a community-property state. He tweeted that appeals when she got a far lower credit limit fell on deaf ears,” the AP continues.
But not on Twitter.
“Hansson’s tweets caught the attention of more than just his 350,000 followers. They struck a nerve with New York State regulators, who announced on Saturday that they would investigate the algorithm used by Apple Card to determine the creditworthiness of applicants,” writes Neil Vigdor for The New York Times.
“Algorithms are codes or a set of instructions used by computers, search engines and smartphone applications to perform tasks, from ordering food delivery to hailing a ride -- and yes, applying for credit,” Vigdor continues.
Hansson is the creator of the web development tool Ruby on Rails. His charge was backed up by Apple co-founder Steve Wozniak.
“On Saturday, Wozniak chimed in with a similar experience, saying he got 10 times more credit on the card, compared with his wife,” reports Reuters’ Subrat Patnaik.
“We have no separate bank or credit card accounts or any separate assets,” Wozniak said on Twitter, in reply to Hansson’s original tweet. “Hard to get to a human for a correction though. It’s big tech in 2019.”
“The situation throws a shadow over the Apple Card, which launched with much fanfare in August as a partnership between the tech giant's Apple Pay program and a new retail consumer-focused effort at Goldman Sachs,” writes Clare Duffy for CNN Business.
“The companies had boasted that the card would be available to consumers that might otherwise struggle to access credit, including those with no credit history or below-average credit scores. But these allegations highlight the challenges inherent in letting artificial intelligence, which has been shown in a number of contexts to be biased, make decisions like how much credit to extend to a user,” Duffy continues.
Linda A. Lacewell, the superintendent of New York’s Department of Financial Services, took to Medium Saturday to respond to Hansson, Wozniak and “numerous” others who “[described] similar instances where men received higher credit limits than women.”
Vowing to investigate, Lacewell writes:
“New York law prohibits discrimination against protected classes of individuals, which means an algorithm, as with any other method of determining creditworthiness, cannot result in disparate treatment for individuals based on age, creed, race, color, sex, sexual orientation, national origin, or other protected characteristics.
“We know the question of discrimination in algorithmic decisioning also extends to other areas of financial services. Just last week, DFS opened an investigation after reports regarding an algorithm sold by a UnitedHealth Group subsidy allegedly resulted in black patients receiving less comprehensive care than white patients,” Lacewell adds.
Apple has yet to comment.
“Please know that Goldman Sachs will never consider sex/gender or any other prohibited bases when making credit decisions,” the bank replied to Hansson Twitter, asking him to “please direct message us."
“A spokesman for the bank told Bloomberg, ‘Our credit decisions are based on a customer’s creditworthiness and not on factors like gender, race, age, sexual orientation or any other basis prohibited by law,’” writes Annabelle Timsit for Quartz.
“Hansson said Goldman’s response doesn’t explain what happened after he started airing his issues on social media,” report Bloomberg’s Sridhar Natarajan and Shahien Nasiripour.
“As soon as this became a PR issue, they immediately bumped up her credit limit without asking for any additional documentation,” Hansson tells the reporters. “My belief isn’t there was some nefarious person wanting to discriminate. But that doesn’t matter. How do you know there isn’t an issue with the machine-learning algo when no one can explain how this decision was made?”
Fighting with officious customer-service reps is one thing; battling with algos clearly takes the battle to another level.