The Apple Card Is the Most High-Profile Case of AI Bias Yet

Apple Card users have alleged that its credit decision algorithm discriminates against women.

Chris Wiltz

November 13, 2019

7 Min Read
The Apple Card Is the Most High-Profile Case of AI Bias Yet
1.) Algorithmic bias is already affecting usAs more and more AI algorithms are implemented into decision making processes in everything from real estate to healthcare, it is important to for developers to be aware of the inherent biases within the datasets they use to train AI.Apple's Apple Pay service recently came under fire from customers – including Apple's co-founder Steve Wozniak – over allegations that the services approval system was assigning lower credit limits to female customers.Experts agree it will likely be impossible to completely safeguard systems again bias, but steps can be taken to mitigate the impact of bias.(Image source: Apple)

The Apple Card is backed by Goldman Sachs and is meant to be used with Apple devices. It requires no security code, signature, or card number. Apple says it is more secure than any otehr physical credit card. (Image source: Apple) 

The algorithm responsible for credit decisions for the Apple Card is giving females lower credit limits than equally qualified males. Those are the allegations that began spreading as consumers took to social media with complaints about Apple's credit card designed to work with Apple Pay and on various Apple devices.

The controversy began on November 7 when entrepreneur David Heinemeier Hansson, the creator of the Ruby on Rails programming tool, posted a lengthy, and angry, thread to Twitter complaining of his wife's experience with the Apple Card.

“The @AppleCard is such a [expletive] sexist program. My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x [sic] the credit limit she does. No appeals work,” Hansson tweeted. “It gets even worse. Even when she pays off her ridiculously low limit in full, the card won’t approve any spending until the next billing period. Women apparently aren’t good credit risks even when they pay off the [expletive] balance in advance and in full.”

Hansson goes on to describe his experience dealing with Apple Card's customer support regarding the issue. He says customer service reps assured him there was no discrimination involved and that the outcomes he and his wife were seeing were due to the algorithm.

“So let’s recap here: Apple offers a credit card that bases its credit assessment on a black-box algorithm that [six] different reps across Apple and [Goldman Sachs] have no visibility into. Even several layers of management. An internal investigation. IT’S JUST THE ALGORITHM!” Hansson wrote (emphasis his). “...So nobody understands THE ALGORITHM. Nobody has the power to examine or check THE ALGORITHM. Yet everyone we’ve talked to from both Apple and [Goldman Sachs] are SO SURE that THE ALGORITHM isn’t biased and discriminating in any way. That’s some grade-A management of cognitive dissonance.”

David Heinemeier Hansson tweeted a lengthy statement outlining his frustration with Apple Card. (Tweet edited for language). 

Hansson's tweets prompted others to share similar experiences, most notably Apple co-founder Steve Wozniak. The same thing happened to us,” Wozniak tweeted. “I got 10x [sic] the credit limit. We have no separate bank or credit card accounts or any separate assets. Hard to get to a human for a correction though. It's big tech in 2019.”

Filmmaker Lexi Alexander said she and a group of her friends applied for an Apple Card to see if the allegations were true. What they found confirmed the accounts made by Hansson and Wozniak. “A bunch of us applied [for] this card today. It takes 5 sec on your iPhone [and] it doesn’t show up on your credit history (I’ve been told). Apple Card then makes you a credit limit [and] APR offer which you can accept or deny. I’m currently trying to recover from the sexist slap in my face,” Alexander tweeted. “Like it’s really really bad. Male friends with bad credit score and irregular income got way better offers than women with perfect credit and high incomes. There were 12 of us, 6 women 6 men. We just wanted to see what’s up and it was not pretty.”

As complaints about the Apple Card went viral, Goldman Sachs, the New York-based bank which backs the Apple Card, issued a statement on November 10. In the statement Goldman Sachs said the issue stems from the fact that credit decisions regarding the Apple Card are based on individual credit lines and histories, not those shared with family members.

“As with any other individual credit card, your application is evaluated independently,” the Goldman Sachs statement said. “We look at an individual’s income and an individual’s creditworthiness, which includes factors like personal credit scores, how much personal debt you have, and how that debt has been managed. Based on these factors, it is possible for two family members to receive significantly different credit decisions...In all cases, we have not and will not make decisions based on factors like gender.”

The contributing factors cited by Goldman Sachs would seem to contradict those offered by people such as Hansson and Wozniak.

CNBC reported that the discrimination allegations have spurred the New York Department of Financial Services (DFS) to launch an official investigation into Goldman Sachs’ credit card practices. “DFS is troubled to learn of potential discriminatory treatment in regards to credit limit decisions reportedly made by an algorithm of Apple Card, issued by Goldman Sachs,” Linda Lacewell, superintendent for the DFS, told CNBC, “The Department will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally regardless of sex.”

According to CNBC, Goldman Sachs was aware of the potential bias when the Apple Card rolled out in August. But the the bank opted to have credit decisions made on an individual basis to avoid the complexity that comes with dealing with co-signers and other shared accounts.

RELATED ARTICLES:

The Black Box Problem

While these reports of bias related to the Apple Card are surely drawing attention due to the high-profile names attached, it's far from the first case of an widely-used AI algorithm exhibiting bias. Incidents of algorithmic bias in healthcare, lending, and even criminal justice applications have been discovered in recent years. And experts at many major technology companies and research institutions are working diligently to address bias in AI.

“Part of the problem here is that, as with many AI and machine learning algorithms, the Apple Card’s is a black box; meaning, there is no framework in place to trace the algorithm’s training and decision-making,” Irina Farooq, chief product officer at data analytics company, Kinetica, told Design News in a prepared statement. “For corporations, this is a significant legal and PR risk. For society, this is even more serious. If we cede our decision-making to AI, whether for ride-sharing refunds, insurance billing, or mortgage rates, we risk subjecting ourselves to judgment with no appeal, to a monarchy of machines where all the world is a data set, and all the men and women, merely data.”

Farooq echoed the statements of many concerned with bias in AI by stating that the algorithms we employ are only as fair as the data they are trained with. “The parameters of what the algorithm should take into account when analyzing a data set are still set by people. And the developers and data scientists doing this work may not be aware of the unconscious biases the parameters they’ve put in place contain,” she said. “We don’t know what the parameters were for the Apple Card’s credit determinations, but if factors included annual income without considering joint property ownership and tax filings, women, who in America still make 80.7 [cents] for every man’s dollar, would be at an inherent disadvantage.”

On November 11, following the announcement of the New York DFS investigation, Carey Halio, CEO of Goldman Sachs Bank USA, released another statement on behalf of the bank, pledging to work to ensure its algorithms are not exhibiting bias and to ask any customers who feel they may have been affected to reach out.

“We have not and never will make decisions based on factors like gender. In fact, we do not know your gender or marital status during the Apple Card application process,” Halio wrote. “We are committed to ensuring our credit decision process is fair. Together with a third party, we reviewed our credit decisioning process to guard against unintended biases and outcomes.”

Chris Wiltz is a Senior Editor at  Design News covering emerging technologies including AI, VR/AR, blockchain, and robotics.

Sign up for the Design News Daily newsletter.

You May Also Like