The Apple Map is the most high profile case of AI bias to date

The Apple Card is backed by Goldman Sachs and is intended for use with Apple devices. It does not require any security code, signature or card number. Apple says it’s more secure than any other physical credit card. (Image source: Apple)

The algorithm responsible for credit decisions for the Apple card gives women lower credit limits than men of the same level. These are the allegations that started to spread when consumers took to social media to complain about Apple’s credit card designed to work with Apple Pay and on various Apple devices.

The controversy began on November 7 when entrepreneur David Heinemeier Hansson, the creator of the Ruby on Rails programming tool, posted a long, angry thread on Twitter, complaining about his wife’s experience with the Apple card.

“The @AppleCard is a [expletive] sexist program. My wife and I have filed joint tax returns, live in a community property state, and have been married for a long time. Still, Apple’s black box algorithm thinks I deserve 20x [sic] the credit limit she makes. No call works ”, Hansson tweeted. ” It’s even worse. Even when it pays off its ridiculously low limit in full, the card won’t approve any expenses until the next billing period. Women apparently do not present good credit risks even when they repay loans. [expletive] balance in advance and in full.

Hansson goes on to describe his experience with Apple Card customer support regarding the issue. He says customer service reps assured him there was no discrimination and that the results he and his wife were seeing were due to the algorithm.

“So let’s recap here: Apple offers a credit card that bases its credit rating on a black box algorithm that [six] different representatives across Apple and [Goldman Sachs] have no visibility on. Even multiple layers of management. An internal investigation. IT’S JUST THE ALGORITHM! Hansson wrote (emphasis added). “… So no one understands THE ALGORITHM.” No one has the power to examine or verify THE ALGORITHM. Yet everyone we have spoken to about Apple and [Goldman Sachs] are SO SRS that THE ALGORITHM is in no way biased and discriminating. It is grade A management of cognitive dissonance.

David Heinemeier Hansson tweeted a lengthy statement describing his frustration with Apple Card. (Tweet edited for the language).

Hansson’s tweets prompted others to share similar experiences, including the Apple co-founder Steve wozniak. The same thing happened to us ”, Wozniak tweeted. “I have 10 times [sic] the credit limit. We do not have separate bank accounts or credit cards or separate assets. Hard to reach a human for a fix though. It’s big technology in 2019. ”

Filmmaker Lexi Alexander said she and a group of her friends have applied for an Apple card to see if the claims are true. What they found confirmed the accounts by Hansson and Wozniak. “A group of us applied [for] this card today. It takes 5 seconds on your iPhone [and] this does not appear on your credit history (I was told). Apple Card then imposes a credit limit on you. [and] APR offer that you can accept or decline. I’m currently trying to get over the sexist slap in the face, ”Alexander tweeted. “As if it was really, really bad. Male friends with bad credit and irregular incomes received much better deals than females with perfect credit and high incomes. There were 12 of us, 6 women 6 men. We just wanted to see what was going on and it wasn’t pretty.

As complaints about the Apple Card went viral, Goldman Sachs, the New York-based bank that backs the Apple Card, issued a declaration November 10. In the statement, Goldman Sachs said the problem stems from the fact that credit decisions regarding the Apple card are based on individual credit lines and histories, not those shared with family members.

“As with any other individual credit card, your application is assessed independently,” the statement from Goldman Sachs said. “We look at a person’s income and their creditworthiness, which includes factors like personal credit scores, the amount of your personal debt and how that debt has been managed. Based on these factors, it is possible that two family members will receive very different credit decisions … In any case, we have not and will not be making decisions based on factors like gender .

The contributing factors cited by Goldman Sachs seem to contradict those suggested by people such as Hansson and Wozniak.

CNBC reported that allegations of discrimination prompted the New York Department of Financial Services (DFS) to launch a formal investigation into Goldman Sachs’ credit card practices. “The DFS is disturbed to learn that there is potential discriminatory treatment with regard to credit limit decisions allegedly made by an Apple Card algorithm, issued by Goldman Sachs,” Linda Lacewell told CNBC , DFS Superintendent, “The department will investigate whether New York law has been violated and ensure that all consumers are treated equally, regardless of gender.

According to CNBCGoldman Sachs was aware of the potential bias when the Apple Card was launched in August. But the bank has chosen to have credit decisions made on an individual basis in order to avoid the complexity of dealing with co-signatories and other shared accounts.

The black box problem

While these Apple Card bias reports surely grab some attention due to the high profile names attached to them, it is far from the first case of a widely used AI algorithm showing bias. Incidents algorithmic bias in health care, loans and even criminal justice applications have been discovered in recent years. And experts from many large tech companies and research institutes are working diligently to correct biases in AI.

“Part of the problem here is that, like many AI and machine learning algorithms, the Apple Card is a black box; which means there is no framework in place to track the formation and decision making of the algorithm, ”said Irina Farooq, product manager at Kinetica, a data analytics company. Design News in a prepared statement. “For businesses, this is a significant legal and public relations risk. For society, it is even more serious. If we cede our decision-making to AI, whether it’s ridesharing reimbursements, insurance billing, or mortgage rates, we risk submitting to final judgment, to a machine monarchy where everyone. is a set of data, and all men and women, just data.

Farooq echoed the statements of many concerning with a bias in AI by stating that the algorithms we use are as fair as the data they are trained with. “The parameters of what the algorithm should take into account when analyzing a data set are always defined by people. And the developers and data scientists who do this work may not be aware of the unconscious biases in the settings they’ve put in place, ”she said. “We don’t know what the parameters were for the Apple Card credit determinations, but if the factors included annual income without considering joint ownership and tax returns, women, who in America still earn 80, 7 [cents] for every man’s dollar, would be at an inherent disadvantage.

On November 11, following the announcement of the DFS investigation in New York, Carey Halio, CEO of Goldman Sachs Bank USA, released another declaration on behalf of the bank, committing to work to ensure its algorithms are unbiased and to ask all customers who feel they have been affected to contact us.

“We haven’t and never will make decisions based on factors such as gender. In fact, we don’t know your gender or marital status during the Apple card application process,” Halio wrote. We are committed to ensuring that our credit decision-making process is fair. Working with a third party, we have reviewed our credit decision-making process to guard against bias and unintended outcomes. “

Chris Wiltz is Editor-in-Chief at Design News covering emerging technologies including AI, VR / AR, blockchain and robotics.

Comments are closed.