Incident 92: Apple Card's Credit Assessment Algorithm Allegedly Discriminated against Women (2024)

Entities

View all entities

Alleged: Apple developed an AI system deployed by Goldman-Sachs, which harmed Apple Card female users

and

Apple Card female credit applicants.

Incident Stats

Incident ID

92

Report Count

6

Incident Date

2019-11-11

Editors

Sean McGregor, Khoa Lam

Applied Taxonomies

CSETv0, CSETv1_Annotator-1, CSETv1_Annotator-2, CSETv1, GMF

CSETv0 Taxonomy Classifications

Taxonomy Details

Problem Nature

Indicates which, if any, of the following types of AI failure describe the incident: "Specification," i.e. the system's behavior did not align with the true intentions of its designer, operator, etc; "Robustness," i.e. the system operated unsafely because of features or changes in its environment, or in the inputs the system received; "Assurance," i.e. the system could not be adequately monitored or controlled during operation.

Specification

Physical System

Where relevant, indicates whether the AI system(s) was embedded into or tightly associated with specific types of hardware.

Software only

Level of Autonomy

The degree to which the AI system(s) functions independently from human intervention. "High" means there is no human involved in the system action execution; "Medium" means the system generates a decision and a human oversees the resulting action; "low" means the system generates decision-support output and a human makes a decision and executes an action.

High

Nature of End User

"Expert" if users with special training or technical expertise were the ones meant to benefit from the AI system(s)’ operation; "Amateur" if the AI systems were primarily meant to benefit the general public or untrained users.

Amateur

Public Sector Deployment

"Yes" if the AI system(s) involved in the accident were being used by the public sector or for the administration of public goods (for example, public transportation). "No" if the system(s) were being used in the private sector or for commercial purposes (for example, a ride-sharing company), on the other.

No

Data Inputs

A brief description of the data that the AI system(s) used or were trained on.

credit score, credit report, reported income

CSETv1 Taxonomy Classifications

Taxonomy Details

Incident Number

The number of the incident in the AI Incident Database.

92

There was a gender bias in the rates and credit limits offered by the Apple card. This results in financial harm based on gender.

Special Interest Intangible Harm

An assessment of whether a special interest intangible harm occurred. This assessment does not consider the context of the intangible harm, if an AI was involved, or if there is characterizable class or subgroup of harmed entities. It is also not assessing if an intangible harm occurred. It is only asking if a special interest intangible harm occurred.

yes

Date of Incident Year

The year in which the incident occurred. If there are multiple harms or occurrences of the incident, list the earliest. If a precise date is unavailable, but the available sources provide a basis for estimating the year, estimate. Otherwise, leave blank.Enter in the format of YYYY

2019

Date of Incident Month

The month in which the incident occurred. If there are multiple harms or occurrences of the incident, list the earliest. If a precise date is unavailable, but the available sources provide a basis for estimating the month, estimate. Otherwise, leave blank.Enter in the format of MM

11

Estimated Date

“Yes” if the data was estimated. “No” otherwise.

No

CSETv1_Annotator-1 Taxonomy Classifications

Taxonomy Details

Incident Number

The number of the incident in the AI Incident Database.

92

AI Tangible Harm Level Notes

Notes about the AI tangible harm level assessment

3.2 - Goldman Sachs, who developed the card, never explicitly said whether the algorithm was AI. General media consensus is that machine learning was very likely involved.

Notes (special interest intangible harm)

Input any notes that may help explain your answers.

Women with similar financial backgrounds, credit scores, and other personal details as male counterparts were assigned much lower credit limits.

Special Interest Intangible Harm

An assessment of whether a special interest intangible harm occurred. This assessment does not consider the context of the intangible harm, if an AI was involved, or if there is characterizable class or subgroup of harmed entities. It is also not assessing if an intangible harm occurred. It is only asking if a special interest intangible harm occurred.

yes

CSETv1_Annotator-2 Taxonomy Classifications

Taxonomy Details

Incident Number

The number of the incident in the AI Incident Database.

92

AI Tangible Harm Level Notes

Notes about the AI tangible harm level assessment

There was a gender bias in the rates and credit limits offered by the Apple card. This results in financial harm based on gender.

Special Interest Intangible Harm

An assessment of whether a special interest intangible harm occurred. This assessment does not consider the context of the intangible harm, if an AI was involved, or if there is characterizable class or subgroup of harmed entities. It is also not assessing if an intangible harm occurred. It is only asking if a special interest intangible harm occurred.

yes

Date of Incident Year

The year in which the incident occurred. If there are multiple harms or occurrences of the incident, list the earliest. If a precise date is unavailable, but the available sources provide a basis for estimating the year, estimate. Otherwise, leave blank.Enter in the format of YYYY

2019

Estimated Date

“Yes” if the data was estimated. “No” otherwise.

No

Multiple AI Interaction

“Yes” if two or more independently operating AI systems were involved. “No” otherwise.

no

Incident 92: Apple Card's Credit Assessment Algorithm Allegedly Discriminated against Women (7)

washingtonpost.com · 2019
  • View the original report at its source
  • View the report at the Internet Archive

What started with a viral Twitter thread metastasized into a regulatory investigation of Goldman Sachs’ credit card practices after a prominent software developer called attention to differences in Apple Card credit lines for male and femal…

Incident 92: Apple Card's Credit Assessment Algorithm Allegedly Discriminated against Women (8)

designnews.com · 2019
  • View the original report at its source
  • View the report at the Internet Archive

The algorithm responsible for credit decisions for the Apple Card is giving females lower credit limits than equally qualified males. Those are the allegations that began spreading as consumers took to social media with complaints about App…

Incident 92: Apple Card's Credit Assessment Algorithm Allegedly Discriminated against Women (9)

cnbc.com · 2019
  • View the original report at its source
  • View the report at the Internet Archive

When tech entrepreneur David Heinmeier Hansson recently took to Twitter saying the Apple Card gave him a credit limit that was 20 times higher than his wife's, despite the fact that she had a higher credit score, it may have been the first …

Incident 92: Apple Card's Credit Assessment Algorithm Allegedly Discriminated against Women (10)

qz.com · 2019
  • View the original report at its source
  • View the report at the Internet Archive

US regulators are investigating whether Apple’s credit card, launched in August, is biased against women. Software engineer David Heinemeier Hansson reported on social media that Apple had offered him a spending limit 20 times higher than h…

Incident 92: Apple Card's Credit Assessment Algorithm Allegedly Discriminated against Women (11)

hbswk.hbs.edu · 2019
  • View the original report at its source
  • View the report at the Internet Archive

The possibility that Apple Card applicants were subject to gender bias opens a new frontier for the financial services sector in which regulators are largely absent, argues Karen Mills.

In late August, the Apple Card debuted with a minimali…

Incident 92: Apple Card's Credit Assessment Algorithm Allegedly Discriminated against Women (12)

techcrunch.com · 2021
  • View the original report at its source
  • View the report at the Internet Archive

Advocates of algorithmic justice have begun to see their proverbial “days in court” with legal investigations of enterprises like UHG and Apple Card. The Apple Card case is a strong example of how current anti-discrimination laws fall short…

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Twitter’s Image Cropping Tool Allegedly Showed Gender and Racial Bias

·

5 reports

Facebook’s Hate Speech Detection Algorithms Allegedly Disproportionately Failed to Remove Racist Content towards Minority Groups

·

2 reports

Amazon’s Robotic Fulfillment Centers Have Higher Serious Injury Rates

·

3 reports

Previous IncidentNext Incident

Incident 92: Apple Card's Credit Assessment Algorithm Allegedly Discriminated against Women (2024)
Top Articles
Credit Risk Management and Financial Performance: A Case of Bank of Africa (U) Limited
Accounting and financial reporting - IFRS / HGB - KPMG Germany
The Largest Banks - ​​How to Transfer Money With Only Card Number and CVV (2024)
123 Movies Black Adam
Loves Employee Pay Stub
CLI Book 3: Cisco Secure Firewall ASA VPN CLI Configuration Guide, 9.22 - General VPN Parameters [Cisco Secure Firewall ASA]
Rek Funerals
Seething Storm 5E
Miles City Montana Craigslist
Palace Pizza Joplin
House Share: What we learned living with strangers
83600 Block Of 11Th Street East Palmdale Ca
[PDF] INFORMATION BROCHURE - Free Download PDF
Find The Eagle Hunter High To The East
Phillies Espn Schedule
Culvers Tartar Sauce
Lonadine
Busted Newspaper S Randolph County Dirt The Press As Pawns
Craigslist Motorcycles Orange County Ca
Bowlero (BOWL) Earnings Date and Reports 2024
Christina Khalil Forum
Baywatch 2017 123Movies
Northern Whooping Crane Festival highlights conservation and collaboration in Fort Smith, N.W.T. | CBC News
Grab this ice cream maker while it's discounted in Walmart's sale | Digital Trends
Log in or sign up to view
Kp Nurse Scholars
Metro Pcs.near Me
2013 Ford Fusion Serpentine Belt Diagram
Marion City Wide Garage Sale 2023
Www.patientnotebook/Atic
Airtable Concatenate
Olivia Maeday
Star Wars Armada Wikia
Phoenixdabarbie
Hwy 57 Nursery Michie Tn
Flaky Fish Meat Rdr2
Orange Pill 44 291
Tra.mypatients Folio
Mgm Virtual Roster Login
Federal Student Aid
Umiami Sorority Rankings
Austin Automotive Buda
Finland’s Satanic Warmaster’s Werwolf Discusses His Projects
Bismarck Mandan Mugshots
Saline Inmate Roster
Levi Ackerman Tattoo Ideas
Thothd Download
Copd Active Learning Template
Benjamin Franklin - Printer, Junto, Experiments on Electricity
Craigslist.raleigh
Used Curio Cabinets For Sale Near Me
Blippi Park Carlsbad
Latest Posts
Article information

Author: Aron Pacocha

Last Updated:

Views: 6432

Rating: 4.8 / 5 (48 voted)

Reviews: 87% of readers found this page helpful

Author information

Name: Aron Pacocha

Birthday: 1999-08-12

Address: 3808 Moen Corner, Gorczanyport, FL 67364-2074

Phone: +393457723392

Job: Retail Consultant

Hobby: Jewelry making, Cooking, Gaming, Reading, Juggling, Cabaret, Origami

Introduction: My name is Aron Pacocha, I am a happy, tasty, innocent, proud, talented, courageous, magnificent person who loves writing and wants to share my knowledge and understanding with you.