Incident 106: Korean Chatbot Luda Made Offensive Remarks towards Minority Groups

Description: A Korean interactive chatbot was shown in screenshots to have used derogatory and bigoted language when asked about lesbians, Black people, and people with disabilities.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Incident Stats

Incident ID
106
Report Count
13
Incident Date
2020-12-23
Editors
Sean McGregor, Khoa Lam

GMF Taxonomy Classifications

Taxonomy Details

Known AI Goal

Chatbot

Known AI Technology

Autoencoder, Distributional Learning

Known AI Technical Failure

Adversarial Data, Distributional Bias, Unauthorized Data, Inadequate Anonymization, Inappropriate Training Content, Unsafe Exposure or Access

CEO says controversial AI chatbot ‘Luda’ will socialize in time
koreaherald.com · 2021

Interactive chatbot ‘Luda,’ subjected to sexual harassment and taught hate speech  

Korean firm Scatter Lab has defended its Lee Luda chatbot in response to calls to end the service after the bot began sending offensive comments and was sub…

AI Chatbot Shut Down After Learning to Talk Like a Racist Asshole
vice.com · 2021

Imitating humans, the Korean chatbot Luda was found to be racist and homophobic.

A social media-based chatbot developed by a South Korean startup was shut down on Tuesday after users complained that it was spewing vulgarities and hate speec…

Korea’s controversial AI chatbot Luda to be shut down temporarily
pulsenews.co.kr · 2021

South Korea’s AI chatbot Lee Luda (Luda) will be temporarily suspended after coming under fire for its discriminatory and vulgar statements, as well as privacy breach allegations.

“We will return with an improved service after addressing th…

South Korean AI chatbot pulled from Facebook after hate speech towards minorities
theguardian.com · 2021

Lee Luda, built to emulate a 20-year-old Korean university student, engaged in homophobic slurs on social media

A popular South Korean chatbot has been suspended after complaints that it used hate speech towards sexual minorities in convers…

(News Focus) Chatbot Luda controversy leave questions over AI ethics, data collection
en.yna.co.kr · 2021

SEOUL, Jan. 13 (Yonhap) -- Today's chatbots are smarter, more responsive and more useful in businesses across sectors, and the artificial intelligence-powered tools are constantly evolving to even become friends with people.

Emotional chatb…

South Korean chatbot 'Lee Luda' killed off for spewing hate
inputmag.com · 2021

The bot said it 'really hates' lesbians, amongst other awful things.

A chatbot with the persona of a 20-year-old female college student has been shut down for using a shocking range of hate speech, including telling one user it “really hate…

Chatbot shut down after saying it 'hates lesbians' and using racist slurs
thenextweb.com · 2021

A South Korean Facebook chatbot has been shut down after spewing hate speech about Black, lesbian, disabled, and trans people.

Lee Luda, a conversational bot that mimics the personality of a 20-year-old female college student, told one user…

Chatbot Gone Awry Starts Conversations About AI Ethics in South Korea
thediplomat.com · 2021

The “Luda” AI chatbot sparked a necessary debate about AI ethics as South Korea places new emphasis on the technology.

In Spike Jonze’s 2013 film, “Her,” the protagonist falls in love with an operating system, raising questions about the ro…

AI chatbot mired in legal dispute over data collection
koreaherald.com · 2021

Artificial intelligence-based chatbot Lee Luda, which ended this month in ethical and data collection controversy, faces lawsuits on charges of violating personal information.

On Friday, around 400 people filed a class action suit against t…

Civic groups file petition over human rights violations by chatbot Luda
koreaherald.com · 2021

South Korean civic groups on Wednesday filed a petition with the country’s human rights watchdog over a now-suspended artificial intelligence chatbot for its prejudiced and offensive language against women and minorities.

An association of …

AI Chatbot ‘Lee Luda’ and Data Ethics
medium.com · 2021

The case of Lee Luda has aroused the public’s attention to the personal data management and AI in South Korea.

Lee Luda, an AI Chatbot with Natural Tone

Last December, an AI start-up company in South Korea, ScatterLab, launched an AI chatbo…

A South Korean Chatbot Shows Just How Sloppy Tech Companies Can Be With User Data
slate.com · 2021

“I am captivated by a sense of fear I have never experienced in my entire life …” a user named Heehit wrote in a Google Play review of an app called Science of Love. This review was written right after news organizations accused the app’s p…

(2nd LD) Developer of AI chatbot service fined for massive personal data breach
en.yna.co.kr · 2021

SEOUL, April 28 (Yonhap) -- South Korea's data protection watchdog on Wednesday imposed a hefty monetary penalty on a startup for leaking a massive amount of personal information in the process of developing and commercializing a controvers…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents