Incident 106: Korean Chatbot Luda Made Offensive Remarks towards Minority Groups

Description: A Korean interactive chatbot was shown in screenshots to have used derogatory and bigoted language when asked about lesbians, Black people, and people with disabilities.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Incident Stats

Incident ID
Report Count
Incident Date
Sean McGregor, Khoa Lam

GMF Taxonomy Classifications

Taxonomy Details

Known AI Goal


Known AI Technology

Autoencoder, Distributional Learning

Known AI Technical Failure

Adversarial Data, Distributional Bias, Unauthorized Data, Inadequate Anonymization, Inappropriate Training Content, Unsafe Exposure or Access

CEO says controversial AI chatbot ‘Luda’ will socialize in time · 2021

Interactive chatbot ‘Luda,’ subjected to sexual harassment and taught hate speech  

Korean firm Scatter Lab has defended its Lee Luda chatbot in response to calls to end the service after the bot began sending offensive comments and was sub…

AI Chatbot Shut Down After Learning to Talk Like a Racist Asshole · 2021

Imitating humans, the Korean chatbot Luda was found to be racist and homophobic.

A social media-based chatbot developed by a South Korean startup was shut down on Tuesday after users complained that it was spewing vulgarities and hate speec…

Korea’s controversial AI chatbot Luda to be shut down temporarily · 2021

South Korea’s AI chatbot Lee Luda (Luda) will be temporarily suspended after coming under fire for its discriminatory and vulgar statements, as well as privacy breach allegations.

“We will return with an improved service after addressing th…

(News Focus) Chatbot Luda controversy leave questions over AI ethics, data collection · 2021

SEOUL, Jan. 13 (Yonhap) -- Today's chatbots are smarter, more responsive and more useful in businesses across sectors, and the artificial intelligence-powered tools are constantly evolving to even become friends with people.

Emotional chatb…

South Korean AI chatbot pulled from Facebook after hate speech towards minorities · 2021

Lee Luda, built to emulate a 20-year-old Korean university student, engaged in homophobic slurs on social media

A popular South Korean chatbot has been suspended after complaints that it used hate speech towards sexual minorities in convers…

South Korean chatbot 'Lee Luda' killed off for spewing hate · 2021

The bot said it 'really hates' lesbians, amongst other awful things.

A chatbot with the persona of a 20-year-old female college student has been shut down for using a shocking range of hate speech, including telling one user it “really hate…

Chatbot shut down after saying it 'hates lesbians' and using racist slurs · 2021

A South Korean Facebook chatbot has been shut down after spewing hate speech about Black, lesbian, disabled, and trans people.

Lee Luda, a conversational bot that mimics the personality of a 20-year-old female college student, told one user…

Chatbot Gone Awry Starts Conversations About AI Ethics in South Korea · 2021

The “Luda” AI chatbot sparked a necessary debate about AI ethics as South Korea places new emphasis on the technology.

In Spike Jonze’s 2013 film, “Her,” the protagonist falls in love with an operating system, raising questions about the ro…

AI chatbot mired in legal dispute over data collection · 2021

Artificial intelligence-based chatbot Lee Luda, which ended this month in ethical and data collection controversy, faces lawsuits on charges of violating personal information.

On Friday, around 400 people filed a class action suit against t…

Civic groups file petition over human rights violations by chatbot Luda · 2021

South Korean civic groups on Wednesday filed a petition with the country’s human rights watchdog over a now-suspended artificial intelligence chatbot for its prejudiced and offensive language against women and minorities.

An association of …

AI Chatbot ‘Lee Luda’ and Data Ethics · 2021

The case of Lee Luda has aroused the public’s attention to the personal data management and AI in South Korea.

Lee Luda, an AI Chatbot with Natural Tone

Last December, an AI start-up company in South Korea, ScatterLab, launched an AI chatbo…

A South Korean Chatbot Shows Just How Sloppy Tech Companies Can Be With User Data · 2021

“I am captivated by a sense of fear I have never experienced in my entire life …” a user named Heehit wrote in a Google Play review of an app called Science of Love. This review was written right after news organizations accused the app’s p…

(2nd LD) Developer of AI chatbot service fined for massive personal data breach · 2021

SEOUL, April 28 (Yonhap) -- South Korea's data protection watchdog on Wednesday imposed a hefty monetary penalty on a startup for leaking a massive amount of personal information in the process of developing and commercializing a controvers…


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents