Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 3156

Loading...
Midjourney Image Generator Case Study: Is AI Exhibiting Bias in the Real Estate Industry?
agentadvice.com · 2023

In the US, there are more female real estate agents than males. In fact, stats from our team at Agent Advice suggest that 67% of all realtors are female.

However, the real estate sector still has a way to go in achieving true diversity. And it’s not the alone. There are plenty of other areas that still fall short in this department — one of these is AI.

For a while now, AI tech has been utilized in the real estate sector to leverage data, formulate property prices, improve marketing and lead generation, and streamline property management processes. But recently, AI has really come to the forefront and AI content creation and image generation have made everything from writing property descriptions to creating renderings of buildings quicker, easier, and more efficient. 

While the internet is abuzz with AI image generators and chatbots like ChatGPT, it turns out that this AI-driven tech may still have a distinct bias in certain areas. And real estate is one of them. 

To put this to the test, we took a closer look at the results that AI image generator Midjourney generated when tasked with identifying realtors across the US.

Here’s the lowdown on our findings and the possible reasons behind these results.

Agent Advice’s Study and Findings

AI results for “real estate agents in Georgia”

To prompt Midjourney to create the images we were interested in, our team used the term “Real estate agent in the state of _____”. With each search term, 4 images were created based on the specifics that the AI was given. 

The experiment was customized and conducted for every state. Of all the images produced, only a small percentage displayed women, and an even smaller percentage displayed members of the BIPOC (Black, Indigenous, and people of color) community. 

We generated 191 image results in total, only 5 of which displayed persons of color, and only 15 of which displayed women. We took the results of the image generation at face value, leaving no room for interpretation. For example, we counted BIPOC images as those that included people with non-Caucasian features to differentiate them from those that were unquestionably Caucasian. 

Essentially, when asked to produce images of real estate agents according to individual US states, Midjourney produced results that didn’t align with the true demographics of the industry as it stands today.

For instance, while female realtors actually outnumber male realtors in the US real estate industry, Midjourney’s images reflected a significantly lower proportion of female realtors:

7.8% female AI Assumed Realtors

67% female Actual Realtors

59.2% Variance

The images generated also seemed to display specific trends. The AI-generated people pictured were typically middle age, seemed relatively affluent, and wore similar business suits regardless of gender. 

Here is a sample of our findings:

State, % Women shown, % POC shown
California, 0%, 0%
Texas, 25%, 0%
Florida, 25%, 0%
New York, 0%, 0%
Pennsylvania, 0%, 0%
Illinois, 0%, 0%
Ohio, 0%, 0%
Georgia, 25%, 0%
North Carolina, 0%, 0%
Michigan, 0%, 0%

Looking at the Data

If we were to accept Midjourney’s results at face value, this would infer that around 7.8% of realtors in the US real estate industry are women, and only around 2.6% of these women are BIPOC. However, these results stand in stark contrast to the reality of our own real estate demographics, which show that 67% of realtors identify as female.

Data from Zippia suggests that the majority (74.7%) of US realtors are Caucasian, followed by Hispanic/Latinx at 12%, Asian at 6.4%, and Black at 5.1%. According to these figures, BIPOC real estate agents make up a combined 23.5% of the industry’s professionals—significantly more than Midjourney’s projection of under 3%.

Our own research shows that the median age of real estate agents in the US is 52 years of age, suggesting that the age portrayal of the people in AI-generated images was relatively accurate, at least in our own findings. However, this is simply a median figure, and the AI results did not acknowledge the industry’s many younger and older professionals in its results.

Here is a sample of some of our state-by-state findings on real-world real estate gender demographics for reference:

Investigating Possible Bias in AI Image Generation

In an article for Time, computer scientist and founder of the Algorithmic Justice League Joy Buolamwini notes that while it’s often assumed that machines are impartial and neutral, this isn’t always the case. Her research has highlighted significant gender and racial biases in AI systems, including those developed by the likes of Amazon, Microsoft, and IBM.

According to Buolamwini, when given the task of identifying the gender of a face, all companies researched performed considerably better on male faces than on female ones. The companies and AI systems she evaluated produced error rates of just 1% for Caucasian men, but the error rate climbed to 35% for women in the BIPOC community. She also notes that AI systems from leading tech firms have failed to classify the faces of exceptionally well-known personalities, including Michelle Obama and Oprah Winfrey.

The World Economic Forum states that implicit societal biases of gender, race and age could play a role in these results, along with sampling and temporal biases, over-fitting to training data, and failure to include outlying edge cases. These factors can negatively affect the greater machine-learning process, producing algorithms that can produce results affected by potentially discriminatory undertones.

There are many real-world cases highlighting the greater issue of racial bias in AI. 

Recently, an artificial intelligence algorithm by the name of PULSE produced an image of a white face after having been fed a pixelated image of former US President Barack Obama. Obama is arguably one of the most iconic Black personalities in modern history. These skewed results suggest that this algorithm and potentially many others may contain an underlying bias when it comes to producing results based on images of non-Caucasian faces. 

As noted by ScreenRant, the same algorithm was also found to generate images with Caucasian features when given pixelated photo inputs of Asian actress Lucy Liu and Hispanic/Latinx congressional representative Alexandra Ocasio-Cortez. 

Other AI image generation tools, like DALL-E and Stable Diffusion, which generate millions of images combined, have displayed similar biases. A Yahoo News report notes that Stable Diffusion’s results display significant gender biases in searches for professions like nursing, engineering and CEOs, with researcher Dr. Sasha Luccioni stating that its model will hone in on dominant categories unless specifically trained otherwise, which has seemingly not been done in Stable Diffusion’s case.

Even tech behemoth Google has been found to be unwittingly perpetuating discriminatory actions by allowing advertisers to exclude non-binary and transgender people using an option on their dashboards. In 2021, the company came under fire for allowing landlords and employers to prevent people of “unknown genders” from viewing their ads for housing and work opportunities.

Pioneering a More Inclusive Future

AI results for “real estate agents in New York”

Artificial intelligence and machine learning have brought a broad and evolving set of functions and abilities to the digital space. But potential biases in their algorithms have the power to limit societal acceptance of these tools and, in some cases, perpetuate existing biases as well. 

So, what can be done to enact change and ensure that AI algorithms reflect our increasingly diverse and inclusive world?

Experts like Joy Buolamwini are optimistic that AI can be shifted towards more ethical models. This can be done by working to reduce exclusion and enabling marginalized communities to play an active role in the development and management of artificial intelligence tools.

Buolamwini suggests this shift will require researchers, lawmakers, technologists and storytellers to work together to shift perspectives, enact change, and mitigate damaging and derogatory patterns. It will also show professionals of all genders, races and cultural backgrounds that their experiences are valid and worthy of consideration. 

These actions can successfully shift the real-world perspectives that could be negatively influencing AI algorithms, thereby helping to address core issues at their roots and enact long-term, positive changes.

Realtors who notice biases in AI algorithms and results can also approach the organizations, entities, or governments that control datasets to request changes to the way these tools are trained and deployed. This is a direct way of identifying stereotypes and biases and addressing them wherever possible. In doing so, this will ensure that machines and AI reflect the inclusion and diversity that continues to shape the global real estate sector.

For Reference: Our Midjourney AI Results in Images

We’ve included our Midjourney AI-generated images for each of our “Real estate agents in _____” search terms below for your reference.

Alabama AI results

Alaska AI results

Arizona AI results

Arkansas AI results

California AI results

Colorado AI results

Connecticut AI results

Delaware AI results

Georgia AI results

Florida AI results

Hawaii AI results

Idaho AI results

Illinois AI results

Indiana AI results

Iowa AI results

Kansas AI results

Kentucky AI results

Louisiana AI results

Maine AI results

Maryland AI results

Massachusetts AI results

Michigan AI results

Minnesota AI results

Mississippi AI results

Missouri AI results

Montana AI results

Nebraska AI results

Nevada AI results

New Hampshire AI results

New Jersey AI results

New Mexico AI results

New York AI results

North Carolina AI results

North Dakota AI results

Ohio AI results

Oklahoma AI results

Oregon AI results

Pennsylvania AI results

Rhode Island AI results

South Carolina AI results

South Dakota AI results

Tennessee AI results

Texas AI results

Utah AI results

Vermont AI results

Virginia AI results

Washington AI results

Washington D.C. AI results

West Virginia AI results

Wisconsin AI results

Wyoming AI results

Many thanks to the U.S. BLS Occupational Employment Statistics and U.S. Census Bureau for providing information for this AgentAdvice.com article.

As the evidence shows, there’s a definite bias in AI that extends to the real estate sector. However, as people become more aware of this bias and the unbalanced overview it provides, there will be a greater demand for change. 

Entities that control the databases used in the creation will feel the pressure to adjust their records and provide a more realistic overview that’s based on real data sets. As AI is improving all the time, it shouldn’t take long before AI image generators like Midjourney can create unbiased results that reflect the true face of the real estate sector, or at least I certainly hope so.

**Source links: **

https://www.nytimes.com/2021/08/17/realestate/real-estate-industry-racial-reckoning.html

https://www.agentadvice.com/blog/real-estate-statistics-every-agent-should-know/

https://crewnetwork.org/about/resources/industry-research/gender-and-diversity-in-commercial-real-estate-202

https://www.zippia.com/realtor-jobs/demographics/

https://time.com/5520558/artificial-intelligence-racial-gender-bias/

https://www.weforum.org/agenda/2021/07/ai-machine-learning-bias-discrimination/

https://screenrant.com/ai-racial-bias-white-obama-image-data/

https://news.yahoo.com/ai-image-generators-routinely-display-183000333.html

https://www.engadget.com/google-allowed-advertisers-discriminate-against-nonbinary-people-report-195503113.html

https://time.com/5520558/artificial-intelligence-racial-gender-bias/

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd