Incident 43: Racist AI behaviour is not a new problem

Description: From 1982 to 1986, St George's Hospital Medical School used a program to automate a portion of their admissions process that resulted in discrimination against women and members of ethnic minorities.
Alleged: Dr. Geoffrey Franglen developed an AI system deployed by St George's Hospital Medical School, which harmed Women and Minority Groups.

Suggested citation format

Yampolskiy, Roman. (1998-03-05) Incident Number 43. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
Sean McGregor


New ReportNew ReportDiscoverDiscover

CSET Taxonomy Classifications

Taxonomy Details

Full Description

From 1982 to 1986, St George's Hospital Medical School used a program to autonomously select candidates for admissions interviews. The system, designed by staff member Dr. Geoffrey Franglen, used past admission data to select potential students based on their standardized university applications. After the program achieved 90-95% match with the admission panel’s selection of interview candidates, it was entrusted as the primary method to conduct initial applicant screening. In 1986, lecturers at the school recognized that the system was biased against women and members of ethnic minorities and reported the issue to Britain’s Commission for Racial Equality.

Short Description

From 1982 to 1986, St George's Hospital Medical School used a program to automate a portion of their admissions process that resulted in discrimination against women and members of ethnic minorities.



Harm Distribution Basis

Race, Sex

Harm Type

Harm to civil liberties

AI System Description

A custom designed statistical analysis program that used data from past admissions decisions to select which university applicants to be given admissions interviews.

System Developer

Dr. Geoffrey Franglen

Sector of Deployment

Human health and social work activities

Relevant AI functions


AI Techniques

Machine learning

AI Applications

decision support


London, England

Named Entities

St George’s Hospital Medical School, Dr. Geoffrey Franglen, University Central Council for Admission, Commission for Racial Equality

Technology Purveyor

St George’s Hospital Medical School, Dr. Geoffrey Franglen

Beginning Date


Ending Date


Near Miss

Harm caused



Lives Lost


Laws Implicated

United Kingdom's Race Relations Act

Data Inputs

Standardized university admission form, Previous admission and regection decisions

Incidents Reports

A Blot on the Profession

Discrimination in medicine against women and members of ethnic minorities has long been suspected, but it has now been proved. St George's Hospital Medical School has been found guilty by the Commission for Racial Equality of practising racial and sexual discrimination in its admissions policy. The commission decided not to serve a nondiscrimination notice on the school, which it is empowered7to do by the Race Relations Act, but as many as 60 applicants each year among 2000 may have been refused an interview purely because of their sex or racial origin. This is a sad finding not only for St George's Hospital Medical School but for the whole profession. It is now important not only that discrimination is swept out of St George's and the profession but also that it is seen to be swept out.

The story began in December 1986 when the commission was informed by Dr A Burke and Dr J Collier, both senior lecturers at St George's, that a computer program used in the initial screening of applicants for places at the school unfairly discriminated against women and people with non-European sounding names. The program had been developed by Dr Franglen, a member of staff, to reduce the work of selecting candidates for interview. It was also hoped that it would eliminate any inconsistencies in the way the admissions staff carried out their duties. The program was written after careful analysis of the way in which the staff were making these choices and was modified until by 1979 it was giving a 90-95% correlation with the gradings of the selection panel. This point is important: the program was not introducing new bias but merely reflecting that already in the system. By 1982 all the initial selection was being done by computer. Details of each candidate were obtained from his or her University Central Council for Admission (UCCA) form, but since this contains no reference to race this was deduced from the surname and place of birth. The computer used this information to generate a score which was used to decide which applicants should be interviewed. Women and those from racial minorities had a reduced chance of being interviewed independent of academic considerations.

Ironically St George's has a better record on racial matters than most of the other London medical schools and admits a higher than average proportion of students from ethnic minorities. For example, 12% of the students there had nonEuropean sounding names compared with only 5% at the Westminster Medical School.' This is more worrying than reassuring as it raises the question ofwhat is happening in the other schools.

The commission has made recommendations not just about this particular episode but also about how other schools can avoid similar difficulties. It is emphasised that where a computer program is used as part of the selection process all members of staff taking part have a responsibility to find out what it contains. A major criticism of the staff at St George's was that many had no idea of the contents of the program and those who did failed to report the bias. All staff participating in selection should be trained so that they are aware of the risk of discrimination and try to eliminate it. No one person should have sole responsibility for any stage of the process. The commission recommends that a question on racial origin be included in the UCCA form. The percentage of non-European students in a medical school provides little information unless the proportion among applicants for places is known. At present this information is unobtainable, and it is ironic that protection of the interests of minority groups should necessitate their identification on application forms. If this information is collected the ratio of students from ethnic minorities accepted will have to be monitoredperhaps this should be a job for the General Medical Council.

Many doctors, medical students, and lay people believe that discrimination on grounds of sex or race is widespread in allocations of places at medical schools and later at appointments to jobs. Gradually statistical evidence supporting this is becoming available.23 What factors encourage this to continue? St George's receives about 12 applicants for each of its 150 places each year. About a quarter are interviewed and roughly 70% of these are offered places. The competition for jobs after qualification is even greater and worsens as one moves up the career ladder. This is strikingly illustrated in the letter from Professor J R Salaman on p 717. Appointments committees need to weed out the applicants somehow, and at the early stages this can be a fairly random process. Exceptional candidates will be selected but what happens to the many suitable people remaining?

It is easy to see why women might be discriminated against: there is more risk of them wanting time off work because of family commitments. Likewise, some overseas doctors do not have a sufficient command of English to practise medicine, because understanding the colloquial language is as important as grasping the technical terms. These are unpopular but valid points. The difficulty arises in the attempts which have been made to deal with them. The way to cope with the family commitments of women doctors is not to refuse to appoint women. As nearly half of medical school entrants are now women the National Health Service cannot afford such a policy. It would be far better to look at ways of providing suitable creche facilities, which would have the extra benefits of allowing nurses to be more flexible in the shifts they could work and improving the running of many of our hospitals. Similarly, discriminating against all those who have foreign names or black faces is an inefficient way of excluding those with a poor command of English. If the Professional and Linguistic Assessment Board examination is not sufficiently helpful better ways of testing language must be devised and more facilities provided to help those who need to improve.

Discrimination is wrong, but it is not enough to identify it-the reasons for it must be sought and solutions found. Although discrimination may arise as an inappropriate response to a genuine problem, all too often there is no explanation other than historical and cultural traditions. The attitudes at St George's cannot be excused. Only candidates applying on UCCA forms were involved, and they would all have had a good command of English. A study of doctors who obtained the membership ofthe Royal College ofPsychiatrists in November 1981 or April 1982 showed that four times as many overseas as British graduates were still in registrar posts by 1984.5 This discrimination cannot be explained as all the doctors had obtained higher qualifications and must have been competent in English and psychiatry.

St George's cooperated fully with the inquiry and has taken steps to avoid a recurrence. Attempts are being made to contact people who may have suffered, and three previously unsuccessful applicants have been offered places at the school. Other medical schools and appointments committees must ensure that any discrimination in their methods is identified and removed. A further incident like this may lead to a prosecution under the Race Relations Act. More importantly, medicine needs graduates from ethnic minorities and has an outstanding international tradition; it should be leading the way in assessing fairly all who want to enter its ranks.

A Blot on the Profession

As AI spreads, this will become an increasingly important and controversial issue:

For one British university, what began as a time-saving exercise ended in disgrace when a computer model set up to streamline its admissions process exposed – and then exacerbated – gender and racial discrimination.

As detailed here in the British Medical Journal, staff at St George’s Hospital Medical School decided to write an algorithm that would automate the first round of its admissions process. The formulae used historical patterns in the characteristics of candidates whose applications were traditionally rejected to filter out new candidates whose profiles matched those of the least successful applicants.

By 1979 the list of candidates selected by the algorithms was a 90-95% match for those chosen by the selection panel, and in 1982 it was decided that the whole initial stage of the admissions process would be handled by the model. Candidates were assigned a score without their applications having passed a single human pair of eyes, and this score was used to determine whether or not they would be interviewed.

Quite aside from the obvious concerns that a student would have upon finding out a computer was rejecting their application, a more disturbing discovery was made. The admissions data that was used to define the model’s outputs showed bias against females and people with non-European-looking names.

The truth was discovered by two professors at St George’s, and the university co-operated fully with an inquiry by the Commission for Racial Equality, both taking steps to ensure the same would not happen again and contacting applicants who had been unfairly screened out, in some cases even offering them a place.

Computers which magnify our prejudices

Professor Margaret Boden, an AI and cognitive science researcher, took the time to speak to me in 2010 about computers, AI, morality and the future. One of the stories she told me comes back to me every now and then, most recently by Microsoft’s latest failure to anticipate the result of releasing their chat bot, Tay, into the world, only to see it become racist in less than 24 hours.

In the early ‘80s St George’s Medical Hospital School in London decided to automate parts of their admissions workflow. The existing process was time consuming and expensive. Before the applicants were invited for interviews, their initial suitability was assessed based on their grades, classes taken and other criteria. It was that stage that was going to be automated.

A statistical system for weighting those different criteria was devised. The program would select the highest ranking candidates to invite for an interview. After the initial design was complete, the program was trained to adjust the weighting it gave to different criteria based on previous years’ admissions datasets. It was taught to achieve results consistent with the way human staff would select potential students.

The program was used between 1982 and 1986, until in 1986 two members of staff complained to the Commission for Racial Equality. The program had learnt to discriminate against non-white and female applicants. Those with postcodes betraying their working-class background were also given lower priority in the selection process.

The story making the news: The Age, 26th February 1988.

The quote from the spokesperson for the Commission for Racial Equality at the time was damning:

St George’s computer program merely replicated the discrimination that was already being practised by the selectors, and there is no reason to believe that the selectors at St George’s were any more discriminatory than selectors elsewhere.

— The Age, 26th February 1988, (emphasis mine)

That was in the ’80s. Plus ça change.

Racist AI behaviour is not a new problem

Companies and governments need to pay attention to the unconscious and institutional biases that seep into their algorithms, argues cybersecurity expert Megan Garcia. Distorted data can skew results in web searches, home loan decisions, or photo recognition software. But the combination of increased attention to the inputs, greater clarity about the properties of the code itself, and the use of crowd-level monitoring could contribute to a more equitable online world. Without careful consideration, Garcia writes, our technology will be just as racist, sexist, and xenophobic as we are.

Racist in the Machine

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents