Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 6772

Loading...
Heber City Police Department test-pilots AI software
parkrecord.com · 2025

Heber City Police Chief Parker Sever appeared before the City Council in late October and informed them an officer had shape-shifted into a frog.

At least, that's what he heard from his son-in-law, who works for the West Jordan Police Department. That department had recently implemented artificial intelligence software that generates police reports using body camera footage. The details aren't always 100% accurate --- which is why officers review them.

"I read the report, and I'm like, 'Man, this really looks like an officer wrote it,'" Sever recalled. "But when it got to one part, it said, 'And then the officer turned into a frog, and a magic book appeared and began granting wishes.' ... It was because they had, like, 'Harry Potter' on in the background. So it picked up the noise from the TV and added it to the report."

The Heber City Police Department is test piloting similar software. 

The candidates are Code Four, a startup launched by two 19-year-old MIT dropouts earlier this year, and law enforcement technology juggernaut Axon's Draft One. Select weekday officers have access to Draft One, while weekend officers use Code Four. 

The two-month free trial periods for these programs expire in January.

The selling point of an AI software that creates police reports is the time savings. 

Axon and Code Four cite similar statistics that police officers spend up to 40% of their time writing police reports.

 "I think most of the public wants our officers out on the street, not in the office," Sever said.

That 'brand new car'

Josh Weishar, a Police Department sergeant who had exclusively used Draft One, testified to the time savings on Nov. 10, about a week into the pilot program.

Weishar thought the AI-generated police reports were typically more detailed than what he would have written.

"An officer's going to go with the basic information," he said. "I think you'll get more info, or at least, definitely more detailed info, than you would from just typing it out on your own because you're just going to remember the key facts instead of all the detail that comes from that AI's portion."

That was one of the reasons the software appealed to Heber City Police --- an officer might not be as detailed if they're "down 10 reports," Sever said, but AI can always include the same level of detail.

Both Draft One and Code Four allow the user to customize the level of detail the AI software includes in the report, from basic synopses to in-depth play-by-plays, and can generate reports from interactions in English and Spanish.

The major difference between the software is that Draft One creates a report based on bodycam audio, while Code Four uses bodycam audio and video.

End-of-shift reports, which provide an overview of incidents over an entire shift, will continue to be written exclusively by humans.

"Most of our officers are kind of awestruck with (Draft One) because it's such a new, innovative thing for us," Weishar said. "It's like that brand new car that's got all the features to it. For us, it's crazy that you can just press a button and it'll tell you everything about the case that you were on and give you a pretty decent police report to edit."

How Draft One works

In a demonstration of Draft One, Weishar showed that the draft report is littered with prompts that ask the officer to provide additional detail about certain elements, like what a suspect looked like or the state of the crime scene. The prompts, which have to be deleted or edited to continue, are intended to force the officer to review the report with a fine-tooth comb. 

Once they've completed their edits and approved the report, the officer copies and pastes the report into the Police Department's database and submits it as they would any other police report.

The original report generated by Draft One is not retained. However, the software creates an "unalterable digital audit trail" that captures who used the tool, when they used it and what evidence was involved, according to Axon's website.

The report ends with a disclosure that is turned on by default: "I acknowledge this report was generated using Draft One by Axon. I further acknowledge that I have reviewed the report in detail, made any necessary edits, and believe it to be an accurate representation of my recollection of the reported events. If needed, I am willing to testify to the accuracy of this report."

The disclosure can be customized or turned off.

Comparison shopping

If the Police Department chooses to continue with Draft One, it would cost about $30,000 of taxpayer dollars per year, according to Sever. 

Code Four co-founder George Cheng was critical of Axon because of additional add-ons, such as extra video storage and processing. He estimated it costs up to $400 per officer per month because of these add-ons. Code Four is significantly cheaper at $30 per officer per month. Sever estimated it would cost Heber City between $6,000 and $8,000 annually. 

Cheng visited the Taylorsville and Heber City police departments before Thanksgiving to demonstrate how the software works. He's made similar house calls at the Spanish Fork, South Jordan, Kaysville and Draper police departments.

Sever said he's only able to authorize expenditures up to a certain amount without Heber City Council approval. Purchasing Draft One for around $30,000 per year would require such permission. Buying Code Four at around $6,000 to $8,000 per year would not.

As of December, Heber City Police were leaning toward Code Four because of its lower cost and ability to write more comprehensive reports.

Another case

The Heber City Police Department isn't the only law enforcement agency in the Wasatch Back test-piloting AI software.

The Summit County Sheriff's Office recently completed a 90-day pilot program of Draft One among six patrol deputies. The Park City Police Department and Wasatch County Sheriff's Office have not implemented any AI software practices.

Sheriff's Sgt. Skyler Talbot said his outlook on AI in policing is a balance of caution and optimism.

"It's important that we recognize the capabilities of AI and the strong potential to increase operational efficiency," he said. "I think it's important that we stay on the cutting edge of that as it develops. But with that ... I don't want to say we should be skeptical, but we definitely need to be cautious."

Talbot said Draft One was not used to draft reports for "major incidents or any incident that might result in felony level charges" during the pilot program.

On Nov. 25, Talbot informed The Park Record that the Summit County Sheriff's Office had chosen not to continue with Draft One past the free trial.

He explained that Draft One has an administrative setting that forces the AI software to insert obvious errors into every report. 

"What that did is prompted and really forced the deputy to go through, read the report, check it for accuracy and make those corrections before it would allow them to continue," Talbot said. Essentially, the time savings weren't as impressive.

Having that setting on was a necessity if the Summit County Sheriff's Office were to implement Draft One to ensure deputies were reading the draft report carefully and checking it for accuracy before submitting it, Talbot explained. 

Meanwhile, the Heber City Police Department does not have this setting turned on. Sever said he was not aware of the setting and would have to look into its value.

Talbot said it wasn't "fiscally responsible" for the Summit County Sheriff's Office to use taxpayer funding on the program, though he clarified that the stance could change in a matter of years or months as the field evolves.

"Again, this is useful. This is great technology for our department size, for our call volume," he said. "It just makes more sense, at this point anyway, just to do things the way we were doing them before."

Ethical concerns

Not all are optimistic about AI-drafted police reports. 

David Ferguson, executive director of the Utah Association of Criminal Defense Lawyers, detailed some of the potential pitfalls of such AI software. He said that police reports are vital to defense attorneys because they can reveal whether the writer has any biases. 

But because generative AI can "hallucinate," getting details wrong or making them up entirely, it would be "unethical" for a defense attorney to take an AI-drafted police report at face value, leading them to review the bodycam footage to verify basic details in more and more cases, Ferguson said. That could increase workload for defense attorneys, particularly overworked public defenders. 

Prosecutors, on the other hand, typically rely on the evidence given to them by the police, including police reports.

A report generated by an AI software also has the potential to modify an officer's memories of an incident. If an officer who used generative AI to draft a report testified in court, they could end up testifying based on the AI's observations, not their own.

"When we compare notes, we tend to pull another person's observations into our own memory," Ferguson explained. The same could be said for the observations of an AI software.

"If the full narrative is whatever the body camera has picked up, that becomes the proposed truth of whatever happened. ... At some point, why do we have officers testify at all?" Ferguson said. "It's deeply offensive, in some sense, to think about a system in which humans are being prosecuted by cameras and whatever a software determines is relevant from what the camera picks up."

This can be especially problematic if the transcription struggles to understand an accent or the meaning of slang terms. Further problems could emerge with AI's video software. Ferguson explained that AI can have "difficulty with interpreting and understanding faces of minorities," and that AI-attributed observations about a person's actions, body language, behavior and facial expressions could be misunderstood --- especially among minority communities. 

According to Axon, studies for racial bias in Draft One "could not detect a statistically significant difference in Completeness, Consistency or Word Choice Severity between races." The studies evaluated whether AI-generated transcripts differed depending on race.

In April 2024, the company said a more detailed report about the studies would be released. However, The Park Record was unable to find additional information about the racial bias studies.

Lack of transparency was another of Ferguson's concerns, since private corporations aren't subject to the same transparency laws, like the Freedom of Information Act and the Privacy Act, as a government agency is. That could make it difficult for defense attorneys or the public to dig into the software to better understand how it works.

He was also skeptical that AI-drafted police reports would be properly reviewed by officers before they hit approval. 

"The problem isn't an officer who's able to condense an hour's worth of work into 10 minutes. The problem is the officer who says, 'I can do even better. I can condense this into 30 seconds.' ... You have to have your head in the sand to think that that's not going to become normal among some officers, and even culturally accepted in some departments," Ferguson said.

But Sever believes any faulty reports will be caught in the administrative process.

"Once (a police officer) signs off on (a police report), it goes to a supervisor. That supervisor is required to look at that report for completeness and accuracy, and then sign off on it," he explained. "Then our records department gets the report, ... and lots of times, they're the ones that also kick back reports for something that they see."

An officer who continually failed to review AI-drafted reports before submitting them would be disciplined and potentially terminated.

Looking forward

Sever said the only reason the department may not immediately implement one of these programs is if it doesn't fit into the current budget cycle. 

"An officer in the station does not affect slowing people down on the road. They're not available to respond to a call in a timely manner," Sever said. "When we can get them out of the station, on the road, is when the public is going to see them, and that's when it changes people's behaviors for the better."

Feedback from officers seems to have been positive as well.

"One of my sergeants, he's not a big tech guy, but he's super excited about this," Sever said. "He goes, 'I came back to the station. This report would have taken me an hour, and it took me eight minutes.'"

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd