Incident 261: Robot Deployed by Animal Shelter to Patrol Sidewalks outside Its Office, Warding off Homeless People in San Francisco

Description: Society for the Prevention of Cruelty to Animals (SPCA) deployed a Knightscope robot to autonomously patrol the area outside its office and ward off homeless people, which was criticized by residents as a tool of intimidation and ordered by the city of San Francisco to stop its use on a public right-of-way.

Suggested citation format

Perkins, Kate. (2017-11-15) Incident Number 261. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
Khoa Lam


New ReportNew ReportDiscoverDiscover

Incidents Reports

San Francisco residents continue to rage against the machines.

While the city's Board of Supervisors moves toward finalizing limits on robots that roam the sidewalks to deliver food and goods, it must also find a way to handle security robots that patrol public sidewalks.

The S.F. SPCA in the Mission District started using a security robot about a month ago in its parking lot and on the sidewalks around its campus, which takes up a whole city block at Florida and 16th Streets. Last week, the city ordered the SPCA to keep its robot off the sidewalks or face a penalty of up to $1,000 per day for operating in the public right-of-way without a permit.

The security robot is just the latest in a growing list of uses for robots around the city, from rental agents to food couriers. The robot surge could draw local government into more questions about its role in regulating the machines, especially if they operate in the public right-of-way.

For the SPCA, the security robot, which they've dubbed K9, was a way to try dealing with the growing number of needles, car break-ins and crime that seemed to emanate from nearby tent encampments of homeless people along the sidewalks.

“We weren’t able to use the sidewalks at all when there’s needles and tents and bikes, so from a walking standpoint I find the robot much easier to navigate than an encampment,” Jennifer Scarlett, the S.F. SPCA’s president, told the Business Times.

Once the SPCA started using the robot on the sidewalks around its campus in early November, Scarlett said, there were no more homeless encampments. There were also fewer break-ins to cars in the campus parking lot. It’s not clear that the robot was the cause of the decreases, Scarlett added, but they were correlated.

The people in the encampments showed their displeasure with the robot’s presence at least once. Within about a week of the robot starting its automated route along the sidewalks, some people setting up a camp “put a tarp over it, knocked it over and put barbecue sauce on all the sensors,” Scarlett said.

The robot upset local resident Fran Taylor, too. Last month, the robot approached Taylor while she walked her dog near the SPCA campus. Her dog started lunging and barking, she said, and Taylor yelled for the robot to stop. It finally came to a halt about 10 feet away, she said.

The encounter struck Taylor as an “unbelievable” coincidence since she had been working with pedestrian advocacy group Walk San Francisco in asking the city to limit sidewalk delivery robots. That legislation is expected to receive final approval soon but doesn’t apply to security robots like K9.

Taylor said she’s concerned about robots bumping into people on the sidewalks. She knows robots are often equipped with sensors so they don’t do that, she added, but “I don’t really trust that.”

She wrote an email to the SPCA the day of her encounter and copied several San Francisco government officials, including Mayor Ed Lee and members of the Board of Supervisors. The SPCA team responded and cited security concerns as the motivation for starting to use the robot.

On Dec. 1, the Department of Public Works sent the SPCA an email saying that the robot is operating in the public right-of-way "without a proper approval.” SPCA would have to stop using the robot on sidewalks or request a proper permit, according to the DPW email reviewed by the Business Times.

Scarlett said the SPCA stopped using the robot on the sidewalks and handed the issue over to the robot’s maker, Mountain View-based Knightscope, for further discussion with the city. Knightscope didn't respond to a request for comment about the status of those talks.

The robot is a K5 unit and has a top speed of three miles per hour, according to Knightscope’s website. The units are more than five feet tall and weigh 400 pounds. They are equipped with four cameras, “each capable of reading up to 300 license plates per minute” and sending alerts when trespassers or people on a “blacklist” are in an area.

In addition to her concerns about sidewalk safety, Taylor said the robot’s route and cameras seemed “like an obvious attack on the very people in San Francisco who are already having such a hard time surviving in this expensive city.”

Having humans replace the robot’s 24/7 shift would be “cost prohibitive,” though, Scarlett said. The robot costs about $6 per hour to rent, she said. The minimum wage in San Francisco is $14 per hour.

The SPCA also employs two security guards this time of year because its staff brings back animals at night from displays in the Macy’s store holiday window.

“I can understand being scared about a new technology on the street, and we should be asking questions about it, but we should probably be a little bit angry that a nonprofit has to spend so much on security at the same time,” Scarlett said. Ultimately, the S.F. SPCA wants to see a resolution of “the complicated issues around homelessness,” she added.

But she doesn’t see the robot trend going away, either.

“In five years we will look back on this and think, ‘We used to take selfies with these because they were so new,’” Scarlett said.

Security robot that deterred homeless encampments in the Mission gets rebuke from the city

The San Francisco branch of the Society for the Prevention of Cruelty to Animals (SPCA) has been ordered by the city to stop using a robot to patrol the sidewalks outside its office, the San Francisco Business Times reported Dec. 8.

The robot, produced by Silicon Valley startup Knightscope, was used to ensure that homeless people didn’t set up camps outside of the nonprofit’s office. It autonomously patrols a set area using a combination of Lidar and other sensors, and can alert security services of potentially criminal activity.

These robots have had a string of mishaps in the past. One fell into a pond in Washington, DC, in July. Another ran over a child’s foot in California in 2016. And Uber, which is no stranger to the ethical quandaries of what it means to be gainfully employed by a company, has used the robots in San Francisco.

Knightscope’s business model, according to Popular Science, is to rent the robots to customers for $7 an hour, which is about $3 less than minimum wage in California. The company has apparently raised over $15 million from thousands of small investors.

In a particularly dystopian move, it seems that the San Francisco SPCA adorned the robot it was renting with stickers of cute kittens and puppies, according to Business Insider, as it was used to shoo away the homeless from near its office.

San Francisco recently voted to cut down on the number of robots that roam the streets of the city, which has seen an influx of small delivery robots in recent years. The city said it would issue the SPCA a fine of $1,000 per day for illegally operating on a public right-of-way if it continued to use the security robot outside its premises, the San Francisco Business Times said.

“Contrary to sensationalized reports, Knightscope was not brought in to clear the area around the SF SPCA of homeless individuals. Knightscope was deployed, however, to serve and protect the SPCA,” A spokesperson for Knightscope told Quartz. “The SCPA has the right to protect its property, employees and visitors, and Knightscope is dedicated to helping them achieve this goal. The SPCA has reported fewer car break-ins and overall improved safety and quality of the surrounding area.”

Update (Dec. 13): This post has been updated to include comments from Knightscope.

Robots are being used to shoo away homeless people in San Francisco

An animal shelter in San Francisco has been criticized for using a robot security guard to scare off homeless people.

The San Francisco branch of the SPCA (the Society for the Prevention of Cruelty to Animals) hired a K5 robot built by Knightscope to patrol the sidewalks outside its facilities. According to a report from the San Francisco Business Times, the robot was deployed as a “way to try dealing with the growing number of needles, car break-ins and crime that seemed to emanate from nearby tent encampments of homeless people.”

Jennifer Scarlett, president of the SF SPCA told the Business Times last week: “We weren’t able to use the sidewalks at all when there’s needles and tents and bikes, so from a walking standpoint I find the robot much easier to navigate than an encampment.”

The robot in question is equipped with four cameras, moves at a pace of three miles per hour, and is cheaper than a human security guard — costing around $6 an hour to rent. Knightscope’s bots are some of the most popular robot guards around and have popped up in the news in the past. The same model of robot previously knocked over a toddler in a mall and fell into a fountain in DC. Knightscope says its robots are intended as deterrents, and for providing mobile surveillance.

Reaction to the news on social media has been overwhelming negative, with people shaming the SPCA for deploying the machine, and encouraging others to vandalize or destroy it.

Capitalism: instead of providing homes for homeless people, spend exorbitant sums of money creating robots that will prevent homeless people from making homes for themselves

— Ben Norton (@BenjaminNorton) December 13, 2017

According to the SPCA, attacks have already taken place, with Scarlett telling the Business Times that within a week of the robot starting its duties, some people “put a tarp over it, knocked it over and put barbecue sauce on all the sensors.” One Twitter user reported seeing the robot with feces smeared on it.

At the time of writing, the SF SPCA had not responded to a request for comment from The Verge, although the robot’s creators, Knightscope, denied the framing of the news.

“Contrary to sensationalized reports, Knightscope was not brought in to clear the area around the SF SPCA of homeless individuals,” a spokesperson told The Verge. “Knightscope was deployed, however, to serve and protect the SPCA. The SCPA has the right to protect its property, employees and visitors, and Knightscope is dedicated to helping them achieve this goal. The SPCA has reported fewer car break-ins and overall improved safety and quality of the surrounding area.”

Knightscope’s response raises interesting questions about how society will respond to robots like these in the future. Although the company denies that the bot was being used to deter the homeless, comments from SPCA show that this definitely was one of the outcomes — whether intentional or not. Because the robot is semi-autonomous, Knightscope (and, potentially, the SPCA) can shift the blame for its actions. They only rented the robot; they didn’t tell it to do anything.

In any case, the SPCA K5 might have a limited shelf life in San Francisco. The city recently passed new legislation limiting the use of robots in city streets. Although the rules were aimed primarily at delivery bots, the SPCA has been ordered to keep the K5 off sidewalks or face a $1,000 daily fine. Knightscope is currently negotiating with the city over future deployments.

Animal shelter faces backlash after using robot to scare off homeless people

A robot patrolling a street in San Francisco to ward off homeless people has been removed after complaints from locals, who also knocked it over and smeared it with feces.

The Knightscope K5 security robot was deployed by the San Francisco branch of the Society for the Prevention of Cruelty to Animals (SPCA) to deter homeless people from sleeping and loitering near its building.

But it was forced to take away the 400-pound machine as it was operating in the public realm without a permit, and threatened with a $1,000-a-day (£745) fine.

The K5's presence also angered the local community, who took to social media to complain.

I can’t help but feel bad for the SPCA robot outside that someone smeared their poo on. Is this a conspiracy to make me (us) a sympathizer to our new robot overlords... will they be plastered in cute dog decals??

— Tyson Kallberg (@TysonKallberg) November 9, 2017

Reports claimed that a group doused its sensors with barbecue sauce, knocked it over and veiled it with a tarp. One Twitter user claimed they saw feces smeared on its shell, while another described the robot's use as "shameful".

"The money that was spent on these robots could have gone towards homeless shelters," said another tweet.

The shelter said it released the robot, nicknamed K9, to patrol the pavements around its centre in the Mission District, which had become a camp for the city's homeless population.

"We weren't able to use the sidewalks at all when there's needles and tents, and bikes, so from a walking standpoint I find the robot much easier to navigate than an encampment," the SPCA's president Jennifer Scarlett told the Business Times.

Responding to Dezeen, the shelter said that it only hoped to improve the safety of its employees, following an influx of crime in the surrounding area, and that it is "extremely sensitive" to the issue of homelessness.

"In the last year we've experienced a great deal of car break-ins, theft, and vandalism that has made us concerned about the security and safety of the people on our campus," the SPCA's media relations manager Krista Maloney told Dezeen.

"The security robot that we've been using on a pilot basis has been very effective at deterring these criminal incidents. The device helps us prevent crime; it doesn't attempt to remove homeless people from the sidewalk."

The K5 is equipped with four cameras that monitor its surroundings, and moves on wheels at speeds of up to three miles per hour. It measures 1.5 metres tall and nearly one metre wide at its base, creating a sizeable obstacle on the pavement.

San Francisco is tightening restrictions on autonomous machines on the streets – particularly delivery robots – with growing concerns over public safety.

Knightscope's K5 model has already been embroiled in other controversies elsewhere, including knocking a toddler over in Silicon Valley, and falling into a pond in Washington DC after missing a set of stairs.

Security robot bullied and forced off the street in San Francisco

Is it worse if a robot instead of a human is used to deter the homeless from setting up camp outside places of business?

One such bot cop recently took over the outside of the San Francisco SPCA, an animal advocacy and pet adoption clinic in the city’s Mission district, to deter homeless people from hanging out there — causing some people to get very upset.

Silicon Valley game developer and Congressional candidate Brianna Wu tweeted yesterday her dismay at the move, saying, “I’m sorry for being so frank but this absolutely disgusts me as someone that experienced homelessness.”

I’m sorry for being so frank, but this absolutely disgusts me as someone that experienced homelessness.

Every time I travel to San Fran my heart breaks from seeing all the homelessness in a city with so much wealth and privilege.


— Brianna Wu (@Spacekatgal) December 12, 2017

The homelessness issue in S.F. is thorny and complicated. One could get whiplash at seeing the excess of wealth and privilege juxtaposed with the dire circumstances just steps outside Twitter headquarters on Market Street.

However, the city’s homeless are also associated with higher rates of crime, violence and sometimes episodes of psychosis, leading to safety issues that many feel San Francisco has not had an adequate handle on.

The S.F. SPCA rolled out the use of a robot unit dubbed K9 from security startup Knightscope a month ago, citing these same safety concerns.

“Over the summer our shelter was broken into twice. The inside was vandalized and property and cash donations were stolen,” S.F. SPCA spokesperson Krista Maloney told TechCrunch. “Furthermore, many staff members and volunteers have filed complaints about damage to cars and harassment they experienced in our parking lot when leaving work after dark. We currently employ security guards, but we have a large campus and they can only be in one area at a time.”

The K9 units are also cheaper than humans. One robot costs $6 an hour to use vs. paying a security guard the average $16 an hour.

“Unfortunately, in the last year we’ve been forced to spend a significant amount of money to ensure the security and safety of the people on our campus as well as the animals in our care,” Maloney said.

And, according to both the S.F. SPCA and Knightscope, crime dropped after deploying the bot.

However, the K9 unit had its own share of hardships. The SPCA told TechCrunch it had been knocked over and someone had smeared BBQ sauce on it in early March. Another tweeter mentioned seeing poo smeared on the vehicle, though an SPCA spokesperson could not confirm that happened.

It’s worth mentioning many robots have been targeted by humans in the past. One of these same type of K9 units employed by the SPCA was brutally attacked by a drunken man, toppling the 400 pound robot on patrol in a Silicon Valley parking lot earlier this year.

Another issue looming over the bot cop’s employment at the SPCA was the use of a public sidewalk. The K9 unit was patrolling several areas around the shop, including the sidewalk where humans walk, drawing the ire of pedestrians and advocacy group Walk SF, which previously introduced a bill to ban food delivery robots throughout the city.

“We’re seeing more types of robots on sidewalks and want to see the city getting ahead of this,” said Cathy DeLuca, Walk SF policy and program director.

Last week the city ordered the S.F. SPCA to stop using these security robots altogether or face a fine of $1,000 per day for operating in a public right of way without a permit.

The S.F. SPCA says it has since removed the robot and is working through a permitting process. It has already seen “two acts of vandalism” since the robot’s removal.

But putting permits and public use of sidewalks aside, it seems the robot could do more than just discourage homeless camps. It could keep an eye on the surrounding area and report crimes, yes, but it could also possibly be used to alert police and social workers to areas where homelessness seems to have increased or look for anyone who may be facing violence or a psychotic episode and in need of intervention.

The Knightscope bots are equipped with four cameras able to read more than 300 license plates per minute. They can move about and keep tabs on an area, noting anyone on a list of those who shouldn’t be there.

Already the S.F. SPCA said it has experienced a drop in crime when using the bot cop. The same might be said if it had increased the use of human security guards but humans, as mentioned above, cost more. They also can’t monitor 24/7 or immediately upload what they see to the cloud.

Further, robots aren’t going away. While it isn’t clear what solution San Francisco’s city council will come up with to handle the increase of these types of bots on our sidewalks in the future, it’s inevitable we’re going to see more of them.

It’s an age-old human vs. machine argument. But machines usually win.

Security robots are being used to ward off San Francisco’s homeless population

Like so many classic Western anti-heroes before him, he rolled (literally) into town with a singular goal in mind: cleaning up the streets, which had become a gritty hotbed of harassment, vandalism, break-ins and grift.

The only difference was that he was a slow-moving, 400-pound robot with a penchant for snapping hundreds of photos a minute without people’s permission, and this was San Francisco’s Mission District in 2017.

What could go wrong? Quite a bit, as it turns out.

In the past month, his first on the job, “K-9″ — a 5-foot-tall, 3-foot-wide K5 Autonomous Data Machine that can be rented for $6 an hour from Silicon Valley start-up Knightscope — was battered with barbecue sauce, allegedly smeared with feces, covered by a tarp and nearly toppled by an attacker.

As if those incidents weren’t bad enough, K-9 was also accused of discriminating against homeless people who had taken up refuge on the sidewalks he was assigned to patrol. It was those troubling allegations, which went viral this week, that sparked public outrage and prompted K-9’s employers — the San Francisco chapter of the animal rescue group SPCA — to pull the plug on their newly minted robot security pilot program.

“Effective immediately, the San Francisco SPCA has suspended its security robot pilot program,” Jennifer Scarlett, the organization’s president, wrote in a statement emailed to The Washington Post on Thursday. “We piloted the robot program in an effort to improve the security around our campus and to create a safe atmosphere for staff, volunteers, clients and animals. Clearly, it backfired.”

SPCA officials said the robot was hired to patrol the parking lot and sidewalk outside the animal shelter after the building had been broken into twice and employees had become fed up with harassment and catcalls. The robot, they said, would be able to snap photos, record security footage, and then notify shelter employees or police during an emergency.

The backlash began after an animal shelter spokeswoman, in an interview with the San Francisco Business Times this week, seemed to suggest that the robot was an effective tool for eliminating the homeless encampments outside the SPCA, leading to a sudden reduction in crime. SPCA officials now say they didn’t mean to imply that they wanted to be rid of the homeless and have pointed out that they partner with several local organizations to provide veterinary care for homeless pet owners.

Nevertheless, a public outcry, complete with calls for the robot’s destruction, quickly ensued. A flurry of attention-grabbing headlines implied that the robot was specifically employed to target the homeless.

“Robot wages war on the homeless,” a particularly inflammatory Newsweek headline read.

Capitalism: instead of providing homes for homeless people, spend exorbitant sums of money creating robots that will prevent homeless people from making homes for themselves

— Ben Norton (@BenjaminNorton) December 13, 2017

In recent days, SPCA officials said, they’ve received hundreds of messages encouraging people to seek retribution against the animal shelter through violence and vandalism. So far, officials said, the facility has experienced two acts of vandalism.

“The SF SPCA was exploring the use of a robot to prevent additional burglaries at our facility and to deter other crimes that frequently occur on our campus — like car break-ins, harassment, vandalism, and graffiti — not to disrupt homeless people,” Scarlett’s statement said. “We regret that our words were ill-chosen. They did not properly convey the pilot program’s intent and they inaccurately reflected our values.”

“We are a nonprofit that is extremely sensitive to the issues of homelessness,” the statement added.

In a statement emailed to The Post, Knightscope referred to accusations that its robot was hired to target homeless people as “sensationalized reports.”

“The SCPA has the right to protect its property, employees and visitors, and Knightscope is dedicated to helping them achieve this goal,” the statement said. “The SPCA has reported fewer car break-ins and overall improved safety and quality of the surrounding area.”

K-9 is not the first Knightscope machine to have a short-lived security career. In July, a K5 robot patrolling Washington Harbour ended up in a fountain, its cone-shaped body halfway submerged in a scene reminiscent of a violent crime.

Images of that robot circulated widely on social media, and, eventually, a memorial with flowers and letters was set up to mourn the short-lived career of “Steve,” as the machine came to be known.

Knightscope called Steve’s demise “an isolated event” before delivering his replacement, an identical K5 known as “Rosie.”

Crime-fighting robot retired after launching alleged ‘war on the homeless’

To some homeless people, San Francisco’s latest security robot was a rolling friend on five wheels that they called “R2-D2 Two”. To others living in tents within the droid’s radius, it was the “anti-homeless robot”.

For a month, the 400lb, bullet-shaped bot patrolled outside the not-for-profit San Francisco SPCA animal shelter, rolling around the organization’s parking lots and sidewalks, capturing security video and reading up to 300 license plates per minute. Homeless people who pitched their tents in an alleyway nearby complained they felt the beeping, whirring droid’s job was to run them off.

“We called it the anti-homeless robot,” said John Alvarado, who was one of numerous people camping next to the animal shelter when the robot arrived. He said he quickly decided to move his tent half a block away: “I guess that was the reason for the robot.”

Officials of both the SF SPCA and Knightscope, who rented the robot to the shelter, denied that the intention was to dislodge homeless encampments.

“The SPCA has the right to protect its property, employees and visitors, and Knightscope is dedicated to helping them achieve this goal,” Knightscope said in a statement.

SF SPCA staff members said the facility had been plagued with break-ins, staff members had been harassed as they went to the parking lot and sidewalks were littered with hypodermic needles. Jennifer Scarlett, the SF SPCA president, said in a release that her organization “was exploring the use of a robot to prevent additional burglaries at our facility and to deter other crimes that frequently occur on our campus – like car break-ins, harassment, vandalism, and graffiti – not to disrupt homeless people”.

But after complaints about the program were shared widely on social media, the organization quickly admitted it had made a mistake in its choice of security guards – and fired the robot.

“Since this story has gone viral, we’ve received hundreds of messages inciting violence and vandalism against our facility, and encouraging people to take retribution,” said Scarlett, noting that their campus had since been vandalized twice. “We are taking this opportunity to reflect on the ‘teachable moment’.”

Some of the homeless people who crossed paths with the white security robot, which bore images of dogs and cats, as it patrolled outside of San Francisco SPCA this month thought it was a cute and a positive addition to the area.

TJ Thornton, whose tent is still pitched across the street from the shelter’s parking lot, nicknamed the bot “R2-D2 Two”. He liked how the machine made little whistling sounds as it moved along the sidewalk and how it would even say “hello” if you walked past it.

Thornton said he thought the bot had a positive influence on the neighborhood and relieved the pressure on local homeless people to always keep an eye on cars parked nearby. “People living on the streets actually watch out for the cars. If anyone does anything stupid, like breaking into cars, it reflects on us.”

Others saw the robot as Big Brother, surveilling their every move with video cameras. “That SPCA robot was the bane of our existence,” said Lexi Evans, 26, who has been living on San Francisco’s streets for 13 years. “It was driving us crazy.”

She said her group of friends had a tent encampment behind the SPCA. When they first saw the robot looking at them, they found it creepy. Then they noticed its white light flashing and thought it was recording their every move on video. Later they observed police officers coming to interact with the robot and wondered whether it was feeding information to law enforcement.

“We started feeling like this thing was surveilling us for the police,” said Evans, whose whole tent encampment has now moved around the block outside another business. “That’s officially invasion of privacy. That’s uncool.”

Evans said that once, someone became so angry with the thing that they knocked it over. The robot made a “whee-ooh wah” sound.

In another instance, somebody “put a tarp over it, knocked it over and put barbecue sauce on all the sensors”, Scarlett, the SPCA president, told the San Francisco Business Times.

Trouble really started for the robot last week, when the city issued an order for it to stay off the public sidewalk or face a daily penalty of up to $1,000 for operating in the public right of way without a permit. Then the story hit the internet, with Scarlett telling the Business Times that “from a walking standpoint, I find the robot much easier to navigate than an encampment”.

But by Friday, SF SPCA was apologizing for having brought in the machine.

“We regret that our words were ill-chosen. They did not properly convey the pilot program’s intent and they inaccurately reflected our values,” said Scarlett. “We are a nonprofit that is extremely sensitive to the issues of homelessness.”

Knightscope’s robots have gotten into trouble in other cities. Last year, a similar robot allegedly ran over a 16-month-old toddler at the Stanford Shopping Center in the town of Palo Alto, causing minor injuries. Another Knightscope security robot became famous on social media for drowning itself in the fountain of the Washington DC office complex it was policing.

“I already miss it,” said Danica Dito, who works in the SPCA administrative offices. “Just the fact that it rolled around discouraged crime.”

Big Brother on wheels? Fired security robot divides local homeless people

In November, the San Francisco SPCA deployed a 5-foot-tall, 400-pound robot to patrol its campus. Not for muscle, mind you, but for surveillance. The SPCA, a large complex nestled in the northeast corner of the city's Mission neighborhood, has long dealt with vandalism, break-ins, and discarded needles in its surrounding parking lots. Fearing for the safety of its staff, the SPCA figured the robot could work as a deterrent, a sort of deputy for its human security team.

The robot came from a Silicon Valley startup called Knightscope, whose growing family of security machines work as slower, more disciplinarian versions of self-driving cars. SPCA used their K5 robot, which is good for outdoor use. Its scaled-down cousin K3 is meant for the indoors, while the K1 is a stationary pillar that will soon monitor things like building entrances. And the K7, a four-wheeled robot meant for patrolling perimeters of airports and such, is going beta next year. The company is on a mission to take a bite out of crime by augmenting human security guards with machines. The path there, though, is fraught with ethical pitfalls.

The K5, along with almost 50 other Knightscope robots across 13 states, sees its world by coating it with lasers, autonomously patrolling its domain while taking 360-degree video. In an on-site control room, a human security guard monitors this feed for anomalies. Knightscope says K5 can read 1,200 license plates a minute to, say, pick out cars that have been parked for an inordinate amount of time. If you get in the robot’s way, it says excuse me. In the event of an emergency, the security guard can speak through the robot to alert nearby humans. The SPCA's robot patrolled both its campus and the surrounding sidewalks while emitting a futuristic whine, working as a mobile camera to theoretically deter crime.

None of these machines are equipped with tasers or flamethrowers or anything like that. “This is not for enforcement,” says William Santana Li, chairman and CEO of Knightscope. “It's for monitoring and giving an understanding of the situation for those humans to do their jobs much more effectively.” Again, the SPCA’s robot wasn’t meant to replace humans, but supplement them.

“Very simply,” Li adds, “if I put a marked law enforcement vehicle in front of your home or your office, criminal behavior changes.”

So does other behavior, it turns out. After the SPCA's Knightscope was set out on its route, homeless residents took it to task. A group of people setting up camp allegedly threw a tarp over the robot and knocked it over and smeared BBQ sauce on its sensors.

Now, by this point you probably don’t recoil when you see a security camera and throw rocks at it—for better or worse, we’re all under surveillance in public. But the K5 just feels different—and it elicits different reactions. In a shopping mall, the robot seems unassuming, even vaguely endearing. Kids run up and hug it. But in the outdoors, it's a roaming embodiment of surveillance, recording video of everything around it. Which is particularly unsettling to people who make the outdoors their home.

“Keep in mind, this concept of privacy in a public area is a little bit odd,” says Li. “You have no expectation of privacy in a public area where all these machines are operating.”

Still, a camera on a wall is one thing. A giant camera that roams the streets of San Francisco is another. “When you’re living outdoors, the lack of privacy is really dehumanizing after awhile, where the public’s eyes are always on you,” says Jennifer Friedenbach, executive director of San Francisco’s Coalition on Homelessness. “It’s really kind of a relief when nighttime comes, when you can just be without a lot of people around. And then there’s this robot cruising around recording you.”

After the San Francisco Business Times published a piece on the SPCA’s foray into security robotics, public outcry grew that the organization was using the robot to roam the sidewalks around its facility to discourage homeless people from settling. The SF SPCA denies its intent was anti-homeless. “The SF SPCA was exploring the use of a robot to prevent additional burglaries at our facility and to deter other crimes that frequently occur on our campus—like car break-ins, harassment, vandalism, and graffiti—not to disrupt homeless people,” said the group’s president, Jennifer Scarlett, in a statement.

Nevertheless, the group discontinued its pilot program with Knightscope last week. Deploying robots in a mall is fairly innocuous, but clearly in a more sensitive use case like this, the ethical conundrums of human-robot interaction got out of hand quick.

If you think the ethics of security robots are murky now, just you wait. Knightscope wants to keep humans in the loop with its robots, but it’s not hard to imagine a day when someone else gets the bright idea to give other security machines a lot more autonomy. Meaning, have AI-powered robots recognize faces and look for patterns in crimes. Patrol this area preferentially at this time of day, for instance, because this suspicious group of people tends to come around.

Algorithms are already forming biases. In 2016, an investigation by ProPublica revealed that software used to determine criminal risk was biased against black defendants. Now imagine a security robot loaded with algorithms that profile people. It’s especially troubling considering the engineers developing artificial intelligences don’t necessarily know how the algorithms are learning. “There should be not only a human at the end of the loop, but a human at the beginning, when you're learning the data,” says computer scientist Michael Anderson of the Machine Ethics program.

Really, what robot makers will need are ethicists working alongside engineers as they develop these kinds of systems. “Engineers aren't necessarily able to see the ramifications of what they're doing,” says ethicist Susan Anderson, also of Machine Ethics. "They're so focused on how it can do this, it can do that.”

Could a robot at some point help an organization like SPCA? Yeah, maybe. These are early days of human-robot interaction, after all, and humans have as much to learn from the robots as the robots have to learn from us. Maybe there are ways to go about it without rolling over somebody’s toes.

The Tricky Ethics of Knightscope's Crime-Fighting Robots