When David Brennan first began cycling on Scotland’s roads, he felt unsafe. “It felt lawless,” he recalls, with motorists leaving little space for his bike while overtaking at terrifying speeds. Failing to observe any meaningful police surveillance of Scottish roads, Brennan began fixing cameras to his helmet and his bike frame. At least if anything happened to him out on those lonely roads, he thought, he’d have incontrovertible proof.

It did in October 2019. Taking offence at Brennan weaving through queued traffic, a motorist tried to drive him off the road, before getting out of his car and assaulting him. Capturing the footage from multiple angles, the cyclist filed a complaint with Police Scotland. Expecting an open and shut case, instead Brennan triggered a lengthy and bureaucratic investigation that dragged out the case for months and seemed to ignore what he’d captured. Brennan was even fined for swearing during the assault.

The situation was all the more bizarre for the fact that most other police departments in England and Wales had already embraced the role dashcams and GoPro cameras could play in improving road safety. July 2018 saw the creation of the National Dash Cam Safety Portal, which provided a direct channel for all road users to upload footage of dangerous driving and see it forwarded to participating police services. According to one estimate, the portal saved law enforcement 68,474 hours of work that would otherwise be spent manually processing videos and cross-checking witness statements.

It would take a concerted campaign led by Cycling UK before Police Scotland announced the creation of its own portal earlier this year (Brennan also received an apology from officers after he filed an official complaint.) It’s indicative of a broader embrace by road users of personal recording devices, explains Cycling UK’s Scottish campaigns manager Jim Densham. “We know from our members that there’s a real feeling of empowerment if you’ve got that camera footage,” says Densham. “At least if something happens, you know that you’ve got some way of saying, ‘This happened to me and here’s the evidence for it’.”

Police departments on both sides of the Atlantic are increasingly alert to this kind of hue and cry – and not just from road users. Recent years have seen a flowering of community-focused apps and Wi-Fi-connected door cameras that provide law enforcement with an unprecedented insight into criminal activities at a hyper-local level. It’s an opportunity that UK and US police forces have embraced, partnering with tech platforms including Citizen and Ring to create and streamline new reporting channels for crime.

It has also led to relationships between police and tech companies that make privacy advocates nervous. Community-based apps have been accused of playing into stereotypes of what drives criminal activity, as well as deepening racial divisions. The close relationships between police departments and tech platforms has also prompted critics to question where the former’s loyalties truly lie.

New frontiers for police surveillance

Such concerns are not nearly as pronounced for dashcams. National codes of conduct for driving on roads in the UK, US and elsewhere set clear boundaries on what is permissible or illegal. As such, it’s hard to see what societal divisions are being deepened when a cyclist uploads footage of being forced off the road by a lorry, or a motorist volunteers a recording of a dangerous overtaking manoeuvre on a motorway.

What it doesn’t do, explains Densham, is protect vulnerable road users in the moment. “It’s not a secret forcefield around your bike as you’re cycling along,” he says. Rather, the utility of the camera lies in the hope that action will be taken by the police to prosecute the dangerous driver. It isn’t, says Densham, a “substitute for more policing on the roads”.

Attempts to release solutions that make road users more than passive observers of illegality, however, have received a mixed reception. In April, a Guardian interview with the developers behind Speedcam Anywhere – an app which claims to use AI to measure the speed of passing cars from camera footage – revealed that they had been subjected to a slew of vitriol from motorists. This was in spite of the fact that the team’s goal was little more than providing the police with enough data to identify speeding hotspots, although the app does not meet the legal threshold of acting as a speed camera.

The general public’s reaction to the spread of door cameras has been muted by comparison. Sales of Wi-Fi-enabled doorbells have ballooned in recent years, with 7.9 million devices alone sold by the top five manufacturers in 2020. While most of them offer the opportunity for users to answer their door from anywhere via their phone, another key selling point has been its use as a means of identifying package thieves and burglars.

“Fear sells,” says Matthew Guariglia, a policy analyst at the Electronic Frontiers Foundation. “The more people are concerned about crime, and the safety of their home and their packages, the more likely they’ll shell out money for something they think might, if not protect them, give them more awareness.”

Leading in this market is Amazon’s subsidiary Ring, which also allows users to more easily share footage with law enforcement or with its associated ‘Neighbors’ app. Police departments in the US and UK have been especially keen on handing out free Ring cameras to neighbourhoods blighted by theft and other forms of petty crime. In the US alone, approximately 2,000 partnerships have been made between the company and local police departments, with similar agreements being struck with forces in the UK

“The more cameras there are, the less police have to gain [from] the cooperation of the community, because cameras are the new witnesses,” says Guariglia. Little wonder that, according to an investigation by privacy researcher Lauren Bridges, approximately 22,000 requests for footage from Ring cameras were made by US police departments in the year up to April 2021.

However, the sense of objectivity derived from viewing camera footage, as opposed to collecting witness statements, can be deceptive. While Guariglia argues that some users might be more inclined to send raw video of wrongdoing to a police department than call 911 for an officer and see the resulting confrontation potentially escalate into violence, the very act of sharing the footage still has the potential to amplify existing racial and class stereotypes.

This can already be seen on local community forums, where sites such as Nextdoor have seen users racially profiling outsiders and contributing to what one Guardian writer described as a tone of ‘urban embattlement’ (the firm has since tightened its moderation regime).

This criticism has been more pronounced against platforms explicitly premised around fighting crime. Take Citizen, a mobile app that alerts its users to criminal incidents taking place in their local neighbourhood. Originally called ‘Vigilante,’ the app works by mining local news reports and monitoring 911 calls using radio antennae in major cities. While use of the app has led to crimes being solved – most notably, the successful retrieval of a missing 4-year-old boy in New York – it’s also linked to several cases of mistaken identity.

“One of the more nefarious ones [involved] a person suspected of starting a brush fire in Los Angeles, who turned out to be an unhoused person,” says Guariglia. “The Citizen app broadcast that person’s picture and actually offered a cash reward for information leading to their arrest. But it turns out they had broadcast the wrong person’s picture.”

police surveillance
(Photo by Smith Collection/Gado/Getty Images)

When Big Tech meets the police

Just how closely police departments have partnered with apps like Citizen remains unclear, although a recent investigation by Vice into data-sharing between the app and the LAPD suggests that such relationships are more common than previously assumed.

The picture is less fuzzy when it comes to partnerships with doorcam manufacturers, some of which have been accused of crossing ethical lines. Another investigation by Vice identified at least 14 US cities and one in the UK where police departments were contractually obliged to promote the sale of Ring cameras in exchange for free units. In Gwinnett County, Georgia, the company even went as far as to edit a press release about a camera distribution program, prompting questions from critics as to whether police are ultimately serving the interests of citizens or private enterprise in distributing doorcams.

The contribution of the technology toward fighting crime is also debatable. An investigation by NBC News into the use of Ring camera footage by 40 police departments found that officers were spending more time sifting through endless video of raccoons and neighbours arguing than catching criminals in the act.

The doorbells were also vulnerable to misuse and abuse, both by hackers exploiting weak security protocols – resulting in one Black family being subjected to racial abuse via their camera’s speaker – and the police themselves. In 2020, the LAPD was criticised by a civil liberties organisation for having requested Ring camera footage in a bid to identify Black Lives Matter protestors (‘The SAFE LA Task Force used several methods in an attempt to identify those involved in criminal behaviour,’ the LAPD later told EFF.)

Professor Andrew Ferguson, author of The Rise of Big Data Policing, has seen little evidence that door cameras have helped reduce or solve crimes. Indeed, the aggressive marketing of such devices by police departments and tech platforms and their comparatively tepid contribution toward making streets safer reminds him of the numerous failed attempts to apply predictive policing technologies toward fighting crime. In that case, argues Ferguson, “there was never any evidence that [it] actually worked”.

Ring has since responded to public backlash about its partnerships with police departments and the security of its devices, imposing encryption and requiring police to publicly request footage on its associated Neighbors app rather than send bulk emails to users privately. “They responded in a positive way that, I think, has improved things,” says Ferguson.

Insofar as these changes were made in response to significant political pressure, however, it does reveal a growing burden of responsibility on tech platforms and users themselves in how to handle such footage. In the UK, it has resulted in at least one individual being fined for inadvertently breaking data protection laws. US law, meanwhile, sees few impediments to the operation of such devices. “In many ways, we have abdicated our responsibility to legislate some of these surveillance threats to our privacy,” says Ferguson. “We just don’t have laws on the books about them.”

This prompts the question of who really benefits from having a networked camera on every door. It’s not residents or police, Ferguson argues, but technology firms. Not only do they profit from the inflated sense of security that cameras bring to users, but could gain in the future from the vast volume of footage captured by these devices, insofar as it’s useful as training data for future algorithms. Weave that into a wider web of virtual assistants and crawler bots, and there may come a time where a single company not only has an intimate acquaintance with the timbre of our voices and our stranger shopping habits, but the rhythms and flows of entire communities.

That’s great business for a company like Google or Amazon, argues Ferguson, “but where it gets shady is when you’re using the public safety validation that’s supposed to be not commercial.”

There are, of course, many other reasons why someone might choose to buy a door camera. Ferguson does wonder, though, what their growing popularity says about society at large. It is, in the end, too easy to just blame platforms and law enforcement for using technology to widen societal divisions and hysterias: it is us, after all, who are using the devices. “We’re the centre of this rise of self-surveillance,” says Ferguson. “And only we can stop it.”

This article originally appeared on Tech Monitor.

Latest News

Read

Why you should be conserving your online legacy for future generations

One of the questions most commonly asked of historians today is how the history of our times will be written. By that, people don’t mean ‘will we be described as...

Spear's

Read

Explainer: What is sustainable tourism?

After the pandemic subsided, tourism took back its title as one of the largest industries in the world, accounting for 10% of global GDP.  Even though there are no figures to show...

Capital Monitor

Read

Why Big Tech pretends AI is dangerous

In April 2018 Mark Zuckerberg appeared before the US Senate to answer questions on the Cambridge Analytica scandal, Russian interference in the 2016 US presidential election, and whether Facebook now represented a threat to the free...

The New Statesman

Read

LinkedIn survey: Majority of B2B marketing leaders expect budgets to grow

LinkedIn’s The B2B Marketing Benchmark paper makes positive reading as the world’s largest professional network signals marketing is in rude health. Surveying nearly 2,000 senior B2B marketing and finance leaders,...

Lead Monitor

Read

Getting up close and personal with Rwanda’s mountain gorillas

Maerrhumm… maerrhumm,’ I repeat under my breath, mustering the deepest grunt I can. But in gorilla speak, uttered with lowered eyes and a stoop, this is a relatively quiet and...

Spear's

Read

Government looks to space to solve UK 5G connectivity problems

The UK government has launched a new £160m fund to back satellite-based solutions that could fill the gaps in the UK’s 5G network. At present, businesses and consumers in rural areas often...

Tech Monitor

Read

India needs $1.1trn to tackle climate change by 2030

India’s central bank has estimated the country would have to spend Rs85.6trn ($1.1trn) to adapt to climate change by 2030. This came in a new report, Towards a Greener Cleaner India,...

Capital Monitor

Read

Nick Robinson interview: Whatever happened to broadcast news impartiality?

Nick Robinson does not shy away from confrontation. When he was political editor of the BBC, he had high-profile run-ins with Alex Salmond (who accused him of “heckling” him at an SNP press conference)...

Press Gazette

Read

As ChatGPT is predicted to lead to job growth, marketers reveal how they use the AI tool

Global technology consultancy Lorien has boldly forecast that AI models such as ChatGPT will benefit the tech sector within business with roles in machine learning, natural language processing and data...

Lead Monitor