This week, the New South Wales Police announced the introduction of upgrades to their Insights policing platform. This new technology is designed to provide further services to frontline officers through faster access to critical information in the course of their roles in identifying persons and criminal activity across the state.
Powered by Microsoft Azure cognitive technologies, the machine learning and deep learning capabilities were fully deployed in February 2021, with the goal of reducing police labour hours on manual data processing tasks, such as reviewing video feeds.
Examples of how the AI systems will be used include one case were NSW Police collected 14,000 pieces of CCTV footage as part of a murder and assault investigation which would previously have taken detectives months to analyse.
Microsoft claims the AI/ML infused Insights platform ingested this huge volume of information in five hours and prepared it for analysis by NSW Police Force investigators, a process which would otherwise have taken many weeks to months.
Microsoft said the new Azure services will be used to manage all the data from operations including triple zero calls, arrests and charges, firearms, criminal investigations and forensics – and then make that easily accessible by police officers.
Trace The Face
While the benefits appear obvious, there have been issues with similar types of technology in the past, particularly when used by law enforcement.
One of the key applications of AI/ML technology in policing work has been through facial recognition. Police have used this technology to compare suspects’ photos to mugshots and driver’s license images stored in extensive law enforcement networks.
However, despite widespread adoption, facial recognition software was banned this time last year for use by police and local agencies in several US cities, including major centres in Boston and San Francisco.
In addition to the privacy concerns, experts cite facial recognition as the least accurate of the key biometric indicators, when compared to other identifiers such as fingerprint, iris, palm and voice.
Seen But Not Believed
There has also been a growing body of research identifying the inherent bias built into the algorithms driving these systems. In 2018, facial recognition systems were identified to be fundamentally flawed when 28 members of the US Congress were falsely matched with criminal mugshots.
Humans recognise faces of their own race more accurately than faces of other races and developers are no different.
In June last year, the world’s largest scientific computing society, the Association for Computing Machinery in New York City, urged a suspension of private and government use of facial-recognition technology, because of “clear bias based on ethnic, racial, gender, and other human characteristics”.
Earlier this year, Amnesty International launched a campaign to ban the use of facial recognition technology as “a form of mass surveillance that amplifies racist policing and threatens the right to protest.”
Australia and NSW in particular are not the US, but the technology was developed there and First Nations people are subjected to similar disproportionately higher rates of imprisonment in this country as African Americans are in the United States.
Tens of thousands of NSW residents won a last minute reprieve last June to publicly protest in Sydney, in opposition to the deaths of Indigenous people in police custody which has been ongoing since the Royal Commission into Aboriginal Deaths in Custody was convened over 30 years ago.
Won’t Get Bias
In their announcement, Microsoft assured “the system has been designed with ethics front and centre, and in consultation with privacy experts with a particular focus on avoiding bias.”
The article goes on to say, “Since establishing its AI and Ethics review board (AETHER) process in 2017, Microsoft has undertaken peer reviews and ethics reviews of many AI/ML solutions. These reviews are conducted by a globally diverse group of individuals and examine the key aspects of any AI based solution including transparency, accountability and the potential impact to human rights.”
Raj Bhaskaran, Lead Architect on Insights for the NSW Police confirmed “We’ve engaged with stakeholders in this space to ensure we are building and deploying this leading-edge technology right.”
Will It Work?
NSW has the largest police force in Australia with more than 22,000 members, including 18,000+ police officers serving 8 million residents. Since 2017/18, the NSW Police, under a new Digital IT Strategy and leadership has digitised much of the interactions and channels it has to NSW citizens.
With crime rates on the rise since the onset of the pandemic, particularly in Domestic Violence situations, it remains to be seen if these digital applications have made any noticeable impact in curbing criminal activity in Australia.
While it’s too early to tell how the application of the Microsoft Azure programs will influence police effectiveness and response times, we watch with interest how this technology will be deployed across the state.