fbpx

AI is Being Used by Police to Predict and Deter Domestic Violence and Yes, It’s a Bad Idea

predictive policing australia

TW/CW: This article mentions domestic violence and abuse.

Police in Queensland have revealed that they will begin to use artificial intelligence to predict and prevent domestic violence.

The technology will be trialled in the sunshine state to identify high-risk domestic violence offenders, the results of which will be used to door knock people they suspect could be likely to commit an offence.

In domestic and family violence scenarios, behaviour often follows predictable patterns of escalating violence which police believe they can capitalise upon to identify those who are at risk and intervene before there is a serious problem.

It’s often cited as a systemic failure of the criminal justice and social services systems that those in charge frequently fail to identify problematic individuals until it’s too late. This is where AI technology could be useful in alerting police to the need to intervene.

However, while some domestic violence campaigners have stated that the development could be of some use, most are simply concerned that the technology will be used inappropriately and further entrench problems within the system set up to protect people.

The development is even more troubling as it is deployed by a police force with historic problems of domestic violence perpetrators within its own ranks.

Here’s what you need to know.

How is This Going to Work?

The technology uses data from the Queensland Police Records Information Management Exchange or ‘Qprime’ system to assess the risk of potential family and domestic violence perpetrators.

It’s an algorithm using ‘actuarial’ risk processing analysis to identify those who are the most likely to commit violent acts. This is a type of risk calculation based on statistics and measurable variables as predictors for future behaviour, unlike clinical risk assessment which relies on the analysis of a professional.

Information normally used in actuarial risk assessment includes things like age, gender, previous criminal history, income, geographical location, and other quantifiable data to build a picture of an individual. Data points are weighted with each variable to predict how likely someone is to commit a violent act based on historical trends.

The algorithm in question has been in development for about three years using data from the police database but it is not designed to replace policing work, nor could it be used in some kind of Minority Report-style policing where individuals are charged before a crime is committed.

Instead, police hope that by identifying individuals who may be more likely to commit offences, they can use the system as a tool in assisting in police work by knocking on doors and performing wellness checks.

Queensland acting super intendant Ben Martain said that the system is a shift towards ‘preventative policing’.

“With these perpetrators, we will not wait for a triple-zero phone call and for a domestic and family violence incident to reach the point of crisis,” Martain said.

“Rather, with this cohort of perpetrators, who our predictive analytical tools tell us are most likely to escalate into further DFV offending, we are proactively knocking on doors without any call for service.”

Why Is This a Terrible Idea?

Where to begin? Predictive policing strategies having little merit aside for the moment, the Queensland police force does not have a good record when it comes to using data systems and private information.

In 2019, Queensland police commissioner Katarina Carroll admitted that her organisation had been brought into disrepute when current and former members of the police force were caught using the Qprime system to access private information that was not related to their duties at the time.

This breach of public trust extends as far as one officer who provided information on the whereabouts of a woman who was fleeing domestic violence to her former partner. The woman was forced to go into hiding as her abusive ex came after her.

The tribunal at the time found that police could not be held accountable for that breach of privacy as there was no system in place to regulate how officers accessed police records.

It’s also worth mentioning here, as above, that a senior Queensland police officer stated that there was a “concerning increase” in the numbers of police on the force accused of domestic violence and that the police were “grappling” with how to handle the situation.

Assistant Commissioner Brian Codd told The Guardian in May of this year that he could not “100% guarantee” that women seeking help from police would not encounter abusers in uniform or officers with problematic attitudes.

While all of the above represent serious but broad systemic issues with the Queensland police force, it must be noted that domestic violence has seen a shocking and tragic increase during the pandemic and that more ought to be done to prevent it.

Using artificial intelligence to prevent domestic violence perpetrators is not a new idea but thus far there are few police forces in the world who have actually gone on to implement such practices.

One of the biggest issues with artificial intelligence — which is often a catch-all term for assistive technology — is that it has been proved time and time again to retain the biases and faults of either those who create it or the information that it is trained on.

Think of the example of Tay, Microsoft’s AI chatbot, that was unleashed on Twitter in 2016 to learn how to interact with people online. Absorbing all of the terrible things that go on that site, Tay went from tweeting about how cool humans are to spouting off that “Hitler was right” in the space of just 12 hours.

It’s a bit of a joke, but using the power of AI in predictive policing can have serious consequences.

Australia’s National Research Organisation for Women’s Safety has found that “racism, poor relationships with local communities, misogyny, and the patriarchal culture of the police service” were ongoing concerns.

Prof Lyria Bennett Moses, the director of the Allens Hub for Technology, Law and Innovation at the University of NSW, has also said that similar “knock on doors” approaches to predictive policing in NSW resulted in Indigenous youths being disproportionately targeted.

Bennett Moses followed by saying that any AI model used in this way “must be transparent” and subject to independent evaluation.

The police, for their part, have said that they are “acutely aware” of the potential for bias in the system and have taken steps to ensure the technology will not simple entrench further issues.

“[The] QPS considered the lessons learnt in other jurisdictions and have developed a model monitoring tool that aims to regularly monitor and address bias within the model,” Martain said.

“For the pilot, QPS removed raw data that had the direct attributes of ethnicity and geographic location before training the model.”

Women’s Legal Service Queensland CEO Angela Lynch has said that while the system does need to improve to identify problematic individuals, these steps need to be taken with caution and care.

“The system does need to be better able to identify and respond to high-risk offenders, especially those who go from relationship to relationship,” Lynch said.

“We do need innovative responses in terms of how we deal with them, but it does have to be done so safely.

“You’d have to be really careful and you’d want to look at what the impacts are, particularly on groups that may be particularly vulnerable.”

Read more stories from The Latch and subscribe to our email newsletter.