The intersection between AI and Police.

-- By GorbeaF - 29 Feb 2024

Society's evolution

Society has drastically evolved in the past years. It has grown a new programable limb that captures and stores every action we take, transforming it into data, learning from it, and evolving in secret. These past years are akin to a once-distant printing press revolution, where a product overwhelmed society with new business models, opportunities, and threats. This analogy between a revolution in the 15th century and software seems far-fetched; however, the proliferation of ideas once revered now faces heightened control and surveillance by the limb and those who leverage it. This limb has become the most crucial constituent of our digital society. As such, everyone and everything must evolve with it and help it grow and expand. Our digital society’s evolution consequently involves a cultural shift from privacy concern to privacy ignorance. Should that same evaluation apply to the evolution of policing in this new society?

It’s curious how we evaluate policing agencies that “infringe” on our privacy and then turn a blind eye to international conglomerates that exploit that same privacy. Disregarding the different purposes, scope, and regulations, do we trust Meta more than the police? Do we know for sure if the PRISM program ended or if it only changed designation? Suppose the following hypothetical was true. A new user of an iPhone needs to agree to specific T&Cs to be able to use it. Upon starting the device, an option appears: agree to T&C where everything is shared with “X” company or request a refund for the device. Why would I click decline when I need the device to live comfortably at this age? Compared with: suppose the same language appears, but we modify “X” company to NSA, do we request a refund? The same core problem exists; nothing has changed in both hypos. However, the human mind will recognize police surveillance as negative; why?

A new challenge

Police agencies have and will continue to leverage the limb and any possible alteration to it to their need, and they rightfully should. As software evolves, police should too. However, our evaluations of their actions haven’t – we distrust them. The new available software capabilities will be a game changer for the police; Cellebrite Pathfinder and Lema are just the tip of the iceberg. However, the underlying problems (bias, discrimination, unaccountability, and no transparency) still appear. Nothing will make us trust them without completely rooting out the problems that have festered for decades. The system is broken. Rather than allowing agencies to leverage the limb, we should push for a complete overhaul of all the surveillance agencies and, thus, allow the digital society to complete another phase of its cultural evolution –allowing the agencies to evolve correctly.

Continuing on the current path keeps us at odds with our agencies. The underlying issue we keep referring to will fester in new dimensions once we train our models on existing police encounters and surveillance tactics. The snowball will keep growing. PredPol perfectly exemplifies this.

The next chapter

The digital society has evolved to a stage where we do not care what we accept and how it affects us. The Social Dilemma by Jeff Orlowski portrays a rather true story; nonetheless, we do not care. This digital society will only exponentially evolve. Fortunately, we still have humans alive who were not forged by smartphones. The rest of us find it annoying and time-wasting having to scroll and click when it comes to privacy or T&Cs. Privacy is nonexistent, except for a few notable hardware examples; we might accept that PRISM still has a backdoor to our Macs or Windows. Surveillance agencies only waste time and resources on superfluous paperwork.

Instead of devising ways to leverage the limb into safe, reliable, and transparent police agencies, we should focus our energy and resources on a peer-to-peer police agency system that deletes the underlying issues.

Nothing is perfect, and neither would this be. The limb should be empowered with complete access to everything on the net (assuming it doesn’t). We should create different limbs trained by different countries’ distinct and individualized cultures. They should be connected to a peer-to-peer network where they can monitor each other to ensure adherence to whatever regulatory and ethical standards we decide to subject them to. They can each detect faults in each other and potentially address them. Decentralized decision-making will ultimately remove limb corruption. Heightened efficiency, transparency, and accountability would be beneficial and more useful to our digital society. Police interactions would be reviewed and analyzed by the corresponding software in a specific jurisdiction and reprimanded accordingly.

This surveillance state could be a near-future situation. The limb keeps expanding each day into different sectors and industries. Unfortunately, I believe our actions will be reactive rather than proactive. We need to care about our privacy, who controls it, and how they use it before it’s too late. Accepting those T&Cs may be one of the most essential decisions we make.

Proactiveness

The peer-to-peer limb could increase and secure the public’s mistrust of police agencies. Providing security services that hold law enforcement accountable could be the solution to our underlying issue dilemma. If we place our trust in international conglomerates, why couldn’t we shift that same trust to an AI entity? How would it be different? Acceptance and misunderstanding would pose barriers that would only be achievable by extensive education.

The evolution of our digital society directly relates to the evolution of our core values. Once it begins to evolve, a domino effect will obligate the evolution of the rest, including police agencies’ core values. There is no prevention, so let's be proactive rather than reactive.

It seems to me that the route to improvement is a clearer point of your own. "Artificial intelligence" here is really a synonym for "software." Your point is that when software is the most important constituent of society, rather than petrochemicals or steel, policing is also different because all of society is different. The evaluations we apply to policing (accuracy, fairness, avoidance of unnecessary intrusion or use of force, effectiveness at maintaining public security and order) will be affected by the transformations policing goes through, and the changes in societal values that will also happen.

All this is basic. The current draft uses many more words to say as much. Now we want a specific relevant idea of your own, which you can describe, analytically evaluate, and enable the reader to put to her own use. You have much room to reduce the introductory material. Levels of technical description are only necessary where the specifics rather than the general importance of software and data analytics are directly involved.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.