At least 14 police forces of UK have made a scheme to use crime-prediction software according to Liberty. The human rights group informed that 90 requests have been lodged based on the issue of Freedom of Information for discovering which forces used the technology. It believes the programs can generate discriminatory policing strategies which unfairly targeted on ethnic unprivileged individuals, as well as on the lower-income communities. And it told there had been a “ severe lack of transparency” about the matter.
Custodians of the technology say it can provide new visions into gun and knife crime, sex trafficking and other potentially life-threatening offenses at a time while police budgets confront pressure.
A representative told, “We make every effort to prevent bias in data models. For this reason, the data… does not include ethnicity, gender, address location or demographics.”
But Liberty told the technologies have the deficiency of proper oversight and more than there was no vivid evidence that they had led to the secured community.
A report claims, “ These opaque computer programs use algorithms to analyze hordes of biased police data, identifying patterns and embedding an approach to policing which relies on discriminatory profiling”. The lodge pre-existing inconsistency while being impersonated as cost-effective innovations.
According to the report of Liberty, they attempt to rectify two kinds of software, which are sometimes used simultaneously.
The first one is “predictive mapping”, in which crime “hotspots” are portrayed, directing to more patrols in the area. The second is called “individual risk assessment” methodology, which targets to forecast how likely an individual is to commit an offense or be a victim of a crime.
Several companies including IBM, Microsoft, Predpol, and Palantir develop this kind of applications.