New Framework Aims to Mitigate Racial Bias in Predictive Policing

Published on April 22, 2026

In recent years, predictive policing has transformed crime prevention strategies, enabling law enforcement to allocate resources more efficiently based on anticipated crime patterns. Traditional systems, however, often inadvertently reinforce racial disparities through biased data. The introduction of fairness-aware methodologies is now critical to address these prevailing inequities.

Researchers have unveiled FASE, a Fairness-Aware Spatiotemporal Event Graph framework designed to enhance predictive policing. This innovative system integrates crime predictions with fairness constraints to optimize patrol allocations, providing a necessary shift from the status quo. ’s crime data from 2017 to 2019, FASE harnesses advanced machine learning techniques to reflect real-time community needs.

FASE operates on a graph comprising 25 ZIP Code Tabulation Areas, analyzing nearly 140,000 crime incidents to establish a robust predictive model. The framework employs a combination of a graph neural network and a multivariate Hawkes process to capture unique spatial and temporal crime dynamics. While the results demonstrate a strong predictive accuracy, the model also reveals a discrepancy in crime detection rates between minority and non-minority areas, indicating outstanding challenges remain.

The implementation of FASE shows promising results in balancing resource distribution while maintaining a demographic impact ratio within narrow bounds. Nevertheless, a detection rate gap of approximately 3.5 percentage points highlights persistent bias issues stemming from feedback-driven data. This outcome stresses the necessity of comprehensive fairness interventions across all stages of the predictive modeling pipeline, ensuring that technological advancements do not inadvertently exacerbate existing biases.

Related News