Policing: the Most Dismal Science

“Wherever the law is, crime can be found.” – Aleksandr Solzhenitsyn, ‘The Gulag Archipelago’ Our society is predicated on the assumption that law, as we experience it, and statistics, as we utilize them, are impartial and objective by design. Yet law in a democratic society is derived from its constitutional regime, analytics only as useful… Continue reading Policing: the Most Dismal Science

“Wherever the law is, crime can be found.” – Aleksandr Solzhenitsyn, ‘The Gulag Archipelago’

Our society is predicated on the assumption that law, as we experience it, and statistics, as we utilize them, are impartial and objective by design. Yet law in a democratic society is derived from its constitutional regime, analytics only as useful as the data it elects to analyze. In both instances these decisions are unavoidably political; the law exists to mediate power and the constituent data serves as its “rational basis.” Thus all data is reliant on the same power structures it seeks to inform.

An analytical model is as much a reflection of the society that created it as any set of data it seeks to quantify. No set of sociological data can be truly “objective” or “neutral;” methodology demands a context of distinct priorities for that data to have value. Public policy analyst Ingrid Burrington has referred to this false promise of neutrality as an “ideology of data.” This is not to argue that all analytic models are inherently worthless but a reminder that this necessary bias should not be discounted.

This helps explains why COMPSTAT, which stands for COMParitive STATistics, sold by its architects as a “data-driven management model” for modern policing, is also referred to as a “management philosophy” by an approving academia and a fawning press. This is not only a philosophy of management but an indicator of the philosophy of its managers. COMPSTAT represents an ahistorical and antisocial method to policing modern urban spaces, unreliable both in design and implementation.

The various assessments of the modern police state adhere to either one of two frameworks; episodes of police brutality and extrajudicial state murder are seen as tragic instances or the result of human frailty, or that this brutality and murder is the logical extension of a policy of overpolicing as a means of enforcing “law and order” in densely populated urban areas.

COMPSTAT is a policy designed to assimilate crime data through space and time to raise both “situational awareness” and the “standards of accountability” for police officers. The forefather of CompStat was NYPD Lieutenant Jack Maple, who in his heavily-publicized initiative in the 1980s, “The Charts of the Future,” used 55 feet of maps to visualize violent crime per district across the five boroughs. Frank Zimring, chief advocate of CompStat among academics, describes CompStat as essential to every aspect of modern police work, “CompStat information and planning systems pervade all of the strategic changes in the NYPD and are an indivisible part of everything the department does”.

Retired NYPD inspector, turned consultant, Arthur Storch describes COMPSTAT as, “Not the answer; it is the forum for finding and demanding the answers. COMPSTAT is really a process, a forum and a system of accountability”. The public misunderstands COMPSTAT as merely a piece of software or a series of computer networks; this belies the true purpose of COMPSTAT as a comprehensive approach to law enforcement. COMPSTAT places an emphasis on visualizing crime data. A digital map depicts specific arrests made by personnel assigned to those police particular crimes in certain areas. Advocates of COMPSTAT emphasize that the central managerial principal of COMPSTAT is accountability.

To define the values of “law and order” we must first situate our understanding of “crime and disorder” in the historical context of urban policing. When discussing crime, we must separate the concepts of crime as a legal abstraction, crime as statistical “reality” and crime as a behavior and analyze each as a distinct entity.

In response to the Supreme Court’s 1948 decision in Shelly v. Kramer that declared “restrictive covenants,” limiting access to purchase property to minorities on the basis of their race, as unconstitutional, white America began a concerted effort to construct “safe” urban spaces. As overtly discriminatory housing policies were now essentially banned in Shelly v. Kramer, white America assumed new legal and financial means to ensure their legacy of segregation.

African-American culture has long been considered backward and deviant; circumstances now demanded that their behavior be criminalized. Race was no longer the stated factor in discriminatory practices; it was the variables that constituted their culture that had become cause for concern. The War on Drugs and the “broken windows” theory of policing, the idea that minor offenses encourage a culture of criminality and contribute to a greater sense of social decay, exacerbated these practices of racialised policing. Conditions of poverty were misattributed to race, willfully ignoring that the poverty and violence endemic to poor urban communities was reflective not of moral failure but a systemic one.

As African-American culture was, and continues to be, seen as aberrant, private lending institutions could deny mortgages on the basis that these minority groups were seen as “high-risk” investments. Private enterprise sought to occupy the vacuum left by the state; institutionalized racism had now become a demand of the free market. This process of systemic mortgage discrimination, known as “redlining,” ensured that African-Americans would continue to experience discrimination, wage theft and state abuse in physical isolation from white America.

The idea that the criminality inherent to blacks justifies their isolation still prevails. An African-American purchasing a home in a predominantly white community devalues not just their own property but that of the entire neighborhood. Black mobility and the associated fear of criminality contributed to the phenomenon of “white flight,” compounding the issues of depreciating capital further. Today, no other factor has had greater impact in determining property values than the presence of crime.

In a capitalist framework, the state’s presence in impoverished urban centres is focused less on ensuring public safety and more on regulating the means by which income, and more significantly, debt, are generated. Police see the public not as citizens to protect, rather as data points to be manipulated and, through the levying of fines and fees, as potential sources of revenue to the state.

This process is often mischaracterized as scientific; the scientific method does not seek data to justify its assumptions, it derives conclusions from the data collected. By contrast, predictive policing models operate under certain moral assumptions, some broad (“criminality” is wrong), others oddly specific (“demographic trends forecast future crimes”).

In 1976, methodologist Donald Campbell, in examining the role of standardized testing in educational reform, asserted, “The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor”. This concept is now referred to as Campbell’s Law.

“Conceptualizing something as a science gives it an appearance of neutrality.” – Adam Curtis, “All Watched Over by Machines of Loving Grace”

In 2013 The Nation interviewed an NYPD veteran who, under the condition of anonymity, about the pressures administrator’s place on the NYPD’s rank and file to “generate numbers”. “I had this captain who walked into the precinct and gave a speech about harassing the public. His words were, ‘We’re going to go out there and violate some rights.’ We hear it from the captain down, ‘We want 250s.’” A 250 is the designation for the form NYPD officers file after conducting a stop and frisk. Officers who do not comply with the administration’s demands are deemed derelict in their duty and are subject to disciplinary or punitive measures; there are instances of officers who have been demoted, removed from duty or assigned to hostile jurisdictions absent a partner based on their inability or unwillingness to issue 250s. Conversely officers who meet their given monthly quota are made eligible for promotion.

This familiar authoritarian logic breeds perverse incentives for law enforcement; police begin to seek crime where none would otherwise exist. Police makes arrests and issue tickets on technicalities, exaggerating minor infractions in pursuit of “serious crimes.” The continual criminalization of black spaces, “quality of life” policing and arrests made for petty infractions, constitute an overwhelming number of these arrests. The selling of untaxed cigarettes, for which Eric Garner was detained and murdered by police in July of last year, are an example of such “qualify of life” offenses. Police antagonize those they’ve detained, often without probable cause, under the policy of stop of frisk, in hopes to provoke a reaction, thereby inflating arrest numbers and meet quota. In the interview with The Nation, the same officer confirms, “A lot of police officers try to set civilians off. And once they start talking, start cursing they can lock them up for anything.”

Jennifer Lynch, senior staff attorney at the Electronic Frontier Foundation notes the danger inherent to COMPSTAT, “What starts to happen is people think the results that come out of that must be accurate because there’s technology involved. Data may have been collected in certain communities more than other communities. The problem is technology legitimizes somehow the problematic policing that was the origination of the data to begin with.” The purportedly scientific approach of statistic-driven models grants racist policing the veneer of objectivity. The constant police presence in black urban spaces relies on a concept as persistent and “omnipresent” to justify its existence. In the sake of maintaining “public safety,” these increased arrests further the systemic poverty within these spaces; mass incarceration deprives these communities of both capital and labor. Overpolicing affirms the idea of the permanence of crime in urban spaces, degrading property values and further depriving these communities of capital.

Former NYPD officer turned whistleblower Adrian Schoolcraft revealed, in a series of interviews in 2010 with the Village Voice, how NYPD rank and file find themselves in the unique position of having to both fabricate and underreport crimes. Bedford-Stuyvesant’s 81st Precinct recorded felonies as misdemeanors and refused to take complaints from victims. For the sake of optics, rapes were downgraded as criminal trespass. Police endeavored to find the lowest online price of objects reported stolen (“felony theft” is defined as any stolen object exceeding $1000 in value,) as within the context of COMPSTAT felony theft is regarded as a significant “index crime” whereas petty larceny is a negligible offense, misdemeanors are not recorded in COMPSTAT. This is a reflection on the internal bureaucratic pressures created by quotas, creating a paradox of a lowered crime rate while actual crime grows.

These departments use euphemisms to promote their agenda, referring to quotas as “performance goals” or “performance initiatives.” Within this context the NYPD is much more concerned with perception than any “statistical reality.” In this regard law enforcement functions like a brand, adhering to the philosophy of brand management. This poses a significant dilemma to the modern police state: at what point does a police presence no longer serve as a deterrent to crime? At what point do police become so inefficient they themselves act as an impediment to public safety?

The apparent success of proactive policing has given credibility to the new field of predictive policing, which relies on “emergent technologies” and “proprietary algorithms” to “aggregate and analyze existing public safety data streams in real time” in order to anticipate and prevent crimes. Advocates of this technology at the Rand Institute in the policy proposal “Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations” assert the utility of predictive policing on the success that similar “analytical tools” have had in the private-sector forecasting consumer behavior. They make the comparison between predictive policing and the manner by which “businesses use predictive analytics to determine sales strategies”.

Their proposal sees Walmart’s analysis of weather patterns on determining how many strawberry Pop-Tarts to stock prior to major weather events as analogous to the statistical models used in “predictive policing”. The strawberry Pop-Tarts, they continue, “represent a non-obvious relationship. These relationships are uncovered through statistical analysis of previous customer purchases during similar major weather events. Many similar relationships in law enforcement can be explored with predictive policing”. This leads to the “development of strategies that foster more efficient and effective policing”. That policy analysts would not hesitate to compare the patterns of consumption of strawberry Pop-Tarts at big box stores with assessments of crime speaks to the corporate mindset that dictates so much of our intellectual discourse and the ascension of the neoliberal technocrat as an instrument of public policy. This is demonstrative of a free market approach to fighting crime; the utopian ideal of predictive policing is to approach communities as singular, continuous and easily replicable entities. As Kade Crockford, the director of the American Civil Liberties Union of Massachusetts’ Technology for Liberty Project, notes, “People are excited about technological solutions for vexing social problems. This is technophilia that’s taken over. People assume if computers are involved, then it’s smarter, more efficient”.

Jeffrey Brantingham, the UCLA professor of criminology who developed the PredPol system and who on his personal website describes “crime as ubiquitous,” insists that, “Because PredPol only concerns itself with the spatial and temporal aspects of crime; it wouldn’t be skewed by social factors”. In his insistence of PredPol’s neutrality, Brantingham highlights the fundamental flaws of his approach. Algorithms, by their nature, discriminate and it is incumbent upon academics and policy makers to determine the manner of that discrimination.

The problematic approaches to modern policing are embodied in works of men like Brantingham and the academic field of criminology, or the “scientific study” of crime. Criminology purports to examine the cause, effect, control and prevention of criminal behavior. Criminology as it exists today is by and large a fundamentally ahistorical approach to understanding criminality, treating poverty as a constant and not as structural variable or compounding factor. Criminology forces sociological data to conform to a model which justifies the state’s continual presence and use of force. This intellectually bankrupt approach is the sole academic discipline that informs our approach to policing.

This framework will invariably be overused and abused. False assurances of fairness undermine the stated goal of equality, reinforcing a belief in the idea of “scientifically-accurate justice” when, in reality, no such notion exists. Statistical models threaten to fundamentally change the nature of how we define and perceive reality. At what point do deliberate lies become self-fulfilling truths, at what point can deceit be considered a quantifiable statistical reality?

The methodologies of modern police departments, in their entirety, do not constitute a system so much as an understanding; an unprecedented, dangerous and pervasive understanding that lacks any formalized structure or legal means of accountability. Science and technology now exists in a regulatory vacuum, motivated primarily by profit and unencumbered by ethics. Technology now sets our ideological paradigms and the law follows. This approach sees cities as monolithic entities; it signifies law enforcement’s insistence upon automated solutions to dynamic human problems. This superficially objective, post-hierarchical approach ignores the power structures responsible for creating and maintaining these machines.

The pathology of productivism relies on the assurances of these statistical models. It is a system that aspires for stability over safety, prioritizing the manipulating of constituent factors rather than a comprehensive community approach, fundamentally different objectives which yield fundamentally different results.

Explore more

19.09.15
Eight Years
10.09.15
Telling Tales: Story Telling in the Academy