The Measure of Our Success

Good intentions are not enough when we make decisions for criminal justice.

Dave Smith Headshot

Some of the strangest things I have noticed over the last 30 years are the lack of measurement of outcomes for decisions made in law enforcement, and the absence of direction-setting based on these outcomes. It used to be that when we chose an action to deal with a specific problem we expected to see a result within a certain timeframe. And then we assessed the result and modified the action as needed. Unfortunately, nowadays, regardless of the results of our actions, we choose to ignore feedback and just "stick with the program," effective or not.

Illustration: Sequoia BlankenshipIllustration: Sequoia Blankenship

Take, for example, drugs education in our schools.

After assessing whether or not this was a challenge law enforcement could resolve (and at least there was an assessment—this step is also skipped these days) it was decided to implement D.A.R.E. nationwide. "Drug Awareness Resistance Education" was a program started by the Los Angeles PD  to nip drug use in the bud at the elementary school level. Additionally, the contact between the kids and a local police officer was to be a positive example of  caring professionals interacting with the children they were trying to keep safe.

The key problem with this decision was the lack of two critical elements of planning: First, no timeframe was set within which we would measure the results of the program, positive or negative. Second, we did not identify what those expected results should actually be, and test to see if they were achieved. From its beginning in 1983, until its rewriting in 2009, D.A.R.E. trained millions of kids, and spent billions of dollars, without any performance measurements applied. It was a popular program that kids and cops seemed to like, and therefore we assumed it was working.

My wife, the Sarge, ran her agency's D.A.R.E. program and will attest to the positive feedback each officer received when helping the kids deal with the real-world problem of drugs. Cool seizure vehicles, treasured T-shirts, and signature teddy bears became the symbols of the program. Every American school kid knew their "D.A.R.E. Officer," and these cops were minor celebrities in the grade schools in their towns. D.A.R.E., it was assumed, would be the final charge in the war on drugs, and negative feedback was not sought or accepted.

The problem was, the program was not actually making a difference in the use of drugs among the youth who had completed the curriculum. Repeatedly, researchers found the program was not doing what it was designed to do. And yet, year in and year out, funding, grants, and personnel went to the schools as they dutifully tried their best—and they truly tried—to stem the tide of drug use. If we are honest, we must admit that the problem of drugs is not truly a law enforcement issue, as much as it is a challenge for families, schools, neighborhoods, and social institutions. Yet these entities all too often step aside, wrongly thinking that crime fighters can take the reins and steer society away from dependency.

And now we have a new plethora of "recommendations" demanding that we "change the culture of law enforcement" without clearly 1. defining the problem, 2. collecting the data required to assess the problem, 3. identifying meaningful alternatives based on the data, 4. setting an expected time when these drastic changes should work, or 5. measuring whether the changes were successful or not. Good intentions are not enough when we make decisions for criminal justice, since unintended consequences can lead to increased officer deaths and civilian causalities. For example, the "Ferguson Effect" may have led to passive policing, increased crime, and training that has not been evaluated for efficacy; its only "evaluation" is how it "looks" to politicians and the media.

The blueprint for planning and change that I was taught was based on the textbook Criminal Justice Planning, by O'Neill, Bykowski, and Blair (1976). It uses a systems-based approach to planning and executing change in an organization. Essentially, it is the good old Aristotelian Scientific Method applied to planning: Identify the Problem; Gather Data;  Develop Possible Alternatives; Choose One; Develop an Implementation Plan; and finally, Conduct an Evaluation. This last process may lead to changing the original plan, or choosing a different alternative. It is called feedback and it works!

This model will look familiar to you crime-fighters who are proponents of the OODA Loop. Created by USAF Colonel John Boyd, OODA stands for Observe, Orient, Decide, Act. The "secret sauce" is that it's a Loop; after Act, we Observe the results of the Act, and then run through the loop again; and again; and again—forever. This system has proven effective in a broad range of applications, and yet we neglect this process in high-level decision-making for law enforcement.

Too often, we get a demand to train our crime fighters in bias, de-escalation, or some other "cultural change," in this strange non-loop way. We have no means to determine what other options we have, what it looks like when it works, what data exists to support the training or expenditure, or an expectation that we will refine the process over time to increase its effectiveness. It's time we stop genuflecting to every social justice warrior with a PhD, or academic idea that grows out of the teacher's lounge, and get back to effective planning that achieves the true mission of law enforcement: freedom and security.

Dave Smith is an internationally recognized law enforcement trainer and is the creator of "JD Buck Savage." You can follow Buck on Twitter at @thebucksavage.

About the Author
Dave Smith Headshot
Officer (Ret.)
View Bio
Page 1 of 2351
Next Page