Opinion

Early education programs need results – and good data

It is too easy to measure something simple – like attendance – when what we really want is learning certain skills and actually becoming ready for school entry,” writes Christoper Sanford.
It is too easy to measure something simple – like attendance – when what we really want is learning certain skills and actually becoming ready for school entry,” writes Christoper Sanford.

In all the discussion of universal early education – preparing every at-risk child in Durham, starting at birth, to be ready to enter school – I’ve heard little mention of a crucial component: monitoring and evaluation. Yet this is the answer to the major unspoken objection to such a program. If you have doubts about spending millions of dollars extra on small children, perhaps you suspect that this too would be a waste, frittering away tax dollars, with no results. Valid concern.

Many of us lived through – and, in my case, worked in – the poverty programs of the 1960s and ’70s. We saw expensive programs, poorly planned, poorly implemented, and scarcely evaluated, until outside investigators reviewed the facts and the history and concluded that little had been accomplished. (I have to interject here that there were a few good programs, programs that helped poor people get training, get good jobs, and escape poverty. I’m afraid they were the exception.) The programs certainly helped poverty staffers, if not poor people. The saying was, “Go into poverty – that’s where the money is.”

Let me tell you about two programs. Back in the early ’70s I was part of the MDC team that developed HRD, a job-training program (still thriving, I think, on many community college campuses). We thought about how to measure the success of the different centers -- which meant also figuring out how to divvy up next year’s funding. We developed the Earnback Index (now the “Efficiency Index”), which said: Look at each trainee’s income before and after training, a full year in each case, subtract, and compare that to the cost of the training. If a program’s average trainee gained $10,000, and the training cost was $2,000, then the index number was 5. Easy to compare program costs and outcomes.

And then I worked in a federal program, run through the N.C. Department of Labor. After trainees left the program, we followed up. I visited Trainee “Joe” every week until, one time, I found him working at Hardee’s (after welder training? Oh well.) I filled out the form, and the program calculated an annual income based on his hourly pay. (No matter that he only worked a week.) The program reported to Congress that all our trainees got jobs, so we were a success. And a boondoggle. So be skeptical.

And many Durham residents can point to local programs over the last 20 or 30 years – I’m thinking of one or two in low-income housing, and I know there were others – where the city or county poured money into an agency or group and discovered several years later that the money was gone and little had been done.

It doesn’t need to happen that way. It must not. There is an oft-forgotten guideline that in return for that pile of money, any program must first spell out its plans and expected results, and then must provide interim progress reports, document problem fixes, and show final results. Sounds simple enough. The challenge is to decide on goals and on how to measure success or failure.

It is too easy to measure something simple – like attendance (maybe important but not really the goal) – when what we really want is learning certain skills and actually becoming ready for school entry. (What should we measure along the way? The final measure, of course, will come several years down the road when children actually enter school and we observe whether they are really ready.)

It will be up to the agency that provides the universal early education to be very clear about the goals, the measures, the monitoring, the fixes (when needed), and the expected outcomes. This will call for a very good data-management system and a hands-on staff that is constantly watching for problems. I was glad to see this issue listed as one of the points in the State of Durham County’s Young Children report, issued by the task force last spring, and I was glad to see County Commissioner Ellen Reckhow emphasize it in her op-ed on Sunday.

Christopher B. Sanford is an education advocate and gadfly. He welcomes your comments at sanfordchristopher1636@gmail.com

  Comments