On mentoring and wheat

by Jean Rhodes

A call to action in youth mentoring

Ron Haskins, the co-director of the Center on Children and Families at the Brookings Institution recently wrote a New York Times OpEd in which he made a strong case for the use of evaluation and evidence in social programs. As he notes, “Despite decades of efforts and trillions of dollars in spending, rigorous evaluations typically find that around 75 percent of programs or practices that are intended to help people do better at school or at work have little or no effect….[yet] A growing body of evidence shows that a few model social programs — home visits to vulnerable families, K-12 education, pregnancy prevention, community college and employment training — produce solid impacts that can last for many years.”

As Haskins notes the evidence-based movement that is currently being championed by the Obama administration has the potential to “separate the wheat from the chaff. If this movement spreads to more federal programs, especially the big education, employment and health programs supported by formula-based grants, we can expect consternation and flailing as many program operators discover that their programs are part of the chaff.”

The need for evidence-based practice

How can youth mentoring, as a field, ensure that it remains part of the wheat in the year ahead. An important step will be for important for programs to adhere to the best practices that have emerged from the growing body of mentoring program research and evaluation. To facilitate this, MENTOR will soon release the 4th edition of the Elements of Effective Practice (EEPM4). With its long history and bright red cover, EEP has taken an almost iconic place in the field, easily the most recognizable resource that MENTOR provides. As with the EEPM3, the development and writing of the EEPM4 was led by myself along with Janis Kupersmidt, and Rebecca Stelter of iRT. Our core group also included the expert assistance of Michael Garringer and Stella Kanchewa, along with the collective wisdom of a core set of research and practices leaders in the field.

There are four major components of the EEPM4: Standards, Benchmarks, Enhancements, and the Research Justifications for each of the standards. In contrast to previous versions, EEPM3 and EEPM4 are grounded entirely in research from the fields of mentoring and child development and beyond. For example, the EEPM4 includes an extensive new section on mentor recruitment, drawn from the fields of volunteerism and marketing. Practitioners have thus far responded extremely positively to the clear, parsimonious, and well-justified set of guidelines, and the EEPM4 will continue to move the field toward more evidence-based practice. In the years ahead, there will be many opportunities for practitioners and researchers to extend the Elements with specific tools that help programs adhere to evidence-based standards that, when rigorously applied, will yield much larger effects. Along these lines, the EEP3/4 team has developed and evaluated a dynamic set of web-based tools that correspond with the EEPM3 (and will incorporate EEPM4). The training, hosted on Mentoring Central, contains two, five-lesson courses that cover prematch topics that align with research on effective mentoring relationships including “Building the Foundation” (e.g., mentor’s motivations/expectations, roles, behaviors) and “Building and Maintaining the Relationship” (e.g., ethics and safety, boundaries, closure). Results of experimental evaluation of the training have been extremely promising and, as far as we know, it represents the only evidence-based training for youth mentoring currently available. Additional program derivatives and post-match specialized training lessons are underway and a companion in person training protocol focusing on behavioral skills training has  been developed to complement the pre-match web-based training course.

For investments in mentoring to be grow, we as a field this and other tools will play a vital role. They will ensure more consistent adherence to the practices outlined in the EEPM4 and facilitate the use of research to guide program improvement and innovation. This can be accomplished by fostering strong collaborations between practitioners and researchers in the design, implementation, evaluation, and the ongoing evaluation of refinement of evidence-based tools and practices.