A Message from APEP Head Honcho, David Devlin-Foltz
As many of our Washington DC colleagues know already, Lisa Molinaro is leaving the Advocacy Planning and Evaluation Program on February 8th to pursue an exciting combination of travel and teaching yoga around the world, followed by business school in September and a career that will no doubt combine her array of skills and passions.
Lisa joined us in November 2008 as an intern and steadily took on more responsibility over her three+ years with the program. She took the lead in developing our “champion scorecard” tool for assessing changes in the support offered by officials for policy change. And she led our development of the Advocacy Progress Planner version 2.0 last year while giving a new look to our web and printed materials. Over the years, Lisa has brought the same enthusiasm and creativity to fem*ex, the newly-incorporated non-profit women’s empowerment group that began in her living room with a few friends, and of course to her trapeze and Acro-Yoga practice as well.
A New Advocacy Evaluation Toolkit
Mathematica Policy Research released in October of last year their evaluation of Consumer Voices for Coverage, a coalition of organizations funded by the Robert Wood Johnson Foundation advocating for greater health care insurance coverage across multiple states. The toolkit describes a wide range of useful evaluation strategies, such as assessing advocacy capacity and evaluating policymakers’ views on advocacy. It’s a must-read for advocacy evaluators!
This week we found out that a small liberal arts college in California, Claremont McKenna, fudged its SAT statistics to receive a higher score in the ubiquitous US News and World Report rankings. As expected, this is not the only case of data tampering: Iona College and Baylor University are just two recent examples cited by The New York Times. The takeaway for evaluators? Your results are only as good as your data.