What\'s in your evaluation toolbox? A recap of the CESBC Evaluation Conference

What's in your evaluation toolbox? A recap of the CESBC Evaluation Conference

Dec. 14, 2018 in Articles

Two members of our evaluation team visited Vancouver in November to attend the Canadian Evaluation Society BC Chapter’s 2018 Provincial Conference. With the theme What’s in Your Evaluation Toolbox?, the conference offered attendees a variety of workshops focusing on building evaluative capacity and adding new tools to our evaluation toolboxes. Using some of the top tweets from conference attendees, we've put together ten key takeaways from the conference.

 

  1. Who gets to decide what is defined as good, for whom, when? (Dr. Sarah Schulman, Keynote Speaker)

Dr. Schulman, social impact lead and founder of InWithForward, discussed the importance of Measuring What Matters in her keynote address. Dr. Schulman raised thought-provoking questions, such as “What matters most, and who gets to decide?” and “What comes to be defined as good, for whom, when?” These are questions that most evaluators and public sector staff often grapple with in our day-to-day work.

 

  1. Evaluators need more than just evaluation skills (Diana Tindall, What Do I Need in my Evaluation Toolbox?)

Diana Tindall, an evaluator with the Rick Hansen Foundation, examined over 40 job postings for evaluation positions posted between 2017 and 2018. While examining and comparing these postings, she noted that evaluators need more than just evaluation experience and post-secondary education. Evaluators also need:

  • great communication skills
  • the ability to collaborate with stakeholders
  • data collection, analysis and visualization experience
  • professional/interpersonal skills

 

  1. Our strength as evaluators is in our ability to navigate challenges (Elaina Mack & Kim Walker, Our Developmental Evaluation Journey: Navigating Supports for Newcomer Refugees)

 

Elaina Mack, engagement and evaluation director with the International Institute for Child Rights, and Kim Walker, principal of community and environment, presented on their experiences implementing a developmental evaluation approach for a newcomer refugee support program. A key takeaway from this presentation was that as evaluators, our clients do not expect perfection—but they do expect us to persevere and navigate the challenging, fast-paced and ever-changing world of evaluation.

 

  1. Developmental evaluation: A challenging but worth-it approach (Elaina Mack & Kim Walker, Our Developmental Evaluation Journey: Navigating Supports for Newcomer Refugees)

 

Developmental Evaluation (DE) is an evaluation approach that can assist social innovators develop social change initiatives in complex or uncertain environments. While it can be challenging to navigate the DE process, the approach is creative, emerging, well-suited for complex, innovative programs and provides real-time (or close to real-time) feedback to program developers and staff, thus facilitating a continuous feedback loop for program development.

 

  1. Choosing what to measure reveals our values (Dr. Sarah Schulman, Keynote Speaker)

In her keynote address, Dr. Schulman reminded us that what gets measured becomes the focus and reflects our values, so choose carefully. She emphasized the importance of including evaluation as part of the project process, not just an extra step at the end. Schulman also accentuated the need to balance what is important to the end-user/individual with what is important to the system, prioritizing relationship-building, conversation, observation and imagination as part of the evaluation process.

 

  1. When it comes to indicator selection, be ruthless! (Penny Cooper, Help! Our Logic Model Contains 147 Potential Indicators)

Penny Cooper, principal of Penny Cooper and Associates, reflected on her experience of working with a client who ended up with 147 potential indicators in their logic model. The key piece of advice that Cooper passed along to conference participants? Be ruthless and get critical when selecting indicators. Focus on “what are we always being asked?” and ensure your indicators reflect—and can answer—that question.

 

  1. Evaluation doesn’t have to be expensive. There are a plethora of free (or low-cost) tools available to extend evaluation practice. (Marla Steinberg, 10+ Free or Low-cost Web-based Tools to Extend Evaluation Practice)

Marla Steinberg, credentialed evaluator and independent consultant, spoke to a packed room about free or low-cost web-based tools that evaluators can use in their practice. The standing-room only, overflowing room confirmed that there is a need for low-cost tools in evaluation, especially for student or emerging evaluators.

Tools included:

 

  1. Arts-based approaches to evaluation are accessible and participatory! (Jennica Nichols, Data Collection Tools that are Accessible to Everyone—Using Arts-based Methods in Evaluation)

Jennica Nichols, PhD candidate and research manager at the University of British Columbia, introduced conference participants to the exciting world of arts-based evaluation and research. An arts-based approach does not require the practicioner to be an artist; instead, it's an accessible and participatory approach that can break down barriers and address power dynamics between evaluators and participants. Nichols presented conference attendees with three arts-based methods: photo elicitation, drawing, and building. The BC Healthy Communities evaluation team is excited to explore how arts-based evaluation methods can be incorporated into our evaluation and research approach.

 

  1. Evaluation is more than WHAT we measure, the HOW and WHERE is equally important. (Dr. Sarah Schulman, Keynote Speaker)

In her keynote address, Dr. Schulman highlighted the fact that data is deeply personal, meaning how we measure—and the environment we measure in—are equally important. Evaluation, as an experience, can engender honesty, a two-way exchange of dialogue and feedback, and spirit of continuous inquiry. Dr. Schulman emphasized the importance of conversation, observation, and imagination in the evaluation process, and the importance of relationship building in the evaluation process.

 

  1. It’s okay to fail. Evaluation is about learning and creativity. (Dr. Sarah Schulman, Keynote Speaker)

When it comes to evaluation and project design, failure is inevitable—embrace it and welcome it with open arms. Failure removes uncertainty. So, fail early and fail often. Learn from your failures and be creative in how you use these lessons to improve your practice.

 

BONUS: 11. Evaluation is an exciting, growing field. It can be creative, participatory, and engaging!

Gone are the days where evaluation consisted soley of surveys, monotonous reports and numbers. Evaluation approaches are constantly emerging and evolving; data is moving from quantitative to qualitative so that stories can be told. A growing number of evaluators are embracing creative, participatory approaches that are inclusive and accessible. How will you embrace this change and incorporate participatory evaluation methods into your practice?

For more highlights from the conference, check out the hashtag #EvalBC18 on Twitter.

Have questions about evaluation or what you read here? Connect with a member of our evaluation team:

Diana Gresku, Research and Impact Specialist

diana@bchealthycommunities.ca

Cassidy Paxton, Researcher

cassidy@bchealthycommunities.ca

About the Author

Cassidy Paxton, BA

Cassidy Paxton, BA

Cassidy graduated from the University of Victoria with a Bachelor of Arts in Health and Community Services and a minor in public administration.  She is passionate about creating healthier communities through policy, community organizing, and empowering young people to be leaders in their community.