Two members of our evaluation team visited Vancouver in November to attend the Canadian Evaluation Society BC Chapter’s 2018 Provincial Conference. With the theme What’s in Your Evaluation Toolbox?, the conference offered attendees a variety of workshops focusing on building evaluative capacity and adding new tools to our evaluation toolboxes. Using some of the top tweets from conference attendees, we've put together ten key takeaways from the conference.
Dr. Schulman, social impact lead and founder of InWithForward, discussed the importance of Measuring What Matters in her keynote address. Dr. Schulman raised thought-provoking questions, such as “What matters most, and who gets to decide?” and “What comes to be defined as good, for whom, when?” These are questions that most evaluators and public sector staff often grapple with in our day-to-day work.
Diana Tindall, an evaluator with the Rick Hansen Foundation, examined over 40 job postings for evaluation positions posted between 2017 and 2018. While examining and comparing these postings, she noted that evaluators need more than just evaluation experience and post-secondary education. Evaluators also need:
Elaina Mack, engagement and evaluation director with the International Institute for Child Rights, and Kim Walker, principal of community and environment, presented on their experiences implementing a developmental evaluation approach for a newcomer refugee support program. A key takeaway from this presentation was that as evaluators, our clients do not expect perfection—but they do expect us to persevere and navigate the challenging, fast-paced and ever-changing world of evaluation.
Developmental Evaluation (DE) is an evaluation approach that can assist social innovators develop social change initiatives in complex or uncertain environments. While it can be challenging to navigate the DE process, the approach is creative, emerging, well-suited for complex, innovative programs and provides real-time (or close to real-time) feedback to program developers and staff, thus facilitating a continuous feedback loop for program development.
In her keynote address, Dr. Schulman reminded us that what gets measured becomes the focus and reflects our values, so choose carefully. She emphasized the importance of including evaluation as part of the project process, not just an extra step at the end. Schulman also accentuated the need to balance what is important to the end-user/individual with what is important to the system, prioritizing relationship-building, conversation, observation and imagination as part of the evaluation process.
Resetting perspectives to those of the people experiencing challenges from those of funders and policies can make for a significant departure in the focus of evaluation. -Dr. Sarah Schulman #evalbc18— Sarah Farina (@Broadleafc) November 30, 2018
Penny Cooper, principal of Penny Cooper and Associates, reflected on her experience of working with a client who ended up with 147 potential indicators in their logic model. The key piece of advice that Cooper passed along to conference participants? Be ruthless and get critical when selecting indicators. Focus on “what are we always being asked?” and ensure your indicators reflect—and can answer—that question.
Great presentation from Penny Cooper on indicator development!— Cassidy Paxton (@itspaxxton) November 30, 2018
•Be realistic & consider burden on person resp. for data collection
•Critically review your indicators to narrow them down
•Qualitative data is okay & helps tell the story#EvalBC18 @EvaluationBC pic.twitter.com/iuPr5d0NXG
Marla Steinberg, credentialed evaluator and independent consultant, spoke to a packed room about free or low-cost web-based tools that evaluators can use in their practice. The standing-room only, overflowing room confirmed that there is a need for low-cost tools in evaluation, especially for student or emerging evaluators.
Lots of resources from Marla Steinberg's presentation today. Hot tip: Use padlet to have stakeholders vote on eval questions. #Evalbc18 #evaluation #Free99 https://t.co/UC5nHrV7t3 pic.twitter.com/ewYGedFhAn— Kanyevu wa Matano (@dataandmuffins) November 30, 2018
Jennica Nichols, PhD candidate and research manager at the University of British Columbia, introduced conference participants to the exciting world of arts-based evaluation and research. An arts-based approach does not require the practicioner to be an artist; instead, it's an accessible and participatory approach that can break down barriers and address power dynamics between evaluators and participants. Nichols presented conference attendees with three arts-based methods: photo elicitation, drawing, and building. The BC Healthy Communities evaluation team is excited to explore how arts-based evaluation methods can be incorporated into our evaluation and research approach.
In her keynote address, Dr. Schulman highlighted the fact that data is deeply personal, meaning how we measure—and the environment we measure in—are equally important. Evaluation, as an experience, can engender honesty, a two-way exchange of dialogue and feedback, and spirit of continuous inquiry. Dr. Schulman emphasized the importance of conversation, observation, and imagination in the evaluation process, and the importance of relationship building in the evaluation process.
Data is deeply personal. This means that how we measure is just as important as what we measure.— Sarah Farina (@Broadleafc) December 1, 2018
Evaluation as experience means we engender a sense of honesty and delight, a two- way exchange, spirit of constant inquiry.
-Dr. Sarah Schulman#evalbc18
When it comes to evaluation and project design, failure is inevitable—embrace it and welcome it with open arms. Failure removes uncertainty. So, fail early and fail often. Learn from your failures and be creative in how you use these lessons to improve your practice.
Evaluation can be a tool for creativity and learning. Failing quickly can reduce risk. -Dr. Sarah Schulman #evalbc18— Sarah Farina (@Broadleafc) November 30, 2018
Gone are the days where evaluation consisted soley of surveys, monotonous reports and numbers. Evaluation approaches are constantly emerging and evolving; data is moving from quantitative to qualitative so that stories can be told. A growing number of evaluators are embracing creative, participatory approaches that are inclusive and accessible. How will you embrace this change and incorporate participatory evaluation methods into your practice?
What was my key takeaway from @EvaluationBC conference? Eval. doesn’t have to be boring! You can be creative & engaging by making it participatory! You can use art, role playing, theatre, & discussion. #Evaluation doesn’t have to just be surveys & monotonous reports. #EvalBC18— Cassidy Paxton (@itspaxxton) November 30, 2018
For more highlights from the conference, check out the hashtag #EvalBC18 on Twitter.
Have questions about evaluation or what you read here? Connect with a member of our evaluation team:
Cassidy graduated from the University of Victoria with a Bachelor of Arts in Health and Community Services and a minor in public administration. She is passionate about creating healthier communities through policy, community organizing, and empowering young people to be leaders in their community.>>Full Bio