A new month means a new focus!
This month SAGE MethodSpace will focus on evaluation and other types of applied research. Posts in the series and relevant posts from the archives will be found using this link.
We are also introducing a new MentorSpace feature: Mentors-in-Residence. We will be featuring SAGE authors and experts each month, and giving you the opportunity to ask them about how to understand and use the ideas they present.
Going forward, MethodSpace will continue to offer a wide range of posts and resources that answer what and why questions about conducting, teaching, and writing about research. Under the MentorSpace tab you will find posts and resources that answer how questions. You can receive all new posts to your inbox by logging in and subscribing to the site.
Welcome October Mentors-in-Residence, Drs. Bernadette Wright and Steve Wallis!
Bernadette and Steve are the co-authors of a new book from SAGE, Practical Mapping for Applied Research and Evaluation. You can find a sample Chapter 1, and online resources. Enter the code SAGE2019 for a discount on your book purchase.
I asked Bernadette and Steve a few questions, so they can introduce themselves and their work.
JS. Tell us a little about the journey that resulted in this book. How did you come to write this book together? How did you build on your respective backgrounds and experiences in this collaborative effort?
Steve: We first “met” by reading each other’s work online. There was a shared frustration with the existing status quo – the need to find new ways to conduct more effective evaluations. We collaborated on a number of smaller publications (white papers, academic articles and so on) and learned to respect each-others’ creativity, hard work, dedication, and knowledge.
Bernadette: I came across Steve’s website, and I liked what he was saying about how we need ways of advancing better theories for solving social problems. So I joined his email list. One day, he posted that he was looking for someone to work with him to finish a paper for publication in a journal. I was intrigued, so I volunteered, and that led to writing that paper and then more research, writing, and facilitating workshops together.
Steve: Bernadette brought a great depth of experience from the world of evaluation while I brought new insights from the world of organization development. Together, we develop new methods of knowledge mapping that could lead to more useful evaluation results for understanding and improving programs.
JS. Given your definition of evaluation research as:“research conducted to provide information for shaping effective programs, policies, and other actions,” what characteristics distinguish evaluation from other types of social science research?
Steve: An interesting question, Janet. Some researchers in various fields might argue with us, but there is actually quite a lot of overlap between evaluation research and other kinds of research in the social sciences. The main difference is the focus or topic. Evaluation research is very much focused on providing organizations with information to help them to run their programs, policies, and initiatives more effectively, and to demonstrate their effectiveness to funders. The flip-side of that focus is the need to understand “how the world works;” that is, what is going on in the surrounding community. That’s where other research comes in
Bernadette: Interesting that you ask, because I recently wrote a short article about this on LinkedIn. Both social research and evaluation studies vary a great deal in their purpose and the types of questions they explore. So, we might think more in terms of a 2×2 grid than two categories.
- One dimension is the research purpose. Social research can be conducted for generalized knowledge or for a specific practical application. An example study for generalized knowledge would be a psychological study to test a theory about how people make decisions. An example study for practical application would be research that a membership organization might conduct to get information they need for planning their future services. Evaluation can be focused on practical application (such as an evaluation that provides feedback for developing and enhancing a program) or on measuring impact on pre-determined goals.
- Another dimension is the research topic. Evaluations focus on measuring specific programs, policies, and other activities by a specific organization or group. Other social research can focus on a wider set of topics, such as understanding public perceptions of an issue and understanding how people and social groups function.
All of these types of research and evaluation are valuable, depending on the specific situation. So, I like to focus less on what we call it and more on how to do research or evaluation that will help answer the questions that need answering.
JS. Your definition also frames evaluation as “to purposefully bring about social or environmental change.” Can you say more about that purpose? What, if any, characteristics distinguish evaluation conducted with a purpose for change from action research?
Steve: Evaluation and action research are both examples of research that is “applied” Rather than conducted for generalized knowledge. Action research would fit very neatly into the process we present in our book.
Bernadette: It depends on how the specific evaluation is done. Action research can be and has been used as an approach to evaluation. One example we cite in our book can be found here: http://grantcraft.org/wp-content/uploads/sites/2/2018/12/par.pdf Many evaluations use methods similar to action research, involving people who are involved in the program in all aspects of the evaluation, without using that term. Evaluations can also be conducted in a non-participatory manner that is not focused on usefulness for action.
JS. With increasing demands for accountability and proof that funds are used appropriately and accomplish the identified goals, it would seem that the need for skilled evaluators must also be growing. What specific knowledge or skills are essential for professionals in the field of evaluation? How can students or career changers develop appropriate research knowledge to prepare for work in this field?
Steve: Great question Janet. Yes, there is a growing demand for good evaluators. In fact, it is something of a (true, yet painful) joke that some people are referred to as “Monday morning evaluators” because their boss walks in with a surprise assignment!
Those who want to have more preparation time for their career would do well to start by learning the basics. Luckily, many of the same research skills are found in a wide variety of fields (psychology, sociology, business, and so on). So, people entering the field may have already learned about a range of methods. Another helpful approach is to work with a seasoned research for your first few projects. You will want to know about the general context of your organization – what kind of work it is doing. And, getting down to brass tacks, what are the range of research methods open to you and which methods might you use in which situations? Writing skills are a plus along with strong ethical standards, project management (because every research project is… yes… a project) and the ability to think clearly and logically. Then there are interview skills, and more generally communication skills. We could go on… but that should give people a taste. Anyway, they are all in the book.
Bernadette: For evaluators in the U.S., the American Evaluation Association recently published a set of evaluator competencies. Some of the evaluator professional associations in other countries also have their own sets of competencies. I’d advise new evaluators to develop their skills in those areas. Plenty of options are available for new and experienced evaluators alike to enhance their skills. Join your national and local evaluation professional association and attend their conferences and other learning events. Take some graduate courses, professional development workshops, or webinars in evaluation. Read evaluation books, journals, and blogs.
Steve: And, of course, our book.
Bernadette: Yes! [Laughs] The opportunities for learning more about evaluation are almost endless – much like the opportunities for improving our organizations and our planet.