In December we explored research and practice, and in January we are looking at the various roles researchers play– and the skills and mindsets needed to play them successfully. This interview with Julie Reeder, a researcher in a government setting, touches on both topics. She discusses roles and skills needed to conduct research outside of academia.
JS: Please introduce yourself.
JR. Julie Reeder, PhD, MPH, MS, CHES. I am the Senior Research Analyst with the State of Oregon WIC Program, which is part of the Oregon Health Authority. WIC is a prevention focused, public health nutrition program that serves lower income pregnant and postpartum women, and children up to the age of five. Nationally, WIC serves just under half of all pregnant women each year and about 1 in 3 children before their fifth birthday. In my position, I conduct internal program evaluation as well as externally funded research projects.
JS. What research or evaluation methods do you use in your agency?
The vast majority of my work uses mixed methods. Although we occasionally only need to quantify how many individual have a certain nutritional risk or live in a particular zip code, in order to provide services that align with the beliefs and lived experiences of our program population, a creative mix of methods is essential.
JS. What kinds of data collection do you use?
JR. Oregon WIC has its own, live data base in which participant visit information is captured. This allows us to pull basic surveillance data such percent of children at risk for overweight or number of women with low hemoglobin, and use these data to better understand the drivers of health disparities. As part of the State of Oregon, we may also access (with additional permissions) data from the Birth Certificate files and the Integrated Client Database. We can then link our WIC data with both of the additional data sets to gain a much more comprehensive understanding of the individuals who access or services or those who are WIC eligible but do not participate in the program. For example, by linking to the Integrated Client Database, I was able to determine how many WIC participants were involved with the Department of Corrections during their pregnancy. This has lead me to reach out to State and County Corrections to improve coordination between our systems.
I also do many focus groups, key informant, and in-the field interviews with WIC participants and front line staff. As WIC has a supplemental food component in addition to its preventive health services, I also interview grocery store managers and cashiers and farmers and Farmer’s Market managers who apply supply the foods that are part of the WIC program. I use multiple qualitative approaches including phenomenology, grounded theory, and feminist theory as appropriate.
JS. How do you analyze the data you collect?
JR. The analysis plan depends on the original outcomes of interest from the original evaluation or research design and the type of data collected. I would say that as a program-based researcher I try to balance the right amount of methodological rigor with ease of understanding. For example, I ask myself whether a “fancier level” of quantitative analysis will really provide us with extra certainty or greater clarity, or just create confusion for those who will translate findings in to changes in practice? If I log transform certain data, how will that be understood by those who need to interpret it? For qualitative data, how do I provide a sufficiently rich description while still honoring the confidentiality of those who shared their lived experiences. In short, I would say I analyze the data I collected thoughtfully, with precision, and with pre-planned outcomes clearly in mind. I do not use my data for fishing expeditions.
JS. How do you report findings? What form(s) do your reports take?
JR. While I do write peer reviewed journal articles, I also write a lot of reports for internal program use. In this case, less is more. Short, clear graphical displays of data is essential. Any complicated analytics require description in laymen’s terms. Policy or practice implications need to be specified and visibly called out. Limitations of the work need to be clearly identified.
JS. Are they discussed with partner agencies or stakeholders, used for program improvement?
JR. Yes, we use them internally and share as well with our WIC Advisory Board, Health Care Systems, and Early Learning Partners.
JS. What research skills are needed for professionals in agencies or other non-academic settings?
JR. As I mentioned earlier, the key is to balance the methodological rigor taught in academia with the realities of program-based work. Much of what is researched in academia is driven by the current trends in grant funding, the need to produce peer reviewed publications, and to grow findings in to a further funding opportunity. Program-based work centers on the immediate needs of the program, the requirement for quality improvement, and near-term changes in policies and practices. Translating research into findings requires brief, easily understandable and timely displays of findings and implications.
JS. What do you recommend to students or career changers interested in working in this field?
JR. I would encourage anyone interested in program-based evaluation and research to look for internship possibilities to get hands on experience. For those who are in academia and would like to partner with programs like WIC on future research projects, I would encourage them to approach thoughtfully, and with an open mind about research topics. Too often I am approached by academic researchers who want a letter of support for a grant ( and need it by tomorrow) for a project that is of no practical interest to our program, represents a substantial burden to our staff workload or participants, and shows a clear lack of understanding of WIC program basics. Unfortunately I have to decline their request. Taking the time to build a relationship with the program in which you are interested, understanding their key research and evaluation questions and how they like to apply study findings, and identifying their unique organizational cultural values are key to building a true research collaboration. Although this takes much more time than sending a short email asking WIC staff to administer your lengthy survey, it is infinitely more likely to yield a positive response,
Relevant MethodSpace Posts
Image credit: “Inova” by Studio Ianus is licensed under CC BY-NC-ND 4.0