Part One: The Need for Equity Approaches in Quantitative Analysis

by Lois Joy, Ph.D., Research Director for Jobs for the Future, and member of the CTE Research Network Equity Working Group.

See: Part Two: Equity Approaches in Quantitative Analysis 


To support equity in CTE research, the Equity Working Group from the Career and Technical Education Research Network (CTERN) authored an Equity Framework for CTE Research which provides principles and practices for researchers on equity questions, designs, and implications throughout the research process. While this guide was authored with a specific focus on Career and Technical Education (CTE) research, the ideas within represent the expertise of a variety of experts in education research. Researchers from all social sciences could implement most if not all the recommendations. This part one of a two-part blog that focuses on quantitative data analysis with part one commenting on the need for equity considerations and approaches in quantitative data analysis and part two recommending practices and approaches that support equity. 

For the quantitative analysis portion of the CTE Equity Framework, we drew from the emerging fields of quantitative criticalism, QuantCrit, and Critical Quantitative analysis to explore the need for and approaches to an equity approach to quantitative data analysis. These approaches emerged in recent years in a portfolio of thought that examines the impacts of structural racism and sexism on quantitative data analysis and policy recommendations in education and other fields including public health, sociology, and economics.  New scholarship has begun to explore the nuances and differences in these approaches and their philosophical lineages which includes critical race theory, Black feminism, conflict theory, class struggle, and feminist economics.1  Our approach in the CTE Framework was to draw high level insights from this body of work to inform equity in CTE data analysis that can apply to groups of people who may face systemic barriers to CTE participation including, for example, Black and Hispanic learners, people who identify as women and nonbinary, and people with disabilities. 

Critical quantitative theories have brought to our attention how quantitative analysis is not without embedded biases. Bias here refers to the way in which preconceived assumptions about people of different races, ethnicities, and genders, made consciously or unconsciously, can become embedded into data analysis leading to findings and policies that perpetuate rather than alleviate structural barriers to access, advocacy, and resources for success or advancement. These biases creep into all aspects of data analysis, from coding and cleaning of data to estimating and interpreting the findings. The goal of an equity lens is to build awareness of these biases in data analytics and the development of tools and practices to reduce the bias.  A first step for researchers interested in reducing social, education, and economic injustice rather than reproducing structural inequalities is uncovering the sources of these biases in all aspects of data analysis. 

At its root, quantitative data analysis consists of categorizing inputs and outputs into relevant groupings and drawing connections between them.  None of this is self-evident from the data itself but subject to the assumptions and hypotheses that individual researchers bring to the analysis.  The data don’t objectively “speak for themselves” but are shaped, merged, and connected by researchers who make many decisions along the way about how to do this. Systematically interrogating assumptions in our data analysis will help us to uncover our own blind spots.   

A key insight of Critical Quantitative Theories and related scholarship is that categories of gender, race/ethnicity, and other socially constructed demographic groupings are not only social identities but also capture the roles of structural systems of oppression on outcomes, which are hard to observe and tease out of limited datasets. More specifically, coefficients on genders, race/ethnicity, and other demographic categories  represent the impact of sexism, racism, and other “isms” that emerge from social, educational, and economic processes and power dynamics.3 “ Controlling for” differences that are associated with demographic variation may, in fact, mean modeling away (and thus losing sight of) the impacts of those “isms.” 

Making the structures of oppression visible is the challenge for quantitative researchers (the subject of the Part 2 blog on this topic). When this is not possible, researchers must be transparent about the limitations of the analysis. This approach to making structural barriers more visible in the analysis also will help address what has historically been a bias toward “deficit approaches” in education research. Deficit approaches blame learners for educational and outcome gaps, and the impacts of systemic barriers remain unexamined. 4 To more accurately capture the lived experiences of individuals impacted by sexism, racism, and other forms of discrimination, researchers should use strategies to minimize biases and clearly articulate the limitations of quantitative analysis. Mixed-methods approaches which include qualitative data on these lived experiences can be considered as a key strategy to overcome these limitations.  

This is the fifth in an eight-part blog series on the Equity Framework for Career and Technical Education Research authored collaboratively by the CTE Research Network’s Equity Working Group and previously published by the American Institutes for Research. 


More Methodspace Posts about the Equity and Research

Previous
Previous

Part Two: Equity Approaches in Quantitative Analysis 

Next
Next

Teaching and learning quantitative research methods in the social sciences