A crucial moment: Why we need to develop data-based decision making skills in adult educators and why we need to do it now.

person holding black and grey pen
Photo by Pixabay on Pexels.com

So in my previous essay I made the case that adult learning is different from child and adolescent learning. But is there anything in these differences that suggests we should take a different approach to researching and evaluating the field’s practices and policies? If one looks at the historical trends of research in adult education over the past 30 years, one might begin to think so. A tidal wave of qualitative research (interviews, focus groups, storytelling, narrative analysis) has inundated the field, displacing from graduate programs and key journals the once dominant quantitative research paradigm (inferential/descriptive statistics, experimental/quasi-experimental designs, surveys) (Daley, Martin, & Roessger, 2018). In two recent reviews of published studies in adult education journals from Great Britain, Australia, and the U.S, a common conclusion emerged: the field has moved almost entirely toward qualitative research (Boeren, 2018; Nyers & Fylander, 2015).

But other fields that study cognitive and social phenomena involving adults have not followed suit. Psychology, behavioral economics, educational policy, neuroscience, sociology, and the learning sciences all remain primarily scientific fields grounded in quantitative research methodologies. The continued use of quantitative methodologies in disciplines that study adult phenomena suggest that there is nothing inherent in the phenomenon of adult learning that requires our field to adopt a qualitative approach to research–unless, of course, we want to proffer the very tenuous claim that either adult learning shares no common ground with these fields, or these fields just have it plain wrong. Both seem unlikely to me. What is likely is that over the past 30 years (about the length of an academic career) the culture surrounding adult education research has changed because some very influential scholars published some very influential theoretical texts espousing interpretivist, emancipative, and relativist views of learning (mostly normative accounts lacking empirical support). A new generation of researchers were developed in turn under these epistemological and methodological umbrellas, and these new researchers then developed a next generation of scholars who share these ideas. To borrow a metaphor, what started out as a snowball has turned into an avalanche!

Adult education is creating its own island by digging a moat, and its graduate programs are handing the next generation of researchers and practitioners shovels.

The problem is that such a one-dimensional qualitative approach to research is now out of touch with how contemporary organizations make decisions and how other fields are advancing our understanding of human cognition. In a sense, adult education is creating its own island by digging a moat, and its graduate programs are handing the next generation of researchers and practitioners shovels to dig a wider and deeper moat. In an era of data-based decision making, interdisciplinary research, and team-based science it’s hard to see how this ensures the continued success of the field.

Consider for instance the results of a recent survey conducted by the National Association of Colleges and Employers (NACE, 2017). In all, 201 organizations that hire college graduates described the key attributes they sought on an applicant’s resume. The results are telling. Problem-solving skills ranked at the top of the list: 82.9% of respondents stated this was something they looked for. And just five spots down this list were analytic/quantitative skills, cited by over 67.5% of respondents. Seems to me that in addition to attributes like work ethic,  cooperation, leadership, and communication, employers are looking for people who can make data-based decisions to solve problems. Unfortunately, our field is not preparing its practitioners to do this anymore, placing in jeopardy its historic relationship with industry as a developer of skilled training and development specialists, evaluators, recruiters, and instructional designers.

Others have done an excellent job detailing reasons why quantitative reasoning has become such a rarity in the field (see Boeren, 2018; Daley, Martin, & Roessger, 2018), so I won’t do so here. Instead, I’d rather highlight what the shift away from data-based decision making and quantitative reasoning is doing to the field. To illustrate this, let’s first consider the status of the field as one that seeks to improve practice and policy through research, an identity repeatedly championed by the field’s adopted phrase “research to practice.” Contrast this identity with that of an applied field (i.e., a trade) that aims to convey time-tested principles and practices (e.g., cosmetology, woodworking, funeral services). These disciplines are typically found in technical colleges and community colleges, not universities. “Research to practice” disciplines, however, have strong presences in universities, the idea being that the research and practice arms of the field inform one another. Problems of practice direct research, and research directs practice. That’s the idea anyway.

We must adopt pragmatic strategies and research aims that are valued by practitioners and policy makers. In today’s world of big data, this involves quantitative reasoning and data-based decision making.

But I’m not convinced this is happening anymore in our field. An analysis of the field’s research, practice, and policy writings has illustrated a complete disconnect between the language of research and practice/policy (Roessger, 2017). What researchers talk about in our field is different from what practitioners and policy makers talk about. In a research-to-practice field, this is a problem. And when the relationship between research and practice breaks down, so does the perceived value, and thereby presence of the field’s research arm. One needn’t scrutinize too hard the list of research-1 universities in the U.S. to notice the paltry number of those that have adult education programs. Research-1 (R1) universities are doctoral degree-granting institutions with the highest level of research activity among universities. They hire faculty with strong research agendas and expect them to obtain grant funding to support their research. In the U.S. many are land-grant universities that have an explicit mission to do the kind of research that improves the lived experiences of those who live in that university’s home state. By my count, there are 115 R1 universities in the U.S., and only 15 of those have programs that identify specifically as adult education or adult learning. Notable programs have closed over the years, such as those at University of Wisconsin-Madison, University of Michigan, Syracuse, and University of Texas. And none to my knowledge have been created at an R1 during this time. Let’s not also forget R2 universities (Universities with high research activity) and their recent program closings: University of Wyoming, Northern Illinois University, and National Louis University.  Contrast this trend for a moment with the closely related field of K-12 education, which has some semblance of a program in every R1 university. Even more alarming is that when we look at the top 25 ranked Colleges of Education in the country (all of which are R1 universities) only three have programs in adult education. By any measure, this doesn’t bode well for a field that seeks to improve practice and policy through research.

So why is this happening? Well, the answer is complicated and likely involves many factors. But I do think a principal factor is the field’s evolution as a primarily qualitative field concerned with learner perceptions and small-scale studies. Let’s consider what happens in research-driven universities when a discipline decides to uniformly adopt a mode of inquiry that is largely absent from more established–and well funded– disciplines: It becomes siloed. In an era of funded research, this is frankly not a healthy long term strategy. Large funding organizations (e.g., Institute of Educational Sciences, National Science Foundation, and U.S. Department of Education) typically seek to fund interdisciplinary and team-based research. The more expertise listed on a grant application, the less risk the funder takes because they assume there is more “skin in the game,” more checks and balances to ensure the project gets done and pursues attainable goals.  But because most established social science disciplines are comprised primarily of quantitative researchers, and most grants involve large-scale projects, adult education struggles to get involved (although its reach extends to most disciplines and areas of social inquiry). Its researchers struggle to speak the research language of these other disciplines, and often its researchers lack the skills and knowledge to work with large datasets or design studies that minimize threats to things like internal and external validity, two concepts not acknowledged in the qualitative tradition. Gone are the days where Kellogg Foundation grants are exclusively handed out to adult education researchers. Now we must compete with everyone else. And to do so, we must adopt pragmatic strategies and research aims that are valued by practitioners and policy makers. In today’s world of big data, this involves quantitative reasoning and data-based decision making.

Modern scientific approaches to human research are calling for large-N studies to ensure researchers have the statistical power necessary to make valid conclusions. They are calling for pre-registration of studies to ensure that researchers state their research questions and hypotheses before they go digging around in their data and potentially p-hack their results. They are calling for replications of findings to ensure that findings aren’t a fluke, or worse, the result of dishonest research practices. These are the criteria commonly imposed upon funded research proposals and research manuscripts submitted to influential journals with the ability to affect practice and policy. Yet, these are all things that we don’t train adult education researchers and practitioners to do.

It’s time to change course and adopt a more pluralistic approach to research in our field (Daley, Martin, & Roessger, 2018). The future success of our field depends upon developing more practitioners with quantitative reasoning and data-based decision making skills.


Boeren, E. (2018). The methodological underdog: A review of quantitative research in key adult education journals. Adult Education Quarterly, 68(1), 63-79.

Fejes, A., & Nylander, E. (2015). How pluralistic is the research field on adult education?: Dominating bibliometrical trends, 2005-2012. European Journal for Research on the Education and Learning of Adults, 6(2), 103-123.

Daley, B. D., Martin, L., & Roessger, K. M. (2018). A call for methodological plurality: Reconsidering research approaches in adult education. Adult Education Quarterly, 68(2), 157-169.

Roessger, K. M. (2017). From theory to practice: A quantitative content analysis of adult education’s language on meaning making. Adult Education Quarterly, 67(3), 209-227.

NACE. (2018). The key attributes employers seek on students’ resumes. Retrieved from https://www.naceweb.org/about-us/press/2017/the-key-attributes-employers-seek-on-students-resumes/


Author: Kevin M. Roessger

I am a research and educator interested in advancing the adult learning sciences. I blog about evidence-based practices and policies for adult and lifelong learning.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: