Discussions of method don’t often sound all that sexy, but I love them anyway. My first two sessions of the ICLS 2008 conference have been about method (maybe process is a better term). First, Yrjo Engestrom talked about formative interventions, an activity theory-style approach to research, and then Ilya Zitter described her process for using Educational Design Research in her doctoral work.
Not surprisingly, Engestrom railed against the “gold standard” of randomized controlled trials as the best and only way to properly conduct research. He mixed in a couple jabs at the U.S. — one for emphasizing such studies and one for making unpopular interventions. I’m with him on both. Randomized controlled trials (RCT) shouldn’t be the gold standard for all kinds of research, and the U.S. shouldn’t have intervened in Iraq. At least not the way we did. But, I digress. I was talking about method.
In contrast to the positivist RCT program, Engestrom recommends a different process entirely. His process, we’ll call it formative interventions (that was on his slides), engages the research site as a participant in the project rather than as a passive recipient of a designed intervention. It differs from ICT (and even from Design Research — an approach gaining popularity in education research) in three main ways:
- starting point,
- process, and
The starting point for formative interventions are poorly understood objects. RCT and design research start with some goal in mind. Having a goal presupposes that the goal is desirable. I dislike the arrogance behind starting a project from, “I know how it should be,” and so it’s no surprise that I like formative interventions’ starting point.
Engestrom calls formative interventions’ process “double stimulation.” That term doesn’t really work for me. I think what he means is that the research introduces and recognizes changes in the research environment over time. Whether those changes are planned by the researchers or not is not terribly important. The process of studying a changing phenomenon differs dramatically from the “execute, refine, repeat” approach RCT takes.
Lastly, the outcomes of the two methodological approaches differs. For Engestrom, the outcome should be “new activity concepts” and for RCT, it’s “solutions.” I’m often frustrated by “solution” terminology — because I’m uncomfortable labeling social phenomena as broken, because I’ve seen too many “solutions” that don’t have clear “problems”, because I just don’t see the world that black and white.
So now we have an outline of Engestrom’s preferred methodological approach. I like it. It’s engaged, rigorous, and embraces the ongoing and changing nature of social situations. Trouble is, it’s hard to sell, in the U.S. especially, and even harder to do.
Enter Ilya Zitter. Ilya is a PhD student at Utrecht University, and she uses a method she calls “Educational Design Research” in her doctoral work. Basically, she uses research, design, and practice approaches to study undergrads in a projects course. Hooray for higher education at ICLS! It’s almost as satisfying for me to engage as adults’ informal and workplace learning. Anyway, Ilya gave a short talk in a firehose session where she described how she conducted her research. This is exactly the kind of talk I like to attend at conferences. I can read papers, but papers about how the research was conducted are hard to come by. Sure, papers include methods sections, but those don’t often tell you the nitty gritty details. Ilya talked about her struggle to balance research, design, and practice in her work. This is a struggle I get to avoid in my dissertation but which is central to my life at Microsoft Research.
At MSR, we’re engaged in a formative intervention study of sorts. We’re working with HR and managers to adjust social and technological tools used in onboarding at Microsoft. I’m often uncomfortable in the “design” and “intervene” portions of such studies. I much prefer to be a fly on the wall. That’s not immediately useful (or publishable) though. I, like Ilya, am struggling to find balance and to negotiate relationships among researchers and practitioners all while gathering and analyzing data. It’s hard, but at least I’m not alone.