:::: MENU ::::

Qualitative Data Analysis Using Office

It seems every time I embark on the qualitative data analysis (QDA) stage of a project, I get frustrated by QDA software. My approach to data analysis is iterative and collaborative. Existing QDA makes both of those approaches obscenely difficult (if possible at all). Hence my frustration.

I want to be able to edit transcripts even after I’ve loaded them into the software (e.g., if I notice an error in the transcription). I also need to be able to share my work with colleagues who use a variety of computers including PCs, Macs, iPads, and Android phones.

Andy Begel and I relied on OneNote, and that seemed to work better than Atlas.ti, NVivo, or HyperRESEARCH, and it came with my version of Office. Trouble is, now I’m working on a Mac, and there’s no OneNote for Mac. I liked the idea of doing all my analysis in Office so that it would be easy for me to share with my collaborators regardless of the software they had available (since Google Docs can handle the basic functions of Office). I also liked that Office would give me direct control over the links between codes and data so that I could change either on the fly.

Daniel Meyer and Leanne Avery published an article in Field Methods a couple of years ago about using Excel for QDA. Sadly, I didn’t notice until today that Dr. Meyer is a colleague of mine at IIT. Anyway, starting with their article, and the assurance of some friends that yes, I could do this all in Excel, I set off to DIY myself a QDA software solution.

I’m dealing with interview data, and I had all the interviews transcribed by Scribie. Scribie transcripts contain info in the both the header and footer, have a cover page, and contain a rating page at the end. None of that matters for my analysis. What I want instead is a file of the form “timestamp TAB speaker TAB talk” that I can copy and paste into a corresponding Excel sheet with 3 columns. Here are the steps for using Word to prepare a Scribie transcript for use in Excel:

  1. Save DOCX file as TXT
  2. Remove the header, footer, scribie cover, scribie rating sheet
  3. Replace “S1:” and “S2:” with correct names – use “^tNAME^t” to add tabs before and after
  4. Replace scribie line breaks (“^l^l”) with paragraph breaks (“^p”)
  5. Remove line breaks between end of paragraphs and scribie notes such as “[laughter]” (Find: “^p[“, Replace: “[“)
  6. Select all, copy, paste into Excel

You should now have 3 full columns in Excel – time, speaker, and their comment. This approach treats all turns Scribie identifies as a speech event. Sometimes that produces long paragraphs, sometimes just a single word. At this stage in my coding, I do not need to code line by line, so this approach works for me. If you want to code each line, you can do some Find and Replace operations to make “time TAB speaker TAB sentence” lines based on the sentence’s punctuation. For instance, replacing “.” with “.^p” would put each sentence on a new line. Getting the first two columns on that same line will be tricky though.

Now that I have all the comments in Excel, I can easily attach codes to specific sections of speech. To do this, I have created a separate sheet for my Codebook where column 1 is the code’s label, and column 2 is the code’s definition. In my main sheet, with the transcript data, I add three columns for “Primary,” “Secondary,” and “Tertiary” codes. Excel’s Data Validation feature enables me to select codes from a drop-down in each of those columns. As I add codes and definitions, the drop-down list updates to include them.

Once I have all my transcripts and codes into Excel, I can go merrily along, choosing codes from the drop-down list without having to navigate to a new window. If I need to edit a comment (as often happens when interviewees mumble), I can do so without impacting my codes. I can easily go back to the recording, listen for the part that Scribie missed, update it in Excel, and continue on my merry way. All the QDA software tools require that you do all your data cleaning and editing before you start coding, and that’s just not the way I work. I code and clean, clean and code, as I go along. This Excel approach lacks some of QDA’s more advanced features, but I find that thoughtful Pivot Table use can make up for most of those. The upside is that I already know how to work with Pivot Tables so I don’t have to sit through those webcasts on how to use NVivo and HyperRESEARCH. Data analysis is hard enough without software getting in the way. If you were looking for a straightforward DIY solution, I hope this post helps.


So, what do you think ?

You must be logged in to post a comment.