It’s been about a year-and-a-half since my last post about my dissertation. Two weeks ago, I defended my dissertation NON-RESPONSE BIAS ON WEB-BASED SURVEYS AS INFLUENCED BY THE DIGITAL DIVIDE AND PARTICIPATION GAP. I’ve included the abstract below if you’re interested in its content but I’ll focus here on some of the process.
I originally intended to write a lot more in this blog about my dissertation-writing process but my posts eventually petered out as I got further and further behind schedule. After a while, I refused to write about it not only because I had nothing new or interesting to say but more importantly because I was simply ashamed to even bring up the topic. I don’t know why I stopped writing. It took me about three years longer to finish this than it should have taken and I can’t help but wonder how different my life and career would be if I had finished in a timely manner. I’m not sure why I avoided working on this for so long but I know that all of the obstacles were internal and emotional. And I can’t tell you that I had any miraculous breakthroughs that let me finally finish except for that fact that I was almost out of time.
The defense itself turned out almost exactly as expected. My committee requested only very, very minor edits that required the addition of only a few sentences. I had set aside the two days immediately following my defense to make edits and the final submission but I only needed a few hours to make those edits and a second round of (minor typographical) edits requested by my graduate school. They’ve accepted the document and forwarded it on to ProQuest for permanent archival so I think I’m just waiting for a few random bits of paperwork to work their way through the systems before everything is completely, totally, and finally done.
I’m not sure what my next steps will be. I worry that the data are too old – Internet access and use data collected in 2010 – to be publishable. Of course, I have many ideas about how to conduct further data analysis and push this particular set of ideas further but I don’t think that anyone can be surprised that I’m a little bit burnt out on these specific ideas right now. I’ll be sure to write more here if I do any further work with this study.
I think that my colleagues, family, and friends are surprised that I’m not more celebratory about finishing my doctorate. The dissertation itself – conceptualization, collection of data, analysis, and writing – was pretty easy for me and it doesn’t feel much different from other studies I’ve completed. But the emotional drain of living with this immense self-imposed and emotionally puzzling weight for so long was so soul-sucking that I’m more relieved than happy or excited to finally be done. I’ll try to learn to celebrate later but for now I’m enjoying just living without the shame and embarrassment I’ve hidden from everyone for several years.
Now that I’m done, I can begin to chip away at my large backlog of video games. I’ve tackled the problem of non-response bias on a Web-based survey but now I’m going to save humanity from aliens.
Abstract
Higher education scholars, policy makers, and administrators know little about the experiences of undergraduate students who matriculate with minimal experience with technology. It is often assumed that all students, particularly traditionally-aged students, have significant experience with, knowledge of, and comfort with technology. Although that assumption is correct for many students, it is false for others. Despite the enormous increase in the use of Web-based assessment surveys and the increasing importance of accurate assessment and accountability data, those efforts may not be collecting adequate and accurate data about and from all students.
This study explores the non-response bias of first-year undergraduate students on a self-administered Web-based survey. First, data were collected with a supplemental survey added to the Beginning College Survey of Student Engagement (BCSSE). K-means clustering was used with this newly constructed Internet Access and Use survey to classify students according to their Internet access and use experiences. Second, demographic data from BCSSE and the Internet access and use data were included in a logistic regression predicting response to the subsequent National Survey of Student Engagement (NSSE).
The Internet Access and Use instrument proved to be a viable way to classify students along lines of their previous Internet access and use experiences. However, that classification played no meaningful role predicting whether students had completed NSSE. Indeed, despite its statistical significance the final logistic regression model using provided little meaningful predictive power.
Generalizing the results of this study to all Web-based surveys of undergraduate college students with random or census sampling indicates that those surveys may not introduce significant non-response bias for students who have had less access to the Internet. This is particularly important since that population is already vulnerable in many ways as being disproportionately composed of first-generation students, underrepresented minority students, and students with lower socioeconomic statuses. This reassures assessment professionals and all higher education stakeholders that cost- and labor-efficient Web-based surveys are capable of collecting data that do not omit the voices of these students.
Leave a Reply