Dissertation Journal: Reframing the First Two Chapters

In class today, John Bean explained the structure of a quantitative dissertation and he reframed the first two chapters in an interesting way.  The way I’ve thought of them and the way they’ve been explained to me in the past is that the first chapter is the introduction and argument for the importance of the study and the second chapter is the literature review.  John suggests that for a quantitative study the first chapter argues for the importance of the dependent variable and the second chapter argues for the importance and inclusion of the independent variables.

Exactly how I write and organize my dissertation will be determined by my committee but this is a very interesting way of thinking about the first two chapters.

Dissertation Journal: Dissertation Synopsis

A few weeks ago, I wrote a brief synopsis of my current dissertation topic.  With the help of some of my wonderful friends, I have continued to revise this synopsis.  Here is the current version, the version I am sending to my advisor for him to use in writing my second-day question on my qualifying exam.

Higher education scholars, policy makers, and administrators know little about the experiences of undergraduate students – traditional and non-traditional – who matriculate with minimal experiences with or knowledge of technology.  It is easy to assume that all students, particularly traditionally-aged students, have significant experience with, knowledge of, and comfort with technology.  For many students, that assumption is correct.  But that assumption is false for some students and it is likely that those students have different and possibly difficult experiences, especially during their first year.

Although little is known about these students, there are tantalizing glimpses.  National surveys of institutions or students indicate that a significant number of students do not own a computer.  National surveys of students have reported different numbers of students without computers, from 1.2% of respondents (Smith, Salaway, & Caruso, 2009) to 2.7% of respondents (Junco & Mastrodicasa, 2007).  EDUCAUSE member institutions report that between 10% and 20% of their students do not own computers (EDUCAUSE, 2009).   Similarly, studies have revealed differences in how college students and youths use computers, differences that are significantly influenced by economic and cultural factors such as how easily and often they can use Internet-connected computers (Palfrey & Gasser, 2008; Ito et al., 2010; Watkins, 2009).  So it is clear that there are some students who neither own computers nor use them in ways that most of their peers use them.

But no one knows how many of these students are on American college campuses.  Little is known about who they are.  And very little is know about their experiences and how their technological aptitude is shaping their academic and social experiences.  Moreover, no one knows if our current methods of assessment – methods that often rely exclusively on web-based surveys advertised via e-mail – are gathering adequate information from these students and adequately representing their experiences, opinions, and needs.

This study will explore the response rates of first-year undergraduate students to a self-administered web-based survey.  Specifically, this study will examine the impact of those students’ previous computer ownership, access, and use on their response rate.  The specific research questions guiding this dissertation:

RQ1: In this sample of first-year students at American institutions of higher education, how many students have matriculated from environments in which they had substantially different patterns of Internet-connected computer ownership, access, or use?

RQ2: Do those students exhibit a significant non-response to a Web-based survey advertised primarily through e-mail?

To answer the first question, I will construct a brief survey of previous computer ownership and use to be administered to students participating in the on-campus, paper administration of the Beginning College Survey of Student Engagement (BCSSE).  I will answer the second question using data from the same institutions who participate in the web-based version of the National Survey of Student Engagement (NSSE).  Using the data obtained from BCSSE (a linked-records approach (Porter & Whitcomb, 2005)), I will be able to see if students with less exposure to technology respond in proportional numbers.

If the population is sufficiently diverse, I expect to find a significant number of students who have had less experience with technology than the majority of their peers.  I also expect that those students will be disproportionately from lower SESes and racial/ethnic minorities.   Finally, I expect to find a small but significant non-response bias to the web-based version of NSSE, a finding that may be generalizable to other web-based self-administered surveys.

References

EDUCAUSE. (2009). EDUCAUSE Core Data Service Fiscal Year 2008 summary report.  Boulder, CO: Author.

Ito, M., et al. (2010). Hanging out, messing around, and geeking out. Cambridge, MA: MIT Press.

Junco, R., & Mastrodicasa, J. (2007). Connecting to the net.generation. Washington, D.C.: NASPA.

Palfrey, J., & Gasser, U. (2008). Born digital: Understanding the first generation of digital natives. New York, NY: Basic Books.

Porter, S. R., & Whitcomb, M. E. (2005). Non-response in student surveys: The role of demographics, engagement and personality. Research in Higher Education, 46(2), 127–152.

Smith, S. D., Salaway, G., & Caruso, J. B. (2009). The ECAR student of undergraduate students and information technology, 2009.  Boulder, CO: EDUCAUSE.

Watkins, S. C. (2009). The young & the digital: What the migration to social-network sites, games, and anytime, anywhere media means for our future. Boston, MA: Beacon Press.

Dissertation Journal: One More Construct

The current draft of my lit review has two major constructs: survey non-response bias and the digital divide (which includes the participation gap as a sub-construct). I need to add socio-technical systems as another construct as it’s the fundamental lens through which I am viewing and approaching this work. I intended to add this to the first draft but didn’t have time.

A week-and-a-half ago, I realized that I need to add another construct: youths’ use of technology. The work that has been done – particularly the qualitative work, much of it funded by the MacArthur Foundation in the past 5 or so years – is also fundamental to this study. But unlike socio-technical systems, I had not considered formally discussing this work as part of my dissertation. Given its central role in shaping my thinking and understanding, I have to add it.

One or both of these new constructs may end up in the first chapter or even the fifth chapter but it’s clear that they need to be added. I guess I’m definitely still at the stage where I keep finding more and more things to do or add. I just don’t know yet when I’ll be able to do this given my courseload and qualifying exam next month. And that’s one of the main reasons for keeping this journal – so I know what to do when I find time to do it.

Dissertation Journal: Lots of Work To Do

In the past week, I’ve met with my chair-to-be and the project manager for one of the surveys on which I will be piggybacking my (first?) survey instrument.  Things are still looking good but I need to get working on my survey instrument.  I need to create the instrument, send it off to some experts for feedback (content validity), formulate my plan to pilot the instrument, clear that with IRB, and execute that plan so I can revise the instrument as necessary.  All in the next month or two.

Like many faculty in my department, my chair wants me to have a solid draft of each of my first three chapters (introduction, literature review, and methodology) before I can defend my proposal.  I have some thoughts on how to approach my first chapter since some of what I found in conducting my initial review of the literature will instead end up in chapter one.  I have a draft of chapter two but I need to add more material and work on the structure and organization.  I already knew that I need to add material related to my theoretical lens (socio technical systems) although I may end having to move that to chapter one.  But the other day I realized that I also need to add material related to youth use of technology since the qualitative work that has been done in the past few years forms much of the foundation for what I’m doing, especially my survey instrument.  The third chapter should work itself out easily since survey non-response is well-studied and I have some good paths to follow.

And I have to work on much of this as I finish up my last three classes and take my qualifying exam next month.  It’s going to be a busy semester…

Dissertation Journal: The Basic Idea

This post will serve two purposes.  First, it will be useful to have one post that explains the basic, big ideas of my dissertation.  Second, our qualifying exam is structured such that our advisor writes the second question and my advisor has asked me to summarize my dissertation ideas and thoughts so he can use them to write the question.  So this post will be a good first draft for that summary.

I haven’t yet figured out how to work in the big theoretical constructs that inform this research – socio-technical systems, digital divide, and participation gap – into this brief, non-technical summary.  I’m not even sure that I *should* work those ideas into such a short, non-technical summary although they play key roles in my research.

I am interested in the experiences of undergraduate students – traditional and non-traditional – who come to our campuses with little experience with or knowledge of technology.  We too often assume that all students, particularly traditionally-aged students, have significant experience with, knowledge of, and comfort with technology.  For many students, that assumption is correct.  But that assumption is likely false for some students and it is likely that those students have different and possibly difficult experiences, especially during their first year.

Although little is known about these students, there are tantalizing glimpses.  The 927 institutions surveyed by EDUCAUSE in 2008 for the Core Data Service reported that between 80% and 90% of their students own their own computers, indicating that between 10% and 20% do not (EDUCAUSE, 2009).  In its most recent study, ECAR reports that 98.8% of the 30,616 students at 115 colleges and universities who participated in its survey (ECAR researchers also conducted focus groups) reported owning a computer (Smith, Salaway, & Caruso, 2009).  In their 2007 Net Generation survey of 7,705 undergraduate students at seven institutions, Junco and Mastrodicasa reported results similar to ECAR’s when 97.3% of their respondents indicated that they own a computer.  It’s worth noting that both of those studies relied on web-based surveys, possibly inflating their computer-ownership results.  Studies of Facebook usage among college and university students have yielded similar results (Ellison, 2007).  So it is clear that there are some students who neither own computers nor use them in ways that most of their peers use them.

But we don’t know how many of these students are on our campuses.  We don’t know who they are.  And we don’t seem to know anything about their experiences and how their technological aptitude is shaping their academic and social experiences.  We don’t even know if our current methods of assessment – methods that often rely exclusively on web-based surveys advertised via e-mail – are gathering adequate information from these students.

To keep this study manageable, give it direction, and capitalize on resources I have at hand, I will focus on two specific questions:

RQ1: Are there an appreciable number of students in (this sample of) American institutions of higher education who have matriculated from environments in which they had little or no access to the Internet?

RQ2: If “yes” to RQ1, do those students exhibit a significant non-response (after controlling for confounding variables) to a Web-based survey advertised primarily through e-mail?

To answer the first question, I will construct a brief survey of previous computer ownership and use to be administered to some students participating in the Beginning College Survey of Student Engagement (BCSSE).  The precise nature of the survey instrument and the number of students/institutions that will be invited to participate are both undetermined right now.  I will answer the second question will be answered using data from the same institutions who participate in the web-based version of the National Survey of Student Engagement (NSSE).  Using the data obtained from BCSSE, I will be able to see if students with less exposure to technology respond in numbers proportionally similar to other students (a linked-records approach (Porter & Whitcomb, 2005)).

If my population is sufficiently diverse, I expect to find a significant number of students who have had less experience with technology than the majority of their peers.  I also expect that those students will be disproportionately from lower SESes and racial/ethnic minorities.   Finally, I expect to find a small but significant non-response bias to the web-based version of NSSE.

(If one wanted to be crass, which I want to be on occassion, this study could be summed up as another “Are we ignoring or screwing poor students?” study.  Of course, I can’t write that in my proposal or my dissertation although it is accurate.  Another way of describing the study, especially the second research question, is to compare our web surveys with surveys administered via telephone, surveys which obviously don’t capture information from people without phones.)

References

EDUCAUSE. (2009). EDUCAUSE Core Data Service Fiscal Year 2008 summary report.  Boulder, CO: Author.

Ellison, N. (2007). Facebook use on campus: A social capital perspective on Social Network Sites.  Presentation at the Sixth Annual ECAR Symposium, Boca Raton, Florida.

Junco, R., & Mastrodicasa, J. (2007). Connecting to the net.generation. Washington, D.C.: NASPA.

Porter, S. R., & Whitcomb, M. E. (2005). Non-response in student surveys: The role of demographics, engagement and personality. Research in Higher Education, 46(2), 127–152.

Smith, S. D., Salaway, G., & Caruso, J. B. (2009). The ECAR student of undergraduate students and information technology, 2009.  Boulder, CO: EDUCAUSE.

Dissertation Journal: First Entry

I’m going to start keeping a public dissertation journal.  In addition to some of the obvious benefits (publicly associated my name with my ideas, obtaining early feedback, etc.), I hope that this will add some element of accountability if I take public stances on ideas and deadlines.  I’ll append “Dissertation Journal:” to the title of every entry and place each entry in the “Dissertation Journal” category so you can easily skip these posts if you’re not interested.

I am about to begin my final semester of coursework in my Ph.D. program.  In my program, we have two mandatory classes that directly relate to the dissertation: literature review and proposal preparation.  I completed the lit review class last semester and have a good start on what will become Chapter 2 of my dissertation.  I still have a lot of work to do with the lit review, not only to take into account the comments I received last semester but also to add new material and craft the lit review to meet the expectations of my specific committee.  I am taking the proposal class this semester and I hope to come out of that class with a solid draft of Chapter 1, the chapter that introduces the study and justifies it by explaining why it’s important and interesting.

I have commitments from four persons who have agreed to serve on my committee, including a chair.  I will not be able to formally put together the committee until I’m done with my coursework and nearly ready to defend my proposal but it’s reassuring to know that I have people who are not only exceptionally knowledgable but also excited about my research on board.  I’m also planning to take my qualifying exam next month (yes, in the middle of a semester in which I am also taking three courses; perhaps not the best way to do things but I want to get it done quickly so I can move on).

By the end of this semester I should have: completed my coursework, taken and defended my qualifying exam, and completed solid drafts of Chapters 1 and 2.  Chapter 3 – methodology – should be easy to write after all of that work.  Further progress – really good versions of Chapters 1-3 and formal defense of my proposal – is possible depending on my committee and available time and energy.