Dissertation Journal: Final Instrument Designed and Printed

My survey instrument has been through its final design process and I’ve printed the first set.  The final design process moved things around a bit to make the instrument easier to scan and added a blank space for the survey ID.

The final instrument was delivered to me not in the format I expected.  I thought it would be in a format that would allow me to easily perform a mail merge to insert the survey IDs for each survey.  Since this instrument is physically a separate sheet of paper, I have to be able to match this instrument to the main BCSSE survey instrument (because I’m using demographic information from BCSSE).  To do that matching, I am printing survey IDs on each of my surveys that correspond to the survey IDs already printed on the BCSSE surveys.  I am then physically inserting my survey into the BCSSE survey (which is a 8.5×17 sheet folded in half).  Having the same ID on both instruments will ensure that we can match the responses to these surveys even if the two instruments are separated after the students complete them.

The final instrument as I received it is exactly what is linked above.  It has a blank for the survey ID but how do I insert the ID?  More specifically, how do I easily insert the ID for hundreds of surveys?  Here is what I’m doing:

  1. Create an Excel spreadsheet with the survey IDs I need to insert (each participating BCSSE institution is pre-assigned a range of survey IDs so I know them in advance).
  2. Create a mail merge document in Word.  This document has only one thing in it: a mail merge field in the bottom right corner.  This document uses the previously-created spreadsheet as its data source and the field is placed on the document such that it lines up with the blank Survey ID space on the pdf of the survey instrument.
  3. Complete the mail merge, generating a Word document that consists of a bunch of blank pages with survey IDs in the bottom right corner.
  4. Print the mail merged document to pdf.
  5. Open the new pdf and add the final survey instrument as the background of each page.

This process is not the most straight forward way to create a mail merged document but given my situation it’s not bad.  Although it’s a tiny bit convoluted it works reasonably well.  This process might not work if it had to be more precise (i.e. inserting new text into a block of existing text) but since I have a large target that is an empty box I don’t have to worry about precision.

(I was initially guided to this process – specifically the idea to use the instrument as a background image on another pdf – by this 2-year old thread on MacRumors.com.)

I’ve used this process to print the first set of 375 surveys to be mailed later today or tomorrow to an institution in New England.  It should work well for the other 3885 surveys I will be mailing to nine other institutions in the coming months.

Dissertation Journal: Participating Institutions

I sent the invitations to participate to the first group of institutions (I hope to invite more as more institutions register for BCSSE) about thirty minutes ago. And I’ve already received three “yes” responses!

Additionally, it occurred to me that I may end up with three groups of participating institutions:

  • Web-only participant in NSSE
  • Paper or mixed-mode participant in NSSE
  • Non-participant in NSSE

The first group is my primary target group as they will allow me to easily calculate non-response as it relates to Internet access and use. But if I end up with institutions in the second or third group (and I won’t know for quite a while since NSSE registration isn’t open yet and won’t close until mid-late Fall) then I might be able to use them to do interesting “troubleshooting” since they will provide additional data. For example, institutions that participate in the paper mode of NSSE would offer me the chance to see what happens when respondents have the choice between paper and Web modes and how Internet access and use influences that choice.

Dissertation Journal: Final IRB Approval for First Data Collection

Last week, I ceased piloting my survey instrument and finalized it.  I submitted it to IRB as an amendment to BCSSE at the same time as sending it to my colleagues in IU’s Center for Survey Research who are formatting it for their scanners.  IRB approved the amendment and I will immediately begin soliciting institutions to participate.

The pilot did not go nearly as well as planned.  I was hoping to conduct 5 cognitive interviews and a pilot administration with about 50 students.  I feel far short of both of those goals.  I think that the biggest factor that contributed to this failure is the timing: I was so late in soliciting participants that we were nearly in finals and that is a bad time to get students to do anything.  I also limited my recruiting to one (large) residence hall in the belief that my requests were so simple and easy that I wouldn’t have to recruit in other places.  Finally, I think the advertisements were too plain and too prominently focused on the Internet, perhaps causing students to believe that I was recruiting only computer experts or geeks.

Despite the dismal participation in my interviews and pilot, I am pressing ahead.  First, my timeline does not have any slack time.  Second, the data I was able to collect was all very, very positive in terms of the construction of my survey instrument.  Finally, even if this stage in designing this new survey instrument did not go as well as hoped, all of the preceding stages went very well and were performed very thoroughly.  I take some reassurance from the idea that all of the stages that were within my control were done well and thoroughly (I hear my dad’s voice echoing in my head: “Control the controllables”), in part because there are some stages that I can not control.  I will, however, conduct more cognitive interviews this summer even as my instrument is in the field; it would be too late to make changes if I were to discover any problems but (a) I do not expect to discover any problems and (b) if there are remaining problems then I need to be aware of them.

So now my focus turns to recruiting institutions to participate.  Ideally, all participating institutions would meet several criteria:

  1. Registered for the paper version of BCSSE
  2. Administering BCSSE in on-campus events (orientation sessions, FYE classes, etc.)
  3. Intending to participate in NSSE next spring
  4. Possessing a diverse student body

The first criteria is non-negotiable as it is critical to my research design.  More than any other criteria, this one immediately and dramatically narrows the pool of potential participating institutions.  The second criteria is desirable as those institutions meeting it should be those institutions that have the highest response rates.  The third criteria is also non-negotiable as it is critical to my research design.  The fourth criteria is important because if I have a homogeneous sample – especially an affluent one – then I may not have enough variation to perform some of the statistical tests I would like to perform (in statistical terms I would not have enough “power” for operations such as logistic regression).

Getting all of these criteria to align – including the unlisted logistical one of “we haven’t mailed their BCSSE surveys yet” – is challenging.  In fact, I worry about the fourth one considerably.  I can’t change the BCSSE registrants to make them meet my criteria but I am placing a small safeguard in my study regarding the fourth criteria: I will be recruiting a few institutions who have diverse student bodies even though they may not be participating in NSSE next spring.  I will not be able to analyze NSSE non-response data from those institutions but by collecting demographic and Internet access information I will at least be able to compare their incoming student bodies with those of the other institutions in my sample.  If there are no important differences then I know that I’ll have nothing to worry about with respect to diverse student populations.  If there are important differences then either I will be able to explore it in my study or note it for future studies.

Since I am pursuing an opt-in strategy in recruiting institutions (i.e. I am asking if they want to participate and they must say “yes”), there is still potential for this study to fall apart if institutions ignore my requests or say “no.”  I was hoping to use an opt-out strategy (i.e. telling institutions that they will be participating unless they say “no”) but some of my colleagues are uncomfortable with that strategy.  In any case, this recruitment is another critical link in the chain that is this study.  From my viewpoint, in the middle of this mess and uncertain how or if it will all turn out, it seems like yet another fragile link in a long chain of fragile links.

Dissertation Journal: IRB Approval and Beginning Pilot Administration

The Human Subjects Committee at Indiana University-Bloomington has approved my pilot study.  I’ve delivered my first set of flyers (for cognitive interviews) to my colleagues in Residential Programs and Services so they can post them.  Now I’m waiting for the phone calls and the e-mails from students so I can start setting appointments.  So far, so good…

Dissertation Journal: Draft Advertisements and Revisions to Instrument

I’ve received feedback from nearly everyone to whom I sent my original draft instrument.  I’ve incorporated their feedback and both my synopsis and instrument are much improved.  In my synopsis, I was careful to document the changes I made to the original draft and why I made those changes.  I don’t know if that documentation will make into the final dissertation but it’s very helpful for me.  It’s the kind of documentation that I imagine is unneeded much of the time but when it is needed it’s invaluable.

I also finished the draft advertisements for my pilot study.  I first plan to conduct five cognitive interviews with undergraduate students here at Indiana University.  These interviews are a way to understand how respondents understand (or misunderstand) the questions and responses on the survey instrument.  In these interviews, the participants are asked to (a) read aloud the questions and responses, (b) verbalize any questions or thoughts they have as they read the instrument, and (c) think aloud as they answer the questions.  Asking the participants to think aloud – the “cognitive” part of the cognitive interview – gives us an insight into how they understand and answer the questions.  The advertisements for the cognitive interviews are rather simple and I will be compensating participants $10.

After conducting the cognitive interviews and making any necessary changes to the survey instrument, I will then pilot the instrument by administering it to 60 Indiana University undergraduate students.  As you can see in the advertisement (here is a different view), I hope to print out large (11×17) flyers, place pockets on them, and insert survey instruments into the pockets.  I will only be advertising in this one building so it should be very easy for students to grab a survey, fill it out, and return it for their $3 compensation.

I’m sure that both IU Residential Programs and Services (RPS) and IRB will want changes made to these advertisements.  But I’ve got good drafts and I am hopeful that they will not be changed much!  RPS is getting first crack at them because I am not quite sure if they’ll let me put surveys in the flyers so if they don’t like that then I’ll have to (slightly) change how I distribute the instruments.  Once I get the thumbs up from RPS, I’m off to IRB!

Dissertation Journal: Draft Survey Instrument

I just sent my draft synopsis and survey instrument to several colleagues to solicit feedback.  I am particularly interested in feedback on the instrument as I am also in the process of completing the IRB documentation I’ll need to pilot the instrument. I have sent the materials to both content experts (student technology professionals) and process experts (survey researchers).  I also sent it to my committee chair-to-be and the project manager of the survey on which I will be piggybacking my instrument.  Importantly (I think), I included some information in the document specifically discussing how the survey instrument was constructed.  I wrote this section both to force me to document the process and to help these reviewers understand the choices I made.

To establish the validity and reliability of the survey before launching it full-scale, I plan to conduct cognitive interviews with five Indiana University undergraduate students before conducting a pilot administration to 60 students.  So in addition to working on IRB documents I am also waiting to here back from our residence life folks regarding how and whether I will be able to advertise my interviews and survey in IU residence halls.  I hope to be able to advertise only in one building.  I work in a large building that is half residence hall and half office space so if I can limit my advertising to this building it will make it very easy for students to participate since they will only have to walk a few hundred feet to either participate in an interview or drop off a survey.  I hope to make it even easier for survey participants by including the surveys with the flyers.

This feels like a really big step as this is the first time I’ve sent out all of this information to so many people.  I feel like I’ve made a commitment now that I have asked so many people to take time to review these materials whereas before this was just all supposition and hollow talk.  It could still fall apart if IRB doesn’t like things, I have difficulty recruiting participants (I am paying them to alleviate this possibility), the interviews or pilots indicate monstrous problems with the instrument, or if this all just takes too long.  But I think I have a fighting chance, especially with all of these awesome people supporting me and offering feedback and assistance.

Dissertation Journal: First “setback” – Postponing Quals

It’s not directly related to my dissertation but I’m postponing my qualifying exam until the summer.  It is indirectly related in that one of the main reasons I’m doing this is so that I concentrate on my initial survey instrument so I can hit my data collection window this summer.  If I miss that window then I’ll either have to wait another year for it to re-open or figure out a different method or topic entirely.

I’m very disappointed to have to do this and I haven’t yet figured out how this might potentially impact my schedule this summer.  But I have a lot going on in my personal and professional lives so this is a smart move even though it stings my pride and disrupts my summer a little bit.

Dissertation Journal: New Support and Quals

This morning, I met with my advisor to discuss the second-day question for my upcoming qualifying exam.  As I understand it, most advisors write the second day question with a specific focus on their student’s dissertation topic.  Since my advisor is not my chair (nor is he on my committee-to-be; he’s a great guy who I respect but my topic is far away from his research interests and experiences), I had to meet with him to explain my topic and answer any questions he had.  I sent him my two-page dissertation synopsis about a week ago and today I met with him to answer his questions.  I now have an idea what he might ask me and it should be very helpful as I continue working on this, honing my thoughts, and eventually writing more drafts of my first three chapters.

On Thursday, I met with my chair-to-be and two other students whose dissertations he is either chairing or has agreed to chair.  My chair proposed this meeting and is encouraging us to meet monthly – with or without him – so we can support one another through the entire dissertation process.  We set a date for our March meeting and we also set concrete goals.  My goal: To draft and begin piloting my initial survey instrument.  If I don’t get that done soon then I’ll be in a very bad position to continue along this path.

Dissertation Journal: Reframing the First Two Chapters

In class today, John Bean explained the structure of a quantitative dissertation and he reframed the first two chapters in an interesting way.  The way I’ve thought of them and the way they’ve been explained to me in the past is that the first chapter is the introduction and argument for the importance of the study and the second chapter is the literature review.  John suggests that for a quantitative study the first chapter argues for the importance of the dependent variable and the second chapter argues for the importance and inclusion of the independent variables.

Exactly how I write and organize my dissertation will be determined by my committee but this is a very interesting way of thinking about the first two chapters.

Dissertation Journal: Dissertation Synopsis

A few weeks ago, I wrote a brief synopsis of my current dissertation topic.  With the help of some of my wonderful friends, I have continued to revise this synopsis.  Here is the current version, the version I am sending to my advisor for him to use in writing my second-day question on my qualifying exam.

Higher education scholars, policy makers, and administrators know little about the experiences of undergraduate students – traditional and non-traditional – who matriculate with minimal experiences with or knowledge of technology.  It is easy to assume that all students, particularly traditionally-aged students, have significant experience with, knowledge of, and comfort with technology.  For many students, that assumption is correct.  But that assumption is false for some students and it is likely that those students have different and possibly difficult experiences, especially during their first year.

Although little is known about these students, there are tantalizing glimpses.  National surveys of institutions or students indicate that a significant number of students do not own a computer.  National surveys of students have reported different numbers of students without computers, from 1.2% of respondents (Smith, Salaway, & Caruso, 2009) to 2.7% of respondents (Junco & Mastrodicasa, 2007).  EDUCAUSE member institutions report that between 10% and 20% of their students do not own computers (EDUCAUSE, 2009).   Similarly, studies have revealed differences in how college students and youths use computers, differences that are significantly influenced by economic and cultural factors such as how easily and often they can use Internet-connected computers (Palfrey & Gasser, 2008; Ito et al., 2010; Watkins, 2009).  So it is clear that there are some students who neither own computers nor use them in ways that most of their peers use them.

But no one knows how many of these students are on American college campuses.  Little is known about who they are.  And very little is know about their experiences and how their technological aptitude is shaping their academic and social experiences.  Moreover, no one knows if our current methods of assessment – methods that often rely exclusively on web-based surveys advertised via e-mail – are gathering adequate information from these students and adequately representing their experiences, opinions, and needs.

This study will explore the response rates of first-year undergraduate students to a self-administered web-based survey.  Specifically, this study will examine the impact of those students’ previous computer ownership, access, and use on their response rate.  The specific research questions guiding this dissertation:

RQ1: In this sample of first-year students at American institutions of higher education, how many students have matriculated from environments in which they had substantially different patterns of Internet-connected computer ownership, access, or use?

RQ2: Do those students exhibit a significant non-response to a Web-based survey advertised primarily through e-mail?

To answer the first question, I will construct a brief survey of previous computer ownership and use to be administered to students participating in the on-campus, paper administration of the Beginning College Survey of Student Engagement (BCSSE).  I will answer the second question using data from the same institutions who participate in the web-based version of the National Survey of Student Engagement (NSSE).  Using the data obtained from BCSSE (a linked-records approach (Porter & Whitcomb, 2005)), I will be able to see if students with less exposure to technology respond in proportional numbers.

If the population is sufficiently diverse, I expect to find a significant number of students who have had less experience with technology than the majority of their peers.  I also expect that those students will be disproportionately from lower SESes and racial/ethnic minorities.   Finally, I expect to find a small but significant non-response bias to the web-based version of NSSE, a finding that may be generalizable to other web-based self-administered surveys.

References

EDUCAUSE. (2009). EDUCAUSE Core Data Service Fiscal Year 2008 summary report.  Boulder, CO: Author.

Ito, M., et al. (2010). Hanging out, messing around, and geeking out. Cambridge, MA: MIT Press.

Junco, R., & Mastrodicasa, J. (2007). Connecting to the net.generation. Washington, D.C.: NASPA.

Palfrey, J., & Gasser, U. (2008). Born digital: Understanding the first generation of digital natives. New York, NY: Basic Books.

Porter, S. R., & Whitcomb, M. E. (2005). Non-response in student surveys: The role of demographics, engagement and personality. Research in Higher Education, 46(2), 127–152.

Smith, S. D., Salaway, G., & Caruso, J. B. (2009). The ECAR student of undergraduate students and information technology, 2009.  Boulder, CO: EDUCAUSE.

Watkins, S. C. (2009). The young & the digital: What the migration to social-network sites, games, and anytime, anywhere media means for our future. Boston, MA: Beacon Press.