My first set of surveys is printed, packaged, and awaiting pickup by the FedEx man. In other words: HOLY SHIT I’M REALLY STARTING MY DISSERTATION!
My survey instrument has been through its final design process and I’ve printed the first set. The final design process moved things around a bit to make the instrument easier to scan and added a blank space for the survey ID.
The final instrument was delivered to me not in the format I expected. I thought it would be in a format that would allow me to easily perform a mail merge to insert the survey IDs for each survey. Since this instrument is physically a separate sheet of paper, I have to be able to match this instrument to the main BCSSE survey instrument (because I’m using demographic information from BCSSE). To do that matching, I am printing survey IDs on each of my surveys that correspond to the survey IDs already printed on the BCSSE surveys. I am then physically inserting my survey into the BCSSE survey (which is a 8.5×17 sheet folded in half). Having the same ID on both instruments will ensure that we can match the responses to these surveys even if the two instruments are separated after the students complete them.
The final instrument as I received it is exactly what is linked above. It has a blank for the survey ID but how do I insert the ID? More specifically, how do I easily insert the ID for hundreds of surveys? Here is what I’m doing:
- Create an Excel spreadsheet with the survey IDs I need to insert (each participating BCSSE institution is pre-assigned a range of survey IDs so I know them in advance).
- Create a mail merge document in Word. This document has only one thing in it: a mail merge field in the bottom right corner. This document uses the previously-created spreadsheet as its data source and the field is placed on the document such that it lines up with the blank Survey ID space on the pdf of the survey instrument.
- Complete the mail merge, generating a Word document that consists of a bunch of blank pages with survey IDs in the bottom right corner.
- Print the mail merged document to pdf.
- Open the new pdf and add the final survey instrument as the background of each page.
This process is not the most straight forward way to create a mail merged document but given my situation it’s not bad. Although it’s a tiny bit convoluted it works reasonably well. This process might not work if it had to be more precise (i.e. inserting new text into a block of existing text) but since I have a large target that is an empty box I don’t have to worry about precision.
(I was initially guided to this process – specifically the idea to use the instrument as a background image on another pdf – by this 2-year old thread on MacRumors.com.)
I’ve used this process to print the first set of 375 surveys to be mailed later today or tomorrow to an institution in New England. It should work well for the other 3885 surveys I will be mailing to nine other institutions in the coming months.
(I started to write an e-mail to some colleagues outlining my current and upcoming projects and the e-mail was getting a bit long. So I’m writing it all out here as perhaps some of you will be interested in one or more of these projects.)
Here are my current and upcoming projects, listed in no particular order…
- Continue editing and submit for publication (EDUCAUSE Quarterly?) the paper (A Comparison of Student and Faculty Academic Technology Use Across Disciplines) I just presented with Allison BrckaLorenz at the AIR Forum.
- Finish preparing for my ResNet 2010 assessment preconference session.
- Continue working with the ResNet 2010 hosts to schedule and conduct attendee focus groups to supplement the survey data we recently collected regarding the current state and future direction of the ResNet organization.
- Two potential AERA proposals:
- Discourse analysis of #sachat. I wrote a solid paper for the discourse analysis class I took in the spring but Rey Junco will be helping me to redo some of the analysis and edit the paper.
- Historical analysis of student affairs and technology. I have a solid draft of this paper already done (another class paper) but it’s very long and needs to be edited down to a more manageable, readable length. Additionally, I’ve recently discovered that we have in the library stacks at Indiana University proceedings from NASPA and ACPA meetings held during the first half of the twentieth century. I need to spend time in the library with those proceedings as I haven’t yet incorporated them into my study (I didn’t know where I could find them; I certainly didn’t expect to find them at my home institution!).
- Begin a new project analyzing the demographics of student affairs professionals. I wanted to use these data in my Twitter research but no one has done this work in 15 years so I’ll have to do it (I hope that I’m wrong and that I simply haven’t found a current or recent source!).
- Wait to hear back from ASHE to know if our Wikipedia proposal has been accepted. If so, then we need to do more work on it to update it and get it into shape for the conference later this year.
Of course, I have other things going on and coming up: quals in 2 months, ongoing projects at work, and beginning data collection for my dissertation. I thought that summer – especially the summer after you finish coursework – was supposed to be quiet and relaxing?
I sent the invitations to participate to the first group of institutions (I hope to invite more as more institutions register for BCSSE) about thirty minutes ago. And I’ve already received three “yes” responses!
Additionally, it occurred to me that I may end up with three groups of participating institutions:
- Web-only participant in NSSE
- Paper or mixed-mode participant in NSSE
- Non-participant in NSSE
The first group is my primary target group as they will allow me to easily calculate non-response as it relates to Internet access and use. But if I end up with institutions in the second or third group (and I won’t know for quite a while since NSSE registration isn’t open yet and won’t close until mid-late Fall) then I might be able to use them to do interesting “troubleshooting” since they will provide additional data. For example, institutions that participate in the paper mode of NSSE would offer me the chance to see what happens when respondents have the choice between paper and Web modes and how Internet access and use influences that choice.
Last week, I ceased piloting my survey instrument and finalized it. I submitted it to IRB as an amendment to BCSSE at the same time as sending it to my colleagues in IU’s Center for Survey Research who are formatting it for their scanners. IRB approved the amendment and I will immediately begin soliciting institutions to participate.
The pilot did not go nearly as well as planned. I was hoping to conduct 5 cognitive interviews and a pilot administration with about 50 students. I feel far short of both of those goals. I think that the biggest factor that contributed to this failure is the timing: I was so late in soliciting participants that we were nearly in finals and that is a bad time to get students to do anything. I also limited my recruiting to one (large) residence hall in the belief that my requests were so simple and easy that I wouldn’t have to recruit in other places. Finally, I think the advertisements were too plain and too prominently focused on the Internet, perhaps causing students to believe that I was recruiting only computer experts or geeks.
Despite the dismal participation in my interviews and pilot, I am pressing ahead. First, my timeline does not have any slack time. Second, the data I was able to collect was all very, very positive in terms of the construction of my survey instrument. Finally, even if this stage in designing this new survey instrument did not go as well as hoped, all of the preceding stages went very well and were performed very thoroughly. I take some reassurance from the idea that all of the stages that were within my control were done well and thoroughly (I hear my dad’s voice echoing in my head: “Control the controllables”), in part because there are some stages that I can not control. I will, however, conduct more cognitive interviews this summer even as my instrument is in the field; it would be too late to make changes if I were to discover any problems but (a) I do not expect to discover any problems and (b) if there are remaining problems then I need to be aware of them.
So now my focus turns to recruiting institutions to participate. Ideally, all participating institutions would meet several criteria:
- Registered for the paper version of BCSSE
- Administering BCSSE in on-campus events (orientation sessions, FYE classes, etc.)
- Intending to participate in NSSE next spring
- Possessing a diverse student body
The first criteria is non-negotiable as it is critical to my research design. More than any other criteria, this one immediately and dramatically narrows the pool of potential participating institutions. The second criteria is desirable as those institutions meeting it should be those institutions that have the highest response rates. The third criteria is also non-negotiable as it is critical to my research design. The fourth criteria is important because if I have a homogeneous sample – especially an affluent one – then I may not have enough variation to perform some of the statistical tests I would like to perform (in statistical terms I would not have enough “power” for operations such as logistic regression).
Getting all of these criteria to align – including the unlisted logistical one of “we haven’t mailed their BCSSE surveys yet” – is challenging. In fact, I worry about the fourth one considerably. I can’t change the BCSSE registrants to make them meet my criteria but I am placing a small safeguard in my study regarding the fourth criteria: I will be recruiting a few institutions who have diverse student bodies even though they may not be participating in NSSE next spring. I will not be able to analyze NSSE non-response data from those institutions but by collecting demographic and Internet access information I will at least be able to compare their incoming student bodies with those of the other institutions in my sample. If there are no important differences then I know that I’ll have nothing to worry about with respect to diverse student populations. If there are important differences then either I will be able to explore it in my study or note it for future studies.
Since I am pursuing an opt-in strategy in recruiting institutions (i.e. I am asking if they want to participate and they must say “yes”), there is still potential for this study to fall apart if institutions ignore my requests or say “no.” I was hoping to use an opt-out strategy (i.e. telling institutions that they will be participating unless they say “no”) but some of my colleagues are uncomfortable with that strategy. In any case, this recruitment is another critical link in the chain that is this study. From my viewpoint, in the middle of this mess and uncertain how or if it will all turn out, it seems like yet another fragile link in a long chain of fragile links.
The Human Subjects Committee at Indiana University-Bloomington has approved my pilot study. I’ve delivered my first set of flyers (for cognitive interviews) to my colleagues in Residential Programs and Services so they can post them. Now I’m waiting for the phone calls and the e-mails from students so I can start setting appointments. So far, so good…
I’ve received feedback from nearly everyone to whom I sent my original draft instrument. I’ve incorporated their feedback and both my synopsis and instrument are much improved. In my synopsis, I was careful to document the changes I made to the original draft and why I made those changes. I don’t know if that documentation will make into the final dissertation but it’s very helpful for me. It’s the kind of documentation that I imagine is unneeded much of the time but when it is needed it’s invaluable.
I also finished the draft advertisements for my pilot study. I first plan to conduct five cognitive interviews with undergraduate students here at Indiana University. These interviews are a way to understand how respondents understand (or misunderstand) the questions and responses on the survey instrument. In these interviews, the participants are asked to (a) read aloud the questions and responses, (b) verbalize any questions or thoughts they have as they read the instrument, and (c) think aloud as they answer the questions. Asking the participants to think aloud – the “cognitive” part of the cognitive interview – gives us an insight into how they understand and answer the questions. The advertisements for the cognitive interviews are rather simple and I will be compensating participants $10.
After conducting the cognitive interviews and making any necessary changes to the survey instrument, I will then pilot the instrument by administering it to 60 Indiana University undergraduate students. As you can see in the advertisement (here is a different view), I hope to print out large (11×17) flyers, place pockets on them, and insert survey instruments into the pockets. I will only be advertising in this one building so it should be very easy for students to grab a survey, fill it out, and return it for their $3 compensation.
I’m sure that both IU Residential Programs and Services (RPS) and IRB will want changes made to these advertisements. But I’ve got good drafts and I am hopeful that they will not be changed much! RPS is getting first crack at them because I am not quite sure if they’ll let me put surveys in the flyers so if they don’t like that then I’ll have to (slightly) change how I distribute the instruments. Once I get the thumbs up from RPS, I’m off to IRB!
I just sent my draft synopsis and survey instrument to several colleagues to solicit feedback. I am particularly interested in feedback on the instrument as I am also in the process of completing the IRB documentation I’ll need to pilot the instrument. I have sent the materials to both content experts (student technology professionals) and process experts (survey researchers). I also sent it to my committee chair-to-be and the project manager of the survey on which I will be piggybacking my instrument. Importantly (I think), I included some information in the document specifically discussing how the survey instrument was constructed. I wrote this section both to force me to document the process and to help these reviewers understand the choices I made.
To establish the validity and reliability of the survey before launching it full-scale, I plan to conduct cognitive interviews with five Indiana University undergraduate students before conducting a pilot administration to 60 students. So in addition to working on IRB documents I am also waiting to here back from our residence life folks regarding how and whether I will be able to advertise my interviews and survey in IU residence halls. I hope to be able to advertise only in one building. I work in a large building that is half residence hall and half office space so if I can limit my advertising to this building it will make it very easy for students to participate since they will only have to walk a few hundred feet to either participate in an interview or drop off a survey. I hope to make it even easier for survey participants by including the surveys with the flyers.
This feels like a really big step as this is the first time I’ve sent out all of this information to so many people. I feel like I’ve made a commitment now that I have asked so many people to take time to review these materials whereas before this was just all supposition and hollow talk. It could still fall apart if IRB doesn’t like things, I have difficulty recruiting participants (I am paying them to alleviate this possibility), the interviews or pilots indicate monstrous problems with the instrument, or if this all just takes too long. But I think I have a fighting chance, especially with all of these awesome people supporting me and offering feedback and assistance.
It’s not directly related to my dissertation but I’m postponing my qualifying exam until the summer. It is indirectly related in that one of the main reasons I’m doing this is so that I concentrate on my initial survey instrument so I can hit my data collection window this summer. If I miss that window then I’ll either have to wait another year for it to re-open or figure out a different method or topic entirely.
I’m very disappointed to have to do this and I haven’t yet figured out how this might potentially impact my schedule this summer. But I have a lot going on in my personal and professional lives so this is a smart move even though it stings my pride and disrupts my summer a little bit.
This morning, I met with my advisor to discuss the second-day question for my upcoming qualifying exam. As I understand it, most advisors write the second day question with a specific focus on their student’s dissertation topic. Since my advisor is not my chair (nor is he on my committee-to-be; he’s a great guy who I respect but my topic is far away from his research interests and experiences), I had to meet with him to explain my topic and answer any questions he had. I sent him my two-page dissertation synopsis about a week ago and today I met with him to answer his questions. I now have an idea what he might ask me and it should be very helpful as I continue working on this, honing my thoughts, and eventually writing more drafts of my first three chapters.
On Thursday, I met with my chair-to-be and two other students whose dissertations he is either chairing or has agreed to chair. My chair proposed this meeting and is encouraging us to meet monthly – with or without him – so we can support one another through the entire dissertation process. We set a date for our March meeting and we also set concrete goals. My goal: To draft and begin piloting my initial survey instrument. If I don’t get that done soon then I’ll be in a very bad position to continue along this path.