EFF Publishes a Bit of ResNet History

The EFF, one of my favorite organizations, has announced a report describing a security vulnerability in Impulse Point’s SafeConnect product. I don’t have any new insight to add regarding the security flaw or SafeConnect. But the announcement is a quick read with a nice little history of Network Access Control (NAC) technology and its important role in managing residential computer networks.

(Off-topic reminiscing: In 2003, college and university campuses experienced massive problems on their student computer networks thanks to the Blaster and SoBig worms. In response, colleges and universities rapidly adopted NAC and similar technologies to curtail those problems. Around that time, a few people from a brand new company visited the campus where I worked to pitch their product; the company was located in Florida and they were visiting nearby colleges and universities to collect feedback and gauge interest. They had a nice product but it didn’t address our needs. If I remember correctly, the product hijacked downloads of copyrighted material – music, movies, etc. – and redirected students to vendors selling the material legally. Again, it was a neat product but one in which we had no interest. Instead, we told them how badly we needed a good NAC, especially after getting our asses kicked by Blaster and SoBig so badly that we shut down the network for several days until we could get a handle on things. Importantly, their product shared a lot with NAC products so our recommendation to develop a NAC was realistic. The nice people from Impulse Point left and when I next heard of them it was about the success of their SafeConnect NAC product. Maybe my memory is faulty or maybe I’m just silly and arrogant but I like to think that I played a teeny tiny role in the success of this company and their popular product. You’re welcome!)

Part-time Students Are Not (Yet) The Majority

Right after I posted a screed about how some recent research about Twitter’s relationship with students’ grades has been misunderstood, along came another study that is being mischaracterized. I’m not looking for these things; I don’t want to be some kind of education research watchdog or bully. But this is important and I must speak up.

The demographics of U.S. college students are changing and too many of us are not changing our practices to match. Recently, Complete College America released a report focusing on these changing demographics with a specific focus on part-time students and the continued growth of non-traditional students. It focuses on some very important and often overlooked topics and it should elicit discussion and promote action.

Frustratingly, several of the media reports are misreporting what is in this study, particularly in their headlines and summaries. The study very explicitly says that “4 of every 10 public college students are able to attend only part-time” on its second page. So why are some reporters and commentators summarizing the report with headlines proclaiming that part-time students are the new majority? I can understand a relatively small shop making this mistake, particularly if they’re in a rush to try to get the word out about this important study and happy to make corrections. But why is the Washington Post getting it wrong and letting the error persist for days? And why are higher education professionals passing along this report with incorrect information, blindly repeating headlines and summaries that get it wrong?

(Not everyone is getting this wrong. For example, The Atlantic gets it just right.)

This is so frustrating to me because the topics discussed in this report are so important. Non-traditional students do make up the majority of students. The federal government does a poor job collecting information about these students by often focusing exclusively on first-time, first year students (which, coincidentally, was an issue I wrote about in my qualifying exam). Too many of us have tunnel vision and only focus on the students on our campus or – more accurately – the students we think are on our campus. In the context of student affairs, I worry particularly about the next generation of professionals and whether these demographic changes are being addressed in their coursework. My impression is that they are not; I hope I am wrong!

Limitations and Lost Nuance: Twitter Does Not Improve Grades

I’ve watched with interest over the last several months as media outlets and individuals have discussed, blogged, and tweeted a study conducted by Junco, Heiberger, and Loken. Their study reported that a group of students who used Twitter as part of a class earned higher grades than classmates in sections of the class that did not use Twitter. It’s a nice study that is clearly described and methodologically sound. Like all studies, it has significant limitations and they are concisely and honestly discussed in the study but those limitations have been ignored by too many people who have made the study into something it’s not.

The study concluded that “Twitter can be used to engage students in ways that are important for their academic and psychosocial development” (p. 10). But is that what has been reported and discussed by others? No, of course not; if it were then I wouldn’t be writing this sanctimonious blog post! Mashable, a very widely-read and influential technology blog, reported on the study using the headline “Twitter Increases Student Engagement [STUDY].” A recently-created infographic proclaims that “Students in classes that use Twitter to increase engagement have been found to average .5 grade points higher than those in normal classes.” Another infographic proclaims that “[Students get] grades up half a gradepoint in classes that use Twitter.”

I get that pithy headlines and concise summaries are necessary to grab attention. But by overlooking or ignoring the details of this study, those headlines and summaries get this all wrong. Let’s return to the original study to understand why.

In the study, the researchers assigned some sections of a class to use Twitter. While the entire class used Ning, these sections also used Twitter to complete some received additional assignments. They also received guidance and encouragement to use Twitter to communicate not only with one another but also with instructors. At the end of the semester, these students had earned higher grades than their non-Twittering classmates.

If I understand the study’s methodology (Rey, please correct me if I got anything wrong!), it seems that this study does not show that “Twitter improves grades.” It shows us that students who do more work and spend more time concentrating on class materials can earn higher grades. It shows us that students who have additional opportunities to communicate and collaborate with one can another earn higher grades. It also shows us that students who have greater access to instructors can earn higher grades. It shows us that Twitter can be a viable medium for students to communicate and coordinate with one another and instructors. And, yes, it shows that Twitter can be an effective educational tool when skillfully incorporated into a class with appropriate support and structure. In a critique of one of the infographics, Junco specifically mentions this: “Yes, that’s our study about Twitter and grades. Unfortunately, what’s missing is that we used Twitter in specific, educationally-relevant ways—in other words, examining what students are doing on the platform is more important than a binary user/nonuser variable.”

This illustrates the challenge with testing the efficacy of educational tools and techniques: It’s really, really hard to isolate just the impact of the tool or technique. To test the tool or technique, you almost always have to make other changes and it’s usually impossible to tell if those changes changed the results of your study more than the tool or technique you intended to study. It’s a limitation of nearly every study focusing on the effect of particular media on education and it may be an inherent limitation for this kind of work. (Richard Clark has been pointing this out for decades; look into his writings for more detailed discussions. He’s also been wonderful in creating dialog with his detractors so there are well-documented and substantive discussions between many different scholars with different opinions.)

Hence my frustration with how this study has been summarized and passed around: By ignoring the limitations and nuance of this study, these summaries miss the boat and draw a grandiose conclusion that the authors of the study never attempt to draw themselves. That’s a shame because this is a nice study that is interesting and informative. But like most research, it’s a small step forward and not a giant, earthshaking leap. Summarizing this study by proclaiming that Twitter is a magic ingredient that can be added to classes to increase grades is irresponsible and misleading.

Update 1: Thanks for the clarification about Ning, Liz!

Update 2: Another example of how headlines can distort or misrepresent research has just popped up. Before correcting the headline, Colorlines reported that the majority of college students are part-time students (full headline before being corrected: “Study: Majority of College Students are Part-Timers, Less Likely to Graduate”) But the actual report doesn’t say that. Instead, it says that “4 of every 10 public college students are able to attend only part-time” (p. 2). It’s a shame that the research was initially being reported incorrectly because the changing demographics of college students is incredibly important and very misunderstood and overlooked. I know there is a lot nuance in discussions of demographics – race, ethnicity, SES status, privilege, etc. – but if we cover up or ignore the details then we haven’t made any progress.

To their credit, Colorlines corrected their headline once I pointed this out to them. They made a mistake in their initial headline and it’s great they they’re willing to correct their public mistake!

Quick Update: NSSE/EDUCAUSE Partnership

(I’m working on a longer post but I keep getting interrupted by life so this short post will have to do for now.)

I’m super excited that I’m going to the 2011 EDUCAUSE Annual Conference next month in Philadelphia to work with EDUCAUSE staff and members to develop potential questions for the next version of NSSE! I’ve always been a huge fan of EDUCAUSE and the work they do so I’m very hopeful that this collaboration will be fruitful and help us figure out the right kinds of questions to ask about technology. Over the past four years I’ve been involved in several efforts to address technology in NSSE and it’s very difficult so I’m really excited that we’ll be able to tap into the experience and expertise of technology experts.

I’m also a bit trepidatious about this collaboration. It’s young and in many ways undefined. I am hopeful that it bears fruit but it may fizzle out or even backfire since there is so much ground we have yet to cover and these are two large, complex organizations. Like many such efforts, it also feels like it is very dependent on a small number of people. While we’re all very talented and dedicated, we’re also incredibly busy and it may turn out that our interests are incompatible.

I’m also very thankful that this collaboration has even made it this far. It’s very gratifying that my colleagues are still willing to take risks on public ventures like this even as we continue to experience sharp public criticism. It’s more incredible for me to know that my supervisors have been supportive of this effort even though it has largely been championed by one graduate student. Of course, I haven’t done this or anything here by myself; I’ve had wonderful support from many people in nearly everything I’ve done here, especially from my current supervisor Allison BrckaLorenz who has been an enthusiastic supporter and wonderfully capable advisor from day one. Despite all of her other important responsibilities, Allison is neck deep in this EDUCAUSE/technology-thing with me and I’m so happy that she is involved!

So even though I’m a little fearful that this particular effort could fizzle out or even publicly blow up (which seems extraordinarily unlikely but I’m always a bit paranoid), I go into this knowing I’m not alone and I’m working with and for people as supportive as they are brilliant. I really want this collaboration between two of my favorite organizations to work. If this all works out well – and it will be a couple of years before we really know – it could be very powerful in helping U.S. higher education better understand and use technology to teach and communicate with undergraduates. I know that’s a very lofty aspiration but these two organizations are more than capable of fulfilling it.

More #sachat analysis: One Illuminating Figure

Laura Pasquini and I are working on analyzing #sachat data, a follow-up to work I’ve done previously but did not formally publish. Part of our work involves looking at a few other student affairs-related hashtags to help us understand #sachat in context. This figure shows the number of Twitter messages posted with particular hashtags – #highered, #sachat, #sadoc, #sagrad, #sajobs, and #studentaffairs – during the week of June 27, 2011. The #sachat session really stands out here both in the number of messages posted and in how it interrupts an otherwise regular daily and weekly pattern. This isn’t a profound discovery but it’s an easy way of illustrating that #sachat sessions are relatively unique and prominent uses of Twitter among some users.

Habits of Successful Higher Ed Doctoral Students

I recently moved to a new apartment and as I was unpacking I came across my notes from last year’s NASPA Doctoral Seminar in Chicago. One page of notes is from a panel discussion where faculty discussed habits and traits of successful doctoral students. Carney Strange moderated the session but I don’t remember or have written down the names of all of the faculty on the panel. I know Deborah Liddell was on the panel because I specifically noted a quote from here. I think George Kuh was also on the panel and I only remember that because he was a faculty member at my institution and the director of the research center at which I worked.

The most successful Higher Education doctoral students…

  • Read
  • Write
  • Read others’ dissertations
  • Keep a writing journal or log
  • Treat their education like a job, including scheduling reading and writing (this tip was aimed particularly at part-time students)
  • Know their motivation(s)
  • Do the damn thing e.g. don’t read about writing a dissertation, just sit down and write it
  • Know that doctoral studies is not about their capabilities; everyone admitted to a doctoral program is capable of completing it
  • Remembers that “it’s just a place to develop habits”
  • Asks questions
  • Knows that “it’s about how you lean into life” and life still goes on outside and beyond their studies
  • Are willing to stick their feet in the water without knowing what will happen e.g. take risks, display trust
  • Know that their dissertation is not their life’s work
  • “Wrestle [their] perfection to the ground” – Deborah Liddel, University of Iowa

Personal Update: It’s Already August???

I’ve been quite disconnected for the last couple of months. I think that I am almost reconnected and settled into my new apartment so I will soon be back to myself with some updates for this blog. Quick thoughts are listed below in no particular order; please let me know if you’d like me to elaborate on any of them.

  • My time at the Oxford Internet Institute’s Summer Doctoral Programme was amazing. This was the third young scholar/advanced doctoral student program I have attended and it was by far the best one. The program was fantastic, the location amazing, and the faculty and participants are as kind as they are intelligent.
  • It took nearly a week for my cable company to get my Internet connection working in my new apartment. I’ve never been more isolated or more productive. Coincidence? I think not.
  • Perhaps as a result of my concentrated time in Oxford focusing on Internet studies and my time in a research center, I am feeling more and more disconnected from the student affairs profession. I continue to wonder about the priorities of the profession and the academy-at-large, especially as we continue to adjust to a new reality of limited funds and increased public scrutiny. Although I agree that most of the services provided by student affairs units are good and useful I don’t know if adults should be forced to fund those services. At a more basic level, I don’t know if all of these services should be performed by colleges and universities even if they do contribute to enrollment, retention, and overall well-being.
  • I am immensely saddened and angered by the continuing slap fight between ACPA and NASPA. It’s unprofessional, wasteful, and embarrassing. I have already decided to let my NASPA membership expire and I am edging toward allowing my ACPA membership expire, too.
  • I don’t think we’re quite ready to make a public announcement but I’m extremely excited about a partnership between my current employer and another of my favorite organizations. I’m smack in the middle of it all and so happy to be there!
  • I said “no” today and I’m very proud of myself because it’s not something I do as often as I should. I really wanted to work on the project, too, so I’m a tiny bit sad about the timing. But my time is limited and I must develop and maintain focus.

Upcoming Student Affairs Technology Conferences

After many years of being a side-discussion at other conferences, technology in student affairs is finaly taking center stage at two upcoming conferences:

  1. The Student Affairs Technology Unconference is being held on July 29, 2011, at Boston University. Although the agenda is not set (and will not be set until the attendees set the agenda at the conference itself, one of the defining features of an unconference), this unconference is aimed at student affairs and higher education administrators who use technology and want to connect with others who share their interest. The conference is being set up by several persons who frequent the #satech Twitter hashtag; use #satechBOS to follow them. The event is free with participants responsible for their own lunch.
  2. The #NASPAtech: Student Affairs Technology Conference is scheduled for October 27-29, 2011, in Newport, Rhode Island. In contrast with the Boston event, this will be more of a traditional conference with a set schedule, invited speakers, and a call for programs that closes on July 22. The schedule has as many timeslots dedicated to unconference sessions as it does traditional concurrent sessions. This event seems to be aimed at the same general population as the Boston unconference: Student affairs technology users and enthusiasts.

I’m disappointed that I won’t be attending either of these events. I will be returning from England just a few days before the Boston event and the turnaround is just too quick for my comfort. I won’t attend the NASPA conference because (a) I will be attending and presenting at another conference and (b) I don’t care to support NASPA right now. (And while I’m being a Debbie Downer, I’ll also note that these conferences both seem to be aimed at the same audience with those who (a) build and support and (b) study the technologies and their users still lacking homes of their own.)

It’s exciting to see these events on the horizon! I hope this is a sign of more good things to come for those who work with technology in student affairs and other areas of higher education.

Confessions of an Uninvolved Student

This is a further development of thoughts that occurred to me as I read and responded to John Gardner’s latest post.

I have worked in student affairs and I have a Master’s degree in that field. I am a PhD Candidate in one of the world’s best higher education programs. I work at the National Survey of Student Engagement. These experiences and education have firmly drilled into me the benefits of being engaged and active in campus groups, events, and activities. I see and hear from my colleagues and my students the incredible impact of these activities, especially the acquisition of lifelong friends.

Here’s my secret confession: I was involved in virtually nothing as an undergraduate and a Master’s student. I can only name two fellow students from my undergraduate alma mater; I’ve scarcely exchanged Facebook messages with them and haven’t spoken to them since I graduated from the University of Tennessee (with a 2.48 GPA; “C is for cookie, that’s good enough for me!“). The story isn’t much different for my Master’s classmates; with one exception, I only keep in touch with them through coincidental attendance at professional conferences.

My lack of campus involvement was my choice, for good or ill. It’s part of who I am and I can’t envision my life any differently. And I don’t think anyone could have convinced me to act differently or be different.

I make these confessions because I know there are many other students who are making the same decisions and I don’t think those students and their decisions are understood by or respected by many of my colleagues, especially those in student affairs. I get the impression that sometimes those students are viewed with pity and even scorn because they choose not engage in our favored activities in our chosen environment. And that saddens me, especially because we preach the benefits of diversity and choice. Many of us believe those students need to be “saved” but that seems very disrespectful of those students and their choices.

Could Graduate Student Members Have Changed the NASPA Vote?

I’ve stated that the NASPA vote against consolidating with ACPA would have been different if graduate student members could have voted. This is largely influenced by my emotions, intuition, and experiences. But how much truth is there in this belief? Can we find evidence to support or refute this assertion by examining the results of the vote and membership data? Would it really have made a difference if graduate students could have voted?

Data and Assumptions

I’m having trouble finding membership data but I have found enough to make some rough calculations. According to the latest Executive Director’s Report to the Board of Directors, in January NASPA had 12,388 members. At about the same time, the co-chairs of the New Professionals & Graduate Students Knowledge Community (NPGS KC) wrote that there were over 3,000 graduate student members in NASPA. Finally, NASPA is reporting that 42% of eligible voters participated in this vote and 62% of them voted for consolidation.

Let’s assume that all of the numbers above are correct or at least close enough for some rough calculations. Let’s also assume that one-quarter of the NASPA membership was ineligible to vote. Graduate student members make up almost one-quarter of the NASPA membership so this assumption is conservative because associate affiliate members could not vote, either. We’ll make this assumption even more conservative and assume there are 3,000 graduate student members. Finally, let’s assume that the straw poll conducted by the NPGS KC is predictive of how graduate student members would have voted and 82% of them would have supported consolidation.

If 42% of eligible voters participated in the vote, that means that 5,203 members voted. Sixty-two percent of those members – 3,226 members – voted for consolidation. Two-thirds of them would have had to vote for consolidation for the motion to carry.

Calculations and Results

Let’s explore two scenarios. In the first scenario, let’s assume that the same proportion of graduate student members participate in the vote as the rest of the membership. In other words, let’s assume that 42% of the graduate student members – 1,260 members – participate in the vote. And as stated above, we’ll assume that the NPGS KC straw poll is predictive of student membership voting preferences. Do these additional 1,033 votes for consolidation meet or exceed 66% of the total number of votes?  Yes, but barely.  That would have resulted in 66.9% of the voters in favor of consolidation, just over the 2/3 required.

In our second scenario, let’s assume that only those student members who responded to the NPGS KC straw poll participate in the consolidation vote. In that case, only 547 new votes are added to the total. The additional 447 votes for consolidation only increases the total percentage of consolidation votes to 64.4 so these additional votes do not change the final result.


As demonstrated by the two scenarios described above, graduate student members would have had to participate in a higher proportion than the general membership and voted for consolidation by a huge margin to have changed this vote. We have good reason to believe that student members would have overwhelmingly supported consolidation. We don’t know how many of them would have voted and whether enough of them would have voted to change the results; it’s very plausible.

In this brief exercise, the assumption that seems weakest is that the NPGS KC straw poll would have been predictive of the behavior of the entire student membership of NASPA. This is primarily due to the fact the straw poll was only administered to the members of the NPGS KC and we don’t know how representative those members are of the larger student membership of NASPA. Personally, I believe that student members would have overwhelmingly voted for consolidation in line with the straw poll. But I don’t know how many students would have voted and I am not confident that the straw poll can tell us much about that.


It is likely but not inevitable that graduate student members would have changed the result of this vote. Evidence seems to show that graduate student members were solidly in favor of consolidation.  But we don’t know if enough of them would have voted to change the final result; it’s likely but not certain.

Of course, this exercise in arithmetic ignores all social, cultural, and historic issues (and many others).  If support for consolidation was indeed extremely high among graduate student members, what emotional impact does it have for their votes to have not been counted even if they would not have changed the final result?  Will graduate students feel more aligned with ACPA who allowed student members to vote and whose general membership seemed to favor consolidation by the same margin as graduate student members of NASPA? Finally, how much does it matter if the graduate student vote could have changed or not changed the final result?  How much weight is carried by the symbolic act of inclusion or exclusion?

(And what about the associate affiliate members who were also denied the right to vote?  If they could have voted, how many would have voted and for which result?)