Data Analysis MOOC Week 3: I’m a Dropout

Despite my best intentions, I have become another MOOC dropout.  Why am I not continuing to participate in this free course?

  1. The format isn’t compelling.  The course is primarily built around four components: video lectures and notes, weekly quizzes, a discussion board, and two peer-graded assignments. The lectures are alright and although there are many other online R resources it’s nice to have concise descriptions of R procedures specifically linked to data analysis. The discussion board is also helpful but there are many other places to find help with R. As discussed in my previous post, the weekly quizzes are very disappointing as they are the primary means by which students in this course practice what they learn but they offer very, very little feedback.My biggest regret is that I won’t experience the peer-graded  assignments. While the idea of requiring students to grade one another’s work is likely driven largely by the logistics of a MOOC, peer-graded assignments can be very powerful and worthwhile even in small classes.  That these assignments are the only non-quiz activities in the course is disappointing especially since there are only two non-quiz assignments.  Although it will be helpful that each student should receive feedback from several classmates (if it’s possible, I might provide feedback on the reports for some of my classmates even though I won’t be writing my own), it often takes more than two attempts for students to learn and begin to master new skills.
  2. Except for the peer-graded reports, there seems to be little reason for this course to be on a lockstep 8 week schedule. I might be able to stay with it if the timing were more flexible.  Even in the first three weeks of the course I’m having some trouble consistently making time to view all of the videos. I had planned to do this all at work as my supervisor supports this as important and valuable professional development but I’m having trouble doing that because it’s sometimes difficult to carve out the time and I feel guilty watching online videos at work for a non-credit course when I feel like I should be doing something more (visibly and authentically) productive.
  3. I can’t convince myself to participate in the two peer-graded reports, the only meaningful assignments in this course.  This is linked directly to the material of this specific course and is not a criticism of the course itself. I simply can’t muster the will to conduct additional data analysis and write additional reports for this course when those are two of my primary job duties.  It’s not that I don’t think that I could learn from the activities, develop new skills, and become a better data analyst and writer.  I just can’t bring myself to spend so much time analyzing data and writing reports unrelated to either my job or my research.  I am disappointed as I was looking forward to these substantive activities, especially being able to receive feedback from others and seeing how others approached the same activities.

Although I’m disappointed to have decided to not continue with the activities of this MOOC, I am happy to have enrolled and tried it out.  I will continue to download the course materials so I can reference them when I am ready to put them into practice in meaningful ways.

I have very mixed feelings about the broader concept of MOOCs.  It would take an extraordinary effort for an online course, especially a MOOC, to match the quality of the best face-to-face courses.  But the reality is that few face-to-face courses are “the best.”  Although the dominant MOOC model seems to mimic much of the worst lecture courses in traditional universities, even the worst course is sometimes good enough especially when the alternative to a crappy, frustrating, and largely self-driven education is no education at all.

Data Analysis MOOC Week 2: Muddling Through Frustration

I have watched the online videos and successfully completed the quiz for week 2 of the data analysis MOOC in which I am enrolled. I struggled quite a bit with some of the R syntax and that made the quiz a very frustrating experience. I have two observations to share about what I learned this week about the format of the course.

First, I am disappointed that so far the only opportunities for students to practice what is being taught and receive feedback is the weekly quiz.  I was able to muddle through things enough to get answers that matched the response options for this week’s multiple-choice quiz but despite answering all questions correctly I’m still very unsure of much of the content – I just know that I happened to somehow end up with answers that matched some of the ones included in the quiz.  Some of this is simply due to my lack of experience with R and its high learning curve.  But much of it is due to the fact that the multiple-choice quiz was the only opportunity to practice with any semblance of feedback and that feedback was restricted to an anemic “correct” or “incorrect” for each question with no additional feedback.

Yes, I can practice on my own some of the skills taught in this class.  This is certainly the case if I want to focus solely on learning how to use R – syntax, configuration, functionality, etc. – as the language provides immediate feedback with error messages or output.  But if that is the focus and if that’s sufficient to learn the skills then why do we need an organized course instead just a course packet or list of recommended self-guided topics and exercises?

What distinguishes an organized, well-taught class from a self-taught topic is that a class has an expert who not only make their thinking explicit but also offers targeted feedback for students as they practice the skills they are learning.  It’s conceivable that some skills could be taught using sophisticated, automated tools if we have a deep enough understanding of how people typically learn those skills that we can programmatically recognize mistakes and misunderstandings to provide appropriate, specific feedback.  Sometimes, this can be done to a (very) limited degree with appropriately designed multiple-choice instruments where the incorrect responses are designed to be diagnostic i.e., wrong answers aren’t merely incorrect but they’re designed to identify particular kinds of mistakes or misunderstandings.  That seems to be the case for some of the questions and answers in this MOOC but we’re not provided with any of the related feedback to help us understand what common mistake we may have made, how we might be misunderstanding the issue, and how we can work to correct our thinking.

Second, the size of the course requires innovative ways to provide support for students and this course seems to rely heavily on the course discussion board.  This is an observation, not a criticism. I’m quite comfortable using that medium as I’ve been using online discussion boards since the early 1990s when they were one of the primary draws for dial-up bulletin board systems (with the other major draw being online “door” games).  I don’t know how well this works for other students, however, as I don’t want to make assumptions about their experiences, skills, and cultures.  It’s probably not a big deal; my concern here is very minor and more of a curiosity about how other students experience and use (or don’t use) the discussion board. (In other situations I would be concerned about those who have poor or no Internet access or those who have little comfort and experience with the Internet but it’s reasonable to expect students who enroll in an online course to have sufficient Internet access and skills. I’m not suggesting that everyone has the access and skills to enroll in an online course, merely that those who are already enrolled in one presumably have the required access and skills.)

Students Believe Their Computer Skills Are Below Average

Our colleagues at the Higher Education Research Institute (HERI) at UCLA have publicly released some information from their annual survey of first-year students.  There are already several media reports on the topic and we can expect many more to come out over the next few days.  What caught my eye is that they shared some of their data with The Chronicle of Higher Education who created an interactive graphic showing how student responses have (or have not) changed over time.

Several of the questions on the survey ask students to compare themselves “with the average person your age” and “rate yourself above average or better in terms of __.”  For nearly all of the questions of this form, students have consistently rated themselves as above average: ability to see the world from someone else’s perspective, tolerance of others with different beliefs, openness to having their own views challenged, ability to discuss and negotiate controversial issues, ability to work cooperatively with diverse people, academic ability, emotional health, and physical health.

So for what topics do respondents believe they are below average?  Computer skills, spirituality, and writing ability.  I don’t care to comment on spirituality (a commenter on the Chronicle’s website asks a good question: “What on earth does [that question] mean?”).  I’m puzzled that first-year college students believe they are below average in writing ability but I’m not an expert on writing so I’ll leave that puzzle to others.

What does it mean that 35% of the respondents to the survey rate themselves below average in computer skills?  And what does it mean that students have consistently responded like this since the question was first asked in 1999?  Well, to know for sure we’d have to ask them.  I would want to know how they interpret “computer skills.”  What do they consider to be computer skills?  How are they measuring their computer skills?  And to whom are they comparing themselves?  Heck, given the proliferation of smart phones and tablets it would be a good idea to ask students (and ourselves!) just what they think of as a “computer.”

One possible factor in all of this may be related to the gender imbalance in undergraduate education in the U.S.  More women than men are enrolled in U.S. colleges and universities.  According to the most recent data published by the National Clearinghouse Research Center, 56% of the students enrolled in the fall of 2012 were women.  Why is this important?  We know that women typically underestimate their computer skills whereas men typically overestimate their skills.  If the data reported by the Chronicle are unweighted then this may have an even larger impact on the data because women typically respond to surveys in higher proportions.

(Aside: The National Clearinghouse Research Center is doing some incredibly cool and vital research these days. They have a huge warehouse of data about college enrollment and it’s great to see them putting it all to use! Check out what they’re doing – it’s good stuff.)

In any case, it’s interesting that most undergraduates at 4-year institutions believe their computer skills are below average.  I doubt that it’s actually true but I would certainly agree that they are nowhere near as proficient as some of the common assumptions (e.g., “digital natives”) make them out to be.  Is this a problem?  Should we be worried or looking for a solution?  That’s a different and more complex discussion but I think it’s safe to say that first-year college students are precisely as proficient as they have needed to be given how they use computers in their daily lives – just like everyone else.  They don’t typically use their computers to perform complicated or deeply technical tasks so why would we expect them to be profoundly tech savvy?

Data Analysis MOOC Week 1: I’m Going to Hate This

This semester, I have signed up for a data analysis class being taught in Coursera. This is a massively open online course (MOOC).  I’m tech savvy and well educated but it seems like the most responsible way for me to really learn about MOOCs is to gain some firsthand experience.  I also hope to learn some new data analysis techniques and ideas in this course.  The course will use R to analyze data so it will also be good to expand my (very limited) skills and knowledge with that powerful tool.

Going into this, I am very skeptical about what I understand the typical MOOC model to be with instruction primarily occurring using pre-recorded videos and quizzes with a discussion board as the primary means of communication between students and faculty.  I hope I’m wrong either about the model of instruction or about its effectiveness.  As an educator, I believe (and am supported by significant evidence) that the best learning occurs when experts make their thinking explicit through demonstration and give learners multiple opportunities for focused practice and feedback.  So my skepticism about the effectiveness of videos and quizzes as learning and teaching tools can best be summed up as: “Telling is not teaching.”  (Note that this applies just as forcefully to passive lecturing in physical classrooms!)

I’ve just started to get into the material for this course and so far it looks like my low expectations are going to be met: the course is built heavily around pre-recorded videos as the way for the faculty to teach students with weekly online quizzes and two peer-graded assignments as the only opportunities for us to “practice” what we are “learning.”  I hope I’m wrong and this proves to be much more enjoyable and rewarding that I think it will be!

Venues for Publishing Student Affairs Technology Research

One of my colleagues recently made an offhand remark about the timeliness of an article in the current issue of The Journal of College Student Development.  Rather than focus on the comment or the specific article, however, it seems more productive to explore appropriate and timely venues for publishing similar work in a more timely manner.

The problem?  Much of the research that we conduct about technology must be shared and disseminated quickly to keep up with the rapid pace with which technologies and their uses change.  Many of the traditional venues for publication and dissemination of research have huge lag times, sometimes a few years long; this is particularly problematic for some technology-related research that grows out-of-date much quicker than many other bodies of information.  I have research that I have conducted that has grown out-of-date before I could get it published in peer-reviewed journals e.g., work conducted with my colleague Chris Medrano examining content in Wikipedia articles about U.S. colleges and universities.  I have had data – really good data about interesting stuff! – grow stale over the course of a very busy year-and-a-half such that I could not work with it (I could have worked with it and it was such cool stuff that I’m sure that it would have been published somewhere but I would have felt horrible and a little bit ashamed about it!).

Although I have moved out of student affairs, I continue to do work about student and faculty use of technology so this is still an issue that is important to me.  I’d like your help in thinking about how we get our work out there.  Here are some of my thoughts:

  • Does the publication or release need to be through a traditional, peer-reviewed venue?  Even for those of us who believe ourselves to be locked into the traditional academic world where peer-reviewed publications remain the gold standard, I think the answer is “no.”  It might be acceptable to blog about your findings or present them in non-traditional conferences, especially if those venues allow you better reach your intended audience (e.g. how many full-time student affairs professionals regularly pore over peer-reviewed journals?).
  • For those who do believe in the necessity or value of publishing or presenting in traditional venues, which ones allow us to disseminate our findings in a timely manner?  My initial reaction to the comment that began this entire line of questioning is that JCSD is a fine venue but it moves too slowly to publish much of the technology-related research I have conducted.  In fact, most of the peer-reviewed journals in higher education move too slowly for me to consider them viable venues for publication of timely technology-related research.

Maybe it would be helpful if we can compile a list of good venues for student affairs technology research.  (Although I’m mostly out of that field now, I still do some work in it and my experiences are significant enough that I think I can help.)  My suggestions, in no particular order:

  • First Monday: Online, peer-reviewed journal that focuses on Internet studies.  They have published higher education-specific work in the past so they seem open to the topic.  It’s also a respected venue for scholarly work.  Very importantly, I understand that they review submitted articles very quickly.
  • Journal of Computer-Mediated Communication (JCMC): Peer-reviewed journal with an obvious focus.  Like First Monday, they have published work in our field.  It’s also the most respected venue that is usually on my radar screen for timely publication of relevant work.
  • The Journal of Technology in Student Affairs: Another peer-reviewed journal with an obvious focus.  Although this is a viable venue, it’s probably not one that I would submit to as my first choice.  It’s a fine publication but it simply doesn’t have a strong, high-profile reputation.  That may sound very crass but the reality of scholarly publishing is that it’s important to publish in the most highly regarded journals possible.
  • EDUCAUSE Review Online (ERO): Although ERO publishes some peer-reviewed work, it largely exists outside the traditional world of scholarly research because the publication is aimed at higher education IT practitioners.  With that said, it has historically been a very good venue for work that is intended for that audience although I haven’t published in it since they changed their format (EDUCAUSE used to have a monthly magazine and a quarterly peer-reviewed journal; they’ve been merged into one publication, ERO).

Outside of formal publications, several conferences are good venues to present and discuss this kind of work. I personally like EDUCAUSE events quite a bit but the audience that is interested in student affairs-specific work is pretty small.  The EDUCAUSE Learning Initiative (ELI), the arm of EDUCAUSE that focuses on teaching and learning, also puts on really nice conferences with wonderful participants if your work is more oriented towards teaching and learning.  I have also presented at other higher education conferences such as the annual conferences for ASHE, AERA, and AIR.  They are large conferences and quite frankly I don’t care for them very much because (a) they lack focus and (b) I have difficulty believing that anything that happens at them impacts the world beyond being another line on my CV.  AIR is a bit better, though, because it does have some focus and much of the work discussed there has real-world implications and impact largely because of the strong presence of institutional research professionals.

The student affairs conferences are certainly viable venues, particularly the recent ones that have begun cropping up that focus specifically on technology e.g., #NASPATech, #satechBOS.  I have drifted away from student affairs conference over the past several years, though, so I will let others with more recent experience offer their opinions and evaluations.

If you find this kind of brainstorming helpful or interesting, feel free to add your thoughts below.  If enough people are interested, this would make for a good shared project to throw into a publicly-accessible editing environment like a Google doc.

Inserting Unique Survey IDs into Multipage Paper Surveys

I still believe in paper surveys.  I believe that their immediacy and accessibility makes them very well-suited for some situations.  Although I value technology-based surveys (e.g. Web-based, tablet-based) I definitely believe that there are times when paper surveys are superior.

You can imagine that I was very happy when my new employer approved the purchase of (a) a printer with an automatic duplex scanner and (b) an installation of Remark Office OMR 8.  These two tools together will allow us to conduct paper surveys with some level of ease, automation, and accuracy.  I’m particularly happy that this will allow us to break free from the tyranny of Scantron by allowing us to create customized survey instruments that don’t rely on generic Scantron answer forms.

Now that I am learning how to use Remark Office OMR 8 I am figuring out all of those little things that I was previously able to count on other people to do, often without even knowing that it was being done.  Most recently, I had to figure out how to add unique survey IDs on a multipage survey.  Let me break it down for you:

I have a survey that is six pages long.  On each page, I have the page number and I can tell Remark Office where that page number is so I don’t have to worry about keeping pages in order.  But I also need some way to link all of those pages together when I am scanning multiple surveys so the correct six pages are grouped together in the resulting data file.  Hence I need to add a unique survey ID to each page of each survey.  Adding page numbers is easy but how do I add survey IDs?

I had to do this for my dissertation instrument but that was a one-page instrument so this was a simpler process.  The multipage process took me a few hours to figure it out and here is what I have settled on for now:

  1. Create the survey instrument.  I did this in Microsoft Publisher because it was the desktop publishing tool I had at hand.  I suppose you could use Word or something similar but it won’t give you near as much control over the layout.
  2. Print or save the survey as a pdf.
  3. Use that pdf to create another pdf with multiple copies of the survey instrument.  Right now, this is the clunkiest part of this process as I haven’t yet figured out how to directly print multiple copies of the instrument as a pdf.  Instead, I have to save multiple copies and merge them together.  It’s not entirely horrible as the merges geometrically multiply so it quickly becomes easy to make a single pdf file with many, many copies of the survey instrument.
  4. Create a simple Excel spreadsheet with the sequence of survey IDs.  My survey instrument has six pages so I end up with one column of numbers where each number is repeated six times before being incremented to the next one.  This spreadsheet is used in a mailmerge so I suppose this could easily be done as a comma-separated file or in some other program that produces similar output.  It’s important that the number of survey IDs match the number of surveys in your pdf.
  5. Create a simple Word document whose only text is a merge field that will insert the survey IDs into the document.
  6. Merge the Word document and save or print the resulting file as another pdf.  You now have two pdf files with the same number of pages; one has survey instruments and the other has survey IDs.
  7. Use pdftk to add the survey ID pdf as a background to the survey instrument pdf.  pdftk is a simple command line tool that lets you manipulate pdfs.  It’s freely available for many platforms, including Windows.  I used the “multibackground” parameter to essentially merge these two pdfs into one, adding the survey IDs to the survey instruments.  I got lucky in that my survey IDs were well-aligned with my survey instrument but you might have to modify one or both of your documents to get the survey ID to end up where you want it.

Now that I have unique survey IDs for each survey and page numbers on each page, I can feed the surveys into the scanner in any order I want and everything will work!  I just have to ensure that they’re all right-side up because I don’t know how well Office Remark OMR 8 can detect and correct for upside down instruments (it’s a feature of the software but I’ll have to test it; if this were a real concern I’d be looking into possible solutions such as cutting off or rounding one of the corners but I’ll be working with small enough batches that it will be easier just to flip through the completed instruments).

Dissertation Journal: Less Time and More Pressure Makes Kevin a Productive Boy

Although I have not finished my dissertation, I began a full-time job a little over a month ago.  I know that this is a dangerous move and that many people who leave school before completing their dissertation never complete it.  I also know that even in the best circumstances this will delay my progress.  This is a move motivated by the reality of five years of graduate student pay and loans, however, not by academic concerns.

So far this is working out well.  For over a year, I was stalled and made no progress at all.  I was paralyzed by indecision and fear and always eager to find other interesting and worthwhile projects.  I was also very good at dodging or redirecting questions from friends and colleagues.  But I knew that I wouldn’t be able to dodge questions from potential employers so I had to buckle down and get back on task – I didn’t have a choice.  Backing myself into a corner seems to have been the right choice as it forced me into action.

As I entered the job market, I began writing again so I could honestly tell interviewers that I was making substantive progress.  Even then I wasn’t writing as much and as often as I should have been doing.  Once I had a job offer, however, I knew that my days as a full-time student with lots of discretionary time were quickly coming to an end.  I finally got off of my ass and wrote with the effort and work ethic that I should have employed a year ago so I could finish my first three chapters and submit them to my chair.  I knew that for about two months I wouldn’t have any time to devote to my dissertation so I did as much as I could before moving and starting a new job.  I finished new drafts of my chapters and submitted them to my chair the day before I began packing up and moving to Delaware.  It was a huge relief to have made substantial progress so I could move with a clear conscience and start a new job without this looming over me.

As I have settled into my new job, I have learned that I have been extraordinarily lucky by landing a job where my supervisor, director, and colleagues are extremely supportive of me completing this terminal degree  When I was offered this job, I wanted to negotiate a pay raise dependent on completion of my dissertation to incentivize it.  That wasn’t possible as my supervisor negotiated the highest pay she could get for me regardless of my doctorate or lack thereof.  But my supervisor wants me to finish my doctorate for my own benefit; when we discussed my goals for the year, she asked me to place this at the top of the list.  Today, she asked if I would like to carve some time out of my work schedule to work on my dissertation on a regular basis as a form of professional development.  I couldn’t ask for more and I now feel a responsibility to justify the support I have been given.

I was also very fortunate in that one of my faculty members reached out to me to offer advice about completing the dissertation while working full-time but I will post that advice in a separate post because it may be more interesting to a larger audience than news about my personal journey.

Plagiarism of ResNet Research

This does not represent the views or opinions of anyone other than myself.   Specifically but not exclusively, this does not represent the views or opinions of anyone with whom I have worked in the past, my employer, or anyone associated with ResNet, Inc.

I am very, very sad to have to write and publish this entry.  I have always thought very highly of ACUTA, the U.S. higher education professional organization that focuses on networking and telephony. They have produced high quality reports and conferences, including conferences and webinars at which colleagues and I have presented.  They were also very gracious in allowing me to visit their headquarters in Lexington, Kentucky, a few years ago to comb through some of their historical archives as I performed historical research.

Six months ago, on April 6, I contacted ACUTA to draw attention to the material in the then-recently released ACUTA ResNet Survey that is identical to material in previous research conducted by me and other colleagues loosely associated with the ResNet Symposium (now the ResNet Student Technology Conference). Although ACUTA initially claimed that any similarities were “inadvertent,” they later admitted that at least 15 of the 45 questions – one-third – on their survey are virtually identical to older questions copied without attribution.  Despite this admission, ACUTA has only impartially and reluctantly publicly acknowledged the previous work from which a substantial portion of their current survey was copied. In particular, the (a) summary report and infographic associated with ACUTA’s survey make no mention whatsoever of the previous work upon which those work are substantially built and (b) ACUTA website was only edited in the past few days, presumably in response to an e-mail I sent on September 28 allowing them one more week to make edits before making this issue public.

This is not a legal issue.  Although I am one of the copyright holders of the original 2005 and 2008 survey instruments and reports and I could pursue legal action against ACUTA and their contractor Forward Analytics, it is highly unlikely that I will do so.  I have no interest in making money from my original work or the work performed by ACUTA and Forward Analytics.  I’m not very interested in stopping ACUTA from conducting their surveys and publishing results; in fact, I’m quite pleased that the work is being continued and I am flattered that they believe that the survey instrument I helped create is of sufficient quality that they are reusing and building on it.

This is an ethical issue.  In academia, we respect the work that others have done by clearly drawing attention to it when we build on their work.  It is right to give people credit for what they have done, especially when we are benefiting from that work.  Moreover, it is essential that we give readers a clear idea of the provenance of our ideas so they can perform due diligence to assure themselves of the quality and rigor of our work.

It is not necessary to ask permission to build on the ideas of another; as far as I am concerned, ACUTA is welcome to use, modify, and adapt questions from the survey instruments I helped to develop. But it is necessary to give us credit, both to acknowledge the work that my colleagues and I did and to allow others to know where some of the content in the ACUTA survey originated.  I don’t think it’s asking very much when I have asked ACUTA to play by the same rules as everyone else in academia.  I am perplexed and saddened that half a year ago I initially contacted ACUTA and since then they have not taken a few minutes to add a sentence or a footnote to their documents acknowledging the work on which theirs is built.


Page from draft of 2005 ResNet survey

Page one of draft 7 of the 2005 ResNet Survey. Note (a) the date in the bottom right corner: January 10, 2005 and (b) a note at the very top noting the previous research most influential on this instrument, an internal note that was later expanded when we solicited responses and published results of the survey.

Plagiarism is a very serious charge.  ACUTA has acknowledged in private e-mail messages that many questions were copied from the 2005 and 2008 survey instruments.  I am not quite comfortable publicly publishing the contents of private e-mail messages but here are some examples of the evidence that originally led me to be concerned about this:

1. Based on ACUTA’s report, their survey instrument asked “Is your institution’s residential network separate from the rest of the campus network(s)?” with the response options of (a) Yes, only physically, (b) Yes, only logically, (c) Yes, both physically and logically, and (d) No.  In 2005, my colleagues and I asked “Is your residential computer network separate from the rest of the campus network(s)? with the response options of (a) Yes, our residential computer network is physically separate, (b) Yes, our residential computer network is logically separate, (c) Yes, our residential computer network is both physically and logically separate, and (d) No.

2. Based on ACUTA’s report, their survey instrument asked “How many staff members (FTE) provide direct support to your campus residential computer network and its users?”  In 2008, my colleagues and I asked “How many full-time equivalent (FTE) staff provide direct support to your campus residential computer network and its users?”

3. ACUTA’s report states that “50% of IT Departments pay for bandwidth supplied to the residential networks but do not recover the cost.”  In 2005, my colleagues and I asked “Who pays for the bandwidth available to the residential computer network and are the costs recovered? (Check all that apply)” with the response options of (a) An outside vendor supplies the bandwidth and recovers some or all of the cost through a charge to the university, (b) An outside vendor supplies the bandwidth and recovers some or all of the cost through resident fees, (c) Central IT pays for it and recovers some or all of the cost through fees to residents or interdepartmental charges to Housing, (d) Central IT pays for it and does not recover the cost, (e) The Housing department pays a non-university ISP and recovers some or all of the cost through rent or other fees, and (f) Other (please specify) [emphasis added].

4. ACUTA’s report states that respondents were asked “What organization on your campus is primarily responsible for maintaining the infrastructure of your residential computer network?” with two pie charts displaying the responses, one pie chart for the Logical Infrastructure and the other pie chart for the Physical Infrastructure.  In 2005, my colleagues and I asked “What organization on your campus is primarily responsible for maintaining the physical infrastructure of the computer network for your on-campus housing facilities? Examples of this responsibility may include physical installation and maintenance of wiring, network switches, and installing and repairing data ports. (Check all that apply)” and “What
organization on your campus is primarily responsible for managing the logical infrastructure of the computer network for your on-campus housing facilities? Examples of this responsibility may include configuring switches and routers, monitoring network traffic, administering servers (DHCP, DNS, etc.), and shaping/filtering network traffic. (Check all that apply)”

5. ACUTA’s report states that “About 9 % of higher education institutions report thet [sic] they are currently outsourcing all or significant portions of their residential network. Another 4% of survey respondants [sic] indicate they are currently considering oursourcing [sic], while 15% of institutions have considered outsourcing their residential network but have yet to pursue such an option.”  In 2005, my colleagues and I asked “Has your institution considered outsourcing any significant portion of the residential computer network, including its support or maintenance, to an outside
entity not affiliated with your institution?” with the response options of (a) Yes, we have outsourced significant portions to a non-university organization, (b) Yes, we have considered outsourcing to a non-university organization but not pursued it, (c) We are considering outsourcing to a non-university organization right now, (d) No, we have not seriously considered outsourcing to a non-university organization, and (e) Other (please specify).

New Job: Hello Assessment, Goodbye Student Affairs

Three weeks ago, I started a new job: Senior Research Analyst in the Center for Teaching and Assessment of Learning at the University of Delaware.  I have not updated this blog, responded to blog comments, or even looked at Twitter and some e-mail messages for the past month-and-a-half as I’ve been busy and focused on moving halfway across the country and and starting a new job.  That should change as I settle into things and regain my focus.

My new job focuses on assessment of student learning, particularly general education goals.  Some of that will involve analyzing existing assessment data and helping faculty and administrators understand the results, including providing them with concrete recommendations.  Some of that will involve working with others to create or modify plans to assess student learning.  I already know that I will work some with our ePortfolio program as our FIPSE ePortfolio grantpays for some of my salary. Similarly, I am already working with our Howard Hughes Medical Institute Undergraduate Science Education Program grant as that grant also funds a small part of my salary.  I am also very pleased to already be involved in consulting with faculty on research design and assisting my colleagues with teaching and learning workshops.

My new job, however, does not focus on or often interact with student affairs programs and staff. I have already applied the skills and knowledge I gained working with student affairs programs and earning a student affairs graduate degree so this is not a complete disconnect.  But I will not be working in the culture that has been most familiar to me throughout the first decade of my professional life and that is a little bit daunting and sad.  On the whole, however, I am ready to move on as I am very ready for some new challenges and I am very happy to work in assessment and faculty development.  I confess that a tiny bit of that is related to my experiences in my job search, especially the dearth of appropriate jobs in student affairs for someone with my deep and broad knowledge of technology.  I am also sad that I am leaving many of the professional communities that have been so important to me, particularly the less formal but more spontaneous ones like #satech and #sachat.  But on the whole I am very happy to have found a new home that will allow me to stretch my wings and apply many of my skills in analysis in a job whose fundamental function is to ask and answer very important questions.

As I move on, I will be working to tie up loose ends and bring some projects to closure.  In particular, I still plan to complete my research into student affairs professionals’ historical views and uses of technology; I am not sure what form that will take (journal article(s)? series of lengthy blog posts? interactive timeline?) but already know that I will not be presenting at #NASPAtech next month as originally planned.  Of course, I also have to complete my dissertation but that deserves a separate post entirely as that’s more complicated and not tied to student affairs.

I don’t think that this new job will dramatically or instantaneously change many of my broad interests or the topics of this blog aside from the obvious shift away student affairs, a shift that has been underway for quite some time now anyway.  The impact of and use of technology in higher education is still one of my primary interests and I hope that my new job will provide me with new insights and spark new questions.  For example, I am sure that my work with ePortfolios will inform my thinking.  Additionally, I still have connections with other researchers who actively work on interesting questions and plans to continue working with some of them.  And I’m already beginning to work with faculty here who share these interests, including one who is beginning to explore some possible predictors of student success in hybrid courses (compared to face-to-face courses), so the future is bright!

Dorm vs. Residence Hall: A Silly Debate Nearly 100 Years Old

In most professions, there are certain words or phrases that are used to mark oneself as a member, someone who is “in.”  Many student affairs professionals doggedly avoid referring to on-campus housing units as “dorms,” even going so far as to take offense at the term and trying to correct those who use the hated word.  The preferred term is “residence hall,” a phrase that is used because dorm is perceived by some as being too cold and distant to describe someone’s home.  This is an issue on which a significant amount of energy is spent – just google “dorm vs residence hall” and you’ll immediately be thrown into the battlefield.

Personally, I think the debate – one which sometimes becomes inexplicably heated and emotional – is very silly and is usually a waste of time and energy better spent on substantive issues.  But my point here isn’t to convince you that I’m right.  I only want to share a surprising finding from the historical documents I’m current reviewing: This debate has been raging for nearly 100 years!

The conference proceedings for the 1941 meeting of the National Association of Advisers and Deans of Men (NADAM), the organization that later changed its name to NASPA, includes a talk given by Mr. R. B. Stewart, Controller of Purdue University (no, I don’t know what that title means, either), on the topic of “Institutional Housing Policies.”  In describing the student housing at Purdue, he noted:

Our approach to the student housing program began in 1927, when we received funds and borrowed money for the erection of our first Residence Hall for men.  At that time, our Trustees eliminated from Purdue terminology the use of the word “dormitory”, and since that date we refer to our housing units as “residence halls,” intending to convey the fact that our units are something more that places to sleep and for one’s being.

Whoa!  I knew that this battle against the word dorm had begun before my time in higher education but I had no idea that it was this old!

Where higher education and technology meet