Many New Ideas are Quite Old

Now that I’ve finished my dissertation, I finally feel free to turn my attention to other scholarly pursuits.  I feel an obligation to bring closure to the historical work I began a few years ago so I will be spending the next several months working with primary sources and reworking old drafts into publishable articles.  More than feeling an obligation to finish this work, I genuinely enjoy conducting historical research because I find it interesting and comforting to continually discover that many of today’s challenges and issues have been with us for decades or even centuries.

Below, I share some quotes from early-late 20th century sources that would be at home in an article, book, or blog post written in 2014.  After reading each quote, try to guess the year it was written before you continue reading.


 

 

It’s very common for us to worry about the effect of technology on our personal lives and psyches.  We’ve worried whether Google is making us stupid and we’ve often worried if Facebook is demeaning the value and meaning of friendship.  Faculty wonder if their jobs are being increasingly outsourced to MOOCs and learning analytics funded by the Gates Foundation.  Parents and teachers question the rise of standardized tests and their primacy in education.  Of course, the broad threads of these worries are quite old.  But how old?  When do you think this was this written?

“Today we have so surrounded ourselves with mechanical records that we may have ceased being personalities and have become machines…. In the present day of statistics and correlations, tests are given for everything except the things worth while.”

It comes from remarks given in 1929 by Dean Emeritus Stanley Coulter of Purdue University as recorded in the Secretarial Notes of the Tenth Annual Conferences of Deans and Advisers of Men.  This quote reflects a recurring theme in national student affairs conferences throughout the twentieth century that education had become too mechanistic and we have become focused on only the things that are in standardized tests.  This is the same idea that is seen a quarter of a century later in student protests in Berkeley with some students wearing t-shirts emblazoned with the “Do not fold, spindle, or mutilate” printed on punch cards as they protested the alleged depersonalization of higher education (among many other grievances).


 

The creation myth of the student affairs profession is that the profession came into being as faculty became increasingly disinterested in student life outside of the classroom.  So it’s no surprise that student affairs professionals have long felt locked out of the central scholarly processes of the academy.  This judgment and related frustration are aptly expressed in this quote:

“The residence halls, the extra-curriculum, the sports programs, the publications, all should be an integral part of the educative process — but they are only a part, and let’s face it, the second part. The classroom remains the core of our enterprise. The college could go on without the extra-curriculum. The curriculum is indispensable.

The educational values of the extra-curriculum cannot be realized unless we understand, and are closely allied with the curriculum itself — unless the force of our work is felt and favorably received by the members of the academic community who are solely academic in their interests and pursuits.”

I could slip this into a student affairs article tomorrow and it would fit right in.  Who originally said it and when?  NASPA President Robert M. Strozier from the University of Chicago included this in his Presidential address at the 1954 NASPA national conference.   Even outside of student affairs, I echo these ideas on a regular basis as I work to bridge the curriculum and co-curriculum from my vantage point in faculty development.


 

I’ve just finished reading a historical overview of undergraduate student culture in the U.S. in the eighteenth, nineteenth, and twentieth centuries.  The book itself is on the fringes of becoming a historical publication since it’s a few years old.  One of the most interesting parts of the book is the final chapter in which the author tries to apply the historical material to form an understanding of the author’s current students.  In this synthesis, the author describes the students who “call the shots [and] provide the dominant model of how to be an undergraduate” (p. 288) and reverse the judgment of previous generations of students who held grades to be nearly meaningless. Instead, grades are

“the ultimate value [that] do not reflect innate differences in intelligence; rather they result from figuring out what their processors want, spending long hours in study, and currying favor with their instructors…. In the classroom, they accept all the terms that the professor sets. Privately they may grumble or criticize faculty eccentricities, but their words sound like the grousing of a monarch’s subjects, an indirect means of confirming his or her power” (p. 269)

In her 1987 book Campus Life: Undergraduate Cultures from the End of the Eighteenth Century to the Present, Helen Horowitz laments that “today’s” (1987’s) students are entirely focused on grades without having interest in knowledge or critical thinking.  That’s an incredibly familiar complaint among today’s faculty!


 

Finally, I return to technology and indeed to the core idea that motivated my selection of “MistakenGoal.com” as the URL of this website.  For several years, I included a Stanley Katz quote in the header of this webpage: “…technology is not something that happens to us. It is something we create. We must not confuse a tool with a goal. We must, therefore, be sure that technology serves the fundamental purposes of higher education.”  That quote comes from a 2001 Chronicle of Higher Education article but the thought has been expressed by many people.  One of my favorite formulations:

“Except in a very few disciplines, technology is not an end in and of itself – it is the means to achieve some other scholarly aim. Technology, however, has an allure and a seductiveness that occasionally catches all of us, and we forget the original goal as we become captivated with the process.” (p. 11)

This quote predates Katz’s article by 12 years and appears in Brian Hawkins’s introduction to the 1989 book Organizing and Managing Information Resources on Campus.  This is a timeless warning to which I continually return.  It’s as familiar an idea as the other thoughts that are expressed in these quotes and a reminder that many of today’s problems have always been with us.  These problems sometimes seem to be too big to conquer because they have deep roots in our culture and society.  Some people might be dispirited by that idea but I take comfort that we’re not alone and we stand alongside those who went before us as we fight these good fights.

They Said it Better Than I Can

I’m a bit ashamed and embarrassed that I haven’t written anything here in so long!  The fall semester was very, very busy but one reason why I haven’t written anything is that there are so many eloquent, informed people who have written things that I want to say much better than I could have done so.  Here are some of the blogs that I follow that regularly impress me:

  • e-Literate: Led by Michael Feldstein, this group of authors routinely post insightful and detailed information about technology and U.S. higher education e.g., What Faculty Should Know About Adaptive Learning, State of the Anglosphere’s Higher Education LMS Market: 2013 Edition.  Some of their posts are a little bit hyperbolic and occasionally shrill (presumably to attract more readers), particularly this post.  Despite the occasional over-the-top writing, this blog was an excellent source of information about the recent kerfuffle about Purdue University’s learning analytics software.
  • Culture Digitally: This is another group blog, one that describes itself as “a gathering point around which scholars who study of cultural production and information technologies can think together.”  This blog doesn’t focus on higher education but it has posts from some wonderful researchers on the cutting edge of culture and technology.  I particularly like this recent post discussing “big data” and its potential shortcomings.
  • The Young and the Digital: This website is a companion to S. Craig Watson’s 2010 book of the same name.  It’s a great book and it’s very nice to be able follow the author as he continues to develop and share his thoughts.  This post is a great example of the good thoughts that are shared on this website.
  • Microsoft Social Media Collective Research Blog: The title of this blog tells you almost all you need to know.  This is a group of exceptional researchers who appear to have significant freedom to conduct ethical research without being unduly influenced by their employer.  This post listing some researchers’ opinion of the most influential journal article has a year’s worth of reading for anyone interested in social media.
  • Josie Ahlquist’s blog: A colleague – Joe Sabado, who has a nice blog of his own! – recently turned me on to Josie’s website.  She’s an EdD student who is beginning a dissertation focusing on “social media communication tools in higher education, focusing on college student use and educational methods to equip students to be positive productive citizens on emerging technologies.”  She is very well-informed and is doing a wonderful job of sharing and synthesizing the information she is discovering as she is completing her literature review. I’m very excited to follow her as she begins her research!

Self-regulated Learning and Age in a Hybrid Course

Earlier this spring, I worked with a wonderful faculty member to conduct research into a new hybrid version of an introductory Spanish course at our university.  He changed some sections of a 4-credit course that typically meets four days each week to so that they only met two days each week with a substantial increase in online activity.  I presented a paper on this research at the recent AIR conference with the basic questions: (a) Did students learn more or less in these hybrid sections? and (b) Did students who were more motivated or exhibited better study skills – measured using the Motivated Strategies for Learning Questionnaire (MSLQ) – learn more?

The full details are in the paper but it appears that the answers to our questions are:

  1. Students didn’t learn any more or less in the hybrid sections.  This is consistent with the larger body of research that has found “no significant difference” between courses taught using different media.  In fact, this is good news in some ways since we can implement more hybrid sections and courses with some confidence that student learning won’t be negatively impacted.  This is particularly beneficial for us as these small four-credit courses require a lot of classroom space.
  2. The impact of self-regulated learning is unclear.  Of the three outcome measures included in this study, performance on the MSLQ was only partially related to two outcomes.  This is contrary to our expectations as it seems reasonable that students who are motivated and use better study skills would learn more.

To me, the most interesting part of this study is the role of age in predicting student learning. We created several multiple regression models and age was a negative predictor of student grades but a positive predictor of improved proficiency in reading Spanish. In other words, after we accounted for things such as race/ethnicity and gender, older students tended to earn lower grades but they also seemed to learn more about reading Spanish (but not about listening to Spanish).  So older students have learned how to study more effectively and are more motivated to learn, right?  No, at least not according to the MSLQ results: Age was not significantly correlated with the MSLQ results.

In addition to the quantitative measures used in this study, we also interviewed several students.  At the same time, we also repeatedly interviewed students in some math courses that were also being modified – “flipped” – during the same semester.  We were consistently impressed with the older students in our interview sessions and very much enjoyed their maturity and self-reflection.  That suggests an interesting hypotheses: Were the older students in this study were simply less concerned with grades and more concerned about learning?

Student Engagement Infographic

Infographic of University of Delaware senior student participation in selected high-impact practices from NSSE 2011

Infographic of University of Delaware senior student participation in selected high-impact practices from NSSE 2011. This image links to the full-size graphic.

Last week, my colleagues and I presented the final UD First Friday Roundtable on Teaching of this semester.  We focused on “student engagement,” specifically naming the session “What Does an Engaged UD Student Look Like?”  It was a good session with lots of great discussion but right now I want to narrowly and briefly focus on two graphics that we whipped up for our supporting materials.

The first image in which you might be interested is a simple infographic showing University of Delaware student participation in some high-impact practices as extrapolated from NSSE 2011 responses.  The image to the right shows one part of the entire image to give you an idea what it looks like. This extract from the full-size image links to the a larger version of the full infographic; some of the text is too small to read even in that large image so you can also download the full-size pdf. It worked out quite well as a full-size poster and I also modified it to work as a handout for attendees.  It’s not bad for the amount of time I had to put into it although I would have liked to have done a lot more and a lot better.

For each of the six selected high-impact practices, I included not only the overall percentage of senior students who reported participating in them but also the subgroups for which there were significant (p ≤ .05) differences.  I looked at differences between students of different genders, white and non-white students, students in STEM and non-STEM disciplines, and first-generation and non-first-generation students.  If I had more time, I would have loved to have created another set of graphics illustrating the impact of these practices or some broader measure of student engagement on self-reported GPA and gains especially if these data showed what the national data tend to show is that these activities or engagement overall sometimes has more impact on different kinds of students.

2013-05-03 Engagement Roundtable word cloud The second image is a simple word cloud we used on some of our materials such as the agenda and signs.  I know that word clouds are passé (or maybe I’m the only one who thinks so) but this was a really simple and quick image for us to create.  Just as important, it was closely tied to the topic of the event as we used the text of one of the primary resources – Kuh’s 2008 AAC&U high-impact practices publication* –  we used to develop and think about the event as the input for the word cloud generated using Tagxedo.

 

* Kuh, G. D. (2008). High-impact educational practices: What they are, who has access to them, and why they matter. Washington, DC: AAC&U.

Are High Impact Practices Available Online?

I am still wrestling with my unease with MOOCs and I think I’ve finally figured out why: High impact educational practices, as we understand them today, are unlikely at best and impossible at worst in MOOCs and other similar online environments.

First, it’s helpful to understand that “high impact practice” (HIP) is a term of art.  Although the phrase is probably very common, in the past ten years or so the term has taken on special significance in U.S. higher education.  Popularized by George Kuh and emerging partly from research using data from the National Survey of Student Engagement (NSSE), this phrase has come to mean a particular set of activities that many higher education researchers believe are especially effective in promoting important and lasting changes in undergraduate students: First-Year Seminars and Experiences, Common Intellectual Experiences (i.e. core curricula), Learning Communities, Writing-Intensive Courses, Collaborative Assignments and Projects, Undergraduate Research, Diversity/Global Learning, Service Learning, Community-Based Learning, Internships, and Capstone Courses and Projects.

Unfortunately, we sometimes place too much focus on these particular activities without understanding why these activities have a high impact.  As originally described by Kuh in 2007, these practices share six characteristics:

  1. HIPs “demand that students devote considerable amounts of time and effort to purposeful tasks (p. 7)”
  2. HIPs place students in circumstances that require they “interact with faculty and peers about substantive matter (p. 7)”
  3. HIPs greatly increase the likelihood that students will interact with people who are different from themselves
  4. HIPs provide students with very frequent – sometimes continuous – feedback from faculty and peers
  5. HIPs require students to operate in intellectually complex ways by connecting knowledge in different courses and applying it in different contexts e.g. confronting complex real-world issues, investigating unfamiliar research problems
  6. HIPs occur in the context of a “coherent, academically challenging curriculum (p. 8)”

I am particularly interested in focusing on these characteristics of high impact practices as I will be helping lead a discussion on my campus next month focused on student engagement.  Most of the participants will be faculty and much of our focus will be on activities that faculty are using or can use in their curricula to promote student engagement.  Given that focus, I don’t think it would be helpful to focus on the specific activities identified as HIPs as those are often beyond the resources and purview of an individual faculty member.  Instead, we will focus on why those activities have a high impact so we can apply those principles to the activities within the power and resources of individual faculty.

That is what was on the forefront of my mind when I “attended” an EDUCAUSE Learning Initiative (ELI) online conference last week that focused on MOOCs.  The conference had some very active discussions among participants and as I participated in those discussions it occurred to me that one of the primary reasons I am uncomfortable with MOOCs is that it is difficult or impossible to apply much of what we know about good teaching in that environment.

Look back up at those six principles of high impact practices.  How do we do apply those principles in a MOOC?  More pointedly, can we apply those principles in a MOOC?  I despair that the answer is mostly “no.”  I pray that it is a simple lack of imagination on my part, a misunderstanding of what we can do in a MOOC, or that this is a fatal flaw of the dominant MOOC model that others will quickly recognize and fix or use to abandon that model.  I also confess that I don’t completely understand all of the discussions about “xMOOCs” and “cMOOCs” on anything but a very theoretical and abstract level and I have a sneaky suspicion that I’m missing something very important in how cMOOCs address some of these principles.

There is another interesting and hopeful way to think about this.  Another ELI conference attendee – I’m sorry that I don’t remember who – suggested that there may be other paradigms of effective educational practices that MOOCs might better fit.  Although I am a little bit skeptical that our understanding of effective education is going to be radically upended, this recommendation to not be too constrained by our current thinking is a very good one.  In fact, that is one important reason why I will be trying to steer our discussion here on my campus next month away from the specific activities and toward the broader principles so we can compare our thinking about student engagement with that of others’.  The idea isn’t to impose the model on my campus but to use it as a common starting point that must be adapted to our unique needs and resources.

That, of course, is what we’ll need to do with MOOCs: Use our best understanding of effective teaching and shape it to this unique environment with unique affordances.  I don’t know how to do that and I don’t know if that is what is being done.  I am wary that much of what is being done is not methodical and not built on what we know about how people learn.  I am especially skeptical that we can provide the kind of demanding and socially and intellectually connected experiences that we know provide some of the best learning.  I hope that people smarter than I are figuring this out, though, and working out how MOOCs can provide high impact educational practices.

Students Believe Their Computer Skills Are Below Average

Our colleagues at the Higher Education Research Institute (HERI) at UCLA have publicly released some information from their annual survey of first-year students.  There are already several media reports on the topic and we can expect many more to come out over the next few days.  What caught my eye is that they shared some of their data with The Chronicle of Higher Education who created an interactive graphic showing how student responses have (or have not) changed over time.

Several of the questions on the survey ask students to compare themselves “with the average person your age” and “rate yourself above average or better in terms of __.”  For nearly all of the questions of this form, students have consistently rated themselves as above average: ability to see the world from someone else’s perspective, tolerance of others with different beliefs, openness to having their own views challenged, ability to discuss and negotiate controversial issues, ability to work cooperatively with diverse people, academic ability, emotional health, and physical health.

So for what topics do respondents believe they are below average?  Computer skills, spirituality, and writing ability.  I don’t care to comment on spirituality (a commenter on the Chronicle’s website asks a good question: “What on earth does [that question] mean?”).  I’m puzzled that first-year college students believe they are below average in writing ability but I’m not an expert on writing so I’ll leave that puzzle to others.

What does it mean that 35% of the respondents to the survey rate themselves below average in computer skills?  And what does it mean that students have consistently responded like this since the question was first asked in 1999?  Well, to know for sure we’d have to ask them.  I would want to know how they interpret “computer skills.”  What do they consider to be computer skills?  How are they measuring their computer skills?  And to whom are they comparing themselves?  Heck, given the proliferation of smart phones and tablets it would be a good idea to ask students (and ourselves!) just what they think of as a “computer.”

One possible factor in all of this may be related to the gender imbalance in undergraduate education in the U.S.  More women than men are enrolled in U.S. colleges and universities.  According to the most recent data published by the National Clearinghouse Research Center, 56% of the students enrolled in the fall of 2012 were women.  Why is this important?  We know that women typically underestimate their computer skills whereas men typically overestimate their skills.  If the data reported by the Chronicle are unweighted then this may have an even larger impact on the data because women typically respond to surveys in higher proportions.

(Aside: The National Clearinghouse Research Center is doing some incredibly cool and vital research these days. They have a huge warehouse of data about college enrollment and it’s great to see them putting it all to use! Check out what they’re doing – it’s good stuff.)

In any case, it’s interesting that most undergraduates at 4-year institutions believe their computer skills are below average.  I doubt that it’s actually true but I would certainly agree that they are nowhere near as proficient as some of the common assumptions (e.g., “digital natives”) make them out to be.  Is this a problem?  Should we be worried or looking for a solution?  That’s a different and more complex discussion but I think it’s safe to say that first-year college students are precisely as proficient as they have needed to be given how they use computers in their daily lives – just like everyone else.  They don’t typically use their computers to perform complicated or deeply technical tasks so why would we expect them to be profoundly tech savvy?

Venues for Publishing Student Affairs Technology Research

One of my colleagues recently made an offhand remark about the timeliness of an article in the current issue of The Journal of College Student Development.  Rather than focus on the comment or the specific article, however, it seems more productive to explore appropriate and timely venues for publishing similar work in a more timely manner.

The problem?  Much of the research that we conduct about technology must be shared and disseminated quickly to keep up with the rapid pace with which technologies and their uses change.  Many of the traditional venues for publication and dissemination of research have huge lag times, sometimes a few years long; this is particularly problematic for some technology-related research that grows out-of-date much quicker than many other bodies of information.  I have research that I have conducted that has grown out-of-date before I could get it published in peer-reviewed journals e.g., work conducted with my colleague Chris Medrano examining content in Wikipedia articles about U.S. colleges and universities.  I have had data – really good data about interesting stuff! – grow stale over the course of a very busy year-and-a-half such that I could not work with it (I could have worked with it and it was such cool stuff that I’m sure that it would have been published somewhere but I would have felt horrible and a little bit ashamed about it!).

Although I have moved out of student affairs, I continue to do work about student and faculty use of technology so this is still an issue that is important to me.  I’d like your help in thinking about how we get our work out there.  Here are some of my thoughts:

  • Does the publication or release need to be through a traditional, peer-reviewed venue?  Even for those of us who believe ourselves to be locked into the traditional academic world where peer-reviewed publications remain the gold standard, I think the answer is “no.”  It might be acceptable to blog about your findings or present them in non-traditional conferences, especially if those venues allow you better reach your intended audience (e.g. how many full-time student affairs professionals regularly pore over peer-reviewed journals?).
  • For those who do believe in the necessity or value of publishing or presenting in traditional venues, which ones allow us to disseminate our findings in a timely manner?  My initial reaction to the comment that began this entire line of questioning is that JCSD is a fine venue but it moves too slowly to publish much of the technology-related research I have conducted.  In fact, most of the peer-reviewed journals in higher education move too slowly for me to consider them viable venues for publication of timely technology-related research.

Maybe it would be helpful if we can compile a list of good venues for student affairs technology research.  (Although I’m mostly out of that field now, I still do some work in it and my experiences are significant enough that I think I can help.)  My suggestions, in no particular order:

  • First Monday: Online, peer-reviewed journal that focuses on Internet studies.  They have published higher education-specific work in the past so they seem open to the topic.  It’s also a respected venue for scholarly work.  Very importantly, I understand that they review submitted articles very quickly.
  • Journal of Computer-Mediated Communication (JCMC): Peer-reviewed journal with an obvious focus.  Like First Monday, they have published work in our field.  It’s also the most respected venue that is usually on my radar screen for timely publication of relevant work.
  • The Journal of Technology in Student Affairs: Another peer-reviewed journal with an obvious focus.  Although this is a viable venue, it’s probably not one that I would submit to as my first choice.  It’s a fine publication but it simply doesn’t have a strong, high-profile reputation.  That may sound very crass but the reality of scholarly publishing is that it’s important to publish in the most highly regarded journals possible.
  • EDUCAUSE Review Online (ERO): Although ERO publishes some peer-reviewed work, it largely exists outside the traditional world of scholarly research because the publication is aimed at higher education IT practitioners.  With that said, it has historically been a very good venue for work that is intended for that audience although I haven’t published in it since they changed their format (EDUCAUSE used to have a monthly magazine and a quarterly peer-reviewed journal; they’ve been merged into one publication, ERO).

Outside of formal publications, several conferences are good venues to present and discuss this kind of work. I personally like EDUCAUSE events quite a bit but the audience that is interested in student affairs-specific work is pretty small.  The EDUCAUSE Learning Initiative (ELI), the arm of EDUCAUSE that focuses on teaching and learning, also puts on really nice conferences with wonderful participants if your work is more oriented towards teaching and learning.  I have also presented at other higher education conferences such as the annual conferences for ASHE, AERA, and AIR.  They are large conferences and quite frankly I don’t care for them very much because (a) they lack focus and (b) I have difficulty believing that anything that happens at them impacts the world beyond being another line on my CV.  AIR is a bit better, though, because it does have some focus and much of the work discussed there has real-world implications and impact largely because of the strong presence of institutional research professionals.

The student affairs conferences are certainly viable venues, particularly the recent ones that have begun cropping up that focus specifically on technology e.g., #NASPATech, #satechBOS.  I have drifted away from student affairs conference over the past several years, though, so I will let others with more recent experience offer their opinions and evaluations.

If you find this kind of brainstorming helpful or interesting, feel free to add your thoughts below.  If enough people are interested, this would make for a good shared project to throw into a publicly-accessible editing environment like a Google doc.

Plagiarism of ResNet Research

This does not represent the views or opinions of anyone other than myself.   Specifically but not exclusively, this does not represent the views or opinions of anyone with whom I have worked in the past, my employer, or anyone associated with ResNet, Inc.

I am very, very sad to have to write and publish this entry.  I have always thought very highly of ACUTA, the U.S. higher education professional organization that focuses on networking and telephony. They have produced high quality reports and conferences, including conferences and webinars at which colleagues and I have presented.  They were also very gracious in allowing me to visit their headquarters in Lexington, Kentucky, a few years ago to comb through some of their historical archives as I performed historical research.

Six months ago, on April 6, I contacted ACUTA to draw attention to the material in the then-recently released ACUTA ResNet Survey that is identical to material in previous research conducted by me and other colleagues loosely associated with the ResNet Symposium (now the ResNet Student Technology Conference). Although ACUTA initially claimed that any similarities were “inadvertent,” they later admitted that at least 15 of the 45 questions – one-third – on their survey are virtually identical to older questions copied without attribution.  Despite this admission, ACUTA has only impartially and reluctantly publicly acknowledged the previous work from which a substantial portion of their current survey was copied. In particular, the (a) summary report and infographic associated with ACUTA’s survey make no mention whatsoever of the previous work upon which those work are substantially built and (b) ACUTA website was only edited in the past few days, presumably in response to an e-mail I sent on September 28 allowing them one more week to make edits before making this issue public.

This is not a legal issue.  Although I am one of the copyright holders of the original 2005 and 2008 survey instruments and reports and I could pursue legal action against ACUTA and their contractor Forward Analytics, it is highly unlikely that I will do so.  I have no interest in making money from my original work or the work performed by ACUTA and Forward Analytics.  I’m not very interested in stopping ACUTA from conducting their surveys and publishing results; in fact, I’m quite pleased that the work is being continued and I am flattered that they believe that the survey instrument I helped create is of sufficient quality that they are reusing and building on it.

This is an ethical issue.  In academia, we respect the work that others have done by clearly drawing attention to it when we build on their work.  It is right to give people credit for what they have done, especially when we are benefiting from that work.  Moreover, it is essential that we give readers a clear idea of the provenance of our ideas so they can perform due diligence to assure themselves of the quality and rigor of our work.

It is not necessary to ask permission to build on the ideas of another; as far as I am concerned, ACUTA is welcome to use, modify, and adapt questions from the survey instruments I helped to develop. But it is necessary to give us credit, both to acknowledge the work that my colleagues and I did and to allow others to know where some of the content in the ACUTA survey originated.  I don’t think it’s asking very much when I have asked ACUTA to play by the same rules as everyone else in academia.  I am perplexed and saddened that half a year ago I initially contacted ACUTA and since then they have not taken a few minutes to add a sentence or a footnote to their documents acknowledging the work on which theirs is built.


Page from draft of 2005 ResNet survey

Page one of draft 7 of the 2005 ResNet Survey. Note (a) the date in the bottom right corner: January 10, 2005 and (b) a note at the very top noting the previous research most influential on this instrument, an internal note that was later expanded when we solicited responses and published results of the survey.

Plagiarism is a very serious charge.  ACUTA has acknowledged in private e-mail messages that many questions were copied from the 2005 and 2008 survey instruments.  I am not quite comfortable publicly publishing the contents of private e-mail messages but here are some examples of the evidence that originally led me to be concerned about this:

1. Based on ACUTA’s report, their survey instrument asked “Is your institution’s residential network separate from the rest of the campus network(s)?” with the response options of (a) Yes, only physically, (b) Yes, only logically, (c) Yes, both physically and logically, and (d) No.  In 2005, my colleagues and I asked “Is your residential computer network separate from the rest of the campus network(s)? with the response options of (a) Yes, our residential computer network is physically separate, (b) Yes, our residential computer network is logically separate, (c) Yes, our residential computer network is both physically and logically separate, and (d) No.

2. Based on ACUTA’s report, their survey instrument asked “How many staff members (FTE) provide direct support to your campus residential computer network and its users?”  In 2008, my colleagues and I asked “How many full-time equivalent (FTE) staff provide direct support to your campus residential computer network and its users?”

3. ACUTA’s report states that “50% of IT Departments pay for bandwidth supplied to the residential networks but do not recover the cost.”  In 2005, my colleagues and I asked “Who pays for the bandwidth available to the residential computer network and are the costs recovered? (Check all that apply)” with the response options of (a) An outside vendor supplies the bandwidth and recovers some or all of the cost through a charge to the university, (b) An outside vendor supplies the bandwidth and recovers some or all of the cost through resident fees, (c) Central IT pays for it and recovers some or all of the cost through fees to residents or interdepartmental charges to Housing, (d) Central IT pays for it and does not recover the cost, (e) The Housing department pays a non-university ISP and recovers some or all of the cost through rent or other fees, and (f) Other (please specify) [emphasis added].

4. ACUTA’s report states that respondents were asked “What organization on your campus is primarily responsible for maintaining the infrastructure of your residential computer network?” with two pie charts displaying the responses, one pie chart for the Logical Infrastructure and the other pie chart for the Physical Infrastructure.  In 2005, my colleagues and I asked “What organization on your campus is primarily responsible for maintaining the physical infrastructure of the computer network for your on-campus housing facilities? Examples of this responsibility may include physical installation and maintenance of wiring, network switches, and installing and repairing data ports. (Check all that apply)” and “What
organization on your campus is primarily responsible for managing the logical infrastructure of the computer network for your on-campus housing facilities? Examples of this responsibility may include configuring switches and routers, monitoring network traffic, administering servers (DHCP, DNS, etc.), and shaping/filtering network traffic. (Check all that apply)”

5. ACUTA’s report states that “About 9 % of higher education institutions report thet [sic] they are currently outsourcing all or significant portions of their residential network. Another 4% of survey respondants [sic] indicate they are currently considering oursourcing [sic], while 15% of institutions have considered outsourcing their residential network but have yet to pursue such an option.”  In 2005, my colleagues and I asked “Has your institution considered outsourcing any significant portion of the residential computer network, including its support or maintenance, to an outside
entity not affiliated with your institution?” with the response options of (a) Yes, we have outsourced significant portions to a non-university organization, (b) Yes, we have considered outsourcing to a non-university organization but not pursued it, (c) We are considering outsourcing to a non-university organization right now, (d) No, we have not seriously considered outsourcing to a non-university organization, and (e) Other (please specify).

Dorm vs. Residence Hall: A Silly Debate Nearly 100 Years Old

In most professions, there are certain words or phrases that are used to mark oneself as a member, someone who is “in.”  Many student affairs professionals doggedly avoid referring to on-campus housing units as “dorms,” even going so far as to take offense at the term and trying to correct those who use the hated word.  The preferred term is “residence hall,” a phrase that is used because dorm is perceived by some as being too cold and distant to describe someone’s home.  This is an issue on which a significant amount of energy is spent – just google “dorm vs residence hall” and you’ll immediately be thrown into the battlefield.

Personally, I think the debate – one which sometimes becomes inexplicably heated and emotional – is very silly and is usually a waste of time and energy better spent on substantive issues.  But my point here isn’t to convince you that I’m right.  I only want to share a surprising finding from the historical documents I’m current reviewing: This debate has been raging for nearly 100 years!

The conference proceedings for the 1941 meeting of the National Association of Advisers and Deans of Men (NADAM), the organization that later changed its name to NASPA, includes a talk given by Mr. R. B. Stewart, Controller of Purdue University (no, I don’t know what that title means, either), on the topic of “Institutional Housing Policies.”  In describing the student housing at Purdue, he noted:

Our approach to the student housing program began in 1927, when we received funds and borrowed money for the erection of our first Residence Hall for men.  At that time, our Trustees eliminated from Purdue terminology the use of the word “dormitory”, and since that date we refer to our housing units as “residence halls,” intending to convey the fact that our units are something more that places to sleep and for one’s being.

Whoa!  I knew that this battle against the word dorm had begun before my time in higher education but I had no idea that it was this old!

Ongoing Research Into Student Affairs Technology History

Covers from old ACPA and NASPA conference proceedings. From upper-left, clockwise: NASPA 1930, NASPA 1950, ACPA 1942, ACPA 1932

I’ve written a few times about historical research I’ve done looking into how U.S. student affairs professionals have used and viewed technology throughout the 20th century.  Although I don’t know where my current job search will take me, I feel a responsibility to bring some closure to this research and then ensure it is somehow published or shared.

Much of my previous work was based on documents held at the National Student Affairs Archives at Bowling Green State University, especially the conference proceedings and programs for ACPA and NASPA.  My work is incomplete, however, because those (wonderful!) archives did not have most of the conference proceedings from the first half of the century.  However, another scholar told me that my own institution, Indiana University, has many of these proceedings.  Since I will probably be leaving Bloomington soon, I finally followed up on this tip and requested all of the conference proceedings in the IU library.  The two collections – IU and BGSU – complement each other very nicely, almost as if a single collection of all of the proceedings were divided evenly between the two libraries.  It would probably be a bibliographic faux pas to ask one of these libraries to donate their materials to the other one but it sure would be nice to have a nearly complete collection in one place.  At least the two universities are only a few hours apart so it’s not terribly burdensome for scholars who want to consult these materials.

I’ve only started reading through these documents and I’m already very glad that I requested them!   In just the handful of proceedings that I’ve read so far I’ve found interesting things such as:

  • Discussion of the negative effects of “mechanical devices” on education in 1928
  • A demonstration of IBM equipment for Deans of Men in 1950
  • A new program at the 1950 NASPA conference using audio recorders to collect and then distribute the distilled wisdom of its members.  In the opening session, NADAM President L. K. Neidlinger described this new program to attendees:

    You can also improve your mind and learn how to be a dean by going to the Recording Room, just off the lobby, at any time that suits your convenience, and asking the attendants there to hook you up to one of the tape recordings that we have been busy making last night and this morning. We are conducting there an interesting new experiment in convention technique. On each of several topics we have had a team of five deans record their experience and advice — all on the same tape. Anyone interested in these topics can pull up a chair, light a cigar, and listen at leisure to the advice of five colleagues who could not otherwise be interviewed so conveniently. He can then add his own comments by flipping a switch and talking. Furthermore, six months from now when you may have to educate a faculty committee on the facts of life about one of these topics, you will be able to write Fred Turner for the recording, borrow a machine, and bring these expert witnesses into your committee room.

  • A demonstration of the new Polaroid camera, with specific mention of its possible use in creating photographs for student IDs, in 1951

Even though I’ve just begun reading through these proceedings, I already have examples of (a) worry about the effects of technology on education and students, (b) discussions of the potential benefits of technology in student affairs administration, especially record keeping and processing, (c) demonstrations of new technology by vendors and pioneering institutions, and (d) innovative uses of technology initiated by members of the professional organizations themselves.  A history of regular and continued use of technology, including original innovations and cutting-edge uses, doesn’t seem to be part of the mainstream historical narrative of the student affairs profession but that seems to be the story I’m finding in the historical artifacts.

(Off-topic: Holy crap are these proceedings products of their times!  I knew that the history of these two professional organizations was very gendered given their historical roots but I didn’t expect the volume of casual sexism documented in these proceedings!  I did, however, expect some degree of racism and a large homophobia and – sadly – my expectations have been met.  I’m not even looking for these things but they often come screaming out of the pages. I’m reminded of a moment in this story where a college student asks during a discussion about the Founding Fathers: “If the Founders loved the humanities so much, how come they treated the natives so badly?” It’s mentally and spiritually jarring to read pages and pages of passionate discussion about the importance of each student and their intellectual and moral development followed by a casual dismissal of the competence of deans of women or a reminder of the psychological and moral depravity of homosexuality. The incongruity and dissonance makes me wonder what normal, accepted practices and beliefs we hold today will cause these “Holy crap!” moments for future generations when they read our e-mails and watch our videos.)