Now that I’ve finished my dissertation, I finally feel free to turn my attention to other scholarly pursuits. I feel an obligation to bring closure to the historical work I began a few years ago so I will be spending the next several months working with primary sources and reworking old drafts into publishable articles. More than feeling an obligation to finish this work, I genuinely enjoy conducting historical research because I find it interesting and comforting to continually discover that many of today’s challenges and issues have been with us for decades or even centuries.
Below, I share some quotes from early-late 20th century sources that would be at home in an article, book, or blog post written in 2014. After reading each quote, try to guess the year it was written before you continue reading.
I’m a bit ashamed and embarrassed that I haven’t written anything here in so long!Â The fall semester was very, very busy but one reason why I haven’t written anything is that there are so many eloquent, informed people who have written things that I want to say much better than I could have done so.Â Here are some of the blogs that I follow that regularly impress me:
Culture Digitally: This is another group blog, one that describes itself as “a gathering point around which scholars who study of cultural production and information technologies can think together.”Â This blog doesn’t focus on higher education but it has posts from some wonderful researchers on the cutting edge of culture and technology.Â I particularly like this recent post discussing “big data” and its potential shortcomings.
The Young and the Digital: This website is a companion to S. Craig Watson’s 2010 book of the same name.Â It’s a great book and it’s very nice to be able follow the author as he continues to develop and share his thoughts.Â This post is a great example of the good thoughts that are shared on this website.
Microsoft Social Media Collective Research Blog: The title of this blog tells you almost all you need to know.Â This is a group of exceptional researchers who appear to have significant freedom to conduct ethical research without being unduly influenced by their employer.Â This post listing some researchers’ opinion of the most influential journal article has a year’s worth of reading for anyone interested in social media.
Josie Ahlquist’s blog: A colleague – Joe Sabado, who has a nice blog of his own! – recently turned me on to Josie’s website.Â She’s an EdD student who is beginning a dissertation focusing on “social media communication tools in higher education, focusing on college student use and educational methods to equip students to be positive productive citizens on emerging technologies.”Â She is very well-informed and is doing a wonderful job of sharing and synthesizing the information she is discovering as she is completing her literature review. I’m very excited to follow her as she begins her research!
Earlier this spring, I worked with a wonderful faculty member to conduct research into a new hybrid version of an introductory Spanish course at our university.Â He changed some sections of a 4-credit course that typically meets four days each week to so that they only met two days each week with a substantial increase in online activity.Â I presented a paper on this research at the recent AIR conference with the basic questions: (a) Did students learn more or less in these hybrid sections? and (b) Did students who were more motivated or exhibited better study skills – measured using the Motivated Strategies for Learning Questionnaire (MSLQ) – learn more?
The full details are in the paper but it appears that the answers to our questions are:
Students didn’t learn any more or less in the hybrid sections.Â This is consistent with the larger body of research that has found “no significant difference” between courses taught using different media.Â In fact, this is good news in some ways since we can implement more hybrid sections and courses with some confidence that student learning won’t be negatively impacted.Â This is particularly beneficial for us as these small four-credit courses require a lot of classroom space.
The impact of self-regulated learning is unclear.Â Of the three outcome measures included in this study, performance on the MSLQ was only partially related to two outcomes.Â This is contrary to our expectations as it seems reasonable that students who are motivated and use better study skills would learn more.
To me, the most interesting part of this study is the role of age in predicting student learning. We created several multiple regression models and age was a negative predictor of student grades but a positive predictor of improved proficiency in reading Spanish. In other words, after we accounted for things such as race/ethnicity and gender, older students tended to earn lower grades but they also seemed to learn more about reading Spanish (but not about listening to Spanish).Â So older students have learned how to study more effectively and are more motivated to learn, right?Â No, at least not according to the MSLQ results: Age was not significantly correlated with the MSLQ results.
In addition to the quantitative measures used in this study, we also interviewed several students.Â At the same time, we also repeatedly interviewed students in some math courses that were also being modified – “flipped” – during the same semester.Â We were consistently impressed with the older students in our interview sessions and very much enjoyed their maturity and self-reflection.Â That suggests an interesting hypotheses: Were the older students in this study were simply less concerned with grades and more concerned about learning?
Last week, my colleagues and I presented the final UD First Friday Roundtable on Teaching of this semester.Â We focused on “student engagement,” specifically naming the session “What Does an Engaged UD Student Look Like?”Â It was a good session with lots of great discussion but right now I want to narrowly and briefly focus on two graphics that we whipped up for our supporting materials.
The first image in which you might be interested is a simple infographic showing University of Delaware student participation in some high-impact practices as extrapolated from NSSE 2011 responses.Â Â The image to the right shows one part of the entire image to give you an idea what it looks like. This extract from the full-size image links to the a larger version of the full infographic; some of the text is too small to read even in that large image so you can also download the full-size pdf. It worked out quite well as a full-size poster and I also modified it to work as a handout for attendees.Â It’s not bad for the amount of time I had to put into it although I would have liked to have done a lot more and a lot better.
For each of the six selected high-impact practices, I included not only the overall percentage of senior students who reported participating in them but also the subgroups for which there were significant (p â‰¤ .05) differences.Â I looked at differences between students of different genders, white and non-white students, students in STEM and non-STEM disciplines, and first-generation and non-first-generation students.Â If I had more time, I would have loved to have created another set of graphics illustrating the impact of these practices or some broader measure of student engagement on self-reported GPA and gains especially if these data showed what the national data tend to show is that these activities or engagement overall sometimes has more impact on different kinds of students.
I am still wrestling with my unease with MOOCs and I think I’ve finally figured out why: High impact educational practices, as we understand them today, are unlikely at best and impossible at worst in MOOCs and other similar online environments.
First, it’s helpful to understand that “high impact practice” (HIP) is a term of art.Â Although the phrase is probably very common, in the past ten years or so the term has taken on special significance in U.S. higher education.Â Popularized by George Kuh and emerging partly from research using data from the National Survey of Student Engagement (NSSE), this phrase has come to mean a particular set of activities that many higher education researchers believe are especially effective in promoting important and lasting changes in undergraduate students: First-Year Seminars and Experiences, Common Intellectual Experiences (i.e. core curricula), Learning Communities, Writing-Intensive Courses, Collaborative Assignments and Projects, Undergraduate Research, Diversity/Global Learning, Service Learning, Community-Based Learning, Internships, and Capstone Courses and Projects.
Unfortunately, we sometimes place too much focus on these particular activities without understanding why these activities have a high impact.Â As originally described by Kuh in 2007, these practices share six characteristics:
HIPs “demand that students devote considerable amounts of time and effort to purposeful tasks (p. 7)”
HIPs place students in circumstances that require they “interact with faculty and peers about substantive matter (p. 7)”
HIPs greatly increase the likelihood that students will interact with people who are different from themselves
HIPs provide students with very frequent – sometimes continuous – feedback from faculty and peers
HIPs require students to operate in intellectually complex ways by connecting knowledge in different courses and applying it in different contexts e.g. confronting complex real-world issues, investigating unfamiliar research problems
HIPs occur in the context of a “coherent, academically challenging curriculum (p. 8)”
I am particularly interested in focusing on these characteristics of high impact practices as I will be helping lead a discussion on my campus next month focused on student engagement.Â Most of the participants will be faculty and much of our focus will be on activities that faculty are using or can use in their curricula to promote student engagement.Â Given that focus, I don’t think it would be helpful to focus on the specific activities identified as HIPs as those are often beyond the resources and purview of an individual faculty member.Â Instead, we will focus on why those activities have a high impact so we can apply those principles to the activities within the power and resources of individual faculty.
That is what was on the forefront of my mind when I “attended” an EDUCAUSELearning Initiative (ELI) online conference last week that focused on MOOCs.Â The conference had some very active discussions among participants and as I participated in those discussions it occurred to me that one of the primary reasons I am uncomfortable with MOOCs is that it is difficult or impossible to apply much of what we know about good teaching in that environment.
Look back up at those six principles of high impact practices.Â How do we do apply those principles in a MOOC?Â More pointedly, can we apply those principles in a MOOC?Â I despair that the answer is mostly “no.”Â I pray that it is a simple lack of imagination on my part, a misunderstanding of what we can do in a MOOC, or that this is a fatal flaw of the dominant MOOC model that others will quickly recognize and fix or use to abandon that model.Â I also confess that I don’t completely understand all of the discussions about “xMOOCs” and “cMOOCs” on anything but a very theoretical and abstract level and I have a sneaky suspicion that I’m missing something very important in how cMOOCs address some of these principles.
There is another interesting and hopeful way to think about this.Â Another ELI conference attendee – I’m sorry that I don’t remember who – suggested that there may be other paradigms of effective educational practices that MOOCs might better fit.Â Although I am a little bit skeptical that our understanding of effective education is going to be radically upended, this recommendation to not be too constrained by our current thinking is a very good one.Â In fact, that is one important reason why I will be trying to steer our discussion here on my campus next month away from the specific activities and toward the broader principles so we can compare our thinking about student engagement with that of others’.Â The idea isn’t to impose the model on my campus but to use it as a common starting point that must be adapted to our unique needs and resources.
That, of course, is what we’ll need to do with MOOCs: Use our best understanding of effective teaching and shape it to this unique environment with unique affordances.Â I don’t know how to do that and I don’t know if that is what is being done.Â I am wary that much of what is being done is not methodical and not built on what we know about how people learn.Â I am especially skeptical that we can provide the kind of demanding and socially and intellectually connected experiences that we know provide some of the best learning.Â I hope that people smarter than I are figuring this out, though, and working out how MOOCs can provide high impact educational practices.
Our colleagues at the Higher Education Research Institute (HERI) at UCLA have publicly released some information from their annual survey of first-year students.Â There are already several media reports on the topic and we can expect many more to come out over the next few days.Â What caught my eye is that they shared some of their data with The Chronicle of Higher Education who created an interactive graphic showing how student responses have (or have not) changed over time.
Several of the questions on the survey ask students to compare themselves “with the average person your age” and “rate yourself above average or better in terms of __.”Â For nearly all of the questions of this form, students have consistently rated themselves as above average: ability to see the world from someone else’s perspective, tolerance of others with different beliefs, openness to having their own views challenged, ability to discuss and negotiate controversial issues, ability to work cooperatively with diverse people, academic ability, emotional health, and physical health.
So for what topics do respondents believe they are below average?Â Computer skills, spirituality, and writing ability.Â I don’t care to comment on spirituality (a commenter on the Chronicle’s website asks a good question: “What on earth does [that question] mean?”).Â I’m puzzled that first-year college students believe they are below average in writing ability but I’m not an expert on writing so I’ll leave that puzzle to others.
What does it mean that 35% of the respondents to the survey rate themselves below average in computer skills?Â And what does it mean that students have consistently responded like this since the question was first asked in 1999?Â Well, to know for sure we’d have to ask them.Â I would want to know how they interpret “computer skills.”Â What do they consider to be computer skills?Â How are they measuring their computer skills?Â And to whom are they comparing themselves?Â Heck, given the proliferation of smart phones and tablets it would be a good idea to ask students (and ourselves!) just what they think of as a “computer.”
One possible factor in all of this may be related to the gender imbalance in undergraduate education in the U.S.Â More women than men are enrolled in U.S. colleges and universities.Â According to the most recent data published by the National Clearinghouse Research Center, 56% of the students enrolled in the fall of 2012 were women.Â Why is this important?Â We know that women typically underestimate their computer skills whereas men typically overestimate their skills.Â If the data reported by the Chronicle are unweighted then this may have an even larger impact on the data because women typically respond to surveys in higher proportions.
(Aside: The National Clearinghouse Research Center is doing some incredibly cool and vital research these days. They have a huge warehouse of data about college enrollment and it’s great to see them putting it all to use! Check out what they’re doing – it’s good stuff.)
In any case, it’s interesting that most undergraduates at 4-year institutions believe their computer skills are below average.Â I doubt that it’s actually true but I would certainly agree that they are nowhere near as proficient as some of the common assumptions (e.g., “digital natives”) make them out to be.Â Is this a problem?Â Should we be worried or looking for a solution?Â That’s a different and more complex discussion but I think it’s safe to say that first-year college students are precisely as proficient as they have needed to be given how they use computers in their daily lives – just like everyone else.Â They don’t typically use their computers to perform complicated or deeply technical tasks so why would we expect them to be profoundly tech savvy?
One of my colleagues recently made an offhand remark about the timeliness of an article in the current issue of The Journal of College Student Development.Â Rather than focus on the comment or the specific article, however, it seems more productive to explore appropriate and timely venues for publishing similar work in a more timely manner.
The problem?Â Much of the research that we conduct about technology must be shared and disseminated quickly to keep up with the rapid pace with which technologies and their uses change.Â Many of the traditional venues for publication and dissemination of research have huge lag times, sometimes a few years long; this is particularly problematic for some technology-related research that grows out-of-date much quicker than many other bodies of information.Â I have research that I have conducted that has grown out-of-date before I could get it published in peer-reviewed journals e.g., work conducted with my colleague Chris Medrano examining content in Wikipedia articles about U.S. colleges and universities.Â I have had data – really good data about interesting stuff! – grow stale over the course of a very busy year-and-a-half such that I could not work with it (I could have worked with it and it was such cool stuff that I’m sure that it would have been published somewhere but I would have felt horrible and a little bit ashamed about it!).
Although I have moved out of student affairs, I continue to do work about student and faculty use of technology so this is still an issue that is important to me.Â I’d like your help in thinking about how we get our work out there.Â Here are some of my thoughts:
Does the publication or release need to be through a traditional, peer-reviewed venue?Â Even for those of us who believe ourselves to be locked into the traditional academic world where peer-reviewed publications remain the gold standard, I think the answer is “no.”Â It might be acceptable to blog about your findings or present them in non-traditional conferences, especially if those venues allow you better reach your intended audience (e.g. how many full-time student affairs professionals regularly pore over peer-reviewed journals?).
For those who do believe in the necessity or value of publishing or presenting in traditional venues, which ones allow us to disseminate our findings in a timely manner?Â My initial reaction to the comment that began this entire line of questioning is that JCSD is a fine venue but it moves too slowly to publish much of the technology-related research I have conducted.Â In fact, most of the peer-reviewed journals in higher education move too slowly for me to consider them viable venues for publication of timely technology-related research.
Maybe it would be helpful if we can compile a list of good venues for student affairs technology research.Â (Although I’m mostly out of that field now, I still do some work in it and my experiences are significant enough that I think I can help.)Â My suggestions, in no particular order:
First Monday: Online, peer-reviewed journal that focuses on Internet studies.Â They have published highereducation-specificwork in the past so they seem open to the topic.Â It’s also a respected venue for scholarly work.Â Very importantly, I understand that they review submitted articles very quickly.
The Journal of Technology in Student Affairs: Another peer-reviewed journal with an obvious focus.Â Although this is a viable venue, it’s probably not one that I would submit to as my first choice.Â It’s a fine publication but it simply doesn’t have a strong, high-profile reputation.Â That may sound very crass but the reality of scholarly publishing is that it’s important to publish in the most highly regarded journals possible.
EDUCAUSE Review Online (ERO): Although ERO publishes some peer-reviewed work, it largely exists outside the traditional world of scholarly research because the publication is aimed at higher education IT practitioners.Â With that said, it has historically been a very good venue for work that is intended for that audience although I haven’t published in it since they changed their format (EDUCAUSE used to have a monthly magazine and a quarterly peer-reviewed journal; they’ve been merged into one publication, ERO).
Outside of formal publications, several conferences are good venues to present and discuss this kind of work. I personally like EDUCAUSE events quite a bit but the audience that is interested in student affairs-specific work is pretty small.Â The EDUCAUSE Learning Initiative (ELI), the arm of EDUCAUSE that focuses on teaching and learning, also puts on really nice conferences with wonderful participants if your work is more oriented towards teaching and learning.Â I have also presented at other higher education conferences such as the annual conferences for ASHE, AERA, and AIR.Â They are large conferences and quite frankly I don’t care for them very much because (a) they lack focus and (b) I have difficulty believing that anything that happens at them impacts the world beyond being another line on my CV.Â AIR is a bit better, though, because it does have some focus and much of the work discussed there has real-world implications and impact largely because of the strong presence of institutional research professionals.
The student affairs conferences are certainly viable venues, particularly the recent ones that have begun cropping up that focus specifically on technology e.g., #NASPATech, #satechBOS.Â I have drifted away from student affairs conference over the past several years, though, so I will let others with more recent experience offer their opinions and evaluations.
If you find this kind of brainstorming helpful or interesting, feel free to add your thoughts below.Â If enough people are interested, this would make for a good shared project to throw into a publicly-accessible editing environment like a Google doc.
This does not represent the views or opinions of anyone other than myself. Â Specifically but not exclusively, this does not represent the views or opinions of anyone with whom I have worked in the past, my employer, or anyone associated with ResNet, Inc.
I am very, very sad to have to write and publish this entry. Â I have always thought very highly of ACUTA, the U.S. higher education professional organization that focuses on networking and telephony. They have produced high quality reports and conferences, including conferences and webinars at which colleagues and I have presented. Â They were also very gracious in allowing me to visit their headquarters in Lexington, Kentucky, a few years ago to comb through some of their historical archives as I performed historical research.
Six months ago, on April 6, I contacted ACUTA to draw attention to the material in the then-recently released ACUTA ResNet Survey that is identical to material in previousresearch conducted by me and other colleagues loosely associated with the ResNet Symposium (now the ResNet Student Technology Conference). Although ACUTA initially claimed that any similarities were “inadvertent,” they later admitted that at least 15 of the 45 questions – one-third – on their survey are virtually identical to older questions copied without attribution. Â Despite this admission, ACUTA has only impartially and reluctantly publicly acknowledged the previous work from which a substantial portion of their current survey was copied. In particular, the (a) summary report and infographic associated with ACUTA’s survey make no mention whatsoever of the previous work upon which those work are substantially built and (b) ACUTA website was only edited in the past few days, presumably in response to an e-mail I sent on September 28 allowing them one more week to make edits before making this issue public.
This is not a legal issue. Â Although I am one of the copyright holders of the original 2005 and 2008 survey instruments and reports and I could pursue legal action against ACUTA and their contractor Forward Analytics, it is highly unlikely that I will do so. Â I have no interest in making money from my original work or the work performed by ACUTA and Forward Analytics. Â I’m not very interested in stopping ACUTA from conducting their surveys and publishing results; in fact, I’m quite pleased that the work is being continued and I am flattered that they believe that the survey instrument I helped create is of sufficient quality that they are reusing and building on it.
This is an ethical issue. Â In academia, we respect the work that others have done by clearly drawing attention to it when we build on their work. Â It is right to give people credit for what they have done, especially when we are benefiting from that work. Â Moreover, it is essential that we give readers a clear idea of the provenance of our ideas so they can perform due diligence to assure themselves of the quality and rigor of our work.
It is not necessary to ask permission to build on the ideas of another; as far as I am concerned, ACUTA is welcome to use, modify, and adapt questions from the survey instruments I helped to develop. But it is necessary to give us credit, both to acknowledge the work that my colleagues and I did and to allow others to know where some of the content in the ACUTA survey originated. Â I don’t think it’s asking very much when I have asked ACUTA to play by the same rules as everyone else in academia. Â I am perplexed and saddened that half a year ago I initially contacted ACUTA and since then they have not taken a few minutes to add a sentence or a footnote to their documents acknowledging the work on which theirs is built.
Plagiarism is a very serious charge. Â ACUTA has acknowledged in private e-mail messages that many questions were copied from the 2005 and 2008 survey instruments. Â I am not quite comfortable publicly publishing the contents of private e-mail messages but here are some examples of the evidence that originally led me to be concerned about this:
1. Based on ACUTA’s report, their survey instrument asked “Is your institutionâ€™s residential network separate from the rest of the campus network(s)?” with the response options of (a) Yes, only physically, (b) Yes, only logically, (c) Yes, both physically and logically, and (d) No. Â In 2005, my colleagues and I asked “Is your residential computer network separate from the rest of the campus network(s)? with the response options of (a) Yes, our residential computer network is physically separate, (b) Yes, our residential computer network is logically separate, (c) Yes, our residential computer network is both physically and logically separate, and (d) No.
2. Based on ACUTA’s report, their survey instrument asked “How many staff members (FTE) provide direct support to your campus residential computer network and its users?” Â In 2008, my colleagues and I asked “How many full-time equivalent (FTE) staff provide direct support to your campus residential computer network and its users?”
3. ACUTA’s report states that “50% of IT Departments pay for bandwidth supplied to the residential networks but do not recover the cost.” Â In 2005, my colleagues and I asked “Who pays for the bandwidth available to the residential computer network and are the costs recovered? (Check all that apply)” with the response options of (a) An outside vendor supplies the bandwidth and recovers some or all of the cost through a charge to the university, (b) An outside vendor supplies the bandwidth and recovers some or all of the cost through resident fees, (c) Central IT pays for it and recovers some or all of the cost through fees to residents or interdepartmental charges to Housing, (d) Central IT pays for it and does not recover the cost, (e) The Housing department pays a non-university ISP and recovers some or all of the cost through rent or other fees, and (f) Other (please specify) [emphasis added].
4. ACUTA’s report states that respondents were asked “What organization on your campus is primarily responsible for maintaining the infrastructure of your residential computer network?” with two pie charts displaying the responses, one pie chart for the Logical Infrastructure and the other pie chart for the Physical Infrastructure. Â In 2005, my colleagues and I asked “What organization on your campus is primarily responsible for maintaining the physical infrastructure of the computer network for your on-campus housing facilities? Examples of this responsibility may include physical installation and maintenance of wiring, network switches, and installing and repairing data ports. (Check all that apply)” and “What
organization on your campus is primarily responsible for managing the logical infrastructure of the computer network for your on-campus housing facilities? Examples of this responsibility may include configuring switches and routers, monitoring network traffic, administering servers (DHCP, DNS, etc.), and shaping/filtering network traffic. (Check all that apply)”
5. ACUTA’s report states that “About 9 % of higher education institutions report thet [sic] they are currently outsourcing all or significant portions of their residential network. Another 4% of survey respondants [sic] indicate they are currently considering oursourcing [sic], while 15% of institutions have considered outsourcing their residential network but have yet to pursue such an option.” Â In 2005, my colleagues and I asked “Has your institution considered outsourcing any significant portion of the residential computer network, including its support or maintenance, to an outside
entity not affiliated with your institution?” with the response options of (a) Yes, we have outsourced significant portions to a non-university organization, (b) Yes, we have considered outsourcing to a non-university organization but not pursued it, (c) We are considering outsourcing to a non-university organization right now, (d) No, we have not seriously considered outsourcing to a non-university organization, and (e) Other (please specify).
In most professions, there are certain words or phrases that are used to mark oneself as a member, someone who is “in.”Â Many student affairs professionals doggedly avoid referring to on-campus housing units as “dorms,” even going so far as to take offense at the term and trying to correct those who use the hated word.Â The preferred term is “residence hall,” a phrase that is used because dorm is perceived by some as being too cold and distant to describe someone’s home.Â This is an issue on which a significant amount of energy is spent – just google “dorm vs residence hall” and you’ll immediately be thrown into the battlefield.
Personally, I think the debate – one which sometimes becomes inexplicably heated and emotional – is very silly and is usually a waste of time and energy better spent on substantive issues.Â But my point here isn’t to convince you that I’m right.Â I only want to share a surprising finding from the historical documents I’m current reviewing: This debate has been raging for nearly 100 years!
The conference proceedings for the 1941 meeting of the National Association of Advisers and Deans of Men (NADAM), the organization that later changed its name to NASPA, includes a talk given by Mr. R. B. Stewart, Controller of Purdue University (no, I don’t know what that title means, either), on the topic of “Institutional Housing Policies.”Â In describing the student housing at Purdue, he noted:
Our approach to the student housing program began in 1927, when we received funds and borrowed money for the erection of our first Residence Hall for men.Â At that time, our Trustees eliminated from Purdue terminology the use of the word “dormitory”, and since that date we refer to our housing units as “residence halls,” intending to convey the fact that our units are something more that places to sleep and for one’s being.
Whoa!Â I knew that this battle against the word dorm had begun before my time in higher education but I had no idea that it was this old!
I’ve written a few times about historicalresearchI’ve done looking into how U.S. student affairs professionals have used and viewed technology throughout the 20th century.Â Although I don’t know where my current job search will take me, I feel a responsibility to bring some closure to this research and then ensure it is somehow published or shared.
Much of my previous work was based on documents held at the National Student Affairs Archives at Bowling Green State University, especially the conference proceedings and programs for ACPA and NASPA.Â My work is incomplete, however, because those (wonderful!) archives did not have most of the conference proceedings from the first half of the century.Â However, another scholar told me that my own institution, Indiana University, has many of these proceedings.Â Since I will probably be leaving Bloomington soon, I finally followed up on this tip and requested all of the conference proceedings in the IU library.Â The two collections – IU and BGSU – complement each other very nicely, almost as if a single collection of all of the proceedings were divided evenly between the two libraries.Â It would probably be a bibliographic faux pas to ask one of these libraries to donate their materials to the other one but it sure would be nice to have a nearly complete collection in one place.Â At least the two universities are only a few hours apart so it’s not terribly burdensome for scholars who want to consult these materials.
I’ve only started reading through these documents and I’m already very glad that I requested them!Â Â In just the handful of proceedings that I’ve read so far I’ve found interesting things such as:
Discussion of the negative effects of “mechanical devices” on education in 1928
A demonstration of IBM equipment for Deans of Men in 1950
A new program at the 1950 NASPA conference using audio recorders to collect and then distribute the distilled wisdom of its members.Â In the opening session, NADAM President L. K. Neidlinger described this new program to attendees:
You can also improve your mind and learn how to be a dean by going to the Recording Room, just off the lobby, at any time that suits your convenience, and asking the attendants there to hook you up to one of the tape recordings that we have been busy making last night and this morning. We are conducting there an interesting new experiment in convention technique. On each of several topics we have had a team of five deans record their experience and advice — all on the same tape. Anyone interested in these topics can pull up a chair, light a cigar, and listen at leisure to the advice of five colleagues who could not otherwise be interviewed so conveniently. He can then add his own comments by flipping a switch and talking. Furthermore, six months from now when you may have to educate a faculty committee on the facts of life about one of these topics, you will be able to write Fred Turner for the recording, borrow a machine, and bring these expert witnesses into your committee room.
A demonstration of the new Polaroid camera, with specific mention of its possible use in creating photographs for student IDs, in 1951
Even though I’ve just begun reading through these proceedings, I already have examples of (a) worry about the effects of technology on education and students, (b) discussions of the potential benefits of technology in student affairs administration, especially record keeping and processing, (c) demonstrations of new technology by vendors and pioneering institutions, and (d) innovative uses of technology initiated by members of the professional organizations themselves.Â A history of regular and continued use of technology, including original innovations and cutting-edge uses, doesn’t seem to be part of the mainstream historical narrative of the student affairs profession but that seems to be the story I’m finding in the historical artifacts.
(Off-topic: Holy crap are these proceedings products of their times!Â I knew that the history of these two professional organizations was very gendered given their historical roots but I didn’t expect the volume of casual sexism documented in these proceedings!Â I did, however, expect some degree of racism and a large homophobia and – sadly – my expectations have been met.Â I’m not even looking for these things but they often come screaming out of the pages. I’m reminded of a moment in this story where a college student asks during a discussion about the Founding Fathers: “If the Founders loved the humanities so much, how come they treated the natives so badly?” It’s mentally and spiritually jarring to read pages and pages of passionate discussion about the importance of each student and their intellectual and moral development followed by a casual dismissal of the competence of deans of women or a reminder of the psychological and moral depravity of homosexuality. The incongruity and dissonance makes me wonder what normal, accepted practices and beliefs we hold today will cause these “Holy crap!” moments for future generations when they read our e-mails and watch our videos.)