I’m Not A Programmer But Programming Skills Are Still Extremely Useful

I don’t work in IT, software development, or anything even closely related to those fields so I’m often surprised at how much programming I do in my daily work life.  At times I write scripts or light programs (e.g., this set of Excel macros), usually to save time and ensure accurate, well-documented, and reproducible results.  More often, I directly use some of the skills of programming, especially flow control and abstraction, to make tasks easier, elegant, or possible.

Continue reading I’m Not A Programmer But Programming Skills Are Still Extremely Useful

The Psuedo-curriculum

I know this will be provocative for some of you but lately when I’ve heard people use the phrase “co-curriculum” I’ve silently translated it in my head to “psuedo-curriculum.” I’ll explain more below but understand that I am not devaluing out-of-class activities but expressing frustration that we don’t really value them.

My frustration here has been long simmering but two strands of experience and thought are mingling and bringing things into focus for me.

Continue reading The Psuedo-curriculum

Are High Impact Practices Available Online?

I am still wrestling with my unease with MOOCs and I think I’ve finally figured out why: High impact educational practices, as we understand them today, are unlikely at best and impossible at worst in MOOCs and other similar online environments.

First, it’s helpful to understand that “high impact practice” (HIP) is a term of art.  Although the phrase is probably very common, in the past ten years or so the term has taken on special significance in U.S. higher education.  Popularized by George Kuh and emerging partly from research using data from the National Survey of Student Engagement (NSSE), this phrase has come to mean a particular set of activities that many higher education researchers believe are especially effective in promoting important and lasting changes in undergraduate students: First-Year Seminars and Experiences, Common Intellectual Experiences (i.e. core curricula), Learning Communities, Writing-Intensive Courses, Collaborative Assignments and Projects, Undergraduate Research, Diversity/Global Learning, Service Learning, Community-Based Learning, Internships, and Capstone Courses and Projects.

Unfortunately, we sometimes place too much focus on these particular activities without understanding why these activities have a high impact.  As originally described by Kuh in 2007, these practices share six characteristics:

  1. HIPs “demand that students devote considerable amounts of time and effort to purposeful tasks (p. 7)”
  2. HIPs place students in circumstances that require they “interact with faculty and peers about substantive matter (p. 7)”
  3. HIPs greatly increase the likelihood that students will interact with people who are different from themselves
  4. HIPs provide students with very frequent – sometimes continuous – feedback from faculty and peers
  5. HIPs require students to operate in intellectually complex ways by connecting knowledge in different courses and applying it in different contexts e.g. confronting complex real-world issues, investigating unfamiliar research problems
  6. HIPs occur in the context of a “coherent, academically challenging curriculum (p. 8)”

I am particularly interested in focusing on these characteristics of high impact practices as I will be helping lead a discussion on my campus next month focused on student engagement.  Most of the participants will be faculty and much of our focus will be on activities that faculty are using or can use in their curricula to promote student engagement.  Given that focus, I don’t think it would be helpful to focus on the specific activities identified as HIPs as those are often beyond the resources and purview of an individual faculty member.  Instead, we will focus on why those activities have a high impact so we can apply those principles to the activities within the power and resources of individual faculty.

That is what was on the forefront of my mind when I “attended” an EDUCAUSE Learning Initiative (ELI) online conference last week that focused on MOOCs.  The conference had some very active discussions among participants and as I participated in those discussions it occurred to me that one of the primary reasons I am uncomfortable with MOOCs is that it is difficult or impossible to apply much of what we know about good teaching in that environment.

Look back up at those six principles of high impact practices.  How do we do apply those principles in a MOOC?  More pointedly, can we apply those principles in a MOOC?  I despair that the answer is mostly “no.”  I pray that it is a simple lack of imagination on my part, a misunderstanding of what we can do in a MOOC, or that this is a fatal flaw of the dominant MOOC model that others will quickly recognize and fix or use to abandon that model.  I also confess that I don’t completely understand all of the discussions about “xMOOCs” and “cMOOCs” on anything but a very theoretical and abstract level and I have a sneaky suspicion that I’m missing something very important in how cMOOCs address some of these principles.

There is another interesting and hopeful way to think about this.  Another ELI conference attendee – I’m sorry that I don’t remember who – suggested that there may be other paradigms of effective educational practices that MOOCs might better fit.  Although I am a little bit skeptical that our understanding of effective education is going to be radically upended, this recommendation to not be too constrained by our current thinking is a very good one.  In fact, that is one important reason why I will be trying to steer our discussion here on my campus next month away from the specific activities and toward the broader principles so we can compare our thinking about student engagement with that of others’.  The idea isn’t to impose the model on my campus but to use it as a common starting point that must be adapted to our unique needs and resources.

That, of course, is what we’ll need to do with MOOCs: Use our best understanding of effective teaching and shape it to this unique environment with unique affordances.  I don’t know how to do that and I don’t know if that is what is being done.  I am wary that much of what is being done is not methodical and not built on what we know about how people learn.  I am especially skeptical that we can provide the kind of demanding and socially and intellectually connected experiences that we know provide some of the best learning.  I hope that people smarter than I are figuring this out, though, and working out how MOOCs can provide high impact educational practices.

Advantages and Challenges of Tech Skills in a Non-Tech Job

Posterboard showing a draft sitemap for part of a new website
The initial planning stages of our new website.

When I was on the job market last year, I was not interested in a job in IT.  Although I’m a little bit rusty in some areas after spending five years in a PhD program, I have the skills and background to work in IT as I’ve done in previous jobs.  But I was interested primarily in faculty development and assessment jobs.  For many of those positions, I deliberately downplayed my technology background because I wanted to be interviewed and hired because for my research and assessment skills.  I seem to have been successful because I’ve found a wonderful research analyst position in a teaching and assessment shop where my none of my job responsibilities involve technology maintenance, development, management, or support.

That has made the last couple of weeks a little bit strange as I’ve taken on significant responsibility in planning and supporting my center’s website as we finish merging our old websites and move them to a new content management system. This is the kind of project I was trying to avoid because it’s completely outside my current job description and I fear being typecast in this role. But you know what? It’s working out just fine. I am able to greatly help my colleagues and they are supportive and grateful but I don’t fear being relegated to being the unit’s tech support.

This is possible because of two things. First, I have wonderful coworkers who are very supportive. They are supportive of my growth as a professional as a higher education scholar, researcher, and faculty developer. They are also very respectful about how my time is used and the kinds of tasks they ask me to take on. Second, I have been very open about setting boundaries. I am always happy to help my colleagues wth technical issues but they are also okay with going through regular channels for larger, more complex issues even though I could spend time solving many of those, too. Because I have been very open with my colleagues about how I would like to use my experiences and knowledge and because my colleagues are wonderful, they have even shielded me a bit to ensure that others don’t try to take advantage of me. For example, despite our official tech support warming up to me very quickly (e.g., it only took a few minutes of me asking the right questions to be granted local admin access to my computer, something that apparently is a rarity here) my colleagues have intentionally made someone else our unit’s official point of contact with tech support to ensure they would not rely on me to solve problems or do extra work.

This seems to be working out well. My background gives me the knowledge to ask many of the right questions when we’re dealing with technology e.g., I can “geek out” with a colleague in IT to more effectively and efficiently probe for information as we figure out which content management system to use for our new website. My skills let me solve little problems in our office very quickly or recognize when problems are out of our control or even unsolvable. My experience guides me to help my colleagues make wise decisions that will be maintainable into the future even when I no longer work here e.g., we’re going to move to Qualtrics as our event registration system instead of using custom-built Google Forms that require significant technical skill to effectively use and maintain.

You can take the computer geek out of IT but apparently you can’t take the IT out of the computer geek. And I’m becoming okay with that.

Data Analysis MOOC Week 3: I’m a Dropout

Despite my best intentions, I have become another MOOC dropout.  Why am I not continuing to participate in this free course?

  1. The format isn’t compelling.  The course is primarily built around four components: video lectures and notes, weekly quizzes, a discussion board, and two peer-graded assignments. The lectures are alright and although there are many other online R resources it’s nice to have concise descriptions of R procedures specifically linked to data analysis. The discussion board is also helpful but there are many other places to find help with R. As discussed in my previous post, the weekly quizzes are very disappointing as they are the primary means by which students in this course practice what they learn but they offer very, very little feedback.My biggest regret is that I won’t experience the peer-graded  assignments. While the idea of requiring students to grade one another’s work is likely driven largely by the logistics of a MOOC, peer-graded assignments can be very powerful and worthwhile even in small classes.  That these assignments are the only non-quiz activities in the course is disappointing especially since there are only two non-quiz assignments.  Although it will be helpful that each student should receive feedback from several classmates (if it’s possible, I might provide feedback on the reports for some of my classmates even though I won’t be writing my own), it often takes more than two attempts for students to learn and begin to master new skills.
  2. Except for the peer-graded reports, there seems to be little reason for this course to be on a lockstep 8 week schedule. I might be able to stay with it if the timing were more flexible.  Even in the first three weeks of the course I’m having some trouble consistently making time to view all of the videos. I had planned to do this all at work as my supervisor supports this as important and valuable professional development but I’m having trouble doing that because it’s sometimes difficult to carve out the time and I feel guilty watching online videos at work for a non-credit course when I feel like I should be doing something more (visibly and authentically) productive.
  3. I can’t convince myself to participate in the two peer-graded reports, the only meaningful assignments in this course.  This is linked directly to the material of this specific course and is not a criticism of the course itself. I simply can’t muster the will to conduct additional data analysis and write additional reports for this course when those are two of my primary job duties.  It’s not that I don’t think that I could learn from the activities, develop new skills, and become a better data analyst and writer.  I just can’t bring myself to spend so much time analyzing data and writing reports unrelated to either my job or my research.  I am disappointed as I was looking forward to these substantive activities, especially being able to receive feedback from others and seeing how others approached the same activities.

Although I’m disappointed to have decided to not continue with the activities of this MOOC, I am happy to have enrolled and tried it out.  I will continue to download the course materials so I can reference them when I am ready to put them into practice in meaningful ways.

I have very mixed feelings about the broader concept of MOOCs.  It would take an extraordinary effort for an online course, especially a MOOC, to match the quality of the best face-to-face courses.  But the reality is that few face-to-face courses are “the best.”  Although the dominant MOOC model seems to mimic much of the worst lecture courses in traditional universities, even the worst course is sometimes good enough especially when the alternative to a crappy, frustrating, and largely self-driven education is no education at all.

Data Analysis MOOC Week 2: Muddling Through Frustration

I have watched the online videos and successfully completed the quiz for week 2 of the data analysis MOOC in which I am enrolled. I struggled quite a bit with some of the R syntax and that made the quiz a very frustrating experience. I have two observations to share about what I learned this week about the format of the course.

First, I am disappointed that so far the only opportunities for students to practice what is being taught and receive feedback is the weekly quiz.  I was able to muddle through things enough to get answers that matched the response options for this week’s multiple-choice quiz but despite answering all questions correctly I’m still very unsure of much of the content – I just know that I happened to somehow end up with answers that matched some of the ones included in the quiz.  Some of this is simply due to my lack of experience with R and its high learning curve.  But much of it is due to the fact that the multiple-choice quiz was the only opportunity to practice with any semblance of feedback and that feedback was restricted to an anemic “correct” or “incorrect” for each question with no additional feedback.

Yes, I can practice on my own some of the skills taught in this class.  This is certainly the case if I want to focus solely on learning how to use R – syntax, configuration, functionality, etc. – as the language provides immediate feedback with error messages or output.  But if that is the focus and if that’s sufficient to learn the skills then why do we need an organized course instead just a course packet or list of recommended self-guided topics and exercises?

What distinguishes an organized, well-taught class from a self-taught topic is that a class has an expert who not only make their thinking explicit but also offers targeted feedback for students as they practice the skills they are learning.  It’s conceivable that some skills could be taught using sophisticated, automated tools if we have a deep enough understanding of how people typically learn those skills that we can programmatically recognize mistakes and misunderstandings to provide appropriate, specific feedback.  Sometimes, this can be done to a (very) limited degree with appropriately designed multiple-choice instruments where the incorrect responses are designed to be diagnostic i.e., wrong answers aren’t merely incorrect but they’re designed to identify particular kinds of mistakes or misunderstandings.  That seems to be the case for some of the questions and answers in this MOOC but we’re not provided with any of the related feedback to help us understand what common mistake we may have made, how we might be misunderstanding the issue, and how we can work to correct our thinking.

Second, the size of the course requires innovative ways to provide support for students and this course seems to rely heavily on the course discussion board.  This is an observation, not a criticism. I’m quite comfortable using that medium as I’ve been using online discussion boards since the early 1990s when they were one of the primary draws for dial-up bulletin board systems (with the other major draw being online “door” games).  I don’t know how well this works for other students, however, as I don’t want to make assumptions about their experiences, skills, and cultures.  It’s probably not a big deal; my concern here is very minor and more of a curiosity about how other students experience and use (or don’t use) the discussion board. (In other situations I would be concerned about those who have poor or no Internet access or those who have little comfort and experience with the Internet but it’s reasonable to expect students who enroll in an online course to have sufficient Internet access and skills. I’m not suggesting that everyone has the access and skills to enroll in an online course, merely that those who are already enrolled in one presumably have the required access and skills.)

Data Analysis MOOC Week 1: I’m Going to Hate This

This semester, I have signed up for a data analysis class being taught in Coursera. This is a massively open online course (MOOC).  I’m tech savvy and well educated but it seems like the most responsible way for me to really learn about MOOCs is to gain some firsthand experience.  I also hope to learn some new data analysis techniques and ideas in this course.  The course will use R to analyze data so it will also be good to expand my (very limited) skills and knowledge with that powerful tool.

Going into this, I am very skeptical about what I understand the typical MOOC model to be with instruction primarily occurring using pre-recorded videos and quizzes with a discussion board as the primary means of communication between students and faculty.  I hope I’m wrong either about the model of instruction or about its effectiveness.  As an educator, I believe (and am supported by significant evidence) that the best learning occurs when experts make their thinking explicit through demonstration and give learners multiple opportunities for focused practice and feedback.  So my skepticism about the effectiveness of videos and quizzes as learning and teaching tools can best be summed up as: “Telling is not teaching.”  (Note that this applies just as forcefully to passive lecturing in physical classrooms!)

I’ve just started to get into the material for this course and so far it looks like my low expectations are going to be met: the course is built heavily around pre-recorded videos as the way for the faculty to teach students with weekly online quizzes and two peer-graded assignments as the only opportunities for us to “practice” what we are “learning.”  I hope I’m wrong and this proves to be much more enjoyable and rewarding that I think it will be!

New NSSE Survey and Technology Questions

I’m super excited that my colleagues have finally made the new version of the National Survey of Student Engagement (NSSE)  publicly available!  We’ve spent a lot of time working on this over the past 3-4 years, including focus groups, interviews, two pilot administrations, tons of literature review and data analysis, (seemingly) thousands of meetings, and many other events and undertakings.  I’ve been incredibly lucky to have been part of this process from nearly the beginning as I’ve learned a lot about survey development and project management.  I’m leaving NSSE at the end of the month so although I won’t be here when the new survey is administered next spring I’m still happy to be here to see the final version.

I’m particularly excited that the technology module (optional set of questions) has made it through all of our testing and will be part of the new survey.  There are other cool modules but this one has been my baby for over two years.  My colleagues here at NSSE – Allison and Heather – and my colleagues at EDUCAUSE – Eden and Pam – have been wonderful collaborators and I hope that they have had half as much fun and fulfillment working on these questions as I did.  It’s poignant to have spent so much time on this project only be handing it off to others just as it sees the light of day but I know it’s in good hands.  I am very hopeful that a significant number of institutions will choose to use this module and we will continue to continue to what we know about the role and impact of technology in U.S. and Canadian higher education.

Throughout all of this, I’ve remained especially thankful to have been so involved in the development of this new survey as a graduate student. Although I work half as many hours as the full-time doctorate-possessing research analysts, they have been very open about allowing me to be involved and never shied away from adding me to projects and giving me significant responsibilities.  I was never treated as “just a grad student” or a junior colleague, just one that worked fewer hours and had some different responsibilities.  Consequently, I had genuine responsibilities and made significant, meaningful contributions; I can honestly point to the survey and see my own fingerprints on some parts of it!  When I speak about meaningful educational experiences in the future, I’ll certainly think of this one as an excellent example.  And I will work to ensure that my students and colleagues can have similar experiences that allow them to learn, grow, and meaningfully contribute by performing important work with trust and support.

Media Spin and Attention Grabbing Headlines

The Washington Post published a story yesterday describing some research that says that college students today study less than college students in the past.   The story is largely based on a tiny bit of NSSE data that we first published several months ago describing self-reported time spent studying as it differs across majors.  At the moment, I’m less interested in the data and more interested in how it’s being reported and described.

First, I’m a bit amused that this is suddenly a hot topic given that the information was released 6 months ago.  In fact, it was covered very prominently in November by little-known websites like the New York Times, USA Today, and Chronicle of Higher Education.  I don’t know why the Post decided to write a story about this now (I suspect it has to do with an upcoming conference of higher education researchers, a conference heavily attended by my NSSE colleagues and one at which we frequently present new research).  But it’s amusing and informative that one story written by the Washington Post has set off a flood of blog posts and “news stories” about something that is old news.  Yes, I know that it’s still interesting and pertinent information but this seems to reinforce the sad fact that many blogs and “news sites” are very dependent on traditional media for content, even when that content has been available for months.

Second, I’m amused and saddened by the headlines that people are using to describe this research.  I know that many of the websites listed below are second- or third-rate and use headlines like these just to get attention (which drives up traffic and ad revenue – and which makes me a bit ashamed to be adding to their traffic and ad revenue!) but it still makes me sad.  Some example:

  1. Is college too easy? As study time falls, debate rises.“  This is the original Washington Post article.  It has a fairly well balanced headline.  It’s not over-the-top and it even notes that the issue is not settled as people debate it.
  2. Is College Hard? Students Are Studying Less, Says Survey“  The Huffington Post’s headline isn’t too far from the one used by the Washington Post.  Although I loathe the Huffington Post and how the vast majority of its content is blatantly derivative and unoriginal, this is a decent little summary of the Washington Post article and an alright headline.
  3. Laid-Back Higher Ed” This is how The Innovation Files describes the Washington Post article and the research it describes.  Not horrible but not very good either.  At least it’s not as bad as…
  4. Fun Time Is Replacing Study Time in College” I don’t know anything about FlaglerLive.com but based on this ridiculous and inaccurate headline and blog post I won’t be spending any time there.  I’m particularly impressed by the figure that they copied directly out of the NSSE 2011 Annual Results that they claim is “© FlaglerLive.”  Classy.

 

Please Step Away From the Infographic!

I’ve tried very hard to be nice but I can’t bite my tongue any longer: Please, stop it with the infographics.  Most of them are bad.  If I were still a bratty 15-year old, I would dryly say that “I feel dumber for having read that” after seeing most infographics.  But I’ll be more professional and offer some specific criticisms.

Most infographics:

  1. Obliterate nuance and ignore subtleties and differences by carelessly aggregating many different sources of information.  By no means am I opposed to integrating knowledge and synthesizing data from multiple sources!  But it must be done carefully because it’s rare that different studies or sources of data align well.  When it’s done carelessly we can draw false conclusions.  These problems compound as more sources are thoughtlessly tossed together until we’re saying things that we simply don’t know are true.
  2. Don’t tell us where the data come from.  Sure, many infographics have a list of sources at the bottom.  But most of the time that’s all we get: An unordered list that doesn’t tell us which bits of information came from which sources.  I guess that kind of list is better than nothing, but not by much.  This is quite puzzling and frustrating because it seems like such an easy thing to fix.  Infographics designers, please look up “footnotes” and “endnotes” because this is a problem we solved a long time ago.
  3. Don’t need to exist in the first place because the “graphics” add nothing to the “information” being conveyed.  I know that infographics are the hip, new thing (I know they’re neither hop nor new – play along because many people still believe that!) but if your message can be better communicated through a different medium then you’re hurting yourself and impeding your message by forcing it into an unhelpful series of “graphics.”

Of course, I’m not the first one to whine about the infographic plague.  For example, Megan McArdle is spot on when she notes that most infographics are created by hacks who haven’t done any research or produced anything useful but want to convince you that they’re experts so you’ll hire them or buy something from them.  I’m also sure that someone has eviscerated the banal characteristics of the infographic genre (e.g. color palette lifted straight from the early-mid 2000s Web 2.0 explosion, percentage values liberally scattered about in large fonts).

A great (?) example of a terrible infographic is this one recently published by Mashable.  It meets all three of the criteria listed above.  Sadly, most infographics I’ve seen meet at least two if not all three of those criteria.

But not all infographics are terrible.  It’s very simple but this one recently published by Bloomberg is effective and informative.   The infographic that is displayed when you click on the “Cost to students & school” button on the left is ok.  But the bar graphs displayed when you click on the “Conference comparison” button are very informative and useful.

Before you make your next infographic or start passing around a link to an infographic, please consider whether the infographic avoids the three pitfalls listed above.  If it doesn’t, please step away from the infographic!