Look Ma – I’m a Thought Leader!

Researchers at Elon University and the Pew Internet & American Life Project have released a report describing some opinions about the future of the Internet and its impact on society, particularly the younger generations.  One of the themes is that educational practices must change to address information literacy.  Although I agree with some of its content I don’t know if this report will have any impact on practice or policy; I am becoming a bit jaded and burnt out by academics discussing how education should be reformed when so little seems to actually happen (which doesn’t stop us from writing books, publishing articles, and giving talks – we must feed our egos and sustain the systems that have grown rich on this fare so they in turn will continue to sustain us!).

I am a little bit amused by The Atlantic’s description of the sampling strategy of the survey that underlies this report: “[The report is] based on surveys with more than 1,000 thought leaders.”  I was asked to participate in this survey.  That means that I must be a thought leader!  I don’t know what that means but it sounds suspiciously like “someone who doesn’t actually do anything but is really good at it!”  Now if you’ll excuse me I need to figure out where “Thought Leader” goes on my CV…

Can All Faculty Really Conduct Research Related to Their Teaching?

Although I will focus on faculty and SoTL in this post, I think the same arguments could be applied in many different contexts where people are asked to perform rigorous social science research that is generalizable beyond their specific context. I will also be vague about the background events that have informed my thinking to avoid embarrassing any of the wonderful faculty with whom I have worked.

One of the ideas central to the scholarship of teaching and learning (SoTL) is that faculty members are all capable of conducting research related to their teaching.  I’ve had some experiences in the past couple of years as I’ve consulted with university individual faculty members and groups of faculty members that have convinced me that this idea is misleadingly simplistic and wrong in some instances.   I haven’t yet convinced my colleagues and I’m not even sure these are original thoughts but I’ve begun to think of SoTL research as having three distinct levels with different outcomes and required skills.

I believe that all faculty are capable of performing research that is intended to enhance their own teaching.  When the research is focused solely on the work of an individual faculty member, he or she is the sole judge of the quality of that research. It may even be debatable if research not intended to be generalized is SoTL work or even research; I’m not very interested in that debate because this work is still incredibly valuable and important.

When faculty are interested in engaging in research related to their teaching with the hope that the research can be easily used by other faculty members, the situation becomes more complicated.  If the research is intended to be generalized primarily to courses taught by faculty in the same discipline, the standards that will likely be applied to the research are those inherent in the discipline and familiar to the faculty researcher.

But if the research is intended to be generalized beyond the faculty researcher’s discipline then the standards that will be used to judge it are much higher and are likely to come from outside his or her discipline.  Since learning is a social endeavor, the standards that should be applied are those from social science such as educational assessment and measurement.

Focus Evaluation standards
Individual faculty teaching Individual faculty researcher’s standards
Courses within the discipline Disciplinary standards
Courses across disciplines Social science standards

This is not a purely theoretical idea.  Nor is this a craven attempt to justify my education and experiences and ensure that faculty developers, assessment experts, and measurement researchers will continue to have jobs.    This has helped me understand why some of the faculty with whom I have worked have struggled to conduct research related to their teaching: they have tried to work on a level above their expertise, experience, and even vocabulary.  Since we were still operating under the belief that all faculty are capable of conducting all kinds of research related to their teaching, we were not providing them with all of the right kinds of support.

As a social scientist and an educational researcher, I believe that although all faculty can perform some research related to their teaching, only some faculty can perform research related to everyone’s teaching.  Although all faculty are capable of figuring out for themselves what is acceptable evidence, it takes more experience to understand discipline-wide standards and particular skills to understand standards that are acceptable across many (or all) disciplines.  So when we work with faculty – graduate teaching assistants, adjuncts, instructors, or tenure-track faculty – who want (or are being required) to conduct SoTL research, we must ensure that they have the skills and experience appropriate for the kind of research they are trying to conduct.  Otherwise they can become frustrated, overwhelmed, and distrusting of SoTL work altogether.

Growing Beyond an Established Digital Identity

I have run into an unexpected and interesting issue.  Although I am not as far along with my dissertation as I would like to be, I have decided to hit the job market.  Some of the jobs to which I am applying are not directly related to student affairs and technology, the primary topic of this blog and the tagline of this website.  That makes me a bit nervous.

It’s natural for people to want to make career changes, large and small.  But I never considered how to handle making such a career change when I have a strongly established digital identity that is not directly aligned with the desired career.  This is particularly tricky because I have a diverse skillset and I am applying to a diverse set of jobs from faculty development to student affairs assessment.  How will potential employers handle an apparent disconnect between my established digital identity – the topics I’ve regularly discussed and the areas in which I have publicly proclaimed expertise – and the jobs to which I am applying?

I am not misrepresenting myself in my application materials.  There are many skills I have acquired and interests I have developed that I simply haven’t discussed here, especially some that don’t seem to be on-topic.  But will potential employers take my claims of competence and experience seriously when they weigh these “new” and undiscussed skills and interests against those I have repeatedly and publicly discussed?

I don’t have answers for these questions right now.  But I will soon because this is not a theoretical issue but one I am actively confronting right now.

What can I do?

  • Scour my materials to ensure that anything I already have online that is relevant is accurately tagged, perhaps even highlighting those collections of materials somehow.
  • Quickly begin to build up a (larger and more visible) body of blog posts related to these other topics (e.g. Scholarship of Teaching and Learning, faculty development, assessment).
  • Tweak the tagline of this website so it’s aligned with a broader set of my professional interests.
  • Create alternative expressions or evidence of competence and experience with these other topics (e.g. e-portfolios).

If I were always completely open and transparent about all of my interests and experiences, I wouldn’t have this problem because these facets of my identity would already be visible.  But I think it’s healthy and even necessary to consciously practice some level of self-censorship and selection, at least for me.  I just need to figure out how to present multiple facets of my identity with integrity now that it has become necessary for me to do so.  And hope that others can perceive that I am acting with integrity and understand what has happened.

Little Things DO Matter

I’ve never liked the trite phrase “don’t sweat the little things.” I have no argument with the general idea that you should spend most of your time on the large, important things. But I reject the implication that the little things aren’t important and not worth spending time on. It offends my passion for detail and belief that details are important. More importantly and more defensible is the idea that “little” is relative; what is little to one person is large to another.

Let me offer an example.

One of the projects at my research shop, the Law School Survey of Student Engagement (LSSSE), focuses on law schools and law students in the U.S. and Canada. I don’t have any formal responsibility beyond general collegiality and professionalism to work with the project and its staff. However, I work on LSSSE projects when they need assistance and my schedule permits because (a) the work they do is important and interesting and (b) I love working with the LSSSE staff. A few months ago, the LSSSE folks needed some help preparing their latest Annual Results and I was very happy to help. They surprised me a few weeks ago by letting me know that in return for my assistance they gave me “top billing” in the Annual Results by including me in the LSSSE staff listing on page 1 of the report.

In many ways, this was literally a little thing. It costs the LSSSE staff virtually nothing to do this. It’s less than half a line of text that few people will ever read (even if you’re interested enough to read the LSSSE Annual Report I doubt that you’ll read through the staff listing, too!). And it only took them a few second to include my name in the document.

But to me, it’s not so little. How wonderful that the LSSSE staff thought enough of me to claim me as one of their own! What a kind and unexpected gesture of thanks!

That is why I think it’s important to spend a little bit of time “sweat[ing] the small stuff:” You never really know what is small. So spend some time working on the little things because they may unexpectedly grow into big things.

I Don’t Trust This Article – And Here’s Why

On Friday, a colleague pointed out a new article on Mashable that is titled “Why Tablet Publishing Is Poised To Revolutionize Higher Education.” I don’t trust the claims made in this article. I’m going to explain why I don’t trust the claims, not to convince you that my opinion is correct but to give you an understanding of how I evaluate claims like the ones made in the article. I’ll lay out my thoughts in chronological order.

  1. The article is published at Mashable. I removed Mashable from my RSS reader over a year ago because I got tired of their poorly-written articles that make ridiculously overwrought and unprovable claims. This certainly isn’t enough for me to condemn this particular article but it certainly makes me cautious right from the beginning.
  2. The title makes a very bold claim. Many people have attempted to “revolutionize” education; few have succeeded. And even fewer have been able to explicitly predict revolutions before they occur or even recognize them as they are occurring. The author has a helluva case to make and he better bring remarkable evidence to support his claim(s).
  3. After reading the title, a quick glance through the article indicates that it’s a utopian piece largely based on the idea of technological determinism. In other words, it’s not only wildly optimistic but it also relies on the idea that we can predict and control how people use technologies by the way in which those technologies are designed. Both of these ideas – utopia and technological determinism – have a bit of history in the field of social informatics. The history is mostly negative; these ideas simply don’t work most of the time. So my skepticism continues to increase.
  4. The author of the article is an executive at Adobe. In fact, he’s the “director of worldwide education.” That doesn’t mean that his opinions are necessarily biased but it’s another reason for me to be skeptical.
  5. The article claims that “[There are] better study habits and performance with tablets.” Only one study is cited to support these sweeping claims: a Pearson Foundation “Survey on Student and Tablets.” For example, the author states that “86% of college students who own a tablet say the device helps them study more efficiently, and 76% report that tablets help them perform better in their classes” and a few other claims. Even if this study were flawless, the author needs a whole lot more evidence to support such a broad claim.
    1. To their credit, Pearson offers to share methodological details about and data from their survey if you just ask them; I haven’t asked so I don’t have any more detail than what is provided in that 2-page overview. But we do know that the survey was conducted online. Given that about 20% of people in the U.S. do not have access to the Internet (Dept of Education estimates 18.6% and the Pew Internet & American Life Project estimates 21%), it seems unlikely that an online survey can produce data that is representative of the entire population. It seems particularly problematic to omit non-Internet users when asking about technology since the results will almost certainly be skewed.
    2. Even if we accept that the Pearson numbers are accurate or in the right ballpark, I’m still not sure if they’re very informative. I guess it’s interesting that many young people think that tablets will help them study more efficiently and that they will replace textbooks in the next five years. I just don’t think that we can use these data to make any predictions.
    3. Let’s ignore the validity issues for some of Pearson’s data (e.g. people are notoriously bad at distinguishing between “what I like” and “what is most efficient/effective) so we can move on.
  6. The authors correctly assert that digital textbooks can include more features than printed textbooks, including “video, audio, animation, interactive simulations and even 360-degree rotations and panoramas.” However, the author does not say how we’ll produce all of that additional material. I don’t expect the author to solve every challenge associated with his predicted revolution but it would be nice to at least acknowledge them instead of glossing them over or ignoring them entirely.
  7. In the next section of the article, the author claims that “interactive learning leads to better retention.” The only evidence cited is this news article about a study of elementary and high school students using 3D technology in science and math classes. Of course, since I’m an academic snob I think it would be much better to cite a primary source, preferably one that has been peer-reviewed, than to rely on a popular press article. Once again, even if we accept that this study is perfect it’s not even close to being enough to support such a broad claim.
  8. Next, the author claims that digital publishing can help us better “[understand] learning effectiveness” using “integrated analytical tools.” I have no issue with this claim as a broad theoretical claim. But it seems to completely bypass the fact that U.S. higher education is in complete disarray in terms of even settling on broad learning objectives much less specific objectives and associated assessment tools or indicators. (Look into the “tuning project,” especially the “Tuning USA” project, to get an accurate view of these issues.)
  9. The next claim the author makes is that “digital publishing makes knowledge more accessible.”
    1. The author must be using “accessible” in a different way than I commonly use it because it’s hard to take that claim seriously given the (a) lingering digital divide, participation gap, and similar inequities in the U.S. and (b) the immense resistance many digital publishers have exhibited to making their content accessible to the visually impaired.
    2. Once again, the author focuses solely on a possibility offered by the technology without giving any thought to the cultures in which the technology is embedded. He writes that “digital publishing allows professors or subject matter experts to self-publish their own educational materials or research findings and distribute the information on tablet devices” without offering even the barest hint about how this will occur without adjusting or overturning the systems that would need to support this. In other words, why would faculty do this? What is the incentive?
    3. Similarly, the author claims that “by harnessing interactive technologies, educators can explain even the most complex scholarly or scientific concepts in compelling and intelligible ways.” Once again, I accept this broad claim (ignoring the “even most complex” qualifier because it’s just silly) in theory but balk at it in practice. It takes complex skills to create effective interactive content, skills that are different from those possessed and valued by faculty in many disciplines.
  10. At this point I’m just tired of reading these grand claims supported by flimsy or no evidence…

I’m not a Debbie Downer or a Luddite. I agree with the broad proposition that digital publishing has potential to make a huge impact on U.S. higher education. And I agree that tablets are super cool and very useful in some circumstances; I purchased an ASUS Transformer a few months ago to replace an ailing netbook and I’m very happy with my purchase! Fundamentally, I distrust the claims made in this article because the author fails to support them. Even when the author provides cherry-picked examples and studies, they are often of poor quality and always insufficient to support those claims. This is quite disappointing since the author could have easily drawn upon the large and rapidly-growing body of evidence in this area. I expect very little from an article published by Mashable and this article delivered.

NASPA Expands Voting Rights to All Members

I have been extremely critical of NASPA’s disenfranchisement of graduate student members, especially since that effectively negated the membership’s desire to merge with ACPA. So I was very happy to receive the following message in an e-mail from NASPA:

After a month-long voting period, the NASPA voting delegates overwhelmingly approved a proposal to revise the voting structure of the association to allow associate affiliates, graduate student affiliates, and emeritus affiliates the opportunity to vote in elections for the chair of the NASPA Board of Directors (previously NASPA President) and Regional Directors (previously Regional Vice Presidents).

“As a result of member feedback, the Board of Directors voted unanimously in May to submit this Bylaw Amendment to NASPA’s Voting Delegates,” said NASPA President Patricia Telles-Irvin. “I feel strongly that this was the right thing to do at this point in time, and I am so gratified that the Voting Delegates agreed and voted so overwhelmingly in favor of the change.”

“Graduate students, in particular, have been increasingly active within NASPA and have been its fastest growing membership type over the past year,” said NASPA Executive Director Gwendolyn Jordan Dungy. “I am particularly pleased to see the governance structure adapted to better recognize the contributions our members along the full spectrum of the student affairs career trajectory.”

The expanded voting rights will go into effect immediately with January’s ballots.

That this was necessary and that the organization denied full voting rights for over a quarter of its membership will remain stains on NASPA’s history. But it’s wonderful that the voting delegates have voted to remedy this injustice as we move forward. Well done, NASPA!

Thumbs Down for CBS News NSSE Article

There are many different angles one could take in reporting on the 2011 NSSE Annual Results; it’s a dense 50-page report. I know that every group has its own agenda and every reporter has his or her own personal interests but it’s very disappointing that CBS News chose the snide headline “Business majors: College’s worst slackers?” for their article. In an ordered list, something must be last. In this case, some major must rank last in the number of hours students typically study each week. But to label that group of students “slackers” simply because they fall at the bottom of the list is unnecessarily mean and unprofessional.

Fun Time of Year: NSSE Annual Results Released

The 2011 NSSE Annual Results were released today. I don’t want to focus on the content of the report in this blog post. Instead, I am briefly noting how fun it is to work on a project with a large impact that regularly receives attention from the press (even if some of the attention is sometimes negative, a very interesting experience itself). It’s gotten more fun each year as I’ve become more involved in much of what we do; this year I directly contributed by writing part of the report itself. Yes, it’s ego-boosting to see my work in print but more importantly it helps address a very serious and difficult problem that vexes many researchers and administrators in higher education: It’s hard to explain to others, especially our parents and extended families, what we do. Instead of trying to convince them that I really have graduated (several times!) and am not wasting my whole life in college, I can send them the report and articles from the New York Times and USA Today and say, “Look – this is what I do!”

Now I get to watch media reports and subsequent discussions to see how they play out and what they will emphasize. This process is unpredictable and it has surprised me in previous years when relatively small bits of information have caught on to the exclusion of other interesting and important information. As The Chronicle of Higher Education notes, this year may be a bit different given recent events but who knows how things will play out.

Two Quick Observations Regarding Online Community

I’m buried in work and research but I have two thoughts dancing on my mind and they’re both related to online community:

  • I hate when websites or tools list reader comments in reverse chronological order i.e. newest messages first. I finally figured out why I hate that: It makes it very difficult to view the messages as a coherent discussion within a pre-existing social context. Because new participants are not immersed in the context of the ongoing discussion they can easily view the opportunity to comment merely as a way to shout messages without any responsibility to engage with or form a community. Mediated communication is difficult enough without us actively encouraging antisocial behaviors and views.
  • Our obsession with tools and technologies leads us to underestimate or ignore the social effects and communities that build up around them. I see this happen all of the time in Wikipedia when new editors leap into articles without having any understanding of the cultural norms of the immense community of users that have used Wikipedia for years. It’s sadly naive to believe that such an immense collection of resources doesn’t have a correspondingly large and complex community with cultural and social norms and expectations.

New Research from EDUCAUSE & Statistics Canada

2011 ECAR National Study of Undergraduate Students and Information Technology infographic

Results from three research studies were released late last week. Two of them come from EDUCAUSE; I’m going to their annual conference this week and I’m really looking forward to attending presentations related to these studies.

  • EDUCAUSE – or more accurately their research arm ECAR – released results from the 2011 ECAR National Study of Undergraduate Students and Information Technology. As always, the researchers at ECAR have done a great job summarizing the results and making actionable recommendations for colleges and universities. The survey is undergoing changes and two different versions were administered this year. Although the methodological details in the report are not as detailed as I would like, it seems that moving to a third party administration has addressed some of my consistent concerns about non-response bias in this survey and generalizability of its results. EDUCAUSE also commissioned an infographic to summarize some of the results.
  • EDUCAUSE also released some data from the Core Data Service (CDS), their annual survey of member institutions. They have not yet released the summary report but they have released other reports including new “almanacs” that summarize data for large aggregations of Carnegie Classifications. The CDS has also been redesigned and several of these reports are also new. However, I am puzzled that they continue to use the outdated 2000 Carnegie Classifications. Not only the actual categories outdated and no longer used, the data on which they are based are well over a decade old.
  • Statistics Canada, a government agency roughly analogous to the U.S. Census Bureau, released results from the 2010 Canadian Internet Use Survey. Comparative data are often useful and interesting to me, especially data from Canada and other countries culturally and economically similar to the United States. Unfortunately, only a few summary tables are available; you have to pay for other data. Hopefully Canadians can access these data for free and I am only being quoted a price because I am connecting to the Statistics Canada website from a U.S. IP address.