But as someone who makes as much as a quarter of his income from teaching college classes in any given year, and who also spends a good amount of time speaking at conferences trying to help professors incorporate technology and social media into their curriculum, the view from the trenches is very different than the iPad-in-every-backpack proponents would have you believe.
This is not to say that tech isn’t changing the way we teach and the way students learn: it most certainly is. But probably not as fast as some people outside of higher ed think it is.
People who say we’re at the dawn of a new way of learning at the college level are overlooking some rather significant economic and cultural hurdles. At the same time, academic freedom means professors can choose to implement technology a lot, a little bit or not at all into their curriculum. And implementing it “a lot” isn’t always a good thing, particularly if it isn’t used in a way that boosts learning outcomes.
We (Don’t) Have The Technology
If you were to visit the library on the campus where I teach, you would see students waiting to use outdated desktops in the computer labs and library, particularly around midterms and finals week. It seems odd at first, considering the school has a laptop requirement for all undergraduates. That means you have to have a laptop computer when you enroll, and presumably, as an instructor, I can require my students to bring them to any class.
But here’s the reality: laptops break, and students can’t afford replacements.
The mainstream media has sold us a myth of college still being the place for the ultra-elite, for kids who start compiling “brag sheets” in the fourth grade and have parents that shell out five figures to hire a college admissions coach.
But in practice, most college students these days are like the ones I teach at a four-year state college: they are, by-and-large, the first in their family to attend college. Almost all of my students work, and many work full-time or multiple part-time jobs. Some are parents. An increasing number are so-called nontraditional students and are enrolling after an extended break from education. These students often support families and, in many case, have college-aged children who need their own laptops.
Now factor in that the fastest growing segment of higher education are community colleges, which by-and-large draw kids from working class backgrounds or cater to people who have been laid off and are trying to get trained for a new career.
For a lot of students, replacing a broken laptop is a choice between skipping a rent payment or sucking it up and waiting in those long lines at the computer lab. Asking them to shell out for an iPad on top of the laptop just isn’t feasible for many college students, and that means its going to take longer to get everyone on board with the tech revolution in higher ed.
Tenure Doesn’t Equal Tech Savvy
One of the concerns among students on the campus where I teach is that the university employs an alert system that sends them text and email messages if there is a life-threatening emergency on campus (think Virginia Tech in 2007). But what are they supposed to do, these students ask, if they’re in a class where the teacher bans them from using smartphones and laptops?
Academic freedom means professors get to run their classrooms in the way they want, and that includes choosing the tools they use to teach. Having sat in meetings where faculty members have threatened to file union complaints because email means students can – GASP! – contact them at any time, I think we’re a ways off from blanket incorporation of social media and tablet textbooks across the curriculum.
These same professors, many of whom predate the Internet era in higher ed, never concede that email also means fewer student visits during office hours for simple questions, which means more time to get actual work done. This isn’t meant as a knock on them, but there are varying degrees of enthusiasm for incorporating tech into teaching and, unlike high schools, tech enthusiasm can’t be mandated by a curriculum committee.
High School’s Chilling Effects
Career academics are not, however, the only ones to blame. A lot of students come to college with backward views of what social media is and what it can accomplish. And most importantly, what is and isn’t acceptable on social media.
And why shouldn’t they? They come from schools where teachers can be reprimanded or even fired for connecting with students on social networks. Several schools across the country are implementing bans on teachers friending not only current students but former students on Facebook.
There’s no easy fix for overcoming these preexisting biases. Step one, as a professor, is make sure you don’t use Facebook for classwork: even though it’s the default social network for so many of us, there’s still too much of a creep factor in crossing that student-professor line (and, frankly, with Facebook’s ever-shifting privacy policies, even if you think you’re protected you may end up seeing stuff about your students you’d be better off not knowing about).
But that leaves us to decide which social network we should use with our students. Dedicated social networks like the one being rolled out for students by Microsoft seem like a good idea, but my own experience is that a site students check for reasons other than school tends to produce more frequent check-ins and a more organic discussion about classwork, which is exactly what I want to accomplish with social media in my classes.
I tried using Google+ last September, only to be thwarted in a freshman writing class where some of the students were not yet 18. Google has since relaxed its age restrictions, but the social network is still too new for students to gravitate toward it. In my experiment, students found it confusing, or at least less intuitive than Facebook, and I was finding most would only use it if I mandated it.
I’ve had the best luck with Twitter, including the use of it in a film class so we can discuss the film as we’re screening it each week (for a sample, see this storify of tweets from the class discussion of Shawshank Redmeption). But, again, only about half of my students will use it if I don’t require it. And of the students who start using it because I require it in my class, fewer than 10% will continue to use it when the semester ends.
Hope On The Horizon: The Kindle Effect
The people I thought would be stingiest about adopting technology in their classrooms have, in many cases, been the most willing to change. I now see a lot of those seemingly stodgy old English professors walking around campus with a Kindle tucked under their arm.
“I still love all of my books,” one told me recently. “But this is so much easier when you have three or four going.”
Science departments have always been better about being early adopters than people stuck in the humanities, but as colleges increasingly compete with one another for students, and as students increasingly look at college as job training, the colleges that at least acknowledge social media and tech are such an integral part of the workplace and the world, are going to do a better job of recruiting top-tier students.
When I started teaching five years ago, a colleague called me “tech savvy” because I would use YouTube videos to supplement class lectures. Now that same colleague tweets regularly with students, has given up PowerPoint for Prezi and is having students use Google Earth to map the plot of a book they are reading for a class assignment.
Progress in adopting technology in academia is probably slower than the people who make those technologies would hope, but it is happening.