Now that faculty are dealing with the digital divide, there is only one choice. And the way to pursue that choice is to remember that faculty are perpetual learners.

One and a half lines in an e-mail started me on a journey that may be one of the most vexing—and important—in my teaching career. The e-mail was directed toward the 13 faculty members at my university who had enrolled in an upcoming course-design workshop, and it ended with this last-minute suggestion: “If you use Twitter, Flickr, Delicious, etc., or have a blog, the tag for the institute is GuelphCrDI, so capture it! Tag it! And share it!”

The recipients of that message were, no doubt, divided into two camps: those who could skim easily across that sentence, untroubled by semantics or phrasing and those who hesitated, stumbled, re-read—and still did not grasp its meaning beyond recognizing references to newer forms of digital technology. I fell into the latter group. Membership in this group is not a comfortable place to be these days, because it suggests ignorance about a topic—emerging technologies—that some claim should be the foundation of post-secondary pedagogy. This claim, however, is countered by those who argue that a pedagogical approach centered around digital technology panders to students’ desire to be entertained, reinforces already weak attention spans, and erodes through neglect students’ higher-level cognitive skills.

Arguments on both sides have merit, but the language used to advance these arguments is often divisive and emotion-laden. And always lurking just below the surface is the element of ageism: those who don’t champion the technology fear being cast as aging Baby Boomers, covered in chalk dust and fearful of what they don’t understand. Those who do champion the technology contend with speculation that they themselves are its unwitting victims, whose compulsion to text surreptitiously during faculty meetings and student exams calls into question their ability to provide a balanced, mature perspective.

Because gross generalizations are seldom the precursor of effective pedagogy, this article attempts to present objectively the arguments of both camps and offer an approach to connect the two groups on either side of this Digital Divide.

Arguments for the Increased Use of Technology-Based Pedagogy

1. Learner-centerdness

Central to this debate is the current generation of post-secondary students, born between 1980 and 1994, and their particular learning style. Marc Prensky, an avowed—and frequently cited—proponent of more educational technology, describes the most salient feature of this generation as their lifelong immersion in digital technology. Computers, cell phones, video games, and digital music players have been their constant and, therefore, indispensable toys and tools since birth—hence the term Digital Native to describe these students and hence the claim that by the time they arrive on campus, they’ve spent more than 10,000 hours playing video games.

This prolonged exposure to digital technology, together with the notion of brain plasticity, has led to another, more scientifically-based claim: Digital Natives have developed brains that are different from those of their parents—and those of their professors. Neuroscientist Gary Small explains this difference between young and old as a “brain gap”, whereby daily, lifelong exposure to digital technology, such as computers, video games, and search engines such as Google and Yahoo has literally shaped the way our students think: certain neural pathways have become strengthened through habitual use, while others have weakened through infrequent use. The result has been likened to brains that are “hard wired” to prefer speed, multi-tasking, and non-linear access to information—and to have a low tolerance for lectures, lengthy text, and passive forms of acquiring information. If that’s the case, the bored faces in the lecture hall and the persistent web surfing throughout the tutorial session aren’t necessarily examples of discourteous student behaviour. Instead, they may be evidence of the ill fit between the traditional teaching approaches and Digital Native learning styles.

The June 2009 report from Demos, a UK think tank, echoes this view. The 90-page report addresses the issue posed by its title, Why Higher Education Must Embrace Technology, in part by arguing that emerging technology’s visuals, sense of immediacy, and ability to communicate simultaneously all target the Digital Native’s learning profile. In doing so, this technology also narrows the gap between the student’s in-school and out-of-school worlds. The result, presumably, is a smoother transition from home life to school life as the burden of having to move from a technology-rich world at home to a technology-limited (or perhaps even banned) world at school has been lifted. For these reasons, the British report encourages universities to make greater use of tools such as Twitter and online forums and to recognize faculty who are technology advocates for their teaching and leadership.

2. A strategy to strengthen the university’s competitiveness

But Demos has another agenda besides enhanced student learning when it advocates increased technology in the classroom. Increasingly, university administrators are realizing just how useful technology can be terms of supporting their institution’s competitiveness, perhaps even survival, in the global marketplace. Technology is presented as a way for British universities to cope with reduced public funding, vigorous competition, increased demand fuelled by high unemployment, and increased student diversity—all challenges that certainly resonate here in Canada, as well. Indeed, a well-thought-out, more strategic use of technology would allow any university to become more flexible and accessible, opening its virtual doors to a bigger student body, including the sprawling international market. And underlying these advantages, of course, is the financial incentive: virtual classrooms are relatively cheap to build and even cheaper to maintain.

3. Workplace literacy

The need for digital literacy in the workplace is so apparent that it hardly needs reference. Harvard Business School’s Andrew McAfee blogs (appropriately) on one essential theme: the engine behind American business competition is information technology (see his blog at http://andrewmcafee.org/blog/). Jack Welch, former CEO of General Electric, famously signaled its importance over a decade ago when he encouraged older managers to learn about the Internet from young employees. Today, this process is known as “reverse mentoring”, and it has spread to encompass knowledge transfer in the workplace about iTunes, text messaging, wikis, blogs, and social networking sites.

And digital literacy is not just needed on the job. Increasingly, it’s needed to get the job. An estimated 68 per cent of employers in the United States use social media such as Facebook and Twitter for recruiting purposes.

In other words, to succeed in a work world that’s increasingly based on digital technology, students need—and expect—to be immersed in this world throughout their post-secondary education.

4. High-level cognitive skills

In 2008, the National Council of Teachers of English (NCTE), a 60,000-member American organization, released its definition of 21st century literacies, along with six related learning objectives. Noting that “technology has increased the intensity and complexity of literate environments”, the NCTE document identified as first among its learning objectives the need to “develop proficiency with the tools of technology”. The remaining objectives implicitly refer to the potential learning outcomes of using these interactive tools of technology: collaborative, cross-cultural problem-solving; construction of knowledge to be shared globally; analysis and synthesis of multiple streams of simultaneous information; creation and evaluation of multi-media texts; and attention to the ethical responsibilities required by these complex environments. These learning objectives all fall within the top tier of Bloom’s taxonomy—and together they illustrate an important argument for technology-based pedagogy: its potential to allow learners not just to consume knowledge but also to create it.

A recurring theme throughout Don Tapscott’s Grown Up Digital is the power of technology to elevate its youthful users to “become smarter than their parents ever could be”. For example, the argument goes, hours spent playing video games leads to heightened skills in visual processing; spatial coordination; discovery through trial and error and hypothesis testing; cooperation with opponents; creative problem-solving; and strategizing. Consequently, by virtue of their life-long digital immersion, many students are said to be arriving at the post-secondary classroom with impressive proficiency in these higher-level skills. Given their capabilities, a focus on the lower-level cognitive skills of acquiring, knowing, or memorizing factual knowledge is boring for students and wastes their time. The Internet, after all, provides quick access to any facts the students will likely ever need. Post-secondary educators, therefore, should focus mostly, if not solely, on furthering the students’ higher-level skills using the technologies that fostered these skills in the first place. Technology proponents argue that to do otherwise represents a step backwards (or, more accurately, a refusal to move at all), ignoring the students’ current capabilities and the educational potential of emerging technologies to take these capabilities even further.

Skepticism About the Increased Use of Technology-Based Pedagogy

The counterparts to the ostensibly quick-thinking, parallel-processing—but definitely young—Digital Natives are the Digital Immigrants, those who were born before 1980 and who, therefore, did not experience digital immersion from birth. Because of their lesser exposure, this group is less likely to possess the technological ease and fluency of the Digital Natives. They include among their numbers most post-secondary faculty. And they are perceived by some as a significant obstacle to the educational progress represented by increased technology-based pedagogy: “I think the problem is the faculty—their average age is 57 and … (their) model of learning is pre-Gutenburg. We’ve got a bunch of professors reading from handwritten notes, writing on blackboards”.

Not surprisingly, this kind of rhetoric—the kind that suggests an inverse relationship between faculty age and teaching ability—leads to some impassioned responses from the other side of the Digital Divide. A summary of those responses follows.

1. The use of digital technology can promote “intellectual laziness”

By encouraging “horizontal modeling”

Professor Mark Bauerlein’s thesis is contained in his book’s title The Dumbest Generation: How the Digital Age Stupefies Americans and Jeopardizes Our Future” (2008). He cites study after study that suggests Digital Natives in the United States—and presumably Canada—know little about politics, history, literature, science and, except for celebrity gossip, current events.

Digital immersion for this generation means hours spent on the Internet socializing with peers and following pop culture. A multi-year, comprehensive, ethnographic study based on interviews with more than 800 American youth and their parents, The Digital Youth Project, confirms this last point. It uses the terms “always on communication” and “hypersocial” to describe young people’s use of social networks, instant messaging, and mobile phones. However, time spent on Facebook and MySpace, which are popular websites among students, means less time spent on more intellectually valuable pursuits, such as reading books and forging relationships with their elders, people who in earlier days acted as mentors and role models. In short, teenagers who spend more time with their peers than with anyone else may lack role models who will set high standards and enforce the discipline needed to achieve them.

By undermining “deep reading” and therefore analytical skill

The following scenarios are frequent topics of conversation—and commiseration—in faculty lounges. Students come to class not having read the assigned text. In fact, in anticipation of not reading, they may not have even have purchased the text. Students ask questions throughout the semester that indicate they haven’t read the syllabus, at least not in its entirety. They skim; they scan. If a text is online, they click onto the next link, seldom returning to that unfinished page.

But students aren’t the only ones doing this. In an article entitled “Is Google Making Us Stupid?” (2008), Nicolas Carr, former editor of the Harvard Business Review, describes his newfound impatience with reading books or lengthy articles and wonders if the Internet is to blame:

What the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.

Dismissing the above as a baseless complaint, an a­necdotal effort to foist the blame for one’s own intellectual laziness on modern technology, would be easier if the work of Herbert Simon, Nobel Prize recipient, had not presaged this dilemma almost 40 years ago. The relationship between hours spent on the Internet and a reluctance—or inability—to read lengthy text may be explained by Simon’s explanation of the attention economy, in which, he warns, “a wealth of information creates a poverty of attention …”. In other words, we risk losing the ability to pay attention when too much vies for our attention. Trying to pay attention to the overabundance of digital information that confronts everyone—but particularly, given their high usage, Digital Natives—means that we’re probably spreading our attention too thinly.

And all of that puts us at risk of “turning into ‘pancake people’” who have sacrificed the analytical depth of thought that reading fosters for a superficial breadth of knowledge garnered from “zipping along the surface” of online information.

By encouraging the “myth” of multi-tasking

Some technology proponents would disagree with the above sentiment by arguing that Digital Natives know how to handle “an overabundance of information” through multitasking. In a Salon.com interview, social critic and author Hal Niedzviecki describes this generation as “constantly on the move, instantly Twittering, Twitpic-ing … they’ll be doing that at the same time they’re blogging and updating their status and making little movies and sending them to YouTube”. And all this, a faculty member might add, while they’re sitting in the front row of a lecture.

Scientific evidence, however, suggests that multi-tasking is a myth. When people think they are multi-tasking, they are actually engaged in “continuous partial attention”, switching back and forth between competing activities. The resulting performance will likely be inefficient and error-prone, particularly if the task at hand is a challenging one. There’s a reason, in other words, for legislation banning texting while driving.

But perhaps the most obvious—and visceral—evidence that people tend not to handle multi-tasking well comes from London, England, in 2008 “when Brick Lane, the fashionable east London street, announced that it was henceforth padding its lampposts as a preventive measure against the growth of ’talk and text’ injuries that were maiming thousands of the young hipsters who amble along it”.

2. Digital immersion can lead to constant yet weak connectivity

Problems of errors, inefficiencies, and bruised limbs are all relatively benign, though, compared to a more insidious threat faced by digital technology users: the need to always be in the information loop—even though the information is almost always banal and superficial. Blackberry users who set up their devices to alert them immediately to each incoming e-mail and students who repeatedly check for Facebook updates may be exhibiting the same syndrome as the terrier who barks incessantly once you’ve left for work: separation anxiety.

This anxiety exacts a high cost. Being in a state of constant alert for any new contact or bit of news places a strain on one’s physical and emotional health. Compromised health is a high cost to pay, particularly when the relationships fostered though this constant connectivity tend to be weak, resulting in a pseudo-cyber community (e.g., Facebook “friends” whom the user doesn’t actually know; myriad blogs that no one reads). Indeed, the founders of Twitter attribute their technology’s appeal to the fact that “it’s connection with low expectations” of any real commitment.

And perhaps low expectations of any real achievement. A constant digital presence—and the pseudo-community it links the user with—may discourage a state of being often cited as the hallmark of great achievers; namely sustained (and solitary) reflection and contemplation. Marx had the Reading Room of the British Museum, Thoreau had Walden Pond, and Einstein, the private world of his own imagination. From this perspective, the path to achievement—and often to personal well-being—is not through multi-tasking and constant contact with others, states often associated with Digital Natives and their use of technology but through the opposite: focus, mindfulness and even meditation.

3. Not all post-secondary students are digital natives

Recent studies of first-year university students in Australia confirm what most faculty have observed in their classrooms: apart from Google, cell phones, and e-mail, students vary considerably in their technological proficiency and preferences. For example, most of the 2,000 students surveyed in one study had never created a website, kept a blog, participated in a web conference, used RSS feeds, or contributed to a wiki. A pedagogical approach that argues for enhanced use of digital technology based solely on student age is based on a thin foundation. And students who don’t neatly fit the Digital Native profile risk falling through the cracks of this thin foundation.

An Approach to Reconciling the Two Sides
I must lie down where all ladders start,
In the foul rag and bone shop of the heart
W.B.Yeats

The arguments offered by camps on either side of the Digital Divide point to the major challenge facing post-secondary educators: both sides of the argument make sense. Both sides present compelling and logical (albeit often anecdotal) evidence that suggests we should support, and yet also be skeptical of, the increased use of technology-based pedagogy.

We are, therefore, left with only one choice: to search for a way to reconcile both sides. This may be an onerous task, but it’s not a new one. The economist E.F. Schumacher framed our challenge years ago when he said, “the true problems in living … involve reconciling opposites”. And, fortunately, F. Scott Fitzgerald provided the motivation to pursue this problem when he pointed out that, “the test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function”.

If our goal as educators is to help our students acquire a “first-rate intelligence”, then surely our fundamental task is to model the process by which this goal can be achieved. We have to embrace the sound evidence and the good sense presented by both sides of the Digital Divide, no matter how opposing or contradictory those sides may appear, especially at first glance. To do so is not just our challenge; it is our obligation as educators.

Fortunately, it’s also our heartfelt desire, one that resonates with who we really are. Perpetual learners. Most of us, after all, never stopped being students. A love of pursuing our own studies led to a faculty appointment that involved, almost incidentally, teaching. But our heart’s desire is to learn. And understanding the Digital Divide, especially in light of the technology’s rapid proliferation, offers many ongoing learning opportunities. There is an opportunity to learn more about technology-based pedagogy from camps on both sides and an opportunity to learn from the one side that matters the most in this debate (and the one side that has the highest likelihood of hands-on expertise in the subject matter): the students.

Universities haven’t traditionally given much credibility to decisions based on the heart. But that’s where we’ll find the priority that unites us all: our love of learning. Remembering this priority will help us accept—and figure out how to apply—the sound arguments both for and against technology-based pedagogy. And in doing so, we’ll bridge the technological divide that separates us from our teaching colleagues and from our students.

Joan Flaherty is an assistant professor in the School of Hospitality and Tourism at the University of Guelph. Her research interest is the scholarship of teaching and learning.