Cut and Paste Research

Research icons surrounded by cut lines

Taxiing through the orchestral lull of the internet and its many flash-powered ads and as many sub-screen distractions crammed into one, I know there are a few students with a paw on their laptop trackpads, maneuvering in a desperate sweat to finish the term paper and grab-dragging all they can from Wikipedia.

We know the fate of such pilfered and clipboarded items. These bits and bobs of “information” will be unceremoniously dumped into an open Word file only later to be tortured for anything remotely usable or flagrantly rewritten in the hideous idiom only students of the worst kind can master. The bibliography, I know, will be very thin. If it has any books at all, that would suggest the unthinkable: that the student had mustered up courage to break the ogle marathon with the machines of convenience to make a trip to that foreboding Dracula’s castle known as the library (insert spooky music, thunder, and a cloud of bats pinging across a bloody moon here).

Some students, in their gross self-entitlement and intransigence on what constitutes proper research, will not make the pilgrimage to that grand archive of collected knowledge any more than a non-Muslim would elect on a whim to circle the Ka’ba. Instant, freeze-dried knowledge in 140 character bursts is all they want and, in their view, all they need. The requirement that proper research be done is an alien concept to many of the 21st century student cohort, and they simply cannot fathom what purpose it could serve to develop a good research question, form a good thesis, and substantiate this with carefully selected sources by way of conducting research. Sleuthing through a dusty old building and poring over dusty old books seems almost repugnant to this bunch who may actually believe if it isn’t on the internet, it simply doesn’t exist.

To my own discredit, I’ve seen my own research shift more to the online marketplace, pulled there by the immense magnetic attraction of instantaneity and my own indolence. Yet, there upon my walls stands mighty shelving, holding a mosaic of variegated book spines, each one the promise of being either a trove of wisdom or a compendium of irrelevance. As a committed bibliophile who spends so lavishly—and recklessly—on books, I could be pardoned for getting a bit shirty about students who manage to sail through their entire educational lives with only a cursory rub of the nose in the occasional book, usually only by some grade-related coercion.

I harp on this far too often to the point that it exasperates even those who share my opinion. In what must be the early onset of old age, I am becoming quite crusty and crotchety about the young. The fact that I have been able to make such an us-and-them division speaks volumes—volumes the young will never read unless it is abridged in digital format—on how outmoded I have become, relying as I do on the archly divisive binaries I usually rail against. The equation seems to make perfect sense, even if the outcome is ironic: mass literacy has resulted in increasing illiteracy. Of course, I anticipate the objections immediately by the more sober-minded and inclusive media theorists of the day: the definition of literacy must be expanded to include digital literacy. Yes, of course, and my spam folder is constantly refreshed with the new avant-garde poetry and prose of some of our new digital literates: eloquently Ciceronian appeals to provide my banking information, worldly yet quaintly misspelled sonnets to male member extension, erudite notices of winning foreign lotteries, and vigorous quatrains on how my PayPal account has been compromised. I even developed a course on digital literature, partially to appease the younger generation’s appetite for all things virtual and as a means of trying to find solace in the new literary medium. It is not as though I am staunchly opposed to the web and its many possibilities, but I am foolish enough to believe that the internet should be a supplement and not a replacement for books.

My current compromise is a faint whisper compared to the thumping bibliophile railing in my extempore sermons whenever the subject comes up. I ask that at least half of the bibliography contain physical specimens of text, i.e., books and journal articles. The other half can be the usual litany of web citations with their download dates and URL gobbledygook. Try as I might, when opportunity allows, to extoll all the secret delights and technological superiority of the book, I can tell that the students wish I were a window that could be minimized. My saturnine Mencken-esque reflections receive their rebuffs by means of the blank stare of an audience that finds what I have to say quaint, perplexing, and of arrant nonsense.

How could anyone keep a straight face to hear yet another codger giving vent to his woeful lament about the current state of education and the depths to which the once noble book has fallen? A trip through history—available even on that biblioclastic medium of the internet—charts the long and fascinating history of the book from the monastic middle ages, when a vellum book of 400 pages took about eight cows to manufacture, to the revolutionary Gutenberg press circa 1450, to the creation of the first public lending libraries during populist educational reforms in the 19th century, right up to the cheap mass-trade paperbacks that can only be read once before their spines split and that usually end their lives in church bazaars, Salvation Army thrift stores, and estate sales. Books have been responsible for the dissemination of ideas and the flaring of controversies. They have been faithful companion to the rise of science, changing the world with revolutionary ideas that precipitate actual revolutions. Arguably, the internet has taken on that role and has made the book largely redundant. Not even a small acknowledgement or dirge for the printed word, giving the impression—perhaps now espoused by the young—that books never held any importance beyond being a quaint object consigned to sit among other useless antiques like pomade, butter churners, and Betamax machines.

My main contention is not to praise or bury books and their illustrious legacy but to signal the icy toboggan descent for what passes for student scholarship in research. I do not mean all students, either but rather a growing ersatz majority of them. Their alarming click-cut-paste numbers continue to swell and press themselves against the doors of higher education, eventually making it through the tuition turnstile by sheer force. Research and shopping seem to converge ever more, and I cannot but evince the terrifying thought of students going to their machines to do “research” at the web’s many info-malls. Here, one can find anything one wants, especially all the free facile renderings, chop-shop book reports, false reportage, and bogus opinions one needs. Worse still is the contemporary student’s tendency toward extreme literalism in research. A student at the portal of the info-mall needs to become orientated and so will plug a search term such as “abortion” into the engine. The server crunches and grinds the results and splatters them in an algorithm-based hierarchical list in a matter of milliseconds, returning 22.5 million entries for “abortion.” The gullible info-shopper will assume that the most relevant entry (the first one in the list) means that it is the most relevant in terms of being a credible research source. Hence, I may see the following in a student paper:

Abortion rates have skyrocketed in recent years, prompting Paula Pundit to declare this a state of emergency. According to Paula Pundit’s unbiased research on KillTheBabyKillers.com, “abortion is vile murder, period. Abortionists will all be roasted on Lucifer’s spit!

Indeed. Credible and unbiased, the two most popular flavours of the internet’s worldwide warren of critical thought. Perhaps I am being a bit of a Luddite prude. It seems to me the modern student does all that is possible to dodge traditional research as if it were something very messy and unpalatable. And, indeed, this has expanded to several areas of our lives; joining the military is fine if it can be done via a video game, and being an activist is swell if clicking on the occasional ”agree“ button on a petition counts as being politically informed. Research is messy business, a kind of altercation with the past via a dust-up with challenging language in difficult to click books. There are no rollovers in the book, no hyperlinking to summaries. The book sits there, an inert, immovable text that has no Flash capabilities and no comment pit where anonymous screen-named persons can flex their opinionated witticisms and fan flame wars, trading off in the narcissist game of ”quip pro quo”. No, comparatively the book is an arid terrain of words and concepts with no helpful search toolbar on the heading of every one of its pages. Unless one has recently indulged in illicit substances, one will wait a very long time indeed for a chat window to pop up at the bottom of the page where ”2Much4U” is dinging you with his ”wazzup?” What a book demands is attention and patience—difficult to acquire commodities for the modern psyche, much like trying to track down a box of condoms in the Sahara and as seemingly practical as a solar powered submarine.

Pardon the segue, but I cannot leave this aside for how much it irritates me. I may be considered an unreasonable tyrant, but I forbid emoticons in student essays. I have made similar moves to ban the exclamation point, as well as a growing list of words such as “totally”, “like”, and phrases such as “in my experience” and “I believe….” Any instance of “like” is usually a clumsy attempt at a simile or metaphor that cannot find its appropriate term. “Like” is a filler word, like, y’know, like, yeah.

I’ve tossed around the option of banning semicolons as well, since most people—young and old—have no clue how to use them. There is a popular yet wrong-headed idea that peppering an essay at random with the little barbed things makes the writing more “academic.” Incessant semicolon usage, especially when not one of them is even by chance employed correctly, only serves to make a research paper look like a punctuation fire-fight. The young do use them perhaps far more frequently in those besotted emoticons, the typographical standardized currency of all human emotion. In such cases, I tell them that to be winked at while reading is rather distracting, if not lewd. There is positively no web-speak allowed, which includes the now worn smooth and meaningless LOL, LMAO, ROTFLMAO, and any other acronym that vaguely resemble sub-committees invented by the Soviet Gosplan or awkwardly named probes launched by NASA. If anything, these prohibitions on what cannot be included in the research essay allow me a clear and unobstructed view of their vicious crimes against spelling and grammar and the brutal atrocities they commit against the rules of reasoning.

Truth is the first casualty of war—thus spake Wikipedia (after Aeschylus). In the age of internet research, substance is the first casualty of info-mining. Info-mining leads to info-dumping. The factoid and the tidbit are the sickly dauphins of knowledge and argumentation, and the personal anecdote is the usurper of vetted sources. What we are left with is a vast, disconnected constellation of orphaned quotations in the sequence that Google has arranged them, making a research paper look one part riddle and another part trivia. These web-harvests contain content connected only by virtue of sharing a few keywords. Nothing new is developed—just the perpetual act of reshuffling the same things in different order. Of course, I should not be so harsh: there may be plenty of respected academics out there who have made their names on just this practice alone. Reorganization is important, too.

Whither depth and engagement with the research enterprise? I do not mean to paint a completely bleak portrait of today’s student, for I still have exceptionally talented and equally old-fashioned research-savvy students coming through my courses. And, when Google completes its own Borgesian task of digitizing every book ever known, my complaints about web research will seem quaint and irrelevant. However, you will still find my obnoxious finger wagging when the grand optimists speak of the new virtual utopia made of text. Does digitizing everything really mean people will read more, or will we continue in our lazy habit of reading only what contains the explicit keyword to copy-paste? If it were preached to me that the web will make us better thinkers, it would be a strain on my credulity. Really what such blind optimists mean—or hope—is that universal access to information will precipitate a digital renaissance. But that reasoning is sloppy and wishful at best: access to healthier foods has not made us all magically thinner; nor did the opening of the first lending libraries make us all proficient in mathematics or connoisseurs of Latin poetry. The web is ungovernable, despite what the Chinese government believes, and the amount of frivolous garbage poured into it has made it an unthinkably vast psychological landfill. Sure, there’s plenty of knowledge if one knows where to sift for it, but it is so much easier to let the search engine decide what has value.

There is no doubt that I will continue to be served with student essays that read more like a random cull of websites. This will be exacerbated in the same proportion to what I call the second law of info-dynamics; namely, the entropic effects of “information” in its raw and vulgar import. Lacking any framework or context, research will more resemble Dadaist vers libre, leaving it to the reader to do all the work of connecting isolated, orphaned fragments. These banal aphorisms masquerading as sustained research will be as perplexing and interpretively broad as the pre-Socratic fragments so sparingly bequeathed to history. These exploded bits, yanked from sources critical and dubious, show the modern web-based essayist to be the intrepid garbologist of the virtual, eager to reorganize content that has been repeated and reorganized according to several arrangement schemes already. This is not to say that there are not legitimate works on the internet that are worthy of inclusion in a research paper, but the method of acquisition (keyword searches) and no concerted program of teaching web-content-discrimination techniques to say what is valid and what is dross, hobble these efforts. Books, too, are more legion than ever, with a new book being published every 20 seconds.

Perhaps I have it all wrong. The days of poring over stacks of books and compiling meticulous records and synthesizing this information into a coherent articulation based on a well-formed research question, with an appeal to credible sources, belong to a slower, more patient time that was not being buzzed, dinged, or jolted by devices of distraction.

Kane X. Faucher is a professor in the Faculty of Information and Media Studies at the University of Western Ontario.