Cut and Paste Research

Research icons surrounded by cut lines

Taxiing through the orchestral lull of the internet and its many flash-powered ads and as many sub-screen distractions crammed into one, I know there are a few students with a paw on their laptop trackpads, maneuvering in a desperate sweat to finish the term paper and grab-dragging all they can from Wikipedia.

We know the fate of such pilfered and clipboarded items. These bits and bobs of “information” will be unceremoniously dumped into an open Word file only later to be tortured for anything remotely usable or flagrantly rewritten in the hideous idiom only students of the worst kind can master. The bibliography, I know, will be very thin. If it has any books at all, that would suggest the unthinkable: that the student had mustered up courage to break the ogle marathon with the machines of convenience to make a trip to that foreboding Dracula’s castle known as the library (insert spooky music, thunder, and a cloud of bats pinging across a bloody moon here).

Some students, in their gross self-entitlement and intransigence on what constitutes proper research, will not make the pilgrimage to that grand archive of collected knowledge any more than a non-Muslim would elect on a whim to circle the Ka’ba. Instant, freeze-dried knowledge in 140 character bursts is all they want and, in their view, all they need. The requirement that proper research be done is an alien concept to many of the 21st century student cohort, and they simply cannot fathom what purpose it could serve to develop a good research question, form a good thesis, and substantiate this with carefully selected sources by way of conducting research. Sleuthing through a dusty old building and poring over dusty old books seems almost repugnant to this bunch who may actually believe if it isn’t on the internet, it simply doesn’t exist.

To my own discredit, I’ve seen my own research shift more to the online marketplace, pulled there by the immense magnetic attraction of instantaneity and my own indolence. Yet, there upon my walls stands mighty shelving, holding a mosaic of variegated book spines, each one the promise of being either a trove of wisdom or a compendium of irrelevance. As a committed bibliophile who spends so lavishly—and recklessly—on books, I could be pardoned for getting a bit shirty about students who manage to sail through their entire educational lives with only a cursory rub of the nose in the occasional book, usually only by some grade-related coercion.

I harp on this far too often to the point that it exasperates even those who share my opinion. In what must be the early onset of old age, I am becoming quite crusty and crotchety about the young. The fact that I have been able to make such an us-and-them division speaks volumes—volumes the young will never read unless it is abridged in digital format—on how outmoded I have become, relying as I do on the archly divisive binaries I usually rail against. The equation seems to make perfect sense, even if the outcome is ironic: mass literacy has resulted in increasing illiteracy. Of course, I anticipate the objections immediately by the more sober-minded and inclusive media theorists of the day: the definition of literacy must be expanded to include digital literacy. Yes, of course, and my spam folder is constantly refreshed with the new avant-garde poetry and prose of some of our new digital literates: eloquently Ciceronian appeals to provide my banking information, worldly yet quaintly misspelled sonnets to male member extension, erudite notices of winning foreign lotteries, and vigorous quatrains on how my PayPal account has been compromised. I even developed a course on digital literature, partially to appease the younger generation’s appetite for all things virtual and as a means of trying to find solace in the new literary medium. It is not as though I am staunchly opposed to the web and its many possibilities, but I am foolish enough to believe that the internet should be a supplement and not a replacement for books.

My current compromise is a faint whisper compared to the thumping bibliophile railing in my extempore sermons whenever the subject comes up. I ask that at least half of the bibliography contain physical specimens of text, i.e., books and journal articles. The other half can be the usual litany of web citations with their download dates and URL gobbledygook. Try as I might, when opportunity allows, to extoll all the secret delights and technological superiority of the book, I can tell that the students wish I were a window that could be minimized. My saturnine Mencken-esque reflections receive their rebuffs by means of the blank stare of an audience that finds what I have to say quaint, perplexing, and of arrant nonsense.

How could anyone keep a straight face to hear yet another codger giving vent to his woeful lament about the current state of education and the depths to which the once noble book has fallen? A trip through history—available even on that biblioclastic medium of the internet—charts the long and fascinating history of the book from the monastic middle ages, when a vellum book of 400 pages took about eight cows to manufacture, to the revolutionary Gutenberg press circa 1450, to the creation of the first public lending libraries during populist educational reforms in the 19th century, right up to the cheap mass-trade paperbacks that can only be read once before their spines split and that usually end their lives in church bazaars, Salvation Army thrift stores, and estate sales. Books have been responsible for the dissemination of ideas and the flaring of controversies. They have been faithful companion to the rise of science, changing the world with revolutionary ideas that precipitate actual revolutions. Arguably, the internet has taken on that role and has made the book largely redundant. Not even a small acknowledgement or dirge for the printed word, giving the impression—perhaps now espoused by the young—that books never held any importance beyond being a quaint object consigned to sit among other useless antiques like pomade, butter churners, and Betamax machines.

My main contention is not to praise or bury books and their illustrious legacy but to signal the icy toboggan descent for what passes for student scholarship in research. I do not mean all students, either but rather a growing ersatz majority of them. Their alarming click-cut-paste numbers continue to swell and press themselves against the doors of higher education, eventually making it through the tuition turnstile by sheer force. Research and shopping seem to converge ever more, and I cannot but evince the terrifying thought of students going to their machines to do “research” at the web’s many info-malls. Here, one can find anything one wants, especially all the free facile renderings, chop-shop book reports, false reportage, and bogus opinions one needs. Worse still is the contemporary student’s tendency toward extreme literalism in research. A student at the portal of the info-mall needs to become orientated and so will plug a search term such as “abortion” into the engine. The server crunches and grinds the results and splatters them in an algorithm-based hierarchical list in a matter of milliseconds, returning 22.5 million entries for “abortion.” The gullible info-shopper will assume that the most relevant entry (the first one in the list) means that it is the most relevant in terms of being a credible research source. Hence, I may see the following in a student paper:

Abortion rates have skyrocketed in recent years, prompting Paula Pundit to declare this a state of emergency. According to Paula Pundit’s unbiased research on KillTheBabyKillers.com, “abortion is vile murder, period. Abortionists will all be roasted on Lucifer’s spit!

Indeed. Credible and unbiased, the two most popular flavours of the internet’s worldwide warren of critical thought. Perhaps I am being a bit of a Luddite prude. It seems to me the modern student does all that is possible to dodge traditional research as if it were something very messy and unpalatable. And, indeed, this has expanded to several areas of our lives; joining the military is fine if it can be done via a video game, and being an activist is swell if clicking on the occasional ”agree“ button on a petition counts as being politically informed. Research is messy business, a kind of altercation with the past via a dust-up with challenging language in difficult to click books. There are no rollovers in the book, no hyperlinking to summaries. The book sits there, an inert, immovable text that has no Flash capabilities and no comment pit where anonymous screen-named persons can flex their opinionated witticisms and fan flame wars, trading off in the narcissist game of ”quip pro quo”. No, comparatively the book is an arid terrain of words and concepts with no helpful search toolbar on the heading of every one of its pages. Unless one has recently indulged in illicit substances, one will wait a very long time indeed for a chat window to pop up at the bottom of the page where ”2Much4U” is dinging you with his ”wazzup?” What a book demands is attention and patience—difficult to acquire commodities for the modern psyche, much like trying to track down a box of condoms in the Sahara and as seemingly practical as a solar powered submarine.

Pardon the segue, but I cannot leave this aside for how much it irritates me. I may be considered an unreasonable tyrant, but I forbid emoticons in student essays. I have made similar moves to ban the exclamation point, as well as a growing list of words such as “totally”, “like”, and phrases such as “in my experience” and “I believe….” Any instance of “like” is usually a clumsy attempt at a simile or metaphor that cannot find its appropriate term. “Like” is a filler word, like, y’know, like, yeah.

I’ve tossed around the option of banning semicolons as well, since most people—young and old—have no clue how to use them. There is a popular yet wrong-headed idea that peppering an essay at random with the little barbed things makes the writing more “academic.” Incessant semicolon usage, especially when not one of them is even by chance employed correctly, only serves to make a research paper look like a punctuation fire-fight. The young do use them perhaps far more frequently in those besotted emoticons, the typographical standardized currency of all human emotion. In such cases, I tell them that to be winked at while reading is rather distracting, if not lewd. There is positively no web-speak allowed, which includes the now worn smooth and meaningless LOL, LMAO, ROTFLMAO, and any other acronym that vaguely resemble sub-committees invented by the Soviet Gosplan or awkwardly named probes launched by NASA. If anything, these prohibitions on what cannot be included in the research essay allow me a clear and unobstructed view of their vicious crimes against spelling and grammar and the brutal atrocities they commit against the rules of reasoning.

Truth is the first casualty of war—thus spake Wikipedia (after Aeschylus). In the age of internet research, substance is the first casualty of info-mining. Info-mining leads to info-dumping. The factoid and the tidbit are the sickly dauphins of knowledge and argumentation, and the personal anecdote is the usurper of vetted sources. What we are left with is a vast, disconnected constellation of orphaned quotations in the sequence that Google has arranged them, making a research paper look one part riddle and another part trivia. These web-harvests contain content connected only by virtue of sharing a few keywords. Nothing new is developed—just the perpetual act of reshuffling the same things in different order. Of course, I should not be so harsh: there may be plenty of respected academics out there who have made their names on just this practice alone. Reorganization is important, too.

Whither depth and engagement with the research enterprise? I do not mean to paint a completely bleak portrait of today’s student, for I still have exceptionally talented and equally old-fashioned research-savvy students coming through my courses. And, when Google completes its own Borgesian task of digitizing every book ever known, my complaints about web research will seem quaint and irrelevant. However, you will still find my obnoxious finger wagging when the grand optimists speak of the new virtual utopia made of text. Does digitizing everything really mean people will read more, or will we continue in our lazy habit of reading only what contains the explicit keyword to copy-paste? If it were preached to me that the web will make us better thinkers, it would be a strain on my credulity. Really what such blind optimists mean—or hope—is that universal access to information will precipitate a digital renaissance. But that reasoning is sloppy and wishful at best: access to healthier foods has not made us all magically thinner; nor did the opening of the first lending libraries make us all proficient in mathematics or connoisseurs of Latin poetry. The web is ungovernable, despite what the Chinese government believes, and the amount of frivolous garbage poured into it has made it an unthinkably vast psychological landfill. Sure, there’s plenty of knowledge if one knows where to sift for it, but it is so much easier to let the search engine decide what has value.

There is no doubt that I will continue to be served with student essays that read more like a random cull of websites. This will be exacerbated in the same proportion to what I call the second law of info-dynamics; namely, the entropic effects of “information” in its raw and vulgar import. Lacking any framework or context, research will more resemble Dadaist vers libre, leaving it to the reader to do all the work of connecting isolated, orphaned fragments. These banal aphorisms masquerading as sustained research will be as perplexing and interpretively broad as the pre-Socratic fragments so sparingly bequeathed to history. These exploded bits, yanked from sources critical and dubious, show the modern web-based essayist to be the intrepid garbologist of the virtual, eager to reorganize content that has been repeated and reorganized according to several arrangement schemes already. This is not to say that there are not legitimate works on the internet that are worthy of inclusion in a research paper, but the method of acquisition (keyword searches) and no concerted program of teaching web-content-discrimination techniques to say what is valid and what is dross, hobble these efforts. Books, too, are more legion than ever, with a new book being published every 20 seconds.

Perhaps I have it all wrong. The days of poring over stacks of books and compiling meticulous records and synthesizing this information into a coherent articulation based on a well-formed research question, with an appeal to credible sources, belong to a slower, more patient time that was not being buzzed, dinged, or jolted by devices of distraction.

Kane X. Faucher is a professor in the Faculty of Information and Media Studies at the University of Western Ontario.

2 Responses to “Cut and Paste Research”

  •  by Bill Will

    Thrillingly, acridly ironic that these thoughts have to be presented online in order to achieve an audience! There is certainly something of the Luddite lurking in the author – would he insist that students submit papers written in pencil, for instance? Throwing the semi-colon out is clearly a necessity, but not merely because students don’t know how to use it; it’s just simply no longer coherent in the modern wash of shorter sentences and paragraphs. But is there a relationship between punctuation litter and info-dumping? I think not. Long before the Internet, students played Bibliomancy with the encyclopedia. One cannot vilify the web for introducing something new. Rather, let’s examine the surge of university students, the desperate need to fill seats and process humans towards the ultimately destructive goal of BA-ifying everything to the point that the MA is the new BA, and the Phd has become the hiding place for those who use distinguishing phrases like “the real world” (which makes it impossible for them to recognize that their labors towards a professorial job they have a good chance of never achieving stinks of the Real, and in the most vulgar way). No, the power of this article is in identifying the lack of the well-formed research question. Beyond that, I think the author needs to recognize that being on the changed side of the guard even at such a young age does not evidence for a good/bad binary make. The real problem is that a) we have students write at all and b) we do not encourage new ways of expression. Why? The only real reason why students need to read and write is because laws are still written down. As soon as the first legal precedent is drummed up through YouTube, YouTube will become the depository of the law (and here I am perhaps already behind the times). That humankind may soon shed writing as we know it is in fact a positive thing. Why do we snarl at these kids for their inability to do tasks that are becoming increasingly meaningless to them and their futures? Why not have them demonstrate the ability to organize thought by having them make documentaries with their iPhones? They’ll quickly learn the need for writing, and likely more people will learn more about writing by letting them play with their toys and letting them cite wildly through the technological means they are being raised with. With a well-formed research question, the medium of the answer is a secondary consideration. Imagine a world in which Michelangelo had been forced to express his thoughts about David in the form of an essay? Unshackle the students of the world from these dead models of intellectual expression (or at least skirt them in through the back door) and let slip the rush of the future. Bring back the paid lecturer and refuse the scholar the use of books. Obviously, Google’s “all-online” policy will never be the answer, but at the same time, we have to recognize that an omega point may well come in which all writing simply is written exclusively online. Finally, it should be said in this tangle of response that citing online is not strictly the problem. The MLA, APA and other styleguides are the problem. I suggest scrapping the whole project of citations. Let us assume AdamEveo me fecit. As long as we piddle around with citations, we are continuing the myth of the individual, and that is ultimately the key concern here, replete with the distasteful matters of religion which are the consequence of all individuality (even for atheists, antitheists and the like). Dividualization is nigh! Dare to dive!

    Reply
  •  by Craig Butosi

    I wonder to what degree the author can be considered a Luddite as you suggest Bill Will. Steven E. Jones reminds us that the Luddites of 19th century England were never just “anti-technology” but that popular discourses about them, circulated by such Romantics as Keats, Lord Byron, and Blake (those “dark Satanic mills”), tended to pigeonhole this loose collective of activist craft-workers as simply that – against technology. Luddites never positioned themselves solely as an antagonistic force against technology per se; instead, they smashed machines in order to prevent what Marx observed seventy years later as the estrangement of labour of industrial labourers, i.e., alienated labour. Luddites, therefore, were struggling over the control of their craft and conservation of their skill, over the means of production. A battle between dead labour and living labour. Like a terrorist attack, the primary target is always of secondary importance. The machine, for Luddites, was incidental, symbolic (yet very real). For them, the battle was about control, not scientific progress itself; their motivation was survival, not necessarily an appeal to yesteryear. The implication against the author that he “insist that students submit papers written in pencil” speaks to the popular and somewhat misinformed notion that Luddites yearned to return to a “simpler time,” that they subscribed to a sort of Paganistic regression and scepticism who, feeling alienated from nature due to “progress”, ultimately interpreted progress synonymously with “being left behind”. The historic record seems to suggest otherwise. Thus, I see optimism in the author’s piece: a manifesto for one to own (smash?) the machine, not the other way around. Perhaps we could say that the kind of student the author describes is indeed alienated from their ability to fully perform as a student precisely because of the ways in which they are introduced to, and perceive, the Web before they even become students of higher-learning (i.e., predominantly as spaces of consumption, convenience, and communication, not necessarily sites of rigorous, quality research activity). After all, the author writes, “Sure, there’s plenty of knowledge if one knows where to sift for it, but it is so much easier to let the search engine decide what has value. Aside from the industrial connotation of the engine, what speaks to me here is that we do indeed let the search engine drive our productive capabilities when researching, perhaps relinquishing a degree of control to an algorithm in terms of the research process. Could we say that, because of this perception (ideology?), some students are hard-pressed, even more so today, from developing the qualities they need as students (digital literacy, patience, research rigour, information literacy, critical reasoning over opining, etc.) and, therefore, are unable to produce something of quality – similar to the (de-skilled) Luddite who had to work a machine that ultimately produced low-grade textiles? For whatever reason, be it the Web, the culture of convenience, progress, etc., perhaps this piece is a wake-up call to remember the value of simpler technology. If anything, to keep people in control of their productive activity (that is, their critical faculties) so that they may not only perform as students, but also critical thinkers who are able to have an extended conversation with a well-crafted, extended argument and to engage with ideas at the scholarly level. The beauty of the book is in its non-multi-modality, that is, its one-dimensionality as solely a carrier of text. This is not to say, however, that nothing can be gained from the multi-modal experience of the Web and mobile devices. Indeed, we have seen the power of such technological ingenuity. So, perhaps you are correct, Bill Will, that the author is a (neo)Luddite, but I believe it is more for the reasons I have mentioned above. Though I do agree with you that perhaps writing a documentary from one’s iPhone might be the answer, we cannot deny the value of prior forms of technology. If anything, the synthesis or concomitant use of these technologies new and old can only broaden one’s techne. When, in the name of progress, we begin supplanting the old for the new, the threat of the loss of control is always looming (pun intended!). Maybe it is our job to remind students who need such reminding that it is time to smash (that is, to critically engage with) the machine and remind the peddlers of “information revolution” discourse forwarded by the Tofflerites and the Bellites that the technological determinism of which they subscribe has not brought about a brave new world, but instead only an increase in the (neoliberal) belief that technology as an autonomous force (standing over and above the political, social, cultural, and economic practices of which technology is always embedded) is the key to the world’s woes, when, in fact, it is not and has never been. The author’s piece, I believe, is a demonstration of such a blind faith in such discourses about technology which so often work against the critical faculties of individuals. How about giving control back to the students?

    Reply

Leave a Reply