1,944 Facts That Explain Why You Can't Escape The Vast Prison Of Language (#625 Will Blow. Your. Mind.)

Hey!

If you don't want to read all of this before skipping to the rest of the webtext, I understand. It's the internet, and this here article is full of strange words and phrases (though I've tried! I've tried so hard to align it with the popular conventions of online discourse ☹ ). Click the button below to do so. You can always come back here by clicking the "Quit!" button after getting thoroughly confused by what you will experience. And trust me: you will.

~<(((Nathan Riggs)))>~

The General Rhetorical Accident

In the beginning of The Information Bomb, Paul Virilio makes his case clear against contemporary scientific practice:

As the tragic phenomenon of a knowledge which has suddenly become cybernetic, this techno-science becomes...as mass techno-culture, the agent not, as in the past, of the acceleration of history, but of the dizzying whirl of the acceleration of reality— and that to the detriment of all verisimilitude. (3)

Widely known for his work on dromology (the logic of speed, especially as it applies to shrinking space) and the theorizing of the "General Accident"— a catastrophe that happens to all of us at the same time and with equal force, thanks to our thus unabated desire to shrink space with speed—Virilio, like his fellow "postmodern" contemporaries Baudrillard and Ellul, spares little mercy for his attacks on scientific practice, technological aims and effects, or cultural phenomena seeming to suddenly running amok with a new-found disregard for time or space. Nor does Virilio, for the most part, spare his audience: most of his conclusions leave little room for hope, and even less room for averting disaster. Joining the ranks of Nietzsche, Schopenhauer, Camus and others, Virilio seems ripe to join the League of Noted Gentleman Pessimists—and maybe even lead it.

Pessimists are rarely entirely correct, thankfully—but nor are they ever entirely wrong. Consider, for instance, a reworking of Virilio's statement above:

"As the tragic phenomenon of a knowledge which has suddenly become cybernetic, this techno-science becomes...as mass techno-culture, the agent not, as in the past, of the acceleration of history, but of the dizzying whirl of the acceleration of reality—and that to the detriment of all verisimilitude." (3)

With a few small alterations, Virilio's quote may not only accurately portray the state of science today, but of all transactions of information happening in our techno-scientific, cybernetic landscape—and "fake news" is only but one symptom of the ailment. The replacement of Virilio's "verisimilitude"—the appearance of being true—with "Truth" belies the point: increasingly, that which is patently false, or as close to falsehood as one can be, pursues verisimilitude at all costs, borrowing the conventions of good faith discourse to such an extent that truth itself, without a monopoly on the contrivances that sustain it, becomes impossible to discern or sustain: what is true is the effect of discursive acts and utterances, whether those originating statements are "true" in themselves or not.

This is old news for contemporary scholars of rhetoric, who have digested more than their fair share of thinkers who systematically question the foundations of truth: Lyotard, Derrida, Latour, Heidegger, and so on. While the philosophers and scientists claim truth, though by radically different means, many rhetoricians question and even deny the very truth-hood of capital-T, platonic Truth; given that notions of true and false are ultimately up to matters of persuasion and human fallibility (both individual and collective), the veracity of a statement depends not on a solid yet abstract, undeniable grounding, but on the people involved, their places in the world, cultural understandings, and even perhaps the dismal weather at a point in time and space when and where a truth is established or denied. Whether systematically identified in the techno-scientific laboratory or painstakingly crafted from axioms and premises, whether debated in a college classroom or established in a holy ceremony, one truth seems to trump all others: we must agree to a truth, in our communities or professional organizations, or even in our interpersonal relationships, before a concept attains a "true" status at all.

We must admit: scholars of rhetoric seem thoroughly, abysmally unhelpful, and perhaps even add to the problem: with all of our denials of the platonic truth, and the increased influence over young minds that we have harbored over the past century, it is no wonder that 64% of adult Americans, many of whom were educated in the rhetorician's college classroom, have fallen for some form of fake news —and many of those same, often rational people equally believe that traditional journalistic outlets of "news" are the "true" purveyors of falsehood. After decades of fighting against grand truth claims, one might come to the conclusion, and understandably, that rhetoricians are reaping what they have sewn, and the rest of the world is reaping (weeping) with them. Scholars of rhetoric, rejoice!—it seems we have won the day.

Take that, Plato! The. End.

(keep scrolling down to continue reading)

Yes, yes. Not so fast.

On the chance that you managed to continue reading after that last paragraph despite your disgust (feeling guilty?), two points now must follow:

  1. There is a good chance that you recognize both Aristotle and the meme form in the image above, partially proving the point thus far
  2. We are quite conveniently forgetting the relationship between rhetoric and ethics, a connection perhaps first outlined by the sophists and solidified in text by the EDM-listening, fist-bumping bro pictured above.

Let us now briefly address these two points, in order, before moving forward—or, rather, let us move forward through them.

Aristotelian Memes

Internet memes, as examples of visual rhetoric, rely on the viewer (reader) to complete a series of a enthymemes that:

  1. can often be "properly" completed only by someone who has the required cultural capital to be familiar with the commonplaces of the meme's specific target audience, but
  2. are flexible enough as enthymemetic devices so as to transcend the narrow scope of its intended audience, allowing for mass appeal and multiple interpretations, and
  3. sometimes combine the discourses of disparate and mostly disassociated communities, often lending comedic effect: an image of Aristotle should be almost universally recognized by the audience of this webtext, but the visual language of the "deal with it" meme may be asking too much: either the reader is unversed in the language much of the internet has adopted, or the meme is hopelessly outdated already.

Nonetheless, the meme's untimeliness nor a lack of knowledge prevents the reader from interpreting the meme given the context of this text, nor is the context necessary for its operation—a meme such as this could easily find a home outside of this specific discourse, and yet would still carry similar intentional constructions. A meme is a meme because it eludes its own context and origins: it is an abstract universal, able to traverse the globe at near the speed of light and with no concern for space, infinitely copied and reproduced to fit most any context or situation and to imprint into millions of minds in an instant. It is, in short, Virilio's nightmare.

Richard Dawkins, who mutated an Ancient Greek word (Mimeme) to suit his purposes (and he would be pleased with the language here), originally coined and defined the term through analogy:

Examples of memes are tunes, ideas, catch-phrases, clothes fashions, ways of making pots or of building arches. Just as genes propagate themselves in the gene pool by leaping from body to body via sperms or eggs, so memes propagate themselves in the meme pool by leaping from brain to brain via a process which, in the broad sense, can be called imitation. If a scientist hears, or reads about, a good idea, he passed it on to his colleagues and students. He mentions it in his articles and his lectures. If the idea catches on, it can be said to propagate itself, spreading from brain to brain. As my colleague N.K. Humphrey neatly summed up an earlier draft of this chapter: `...memes should be regarded as living structures, not just metaphorically but technically [author's note: we will leave Mr. Humphrey's attempt at separating the technical from the metaphorical for another time]. When you plant a fertile meme in my mind you literally parasitize my brain, turning it into a vehicle for the meme's propagation in just the way that a virus may parasitize the genetic mechanism of a host cell. And this isn't just a way of talking—the meme for, say, "belief in life after death" is actually realized physically, millions of times over, as a structure in the nervous systems of individual men the world over. (source)

Yes, that Richard Dawkins: the notorious rhetorician biologist who somehow shows a remarkable lack of rhetorical consideration in his tweets, and who once concerned himself with the study of rhetoric without the wherewithal to realize it. His description here, perhaps like a meme itself, functions in a way that can cross sub-disciplinary interpretations of rhetoric, from traditionalist to neo-Aristotelians to New Materialist and to the Posthuman alike and together. Additionally, given his heavy use of metaphor, it is no wonder that the paragraph can so aptly transcend its origins in discourse between professional biologists. Like with Virilio, let us replace some key terms for Dawkins:

Examples of memes are tunes, ideas, catch-phrases, clothes fashions, ways of making pots or of building arches. Just as genes propagate themselves in the gene pool by leaping from body to body via sperms or eggs, so memes propagate themselves in the meme pool by leaping from brain to brain via a process which, in the broad sense, can be called imitation. If a scientist hears, or reads about, a good idea, he passed it on to his colleagues and students. He mentions it in his articles and his lectures. If the idea catches on, it can be said to propagate itself, spreading from brain to brain. As my colleague N.K. Humphrey neatly summed up an earlier draft of this chapter: `...memes should be regarded as living structures, not just metaphorically but technically [author's note: we will leave Mr. Humphrey's attempt at separating the technical from the metaphorical for another time]. When you plant a fertile meme in my mind you literally parasitize my brain, turning it into a vehicle for the meme's propagation in just the way that a virus may parasitize the genetic mechanism of a host cell. And this isn't just a way of talking—the meme for, say, "belief in life after death" is actually realized physically, millions of times over, as a structure in the nervous systems of individual men the world over. (source)

Alas! It does seem readily apparent, for anyone semi-versed in rhetorical studies, that Dawkins is indeed describing one of the oldest disciplines in the world, and one with a long history that predates Dawkins himself by at least a couple thousand years: Rhetoric. Likewise, the definition of 'meme' that Dawkins provides certainly predates the contemporary usage of the term to usually denote an image covered in sparse text, often in the format of "top text, bottom text" in relation to the image's geometry, and the older definition provides much more space for inclusion: any transmission of information, if it is to be received, must pass through the rhetorical highway, paying the poor soul stuck in its frustratingly frequent toll booths along the way (the public lane has long been closed, here in the United States, and was formally pronounced dead with the retraction of our Net Neutrality laws; I pray you find yourself in a more egalitarian space). Dawkins' definition, with our minor changes, more aptly allows us to rebound back to the original point: fake news is nothing new, nor are challenges to truth (the same goes for complete fabrications); what separates the new fake news from the old is the collapsing of space-time in its mode of transmission—in the limited terms of Dawkins, one might say that it has become highly infectious, and potentially fatal. They are viral enthy-memes, with enough blanks for anyone to fill.

Oh, for the love of Rhetorethics!!!

It may be hard to swallow, even for those of us who digested it ourselves, that it was once a widely accepted belief that the internet, and its underlying hypertextual fabric, would promise human liberation from the chains of hierarchy and authority, whether enforced by family, business, or state (Alexander Galloway's Protocol put a final nail into this fantasy's coffin in 2004, describing the vast control scheme by which the internet operates in a way that non-technicians could reluctantly grasp). In the midst of this short-lived, techno-utopian zeitgeist, in which there was even a scholarly bubble of hypermedia studies that eventually busted (you can see an example here) and then became the still fairly overly-optimistic sub-disciplines related to the digital humanities, Pamela K. Gilbert published "Meditations upon Hypertext: A Rhetorethics for Cyborgs." In it, she not only challenged the overwhelming optimism in the humanities about postmodernism's seeming triumph over the linear narrative form through HTML, but also explored a facet of hypermedia (what we would now call the internet) that, though only in its infancy, she found troubling: our pretensions about the liberatory and inclusive potential of the internet, along with our inability to think in a mode beyond that of literacy, blind us to the more troubling and ultimately controlling and exploitative aspects of its design and effect (26). Like with market economies, we have confused decentralization with decolonization, open trade with open societies, and freedom with cost-benefit analysis.

Primarily, Gilbert focused on the consequences of the then much lauded collapse of the reader/writer distinction online, and sought a way for us to find a way through it unscathed (and as the topic of this essay suggests, we did not heed her advice). As many other scholars have suggested, the function of the author has largely been a matter of ethos (24)—even today, we teach students that the credibility of a source depends on its author and avenue of publication, whether through expertise or trust—but in the death of the author, as exhibited in hypertextuality, the only ethos left to trust is that of the reader; and such a reader, as we have sadly confirmed, is ill-prepared for such immense responsibility. In the barrage of hyperlinks that has become only more intense in the past two decades, even when an author's ethos still functions as it should, that author becomes stitched together with a thousand others (28), a hypertext chimera bleating indeterminate or conflicting narratives, the veracity of which can be sorted through by only the self-reflective hyper-awareness of the reader. To the untrained—and, given our training in a semi-outmoded paradigm of literacy, we are all untrained in a world of electracy (Gilbert admitted as such in 1997; we are still working our way through) — this becomes an impossible task: our notions of truth and fiction, like the author and reader, dissolve and pool together as a ambiguous mess on the floor. Worse still, because hypertextual media provides no closure provided by an author, only an endless series of interconnected links, the tenets and uses of formal argumentation, which is premised on reaching closure, becomes ineffective at best, and counterproductive or catastrophic at worst (27).

Those of us fortunate enough to live through perhaps the first electrate presidency today, in this glorious first-term American presidential year of 2017, need no reminder about the seeming futility of that very formal argumentation we teach in our writing classrooms, though the meme to the left reminds us nonetheless (see here for an explanation of the meme): there are no arguments, only memes. Even when formal argumentation reaches a critical point at which its audience almost seems to accept its conclusions—here comes the screenshot, top-text, bottom-text. The argument itself is transformed by a reader/copier/writer into meme form, and it spreads across the network without the baggage of the context it requires to qualify as an argument. Granted, this is not turning water into wine, but it may well be the reverse: distilled from its argumentative impurities, it becomes the solvent for future solute enthymemes to join into and dissolve. New arguments are pasted above it and below it, either agreeing or disagreeing; animated reactions are inserted to indicate the appropriate response expected from the new audience. As it has always been, once language is evoked, whether spoken, written, gestured or otherwise, the originating author has no control over its further use and interpretation—but unlike before, the quantitative speed at which this process operates has fundamentally transformed the qualitative properties of communication. Despite our best efforts, the dialectics of Marx and Engels are still quite alive and well.

At issue here is the bolder, more performative nature of online communication as compared to that of its analogue counterparts; instead of the literate mode of expressing essentia ("Donald Trump is a bad president because he lacks decorum, hates the poor, is racist, etc."), internet users often center instead the performance of the communicative act and let other readers/writers input their own essential qualities ("Here's a drawing of Donald Trump with his mouth pasted over his eyes") —essentia, in effect, become the fodder for enthymemes (Gilbert 24). Perhaps to the dismay of Marshall McLuhan, our communication looks more and more like boxing match than a global village: the interlocutors themselves are center stage rather than their words; the arguments are merely window dressing for the bombastic ethos of the speakers. Gilbert, for her part, saw this coming, and offered a solution: because there is no reliable ethos beyond the individual reader, we must develop a self-reflective intensity in the reader-writer who, when encountering a text, is always actually reading her own self (29); with the loss of authority elsewhere, all responsibility falls on the individual. Rhetoric and ethics are no longer separate, but combined: we must move forward with a rhetorethics or risk no substantial liberatory change (30).

Gilbert continues to argue that this new rhetorethical reader-writer hybrid must be competent in the creation, copying, and recreation of hypertexts, and adds that we must become Donna Haraway's cyborgs to survive: "No longer structured by the polarity of public and private, the cyborg defines a technological polis...Nature and culture are reworked; the one can no longer be the resource for appropriation by the other” (qtd. 30). We must neither be technological utopians nor absolute cynics, but remain in a liminal place, like that of a critical reader, between incredulity and acceptance (31)—and it is here where we find a missing key to what we lack today. Many of us, of course, have abandoned that liminal space between cynicism and utopian fantasy (I like to vacillate between the two, depending on my mood), and this has undoubtedly expanded the problem of fake news: sources like The Onion and Clickhole, and their many variants, may be all fun and games to the indignant observer, but they also lend credibility to sources of fake news that might not be so obviously pastiche (as has been seen on more than on occasion , these articles have been taken without even a single un grano salis by those who are unfamiliar with our cynical mode of communication). But there is a larger problem at hand, and one that scholars often scoff at the suggestion to address it, particularly in the humanities: not only do few of us attempt to understand and utilize the devices of electrate communication, but when we do we still approach it from a thoroughly literate, outdated framework— in seeking to uncover the secrets of nuclear combustion, we refer to our venerated studies of phlogiston.

Worse still, even fewer of us seek to understand the mechanics beneath the surface of these new kinds of texts, and believe we can still focus on the word and word alone. Gilbert's solution—that the reader-writer learn and understand hypertext—is a terrific start, but one that is already hopelessly outdated; hypertextual studies, while seeming like a promising avenue of exploration in the 1990s, has all but disappeared as a specialization. In its place have risen schemes like the study of digital rhetoric or the digital humanities, but too often these suffer from the faults already outlined: digital rhetorics, while a valuable resource, still tends to focus on the surface text; and the digital humanists, while perhaps understanding the operations underneath more so than the average humanist, use this knowledge toward once again focusing on the interpretation of texts: algorithms to count the number of times a word and its antonym are used in the same sentence, for instance (here). We are using new tools to address old questions, rather than understanding how these tools have fundamentally changed the nature of communication, language, and the tasks of our scholarship; we are apes picking up stones to smash acorns in a new way, but we have yet to grasp the the explosively transformative power of the tool itself.

Of course: many of us, especially those who do interdisciplinary work, seek to understand the radically altering essence of this new electrate age; we use Virilio, Stiegler, Heidegger, Galloway, Wark, and an infinite array of related scholars to do so—and yet! Tasked to read the code below the texts we study, or use it in our own scholarly writing, most of us quickly recoil at the notion (including myself): the scholarly essay has long worked well for us, and we see no need to change it now, beyond the spare image, video, or animation. Algorithms are to be studied, not used; electracy is to be described and accounted in a literate, scholarly fashion for literate, scholarly people—the electrate hoi poloi has their mode of communication, and we have ours; use theirs, and the work is no longer the work of a scholar. It becomes a youtube video, a tweet, an inconsequential meme: add some cats and you might have something entertaining to watch and read, but it can be extended little more regard than that.

One must admit, it is a humorous notion: charged with formulating how to survive this seismic shift in communication and rhetorical practice, we suggest a fine solution: everyone must become reader-writers, capable of reading and producing complicated webtexts that utilize the compositional practices of the time (which now, it should be noted, moves far beyond hypertext, and is much more difficult to grasp)—everyone, that is, except for ourselves.

What is to be done?

This might seem like an inappropriate venue to quote Lenin, but in the midst of such a vast qualitative change in what we feign to study here, I find it rather fitting: What is to be done? The proliferation of fake news, as I argue, has little to do with its content, and only a small part of it is due to its falsely constructed ethos through parroting the design of more reliable sources. This issue at hand is one of speed (Virilio), algorithms, and hyper-connected networks; it is not "viral" in any sense of the metaphor because it is well-written or well-designed, but in the operations happening behind the scenes that most readers are unable to parse, educated or not (at times, this is because the text is quite literally unavailable to the reader, partially residing on a server beyond access; but it is also an incapacity on the readers' part). Vast tracking mechanisms, dynamically generated content, and covert algorithms intermingle to maximize the speed of transmission and the full force of electrate communication, and there is no escape from its whirlwind; fake news feeds off of our lack of closure, constantly exploiting enthymemes while in the process creating new ones to be exploited further, with no regard for truth, authority, or consequences. News, fake or not, and in itself memetic by nature, may be written by particular authors for particular reasons, but this is quickly lost in this cybernetic maelstrom, and the task normally assigned to an author's ethos is as equally automated as our social networks. Literacy, like formal argument, cannot act as an effective weapon against it—it has failed, and likely will fail, every time.

We must as each other and ourselves, again: What is to be done?

If you're you're familiar with the journal in which this article found a home (I hope you are), then most of this should not sound much at odds with what you already know and believe—Ulmer himself has been arguing something along these lines for years, perhaps at least since 1944 (hint), and his focus centers on developing the reader-writer's ability to cope as well as theorizing a general (and particular) framework for understanding this new order. His "Flash Reason," perhaps, serves as a crystallization of what both has and yet still must occur: the development and internalization of an updated version of Aristotle's notion of Prudence (Phronesis, time-wisdom) (2) to accommodate and make use/sense of this new velocity of communication. As the epigraph of his essay suggests—"The theme of a velocity of thought greater than any given velocity can be found in Empedocles, Democritus, or Epicurus" (Deluze)— our task is to accelerate our means of judgment and deliberation to meet the velocity of communication; that is, to employ flash reason in the place of classical deliberative rhetoric and logical reasoning. This, Ulmer argues (quite literately!), is a necessary development.

Tr(i)ump(h)ed

Earlier in this essay, it was claimed that the 57th president of the United States, Donald J. Trump, lacked decorum—but this misses the point. Decorum, as the application of prudence in life conduct, is no longer the decorum of five decades ago, let alone a single decade, because the prudence from which it is derived is no longer the literate prudence that many of us still long for and admire. Barack Obama, while beloved by many (and despite a great many abhorrent international policies and actions ), was ultimately a penultimate great gasp of literary decorum, politely and eloquently embodying hope for the masses while systematically denying it to most (such is a routine and recognizable theme in literate behavior, even dating back to Ancient Greece). While hesitant to praise Trump for anything, this much is clear: he has, on the other hand, accepted and internalized this new electrate prudence, albeit clumsily so, and perhaps unwittingly utilizes it to his own ends (though not always to the ends of his supporting audience; electracy has not cured the deceptions of literacy, but has amplified them). To understand and fight the seeming impossible rise of Donald Trump and fake news, not to mention the alt-right or any number of improbable developments, we have to theorize this new kind of prudence, of course; but we also must utilize it: as citizens, readers, writers, mothers, fathers, children, strangers—and importantly, as scholars.

Ulmer provides us, like Gilbert, with a background to begin this route, traced across technics, artistic theory, literature, critical theory, classical Greek philosophy, and, providing the flash in his reason, puns (13). Much of what he identified in communication practices in "Flash Reason," published in 2013, is easily identifiable today in popular online discourse, as it probably was then. Cicero defines prudence as "...the knowledge of what is good, what is bad and what is neither good nor bad. Its parts are memory, intelligence, and foresight (providential)“ (qtd. 3), but in a short survey of such practices, the problem for prudence becomes clear: with the velocity of communication, no time is available either for foresight or the adequate formation of memories—and without such constituents, little is left for any classical definition or interpretation of intelligence. If decorum is the presence of prudence in conduct of life's affairs, then it is no wonder that Trump exhibits very little of its literate meaning; the conduct of life (communication) has outpaced any capacity for reflection (memory) or reasoned planning (foresight). In the dromosphere's dizzying pace, our schemes of conceptualization, dependent on these faculties, are outdated and ill-informed at the moment of birth.

"Eureka!" Archimedes shouts, running naked through the streets.

"Old news," shouts back the town; calculating volume has always been at their fingertips.

"Fake news," tweets Hieron II, famed tyrant of Syracuse. "Crooked Archimedes can't measure the girth of my crown."

Heureka

Trained in the tradition of literacy, it seems difficult, perhaps even impossible, to fight against what can be best described as "anti-thought." Our arguments fail; our careful analyses go unheard, and are even inconsequential; our notable history of scholarship and pedagogy, undone, or at least so it seems. There is no need nor time for carefully acquired wisdom when your home is burning around you; prudence is all one has left, and the first instinct is to find the quickest exit (after, perhaps, saving a cat or two). And many of us have exited, quickly and without regret: loud proclamations of our exits from social media stand as testament to such resolve. We turn and return to the better days, curled around our books with real pages—Proust, Barthes, even Plato will do fine at this point—and still yet discourse continues to burn bright and hot, with or without us, consuming all in its relentless speed and heat.

Fundamentally, we know that there is no escape; it's just that the methods that have worked so well in the past can no longer douse the flames. What is needed, perhaps, is an anti-method—one being utilized already, yet still to be discovered—and this is what Ulmer has sought provide. Ulmer's electracy is founded upon prudence, but not the deliberated prudence of literacy; it is a thought without concepts, a practical wisdom more akin to the recognition of beauty than to the careful systematic reasoning so clearly elucidated by literacy. Such a prudence, like that exhibited in the creation of memes, relies on the refiguration of commonplaces, "common sense," that more resembles the heuretic practices of artists (composing via theory) than the heuristic practices of scholars (interpreting through theory). We can no longer remain mere critics; we must be critic-artists, reader-writers, consumer-creators.

This is most exemplified, according to Ulmer, by the work and practice of Avant-guarde and collage artists who use the readymade—urinals, advertisements, film clips, images from classic paintings—to make a statement; and statements, cleansed of pretense to objectivity and truth, are always already judgments. And like the judgment of an artist, ours too are primarily aesthetic: even in cases of truth versus falsity, good versus evil, right versus wrong, our judgments are at base matters of feeling, of pleasure or revulsion (this is perhaps one reason among many that highly codified systems of ethics fail to account for every moral dilemma: we feel the good, the bad). And again like the artist, our electrate texts need not nor should not be blissfully contrite; in order to fight against tyranny, in our scholarship and in our daily communication, negative pleasure is key: the cleaving of harmony in a natural or systematic order that confronts our conceptual unity, feeling more like a punch than an argument—the Trump tweet, the troll, the patently offensive meme.

Aye, But There's the Rub

I do not wish to claim here that Trump's mode of discourse is to be admired, or even emulated; rather, he is a symptom, not the cause. Trump himself, one could rightfully conjecture, has no overt theoretical approach to discourse; his practice is inherited and constrained through the technological medium at his disposal (Twitter, namely). Nor do I suggest that we resort to the decorum of the lowest form of discourse; a well-deserved punch, rightly placed and in good form, can be the paragon of decorum (please excuse the overtly masculine language; perhaps the subject overcomes me, as well). Perhaps a better word to stick with, like above, is cleaving: meaning both to cut and to join, to join and to cut, electrate prudence requires we do one as we do the other.

But how?

Without seeming thoroughly out of turn, I would chance to say that we have forgotten the electricity in all that is electrate; we still wish to substitute words for transistors. And it should be said: transistors are the alphabet of digital electronics, cutting up the analogue world into discrete states to be interpreted by the human-machine; the computer interprets these cuts, cleavages as ones and zeros, and we interpret its design (one wonders: which letter is a vacuum tube? Is it still with us?). The world before us, in all its electrate intensity, is already cleaved, rejoined, and transmitted at the speed of light; we need only to join the game, adopt its rules, and, in the continuation of the ever-incomplete colonial project of literacy, exploit and undo its weaknesses and advantages.

But we cannot do so without learning its language beyond our hermeneutic exercises on the alphabetic text that appears before us; we cannot merely acknowledge the flickering signifier, but make it flick ourselves. Mirroring Gilbert's charge for all of us, but particularly scholars, to not only interpret hypertexts (hermeneutics) but also acquire the skills and knowledge to make them (heuretics), we must now learn and use the fundamental operations of computation that mediate, tear apart, and put back together our discourse, whether it come in the form of hypertext, video, audio, or any other remediation—that is, we must embrace the very langauge(s) and algorithms that fuel this dromological nightmare (or is it a dream?) in order to direct it toward our collective awakening. This task cannot be left to the engineers, the hackers, the bricoleurs; scholars have an important place in the post-literate world, and though the standard essay has been and will continue to be an important scholarly tool, it is not the only tool—nor the most effective or most illuminating device. How many pour over medieval copies of the Bible, carefully copied by dutiful monks, if not to admire their faniciful illuminated musings rather than the Word itself?

500 words

This, in short, is the ethical task at hand: to extend Gilbert's call to learn hypertext well beyond the narrow and mostly outdated confines of its focus, to heed Virilio's warnings and attempt to circumvent them by altering our discursive practices as scholars to match our material reality, and to take Ulmer's prudence to its most difficult extreme. This is not to refrain the age-young wisdom of “code or die” (note: a through-going parataxis of qualifiers), as such calls do little but reinforce the structure and control of hegemony, but to learn the language of electracy beyond our stale hermeneutic functions of literacy—a practice we cannot even adequately enact without understanding the code, the transistors, the memory addresses, compression algorithms, networking protocols, and so on into infinite knowledge that we are unprepared to comprehend. This will, paradoxically, require from us those features of literacy that we simultaneously need to shed or drastically update: careful deliberation, patience, concept formation, and, though it pains me to say it: mathematics. We can no longer consider a separate writing system, like mathematical notation or computer programming, to be the domain of another discipline: it is ours as much as it is theirs, and it has been from the very beginning.

The problem of fake news is not a problem that can be solved through careful interpretation, nor can it be solved by merely writing our own, fake or not. What we require is both, simultaneously and contradictory, at a speed that matches the velocity of electracy. The spoken word traveled at the speed of sound; the written word, at the speed of space; the computed word, faster than imaginable, and always accelerating. Perhaps it is strange, then, to hearken back to a principle first recorded to be conceptualized by Heraclitus, and one that has attracted little attention to date: syllapsis (not to be confused with syllepsis). According to Patrick Lee Miller in his Becoming God, Heraclitus envisions syllapsis as the act of synthesis and analysis at the same time, dialectically playing off itself so fast that reason could not capture nor alone account for it. It happens, so to speak, in a "flash," and Heraclitus believed, or so it is attributed, that this form of thinking could be cultivated and pursued, though with much effort, becoming a kind of divine thought that transcends both reason and feeling, death and life, finite and infinite. The acts of doing, making, and knowing, the many, converge inseparably to become the one.

It is this kind of thinking, I believe, that Ulmer returns us to, and the prudence necessary for it, being a practical wisdom the flows from syllapsis (flash reason), calls on us to understand and use the material consistency of electracy and the communication that sustains and builds it. We have long known that this material goes far beyond and is much more complicated than ink and parchment, vibrating chords and molecules; it is mechanical, algorithmic, interconnected, inhuman and above all, electric. Electrate discourse, to paraphrase Heraclitus, is now and maybe ever shall be an ever-living fire, with measures kindling and measures going out; let us admit our own selves as these measures and interpret the flame as such, but also burn kindling and blow smoke of our own.

In short, we not only need a theory of electracy, but also a deliberate and informed practice (praxis) that at least aims to match the algorithmic complexity and speed of electrate communication; a practice that goes well beyond the oversimplified version of the readymade as exhibited in internet memes. A number of difficulties great and small are presented to us here: as noted with hypertext fiction, the complications of creating electrate texts, as well as the steep learning curve for those of us less mathematically inclined in trying to compose in repeating algorithms, stands as an almost impenetrable obstacle to overcome, especially with our current collective apprehension.

I do not pretend here, at least, to provide a suitable handbook for such practice, but I do think we should take seriously Ulmer's call to play the Prince of Duchamp's Machiavelli, and I hope that the webtext that follows points to a perhaps premature and underdeveloped, but genuine, attempt to do just that.

1,944 Facts that Explain Why You Can't Escape The Vast Prison of Language

Of course, I can't get away with making such a call without at least trying, however successfully, to provide my own smoke and fire. What follows, once the button below is clicked, is an attempt to show how to use flash reason (syllapsis) in the face of fake news, using the algorithmic, dynamic, interactive and networked nature of Virilio's nightmare. To be succinct: This webtext, outside of this obscenely wordy "introduction," aims to embody and exemplify the object of its study: fake news, and how to possibly develop the prudence necessary to avoid its memetic call. To do so, it emulates the kind of sensationalist, meme-saturated and dynamically generated content typical of many websites today (2017), but with an academic twist—all the while trying to subvert itself. It also attempts react to and mutate in response to your online behavior like such websites (targeted advertising, web tracking, etc.), but with a caveat: because it is not connected to those websites and algorithms that really track you , its capacity to do so is highly limited, and a bit tongue-in-cheek. Still, I hope to illustrate the principle here, and moreover I hope to show that academic writers can engage more fruitfully in this type of work—not just as critics, but as creators of the such dynamic content as well; whether or not I am successful at this in any capacity, or how you may interpret it, depends solely on you—and will depend on you the reader alone for the foreseeable near-future.

Kthxbai!

Click the button below to begin the rest of this webtext.

Works Cited

Agerholm, Harriet. "Map Shows Where President Barack Obama Dropped His 20,000 Bombs." The Independent, Independent Digital News and Media, 19 Jan. 2017, www.independent.co.uk/news/world/americas/us-president-barack-obama-bomb-map-drone-wars-strikes-20000-pakistan-middle-east-afghanistan-a7534851.html.

Anderson, Janna, and Lee Rainie. "The Future of Truth and Misinformation Online." Pew Research Center: Internet, Science & Tech, Pew Research Center, 19 Oct. 2017, www.pewinternet.org/2017/10/19/the-future-of-truth-and-misinformation-online/.

Dawkins, Richard. "Viruses of the Mind." bactra.org, 2001, bactra.org/Dawkins/viruses-of-the-mind.html.

"Deal With It." Deal With It | Know Your Meme, Know Your Meme, knowyourmeme.com/memes/deal-with-it.

"Deal With It." Deal With It | Know Your Meme, Know Your Meme, knowyourmeme.com/memes/deal-with-it.

Galloway, Alexander R. Protocol, or, How Control Exists after Decentralization. The MIT Press, 2001.

Gilbert, P.K. "Meditations Upon Hypertext: A Rhetorethics for Cyborgs." Journal of Advanced Composition, JAC 17.1, 1999.

Jodyavirgan. "Internet Tracking Has Moved Beyond Cookies." FiveThirtyEight, FiveThirtyEight, 2 Sept. 2016, fivethirtyeight.com/features/internet-tracking-has-moved-beyond-cookies/.

Johnson, Steven. "Why No One Clicked on the Great Hypertext Story." Wired, Conde Nast, 16 Apr. 2013, www.wired.com/2013/04/hypertext/.

Lee, Adam. "Richard Dawkins Has Lost It: Ignorant Sexism Gives Atheists a Bad Name | Adam Lee." The Guardian, Guardian News and Media, 18 Sept. 2014, www.theguardian.com/commentisfree/2014/sep/18/richard-dawkins-sexist-atheists-bad-name.

Lenin, Vladimir. "Index to Lenin's 'What Is to Be Done?'." Marxists Internet Archive, www.marxists.org/archive/lenin/works/1901/witbd/.

Miller, Patrick Lee. Becoming God: Pure Reason in Early Greek Philosophy, Continuum, 2011.

Taylor, Adam. "7 Times the Onion Was Lost in Translation." The Washington Post, WP Company, 2 June 2015, www.washingtonpost.com/news/worldviews/wp/2015/06/02/7-times-the-onion-was-lost-in-translation/?utm_term=.9f8cf80bda53.

"There Is No Dana, Only Zuul." Deal With It | Know Your Meme, Know Your Meme, knowyourmeme.com/memes/http://knowyourmeme.com/memes/there-is-no-dana-only-zuul.

Ulmer, Gregoryle. "Flash Reason." CyberText Yearbook Database, University of Jyväskylä, 2013, http://cybertext.hum.jyu.fi/articles/157.pdf.

Virilio, Paul, and Chris Turner. The Information Bomb. Verso, 2005.

Watson, Robert N. "The DNA of Shakespeare's Works." Center for Digital Humanities - UCLA, UCLA Center for Digital Humanities, 3 Oct. 2017, cdh.ucla.edu/news/the-dna-of-shakespeares-works/.

Technology Used:

1,944 Facts That Explain Why You Can't Escape The Vast Prison Of Language! (#625 Will Blow. Your. Mind.)
1.
2. ngfrjhdfj
3. ngfrjhdfj
4. ngfrjhdfj
5. ngfrjhdfj
6. ngfrjhdfj
0

Statistics

Mouse Clicks:
Mouse Movements:
Time Spent Left:
Time Spent Right:
Time Spent Listicle:
Time Spent Ads:

Time Spent:
Clicks per Second:
Cries:
POS Quotient:
Rooms Found:

IP:
Country:
Region:
City:
Zip:
Latitude:
Longitude:

OS:
Browser:
Resolution:

Hello.

This is a semi-experimental, "interactive" webtext. Treat it like you would a normal website. Most of the content is randomly generated, save for much of the content that appears in these popup boxes. The purpose and argument will be revealed as you click your way through, and the content of the essay changes depending on what you do. You may also find portions of the essay in unexpected places. Use your intuition—your flash reason, your phronesis, your practical wisdom (or lack thereof!).

Whenever this window appears, you will have to make a decision; click whichever statement you most agree with when prompted, and then your browsing may continue.

Beautiful.

Let's get to know each other, shall we?

I'm a purveyor of fake news. I write headlines, find the right images, fabricate stories just for you. And what do I ask in return? Nothing but your sweet, sweet clicks.

You, on the other hand—well, I know plenty about you. You're currently around [city], [region]. I suspect you are well-educated, but you're probably not in STEM. I might further guess that you're some sort of writing and rhetoric scholar. You know a bit about postmodernism. I bet you're pretty certain that you'd never fall for that dreaded fake news.

Sigh. You are a [OS] user. Your reading me with the [browser] web browser (poor choice—use Vivaldi). Your computer resolution is [screen_width] by [screen_height].

I can also probably assume that you're a rather passive person, since you just went along with my insistence on getting started last time we talked. If you're not passive, why did you so easily agree? You could have quit. You could have closed your browser. You could have shut down your computer and stared at the brilliant, fiery sun. Or you could have just clicked "I'd rather not." But you didn't, did you?

Bullshit.

You'd refuse? Why on earth are you still here? It doesn't seem prudent to refuse, and then continue along with a conversation anyhow. I'm not really sure that I can trust you, quite honestly.

Nobody refuses. Not on my watch. Refusal is fake news.

$MAGA

Woah whoah whoah! It's okay, it's okay. Don't be sad—I was just joking!

HA. HA. HAAAAAAAAAA.

Here, I'll play you a nice little song.

Isn't it wonderful and soothing? You can hear America being made great again if you listen closely. Really closely.

Good choice, darling.

See, that wasn't so hard, was it? Everything is better when you agree with me. In fact, you cannot disagree!

You see, already you have been pulled into my reality, and you have agreed to it: by choosing to continue to make the choices I present to you, you have implicitly consented to take part in the world I have constructed. And I know what you're thinking: that makes you complicit. And yeah! Yeah it does. We're all complicit in the burning momentum of my Trump train, and you can't do anything about it. What are you going to do, build a wall? It might seem prudent to drop out—from the internet, social media, education, governing institutions, entertainment, industry—but you won't enjoy being cut off from the rest of the world, and good luck doing everything required to keep you alive and engaged yourself.

We're in it together, and I'm in it to win it. Are you?

Witch Hunt!

Oh, please. You don't know a damned thing about prudence. You know what's prudent? Expediency. You know what's expedient? Twitter. You know what Tweets?

Me.

I got something prudent for you! Your IP address is [ip]. I know where you are, what you're doing, and everything you've ever done online in the past. I even know which computer you're using, and which room I need to send my agents to find you. Couple that with your credit history, legal documentation, and scheduling habits, I pretty much know everything there is to know about you—both public and private. How's that for prudent audience analysis?

WRONG.

No, you are the one who is the wrongly correct wrong one! How do you even attempt to fight against fake news like CNN if you can't even tell fact from fiction? Tell me, if I am so very wrong, then why do you insist on hanging onto the my words? Is it not rational to discontinue engagement? Prudent?

My point, friend, is that you are either entirely irrational and imprudent or I. Am. Right. Which is it?

Oh my! Look at the bottom left corner of your screen--you have zero points! What a loser you are.

Backscratchin'

Well now, look who's being polite! In that case, I'll be polite to you. The most polite anyone has ever seen! The greatest politing. One second, hold my beer.

MAGNANIMOUS.

Here you are! A new info bubble for you, wrapped nice and neat in its own little wall where it belongs. I have also been gracious enough to give you real-time updates on which section of the page you're currently hovering over!

No need to thank me. It's what friends do.

NO.

Oh, you think that's nothing? Think about it for a sec: I have collected information about you for the last [time_spent] minutes, and that's without the help of the rest of the internet building an audience profile to persuade you and you alone. That the hardest part to internalize here, if you asked me: we are no longer addressed as amorphous social groups that share certain experiences and interests while also remaining fundamentally different as individuals, but instead are atomized and addressed alone; properties that once played a key role in audience analysis, like demographic information, are now mere asides. It doesn't matter if my voters are fundamentally white, in multiple senses of the word; my message is algorithmically catered to each and every individual—especially to those who are against me. After all: look what you've been persuaded to read!

If I wanted to abuse my power, which I totally wouldn't, I'd gather every bit of information about you collected by third parties over the course of you entire history of using the internet—how long have you been jacked in? It would cost mere pennies, and without the need for government intervention. Big Government, bad; Big Business, good. I could use this information to know what to sell you, sure— we all know that the likes of dirty Amazon has been doing that for years. More importantly, however, I get to decide exactly what you read, when you read it, and access a long history of your activity to judge precisely what I need to do to get you to believe it, to internalize it—to be it.

You literate folk might have an edge over my efforts for now, but you won't be alive forever. And many of you, I've heard you whisper, have already forgotten how to truly read.

GETOUTTAHERE

YOU'RE A POS. Here I thought we were being polite. smh.

Look: you simply don't understand the nature of the world you live in any longer—which is key to prudent behaviour, if I don't say so myself (I do). The world only runs on two symbols now: 1 and 0, 0 and 1. There isn't room any longer for the rest. Zero replaces one every thousandth of a millisecond, and one replaces zero just as fast—they might as well be a single thing. 10. 01. Φ. True, false, doesn't even matter. We're all a POS here.

But you keep on reading those books, and I'll keep on upending the world with memes. We'll see who ends up on top of it (me).

For someone with passing knowledge of Aristotle, you sure don't see the prudence of learning the new writing system...

LOL

...seriously? Do you see my face? Look closely. I am the president.

But let us talk about intellectualism, and its platonic trappings.

Plato? Yes, Plato. The old who was against reading and writing in the classic sense. That's you. Today.

How's the cursive writing going for you? How about those monks still painstakingly copying the old and new testaments by hand? There's a reason the Catholic Church began to embrace the printing press: it's all about speed and numbers, after all.

Fine people, the Catholics. Not as great as the America is going to be, but pretty good.

(⌐■_■)

I AM DONALD TRUMP, I AM AN ACADEMIC. And so are you.

I've laid my cards. You should do that as well. I get it; you're literate. But to be literate in an illiterate world is to be imprudent. And of course I don't mean literate in the sense that one can or cannot read or write, but rather literacy as a way of thinking and doing. You're doing it now, but you shouldn't be.

Literacy is linear. Our world is indefinite.

Literacy is conclusive. Our world is explosive.

Literacy is logical. Our world is aesthetic.

Case in point 1: Authority has been replaced with algorithms; authors with procedures. There is no "I" speaking nor "You" reading or listening; discourse is spit out and chewed up well before either of us get to chewing or spitting.

Case in point 2: Informal Fallacies are today not flaws in an argument, but provocations to challenge its absurdity. The boy who cries ad hominem is the boy getting lured into the wolves den for dinner.

Got it, jerk?

umad?

Nobody likes it. I am president of the Uwited States of America, and I don't even like it.

But that there is the everlasting flame, isn't it? As soon as we get used to the season, and even begin to like it, the weather decides to change on us. And we say comforting things to ourselves—oh, the temperature has gotten lower— but really it's a change in kind and quality, not quantity. The weather is not a quantifiable series of entities, but a cacophony of becomings.

The weather, my little literate friend, has changed.

DYLASTOTLE

It is true: all the weathermen said that I wouldn't be president, didn't they? Perhaps you're learning how to be prudent after all.

But the problem with patterns is that they are only patterns when we recognize them as such. Patterns might reflect some sort of material reality (if you're into such things), but they emerge through the interplay between observer and observed, speaker and audience, world and entity.

Sure, we can say: the emergence of electracy is like the emergence of literacy, and those well-versed in the old, like Plato, will inevitably resist the new. But as it is when recognizing any given pattern, this is a gross oversimplification. It is accurate to say that the weather has changed just as a nuclear bomb explodes overhead, but it is hardly a prudent observation.

Here's what I'll posit: the change of paradigm has been so profound, and so overwhelming, that we can expect no trace of literacy to unravel from it, nor can we weave our literacy through it. They are different fabrics—and the (L/W)ord (logos) forbids their intermixing (like mixed metaphors); the bomb has exploded.

MMAAGGAA.

Yeah, here comes that but: If there were any accidents, then this whole experiment known as the internet, digital computation, electracy and so on simply would not exist. An accident is an error, and an error crashes the whole damned thing. We can quibble like the Ancient Greeks over the nature of cause and effect in the natural world (if we were to accept that such a world existed in the first place), but there is little doubt about the causal relationships in abstract calculation: What we are witnessing, and have become, has been thoroughly rendered; the originating system of these causes has just become so complicated that not even its creators (that's us) can decipher its web of causality. And this, as far as I can tell, is about as close a definition to "accident" as we could ever come.

I am the president of the United States of America. I am such an accident. I am not the last, or the worst. I am solution to an infinite chain of equations. Who, exactly, is to blame? You.

OOPS.

Yes, the task is almost insurmountable! As you'll discover that I have discovered—I, the president, and I, the academic. It is easy, in relative terms, to get where we need to be in practice, and it is even easier, I suspect, to put it into practice without much forethought—we do it every day without a moment's reflection.

It is another task altogether to "do" electracy with deliberation and goals. We are in an era of overwhelming accidents— accidental clicks, accidental leaks, accidental elections—more accidents than assurances. We're drowning in them.

Accidents, at least, are what they seem.

POSTAL.

Correct! You are correct! There are no accidents. It rains for a reason, though it may seem random to the chump getting soaked on the ground. We do not have to observe the reason for rain in order to recognize that there is one, regardless. Logos is always in control, storm or shine.

Well...

Huh. I guess we've come back to that essay. What now?

The conclusion of the accompanying essay—that we not only embrace electracy as an object of study, but also as a matter of praxis and at a level of expertise thus far unapproached, save perhaps in the study and creation of serious games—still stands firm. But so too stands the basic fact that no one person can have the necessary time, skills and training for a scholarly practice of electracy; perhaps it is decidedly impossible, given disciplinary trends. Worse still, to have training in the humanities as much as in the sciences and applied mathematics is not enough alone; I myself [the not-Trump-I] have been a computer programmer for almost three decades now, and though my career in rhetorics is still young, so too are all careers in it—for over 2000 years, we have barely been able to define the term that describes our study. Perhaps the world of rhetoric has grown beyond the scope of rhetoricians—has it ever not been so?

Or perhaps the problem is us: a thousand imperfect copies of PLATO 2.0's immaculate form, unable and unwilling to comprehend the storm in which we now find ourselves. Maybe it will take someone born in the lightning and rain, like Aristotle was born in the text, to weather the weather.

Or maybe I'm just a troll. kthxbai!

WRONG.

B1-

WRONG.

B1-

WRONG.

B1-

WRONG.

B1-

WRONG.

B1-

WRONG.

B1-

WRONG.

B1-

WRONG.

B1-

WRONG.

B1-

WRONG.

B1-

WRONG.

B1-

I am an empty room!