The importance of the hypertextualization of literacy.

Wanna go back?

 

ALWAYS hit the BACK button on your browser to return to this page!

 

Prognostication is a dangerous vocation, especially if you are not a prophet.

Every new technology invariably raises up detractors and advocates. It never ceases to amaze me how wrong the forecasts concerning them have historically been. As Eli Noam, director of the Columbia University Institute for Tele-Information recently wrote, "The Cassandra industry is out in full force, and an avalanche of neo-Luddite literature is rolling in, lapped up by traditional media ." The technological landscape today is impacted by considerably more complex issues than textile mills and employment issues. (Noam, Eli. "Will Books Become the Dumb Medium?" Educom Review, March/April 1998.)

First, let me insist that hypertext is not a technology. Hypertext is one possible logic that derives from the technologies of computer networking and telecommunications. It's nonlinearity in and of itself is not anything new. It is entirely possible to skip from topic to topic within the confines of the print medium, or to cross reference ideas and contexts within a set of texts. Its so called power to fragment literary meaning is entirely contingent upon the intent (and ability or lack of it) of the author. Critical appropriation of texts formatted in this way demands the exercise of cognitive intuition as well as critical thinking skills. Coherence is achieved by the ability of the end user to assimilate and judge the material that is read. This is in fact no different than what is required for the textual literacy of the print medium.

The confusion over whether hypertext is a technology or a new medium, or merely a substructure of an existing technology often leads to a consideration of the relationship between technology and culture as a problem of content rather than a perspective that considers technology and culture as two components of a process. I would like to frame my thesis within the broader landscape of E-Literacy (that includes Web Conferencing, E-Chat and interactive and collaborative learning environments) and the effect the merging of computer and telecommunications technologies in general could have.

The most fundamental effects of this new literacy will have a human element involved. Ingram rightfully places the human impact at the center of his lectures. Pundits who fail to make this central to their thesis should be approached with care. George Landow, in the introduction to his Hyper /Text/Theory demonstrates his vast knowledge of hypermedia spaces. However, I have yet to find any place where once does he depart from his evangelization of hypertext's possibilities to acknowledge that the interactions involved in this virtual realm are at some attached to real human beings, or address the issues involved with how people access these environments. I think this constitutes one of several larger issues in which hypertexts will impact the individuals in the world I have been placed in.

As with any library one walks into, an abundance of texts without contexts can be appropriated. It is up to the judgment and research skills of the user to sort through the competing views available and form an opinion, and to release information from the chains of pretext! The larger issue is not that technologies have not endued their subscribers with critical thinking skills- this is the failure of the educational systems at large. Neil Postman has said " the whole of language is to provide a world of intellectual and emotional continuity and predictability". Strictly speaking, however, this is culturally determined, and the transmission of meaning is neither advanced or denied by the fact that a text includes hyperlinks. A newspaper can provide just as much incoherence in its categories as can a hypertext document. For an example of a well organized hypertext treatment, see Nancy Kaplan's ( Politexts, Hypertexts, and Other Cultural Formations in the Late Age of Print).

When the media speaks of hypertexts in particular, the comments are generally tied to the state of the art, which is in its infancy and does not give a good account of the potentials. I have heard some say that they can still access their reference library for specific informations more efficiently than someone utilizing such technology. This does not take into account the user's degree of competency which he is contrasting with his own long-term mastery of a particular text. He doesn't take into consideration the availability at a click of the mouse the potential to access instantaneously concordances, commentary passages, language helps, dictionaries, and endless reference materials. One falls into the trap of focusing on a mode of operation, a competency based on the user's experience.

The use of hypertexts to communicate cogently and coherently requires a new compliment of composition skills (or basic, old ones faithfully applied). Many fall into creating documents that are, in the main, referential, and are best characterized as strung together chunks of information. The opposite effect is just as strawy. This is a danger many writers on this subject posit as a potential snare- our culture's propensity for jigsawing information. But many have fallen into a snare of a converse nature. One recent book I read saw the author referencing 569 footnotes in a book of less than 200 pages. Perhaps he should have called himself editor rather than author. He has created a document that cries out to be hypertextualized in and of itself. If one reads original thought, say, Heidegger, who wrote much on the effects of technology we find a great contrast. In his "Basic Writings" one will rarely find a footnote. This many footnotes, by their sheer volume, makes me wonder if the author has digested the material or merely referenced it. If the latter, I would posit that at least some of the references are out of context.

The most significant feature of a hypertext environment is its capacity for inclusion- a sea of works in progress with no gatekeeper to verify accuracy or edit texts. These "works in progress" are coming to be considered as representative of the cutting edge of any given discipline. As it currently exists, this is nonlinearity, but with boundary. Hypertext is still a textual domain, and as such, it is governed by conventions of human language and cognition. Noisome interjections, distracting comments and distortions created by conflicting informations must still be contended with in the context of the end users cognitive abilities.This is an area that postmodern deconstructionists have been quick to appropriate in their quest to put the metanarrative to death.

 

The Challenge to Educators

 

The challenge to educators everywhere is not only to help students through the vast forest of information that is increasing every day, but to create a learning environment where the achievement of insight is possible.Where hypertexts can be one mile wide and one inch deep, they can conversely, when handled properly, provide the means for more intense analysis to the discerning user. Computer and Telecommunications technologies have provided a construct whereby informations can be disseminated democratically. (On the internet, nobody knows you're a dog).Freedom of the press is only viable if you have access to a press. Educators are in a position to make a difference in structuring learning environments and curriculae where a hit and run mentality is thwarted and educational institutions are poised to become the great purveyors of access. Neil Postman concludes that education is to lead the resistance against the Technopoly, but fails to provide viable solutions to the problems he presents. (Technopoly, p. 191). Changing the curriculum will not suffice.

Problems that occur when left unchecked are students becoming overwhelmed with the amount of information available to them, blind acceptance of texts as truth (Postman rightly sounds the alarm on this on page 115.) Here is where the educator must clear up the confusion of the collection of information with the acquisition of learning (and wisdom).It must be noted that educators are very aware that the issues are complex, and that sometimes teaching strategies, curriculum changes and educational initiative do not produce any discernible effect. However many of the advancements that have been made in measurable learning cognition over the past forty years points to the possibility of exorcising the supposed demons from the technologies and taming them into useful means to a good end. Postman lives in a reified world where he utilizes buzzwords to embody complex interactions. From this construct arises a behemoth " imposes on human beings certain ways of thinking, feeling and behaving." (Postman, p.161) This description of the effect on human perception and understanding fails to account for the basic fact that humans are the agents of inquiry and change. Even being able to critique the situation gives us far more power than Postman's deterministic scenario implies.

Some modes of approach that I uphold are analytical argumentation and rhetorical debate, asynchronous and synchronous interpersonal communication, structures in which students do not become disembodied or disappear as individuals. Analysis through argumentation uses a combination of appeals and strategies. A good argumentative analysis puts an issue into its historical perspective, takes a clear position and defends it, explains other positions and where these positions stem from, gives reasons where other positions are faulty and argues the advantages of its position. Collaborative writing environments are a related and concomitant feature of the technology that supports hypertext.

 

The Death of Print?

 

The promise (and some would say, danger) of the hypertext revolution is its impact upon print literacy in general. Jay David Bolter, author of "Writing Space" argues, " print will no longer define the organization and presentation of knowledge..." Walter Ong says, "the computer is the technology by which literacy will be carried into a new age", and speaks of an "electronic literacy " , and proposes that "the logic of print" will be replaced by this.

Those who attack print literacy reject "the status of texts as higher or more logical expressions of symbolic knowledge, texts as the embodiment of history, philosophy, literature, science, and other ways of understanding the world supported by the traditions, often the prejudices, of the group" (Myron Tuman, in his 1992 book Word Perfect: Literacy in the Computer Age, Pittsburgh: University of Pittsburgh, 1992. p. 43.)

Richard Lanham argues that electronic texts will free us from the illusion that language is referential and real. The logic and stability of the printed word (a paper based logic structure) has formed the foundations for a stable cultural identity in the last 500 years in cultures that have adopted this form of communication. Oral traditions have passed away except in countries that are considered backwards. The thing that an oral tradition does provide is the necessity of interpersonal contact and interaction. The writing labs of the future will guarantee no such interaction.

The prophesied takeover of textual literacies by electronic literacies demands a new emphasis on critical appropriation, a new return to rationalism, critical thinking skills- cognitive weight training, so to speak. God does not dance to a linear tune, I like to say, and we humans need not think that linear acquisition of information is essentially better than hypertextual models.

Much of the literature involves debates between neo-Luddites and technoptomists, or bewildered observers venturing their theories from their chosen ground. Most of these arguments are based on an impact model which is too simplistic a typology to impose on the much broader and more complex issues. A dualistic paradigm ignores both the role of the unforeseen and the role and responsibility humans have in creating the artifacts of technology. Postman falls at times into a deterministic camp which gives technology with a life of its own.

For Postman, computers in the classroom have this effect:

"In introducing the personal computer to the classroom, we shall be breaking a four-hundred year-old truce between the gregariousness and openness fostered by orality and the introspection and isolation fostered by the printed word. Orality stresses group learning, cooperation, and a sense of social responsibility.... Print stresses individualized learning, competition, and personal autonomy. Over four centuries, teachers, while emphasizing print, have allowed orality its place in the classroom, and have therefore achieved a kind of pedagogical peace between these two forms of learning, so that what is valuable in each can be maximized. Now comes the computer, carrying a new the banner of private learning and individual problem-solving. Will the widespread use of computers in the classroom defeat once and for all the claims of communal speech?" (Technopoly, p. 17)

Myron Tuman, in his recent book Word Perfect: Literacy in the Computer Age echoes Postman's sentiments up to a point. For Tuman, the logic, the stability, and the authority of the printed word guaranteed a culture characterized by "a serious, introspective, relentlessly psychological ... hermeneutic tradition of interpretation" .

However, Postman and Tuman focus on vastly different outcomes of this destruction of print literacies: the one focuses on the threat from a new or secondary orality, the other on the threat from an ever more individualized, intensive, introspective mode of instruction Technological differences themselves tell only a part of the story of technological and cultural change: by ignoring the social both Tuman and Postman fail to articulate some crucial relations between technologies and cultures.

Sven Birkerts, author of The Gutenberg Elegies: The Fate of Reading in an Electronic Age, (reflections on the decline of "real" reading) argues that we are rapidly becoming incapable of comprehending any "sense of the deep and natural connectedness of things...we don't venture a claim to that kind of understanding.Indeed we tend to act embarrassed around those once - freighted terms- truth, meaning, soul, destiny...(74)

This points to the greatest challenge Christian intellectuals must face in the coming years: countering the "Hermeneutics of Suspicion" . Deconstructionists maintain that there are no norms for meaning and language. The zeitgeist is being shaped by so call academicians who believe that language is an instrument of oppression- words do not really mean anything. Derrida, Lyotard and their ilk accuse rationalists of playing language games that are all a part of Nietszche's idea of the "will to power". Thus the possibility of a context for our hermeneutical tradition to survive is undermined in an all out epistemological attack. this question of authority can only be resolved in the commitment of Christians who are willing to think and act biblically in the new culture which is emerging. Ultimately the truth is known by the enlightenment of the Holy Spirit acting in accord with the living witness of the christ life we exhibit in praxis.

 

The Challenge to Christians

 

But we must examine ourselves as well. Print literacy provides no defense for such an attack in and of itself. Print literacy has been weakened by its proponents in depending upon it , along with straight lecture (which has been thoroughly documented as one of the least effective learning environments) for the transmission of knowledge. Examples of this include the binge and purge pedagogy that most institutions of learning utilize- which constitutes a pale caricature of the Socratic method. I am afraid I have experienced this as well. Comprehension can be advanced by forums for discussion, with open interchange among students, yet the syllabus for the very class for which I originally wrote this essay forbids interaction or discussion among students. I offer this in evidence that new technologies are accompanied by new issues for educators to deal with.

However, I am convinced that a Christian/Augustinian understanding of language will out as it is the only option that adheres to the realities that even the pomo dystopians of the world must contend with on a daily basis. Any contention the deconstructionist makes assumes that argumentation is viable and meaningful. As the darkness deepens, the light will shine more brightly on the one true authority which can offer more hope, peace, truth, and REAL experience that modernism failed to provide. Postmodernism is a theology of despair in a world where men are longing for relevance and meaning.

I have the hope that just as the Logos of God holds the integrity of the universe together, (Col 1:17), as in heaven , so on earth, and this will be typified by the power of language to bind humans bearing the Imageo Dei in a similar fashion. Postmodern thought, at its logicus terminus is neither coherent nor viable for humans to live by. As Groothuis states, "...how do you navigate "infinity" with integrity- unless you are infinite yourself, or possess a map from the Infinite?"(Groothuis, p. 73.)

 

 

Extracurricular Sources quoted/ recommended reading:

 

Birkerts, Sven. The Gutenberg Elegies: The Fate of Reading in an Electronic Age. Boston: Faber and Faber, 1994.

Bolter , Jay David. Writing Space: The Computer, Hypertext ,and Writing. Lawrence Erlbaum Associates,1991

Boyle, Frank. "IBM, Talking Heads, and our Classrooms." Educom Review, vol. 5#3,1994.

Landow, George P. , ed. Hyper/Text/Theory. Baltimore:The John Hopkins UP, 1994.

Lanham, Richard A. The Electronic Word: Democracy, Technology and the Arts.

University of Chicago Press. Chicago,1995.

Lewis, C. S. "Membership" from The Weight of Glory.Touchstone,1966.

Murray, Janet H. Hamlet on the Holodeck: the future of narrative in cyberspace. Boston: MIT Press, 1997

Noam, Eli. "Will Books Become the Dumb Medium?" Educom Review, March/April 1998.

Ong, Walter J. Orality and Literacy : The Technologizing of the Word (New Accents) Routledge, 1988.

back to the index

Against the Pomo Dystopians

Technology and Community