University of Virginia Library

Search this document 


  

collapse section 
  
[section]
  
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 

expand section 

At a recent conference on "Shakespeare: Text and Deconstruction"[1] I suggested it was no accident that the current "revisionist" textual view of certain Shakespeare plays[2] had occurred during a period of post-structuralist unease with the fixed, determinate text of literary criticism, or, similarly, that the hegemony of New Criticism—despite its ostensible rejection of intention —had corresponded with the domination of the single, eclectic text reflecting auctorial intentionality.[3] I was not supposing that textual and literary critics had been in conscious emulation of each other, but rather that a specific intellectual climate made some critical and textual assumptions more likely or plausible at some times than at others.[4] In other words, that particular critical and textual practices were promoted and sustained by a general theoretical disposition.

Like it or not, we live in a period of theory. Courses taught in graduate schools, books published by young scholars, sessions held at professional conferences—all reflect the literary concentration on theory as something distinct from, (although perhaps dependent on), the empirical, evaluative, or historical criticism of earlier decades. Inevitably, there is resistance to this movement—from both literary critics and from textual critics. Among the literary folk are those "humanists" who regard structuralism, post-structuralism, marxism and the rest as arid, if not immoral,[5] and among the textuists Shakespearians who wish to retain the securities of a single text, mediaevalists who seek the one Chaucer among the many, modernists who want their Joyce clear not synoptic. And, equally inevitably, the quiet business of "traditional" literary criticism still goes on, as, of course, does the business of textual criticism and editing.

However, the textual-critical business has in recent years confronted


2

Page 2
some of the issues raised by literary theory, beginning with the pioneering work of Bowers in his Textual and Literary Criticism (1966).[6] To cite just two exemplary cases: Tanselle's 1979 article on final intention used Wimsatt, Beardsley, Hancher, Hirsch, and T. M. Gang in its analysis of the theoretical problem of intentionality,[7] and James McLaverty's 1984 article on intention[8] employed evidence drawn from literary theorists and critics (Hirsch), behavorial psychologists (Skinner), structuralist linguists (Saussure), and philosophers (Collingwood). More recent studies by, for example, Peter Shillingsburg, Hershel Parker, Jerome McGann, Louis Hay, and Hans Gabler[9] have confirmed that practising textual critics are prepared to engage the literary theorists and to make use of some of their concepts. And sessions at textual conventions (indeed, entire conferences)[10] have investigated the interplay between literary and textual dispensations.

As we might have anticipated, the literary theorists have, in general, not returned the favour. Some "theoretical" journals have published articles by textual scholars,[11] and the more adventurous literary critics have, on occasion, included textual problems in their consideration of theory[12] or have taken part in the public debate. But for the most part, the literary theorists have continued their work as if there had been no advances in textual-critical theory in the last few decades,[13] and—on our side of the fence—the editing of texts has sometimes continued without a full or articulated investigation of the theoretical choices involved in each separate editorial task.[14]

But despite the general lack of territorial engagement, it is clear that textuists need a theory (or theories) of textuality as a medium for dialogue—with each other or with those from different disciplines. A purely empirical approach—a recital of the specific circumstances of specific texts and the story of the editorial resolution of the problems they engender—can perhaps have a useful role in the accountability of editors for what they have done, and will, of course, be cited in the textual introductions for any responsible textual edition. But some synthesis of various individual and exemplary experiences is necessary to make them comprehensive and comprehensible, and Tanselle's occasional encyclopaedic surveys of the field in Studies in Bibliography in part fulfill this function,[15] although they also illuminate textual argument at large and frequently tie this argument to the theoretical postures employed in other disciplines (as in the "Final Intentions" article). But local empiricism ("this is what we did and why we did it") has, because of its concentration upon experience, little to offer those who have not yet had, and are perhaps unlikely to have, the same or a similar experience.[16] What theory does offer is (in the words of W.J.T.


3

Page 3
Mitchell in the introductory essay to Against Theory) "reflection, fundamental principles, models, schemes, systems, large-scale guesswork, metaphysics, speculation, intuition, and abstract thought" in parallel series to an "empiricist" list of "immediate perception, surface phenomena, things in themselves, small-scale certainty, physics, traditional wisdom, discursive reading, and concrete experience."[17]

It is not, of course, that theory is better than empiricism (or vice versa), nor that empiricism (particularly in such items on the list as "small-scale certainty" and "traditional wisdom") is not necessary to the editorial task (most of the qualities in Mitchell's "empiricist" list are indeed justifiably prized by editors)—but rather that theory provides a matrix for the plotting of the "certainties", small or otherwise, since it delineates a schema for the measurement of editorial attitudes and "reflections".

And, of course, textual criticism has not been shy of theory. From Alexandrian "analogy" through to Lachmannian stemmatics, Greg-Bowers intentionalism, and McGann social textual criticism, theories of how authors work, (even of who or what authors are), how texts are generated and transmitted, and how they should be represented to an audience, have been used to define, defend, and proselytise a theoretical view of the nature of composition, production and transmission—an ontology of the text, if you will. Thus, when Lachmann declares that the archetype of the Lucretius can be reconstituted and shows how this is to be done by the charting of "truth" and "error" in a genealogical table of witnesses, extant and inferred, he is inevitably privileging that archetype as the major desideratum of the textual scholar, and incidentally but forcefully invalidating the significance of the codices descripti lower down the family tree. Within the matrix of possible privileged positions, he is endorsing the relative chronological "superiority" of the archetype (although, note, not the fair copy, which remains unplottable and therefore without privilege) against inferior "copy", scribal reinscription etc. This seems obvious enough—and quite proper—to most classically-trained textual scholars, but it needs saying for two reasons: first, because no dictum should be implicitly and permanently accepted without continual demonstration of its validity (and what might have worked as a model in the transmission of classical texts need not be immutably pertinent in other periods), and second, because the theoretical grounds for an empirical assertion should be understood as a part of its evidentiary status. In other words, there is no "natural" or "self-evident" ontology of the text, but rather a series of alternative "metaphysics" displaying "fundamental principles" (to return to Mitchell's terminology) which will involve some degree of "speculation", some "intuition" and


4

Page 4
even, as most textual critics are willing to admit, some "large-scale guesswork". As Tanselle notes, all editing requires a measure of critical judgement (or, in Mitchell's words, speculative "guesswork") which might have "large-scale" implications.[18]

Why belabour all of this if it ought to be obvious to the practitioner? Well, the major reason for raising the issue now, for an audience of bibliographers old and new, is that the matrix I spoke of has been very largely redrawn by our neighbours in literary criticism, in history, philosophy, even sociology and mathematics, and if textual criticism is to remain one of the major intellectual disciplines of our culture, it must at the very least be aware of this redrawing, at those parts of the matrix that bisect the accepted or acceptable notions of "text" and "author". Developments in, say, structuralist linguistics and anthropology, or in the new science of "chaos",[19] no longer keep to their neat disciplinary boundaries, but on the contrary, they create new disciplines in the gaps left by the retreating older ones. This tendency (noted in the very recent history of chaos in particular) has broad institutional implications. As the work of legal scholars like Rawls or anthropologists like Clifford Geetz shows, there has been a movement towards finding the centre of the humanist and social science ethic in the "textual variance" of the "texts" studied.[20] Philosophers like Richard Rorty (and literary critics like Robert Scholes) have even suggested that the typical research university will eventually reformulate itself to contain "textual departments" (rather than departments of English, history, philosophy, sociology etc.).[21] Such a possible institutional redefinition—if it ever happens (and Scholes we should note heads not an English department but a Center for Culture and Media at Brown) will be a direct product of the redrawing of the map of the text and its author and reader, and on this new map textual scholarship—as we have traditionally understood that term—must find a place, indeed a central not a marginal place. It would be a lost opportunity, and a major intellectual tragedy, if textual scholarship were not to seek a role in future "departments of texts", but it cannot achieve this status if its practitioners remain resolutely unaware of, or even hostile to, the disciplinary and institutional changes which have caused the map to be redrawn.

As I have argued elsewhere, literary critics have all too often assumed that in the new textualism (or even in the old evaluative criticism) "any text will do".[22] No reader of Studies in Bibliography would accept such a dismissive retreat from textual responsibility, and it is thus our job to know where we stand, almost literally, in the redrawing of the terrain. The rest of this article, after this somewhat polemical introduction to the problem, addresses the question of the new matrix and the new


5

Page 5
drawing. It attempts to show where textual editors do indeed stand, by their work and their theories, in the interstercial choices that are now available. It is genuinely a prolegomenon, for it offers only the outlines of how our editorial and textual practices share certain natural affiliations with the positions of textuists of a different stamp. It does not produce anything, for I doubt that immediate or local editorial decisions will change as a result of the plotting I suggest; but it may give a local habitation and a name (and thereby another level of coherence and identity) to our textual enterprise. One final methodological caveat: in order to keep the basic outlines of the new matrix clear, much of the supporting—or conflicting—argument is embedded marginally in explanatory notes, where the curious reader can follow up particular aspects of the critical or literary theories under discussion.

I begin with a (mis)quotation, which can be a brief exercise in critical attribution.

"I start then with the postulate that what the [critic] is concerned with is pieces of paper or parchment covered with certain written or printed signs. With these signs he is concerned merely as arbitrary marks; their meaning is no business of his." (emphases mine)

This sounds like one of the "hermeneutical mafia" pontificating again—perhaps Eco or Culler (given the concentration on signs), Derrida, de Man, Hillis Miller, or Hartman—we may all have our favourite candidates. But the misquotation is instructive in this case. The first sentence should read: "the bibliographer" [not critic] is concerned with . . . signs" etc. And the author of this espousal of the arbitrariness of signs and the impropriety of meaning? Not a refugee from the École Normale Superieur nor even from Geneva, Konstanz, or New Haven, but that stalwart of Anglo-American "strict" bibliography, W. W. Greg. If bibliographers disdain mere meaning, what chance for those toiling in both literary and textual fields? But the apparent coincidence of view (culled, I admit, from Greg's more polemical writings in defence of bibliography as a "science" of "forms")[23] can be valuable, as I have suggested: literary and textual critics and theorists may not have spoken to each other directly in the last half-century or so, but there may be parallel conceptual or methodological issues at stake in their attitudes to that mysterious immanence—the "text". The following brief survey attempts to construct a few possible models where such parallels may be observed in operation. We may find some strange bedfellows, and some of the supposed paradigms may look strained on first acquaintance, but I would hope that a general loosening of the strict territorial imperatives could be of benefit to both parties.


6

Page 6

I would like to use a very familiar structure: the writer-, text- and reader-based theories of both literary and textual dispensations. The familiarity of this tripartite division of the textual spoils may modify the apparent heterodoxy of my other suggestions by framing them in a system that offers comparatively little contention.

From a critical point of view, one would expect to encounter, for example, intentionalist theories, phenomenological theories, historical-critical "objectivist" theories in the first (writer-based) division; formalist, New-Critical, textual-analytical, structuralist theories in the second (text-based) division; and reception, deconstructive, jouissance (or "readerly-play") theories in the third (reader-based) division. It is obviously an over-simplification, but it will do to give a rough orientation to the textual and critical dispensations to be covered.

Let us first admit that some of the possible theoretical filiations are more honestly (or perhaps more directly) stated than others. For example, Steven Mailloux's suggested revision of the Hancher-Tanselle line on intention in his Interpretive Conventions (1982)[24] acknowledges the presence of Stanley Fish in his title, his method, and his documentation. (Ultimately, I think his argument responds more to Poulet and a phenomenological reading of intention than a Fishian, but that is another question.)[25] On the other hand, Jerome McGann's assault—in his Critique of Modern Textual Criticism [26]—upon the Greg-Bowers definition of (and apparent need for) intention makes no such attempt to place itself in the general inheritance of critical speculation, and therefore has appeared more contentious (and revolutionary) to other textual critics than it really is. The Geneva and Konstanz schools, Fishian affective stylistics and interpretive communities, even the good old textus receptus—one of the hoariest of textual données—may all lie behind McGann's position in the Critique, but they are not an informing part of his argument as they have been in some of his other historical and critical works. And this is particularly important, given the sweeping political arguments that underlie McGann's book. Some textual critics would simply consign McGann's work to the demesne of "literary criticism" and therefore ignore it (interestingly, the Critique was reviewed in TLS [27] under the rubric of "literary theory", not "Textual Criticism" or "Bibliography"), but as the very rationale of this survey suggests, I believe that all textual or literary arguments, even the least valuable in practice, rest upon certain theoretical assumptions which must be questioned and made to give an account of themselves.

But on to the first stage: writer-based theories. As already suggested, the dominant phase of an intentionalist textual theoretical school in this last half-century has clearly been the Greg-Bowers-Tanselle promotion


7

Page 7
of "original" intentions for form (accidentals) and "final" intentions for content (substantives). This distinction between form and content is obviously not perfect or absolute, (and would not be accepted as such by the major proponents of the theory)[28] but it shows the relative direction of the historical values inherent in the theory (i.e., the "dual" or "divided" authority of two different manifestations of intention, often at two or more different historical moments). The theory is compounded or reinforced by—and draws much of its evidence from—an admixture of history of technology usually shown in a reliance on data drawn from analytical bibliography, with which the school is also associated. As McLaverty has already demonstrated, this general ideology is most closely allied with Hirsch's definition of an auctorially intended "meaning"[29]—an historically determinable objective context which is yet another resuscitation of supposedly moribund historical criticism. In fact, in his recent survey of Bowers' contributions to textual criticism (PBSA [30], on the occasion of Bowers' eightieth birthday celebrations), Tanselle makes much of this historical rationale for the Greg-Bowers jurisdiction. (Whether Greg could have foreseen that his essay on copy-text would have led to such wide-ranging contention in textual criticism is obviously beside the point: both his disciples and his apostates seem to agree on the basic terms of the debate, and disagree primarily on their specific applicability to fields beyond Renaissance drama.) Thus, I think it was no accident that the related school of New Bibliography was pertinently so called as an historical antidote to the New Criticism (as well as in reaction to the old, belletristic, bibliography), for the New Criticism had an avowed ahistorical bias. Parker's and Higgins' "New Scholarship", short-lived as a critical and political term, might have been trying to make a similarly reactive and polemical point, but since the term was withdrawn soon after its coinage, it never achieved a coherent body of demonstration.[31] The more significant observation for our paradigmatic purposes is that such an historical/intentionalist emphasis lies not only within Hirschian auspices but also within phenomenological (and even some aspects of hermeneutical) as well. Thus, Husserl's "intentional" theory of consciousness, whereby the text is seen as an embodiment of auctorial consciousness,[32] Gadamer's partial—and early—espousal of the varying relevance of "meaning" to auctorial intention,[33] and most persuasively, Hans Robert Jauss' defining of the literary work within its individual historical "horizon",[34] leading to the concept of the cultural and chronological "alterity" or "otherness" of the work[35]—all of these share a reliance upon historical intention for their definitions and methodologies. Now, there are obvious refinements to be made which mar the paradigms to some extent—so, for example, Gadamer's insistence on

8

Page 8
the hermeneutical "relativity" of meaning, supportive as it might be initially of historical criticism (and therefore intention), also allows by extension the continuity of meaning through time which McGann endorses in the Critique.[36] But, despite such reservations, I think the basic model holds up clearly enough. One might argue that any intentionalist school is ultimately a product of the old Germanic philological dispensation of Altertumswissenschaft; in textual theory, for example, most of the dissenters from the Greg-Bowers principles of copy-text (one thinks of Thorpe and Gaskell[37] as prime instances) would probably still regard themselves as practising a form of intentionalist, historical criticism—it is merely that the historical focus is placed elsewhere, say, on printed editions rather than on auctorial manuscripts.

In this speculative tour of paradigms, we move next to text-based theories, where the mid-century influence of Formalist/New-Critical decontextualisation of the text is well-attested. The orthodox Formalist concentration on defamiliarisation[38] (of which more anon) shows a predisposition to respond to particular types of inter- (or perhaps, more correctly, intra-) textual relationships (particularly multi-layered ironies), and this intra-textual layering, albeit under objective bibliographical principles) can be observed in the synoptic text of Gabler's Ulysses (and perhaps in any "texte génétique" as well—although that's a more problematical question).[39] I am not convinced that genetic editors—despite their generally phenomenological assertions—belong automatically in the intentionalist division; it depends on the use made of the genetically-derived material. Perhaps paradoxically, Gabler's "clear-text" reading page could be seen as a "New-Critical" resolution of the structuralist ironies present in the synoptic text (i.e., as the critic/editor's selection of readings which remove or explain or fulfil the layers of meaning in the text, in the manner of a formalist's objective codification of the linguistic tensions in the work); or, the clear text might represent auctorial "final" intention as well as, or in place of, a merely critically-resolved final structure. Stated bluntly, the problem in any joint synoptic/clear text edition is how far does the latter stage correspond to final intention, insofar as that can be delineated in any single, eclectic text? But with or without clear text, a synoptic text—where multiple authority exists, of course—is a sort of "scrambled" (but presumably decodable) version of the Lachmannian filiative system, except that the synoptic text may eschew the hierarchical format of variants on which the Lachmann method depends. (The distinction is not entirely apt, as I recognise, for even a synoptic text must have a "base" text on which the diacritics can map the dynamic of textual growth, but the formal arrangement of a synoptic text is not inevitably genealogical or stemmatic,


9

Page 9
as the Lachmann system always is.)[40] This Lachmann system McGann (mistakenly, I believe) regards as the unfortunate progenitor of modern intentionalism[41]—as a part of his general case against the intentionalist inheritance of the Greg-Bowers school. On the contrary, I would hold that Lachmann was primarily a sort of proto-structuralist, for even the potential embarrassment of the circular logic represented by the definition of "error" does not fundamentally detract from the Lacmannian's insistence on the structuralist descriptiveness of filiation (i.e., the stage identified by recensio, not by emendatio or divinatio). And, of course, the structuralist system of bipolar oppositions[42] (on/off, night/day, good/bad) is seen most tellingly, and used most compellingly, in the Lachmannian insistence upon the determination of vertical transmission by the opposition of "truth" and "error", a dualism which also surfaced in the bipartite stemmata for which Bédier had such scorn.[43] In fact, it was this very putative (and in his view spurious) structuralist "objectivity" which so enraged Housman and led him to claim that the Germans had confused textual criticism with mathematics![44] Similarly, it is the social and literary structuralists' reduction of society and literature to a series of positive or negative equations and their resultant denial of subjective evaluation which in these later days has so enraged the humanist critics.[45]

While McGann does acknowledge that there is an intellectual disjunct between Lachmannian stemmatics and twentieth-century intentionalism, it is, I think, a mischaracterisation of the history of textual theory to place the Lachmannian method and its aims (as he does) in a linear relationship with the Greg-Bowers school. The problem with McGann's "schematic history" is that, despite the noted disjunct, he fails properly to recognise the very limited status of the archetype in the Lachmannian system—an acknowledgedly corrupt state of textual transmission which does not respond to intention. Housman saw this weakness in the Lachmannian argument, when he accused the school of relying upon hope rather than judgement in their acquiescence to what amounted to a "best-text" theory,[46] although it was not so called. (There are, of course, several ironies in the terms of this conflict.)

A separate, and much fuller, study would be required to argue the problem of whether a filiative theory of textual criticism is analogous to the sort of geneticism practised by so many contemporary European textuists—Hay, Lebrave, Zeller—or whether Soviet textology, with its emphasis on the "unintentional", "non-authorial" remaniment, is similarly structuralist.[47] As my general tone would indicate, I believe they probably are.

There is one possibly valid methodological distinction which might


10

Page 10
be raised, however. If the concentration is on the process of creation as an indication of intention (e.g., Lebrave, or Gabler in the assumed relations between his synoptic and clear-text phases),[48] then the textual theory and practice may be deemed phenomenological, as Mailloux has already implicitly recognised.[49] If, on the other hand, the concentration is seen primarily as a vehicle for the mere mapping of alternatives (auctorial and non-auctorial)—i.e., a critical variorum of variant "states"—then the theory and practice is primarily structuralist.[50] It depends on whether the analogy is what Frye claims to have done for genre in the Anatomy [51] (descriptive, non-evaluative criticism) or what Barthes does for Balzac[52] and advertisements (descriptive, analytical, and reader-defined). A charting of the particles of a text (Slavic textology and perhaps Zeller and Hay) will be polysemic almost malgré lui (and therefore semiotic and therefore structuralist) rather than primarily intentionalist. The difference in emphasis may be significant, for an intentionalist, writer-based theory would attach no inherent value to these later polysemic structures, except insofar as they could be shown to represent "coded" or "embedded" auctorial intention—at presumably a "post-textual" (or at any rate a "post-auctorial" stage of transmission). But a structuralist or semiotic text-based or reader-based theory would obviously find the major interest in the variety of structures, whether or no this represented the intentionality of a single consciousness.[53] It would be either the interaction of these structures (their intertextuality, if you like), or the reader's play on their polysemic array, which would be the main focus of the activities of such critics.

I would also hold that the earliest formal structuralism (even though occasionally intentionalist as well) is Third Century BC Alexandrian analogy, whereby an analysis of remaniement structures could be used to determine the nature and content of the phenomenological "gaps" in the documentation of intention. (If the reader will forgive the play, a very convenient modern analogy for analogy would be the non-analog digital method of a CD player, which can be programmed to eradicate transmissional "errors" and to leap over "gaps" in the surface of the CD.) The irony of this ancient Alexandrian system is that it begins to sound rather like the phenomenologist Ingarden's schemata,[54] used to fill "gaps" in the contextual "frame of reference" of the work, and similar to Wolfgang Iser's "strategies" or "repertoires" of themes and codes which again form phenomenological structures for the resolution of intentional cruxes.[55] The Alexandrians' promotion of an ideology of the "Homeric" (or "non-Homeric") line could lead either to a subjective play reminiscent of Barthian jouissance (under the "creative" textual emendations practised by Zenodotus of Ephesus) or (under the more austere


11

Page 11
"coding" of variants without implicit status practised by Aristarchus of Samothrace),[56] to a conservative reticence reminiscent of Zeller or the Slavic textologists. The principle is the same; it's the practice that varies. And this principle, made notorious by Bentley's infamous edition of Paradise Lost, is very much alive and well in what I believe to be the equally post-structuralist jeu of Kane/Donaldson's Piers Plowman, where the editors, under the guise of intentionality—of constructing what Langland wrote, or ought to have written—playfully (and successfully) fabricate a writerly (scriptible) text which responds to the needs of the reader (and the editor) for "perfectability" in the alliterative line, and not, so David Fowler argues, to the cumulative documentary evidence.[57]

Concluding the triad, we encounter reader-based theories (which, as I have implicitly suggested, can derive very conveniently from apparently text-based theories, as the Kane/Donaldson edition demonstrates). The clearest statement in recent textual theory is, of course, in McGann's Critique—the endorsement, in nineteenth-century editing at least, of the so-called "social" school of textual criticism.[58] Intentionality evaporates in the historical continuum of interpretive communities, for, as in later Fish, there is a shifting of focus from the nature of auctorial consciousness through the nature of the text to the nature of the reading and reconstruction of books. As I have already noted, this position is related to the ancient doctrine of the textus receptus, the cumulative history of the text beyond auctorial control, and is nothing terribly new, even on the "textual" front. On the "critical" front, it is even less startling, for Heidegger's insistence on meanings as "situational" (i.e., fluent and relativistic)[59] and Gadamer's and Jauss' charting of the passage of meaning from one cultural context to another[60] can both be seen to anticipate McGann's position. Ironically, so could Hirsch's acceptance of the fluctuating "significance" of the work,[61] although I recognise that Hirsch is really talking about other than the formal features of the text in this case. Furthermore, I would hazard that Bakhtin's concept of the linguistic community as a battleground over meaning (where "ideological contention" may achieve resolution through such processional devices as the "carnival" of language),[62] might also be observed behind McGann's "new" position. Competing views over "continuity" versus "determinism" have been seen recently in our Attorney General's endorsement of "a jurisprudence of original intention", which Justice Brennan regards as the unfortunate result of minds having "no familiarity with the historical record."[63] Brennan's "relativistic" view of interpretation appears in, for example, Bruce Ackerman's affective theory of constitutionalism, which is frankly based on a Fishian method.[64] "Relativism"


12

Page 12
versus "Originalism" was, of course, the focus of the debate over the nomination to the Supreme Court of the historical conservative Robert Bork.

The most notorious demonstration of the post-textual game is the Barthean and Derridean jeu,[65] the "play" on the elements of the text, the purpose of which is the gradual exposure of the inherent aporia, the "central knot of indeterminacy", which is the death-knell not only of auctorial intention and of structuralist system but of codifiable or historically consistent reading as well. Since both reading and writing are temporal media (processional, indeed), we should not be surprised that such indeterminacies are rampant, and some major authors (and critics) have been justifiably brought to book for them. Was Eve created together with Adam or from his rib? What sex was Robinson Crusoe's goat? Did Beowulf have a misspent youth or didn't he? Such textual indeterminacy may be a product of post-auctorial intervention (see, for example, Steven Urkowitz's and others' claims about the history of the conflated editing of Lear,[66] or the sectarian editing of the New Testament) or, it may result from a combination of internal and external (i.e., auctorial and editorial) "de-construction". Hershel Parker's recent textual career[67] has been dominated by a concern for such self-contradictions in American fiction of the nineteenth and twentieth centuries, where the argument is that these texts cannot be read as if they were single creative entities. However, Parker's "knot of indeterminacy" could presumably be avoided with enough auctorial and editorial care and control, whereas (according to Hillis Miller at least),[68] the deconstructor merely demonstrates the native aporia of the text—he does not create it, and it is, in fact, inevitable. But Parker's and Derrida's concerns, while differently motivated, are not dissimilar in their application: both tend to take the specific image, scene, or fragmentary moment, and to work the implications and ambiguities of this moment in its reflective qualities throughout the text (back and forth), to allow it to accomplish the dismantling of the logic of the text as a whole.[69] Both are interested in the "traces" of meaning left in a text, and both are concerned with différance, the reader's continually having to "defer" a conclusive, closed interpretation of a text. To both critics, texts are embarrassments, embarrassments of narrative and of logic.

Where does this leave us? I had a neat and superficially plausible conclusion for this survey, full of tropes of balance and dialectic, historicism and relativism, but since this prolegomenon really offers only a descriptive challenge rather than a synthetic resolution, such a conclusion would probably be unjustifiable at this stage of the debate. So, instead, I have a specific, rather than a general assertion: it is not exactly


13

Page 13
an example, for both Hillis Miller and Tanselle have proclaimed, from their very different textual positions,[70] that good theories are not necessarily proven by the weight of demonstration, but it is an illustration, maybe even an illumination.

The Formalist Shklovsky proclaimed that "defamiliarisation" (estranje) was an essential function of literary language, and the implication of this assertion is that the intention of a truly "literary" author is to evoke a sense of the defamiliar.[71] As has often been observed, this emphasis on the defamiliarisation of language tends to promote a criticism of texts for their ironic qualities, and therefore tends to favour texts with linguistic or metaphorical tensions which may appear inaccessible or paradoxical but which can be resolved by the formalist critic's elucidation. Thus, the power of formalist criticism is to describe the auctorial defamiliarisation of language while ultimately rendering it accessible. I would hold that the "classical" textual theory of lectio difficilior probior/potior est (the more difficult reading is the more "correct" or "moral" or "powerful")[72] operates under the same assumption that auctorial intention will be embedded in the least "familiar" (to scribe and reader) of the variants available. In fact, the textual dogma may go further than this, and in the hands of conjecturalist critics it might suggest that a "difficult" reading representative of auctorial will must be created where documentary evidence yields only an accessible variant or variants. Used as a technical device in charting stemmatic filiations (i.e., the direction of "error"), the lectio difficilior is, like estranje, an endorsement of an ideology of literariness that shows itself primarily in its disjunctive nature (i.e., disjunctive from the norm, the expected, and the derivative). Both theories emphasise originality of mind and linguistic usage as the primary means of the recovery of auctorial intention, and both rest upon a definition of literariness that is assumed rather than tested. And both (ironically) claim to be purely formal, objective, analytical methods of approaching the text while in fact depending upon highly problematical evaluative positions.

My contention for this occasion is simply that the conceptual and methodological premises of such ideological pairings of literary and textual theory could benefit from a simultaneous investigation. Much work needs to be done to suggest how differing theoretical perceptions would result in differing texts, and I have elsewhere sketched some of the evidence that might be used to determine whether an edition is "formalist" or "structuralist" or "deconstructive".[73] But, as the various and several appearances in this survey of such complex editions as Gabler's Ulysses and Kane/Donaldson's Piers Plowman demonstrate, two or more competing theoretical dispensations might lie behind the practice of


14

Page 14
editors: there might (as in Kane/Donaldson) be a divergence between announced editorial purposes and the "performance" of the editors on the textual page, or (as in Gabler) a potential theoretical disjunct between the two facing pages of the text. These delicacies—and contradictions—need a subtler articulation than is possible here, and I am now engaged on such a large-scale study; but for the moment my aim is more modest—descriptive and paradigmatic rather than practical or empirical. I do believe that an awareness of the theoretical assumptions behind an edition—even when these assumptions might seem multiple and perhaps contradictory—can be helpful to an understanding and evaluation of the results of the editing, but I do not yet suppose that this initial fragile matrix will hold the entire history of textual scholarship and the debates thereon.

And Greg? Are we to place him among the prescient post-structuralists rather than among the strict and pure bibliographers? Probably not, for his insistence on the inaccessibility of inherent meaning can be accounted for historically by a dogmatic bibliographical reliance on the technology of textual analysis, where intended meaning as an aesthetic predisposition on the part of the textual critic (see Bentley and Kane/ Donaldson) might indeed be seen as a liability to the practising historian of technology or to the "objective" textual critic. Greg provides the provocation for this paper, but I do not regard him as a forerunner of Jacques Derrida.