| ||
At a recent conference on "Shakespeare: Text and Deconstruction"[1] I suggested it was no accident that the current "revisionist" textual view of certain Shakespeare plays[2] had occurred during a period of post-structuralist unease with the fixed, determinate text of literary criticism, or, similarly, that the hegemony of New Criticism—despite its ostensible rejection of intention —had corresponded with the domination of the single, eclectic text reflecting auctorial intentionality.[3] I was not supposing that textual and literary critics had been in conscious emulation of each other, but rather that a specific intellectual climate made some critical and textual assumptions more likely or plausible at some times than at others.[4] In other words, that particular critical and textual practices were promoted and sustained by a general theoretical disposition.
Like it or not, we live in a period of theory. Courses taught in graduate schools, books published by young scholars, sessions held at professional conferences—all reflect the literary concentration on theory as something distinct from, (although perhaps dependent on), the empirical, evaluative, or historical criticism of earlier decades. Inevitably, there is resistance to this movement—from both literary critics and from textual critics. Among the literary folk are those "humanists" who regard structuralism, post-structuralism, marxism and the rest as arid, if not immoral,[5] and among the textuists Shakespearians who wish to retain the securities of a single text, mediaevalists who seek the one Chaucer among the many, modernists who want their Joyce clear not synoptic. And, equally inevitably, the quiet business of "traditional" literary criticism still goes on, as, of course, does the business of textual criticism and editing.
However, the textual-critical business has in recent years confronted
As we might have anticipated, the literary theorists have, in general, not returned the favour. Some "theoretical" journals have published articles by textual scholars,[11] and the more adventurous literary critics have, on occasion, included textual problems in their consideration of theory[12] or have taken part in the public debate. But for the most part, the literary theorists have continued their work as if there had been no advances in textual-critical theory in the last few decades,[13] and—on our side of the fence—the editing of texts has sometimes continued without a full or articulated investigation of the theoretical choices involved in each separate editorial task.[14]
But despite the general lack of territorial engagement, it is clear that textuists need a theory (or theories) of textuality as a medium for dialogue—with each other or with those from different disciplines. A purely empirical approach—a recital of the specific circumstances of specific texts and the story of the editorial resolution of the problems they engender—can perhaps have a useful role in the accountability of editors for what they have done, and will, of course, be cited in the textual introductions for any responsible textual edition. But some synthesis of various individual and exemplary experiences is necessary to make them comprehensive and comprehensible, and Tanselle's occasional encyclopaedic surveys of the field in Studies in Bibliography in part fulfill this function,[15] although they also illuminate textual argument at large and frequently tie this argument to the theoretical postures employed in other disciplines (as in the "Final Intentions" article). But local empiricism ("this is what we did and why we did it") has, because of its concentration upon experience, little to offer those who have not yet had, and are perhaps unlikely to have, the same or a similar experience.[16] What theory does offer is (in the words of W.J.T.
It is not, of course, that theory is better than empiricism (or vice versa), nor that empiricism (particularly in such items on the list as "small-scale certainty" and "traditional wisdom") is not necessary to the editorial task (most of the qualities in Mitchell's "empiricist" list are indeed justifiably prized by editors)—but rather that theory provides a matrix for the plotting of the "certainties", small or otherwise, since it delineates a schema for the measurement of editorial attitudes and "reflections".
And, of course, textual criticism has not been shy of theory. From Alexandrian "analogy" through to Lachmannian stemmatics, Greg-Bowers intentionalism, and McGann social textual criticism, theories of how authors work, (even of who or what authors are), how texts are generated and transmitted, and how they should be represented to an audience, have been used to define, defend, and proselytise a theoretical view of the nature of composition, production and transmission—an ontology of the text, if you will. Thus, when Lachmann declares that the archetype of the Lucretius can be reconstituted and shows how this is to be done by the charting of "truth" and "error" in a genealogical table of witnesses, extant and inferred, he is inevitably privileging that archetype as the major desideratum of the textual scholar, and incidentally but forcefully invalidating the significance of the codices descripti lower down the family tree. Within the matrix of possible privileged positions, he is endorsing the relative chronological "superiority" of the archetype (although, note, not the fair copy, which remains unplottable and therefore without privilege) against inferior "copy", scribal reinscription etc. This seems obvious enough—and quite proper—to most classically-trained textual scholars, but it needs saying for two reasons: first, because no dictum should be implicitly and permanently accepted without continual demonstration of its validity (and what might have worked as a model in the transmission of classical texts need not be immutably pertinent in other periods), and second, because the theoretical grounds for an empirical assertion should be understood as a part of its evidentiary status. In other words, there is no "natural" or "self-evident" ontology of the text, but rather a series of alternative "metaphysics" displaying "fundamental principles" (to return to Mitchell's terminology) which will involve some degree of "speculation", some "intuition" and
Why belabour all of this if it ought to be obvious to the practitioner? Well, the major reason for raising the issue now, for an audience of bibliographers old and new, is that the matrix I spoke of has been very largely redrawn by our neighbours in literary criticism, in history, philosophy, even sociology and mathematics, and if textual criticism is to remain one of the major intellectual disciplines of our culture, it must at the very least be aware of this redrawing, at those parts of the matrix that bisect the accepted or acceptable notions of "text" and "author". Developments in, say, structuralist linguistics and anthropology, or in the new science of "chaos",[19] no longer keep to their neat disciplinary boundaries, but on the contrary, they create new disciplines in the gaps left by the retreating older ones. This tendency (noted in the very recent history of chaos in particular) has broad institutional implications. As the work of legal scholars like Rawls or anthropologists like Clifford Geetz shows, there has been a movement towards finding the centre of the humanist and social science ethic in the "textual variance" of the "texts" studied.[20] Philosophers like Richard Rorty (and literary critics like Robert Scholes) have even suggested that the typical research university will eventually reformulate itself to contain "textual departments" (rather than departments of English, history, philosophy, sociology etc.).[21] Such a possible institutional redefinition—if it ever happens (and Scholes we should note heads not an English department but a Center for Culture and Media at Brown) will be a direct product of the redrawing of the map of the text and its author and reader, and on this new map textual scholarship—as we have traditionally understood that term—must find a place, indeed a central not a marginal place. It would be a lost opportunity, and a major intellectual tragedy, if textual scholarship were not to seek a role in future "departments of texts", but it cannot achieve this status if its practitioners remain resolutely unaware of, or even hostile to, the disciplinary and institutional changes which have caused the map to be redrawn.
As I have argued elsewhere, literary critics have all too often assumed that in the new textualism (or even in the old evaluative criticism) "any text will do".[22] No reader of Studies in Bibliography would accept such a dismissive retreat from textual responsibility, and it is thus our job to know where we stand, almost literally, in the redrawing of the terrain. The rest of this article, after this somewhat polemical introduction to the problem, addresses the question of the new matrix and the new
I begin with a (mis)quotation, which can be a brief exercise in critical attribution.
"I start then with the postulate that what the [critic] is concerned with is pieces of paper or parchment covered with certain written or printed signs. With these signs he is concerned merely as arbitrary marks; their meaning is no business of his." (emphases mine)
This sounds like one of the "hermeneutical mafia" pontificating again—perhaps Eco or Culler (given the concentration on signs), Derrida, de Man, Hillis Miller, or Hartman—we may all have our favourite candidates. But the misquotation is instructive in this case. The first sentence should read: "the bibliographer" [not critic] is concerned with . . . signs" etc. And the author of this espousal of the arbitrariness of signs and the impropriety of meaning? Not a refugee from the École Normale Superieur nor even from Geneva, Konstanz, or New Haven, but that stalwart of Anglo-American "strict" bibliography, W. W. Greg. If bibliographers disdain mere meaning, what chance for those toiling in both literary and textual fields? But the apparent coincidence of view (culled, I admit, from Greg's more polemical writings in defence of bibliography as a "science" of "forms")[23] can be valuable, as I have suggested: literary and textual critics and theorists may not have spoken to each other directly in the last half-century or so, but there may be parallel conceptual or methodological issues at stake in their attitudes to that mysterious immanence—the "text". The following brief survey attempts to construct a few possible models where such parallels may be observed in operation. We may find some strange bedfellows, and some of the supposed paradigms may look strained on first acquaintance, but I would hope that a general loosening of the strict territorial imperatives could be of benefit to both parties.
I would like to use a very familiar structure: the writer-, text- and reader-based theories of both literary and textual dispensations. The familiarity of this tripartite division of the textual spoils may modify the apparent heterodoxy of my other suggestions by framing them in a system that offers comparatively little contention.
From a critical point of view, one would expect to encounter, for example, intentionalist theories, phenomenological theories, historical-critical "objectivist" theories in the first (writer-based) division; formalist, New-Critical, textual-analytical, structuralist theories in the second (text-based) division; and reception, deconstructive, jouissance (or "readerly-play") theories in the third (reader-based) division. It is obviously an over-simplification, but it will do to give a rough orientation to the textual and critical dispensations to be covered.
Let us first admit that some of the possible theoretical filiations are more honestly (or perhaps more directly) stated than others. For example, Steven Mailloux's suggested revision of the Hancher-Tanselle line on intention in his Interpretive Conventions (1982)[24] acknowledges the presence of Stanley Fish in his title, his method, and his documentation. (Ultimately, I think his argument responds more to Poulet and a phenomenological reading of intention than a Fishian, but that is another question.)[25] On the other hand, Jerome McGann's assault—in his Critique of Modern Textual Criticism [26]—upon the Greg-Bowers definition of (and apparent need for) intention makes no such attempt to place itself in the general inheritance of critical speculation, and therefore has appeared more contentious (and revolutionary) to other textual critics than it really is. The Geneva and Konstanz schools, Fishian affective stylistics and interpretive communities, even the good old textus receptus—one of the hoariest of textual données—may all lie behind McGann's position in the Critique, but they are not an informing part of his argument as they have been in some of his other historical and critical works. And this is particularly important, given the sweeping political arguments that underlie McGann's book. Some textual critics would simply consign McGann's work to the demesne of "literary criticism" and therefore ignore it (interestingly, the Critique was reviewed in TLS [27] under the rubric of "literary theory", not "Textual Criticism" or "Bibliography"), but as the very rationale of this survey suggests, I believe that all textual or literary arguments, even the least valuable in practice, rest upon certain theoretical assumptions which must be questioned and made to give an account of themselves.
But on to the first stage: writer-based theories. As already suggested, the dominant phase of an intentionalist textual theoretical school in this last half-century has clearly been the Greg-Bowers-Tanselle promotion
In this speculative tour of paradigms, we move next to text-based theories, where the mid-century influence of Formalist/New-Critical decontextualisation of the text is well-attested. The orthodox Formalist concentration on defamiliarisation[38] (of which more anon) shows a predisposition to respond to particular types of inter- (or perhaps, more correctly, intra-) textual relationships (particularly multi-layered ironies), and this intra-textual layering, albeit under objective bibliographical principles) can be observed in the synoptic text of Gabler's Ulysses (and perhaps in any "texte génétique" as well—although that's a more problematical question).[39] I am not convinced that genetic editors—despite their generally phenomenological assertions—belong automatically in the intentionalist division; it depends on the use made of the genetically-derived material. Perhaps paradoxically, Gabler's "clear-text" reading page could be seen as a "New-Critical" resolution of the structuralist ironies present in the synoptic text (i.e., as the critic/editor's selection of readings which remove or explain or fulfil the layers of meaning in the text, in the manner of a formalist's objective codification of the linguistic tensions in the work); or, the clear text might represent auctorial "final" intention as well as, or in place of, a merely critically-resolved final structure. Stated bluntly, the problem in any joint synoptic/clear text edition is how far does the latter stage correspond to final intention, insofar as that can be delineated in any single, eclectic text? But with or without clear text, a synoptic text—where multiple authority exists, of course—is a sort of "scrambled" (but presumably decodable) version of the Lachmannian filiative system, except that the synoptic text may eschew the hierarchical format of variants on which the Lachmann method depends. (The distinction is not entirely apt, as I recognise, for even a synoptic text must have a "base" text on which the diacritics can map the dynamic of textual growth, but the formal arrangement of a synoptic text is not inevitably genealogical or stemmatic,
While McGann does acknowledge that there is an intellectual disjunct between Lachmannian stemmatics and twentieth-century intentionalism, it is, I think, a mischaracterisation of the history of textual theory to place the Lachmannian method and its aims (as he does) in a linear relationship with the Greg-Bowers school. The problem with McGann's "schematic history" is that, despite the noted disjunct, he fails properly to recognise the very limited status of the archetype in the Lachmannian system—an acknowledgedly corrupt state of textual transmission which does not respond to intention. Housman saw this weakness in the Lachmannian argument, when he accused the school of relying upon hope rather than judgement in their acquiescence to what amounted to a "best-text" theory,[46] although it was not so called. (There are, of course, several ironies in the terms of this conflict.)
A separate, and much fuller, study would be required to argue the problem of whether a filiative theory of textual criticism is analogous to the sort of geneticism practised by so many contemporary European textuists—Hay, Lebrave, Zeller—or whether Soviet textology, with its emphasis on the "unintentional", "non-authorial" remaniment, is similarly structuralist.[47] As my general tone would indicate, I believe they probably are.
There is one possibly valid methodological distinction which might
I would also hold that the earliest formal structuralism (even though occasionally intentionalist as well) is Third Century BC Alexandrian analogy, whereby an analysis of remaniement structures could be used to determine the nature and content of the phenomenological "gaps" in the documentation of intention. (If the reader will forgive the play, a very convenient modern analogy for analogy would be the non-analog digital method of a CD player, which can be programmed to eradicate transmissional "errors" and to leap over "gaps" in the surface of the CD.) The irony of this ancient Alexandrian system is that it begins to sound rather like the phenomenologist Ingarden's schemata,[54] used to fill "gaps" in the contextual "frame of reference" of the work, and similar to Wolfgang Iser's "strategies" or "repertoires" of themes and codes which again form phenomenological structures for the resolution of intentional cruxes.[55] The Alexandrians' promotion of an ideology of the "Homeric" (or "non-Homeric") line could lead either to a subjective play reminiscent of Barthian jouissance (under the "creative" textual emendations practised by Zenodotus of Ephesus) or (under the more austere
Concluding the triad, we encounter reader-based theories (which, as I have implicitly suggested, can derive very conveniently from apparently text-based theories, as the Kane/Donaldson edition demonstrates). The clearest statement in recent textual theory is, of course, in McGann's Critique—the endorsement, in nineteenth-century editing at least, of the so-called "social" school of textual criticism.[58] Intentionality evaporates in the historical continuum of interpretive communities, for, as in later Fish, there is a shifting of focus from the nature of auctorial consciousness through the nature of the text to the nature of the reading and reconstruction of books. As I have already noted, this position is related to the ancient doctrine of the textus receptus, the cumulative history of the text beyond auctorial control, and is nothing terribly new, even on the "textual" front. On the "critical" front, it is even less startling, for Heidegger's insistence on meanings as "situational" (i.e., fluent and relativistic)[59] and Gadamer's and Jauss' charting of the passage of meaning from one cultural context to another[60] can both be seen to anticipate McGann's position. Ironically, so could Hirsch's acceptance of the fluctuating "significance" of the work,[61] although I recognise that Hirsch is really talking about other than the formal features of the text in this case. Furthermore, I would hazard that Bakhtin's concept of the linguistic community as a battleground over meaning (where "ideological contention" may achieve resolution through such processional devices as the "carnival" of language),[62] might also be observed behind McGann's "new" position. Competing views over "continuity" versus "determinism" have been seen recently in our Attorney General's endorsement of "a jurisprudence of original intention", which Justice Brennan regards as the unfortunate result of minds having "no familiarity with the historical record."[63] Brennan's "relativistic" view of interpretation appears in, for example, Bruce Ackerman's affective theory of constitutionalism, which is frankly based on a Fishian method.[64] "Relativism"
The most notorious demonstration of the post-textual game is the Barthean and Derridean jeu,[65] the "play" on the elements of the text, the purpose of which is the gradual exposure of the inherent aporia, the "central knot of indeterminacy", which is the death-knell not only of auctorial intention and of structuralist system but of codifiable or historically consistent reading as well. Since both reading and writing are temporal media (processional, indeed), we should not be surprised that such indeterminacies are rampant, and some major authors (and critics) have been justifiably brought to book for them. Was Eve created together with Adam or from his rib? What sex was Robinson Crusoe's goat? Did Beowulf have a misspent youth or didn't he? Such textual indeterminacy may be a product of post-auctorial intervention (see, for example, Steven Urkowitz's and others' claims about the history of the conflated editing of Lear,[66] or the sectarian editing of the New Testament) or, it may result from a combination of internal and external (i.e., auctorial and editorial) "de-construction". Hershel Parker's recent textual career[67] has been dominated by a concern for such self-contradictions in American fiction of the nineteenth and twentieth centuries, where the argument is that these texts cannot be read as if they were single creative entities. However, Parker's "knot of indeterminacy" could presumably be avoided with enough auctorial and editorial care and control, whereas (according to Hillis Miller at least),[68] the deconstructor merely demonstrates the native aporia of the text—he does not create it, and it is, in fact, inevitable. But Parker's and Derrida's concerns, while differently motivated, are not dissimilar in their application: both tend to take the specific image, scene, or fragmentary moment, and to work the implications and ambiguities of this moment in its reflective qualities throughout the text (back and forth), to allow it to accomplish the dismantling of the logic of the text as a whole.[69] Both are interested in the "traces" of meaning left in a text, and both are concerned with différance, the reader's continually having to "defer" a conclusive, closed interpretation of a text. To both critics, texts are embarrassments, embarrassments of narrative and of logic.
Where does this leave us? I had a neat and superficially plausible conclusion for this survey, full of tropes of balance and dialectic, historicism and relativism, but since this prolegomenon really offers only a descriptive challenge rather than a synthetic resolution, such a conclusion would probably be unjustifiable at this stage of the debate. So, instead, I have a specific, rather than a general assertion: it is not exactly
The Formalist Shklovsky proclaimed that "defamiliarisation" (estranje) was an essential function of literary language, and the implication of this assertion is that the intention of a truly "literary" author is to evoke a sense of the defamiliar.[71] As has often been observed, this emphasis on the defamiliarisation of language tends to promote a criticism of texts for their ironic qualities, and therefore tends to favour texts with linguistic or metaphorical tensions which may appear inaccessible or paradoxical but which can be resolved by the formalist critic's elucidation. Thus, the power of formalist criticism is to describe the auctorial defamiliarisation of language while ultimately rendering it accessible. I would hold that the "classical" textual theory of lectio difficilior probior/potior est (the more difficult reading is the more "correct" or "moral" or "powerful")[72] operates under the same assumption that auctorial intention will be embedded in the least "familiar" (to scribe and reader) of the variants available. In fact, the textual dogma may go further than this, and in the hands of conjecturalist critics it might suggest that a "difficult" reading representative of auctorial will must be created where documentary evidence yields only an accessible variant or variants. Used as a technical device in charting stemmatic filiations (i.e., the direction of "error"), the lectio difficilior is, like estranje, an endorsement of an ideology of literariness that shows itself primarily in its disjunctive nature (i.e., disjunctive from the norm, the expected, and the derivative). Both theories emphasise originality of mind and linguistic usage as the primary means of the recovery of auctorial intention, and both rest upon a definition of literariness that is assumed rather than tested. And both (ironically) claim to be purely formal, objective, analytical methods of approaching the text while in fact depending upon highly problematical evaluative positions.
My contention for this occasion is simply that the conceptual and methodological premises of such ideological pairings of literary and textual theory could benefit from a simultaneous investigation. Much work needs to be done to suggest how differing theoretical perceptions would result in differing texts, and I have elsewhere sketched some of the evidence that might be used to determine whether an edition is "formalist" or "structuralist" or "deconstructive".[73] But, as the various and several appearances in this survey of such complex editions as Gabler's Ulysses and Kane/Donaldson's Piers Plowman demonstrate, two or more competing theoretical dispensations might lie behind the practice of
And Greg? Are we to place him among the prescient post-structuralists rather than among the strict and pure bibliographers? Probably not, for his insistence on the inaccessibility of inherent meaning can be accounted for historically by a dogmatic bibliographical reliance on the technology of textual analysis, where intended meaning as an aesthetic predisposition on the part of the textual critic (see Bentley and Kane/ Donaldson) might indeed be seen as a liability to the practising historian of technology or to the "objective" textual critic. Greg provides the provocation for this paper, but I do not regard him as a forerunner of Jacques Derrida.
| ||