University of Virginia Library

Search this document 


  

expand section 
expand section 
The Ranking of Variants in the Analysis of Moderately Contaminated Manuscript Traditions by Harold Love
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
expand section 
collapse section 
  
  
expand section 
expand section 
expand section 
expand section 

expand section 

39

Page 39

The Ranking of Variants in the Analysis of Moderately Contaminated Manuscript Traditions
by
Harold Love

Editors of post-mediaeval English literature are not very often required, in the absence of an authorial holograph or authoritative edition, to reconstruct texts from a profusion of non-authorial copyings. When they do, as in the instances of Donne, Rochester and the authors of late seventeenth-century libertine verse and "state poems," they will naturally try to adapt the methods developed by classical and biblical scholars to deal with the problems that arise, but will not always find them as clarified or thought-through as would be desirable.[1] One reason for this may well be that editors of classical texts, dealing as a rule with thoroughly contaminated textual traditions which may, in any case, lead only to the conjectures of some Byzantine or Hellenistic editor or to an archetype that was already seriously deficient, have rightly placed their main emphasis on the reconstruction of the author's text through philological and historical means —reserving the genealogical method for situations where its conclusions would be more or less open and shut.[2] Biblical scholarship faces much


40

Page 40
the same problem multiplied several hundredfold due to the enormous number of surviving sources and the complexity of their interrelationships, and, as a result, is reduced to dealing with broad groups or families of manuscripts without much fine tracing of individual lines of descent. In dealing with traditions such as that of Rochester, however, the editors have every right to feel that they should be able to make sense of the genealogical relationships—even when faced with groups of twenty or more manuscripts. Far from extending over centuries or millennia, the copying of the sources was probably concentrated within a decade or two. It was performed by scribes and compositors who spoke a common dialect of the English tongue, into which the editor can acquire more insight than would be the case with texts in a dead language.[3] There is a good deal of incidental information surviving about the circumstances of copying.[4] Finally, these were not sacred texts or high literature but topical poems written out currente calamo for immediate consumption, with the consequences that variation abounds and that the tradition is less likely to have been corrupted by the drawing of readings from more than one exemplar, though there may have been a fairly high incidence of memorial contamination.

Yet the problems clearly remain considerable. David Vieth, the first scholar to attempt to create a text of Rochester from the full range of the surviving sources (which he himself had been responsible for charting)[5], was, according to his own account, able to make only limited use of the genealogical method and relied in the establishment of his text on an adaptation of the "best manuscript" theory.[6] The present article


41

Page 41
grew from a wish to discover whether, despite Vieth's scepticism, it might be possible to present plausible stemmata for Rochester's longer poems.[7] The innovations in method proposed have been developed solely with that end in view and will not necessarily apply in other areas of textual scholarship, though it is hoped they will be found to have some wider applicability.

The approach in general involves an insistence on the historicity of the act of transmission which has been unfashionable in recent years, and which departs, perhaps unwisely, from the principle argued for by Vinton Dearing that the concern of the editor should be with the text considered as a logical construct rather than with the manuscript as carrier of the readings.[8] The distinction in itself is an important one which may be grasped most easily if we consider a manuscript which is a copy of a source, A, being corrected from a manuscript descended from a source, B, belonging to a different branch of the textual tradition, so as to remove all the substantive readings which were characteristic of A. If we assume that the deleted readings are rendered illegible, it is fair to say that the text of that manuscript, in so far as this is defined by substantives alone,[9] now belongs to the branch of the tradition represented by B, although it remains a historical fact that the majority of the words inscribed on its pages—those in which it did not vary from either A or B—had descended by a physical act of copying from A. When editors construct a stemma, they will normally see their aim as that of producing a diagrammatic representation of the logical relationships of the groupings of variant readings that characterise the sources, not a family tree of copyings which would show the paths by which the majority of readings in each instance, whether variant or invariant, were actually transmitted from page to page. So, to return to the earlier


42

Page 42
example, our manuscript would have to be included in a logical stemma as a descendant of B, even if the editor had firm external or non-substantive evidence for its having been initially copied from A.

There is, of course, no objection to this procedure as long as there is no confusion in the minds of editors or their readers over what is taking place. Indeed, it is often much more serviceable to the inquiry in hand than a strict family tree of sources would be. But dangers can nonetheless arise if editors allow themselves to lose sight completely of the nexus between the text and its carrier.[10] If the study of variation is, from one point of view, an exercise in the manipulation of symbols according to the principles of mathematical logic, it is equally the study of a certain kind of human behaviour—that of the scribe, professional or amateur, in the act of copying. The importance of habituating oneself to think in terms of the genealogies of manuscripts as well as of the class-relationships of texts is simply that the significance as evidence of a variant in any kind of textual inquiry must lie ultimately in the processes of thought (or non-thought) which have led to its coming into existence, its further transmission and possibly its disappearance. Modern English-language editorial practice admits the validity of such criteria, but insists that they should be reserved until the stage when the editor has established the primary pattern of relationships between his texts and begins the search for evidence of the direction of variation in order to locate the position of the ancestor.[11] My own contention is that the consideration of scribal motivation is relevant to every stage of textual reasoning and that to ignore it is to assent to the proposition that all cases of variation are of equal value as evidence for the relationship of texts—a proposition which may, indeed, be an unavoidable rule of


43

Page 43
procedure in certain specialised studies of textual data, but which is in itself an absurdity.

II

The editor encountering a tradition of the kind described must begin by finding solutions to two pressing problems of technique. In the first place, although the logic of textual analysis, as presented in handy form in Greg's The Calculus of Variants,[12] is crystal clear, the task of applying that logic to a manuscript tradition of any complexity, and containing even a moderate number of irregular agreements, can involve a daunting amount of calculation and record-keeping. Dearing, Froger and others have devised computer-based methods for performing the purely mechanical part of the task—though the problem still remains of ensuring that the data to be analysed is put into machine-readable form without distortion.[13] The second problem is that, as we have just seen, the variant readings which are the raw materials of logical analysis will not themselves necessarily progress through the tradition in logical ways. An error which would otherwise have been characteristic of a particular branch of the tradition might be corrected or (less fatally) miscorrected by a scribe. The same variant may arise independently in two different parts of the tradition. Readings characteristic of one branch may be imported into another as the result of memorial contamination or, in extreme cases, of an editing process in which readings from exemplars representing two or more separate lines of descent are combined eclectically into a single "conflated" text. Any of these happenings is likely to lead to an incorrect judgement about the place of manuscripts within the tradition as a whole or the number of intermediaries intervening between one manuscript and another. Seeing that a significant percentage of anomalous agreements seems to be present in every "real life" textual situation, the editor must be prepared to find this the most testing and frustrating stage of his labours.


44

Page 44

Whereas problems of scale and volume can be dealt with satisfactorily by the computer, it has yet to be shown conclusively (which is to say empirically) that purely quantitative methods can deal with the problem of irregular agreement.[14] Scholars faced with large and intractable textual traditions have naturally wished to devise methods by which vast bodies of data could be presented simply as a list of groupings of sigla and the task of reconciling these with each other and ascertaining the status of any anomalies performed entirely by a computer program. An algorithm described by Dearing in his Principles and Practice of Textual Analysis is the most elaborate attempt of this kind to date to appear in print, though others are known to have been attempted privately. While it is easy to scoff at such ambitions as a twentieth-century counterpart of squaring the circle (and there are certainly enormous temptations to recursive and tautological reasoning) the fact remains that it is only by such means that any hopes can be entertained of reaching a solution to the textual histories of the New Testament, the major church fathers, and such widely copied poets as Dante, Virgil and Chaucer. The drawback of such enterprises, however, is that the editor has to work with one arm tied behind his back. Large-scale computer-based data sorts allow no opportunity for the detailed ranking and precoding of the data. Moreover, once a variation has been reduced to a string of symbols divided by group delimiters it has to be regarded as possessing exactly the same evidential value as any other variation with the result that cases of irregular agreement must be reconciled by quantitative not qualitative means. Quantitative means should not be sneered at simply because they are quantitative. The powerful software packages used by social scientists for the statistical analysis of questionnaire and interview data have a considerable, and so far largely untested, value for textual scholars. But they cannot give better than a statistically plausible answer to the central questions—what is the location of the ancestor and how are the other texts derived from it—and there are many situations in which this will be a disastrously wrong answer.

Older approaches to textual criticism, partly from necessity but partly also from intellectual preference, took a view that was exactly the other way round. Accepting that variations differ greatly in their reliability as evidence for the family relationships of sources, they exercised


45

Page 45
editorial judgement to select those variations most likely to indicate the true state of affairs, rejected the others, and constructed their stemmas on the basis of what they learned from the favoured few.[15] It is true that the method was sometimes grossly abused by lazy or careless scholars, but there is still much to be said for it providing two conditions are met. The first of these is that adequate, reasoned and consistent criteria for selection are employed. The second is that decisions made on the basis of a selection of the evidence are found to hold good in a general way when referred back to the whole body of data, or at least to a random sample of it. It is my aim in what follows to outline one form in which such a method can be applied and to suggest that at the present stage of our experience it still offers many advantages over purely quantitative methods for dealing with problems of irregular agreement in moderately contaminated traditions.

It will be necessary before proceeding to present a more detailed summary of the circumstances under which variation may arise during the copying of texts. Least under the control of the scribe is physical damage to the exemplar that removes words or makes them illegible. Variation may also arise from inadequacies in the script of the exemplar by which one word may be read as another or provoke a desperate emendation. The extent to which this takes place will be influenced by the scribe's ability as a decipherer of difficult hands and, perhaps more vitally, by his preparedness to devote time and care to the analysis of letter forms. A third class of changes may be described as misreadings due to incorrect perception on the part of the scribe or a failure to proofread the transcript against the exemplar. This would include eyeskips between two occurrences of a word, dittography, reversals of order, and errors arising from incorrect anticipation—all these categories of error having been carefully anatomised by classical scholars.[16] Of a slightly different nature are substitutions, whether voluntary or involuntary, of the scribe's habits of grammar and expression for those of the original.[17] These would not always be seen by the scribe as comprising errors:


46

Page 46
in some cases they might even be regarded as part of his professional duty. Next come what can be described as editing activities: the emendation of real or imagined errors, the incorporation of readings from other, supposedly better sources, and the replacement of the unfamiliar by the familiar, even if the unfamiliar is perfectly satisfactory—for textual transmission has its own perverse kind of Gresham's Law by which the majority reading will inexorably extend its empire simply by possessing the "rightness" of familiarity. Scribal "improvement" of the original is another matter still—this may have an aesthetic justification or a moral one and covers the suppression or alteration of readings which to the best of the scribe's knowledge were intended by the author, as well as outright interpolation.

Bearing these things in mind, let us consider the position of an editor faced with a tradition containing seven manuscripts, cited under the sigla A to G, who encounters a variation in the form ABDE:CFG, or as it would be written in Greg's notation—Σ:CFG. These two groups should ideally represent (1) an independent and self-contained subfamily within the overall tradition, having its own exclusive common ancestor, and (2) a residual body of texts descended without variation from the archetype—which is which need not for the moment concern us. This ancestor of the sub-family, in turn, may either be one of the manuscripts listed or a lost intermediary (further patterns of variation would need to be consulted in order to determine this). And yet the assumption that the two groups of sigla are genetically significant is not one that the editor is able to make on the basis of a single instance, or even, in some cases, on the basis of a repeated occurrence of the same variation. Before he can judge the value of the variation and the kind of weight that can be placed on it as evidence, he will need to consider two further questions: whether each of the groups indicated is likely to be a true group in genealogical terms and whether it is likely to be a whole group.[18] It will be self-evident that such a question can not be answered from a consideration of symbols alone: it requires a close consideration of the actual variant in its context of meaning, supplemented by a sound understanding of the habits of scribes.

The idea of a "whole" group is necessary to allow for the fact that some variants are inherently less stable than others. Restricting ourselves


47

Page 47
for the moment to what Greg called type-2 variations, involving two groups of texts only, each of which contains at least two members, let us consider the situation of a scribe encountering some glaringly obvious mistake in grammar or metre in his exemplar. While it is a sad fact of life that scribes will often apply superb calligraphical skills to the transcription of arrant nonsense, such a reading would be at constant risk of being altered into something more plausible. Assuming that, in the present case, our scribe succeeded in restoring the reading of the original, his transcript, and those of its descendants which managed to transmit the reading without any further alteration, would, as far as this particular variation was concerned, appear to be prior rather than posterior to their ancestor containing the error. At the same time, the sources that preserved the error would cease to indicate the full membership of the body of manuscripts descended from that ancestor —in other words they would no longer indicate a whole group. The same kind of risk is run by any unique or minority reading—however plausible—occupying a prominent position in a much-copied poem. Here our Gresham's Law would apply again—and a scribe who had encountered the more widely represented reading might substitute it for the alternative simply because, being familiar, it seemed "better."

In discussing the related notion of a "true group" it is necessary to remind ourselves again that our concern for the moment is only with individual variants considered in isolation. Whether or not any given agreement, say CFG, indicates a whole group, that group will be "true" as long as each of its members genuinely belongs to the same sub-family as defined by succession of copying from the source in which the variant originated or from the archetype. However, in cases where an identical variant was introduced spontaneously and independently by scribes working from different exemplars or had arisen from contamination, the receiver group would be a false one in that one of its members was not a descendant of the text in which the variant originated. In this last case the variation would be a false group that contained a whole group though it is easily possible to imagine circumstances that would lead to the genesis of groups that were false without containing a whole group.

A third distinction that needs to be made is that between variants that are enabled by their nature to extend beyond their genealogical bounds and those which are not enabled. If a scribe should feel dissatisfaction with a reading in his exemplar and consult either another manuscript or his memory for an alternative, he will obviously be ready to accept any good, or at least plausible, alternative he might encounter. On the other hand, if the alternative was itself of a dubious nature,


48

Page 48
there would be little point in making a substitution.[19] Moreover, if he knows of no alternative, but attempts a conjectural emendation, his guess is much more likely to coincide with an alternative good reading than an alternative bad one (remembering that we are not at present considering variations with more than one alternative). Plausible readings must therefore be regarded as potentially mobile while bad readings will normally remain stationary. Familiarity may have a similar effect. As has already been indicated, the scribe will tend to regard the variant with which he is personally familiar as the "right" one, and, statistically speaking, this is more likely to be the variant that is better represented among the surviving sources.[20] If the variant concerned was both familiar and obviously more plausible than that with which the scribe was confronted in his exemplar, the pressure to make the substitution would be almost overwhelming, even in cases when it was the less plausible minority reading, as lectio difficilior, which was authorial or offered the best evidence for the authorial intention. If both variants were equally plausible, and there was in consequence no incentive for the scribe encountering either one separately to search for an alternative in a second source, the less common reading would only be likely to supplant the more common if it happened fortuitously to be that with which the scribe was already familiar. A consequence of these distinctions is that the agreement in an obviously wrong, and therefore "stationary," variant of a group of manuscripts when the alternative is obviously better is a very powerful argument—perhaps the most powerful of all—for them forming a true group. It would be unlikely, however, for them to form a whole group, both because of the existence of a better alternative and because a scribe unaware of the alternative might still arrive at it through conjectural emendation.[21] The importance of these considerations for the editor is that they suggest ways in which certain kinds of variation, or even certain kinds of variants taken in isolation, might be regarded as more likely or less likely to produce reliable evidence about the trueness or wholeness of groups.


49

Page 49

The following list offers a ranking of classes of variant according to the likelihood of their providing reliable evidence of "bibliographical" descent. It must be understood that some of the assumptions made rest on no firmer foundation than common sense, and that common sense is sometimes contradicted by what emerges from the patient analysis of large bodies of empirical data. It is to be hoped therefore that the validity of these assumptions can be tested for each of the major historical manuscript traditions by a statistical study of the causes of variation in situations where it is still possible to compare a copy with its exemplar, and equally that the activity of transcription can be studied under laboratory conditions using the methodology of present-day experimental psychology. A further limitation is that it assumes that scribes, whether characterised as "alert" or "careless," will behave in a reasonably consistent way and come to reasonably rational conclusions about the problems posed by their exemplars, an assumption which is hard to reconcile with the many peculiar situations encountered in the sources. Nonetheless, the criteria listed are offered as the guides which in the present state of our knowledge, and on the basis of my own experience as an editor of seventeenth-century texts, seem most likely to prove fruitful. As earlier, only Greg's type-2 variations are admitted for formal consideration though the method obviously has implications for the evaluation both of type-1 variations and of complex variations (Dearing's types 3 and 4).[22]

The basic distinctions that require to be made are as follows:

(1) whether a reading encountered by a scribe in his exemplar would appear to him to be plausible (P), suspicious (S) or an obvious error (E)

(2) whether, in the event of a scribe having knowledge of the alternative, a variant would be likely to be regarded as good by comparison (G), indifferent (I), or obviously inferior, i.e. bad (B)

(3) whether a variant can be described as obtrusive (O), in the sense of being prominent and likely to draw attention to itself, or unobtrusive (U)

(4) whether a variant might be considered vulnerable to reversal or further alteration under the influence of scribal linguistic and spelling preferences or palaeographical factors (V).


50

Page 50

Using this terminology, we can describe the following classes of variant and/or variation, each possessing a different degree of trustworthiness as an indicator of genetic relationships.

PIU

A clear substantive variant which is plausible in its context and to all appearances could have been intended by the author, which is indifferent by comparison with the alternative reading, yet which is not obtrusive enough to encourage memorial contamination from the alternative in cases where that was better represented and therefore more familiar. A variant of this kind is likely to indicate both a whole group and a true group.

PIO

A clear plausible substantive variant not noticeably better than the alternative, but one that is obtrusive enough in the poem to have run a definite risk of memorial replacement. Criteria of obtrusiveness would need to be derived afresh for every new work or tradition studied. In a religious text, for instance, it might be the theologically charged word that required to be so defined whereas in verse the position of the word in the stanza might be relevant.[23] Consideration would also have to be given to the likelihood of the scribe having encountered the work being copied on a previous occasion or of having access to a second source: with the Rochester manuscripts, I believe that possibility to have been quite a high one, especially where the professional writers of the large miscellanies were concerned. Variants of this kind are still likely to indicate both a true and a whole group, but not so reliably as those of the previous class. If the variant is relatively infrequent by comparison with the alternative, and there is no reason for believing that this was not the case in the tradition as a whole (including non-surviving as well as surviving sources), the likelihood of its representing a whole group would be correspondingly diminished but the likelihood of its representing a true group correspondingly enhanced.

PGU, PGO

A clear plausible substantive variant but clearly better than the alternative. Readings so classified—and especially the obtrusive subclass—would be likely to travel into texts belonging genealogically to


51

Page 51
the alternative group in cases where the scribe was aware of both alternatives. In such instances, the texts containing the good reading would constitute a false group which, however, contained a whole group, since the bad reading would not, under normal circumstances, ever be substituted for the good one.

SIU, SIO, SBU, SBO

Situations where the variant reading is suspicious but not an obvious error. A careless scribe would simply copy the reading, but an alert one might be tempted to emend or to consult another source. Such variants would be quite likely to define a true group, as they would be unlikely to travel to other branches of the tradition, but could not be relied on to define a whole group, especially when the alternative was obviously better.

E

Glaring and obvious errors that only a careless or desperate scribe would be prepared to let pass and that would never be called on to repair a suspect reading in another text. Where there is no danger of independent error, such variants would probably be the most reliable of all as indicators of a true group but would have little reliability as indicators of a whole group.[24]

V

A variant which, while still characterisable as substantive, would be vulnerable to change either as the result of an individual copyist's linguistic or spelling preferences or for palaeographical reasons. Variants in the first category would be likely to lie behind many forms of purely dialectical variation, many variations in grammatical function words, most variations in which alterations to meaning are brought about by variations in punctuation or small changes to spelling, and most variations in the wording of material that might be regarded as ancillary to the text proper such as titles, inscriptions and stage directions. The second category would include all variations involving words such as "the," "that," "which," and "when," which were frequently written as contractions, and most variations between singular and plural forms, many of which appear to arise from the misreading


52

Page 52
of final "s" as a decorative flourish or of a decorative flourish as final "s." (This and other points of ambiguity might well be particularly critical in the handwriting of particular scribes.) The problem with such readings is not simply that they are highly prone to further variation but that they are, more specifically, highly prone to reversal—often without any conscious intention on the part of the copyists. The vulnerability would therefore apply to both alternatives in the variation. Variations of this kind are the least likely to indicate either a true group or a whole group. On the other hand, if one has been able to establish a stemma on the basis of more reliable classes of variants, one would expect the V variations to show a predominant conformity to the genetic patterns.[25]

Accidental variants, i.e. those which are purely matters of spelling or punctuation, are so unpredictable in the pre-1800 period as to have little value as evidence for the descent of texts. Editors, as a rule, do not even bother to record them. It should not be overlooked, however, that variant spellings of proper names may sometimes be of value (a repeated glaring error in the spelling of a common name could be admissable as evidence of the constitution of a true group) and that a conformity in the distribution of variant spellings of the same words throughout a manuscript can provide persuasive evidence of relationship and is, indeed, one of the few reliable methods by which we can trace the descent of substantively invariant readings within a tradition and thus distinguish genealogical from purely textual affinities.[26]

III

The difficulties confronting the "bibliographical" editor who insists on trying to determine the relationship of manuscripts as opposed to texts have been eloquently summarised by Dearing:

Finally, the bibliographer, seeking to relate means (transmitters) genealogically, can never fully answer the antinomian objection that he may be wrong by rule. A record whose immediate independent descendants do not record different states of its text can vanish without a trace. Agreements can result from normal copying, from conflation, from emendation, from independent errors (as for instance making the same eyeskip twice), or just by accident. Who is to say what combination of circumstances produced the agreements or failed to produce the differences the bibliographer analyzes?

53

Page 53
What check is there on his estimate of the most probable circumstances? The best reply he can make is that his procedure is rational and that the objection is fundamentally antirational and subversive of good order.[27]
There is much weight to these objections, as there was in the advice given to Columbus on the quayside that he must expect to endure hunger, thirst, hardship, mutiny, shipwreck, disorientation and an ever present risk of toppling over the edge of the world. But Columbus, being unfashionable enough to believe that truth might be sought through the study of the physically and historically actual as well as from speculative symmetries, proceeded with his voyage, and editors are, in my view, best equipped to deal with problems arising from irregular agreement when they consider them not merely as logical anomalies (though this is also their duty) but as a challenge to understand the operations of a particular human mind engaged in the act of copying at a particular moment in history. For, as Vieth has pointedly stated, "the bases of textual criticism are just sufficiently mathematical to tempt the critic to set up for a scientist, and just sufficiently nonmathematical to make this temptation an insidious trap."[28] The method of ranking variants which has just been proposed is designed to admit a bibliographical perspective into the preparatory stage of stemma-building by proposing criteria for selecting those variants most likely to have remained stable in transmission and for eliminating those which are likely to be unreliable as evidence for the descent of manuscripts. The criteria in their present form are admittedly crude and may in some instances still produce misleading data—which would be all the more misleading for its having been selected "by rule." They certainly do not relieve the editor of the need for an unceasing alertness towards all the possibilities inherent in all his evidence and the obligation to test all available techniques for making sense of it. It must also be conceded that the method can only be effective when variants are relatively abundant: there would be no point in restricting one's attention to a selection of the more "reliable" variations when this would not provide sufficient evidence to permit the full determination of the stemma.

The first step towards the construction of a stemma is to distinguish between those variants which stand at the conclusion of lines of relationship, and which are called terminal, and those which lie on lines or at


54

Page 54
the intersection of lines and which are known as intermediaries. Terminal texts are revealed by their possessing unique readings which show in the form of what Greg called type-1 variations, e.g. ACDEF:B. The final determination of intermediary status rests on more subtle tests, but a text which contains no unique readings is at least a possible intermediary. There are, however, two situations in which the editor may be misled. The first is where a text is in fact terminal but does not reveal this through the possession of an unique reading. The second is where a text which is genealogically speaking an intermediary possesses unique readings which were not transmitted to its descendant or descendants and which therefore show as type-1 variations. A simple situation in which this might happen would be when a carelessly written manuscript introduced errors of an obvious kind, some of which were removed conjecturally by the scribe of a copy. It is therefore clear that readings of a kind that would be prone to such reversal (and especially those in categories E and V) are of lower value as evidence for either terminal or intermediary status than more inherently stable readings. The issues are simpler in this case than in the later stages of stemma construction in that the editor is only concerned with the question of whether the scribe of one of the manuscripts containing the majority reading could have had the odd-manuscript-out in front of him as his exemplar but felt impelled to emend, either because of familiarity with the alternative reading or because the reading as it stood was obviously unsatisfactory. When anything in the nature of the minority variant suggests that such a reaction might have been possible (obtrusiveness in the first case, suspiciousness or outright badness in the second), the editor should be cautious about relying on its unsupported evidence in determining terminal status.

A simple way of handling this problem when considering a large number of sources is to give each unique reading a reliability rating on a scale from nought to two, where 2 indicates a variant, which, in the light of the criteria suggested earlier, would seem to be exceptionally reliable as an indication of terminal status, 1 a variant which is reasonably plausible and not too obtrusive, but no doubt, like all unique or at least rare readings (for it is possible that the singleton variant may have existed in a number of now lost manuscripts), must have been in some measure of danger from memorial contamination, and o a variant that must be considered to have run a real danger of reversal at the hands of an alert scribe. The precise setting of these values will need to be a matter for the editor concerned; however the aim should be a situation in which a single 2-class variant with support from one other of lower status, three 1-class variants or five or more o-class variants should provide acceptable evidence for terminal status and any text


55

Page 55
with less support be reserved for reconsideration when evidence is available from the larger groupings.

The next and most challenging stage in stemma-building is the determination of the distributional relationships implied by the type-2 and complex variants. Here the problem is to ensure that the variants chosen as the basis for analysis are, as far as can be determined by purely notional methods, those most likely to give an accurate indication of the underlying genetic groups.[29] In order to ensure this, the following procedure is suggested. To begin with, the editor works through each of his variant readings, studies it carefully in relationship to its context and then assigns it to one or another of the suggested classifications (PIO, SBU, E etc.). In doing so he is in the position of a bidder submitting sealed bids in advance of an auction at which he is also to be the auctioneer, and there may at a later stage be a strong temptation to change the terms of the bid in order to accommodate an emerging pattern of agreement.[30] This, however, would be to defeat one major advantage of the method which is to remove the danger of rationalisation after the event by demanding an assessment of variants at a stage before an overall pattern of relationship has begun to suggest itself. So that the method can be used with as much integrity as possible, and so that evidence can be assembled to permit an assessment of its validity as a method, it is suggested that the judgments should be made first independently and then in consultation by at least two scholars, and that later reassessments should only be made with the agreement of both. It is, of course, possible at this stage that agreement over the precise status of a variant might not be possible. In particular, there are situations where a PIU-type variant, which would theoretically have the highest reliability ranking, may be hard to distinguish from a slightly firmer-than-normal V-type variant. In this case, the variant could be given a double ranking and classified initially either at the higher or the lower ranking depending on whether variants were in short supply or not. If it was necessary to use it for the first stage of calculation, it should be watched carefully, and could be reclassified if it was observed to be setting up otherwise insoluble patterns.

In other cases, however, editors should be prepared to stand by their preliminary evaluation except in situations when a reclassification would be justified by clear considerations which had previously been overlooked, or when a process of reasoning back from an emerging stemma to prior classifications could be justified in Humphrey Palmer's


56

Page 56
terms as being spiral rather than merely circular.[31] Fundamental to the use of qualitative methods in textual criticism is the recognition of the point at which, in each particular case, they cease to yield valid answers. To attempt to apply them beyond that point is to devalue not only the editorial enterprise but also the methods. It must also be recognised that the use of such methods is not a substitute for the disciplined and intricate logical procedures described by theorists such as Palmer and Dearing (assuming, of course, that the editor is convinced of their validity as procedures) but a way of supplementing, sharpening, and, where necessary, correcting them.

The selection of which classes of variants to use in the construction of the hypothetical non-directional stemma will depend on the numbers of variations available within each class. The ideal situation would be one in which PIU variants only required to be considered; but, as it is also vital that the variants selected should permit as full as possible a determination of the stemma, it will often be necessary to content oneself with eliminating those classes most likely to contain positively misleading evidence about the composition of genetic whole groups, namely E and V (though E-variations will still, as explained earlier, provide a valuable control on assessments of the truth of groups). If a shortage of higher-ranking variants means that all except the lowest-ranking groups have to be included, the number of inconsistent agreements requiring to be located by formal means may be dauntingly high. However, if some formal technique of resolving inconsistencies, such as Dearing 'ring-breaking' routine, is employed, it should be regarded as absolutely essential that its determinations should be checked at every stage against the qualitative judgements established during the initial process of classification, and, in cases where there was any conflict between the two, a fresh assessment of the contextual evidence undertaken. As an additional guide, I would recommend that every textual scholar should adorn his study with two historical illustrations —the first of the Charge of the Light Brigade as a reminder of the fatal consequences of entry into the wrong branch of a multi-branched fork, and the second of the last stand of the Old Guard at Waterloo as a testimony to the inadvisability of breaking rings without the very best of reasons. At the third stage of stemma-building, when directional evidence is required, it will of course be sought for through the entire body


57

Page 57
of variations, bearing in mind that those classified as indifferent under the present terminology may still be directional.

The danger in eliminating variants from consideration because they are of a type that may lead to irregular agreement is that one can easily at the same time suppress variations containing valid and unique evidence of some particular aspect of the genetic relationship. In order to compensate for this, the establishment of a hypothetical stemma should be followed by a testing of its power to make sense of the whole body of variations, including those rejected from the original investigation because of their possible unreliability. A precise methodology for this cannot at present be proposed, but the general rule that should be borne in mind is that, if we have determined the actual pattern of descent, the predominant body of agreements should be in accordance with it and those that are not should be explicable by means of one or other of the principles considered earlier. The presence of a certain level of completely unaccountable variation in a "living" text (as Quentin has used the term)[32] has perhaps to be allowed for, at least among the lower-grade variants; but for a stemma proposed for a tradition of any complexity to receive assent, it must, at the very least, explain.

The assumption behind the present study has been that in dealing with moderately contaminated traditions it should be possible to determine bibliographical genealogies by applying Greg's methods of analysis to variants selected on a prima facie basis as unlikely by their nature to be involved in irregular agreements. The criteria proposed are in some ways a refinement on Greg's own words of advice for dealing with what he terms "correctional conflation":

What usually happens is that collation and "correction" are confined to some of the more striking variants. This will show itself on analysis either by the sporadic appearance of anomalous groupings, or by those involving the more important variants consistently pointing in one direction, and those involving the minor variants consistently in another. . . . It may be added that, where conflation is suspected, the value of variants as an indication of ancestry is in inverse proportion to their intrinsic importance. To the herd of dull commonplace readings we must look for the genetic source of the text, to the more interesting and striking for the source of the contamination (p. 57).
It is necessary to bear in mind, however, that the efficacy of the method will decline as the number of irregular agreements rises and that, for the radically conflated traditions with which Dearing and Okken are principally concerned, such a stemma, even if determinable, might not be of much help in restoring the readings of the archetype.

Notes

 
[1]

Examples of critical editions of seventeenth-century English poets drawing on extensive collations of manuscript sources are the Oxford English Texts editions of Donne, edited by Helen Gardner and Wesley Milgate, and of Suckling, edited by Thomas Clayton and L. A. Beaurline, and Poems on Affairs of State: Augustan Satirical Verse, 1660-1714, gen. ed. George de Forest Lord, 7 vols. (1963-75). The Complete Poems of John Wilmot Earl of Rochester, ed. David M. Vieth (1968), though compiled on critical principles, is without an apparatus criticus. For a detailed account with supporting evidence of an experienced editor's conclusions about the problems of one such text, see L. A. Beaurline, "An Editorial Experiment: Suckling's A Sessions of the Poets," Studies in Bibliography, 16 (1963), 43-60.

[2]

For the circumstances of transmission of classical literature, see L. D. Reynolds and N. G. Wilson, Scribes and Scholars: A Guide to the Transmission of Greek and Latin Literature (2nd ed., 1974) and E. J. Kenny, The Classical Text (1974). Lambertus Okken in his 1970 Rijksuniversiteit te Utrecht doctoral thesis, Ein Beitrag zur Entwirrung einer kontaminierten Manuskripttradition: Studien zur Überlieferung von Hartmanns von Aue "Iwein," p. 8, points out that Karl Lachmann, the founder of modern textual criticism, never in fact published a stemma and inquires "Hatte der grosse Philologe im Umgang mit kontaminierten Traditionen, etwa mit den Überlieferungen von Wolframs von Eschenbach Werken, gelernt, das die Kenntnis der Handschriftengenealogie gewöhnlich keinen praktischen Wert hat?"

[3]

An exception to the common dialect would be the University of Edinburgh Library MS. DC. 1.3 of poems by Rochester and his contemporaries, the text of which contains a detectable infusion of Scotticisms.

[4]

Much information on this topic will be found scattered through David M. Vieth, Attribution in Restoration Poetry: A Study of Rochester's "Poems" of 1680 (1963). For an exceptionally revealing account of the genesis of a particular group of manuscripts, see W. J. Cameron's "A Late Seventeenth-Century Scriptorium," Renaissance and Modern Studies, 7 (1963), 25-52 and "Transmission of the Texts" in Poems on Affairs of State, V, 528-540.

[5]

In his Attribution in Restoration Poetry, his edition, with Bror Danielsson, of The Gyldenstolpe Miscellany of Poems by John Wilmot, Earl of Rochester, and other Restoration Authors (Stockholm: Almqvist and Wiksell, 1967), and his edition of the Complete Poems.

[6]

See Complete Poems, pp. xlvi-lii. Vieth prefaces the account of his procedures with the dry remark: "Some aspects of textual criticism raise surprisingly philosophical questions, in this instance whether the universe (not to mention the human mind) is fundamentally rational" (p. xlvii). The genealogical, historical and philological methods are employed to create a "tentative reconstructed text" of the poem. The early text "having the least departures from the tentative text" is then chosen as copy-text and its readings accepted "unless there is substantial reason to substitute a reading from other texts" (pp. l-li). In practice, Vieth appears to have let a number of minor substantive readings stand which are unlikely to have been those of the archetype.

[7]

As this enterprise is still in progress, I can not guarantee that my conclusions will not be identical with Vieth's.

[8]

See Vinton A. Dearing, Principles and Practice of Textual Analysis (1974), pp. 14-20 and passim. Dearing holds that the establishment of the genealogy of "transmitters" (e.g. manuscripts) is the province of bibliography, not textual analysis which is described as "a logic engine, like a computer" (p. 58). "Carrier" is to be preferred to Dearing's "transmitter" as terminal texts are receivers only and do not transmit.

[9]

It is possible in some instances to establish relationships between texts on the basis of accidental features, such as the distribution of variant spellings, or physical features such as line-lengths or the number of lines per page. For a summary of Cameron's important discoveries concerning scribal accidentals, see Poems on Affairs of State, V, 529.

[10]

Some of these are spelled out by Humphrey Palmer in The Logic of Gospel Criticism (1968), pp. 93-94. The additional point should perhaps be made that, even in a strictly "bibliographical" stemma, the inferential intermediaries remain inferential, that is to say logical abstractions, and cannot be relied upon to correspond in their assumed readings to any single historically existing manuscript.

[11]

The editorial heritage of Lachmann encouraged editors to fuse the two processes of the assessment of direction and the linking-up of the stemma; however editors of English literary texts since the time of Greg have agreed in not proceeding to consider directional evidence until the linkages between texts have been established on a purely distributional, non-directional basis. The present study, while arguing that contextual as well as purely formal arguments should be admissible at every stage of textual reasoning, accepts the validity of the two-stage model of stemma-building. For objections to the Lachmannian method, which is still accepted on the authority of Maas by many present-day classicists and mediaevalists, see Dearing, pp. 5-11, 15-16 and 54-56 and Palmer, pp. 76-80. Palmer concludes (p. 92) that Lachmann's method "is quicker—when it works!" but that it depends "on finding errors certain to the critic and incorrigible by copyists"—a consummation more often devoutly wished than practically experienced. For a defence of the method against its critics, see Kenny, p. 137.

[12]

Walter W. Greg (1927). Dearing's much more detailed study refines in many valuable particulars on Greg's techniques, definitions and terminologies but should not be used without reference to Michael Weitzman's review in Vetus Testamentum, 27 (1977), 225-235. I would like to thank Dr. Weitzman for pointing out an error in the reasoning of an earlier version of the present paper.

[13]

See Dearing, pp. 215-236 and Dom J. Froger, La Critique des Textes et son Automatisation (Paris: Dunod, 1968). Earlier work by Dearing is discussed briefly in my "Computers and Literary Editing: Achievements and Prospects" in The Computer in Literary and Linguistic Research, ed. R. A. Wisbey (1971), pp. 47-56. Gunter Kochendörfer and Bernd Schirok, Maschinelle Textrekonstruction, Göppinger Arbeiten zur Germanistik nr. 185 (Göppingen: Alfred Kümmerle, 1976) includes a valuable bibliography of work in a number of languages relating to this field (pp. 176-179).

[14]

As the aim of textual analysis is to restore the readings of the archetype, formal systems and algorithms claiming to do this should be tested on texts chosen at random from a tradition created by supervised copying in as close as devisable an approximation to the conditions experienced by professional scribes of the past and with the solution withheld from the experimenter until after he has submitted his conclusions. If such a system can not produce a correct solution to texts whose descent can be verified, it can hardly be trusted to produce correct results for historical traditions where the conclusions can not be verified.

[15]

In the Lachmannian tradition, the "favoured few" are readings regarded as having clear directional implications but not, in the judgment of the editor, being readily corrigible by copyists of the period concerned. These are in turn divided into (1) separative errors which indicate that the manuscript containing them can not be the ancestor of one in which they are correct (cf. Maas, pp. 42-47) and conjunctive errors defined by Palmer as "common to two manuscripts but not a third, and such that the reading of the third could not be due to correction by the copyist" (p. 243).

[16]

A summary of the more common kinds of scribal error will be found in Reynolds and Wilson, pp. 200-212. For a more systematic treatment, see James Willis, Latin Textual Criticism (1972), pp. 47-188. Weitzman (p. 226) draws attention to important material to be found in J. Stoll, "Zur Psychologie der Schreibfehler," Fortschritte der Psychologie, 2 (1913), pp. 1-133.

[17]

Cameron's experience of this kind of variation in his "scriptorium" manuscripts led him to suggest that "all texts are in fact conflated texts—a conflation of the exemplar and a structure of linguistic expectations that is present in the mind of the scribe" (Poems on Affairs of State, V, 529).

[18]

By a "whole group" I understand a whole "true" group as defined below. Greg's use of "true" to describe variational groups containing more than one member seems to proceed from nothing more than an aesthetic disinclination to speak of a "group" with only one member (cf. Dearing, p. 10) and should be disregarded.

[19]

Cf. Maas, p. 8: "obvious corruptions, particularly lacunae, may easily be transmitted in the direct line but are hardly ever transferred by contamination." Against this, however, must be placed Greg's principle (p. 20 n.) that "the easier it is to explain how an error arose, the less valid the assumption that it only arose once."

[20]

Assuming, of course, that the distribution of a variant among surviving sources corresponds in a general way with the distribution of the variant in the whole body of sources available at the time the scribe made the copy. This may not always have been the case: a Bowdlerised version of a satire may, for instance, have stood a better chance of survival than an obscene one. If a text was available in printed form, each copy in circulation would need to be counted as a separate source for purposes of such a calculation.

[21]

The editor should consider the possibility whether the overlapping of two or more possibly incomplete groups defined by gross and easily corrigible errors might not indicate a fractured whole group.

[22]

Dearing, p. 57, revising Greg's terminology. Constantine Kooznetzoff, on the other hand, though he does not assert this as a general principle, finds in his "A Genealogical Analysis of the 'Tristan' Fragment, Ms. 2280," AUMLA, 54 (November 1980), 194, that the type-2 variants "providing they are consistent, afford all the evidence necessary for deducing manuscript relationships." The complex variants are declared "genetically non-evidential" on the grounds that they contain "conscious scribal emendation" (p. 197).

[23]

A test for obtrusive readings in a short poem would be to find a pretext to make someone transcribe it several times, and then, after a few days interval, ask him to write down as much as he could still remember. The readings correctly given would be the "obtrusive" ones.

[24]

The only at all likely situations where such a variant would not indicate a true group would be (1) when it had been called on to fill a lacuna in a majority-group manuscript by a scribe with too imperfect a knowledge of the language or content of what he was copying to realise its uselessness or who felt that any reading was better than none (2) when the identical gross error was made during separate copyings by the same scribe (a quite conceivable happening in a scriptorium situation such as that described by Cameron) or by separate scribes (rather less likely unless some aspect of the original reading or a particular exemplar had created an exceptionally high risk of error).

[25]

These points are discussed in greater detail in my "The Text of 'Timon. A Satyr'," Bulletin of the Bibliographical Society of Australia and New Zealand, 6 (1982), 113-140.

[26]

For an example of the application of this technique, see my "The Texts of Southerne's The Spartan Dame," Bulletin of the Bibliographical Society of Australia and New Zealand, 1 (1970-1), 54-59.

[27]

Dearing, p. 19.

[28]

Review of Dearing's Manual of Textual Analysis, JEGP, 59 (1960), 555. Housman in "The Application of Thought to Textual Criticism" sees the subject matter of the discipline as "not rigid and constant, like lines and numbers, but fluid and variable; namely the frailties and aberrations of the human mind, and of its insubordinate servants, the human fingers" (Selected Prose, ed. John Carter [1961], p. 132.) Dearing, p. ix contains a brief retort to Housman.

[29]

Cf. Greg, p. 13.

[30]

Dearing warns rightly (p. 55) that if the textual analyst "lets the form of his family tree influence his analysis of the directional variation, he reasons in a circle."

[31]

Palmer, pp. 51-52, 80. The problems of circular reasoning in textual criticism are similar in their nature to that of the "hermeneutic circle" in critical interpretation as discussed by a number of contributors to the Autumn 1978 issue (X. i) of New Literary History. Palmer's metaphor with its attractively Yeatsian overtones hardly offers a procedural solution, but indicates that the methodological plight of historical scholarship is not as desperate as it is sometimes made out to be.

[32]

Henri Quentin, Essais de Critique Textuelle (Paris, 1926), p. 43, quoted in translation by Palmer, p. 80.