| ||
The Ranking of Variants in the Analysis of
Moderately
Contaminated Manuscript Traditions
by
Harold Love
Editors of post-mediaeval English literature are not very often required, in the absence of an authorial holograph or authoritative edition, to reconstruct texts from a profusion of non-authorial copyings. When they do, as in the instances of Donne, Rochester and the authors of late seventeenth-century libertine verse and "state poems," they will naturally try to adapt the methods developed by classical and biblical scholars to deal with the problems that arise, but will not always find them as clarified or thought-through as would be desirable.[1] One reason for this may well be that editors of classical texts, dealing as a rule with thoroughly contaminated textual traditions which may, in any case, lead only to the conjectures of some Byzantine or Hellenistic editor or to an archetype that was already seriously deficient, have rightly placed their main emphasis on the reconstruction of the author's text through philological and historical means —reserving the genealogical method for situations where its conclusions would be more or less open and shut.[2] Biblical scholarship faces much
Yet the problems clearly remain considerable. David Vieth, the first scholar to attempt to create a text of Rochester from the full range of the surviving sources (which he himself had been responsible for charting)[5], was, according to his own account, able to make only limited use of the genealogical method and relied in the establishment of his text on an adaptation of the "best manuscript" theory.[6] The present article
The approach in general involves an insistence on the historicity of the act of transmission which has been unfashionable in recent years, and which departs, perhaps unwisely, from the principle argued for by Vinton Dearing that the concern of the editor should be with the text considered as a logical construct rather than with the manuscript as carrier of the readings.[8] The distinction in itself is an important one which may be grasped most easily if we consider a manuscript which is a copy of a source, A, being corrected from a manuscript descended from a source, B, belonging to a different branch of the textual tradition, so as to remove all the substantive readings which were characteristic of A. If we assume that the deleted readings are rendered illegible, it is fair to say that the text of that manuscript, in so far as this is defined by substantives alone,[9] now belongs to the branch of the tradition represented by B, although it remains a historical fact that the majority of the words inscribed on its pages—those in which it did not vary from either A or B—had descended by a physical act of copying from A. When editors construct a stemma, they will normally see their aim as that of producing a diagrammatic representation of the logical relationships of the groupings of variant readings that characterise the sources, not a family tree of copyings which would show the paths by which the majority of readings in each instance, whether variant or invariant, were actually transmitted from page to page. So, to return to the earlier
There is, of course, no objection to this procedure as long as there is no confusion in the minds of editors or their readers over what is taking place. Indeed, it is often much more serviceable to the inquiry in hand than a strict family tree of sources would be. But dangers can nonetheless arise if editors allow themselves to lose sight completely of the nexus between the text and its carrier.[10] If the study of variation is, from one point of view, an exercise in the manipulation of symbols according to the principles of mathematical logic, it is equally the study of a certain kind of human behaviour—that of the scribe, professional or amateur, in the act of copying. The importance of habituating oneself to think in terms of the genealogies of manuscripts as well as of the class-relationships of texts is simply that the significance as evidence of a variant in any kind of textual inquiry must lie ultimately in the processes of thought (or non-thought) which have led to its coming into existence, its further transmission and possibly its disappearance. Modern English-language editorial practice admits the validity of such criteria, but insists that they should be reserved until the stage when the editor has established the primary pattern of relationships between his texts and begins the search for evidence of the direction of variation in order to locate the position of the ancestor.[11] My own contention is that the consideration of scribal motivation is relevant to every stage of textual reasoning and that to ignore it is to assent to the proposition that all cases of variation are of equal value as evidence for the relationship of texts—a proposition which may, indeed, be an unavoidable rule of
II
The editor encountering a tradition of the kind described must begin by finding solutions to two pressing problems of technique. In the first place, although the logic of textual analysis, as presented in handy form in Greg's The Calculus of Variants,[12] is crystal clear, the task of applying that logic to a manuscript tradition of any complexity, and containing even a moderate number of irregular agreements, can involve a daunting amount of calculation and record-keeping. Dearing, Froger and others have devised computer-based methods for performing the purely mechanical part of the task—though the problem still remains of ensuring that the data to be analysed is put into machine-readable form without distortion.[13] The second problem is that, as we have just seen, the variant readings which are the raw materials of logical analysis will not themselves necessarily progress through the tradition in logical ways. An error which would otherwise have been characteristic of a particular branch of the tradition might be corrected or (less fatally) miscorrected by a scribe. The same variant may arise independently in two different parts of the tradition. Readings characteristic of one branch may be imported into another as the result of memorial contamination or, in extreme cases, of an editing process in which readings from exemplars representing two or more separate lines of descent are combined eclectically into a single "conflated" text. Any of these happenings is likely to lead to an incorrect judgement about the place of manuscripts within the tradition as a whole or the number of intermediaries intervening between one manuscript and another. Seeing that a significant percentage of anomalous agreements seems to be present in every "real life" textual situation, the editor must be prepared to find this the most testing and frustrating stage of his labours.
Whereas problems of scale and volume can be dealt with satisfactorily by the computer, it has yet to be shown conclusively (which is to say empirically) that purely quantitative methods can deal with the problem of irregular agreement.[14] Scholars faced with large and intractable textual traditions have naturally wished to devise methods by which vast bodies of data could be presented simply as a list of groupings of sigla and the task of reconciling these with each other and ascertaining the status of any anomalies performed entirely by a computer program. An algorithm described by Dearing in his Principles and Practice of Textual Analysis is the most elaborate attempt of this kind to date to appear in print, though others are known to have been attempted privately. While it is easy to scoff at such ambitions as a twentieth-century counterpart of squaring the circle (and there are certainly enormous temptations to recursive and tautological reasoning) the fact remains that it is only by such means that any hopes can be entertained of reaching a solution to the textual histories of the New Testament, the major church fathers, and such widely copied poets as Dante, Virgil and Chaucer. The drawback of such enterprises, however, is that the editor has to work with one arm tied behind his back. Large-scale computer-based data sorts allow no opportunity for the detailed ranking and precoding of the data. Moreover, once a variation has been reduced to a string of symbols divided by group delimiters it has to be regarded as possessing exactly the same evidential value as any other variation with the result that cases of irregular agreement must be reconciled by quantitative not qualitative means. Quantitative means should not be sneered at simply because they are quantitative. The powerful software packages used by social scientists for the statistical analysis of questionnaire and interview data have a considerable, and so far largely untested, value for textual scholars. But they cannot give better than a statistically plausible answer to the central questions—what is the location of the ancestor and how are the other texts derived from it—and there are many situations in which this will be a disastrously wrong answer.
Older approaches to textual criticism, partly from necessity but partly also from intellectual preference, took a view that was exactly the other way round. Accepting that variations differ greatly in their reliability as evidence for the family relationships of sources, they exercised
It will be necessary before proceeding to present a more detailed summary of the circumstances under which variation may arise during the copying of texts. Least under the control of the scribe is physical damage to the exemplar that removes words or makes them illegible. Variation may also arise from inadequacies in the script of the exemplar by which one word may be read as another or provoke a desperate emendation. The extent to which this takes place will be influenced by the scribe's ability as a decipherer of difficult hands and, perhaps more vitally, by his preparedness to devote time and care to the analysis of letter forms. A third class of changes may be described as misreadings due to incorrect perception on the part of the scribe or a failure to proofread the transcript against the exemplar. This would include eyeskips between two occurrences of a word, dittography, reversals of order, and errors arising from incorrect anticipation—all these categories of error having been carefully anatomised by classical scholars.[16] Of a slightly different nature are substitutions, whether voluntary or involuntary, of the scribe's habits of grammar and expression for those of the original.[17] These would not always be seen by the scribe as comprising errors:
Bearing these things in mind, let us consider the position of an editor faced with a tradition containing seven manuscripts, cited under the sigla A to G, who encounters a variation in the form ABDE:CFG, or as it would be written in Greg's notation—Σ:CFG. These two groups should ideally represent (1) an independent and self-contained subfamily within the overall tradition, having its own exclusive common ancestor, and (2) a residual body of texts descended without variation from the archetype—which is which need not for the moment concern us. This ancestor of the sub-family, in turn, may either be one of the manuscripts listed or a lost intermediary (further patterns of variation would need to be consulted in order to determine this). And yet the assumption that the two groups of sigla are genetically significant is not one that the editor is able to make on the basis of a single instance, or even, in some cases, on the basis of a repeated occurrence of the same variation. Before he can judge the value of the variation and the kind of weight that can be placed on it as evidence, he will need to consider two further questions: whether each of the groups indicated is likely to be a true group in genealogical terms and whether it is likely to be a whole group.[18] It will be self-evident that such a question can not be answered from a consideration of symbols alone: it requires a close consideration of the actual variant in its context of meaning, supplemented by a sound understanding of the habits of scribes.
The idea of a "whole" group is necessary to allow for the fact that some variants are inherently less stable than others. Restricting ourselves
In discussing the related notion of a "true group" it is necessary to remind ourselves again that our concern for the moment is only with individual variants considered in isolation. Whether or not any given agreement, say CFG, indicates a whole group, that group will be "true" as long as each of its members genuinely belongs to the same sub-family as defined by succession of copying from the source in which the variant originated or from the archetype. However, in cases where an identical variant was introduced spontaneously and independently by scribes working from different exemplars or had arisen from contamination, the receiver group would be a false one in that one of its members was not a descendant of the text in which the variant originated. In this last case the variation would be a false group that contained a whole group though it is easily possible to imagine circumstances that would lead to the genesis of groups that were false without containing a whole group.
A third distinction that needs to be made is that between variants that are enabled by their nature to extend beyond their genealogical bounds and those which are not enabled. If a scribe should feel dissatisfaction with a reading in his exemplar and consult either another manuscript or his memory for an alternative, he will obviously be ready to accept any good, or at least plausible, alternative he might encounter. On the other hand, if the alternative was itself of a dubious nature,
The following list offers a ranking of classes of variant according to the likelihood of their providing reliable evidence of "bibliographical" descent. It must be understood that some of the assumptions made rest on no firmer foundation than common sense, and that common sense is sometimes contradicted by what emerges from the patient analysis of large bodies of empirical data. It is to be hoped therefore that the validity of these assumptions can be tested for each of the major historical manuscript traditions by a statistical study of the causes of variation in situations where it is still possible to compare a copy with its exemplar, and equally that the activity of transcription can be studied under laboratory conditions using the methodology of present-day experimental psychology. A further limitation is that it assumes that scribes, whether characterised as "alert" or "careless," will behave in a reasonably consistent way and come to reasonably rational conclusions about the problems posed by their exemplars, an assumption which is hard to reconcile with the many peculiar situations encountered in the sources. Nonetheless, the criteria listed are offered as the guides which in the present state of our knowledge, and on the basis of my own experience as an editor of seventeenth-century texts, seem most likely to prove fruitful. As earlier, only Greg's type-2 variations are admitted for formal consideration though the method obviously has implications for the evaluation both of type-1 variations and of complex variations (Dearing's types 3 and 4).[22]
The basic distinctions that require to be made are as follows:
(1) whether a reading encountered by a scribe in his exemplar would appear to him to be plausible (P), suspicious (S) or an obvious error (E)
(2) whether, in the event of a scribe having knowledge of the alternative, a variant would be likely to be regarded as good by comparison (G), indifferent (I), or obviously inferior, i.e. bad (B)
(3) whether a variant can be described as obtrusive (O), in the sense of being prominent and likely to draw attention to itself, or unobtrusive (U)
(4) whether a variant might be considered vulnerable to reversal or further alteration under the influence of scribal linguistic and spelling preferences or palaeographical factors (V).
Using this terminology, we can describe the following classes of variant and/or variation, each possessing a different degree of trustworthiness as an indicator of genetic relationships.
PIU
A clear substantive variant which is plausible in its context and to all appearances could have been intended by the author, which is indifferent by comparison with the alternative reading, yet which is not obtrusive enough to encourage memorial contamination from the alternative in cases where that was better represented and therefore more familiar. A variant of this kind is likely to indicate both a whole group and a true group.
PIO
A clear plausible substantive variant not noticeably better than the alternative, but one that is obtrusive enough in the poem to have run a definite risk of memorial replacement. Criteria of obtrusiveness would need to be derived afresh for every new work or tradition studied. In a religious text, for instance, it might be the theologically charged word that required to be so defined whereas in verse the position of the word in the stanza might be relevant.[23] Consideration would also have to be given to the likelihood of the scribe having encountered the work being copied on a previous occasion or of having access to a second source: with the Rochester manuscripts, I believe that possibility to have been quite a high one, especially where the professional writers of the large miscellanies were concerned. Variants of this kind are still likely to indicate both a true and a whole group, but not so reliably as those of the previous class. If the variant is relatively infrequent by comparison with the alternative, and there is no reason for believing that this was not the case in the tradition as a whole (including non-surviving as well as surviving sources), the likelihood of its representing a whole group would be correspondingly diminished but the likelihood of its representing a true group correspondingly enhanced.
PGU, PGO
A clear plausible substantive variant but clearly better than the alternative. Readings so classified—and especially the obtrusive subclass—would be likely to travel into texts belonging genealogically to
SIU, SIO, SBU, SBO
Situations where the variant reading is suspicious but not an obvious error. A careless scribe would simply copy the reading, but an alert one might be tempted to emend or to consult another source. Such variants would be quite likely to define a true group, as they would be unlikely to travel to other branches of the tradition, but could not be relied on to define a whole group, especially when the alternative was obviously better.
E
Glaring and obvious errors that only a careless or desperate scribe would be prepared to let pass and that would never be called on to repair a suspect reading in another text. Where there is no danger of independent error, such variants would probably be the most reliable of all as indicators of a true group but would have little reliability as indicators of a whole group.[24]
V
A variant which, while still characterisable as substantive, would be vulnerable to change either as the result of an individual copyist's linguistic or spelling preferences or for palaeographical reasons. Variants in the first category would be likely to lie behind many forms of purely dialectical variation, many variations in grammatical function words, most variations in which alterations to meaning are brought about by variations in punctuation or small changes to spelling, and most variations in the wording of material that might be regarded as ancillary to the text proper such as titles, inscriptions and stage directions. The second category would include all variations involving words such as "the," "that," "which," and "when," which were frequently written as contractions, and most variations between singular and plural forms, many of which appear to arise from the misreading
Accidental variants, i.e. those which are purely matters of spelling or punctuation, are so unpredictable in the pre-1800 period as to have little value as evidence for the descent of texts. Editors, as a rule, do not even bother to record them. It should not be overlooked, however, that variant spellings of proper names may sometimes be of value (a repeated glaring error in the spelling of a common name could be admissable as evidence of the constitution of a true group) and that a conformity in the distribution of variant spellings of the same words throughout a manuscript can provide persuasive evidence of relationship and is, indeed, one of the few reliable methods by which we can trace the descent of substantively invariant readings within a tradition and thus distinguish genealogical from purely textual affinities.[26]
III
The difficulties confronting the "bibliographical" editor who insists on trying to determine the relationship of manuscripts as opposed to texts have been eloquently summarised by Dearing:
The first step towards the construction of a stemma is to distinguish between those variants which stand at the conclusion of lines of relationship, and which are called terminal, and those which lie on lines or at
A simple way of handling this problem when considering a large number of sources is to give each unique reading a reliability rating on a scale from nought to two, where 2 indicates a variant, which, in the light of the criteria suggested earlier, would seem to be exceptionally reliable as an indication of terminal status, 1 a variant which is reasonably plausible and not too obtrusive, but no doubt, like all unique or at least rare readings (for it is possible that the singleton variant may have existed in a number of now lost manuscripts), must have been in some measure of danger from memorial contamination, and o a variant that must be considered to have run a real danger of reversal at the hands of an alert scribe. The precise setting of these values will need to be a matter for the editor concerned; however the aim should be a situation in which a single 2-class variant with support from one other of lower status, three 1-class variants or five or more o-class variants should provide acceptable evidence for terminal status and any text
The next and most challenging stage in stemma-building is the determination of the distributional relationships implied by the type-2 and complex variants. Here the problem is to ensure that the variants chosen as the basis for analysis are, as far as can be determined by purely notional methods, those most likely to give an accurate indication of the underlying genetic groups.[29] In order to ensure this, the following procedure is suggested. To begin with, the editor works through each of his variant readings, studies it carefully in relationship to its context and then assigns it to one or another of the suggested classifications (PIO, SBU, E etc.). In doing so he is in the position of a bidder submitting sealed bids in advance of an auction at which he is also to be the auctioneer, and there may at a later stage be a strong temptation to change the terms of the bid in order to accommodate an emerging pattern of agreement.[30] This, however, would be to defeat one major advantage of the method which is to remove the danger of rationalisation after the event by demanding an assessment of variants at a stage before an overall pattern of relationship has begun to suggest itself. So that the method can be used with as much integrity as possible, and so that evidence can be assembled to permit an assessment of its validity as a method, it is suggested that the judgments should be made first independently and then in consultation by at least two scholars, and that later reassessments should only be made with the agreement of both. It is, of course, possible at this stage that agreement over the precise status of a variant might not be possible. In particular, there are situations where a PIU-type variant, which would theoretically have the highest reliability ranking, may be hard to distinguish from a slightly firmer-than-normal V-type variant. In this case, the variant could be given a double ranking and classified initially either at the higher or the lower ranking depending on whether variants were in short supply or not. If it was necessary to use it for the first stage of calculation, it should be watched carefully, and could be reclassified if it was observed to be setting up otherwise insoluble patterns.
In other cases, however, editors should be prepared to stand by their preliminary evaluation except in situations when a reclassification would be justified by clear considerations which had previously been overlooked, or when a process of reasoning back from an emerging stemma to prior classifications could be justified in Humphrey Palmer's
The selection of which classes of variants to use in the construction of the hypothetical non-directional stemma will depend on the numbers of variations available within each class. The ideal situation would be one in which PIU variants only required to be considered; but, as it is also vital that the variants selected should permit as full as possible a determination of the stemma, it will often be necessary to content oneself with eliminating those classes most likely to contain positively misleading evidence about the composition of genetic whole groups, namely E and V (though E-variations will still, as explained earlier, provide a valuable control on assessments of the truth of groups). If a shortage of higher-ranking variants means that all except the lowest-ranking groups have to be included, the number of inconsistent agreements requiring to be located by formal means may be dauntingly high. However, if some formal technique of resolving inconsistencies, such as Dearing 'ring-breaking' routine, is employed, it should be regarded as absolutely essential that its determinations should be checked at every stage against the qualitative judgements established during the initial process of classification, and, in cases where there was any conflict between the two, a fresh assessment of the contextual evidence undertaken. As an additional guide, I would recommend that every textual scholar should adorn his study with two historical illustrations —the first of the Charge of the Light Brigade as a reminder of the fatal consequences of entry into the wrong branch of a multi-branched fork, and the second of the last stand of the Old Guard at Waterloo as a testimony to the inadvisability of breaking rings without the very best of reasons. At the third stage of stemma-building, when directional evidence is required, it will of course be sought for through the entire body
The danger in eliminating variants from consideration because they are of a type that may lead to irregular agreement is that one can easily at the same time suppress variations containing valid and unique evidence of some particular aspect of the genetic relationship. In order to compensate for this, the establishment of a hypothetical stemma should be followed by a testing of its power to make sense of the whole body of variations, including those rejected from the original investigation because of their possible unreliability. A precise methodology for this cannot at present be proposed, but the general rule that should be borne in mind is that, if we have determined the actual pattern of descent, the predominant body of agreements should be in accordance with it and those that are not should be explicable by means of one or other of the principles considered earlier. The presence of a certain level of completely unaccountable variation in a "living" text (as Quentin has used the term)[32] has perhaps to be allowed for, at least among the lower-grade variants; but for a stemma proposed for a tradition of any complexity to receive assent, it must, at the very least, explain.
The assumption behind the present study has been that in dealing with moderately contaminated traditions it should be possible to determine bibliographical genealogies by applying Greg's methods of analysis to variants selected on a prima facie basis as unlikely by their nature to be involved in irregular agreements. The criteria proposed are in some ways a refinement on Greg's own words of advice for dealing with what he terms "correctional conflation":
Notes
Examples of critical editions of seventeenth-century English poets drawing on extensive collations of manuscript sources are the Oxford English Texts editions of Donne, edited by Helen Gardner and Wesley Milgate, and of Suckling, edited by Thomas Clayton and L. A. Beaurline, and Poems on Affairs of State: Augustan Satirical Verse, 1660-1714, gen. ed. George de Forest Lord, 7 vols. (1963-75). The Complete Poems of John Wilmot Earl of Rochester, ed. David M. Vieth (1968), though compiled on critical principles, is without an apparatus criticus. For a detailed account with supporting evidence of an experienced editor's conclusions about the problems of one such text, see L. A. Beaurline, "An Editorial Experiment: Suckling's A Sessions of the Poets," Studies in Bibliography, 16 (1963), 43-60.
For the circumstances of transmission of classical literature, see L. D. Reynolds and N. G. Wilson, Scribes and Scholars: A Guide to the Transmission of Greek and Latin Literature (2nd ed., 1974) and E. J. Kenny, The Classical Text (1974). Lambertus Okken in his 1970 Rijksuniversiteit te Utrecht doctoral thesis, Ein Beitrag zur Entwirrung einer kontaminierten Manuskripttradition: Studien zur Überlieferung von Hartmanns von Aue "Iwein," p. 8, points out that Karl Lachmann, the founder of modern textual criticism, never in fact published a stemma and inquires "Hatte der grosse Philologe im Umgang mit kontaminierten Traditionen, etwa mit den Überlieferungen von Wolframs von Eschenbach Werken, gelernt, das die Kenntnis der Handschriftengenealogie gewöhnlich keinen praktischen Wert hat?"
An exception to the common dialect would be the University of Edinburgh Library MS. DC. 1.3 of poems by Rochester and his contemporaries, the text of which contains a detectable infusion of Scotticisms.
Much information on this topic will be found scattered through David M. Vieth, Attribution in Restoration Poetry: A Study of Rochester's "Poems" of 1680 (1963). For an exceptionally revealing account of the genesis of a particular group of manuscripts, see W. J. Cameron's "A Late Seventeenth-Century Scriptorium," Renaissance and Modern Studies, 7 (1963), 25-52 and "Transmission of the Texts" in Poems on Affairs of State, V, 528-540.
In his Attribution in Restoration Poetry, his edition, with Bror Danielsson, of The Gyldenstolpe Miscellany of Poems by John Wilmot, Earl of Rochester, and other Restoration Authors (Stockholm: Almqvist and Wiksell, 1967), and his edition of the Complete Poems.
See Complete Poems, pp. xlvi-lii. Vieth prefaces the account of his procedures with the dry remark: "Some aspects of textual criticism raise surprisingly philosophical questions, in this instance whether the universe (not to mention the human mind) is fundamentally rational" (p. xlvii). The genealogical, historical and philological methods are employed to create a "tentative reconstructed text" of the poem. The early text "having the least departures from the tentative text" is then chosen as copy-text and its readings accepted "unless there is substantial reason to substitute a reading from other texts" (pp. l-li). In practice, Vieth appears to have let a number of minor substantive readings stand which are unlikely to have been those of the archetype.
As this enterprise is still in progress, I can not guarantee that my conclusions will not be identical with Vieth's.
See Vinton A. Dearing, Principles and Practice of Textual Analysis (1974), pp. 14-20 and passim. Dearing holds that the establishment of the genealogy of "transmitters" (e.g. manuscripts) is the province of bibliography, not textual analysis which is described as "a logic engine, like a computer" (p. 58). "Carrier" is to be preferred to Dearing's "transmitter" as terminal texts are receivers only and do not transmit.
It is possible in some instances to establish relationships between texts on the basis of accidental features, such as the distribution of variant spellings, or physical features such as line-lengths or the number of lines per page. For a summary of Cameron's important discoveries concerning scribal accidentals, see Poems on Affairs of State, V, 529.
Some of these are spelled out by Humphrey Palmer in The Logic of Gospel Criticism (1968), pp. 93-94. The additional point should perhaps be made that, even in a strictly "bibliographical" stemma, the inferential intermediaries remain inferential, that is to say logical abstractions, and cannot be relied upon to correspond in their assumed readings to any single historically existing manuscript.
The editorial heritage of Lachmann encouraged editors to fuse the two processes of the assessment of direction and the linking-up of the stemma; however editors of English literary texts since the time of Greg have agreed in not proceeding to consider directional evidence until the linkages between texts have been established on a purely distributional, non-directional basis. The present study, while arguing that contextual as well as purely formal arguments should be admissible at every stage of textual reasoning, accepts the validity of the two-stage model of stemma-building. For objections to the Lachmannian method, which is still accepted on the authority of Maas by many present-day classicists and mediaevalists, see Dearing, pp. 5-11, 15-16 and 54-56 and Palmer, pp. 76-80. Palmer concludes (p. 92) that Lachmann's method "is quicker—when it works!" but that it depends "on finding errors certain to the critic and incorrigible by copyists"—a consummation more often devoutly wished than practically experienced. For a defence of the method against its critics, see Kenny, p. 137.
Walter W. Greg (1927). Dearing's much more detailed study refines in many valuable particulars on Greg's techniques, definitions and terminologies but should not be used without reference to Michael Weitzman's review in Vetus Testamentum, 27 (1977), 225-235. I would like to thank Dr. Weitzman for pointing out an error in the reasoning of an earlier version of the present paper.
See Dearing, pp. 215-236 and Dom J. Froger, La Critique des Textes et son Automatisation (Paris: Dunod, 1968). Earlier work by Dearing is discussed briefly in my "Computers and Literary Editing: Achievements and Prospects" in The Computer in Literary and Linguistic Research, ed. R. A. Wisbey (1971), pp. 47-56. Gunter Kochendörfer and Bernd Schirok, Maschinelle Textrekonstruction, Göppinger Arbeiten zur Germanistik nr. 185 (Göppingen: Alfred Kümmerle, 1976) includes a valuable bibliography of work in a number of languages relating to this field (pp. 176-179).
As the aim of textual analysis is to restore the readings of the archetype, formal systems and algorithms claiming to do this should be tested on texts chosen at random from a tradition created by supervised copying in as close as devisable an approximation to the conditions experienced by professional scribes of the past and with the solution withheld from the experimenter until after he has submitted his conclusions. If such a system can not produce a correct solution to texts whose descent can be verified, it can hardly be trusted to produce correct results for historical traditions where the conclusions can not be verified.
In the Lachmannian tradition, the "favoured few" are readings regarded as having clear directional implications but not, in the judgment of the editor, being readily corrigible by copyists of the period concerned. These are in turn divided into (1) separative errors which indicate that the manuscript containing them can not be the ancestor of one in which they are correct (cf. Maas, pp. 42-47) and conjunctive errors defined by Palmer as "common to two manuscripts but not a third, and such that the reading of the third could not be due to correction by the copyist" (p. 243).
A summary of the more common kinds of scribal error will be found in Reynolds and Wilson, pp. 200-212. For a more systematic treatment, see James Willis, Latin Textual Criticism (1972), pp. 47-188. Weitzman (p. 226) draws attention to important material to be found in J. Stoll, "Zur Psychologie der Schreibfehler," Fortschritte der Psychologie, 2 (1913), pp. 1-133.
Cameron's experience of this kind of variation in his "scriptorium" manuscripts led him to suggest that "all texts are in fact conflated texts—a conflation of the exemplar and a structure of linguistic expectations that is present in the mind of the scribe" (Poems on Affairs of State, V, 529).
By a "whole group" I understand a whole "true" group as defined below. Greg's use of "true" to describe variational groups containing more than one member seems to proceed from nothing more than an aesthetic disinclination to speak of a "group" with only one member (cf. Dearing, p. 10) and should be disregarded.
Cf. Maas, p. 8: "obvious corruptions, particularly lacunae, may easily be transmitted in the direct line but are hardly ever transferred by contamination." Against this, however, must be placed Greg's principle (p. 20 n.) that "the easier it is to explain how an error arose, the less valid the assumption that it only arose once."
Assuming, of course, that the distribution of a variant among surviving sources corresponds in a general way with the distribution of the variant in the whole body of sources available at the time the scribe made the copy. This may not always have been the case: a Bowdlerised version of a satire may, for instance, have stood a better chance of survival than an obscene one. If a text was available in printed form, each copy in circulation would need to be counted as a separate source for purposes of such a calculation.
The editor should consider the possibility whether the overlapping of two or more possibly incomplete groups defined by gross and easily corrigible errors might not indicate a fractured whole group.
Dearing, p. 57, revising Greg's terminology. Constantine Kooznetzoff, on the other hand, though he does not assert this as a general principle, finds in his "A Genealogical Analysis of the 'Tristan' Fragment, Ms. 2280," AUMLA, 54 (November 1980), 194, that the type-2 variants "providing they are consistent, afford all the evidence necessary for deducing manuscript relationships." The complex variants are declared "genetically non-evidential" on the grounds that they contain "conscious scribal emendation" (p. 197).
A test for obtrusive readings in a short poem would be to find a pretext to make someone transcribe it several times, and then, after a few days interval, ask him to write down as much as he could still remember. The readings correctly given would be the "obtrusive" ones.
The only at all likely situations where such a variant would not indicate a true group would be (1) when it had been called on to fill a lacuna in a majority-group manuscript by a scribe with too imperfect a knowledge of the language or content of what he was copying to realise its uselessness or who felt that any reading was better than none (2) when the identical gross error was made during separate copyings by the same scribe (a quite conceivable happening in a scriptorium situation such as that described by Cameron) or by separate scribes (rather less likely unless some aspect of the original reading or a particular exemplar had created an exceptionally high risk of error).
These points are discussed in greater detail in my "The Text of 'Timon. A Satyr'," Bulletin of the Bibliographical Society of Australia and New Zealand, 6 (1982), 113-140.
For an example of the application of this technique, see my "The Texts of Southerne's The Spartan Dame," Bulletin of the Bibliographical Society of Australia and New Zealand, 1 (1970-1), 54-59.
Review of Dearing's Manual of Textual Analysis, JEGP, 59 (1960), 555. Housman in "The Application of Thought to Textual Criticism" sees the subject matter of the discipline as "not rigid and constant, like lines and numbers, but fluid and variable; namely the frailties and aberrations of the human mind, and of its insubordinate servants, the human fingers" (Selected Prose, ed. John Carter [1961], p. 132.) Dearing, p. ix contains a brief retort to Housman.
Dearing warns rightly (p. 55) that if the textual analyst "lets the form of his family tree influence his analysis of the directional variation, he reasons in a circle."
Palmer, pp. 51-52, 80. The problems of circular reasoning in textual criticism are similar in their nature to that of the "hermeneutic circle" in critical interpretation as discussed by a number of contributors to the Autumn 1978 issue (X. i) of New Literary History. Palmer's metaphor with its attractively Yeatsian overtones hardly offers a procedural solution, but indicates that the methodological plight of historical scholarship is not as desperate as it is sometimes made out to be.
| ||