If the Text-Critics Went to Lunch and Didn’t Come Back

Introduction

An important practice in the business world is determining the viability and impact of a project before investing resources into that project. It seems this is a wise analysis to consider for evangelical text-criticism.

 For which of you, intending to build a tower, sitteth not down first, and counteth the cost, whether he have sufficient to finish it? Lest haply, after he hath laid the foundation, and is not able to finish it, all that behold it begin to mock him, Saying, This man began to build, and was not able to finish.

The Holy Bible: King James Version, Electronic Edition of the 1900 Authorized Version. (Bellingham, WA: Logos Research Systems, Inc., 2009), Lk 14:28–30.

Christians should now act like wise investors. The church has been patient, but it is time to analyze the project afresh. The evangelical text-critics have determined that while we will never have the original text God inspired, what we have is close enough. A valuable analytical process is to determine the impact of ending an ongoing project. According to the careful analysis and hard work of the evangelical textual scholars, the church has all it needs from the manuscripts to get by. No doctrine has been affected in nearly 200 years of textual criticism, the church has what it needs. So what is the impact on the church, if all of the text-critics went out for lunch and never went back to work?

Seven Benefits to Ending the Effort of Modern Evangelical Text-Criticism

First, Greek Bibles would stop changing. No new additions or subtractions would be made to God’s Word. The only changes to God’s Word would have to be made by translation committees. 

Second, the text of the modern Bibles would be stable. Christians could buy a translation and keep it their whole lives without it expiring. 

Third, the work of men like Bart Ehrman would be irrelevant to the church, because Evangelical scholars wouldn’t be working with him and for him and under him any longer.  

Fourth, Christian textual scholars could spend more time doing exegesis for the church and pastoring, rather than scraping through manuscripts and counting words. Many of these men have a Masters of Divinity from well reputed seminaries, they could apply their education to shepherding the flock. 

Fifth, seminaries could remove Bruce Metzger and Bart Ehrman’s textbook from the standard curriculum. We have what we need in our Greek texts, there is no need to continue giving Erhman a platform. 

Sixth, the heroic apologists of the Christian faith could spend more time defending the teachings of the Word of God, rather than trying to discover what it says. 

Seventh, resources spent on text-criticism could go to planting churches, supporting struggling churches, and training pastors. 

Conclusion

 If the best and the brightest text critics say that they haven’t found the original text in a time where we have “the best data,” and have determined that “we have what we need,” there is no point in carrying on. “No doctrine has been affected,” so it seems the church is equipped to press on. The church does not need to support a project that has already made the necessary conclusions. Instead, it should support those evangelical textual scholars in putting their MDivs to use pastoring churches and feeding God’s people. Let the secular academy continue their quest for the “historical Jesus” and free up the men of God to do work for the Kingdom! 

A good question to answer to determine the impact of ending such a project is, “What would happen if evangelicals stopped making Greek New Testaments?” The answer is nothing. Nothing would happen. The church would carry on without a hiccup. Pastors would preach, seminaries would train, and the Gospel would still go forth to all the nations. The average Christian would be none the wiser. The Bible has been preserved after all, no need to keep working on a finished product. 

What We Believe About Holy Scripture

Recently, I wrote an article entitled, “Yes, The Bible Teaches Preservation,” to address the reality that modern evangelical scholars have abandoned the historical protestant doctrine that says that we have the Bible today in its original form. This doctrine is enabled by the Chicago Statement on Biblical Inerrancy, which only speaks to inerrancy in the original autographs of the Scriptures. In this blog, I have set forth that the most faithful position on the Holy Scriptures is that of providential preservation, not inerrancy. The modern doctrine of inerrancy only affirms that the Scriptures we have today can be ascertained with “great accuracy” according to what the modern text-critical scholars determine. An article from Ligonier puts it this way:

“In sum, the Bible is entirely truthful and has no errors at all in the original manuscripts that the prophets and Apostles actually wrote. We do not today possess these manuscripts, but through the process of textual criticism, we can recover the original wording of the manuscripts with a high degree of certainty.”

So then, the inerrancy of the Bibles we have today in our possession are entirely determinant on the text-criticism of modern scholars, who uniformly say, 

“We do not have now – in our critical Greek texts or any of our translations, exactly what the authors of the New Testament wrote. Even if we did, we would not know it” 

Gurry & Hixson, Myths and Mistakes in New Testament Textual Criticism. xii

The important part of that statement is the last sentence, “Even if we did, we would not know it.” This is an honest admission, and it is completely accurate, if the method of authentication is the text critical principles employed to make modern critical Greek texts. Since the doctrine of inerrancy sets forth that the Bible’s accuracy is determined by textual criticism, it is really saying that “greatly accurate” means, “we’re not actually sure how accurate it is.” I reject this model of authentication, as it is not Scriptural. The methods of text-criticism are entirely bound to the extant manuscript data, which does not date back to the time of the Apostles. It assumes that the only evidence that matters is what has survived, even though the stationary the Biblical writers used, in most cases, had a maximum shelf life of 500 years. It further assumes that the previous generations were not given the “best” data to receive the Scriptures from the generation before it, which puts the modern church in a terrible predicament.

Even though we do have 2nd and 3rd century manuscripts, none of these are complete enough to make an entire Greek New Testament. The most complete New Testament manuscripts come from the fourth century and later, and so there is no way to determine, according to text-critical principles, what the text looked like prior to that point. There is no way to tell which verses were added, removed, and changed in the two or three hundred year gap between the Apostles and the earliest complete copies. In fact, nearly all of the evangelical scholars say that the text evolved due to Christian tampering. 

Further, the earliest copies look quite different than later copies, so any chance of knowing what the Bible originally said is impossible, according to modern critical principles. Text-critics could reconstruct a Bible that is completely original, and have no idea that they’ve done so, because there is nothing to compare their work against. Critics could just as easily determine an original reading a “later interpolation” as they could a later copyist insertion an “original reading.” Even though the scholars readily admit that,

“It is therefore inadvisable to assume without qualification that earlier is always better, more accurate, or less likely to contain “corruptions” when one of the earliest manuscripts of 1-2 Peter and Jude looks as thought it was written by a copyist who changed the text in places to make a stronger case that Jesus is God”

(Gurry & Hixson, Myths and Mistakes in New Testament Textual Criticism, 92).

In short, the mechanism that gives inerrancy its value to the modern reader of the Bible says nothing meaningful, because it cannot responsibly say that it has delivered the reconstructed Scriptures to the world with “great accuracy.” All it can say is that it has delivered a later version of the Scriptures with great accuracy. Whether or not that version represents the original, nobody can say, if the methods of authentication are the critical principles of men. Scholars may assert that they know some of the places where well meaning Christians “corrupted” the Bible to make it more Christian, but they’ll never know all of the places. The Bible they have reconstructed could just as easily be a gnostic or unitarian version of the Scriptures that was produced during the time when, “The whole world groaned, and was astonished to find itself Arian” (Jerome). When somebody says, “We have what we need,” they are really saying, “I feel that I have all that I need, and you should too.” 

More importantly, does God, the author of the Scriptures, set forth that this is how the Scriptures are to be authenticated? Is the modern articulation of quasi-preservation Biblical? Are we to believe that the Scriptures were corrupted over time by people trying to make them seem more Christian? In the first place, providence declares this not to be the case. The modern critical methods have been employed for almost 200 years now, and the only fruit to show for it is hundreds of new Bibles, none of which are said to be original, and more uncertainty in the text than the orthodox Christian church has ever seen in its 2,000 year history. The theological battle over Scripture is really not all that different than the 16th century, only instead of the church saying it gives the Scriptures weight, conservative Christians are now saying that text criticism gives the Scriptures weight. The only difference is that the textual scholars are not saying they can give the Scriptures the necessary weight, whereas the Roman magisterium did. John Calvin’s words ring especially true today, 

“As if the eternal and inviolable truth of God depended upon the decision of men! For they mock the Holy Spirit when they ask: Who can convince us that these writings came from God? Who can assure us that Scripture has come down whole and intact even to our very day?

Yet, if this is so, what will happen to miserable consciences seeking firm assurance of eternal life if all promises of it consist in and depend solely upon the judgment of men? Will they cease to vacillate and tremble when they receive such an answer? Again, to what mockeries of the impious is our faith subjected, into what suspicion has it fallen among all men, if we believe that it has a precarious authority dependent solely upon the good pleasure of men!”

John Calvin, Institutes of the Christian Religion & 2, ed. John T. McNeill, trans. Ford Lewis Battles, vol. 1, The Library of Christian Classics (Louisville, KY: Westminster John Knox Press, 2011), 75.

More important than what textual scholars say about Holy Scripture, is what God says about Holy Scripture. Here is list of truths from Scripture, about Scripture:

  1. God is the author of His Word, which was written by men (1 Peter 1:19-21; 2 Tim. 3:16)
  2. It is the way He speaks to His people now (Heb. 1:1; Isa. 54:13; John 6:45)
  3. It is the means by which men are saved and sanctified (John 5:39;2 Tim. 3:15-17; Rom. 10:17)
  4. It is to be received by men as truth, over and above the witness of men (1 Thess 2:13; 1 John 5:9)
  5. It is what the church is built upon (Eph. 2:20; Acts 15:15)
  6. God’s Word is pure and perfect (Ps. 12:6; 19:7)
  7. God’s Word will not fall away so as long as He is fulfilling His purpose for this world (Matt. 5:18, 24:35; Rom. 3:2)
  8. Man’s inability to understand more difficult teachings of Scripture does not make it less pure (2 Peter 3:16)
  9. God’s people hear God’s voice through the Scriptures by the power of the Holy Spirit (John 10:27; 1 Cor. 2:10-12)

Nowhere in Scripture do we find a warrant to believe that God’s words are only “greatly accurate,” or that they would fall away and need to be reconstructed. Nowhere do we find that God would only speak in the original texts perfectly, and let His Word be played with by His people to amplify what He said. God’s Word is intimately connected with His covenant purpose to save a people unto Himself, and what we say about His Word is what we say about His purpose, work, and character. What we say about the preservation of the Scriptures is what we say about His continued work in history, because the Scriptures are how He accomplishes that work. What we say about the Scriptures, we say about God Himself, because the Scriptures are how He has spoken. Many Christians have adopted these perspectives without considering the implications. The fact is, if you’re an average Christian, unfamiliar to this conversation, you likely are not comfortable acknowledging what the scholars accept as cold, hard truth. You read your Bible as you should, with certainty that God is speaking to you in His preserved Word.

If we say that God has only preserved “some of His Word,” well, then perhaps He’s only preserved some of His people. It’s completely reasonable to believe, if we take the methods of the modern scholars as true, that the whole idea of Jesus returning on the Last Day is a later invention. If God did not continuously preserve His Word, even the scribes our earliest manuscripts could have added these details. There is nothing that Christians can possibly say to this, if our hope is placed on the evaluation of manuscripts by textual scholars. The fact is, modern evangelical scholars, pastors, and theologians fundamentally agree with Bart Ehrman on the text of Holy Scripture. The only difference is their conclusion, that, “It really doesn’t matter that the Scriptures are corrupt.” In other words, Christians would rather have faith that the Scriptures are still powerful to “get the job done,” despite being corrupted, rather than believe that they have been kept pure in all ages.

Why is it the case that Christians believe God is big enough to preserve the orbit of the planets but not His Word? Rather than assuming on behalf of God that He is not under any obligation to preserve the Scriptures (Jongkind, An Introduction to the Greek New Testament. 90), Christians should believe that He has lovingly and graciously given His people an infallible rule of faith! If you say that God simply didn’t want to preserve the Scriptures, the means that God uses to make men wise unto salvation (2 Tim. 3:15), you should be just as comfortable saying that God simply didn’t want to save man. Christians act like rejecting the preservation of the Holy Scriptures is some benign theological opinion. I have heard, on countless occasions, that this is simply not a fight worth fighting because there are other “more important issues.” What could possibly be more important than fighting for the truth that God has given His church an infallible rule to be saved by? What despair do we subject the people of God to for the sake of having a few star pupils in the lion’s den? Universities and churches invite men like Bart Ehrman into the sanctuary to evangelize this dangerous doctrine, and act like it is honorable to do so.

If the Scriptures have fallen away, what exactly are we doing here, Christian? What does it matter that we fight tooth and nail against liberal Christianity if the standard we use to rule doctrines “liberal” is just a fourth century iteration of Christianity that cannot be shown to represent the Apostolic iteration of Christianity? If the text of Holy Scripture fell away, even in part, who is to say that what we consider the “fundamentals” of Christianity weren’t the machinations of some early Christians trying to “emphasize the deity of Jesus” (Myths and Mistakes in New Testament Textual Criticism, 91). What right do we have to sanctimoniously stand on “God’s inerrant word” if we believe that it was only inerrant in the originals, which we do not have and cannot know? The answer is none. We have no reason to responsibly judge any other version of Christianity, because we’ve simply selected the version that we like the best. If it is our job to “reconstruct” the New Testament, then there is nothing wrong with others reconstructing Christianity. 

Conclusion

Modern Christians suffer from serious amnesia when it comes to the Reformation. They forget what the Roman Catholic church was saying, and the Protestant response. If Christians are to have any claim to an absolute standard of truth, that standard must be self-authenticating. The Scriptures were not developed according to the fancies of Christian faith communities over 2,000 years, as the “lower critics” assert. They were faithfully transmitted by the people of God by the sure hand of God’s providence. The historic Christian belief is that they were “kept pure in all ages.” Rejecting the purity of the Scriptures is one of the most grave theological errors in the modern period because it upsets the whole of the Gospel. How can one say that “This is the message that ye heard from the beginning” if we do not know what that beginning message said? It is completely useless to say that the message from the beginning was perfect if we do not have that message now. I’m afraid that our need to be apologetically relevant to the atheists, higher critics, and muslims has caused Christians to reject that only sound standard of truth that can stand against the gates of hell. 

Calvinists love to appeal to the doctrines of the Reformation, especially Sola Scriptura, while inconsistently affirming the theological axioms of the modern critical text. The two are at odds with each other. The rise of historical criticism and neo-orthodoxy sent the world spinning, and instead of fighting the same fight as the Reformers, theologians of the 19th and 20th centuries reinterpreted the Westminster Confession and retreated back to the doctrine of inerrancy – a doctrine which stands and falls on the determinations of textual scholars. And the methods of textual scholars include “lower” critical theories such as “expansion of piety” and that the text evolved according to Christian faith communities. The culture of celebrity pastors and theologians has made it such that the average Christian cannot even have an opinion on the matter. “My favorite pastor believes this, are you saying you have better insight than them? Are you saying you have perfect discernment?” Apparently you have to be omniscient to know that this is not Scriptural. While Christians sit around exalting their favorite theologians, the people of God are “destroyed for lack of knowledge.” 

In all of my conversations on this topic with the average Christian, 99% of them do not know what the scholars are saying. When I quote them directly, they point me to a James White video, wherein he sets forth the same principles as the scholars, with more mention of bike riding, travel destinations, and debates. Ultimately it comes down to two major theological positions:

  1. The Old Testament in Hebrew, and the New Testament in Greek, being immediately inspired by God, and by His singular care and providence kept pure in all ages, and therefore authentic
  2. The Bible was entirely truthful and had no errors in the original manuscripts, but we do not today possess those manuscripts, and we cannot determine what they originally said. Even if we could, we would not know it. 

The conversation of “Which text did God keep pure?” is completely irrelevant until Christians actually believe that He has kept them pure and do not need reconstruction. Discussions regarding textual variants are meaningless if the method that authenticates a variant has nothing to say about the originality of it. The Bible version you read is irrelevant if you do not believe that any of them are the inspired Word of God handed down through the ages. The common belief in the modern Christian church is that “no Bible is perfect.” If this is the case, what exactly must we do to access God’s inerrant Word? What exactly are we reading when we open our Bibles? Christians must first believe that God has inspired His Word, preserved it, and delivered it. Only then can a meaningful conversation take place over “text type” and translation.

Absolute Certainty, The Received Text, and Matthew 23:13-14

Introduction

Recently, Reverend Christopher Myers of Phoenix Reformed Presbyterian Church (RPCNA) tagged me in on a Facebook post to address the topic of absolute certainty and the Received Text. Dr. Peter Gurry playfully chimed in with a test passage (Matthew 23:13-14). In this article I will be interacting with Dr. Gurry’s article. Any disagreements I have with his article do not represent what I think about him as a person. He is a brother in Christ and I no reason to think otherwise.

The question that must be answered is, “How can one have absolute certainty that the Scriptures they read are the Divine Original?” What first must be defined is the operational definition of “absolute” as it pertains to certainty. Of course I would never argue a definition of “absolute certainty” that means “omniscience.” Humans are creatures, and therefore do not know things absolutely in that sense. Yet, in a different, practical, experiential sense, Christians can be absolutely certain that God exists, that He has saved them, and that He has spoken by virtue of His own operation. So the certainty we do have as Christians is not by virtue of our self-perceived omniscience, but by virtue of God’s power in us. This is the clear testimony of Scripture.  

“The holy scriptures, which are wise to make thee wise unto salvation.”

(2 Timothy 3:15)


“All scripture is given by inspiration of God, and is profitable”

(2 Timothy 3:16)

“The Spirit of truth, is come, he will guide you unto all truth…He shall glorify me: for he shall receive of mine, and shall shew it unto you.”

(John 16:13,14)

“My sheep hear my voice”

(John 10:27)

That is to say that certainty in the Scriptures comes not from man, but from God, and therefore is not from a man. Of ourselves, we can never have certainty in the Scriptures, or any spiritual thing for that matter.


“But ye believe not, because ye are not of my sheep”

(John 10:26)

People do not believe that the Scriptures are the Word of God because of manuscript evidence, they believe the Scriptures are the Word of God because:

“our full persuasion and assurance of the infallible truth, and divine authority thereof, is from the inward work of the Holy Spirit bearing witness by and with the Word in our hearts”

(LBCF, WCF 1.5)

It is firmly the Protestant position that men can have “full persuasion and assurance” in the Scriptures not by virtue of their own knowledge, but because of the “inward work of the Holy Spirit” which bears witness to that “infallible truth, and divine authority,” the Scriptures, in the regenerated heart of the believer. That being said, the matter of certainty is not properly a text-critical category, it is a faith category. “Ye believe not, because ye are not of my sheep.” No matter which text one reads, it is definitely the case that text-critical evidence is not the reason for certainty, because God says that is Him who gives certainty. Even if every single manuscript were to read the same exact way in every single verse, this would still be true. That is why I continue to advocate that the text we receive should be derived from a method of faith, not science.

“For they that are after the flesh do mind the things of the flesh”

(Romans 8:5)

For a moment, let’s set aside the idea that there is any warrant to believe that text-critical evidence is the reason we believe a verse to be Holy Scripture, because the Scriptures teach that this is not the case. The Scriptures give abundant cause for experiential certainty by virtue of the inner working of the Holy Spirit. 

Examining the Test Case 

Since we are talking certainty here, let us first examine the two models proposed: methodologies which evaluate textual evidence, and the inner working of the Holy Spirit of the individual and the church catholic throughout the ages. Models which evaluate textual evidence are quite fragile. For example, in the article posted for examination by Dr. Gurry, he appeals to the NA27 and the Byzantine tradition to question the passage as it is found in the KJV. He also notes that the passage also occurs differently within the TR corpus. What is interesting, is that his major point is that the passage is not a majority reading, and that’s why it allegedly should be rejected, though he doesn’t make a case either way. If it is the case that a reading should be accepted or rejected based on the criteria provided in the article, I’d love to see an NA29 without any doubt cast upon Mark 16:9-20. The article does not really make a significant point at all regarding the text itself, just that Erasmus made a textual decision using his “limited resources.” Note that Gurry doesn’t make any statement at all regarding the authenticity of the reading, or inform the reader of what he thinks of the passage. Such is the modus operandi of textual scholars. In between the lines of the article is an obvious attempt to cast doubt on the authenticity of the Traditional reading, but on what grounds does he do so? There are three identifiable grounds that I could identify:

  1. It’s not the majority reading
  2. Erasmus had limited resources
  3. We don’t know where Erasmus got the reading

I suspect that is why he didn’t make an actual conclusion in his article, because the reasons he gives aren’t exactly arguments for or against the text itself. If they are, I fail to see how. There is only one text-critical camp that takes reason one as a valid text-critical criteria, and neither myself nor Peter Gurry hold to that position. Erasmus may have had “limited” resources, but how much more “resources” were used to make the general shape of the modern critical text in 1881? Aleph, B, and a smattering of readings from several other choice manuscripts? The shape of the NA27 is not leaps and bounds different from Hort’s text, despite having access to the Papyri, more Uncials, minuscules, and lectionaries.

“None of the popular hand-editions of the Greek NT takes us beyond Westcott-Hort in any substantive way as far as textual character is concerned”

Eldon J. Epp, The Twentieth Century Interlude in New Testament Textual Criticism. 1974. Aland cites 558 variants between the 1881 Westcott-Hort text and the 25th edition of the Nestle-Aland Text (NA25, 1963). The text of the NA27 is not significantly different from that of the NA25.

The sheer volume of additional data is not anything to be astounded by, because what actually matters is how that data has influenced the text. It doesn’t matter if we enter in 10,000 new manuscripts into evidence today, if that evidence introduces no new readings, and only supports the readings we have proportionately. Further, it especially doesn’t matter how much data we have if we only look at a small subset of that data.

Point three doesn’t actually matter because the reading ended up in his edition, and there are manuscripts that have that reading, which were available in the time of Erasmus. Dr. Gurry even lists them in his article. So unless we want to say that Erasmus made up the readings and those readings happened to match a Greek manuscript, I fail to see what the point is here. 

The interesting thing that this article has shown, is that the standard Dr. Gurry sets forth to evaluate the TR is a standard that he probably wouldn’t try against his NA27. There are many minority readings within that text. Further, do we know where the readings of Aleph and B came from? If we take Erasmus’ opinion of Codex B, he alleges the same thing about it that Gurry does Erasmus’ text – that parts of it were following the Latin. It is quite strange that Erasmus, having such a strong opinion against the Vulgate, would follow Latin readings so often! The difference between Gurry’s claim and Erasmus, is that Erasmus’ text is supported by Greek witnesses, and many, many readings from Codex B are supported by virtually no other Greek manuscript.

This brings me to my final question – what sort of grounds does one stand on to evaluate a text from a modern critical perspective? The modern critical methodology cannot say much about the original text of Scripture with any kind of authority. It is a text that is based on a localized smattering of idiosyncratic manuscripts that have no pedigree and that disappear from the history of textual transmission. I understand why a majority text appeal is made, but a majority text appeal from a modern critical text perspective is more confusing than anything, because there are many majority readings that those in the modern critical text camp reject. It is an interesting article, but the article mostly just demonstrates that modern critical text advocates like going after Erasmus as if that defeats the validity of the Greek Received Text.

Now to the Question of Certainty at Matthew 23:13-14

Now that we have seen that Dr. Gurry didn’t actually make an argument against the reading at Matthew 23:13-14 within the TR tradition, I think it will be helpful to explain why Christians should have certainty that the underlying Greek text of the KJV is the original reading. 

  1. It is the reading that was used, commented on, translated, and received by the people of God in the age of the printing press
  2. It fits in the passage and is theologically correct
  3. It exists in Greek manuscripts (even Byzantine ones)
  4. It was translated into ancient versions
  5. John Chrysostom preached it (Homily LXXIII)
  6. Calvin commented on it (Commentary on a Harmony of the Evangelist, Matthew 23:13-15; Mark 12:40; Luke 11:42, 20:47)
  7. It does not contradict other Biblical accounts

I have absolutely no reason to doubt that this verse should be there. The only reason I would have for questioning its authenticity is if I was trying to find errors with God’s Word. A reading being omitted by, as Metzger puts it in his textual commentary, “earliest and best authorities,” is not exactly a strange occurrence. If I recall, these “earliest and best authorities” are known for such qualities. What is more likely, that a scribe made a mistake in a verse that starts exactly the same as the verse above and below it, or that somebody intentionally harmonized the text with another gospel before the time of Chrysostom (4th century)?

“Scribes typically copy their sources with fidelity so that ancestors and descendants are closely related”

A New Approach to Textual Criticism, Wasserman & Gurry, 98

If we’re after the simplest solution, what is stopping us from believing a scribe made a common slip-of-the-eye error, and many faithful scribes followed in his steps? Are we going to believe in the meddling scribes theory or the faithful scribes theory? At what point are we going to admit that we are more interested in scrutinizing the text rather than believing it? 

Yet, despite all of the good evidential reasons to believe that the TR reading at Matthew 23:13-14 is the original reading, that is not why I believe it to be God’s Word. I believe it to be God’s Word because the Holy Spirit bears witness to it in my heart. I know, not very text critical of me. 

Conclusion

Matthew 23:13-14 is a great test case to examine the various doctrines of Scripture available in today’s conservative church. On one hand, there is the critical camp, which rejects that we can be certain in the text of Holy Scripture, that relies upon critical analysis of evidence to derive varying levels of confidence. On the other hand, there is the Received Text camp, who recognizes God’s providence as a meaningful metric for recognizing the text of Scripture. Instead of assuming that we have lost the text of Holy Scripture, Christians should believe that he has preserved it, and receive the text he preserved. We shouldn’t be looking for reasons to prove the text of the Protestant Reformation wrong. If the final textual product of the Protestant Reformation is woefully corrupt, then it doesn’t seem that providence had anything to do with the transmission of the text of the New Testament. Further, if the text of the Reformation is corrupt, then we do not have now, and have never had, a stable text of Holy Scripture.

Christians can have certainty in the text of the Holy Scriptures, because God says He provides that certainty. Certainty isn’t derived from our acquisition of knowledge, but rather the internal witness of the Holy Spirit with the Word of God. No amount of text-critical analysis can offer certainty in God’s Word, because there is nothing particular about text-critical methods that can offer certainty in God’s Word. Take, for example, DC Parker, an authority in the discipline, and the team lead for the Gospel of John in the ECM:

“The text is changing. Every time that I make an edition of the Greek New Testament, or anybody does, we change the wording. We are maybe trying to get back to the oldest possible form but, paradoxically, we are creating a new one. Every translation is different, every reading is different, and although there’s been a tradition in parts of Protestant Christianity to say there is a definitive single form of the text, the fact is you can never find it. There is never ever a final form of the text.”

Certainty is a category of faith, not knowledge. If we examine the fruit of the modern critical text machine on the doctrine of Scripture, this is plainly the case. Text critical methods have only produced doubt. So we can talk about Erasmus all we want, but that’s not going to make the New Testament autographs appear. Christians must hold fast to the Scriptures, and derive their certainty from the only infallible hope, our God and Savior Jesus Christ by the power of the Holy Spirit. There is an objective standard Christians can look at to prove this, God’s providential preservation in time.


“We do not have now – in our critical Greek texts or any of translations – exactly what the authors of the New Testament wrote. Even if we did, we would not know it.” – Dan Wallace

(Gurry & Hixson, Myths and Mistakes in New Testament Textual Criticism, xii)

Is the CBGM God’s Gift to the Church?

Introduction

It is stated by some that the Coherence Based Genealogical Method is a blessing to the church, even gifted to the church by God way of God’s providence. I thought it would be helpful to examine this claim. Unfortunately, those who have made such statements regarding the Editio Critica Maior (ECM) and the CBGM have not seemed to provide an answer as to why this is the case. This is often a challenge in the textual discussion. Assertions and claims can be helpful to understanding what somebody believes, but oftentimes fall short in explaining why they believe something to be true. The closest explanation that I have heard as to why the CBGM is a blessing to the church is because it has been said that it can detail the exact form of the Bible as it existed around 125AD. Again, this is simply an assertion, and needs to be demonstrated. I have detailed in this article as to why I believe that claim is not true. 

In this article, I thought it would be helpful to provide a simple explanation of what the CBGM is, how it is being used, and the impact that the CBGM will have on Bibles going forward. The discerning reader can then decide for themselves if it is a blessing to the church. If there is enough interest in this article, perhaps I can write more at length later. I will be using Tommy Wasserman and Peter Gurry’s book, A New Approach to Textual Criticism: An Introduction to the Coherence Based Genealogical Method as a guide for this article. 

Some Insights Into the CBGM from the Source Material 

New Testament textual criticism has a direct impact on preaching, theology, commentaries, and how people read their Bible. The stated goal of the CBGM is to help pastors, scholars, and laypeople alike determine, “Which text should be read? Which should be applied?..For the New Testament, this means trying to determine, at each place where our copies disagree, what the author most likely wrote, or failing this, at least what the earliest text might have been” (1, emphasis mine). Note that one of the stated objectives of the CBGM is to find what the author most likely wrote, and when that cannot be determined, what the earliest text might have been. 

Here is a brief definition of the CBGM as provided by Dr. Gurry and Dr. Wasserman:

“The CBGM is a method that (1) uses a set of computer tools (2) based in a new way of relating manuscript texts that is (3) designed to help us understand the origin and history of the New Testament text” (3). 

The way that this method is relating manuscript texts is an adaptation of Karl Lachman’s common error method as opposed to manuscript families and text types. This is in part due to the fact that “A text of a manuscript may, of course, be much older than the parchment and ink that preserve it” (3). The CBGM is primarily concerned with developing genealogies of readings and how variants relate to each other, rather than manuscripts as a whole. This is done by using pregenealogical (algorithmic analysis) and genealogical (editorial analysis). The method examines places where manuscripts agree and disagree to gain insight on which readings are earliest. In the case that the same place in two manuscripts disagree, the new method can help in determining one of two things:

  1. One variant gave birth to another, therefore one is earlier
  2. The relationship between two variants is uncertain

It is important to keep in mind, that the CBGM is not simply a pure computer system. It requires user input and editorial judgement. “This means that the CBGM uses a unique combination of both objective and subjective data to relate texts to each other…the CBGM requires the user to make his or her own decisions about how variant readings relate to each other.” (4,5). That means that determining which variant came first “is determined by the user of the method, not by the computer” (5). The CBGM is not purely an objective method. People still determine which data to examine using the computer tools, and ultimately what ends up in the printed text will be the decisions of the editorial team. 

The average Bible reader should know that the CBGM “has ushered in a number of changes to the most popular editions of the Greek New Testament and to the practice of New Testament textual criticism itself…Clearly, these changes will affect not only modern Bible translations and commentaries but possibly even theology and preaching” (5). Currently, the CBGM has been partially applied to the data in the Catholic Epistles and Acts, and DC Parker and his team are working on the Gospel of John right now. The initial inquiry of this article was to examine the CBGM to determine if it is indeed a “blessing to the church”. In order for this to be the case, one would expect that the new method would introduce more certainty to Bible readers in regards to variants. Unfortunately, the opposite seems to be true. 

“Along with the changes to the text just mentioned, there has also been a slight increase in the ECM editors’ uncertainty about the text, an uncertainty that has been de facto adopted by the editors of the NA/UBS…their uncertainty is such that they refuse to offer any indication as to which reading they prefer” (6,7). 

“In all, there were in the Catholic Letters thirty-two uses of brackets compared to forty-three uses of the diamond and in Acts seventy-eight cases of brackets compared to 155 diamonds. This means that there has been an increase in both the number of places marked as uncertain and an increase in the level of uncertainty being marked. Overall, then, this reflects a slightly greater uncertainty about the earliest text on the part of the editors” (7).   

This uncertainty has resulted in “the editors to abandon the concept of text-types traditionally used to group and evaluate manuscripts” (7). What this practically means is that the Alexandrian texts, which were formerly called a text-type, are no longer considered as such. The editors of the ECM “still recognize the Byzantine text as a distinct text form in its own right. This is due to the remarkable agreement that one finds in our late Byzantine manuscripts. Their agreement is such that it is hard to deny that they should be grouped…when the CBGM was first used on the Catholic Letters, the editors found that a number of Byzantine witnesses were surprisingly similar to their own reconstructed text” (9,10). 

Along with abandoning the notion that the Alexandrian manuscripts represent a text type, another significant shift has occurred. Rather than pursuing what has historically been called the Divine Original or the Original Text, the editors of the ECM are now after what is called the Initial Text (Ausgangstext). There are various ways this term is defined, but opinions are split with the editors of the ECM. For example, DC Parker, who is leading the team who is using the CBGM in the Gospel of John has stated along with others that there is no good reason to believe that the Initial Text and the Original Text are the same. Others are more optimistic, but the 198 diamonds in the Acts and Catholic Letters may serve as an indication as to whether this optimism is warranted based on the data. The diamonds indicate a place where the reading is uncertain in the ECM. 

The computer based component of the CBGM is often sold as a conclusive means to determine the earliest, or even original reading. This is not true. “At best, pregenealogical coherence [computer] only tells us how likely it is that a variant had multiple sources of origin rather than just one…pregenealogical coherence is only one piece of the text-critical puzzle. The other pieces – knowledge of scribal tendencies, the date and quality of manuscripts, versions, and patristic citations, and the author’s theology and style are still required…As with so much textual criticism, there are no absolute rules here, and experience serves as the best guide” (56, 57. Emphasis added).

In the past it has been said that textual criticism was trying to build a 10,000 piece puzzle with 10,100 pieces. This perspective has changed greatly since the introduction of the CBGM. “we are trying to piece together a puzzle with only some of the pieces” (112). Not only does the CBGM not have all the data that has ever existed, it is only using “about one-third of our extant Greek manuscripts…The significance of this selectivity of our evidence means that our textual flow diagrams and the global stemma do not give us a picture of exactly what happened” (113). Further, the CBGM is not omniscient. It will never know how many of the more complex corruption entered into the manuscripts, or the backgrounds and theology of the scribes, or even the purpose a manuscript was created. “There are still cases where contamination can go undetected in the CBGM, with the result that proper ancestor-descendant relationships are inverted” (115). That means that it is likely that there will be readings produced by the CBGM that were not original or earliest, that will be mistakenly treated as such. “We do not want to give the impression that the CBGM has solved the problem of contamination once and for all. The CBGM still faces certain problematic scenarios, and the loss of witnesses plagues all methods at some point” (115). 

One of the impending realities that the CBGM has created is that there may be a push for individual users, Bible readers, to learn how to use and implement the CBGM in their own daily devotions. “Providing a customizable option would mean creating a version that allows each user to have his or her own editable database” (119,120). There will likely be a time in the near future where the average Bible reading Christian will be encouraged to understand and use this methodology, or at least pastors and seminarians. If you are not somebody who has the time or ability to do this, this could be extremely burdensome. Further, the concept of a “build your own Bible” tool seems like a slippery slope, though it is a slope we are already sliding down for those that make their own judgements on texts in isolation to the general consent of the believing people of God. 

Conclusion

Since the CBGM has not been fully implemented, I suppose there is no way to say with absolute confidence whether or not it is a “blessing to the church”. I will say, however, that I believe the church should be the one to decide on this matter, not scholars. It seems that the places where the CBGM has already been implemented have spoken rather loudly on the matter in at least 198 places. Hopefully this article has been insightful, and perhaps has shed light on the claims that many are parroting which say that the CBGM is a “blessing to the church” or an “act of God’s providence”. If anything, the increasing amount of uncertainty that the CBGM has introduced to the previous efforts of modern textual criticism should give cause for pause, because the Bibles that most people use are based on the methodologies that modern scholarship has abandoned.

Helpful Terms

Coherence: The foundation for the CBGM, coherence is synonymous with agreement or similarity between texts. Within the CBGM the two most important types are pregenealogical coherence and genealogical coherence. The former is defined merely by agreements and disagreements; the latter also includes the editors’ textual decisions in the disagreements (133).   

ECM: The Editio Critica Maior, or Major Critical Edition, was conceived by Kurt Aland as a replacement to Constantin von Tischendorf’s well-known Editio octava critica maior. The aim of the ECM is to present extensive data from the first one thousand years of transmission, including Greek manuscripts, versions, and patristics. Currently, editions for Acts and the Catholic Letters have been published, with more volumes in various stages for completion (135).  

Stemma: A stemma is simply a set of relationships either of manuscripts, texts, or their variants. The CBGM operates with three types that show the relationship of readings (local stemmata), the relationship of a single witness to its stemmatic ancestors (substemma), and the relationships of all the witnesses to each other (global stemmata) (138). 

Common Sense Arguments Against the Modern Critical Text

Introduction

It is easy to get bogged down in conversations about textual variants, manuscripts, and elusive terminology when it comes to any talk about Textual Criticism. These types of conversations prevent the average Christian from entering into the discussion, and so it is common to just side with a favorite pastor or scholar. Fortunately, the conversation is not as complicated as many make it seem. It is true that in order to analyze a variant or read a manuscript, an understanding of the Greek language and a general knowledge of textual scholarship is required. This should cause the average Christian to pause and consider that reality. Should every Christian need to learn Greek and study textual criticism in order to read their Bible? Does that sound like something that God would require for His people to read His Word? Does God require papal or scholarly authority for His people to know which verses are authentic? 

Those who advocate for this have made a serious error in their understanding of the availability of the Scriptures. They have imposed a burdensome standard upon the Holy Scriptures which puts a barrier between the average Christian and New Testament scholarship. This cumbersome gatekeeping tool has informed Christians everywhere that unless they have a PhD in Text-Critical studies and know Greek, they are simply unequipped to determine which Bible they should read, or which variants within those Bibles can be trusted. This common idea has introduced a neo-papacy within the Protestant church, which tells Christians that they must wait for scholars or pastors or apologists to speak Ex cathadra before trusting any verse in their Bible. 

Is it Really That Complicated? 

Not really, no. The direction of modern textual criticism has refuted itself in the fact that it readily admits it cannot find the original text of the New Testament. In other words, their methods have failed. In order to obfuscate this reality, scholars have shifted the effort to finding the Initial Text, which is really just a presuppositional effort to produce a hypothetical (non-existent) archetype from the smattering of Alexandrian manuscripts. This is the first common sense argument against the Modern Critical Text – it doesn’t claim to be the original text, and the methodologies being employed cannot and do not make any certain claims on producing the original text. So for any Christian who wants to “know what Paul wrote,” the modern methods aren’t claiming to provide that kind of certainty. That kind of certainty is only provided, given a scholar or somebody else speaks authoritatively over a text for the people of God. This being the case, Christians need to pick a pope to decide for them if Luke 23:34 really is original, because the popes disagree. If the protestant religion is truly a religion of Sola Scriptura, this simply does not work. It is the same argument the Papists make, only the pope is exchanged for a scholar. If a Christian is okay with maybe knowing what Paul wrote, I present a second common sense argument against the Modern Critical Text. 

If you are fond of the argument that claims that the New Testament is the best attested piece of literature in antiquity, boasting thousands of manuscripts compared to other works such as the Iliad, than the Modern Critical Text fails that criteria. The only text platforms that can use this argument are texts that represent the vast majority of manuscripts, such as a Byzantine priority or Traditional Text based Bible. The Modern Critical Text is based primarily on two manuscripts, which means that the apologetic which says that we have thousands of manuscripts isn’t true for the Modern Critical Text. One would have to say that the New Testament is only supported by less than fifty manuscripts, which makes it one of the least attested to books in antiquity. The narrative of transmission presented by the modern critical scholars says that the rest of the thousands of manuscripts were byproducts of scribal smoothing and orthodox revision. In supporting these modern texts, one has to accept that fact that the vast majority of the 6,000 manuscripts we have were the product of scribal revision and orthodox tampering, and do not testify to a preserved Bible. In fact, this is the common opinion of the men and women engaged in actual textual scholarship. This reality transitions quite nicely to the third common sense argument against the Modern Critical Text. 

Christians should be confident that the thousands of manuscripts testify to the authentic New Testament when compared and edited together. The fact that these manuscripts were copied so much and were used to heavily throughout time should tell a story that is often brushed over by modern scholarship. The story is that these manuscripts, or a comparison of these manuscripts, were always treated as authentic throughout time. In fact, the manuscripts used by Erasmus represent the majority of manuscripts far more closely than the Modern Critical Text. While I don’t believe that simply counting manuscript readings produces an original text without any further consideration, it is a good place to start to reject the few spurious texts that the Modern Critical Text is based on. 

A common sense methodology would also admit that we do not have every manuscript surviving today, and that the testimony of the people of God throughout time should also be considered so that not one word is lost from the Holy Scriptures. In terms of data analysis, the amount of data points that the Modern Critical Text represents should be considered an outlier. So is it the case that a few manuscripts which did not survive in the manuscript tradition are original? Or is it more likely that the vast majority of manuscripts represent the original when compared? In order to responsibly represent the case for the Modern Critical Text, one has to tell a tale that the New Testament evolved over time, and became so corrupt that nobody alive today really knows what the original said. Thus the modern effort is focused on producing a hypothetical archetype for these outlier texts. The modern method assumes that the thousands of manuscripts are corrupt evolutions of the original text. That leads us to a fourth common sense argument against the Modern Critical Text. 

It technically could be true that the handful of early surviving manuscripts represent the original text of the New Testament. Simply counting readings does not necessarily prove originality. There are a handful of readings that the people of God have considered original throughout time that are no longer available in the majority of manuscripts. That is not proof, however, that these now minority readings were not the majority at one point in time, or considered authentic despite not being the majority. God never promised to preserve the majority text in every case, He simply promised that He would preserve His Word until the Last Day. The majority text simply testifies to a different text than the “earliest and best”, and the opinions of the people of God throughout time should serve as a way to understand which readings were considered authentic throughout time. The first time this was ever done on a large scale was during the 16th century, when the printing press was made available to 16th century theologians and scholars. 

So the work during the 16th century was taking place while manuscripts were still being used and copied in churches. The common sense argument is that those people had better access to the manuscripts that were circulating and considered authentic then we do today. After the Bible shifted from existing in hand-copied codices to printed editions, the hand-copied manuscripts were used less, and began being submitted to museums and libraries rather than being used in churches. The texts that the people of God used were no longer in manuscript form, but printed editions of those collated manuscripts. The simple reality is that in the modern period, the manuscripts are artifacts of a time before the printing press. Almost nobody has used a manuscript in a church for centuries, so the evaluation of those manuscripts is difficult without the testimony of the people who actually used them. Thus, the final common sense argument recognizes that the earliest surviving manuscripts are not a standard that anybody would use from the perspective of God preserving His Word. 

The final common sense argument is that the manuscripts used in the first effort of textual criticism do represent the best form of the New Testament as it was preserved in the manuscript tradition. Compare this to the opinion that a smattering of heavily corrected, barely copied past the fourth century manuscripts are “earliest and best”. That is because until the printing press, these handwritten codices were actually used in churches by the people of God. So at the time of the first printed editions, the textual scholars of the time had the best insight into the manuscripts that were actually being used, regardless of being majority or minority texts. In order to reject the text-critical efforts of the 16th century, one has to believe that texts were chosen which nobody was using or had never used. This stands in opposition to history however, as Erasmus was heavily influenced by readings that would received by all. Popular opinion often influenced Erasmus in his text-critical decisions. That is the real story behind his inclusion of 1 John 5:7 in his third edition of the Novum Testamentum. He did not lose a bet, he feared that people wouldn’t use his Greek New Testament if he didn’t include it. 

Conclusion

Based on common sense arguments, what makes more sense? Did the textual scholars who were doing text-critical work when manuscripts were actually being used have better insights into what the best manuscripts are?  Or do modern textual scholars who only have access to manuscripts in museums and libraries know which texts are the best? Is it more likely that God hid away His Word for a thousand years in a handful of manuscripts? Or did He preserve His Word in the manuscripts that were actually being used by the people of God? These are all questions that any layperson should be able to answer. It does not take a PhD in textual studies to determine that the Modern Critical Text starts in the wrong place, with the wrong manuscripts. 

The common sense conclusion is that texts used in the first production of printed texts represents the best form of the manuscript tradition that has ever existed. After this point in time, manuscripts were sent to libraries and museums and the printed form of the Greek New Testament was the form that the people of God used. These printed forms were translated into various common languages and used with little to no contest for the next 300 years, until modern theories of scribal tampering caused people to throw out the work of the 16th century. The claim that “we have more data” really does not mean a whole lot, considering we have less perspective on the value of said data. At the end of the conversation, one has to ask, “How valuable is the data that was hidden in caves and barrels?” Is the data that was not being used more important, or is the data that was being used more important? Modern scholars consent to the former, and the scholars of the 16th century consented to the latter. 

In order to conclude that modern scholars have a better perspective on the data, one must write off the perspective of Augustine, who said, “Certain persons of little faith or rather enemies of the true faith fearing I suppose less their wives should be given impunity in sinning removed from their manuscripts the lord’s act of forgiveness to the adulteress. As if he who had said, “sin no more” had granted permission to sin.” One must claim that Calvin and Beza were either liars, or confused and mistaken. One must declare that Turretin would have upheld the readings he rejected if “he simply had access to the data we have today”. It takes an effort of revisionist history to believe that the believing people of God would adopt the Modern Critical Text. The simple common sense conclusion is to read these theologians and scholars as though they weren’t fools, and determine that they simply disagreed with modern conclusions. Erasmus, Beza, Stephanus, Calvin, Turretin, Gill, and Dabney did not think anything of the Vatican Codex and manuscripts like it. In fact, they considered them a grotesque corruption of God’s Word. Based on the testimony of the people of God in time, which side is spinning tales and mythology? Is it the people who say that the Word of God evolved and became corrupted beyond repair? I heartily disagree, and affirm with the theological giants of the past that God has preserved His Word in the Received Text.  

Has the CBGM Gotten Us to 125AD?

Introduction

So it has been said that the CBGM has been able to “get us to 125AD” as it pertains to the New Testament manuscripts with its analysis – or at least in Luke 23:34. Anybody who makes such a claim clearly has no working understanding of the Munster Method, or at least is choosing to use an invisible rod to bash people over the head. In any case, I thought it would be helpful to examine some potential weaknesses in the methodology in a series of articles. To begin, I thought I would discuss the reality that the CBGM is still in need of critical analysis. Dr. Peter Gurry, in his work, A Critical Examination of the Coherence Based Genealogical Method, as a part of the Brill Academic series New Testament Tools and Studies writes, “Despite the excitement about the CBGM and its adoption by such prominent editions, there has been no sustained attempt to critically test its principles and procedures” (2).

So my advice to any of those who believe such a bold claim that the CBGM can “get us to 125AD” should put on their discernment ears and wait until 2032 when the effort can be accurately examined in full. If its use in analyzing the Catholic Epistles is any indication of the kind of certainty it will provide, I now direct the reader to open their Nestle-Aland 28th Edition, if they own one, and examine the readings marked with a black diamond. It should be loudly noted that the methodology of the CBGM has not been fully examined, and I agree with Dr. Gurry when he writes, “If the method is fundamentally flawed, it matters little how well they used it” (4).

The CBGM and the Initial Text

Before the Christian church preemptively buys into this method wholesale, it is important to first recognize that there is not uniform agreement, even in the early implementation process of the CBGM, by all that this methodology will result in establishing what is being called the Initial Text. Bengt Alexanderson, in his work, Problems in the New Testament: Old Manuscripts and Papyri, the New Coherence-Based-Genealogical Method (CBGM) and the Editio Critica Maior (ECM), writes, “I do not think the method is of any value for establishing the text of the New Testament” (117). What should be noted loudly for those that are falling asleep, is that a significant shift has occurred under the noses of laypeople in the effort of textual scholarship as it pertains to the New Testament text.

That shift is the abandonment of the search for the Original or Authorial text for the pursuit of what is being called the Initial Text. Dr. Gurry writes, “These two terms [authorial or original text] have often been used interchangeably and their definition more often assumed than explained. Moreover, that this text was the goal of the discipline remained generally undisputed until the end of the twentieth-century. It was then that some scholars began to question whether the original text could or should be the only goal or even any goal at all” (90, bracketed material added). Regardless of whether this is the method one decides to advocate for, let it be said that this is indeed a shift, for better or for worse. Dr. Gurry continues, “Rather than clarify or resolve this debate, the advent of the CBGM has only complicated the matter by introducing an apparently new goal and a new term to go with it: Ausgangstext, or its English equivalent “initial text” (90-91). The problem of defining terms will always gray the bridge between academia and the people, so hopefully this article helps color in the gap.

While the debate rages on between the scholars as to how the Initial Text should be defined, I will start by presenting what might be considered as the conservative understanding of it and work from there. Gerd Mink, who is credited with the first use of the term Ausgangstext, employs the term to mean “progenitor” or the “hypothetical, so-called original text”(92). That is to say that the goal of the CBGM in theory is to produce the text that the rest of the manuscripts flowed from. This sounds great, in theory, but there remains a great distance to cover from saying that the CBGM should produce this Initial Text and the CBGM has produced this Initial Text. In any case, the use of the terminology “Original Text” is not employed in the same way as it was historically, and there is much deliberation as to whether Mink’s proposed definition will win out over and above those that wish to define it more loosely.

Based on my experience with systems, an appropriate definition of the term as “the text from which the extant tradition originates” (93) is much more precise and descriptive of what the method is capable of achieving. Any talk of whether or not the Initial Text represents the Original Text is merely speculation at this point, and I argue will remain speculation when the effort is complete. This of course requires a more humble assessment of the capabilities of the CBGM, in that an empirical method is only good for analysis on that which it has is its possession. Which is to say that methodologically speaking, there is still a gray area between the time that the earliest extant manuscripts are dated and the time that the original manuscripts were penned of about 300 years or more, depending on how early one dates the earliest complete manuscripts. This is what I have been calling the “Gray area between the authorial and initial text” or “The Gray Area” for short. Dr. Gurry has defined it as the historical gap (100). I suspect that this gray area will be the focus of all discussion pertaining to the actual value of the ECM by the time 2032 arrives.

The Gray Area Between the Authorial and Initial Text

Any critique of the CBGM is incomplete without a sincere handling of the Gray Area between the Original and Initial Text. Until that conversation has happened, it is rather preemptive to make any conclusions such as, “The CBGM can get us to about 125AD”. Dr. Gurry writes, “The reason is that there is a methodological gap between the start of the textual tradition as we have it and the text of the autograph itself. Any developments between these two points are outside the remit of textual criticism proper. Where there is “no trace [of the original text] in the manuscript tradition” the text critic must, on Mink’s terms, remain silent” (93).

This is understandably a weakness of the methodology itself, if one expects the methodology to produce a meaningful text. Dr. Gurry continues, “Minks statement that the initial text “should not necessarily be equated with any actual historical reality” is best read as a way to underscore this point” (93). I propose that it is of greatest importance that Christians begin discussing the Gray Area between the Original Text and the Initial Text now, as it outside of the interest of the text-critic proper. Yes, this discussion is most certainly a theological one, as much as that might pain some who have buried their heads in the sand to the weaknesses of the CBGM care to admit.

It is important to note, that in this conversation over the methodology of the CBGM, that there is certainly not uniform agreement on the capabilities of this relatively new method. It is my hope that by bringing this discussion into a more public space, that the terminology of Original and Initial Text, and the space between these two points in the transmission of the New Testament, fosters an important conversation as it pertains to the orthodox doctrinal standards of inspiration and preservation. Dr. Gerd Mink indirectly proposes one possible method of analyzing the Gray Area, which would be to demonstrate that there is a significant break between the Original and Initial Text. Perhaps some ambitious doctoral student might take upon himself to conduct this work, though I wonder if it is even possible to analyze data that does not exist. That is to say that determining the quality and authenticity of the Initial Text might as well be impossible, and any conclusions regarding this text will be assumptive, given that some new component is not added to the CBGM which allows such analysis to be done.

The ontological limitations of the CBGM give cause for the discerning onlooker to side with the assessments of DC Parker and Eldon J. Epp. Dr. Epp writes, “Many of us would feel that Initial Text – if inadequately defined and therefore open to be understood as the First Text or Starting Text in an absolute sense – suggests greater certainty than our knowledge of transmission warrants”(Epp, Which Text?, 70). Until those that have a more optimistic understanding of the Initial Text produce a methodology that is adequate in testing the veracity of the Initial Text, I see no reason why anybody should blindly trust that the Initial Text can be said to be same as the Original Text. And that is assuming that the ECM will reveal one Ausgangstext. It is likely, if not inevitable, that multiple initial texts will burst forth from the machine. A general understanding of the quality of the earliest extant texts certainly warrants such a thought, at least.

Conclusion

The purpose of this article is to 1) make a wider audience aware of the difference between the Initial Text and the Original Text and 2) to begin the conversation of the Gray Area between the Initial Text and the Original Text. It is best that the church begins discussing this now, rather than in 13 years when the ECM is completed. There are many Christians out there who may be caught completely off guard when they discover that the somewhat spurious claim that the CBGM has “gotten us to 125AD” is in fact, not the truth. The fact stands that nobody has the capability of making such a precise claim at this point, and will not be able to make such a claim in 2032 either. It is best then, that people allow the scholars to finish the work prior to making claims that the scholars themselves are still in dialogue about.

The Divine Original and the Initial Text

“At the most demanding level, I believe that we still await a truly critical edition of the New Testament…Each new discovery made the old critical apparatuses ever more out of date, and, even more worryingly, cast doubt on the quality of existing critical texts…The Nestle-Aland edition is a fine tool, and one could not imagine being without it. But it is a stopgap, awaiting the completion of the Editio critica maior… We begin to see that, great as the achievements of previous editors were, they were working with partial and arbitrarily selected materials which led to theories of the text and its history which were themselves partial, and thus almost bound to be mistaken. ” – David C. Parker, Textual Scholarship and the Making of the New Testament, 105-114

Introduction

The current and most advanced effort of New Testament textual scholarship is in progress as I write this article. By New Testament textual scholarship I mean what is commonly referred to as “Textual Criticism”, though the latter name may be inadequate to describe the breadth of the ongoing effort. In order to understand what the “modern critical text” is, it is important to understand that the various printed editions (NA28, UBS5, THGNT, etc.) of the Greek New Testament are just one facet of the work. There is no one “modern critical text”. The effort of textual scholars creating editions of the Greek New Testament is just the practical implementation of that work. So when I speak of “Modern Textual Criticism” on this blog, I am not exclusively referring to the work of creating printed editions of the Greek New Testament, but rather the larger effort as a whole. Within the umbrella of New Testament Scholarship, there is a wide array of projects being pursued and the creation of printed Greek texts just a part of that work. Simply reducing the conversation to printed editions when discussing modern textual scholarship neglects those researching New Testament texts in art, history, commentaries, and of course, the major effort of Modern Textual Scholarship – the Editio critica maior. 

The reason I say that the effort of those producing editions of the Greek New Testament is just a part of the work is not to be dismissive. Rather, it is an attempt to 1) accurately describe the scope of the work and 2) highlight the importance of the work that will impact all future printed editions of the Greek New Testament. Recently, I have noticed that there is a discussion over what it means for textual scholars to searching for the original. In this article, I will briefly address what is called the Editio critica maior as well as comment on the various uses of the word “original” as it pertains to the New Testament text.

The ECM and the Initial Text

The Editio critica maior (ECM) is as DC Parker describes it, “The narrative of the history of the [New Testament] text” (Parker, 128). In a more tangible sense, it is the largest collection of New Testament data ever compiled (and is still being created). It contains a critical text, a critical apparatus, and provides the editor’s justification for the methodology and conclusions (Parker, 112). It is being used in its incomplete form now in printed editions of the Greek New Testament, and will most likely be the standard by the time it is completed around 2032. Despite the tremendous advance in New Testament data the ECM will provide, it is still not a definitive text, it is a data set that represents the available data which does not go back to the time of the Apostles. Parker makes it clear that, “A critical edition is not a reconstruction of an authorial text. It is a reconstruction of the oldest recoverable text, the Initial Text” (122). Parker is not alone in his conclusions regarding the current effort of textual scholarship, though some do stand in opposition to him. One of the most controversial claims that I have made is that “No scholar is trying to find the original”, and Dr. Peter Gurry has taken me to task to clarify what I mean by that. In all fairness, it is probably not fair to make such a sweeping statement without clarification. Dr. Gurry has been quite charitable and pointed me to many valuable resources, which I hope to use accurately. There are in fact many scholars who believe that the initial text might as well be the authorial text, though they do seem to be in the minority depending on how “initial text” is defined.

Dr. Gurry argues that this convolution is due to widespread disagreement on the use of the term “initial text”, or even its misuse. Many mean by “initial text” the earliest text available in the extant manuscript tradition, which is how Parker employs the term. Yet its original definition by Gerd Mink goes beyond how it is commonly employed. Mink defined the term to refer to the hypothetical archetype of the earliest extant manuscript tradition. This effectively puts the initial text earlier in the transmission history than the oldest surviving manuscripts. With this definition, it is more reasonable to believe that the initial text and the authorial text are much closer to each other than the authorial text is to let’s say, Vaticanus. In this regard, Mink and Parker stand in opposition to one another. 

Based on the limitations of such methods employed by CBGM, I agree with Parker’s conclusions on the practical understanding of the initial text over the idealistic definition offered originally by Mink. While Mink’s assumption is that the initial text is a hypothesis for the authorial text, there does not seem to be a good reason for believing this with a high degree of certainty. That is the point of contention between myself and Dr. Gurry – I believe the Scriptures set forth the standard of certainty (Mat. 5:28;24:35), and that anything less than certain leads to having no text at all. And since the ECM itself declares that, “Apart from the fact that a reconstruction cannot achieve the same degree of certainty at each variant passage, this does not mean that a reconstruction of the authorial text is possible in each case. Moreover, it does mean that any reconstructed text can claim to be absolutely identical with the authorial text” (30), there will always be somewhat of a gray area between the authorial text and the initial text – even if that gray area is believed to be inconsequential by some. 

In any case, it is in fact a matter of nuance as to whether or not textual scholars are trying to find the authorial or original text. If by “original” it is meant the hypothetical initial text, than I am defining “original” differently and some textual scholars are indeed trying to find the “original” as they define it. If by initial text it is meant the “earliest form of the extant text” than the original is not being discussed at all. In both definitions of the initial text, the way “original” is being defined is different than is being discussed on this forum. By original I mean “the text that the Holy Spirit inspired”, down to the word, as defined by the Reformation and Post Reformation divines. The Puritan John Owen says this, “the Scriptures of the Old and New Testament [which] were immediately and entirely given out by God himself … [are] by his good and merciful providential dispensation … preserved unto us entire in the original languages.” (Works, 16, pp.351,352)

So it seems that it is a matter of disagreement in how “original” is being defined. In the sense that the theological definition of the word “original” is employed, there are no scholars trying to find the original. When it is framed in this light, the discussion becomes a theological and exegetical discussion as to what the Scriptures say about the doctrine of inspiration and preservation and what “original” means, not a discussion of how the evidence is interpreted. A major focus of this blog is to demonstrate that the discussion of Textual Scholarship should be framed from a theological starting point, not a historical critical one. I have already received the critique by some that since I do not have a PhD in the area of textual scholarship, I do not have the right to speak on this issue. While I understand the nature of this argument, my understanding of the Scriptures is that they are sufficient to speak on matters of faith and practice (2 Tim. 3:16). This is most certainly one of those areas, though I can understand if somebody wishes to exit out of the article at this point based on my lack of credentials. 

The Pursuit of the Divine Initial Text 

The reality is, that the methods being employed to construct the ECM do not offer the degree of certainty that the theological giants of the past had in the Holy Scriptures. Thomas Watson says this, “We may know the Scripture to be the Word of God by its miraculous preservation in all ages … Nor has the church of God, in all revolutions and changes, kept the Scripture that it should not be lost only, but that it should not be depraved. The letter of Scripture has been preserved, without any corruption, in the original tongue.” (Body of Divinity, 19). It is clear that the methods being employed simply cannot ever produce this level of certainty. So regardless of whether or not some may believe that the Initial text, as defined by Mink, represents the authorial text – it can never be said with absolute certainty that this is true using the methodology itself. 

The problem is a matter of methodology, not a matter of interpretation. Thus my critique is not of those who believe the initial text represents the authorial text, it is of the methodology used to arrive at such conclusions. Parker agrees with my understanding of the Munster Method (CBGM), though I disagree with his view of the text vehemently. “I say again that the user who treats the text of James in the Editio critica maior as identical to a letter written several hundred years before the oldest extant manuscript was copied has made a serious methodological error” (Parker, 122). Regardless of Parker’s opinion, those who believe that the initial text represents the authorial text will take the same data as Parker and come to the opposite conclusion as him.

While Parker’s conclusion, and thus my conclusion, might be considered inflammatory by some, an examination of the method demonstrates that it is simply a cold truth regarding the methodology. The Munster Method (CBGM) itself can never prove that it has produced an original text, in any sense of the word, that recreates exactly what Paul wrote. The text that Paul wrote might be considered as a highly likely original reading, but scholars might delegate it to the apparatus due to the limitations of the methodology and data used for analysis. It is the interpretations of scholars that will ultimately come along and conclude which version(s) of the initial text represents the authorial text. So in a very real sense, the interpreters’ theology of preservation and inspiration, along with other suppositions, is being applied retroactively to the work done by the methods being employed, and the flawed decisions of men are the final authority over which texts are considered “original”.

This shifts the authority of the Holy Scriptures from the object to the subject. Because the authority lies in the subject, and the subject is not omniscient, it is not only likely, but inevitable that a legitimately original reading is rejected for some other reading that is determined “earliest and best” by a scholar. It does not matter how earnest a particular scholar is in saying that “I want what Paul wrote!”, the fact remains that the methodology does not allow for that desire to actualize in any meaningful way. The final authority will always rest on the determinations of scholars and their theological suppositions. At the end of the day, the modern textual scholar must employ faith in believing that they have chosen God’s Word correctly. This is part of the reason why the historical doctrine of the Scriptures as self-authenticating is held by those in the Confessional Text camp. A return back to the 16th century is most necessary, for both practical and theological reasons. The authority of the Scriptures does not rest in the determinations of men, but the providential work of God. This is the fundamental difference between those in the modern camp and those in the Confessional camp, which is why I continue to press theologically on the issue and not evidentially.  

Conclusion

I have taken some time to demonstrate the nuance in the discussion of what “original” means. Historically, as I have shown by quotations of those at the Westminster Assembly, the word “original” meant the words penned by the prophets and apostles. In the modern period, scholars prefer the term “initial text”, and the definition of that term is debated. To some, the initial text is the hypothetical archetype that all texts flowed from, and to others, it is the text that represents the earliest extant form of the New Testament texts. In all three cases, three different things are being discussed. Thus, using the definition provided by the framers of the 17th century confessions, I do say confidently that there are no scholars in pursuit of the original as defined by the Reformed in mainstream New Testament textual scholarship. Therefore it is especially appropriate that the view of the text of Scripture presented and defended here on this blog be called “The Confessional Text”, as it not only represents a physical form of the text, but also a distinct theological foundation with specific definitions of terms that have evolved in the modern period. 

Many scholars have attempted to reinterpret Francis Turretin and James Usher and others to fit the modern definitions of “original”, “preserved”, “kept pure”, and so forth, but the fact remains that these theologians did indeed mean what they said plainly. It is simply more accurate to say that the modern view of the text of Holy Scripture is different than the view presented by the Westminster Divines and their contemporaries. In recognizing this difference, I believe it possible to have a fruitful discussion on the theological differences underpinning each position. The modern method is to many hidden in a black box, and as it becomes more developed, will come into plain sight by all. When this time comes, the Reformed must be prepared to stand on the truth that the Scriptures are self-authenticating. 

“The marvelous preservation of the Scriptures [demonstrates this]. Though none in time be so ancient, nor none so much impugned; yet God hath still by his providence preserved them, and every part of them” (James Usher. Body of Divinity, 8).