If I could identify the most significant disconnect between those that advocate for modern critical methods and those that advocate for the Received Text, it’s the difference in how evidence is handled. From a modern critical perspective, it is baffling that the manuscript evidence they produce for or against a reading is rejected. From a Received Text perspective, evidence should be used to support the God given text, not used to reconstruct a new text. Despite the frustration this might cause on both sides, there are good reasons that those who advocate for the preservation of the Received Text are not swayed by the evidence-based presentations of the academy and those who follow in that tradition. Instead of simply shrugging off these reasons, I believe that it is wise to consider the perspective of evidence presented by those in the Received Text camp. The concerns raised about evidence based critical methods cannot be answered by simply talking about textual variants ad nauseum. In this article, I will present four reasons why evidence should not be considered such an authoritative source for textual criticism, and then give a positive presentation on how we can know what the Scriptures say.
Evidence Requires Interpretation
The single most impactful cause of changes in modern bibles is reinterpretation of data. That is because modern textual criticism views evidence as the source material to reconstruct a text that has been lost. One generation, the church may deem a manuscript or reading of little value, and the next, the most valuable text available. We have seen this practically implemented by the introduction of various brackets, footnotes, and omissions in modern Greek texts and translations made from them. This is inevitable with evidence based approaches, because the shape of the text is driven by whichever theory is in vogue that is used to evaluate the evidence. While the transmission of the New Testament was guarded from such significant changes by virtue of churches using these handwritten manuscripts and lack of technology for mass distribution, the modern church is not guarded by such a mechanism. The lack of church oversight in the creation of modern texts also plays into the ability for the Bible to shift year by year at such a quick rate. If a change is made in one place of Scripture today, it can be distributed in thousands of copies, essentially overnight, without consulting a single pastor. This should concern the people of God, but this has unfortunately become standard practice, and advertised as a necessary reality.
Within the various modern printed Greek editions, the perspective of the editors define which verses are included, omitted, bracketed, and footnoted. Since evidence based methods are always driven by the perspective of the person interpreting the data, the shape of the New Testament is intimately connected with the methodologies of the scholars themselves. Yet even the scholar’s analysis is subject to interpretation. This is clearly demonstrated when an editor prints a reading in the main text of a Greek New Testament, and the user of that text selects a reading to use from the footnote over and above the reading chosen by the editor. Even within the modern critical text community, there are even disagreements over how the agreed upon source data should be interpreted, which is evidenced by the various printed editions produced by modern text critics.
It is often said that the cause for change in modern Bibles is “new” data, yet this doesn’t seem to be the case. The texts which modern Bibles are based off were not created in the 19th century, they were published in the 19th century. And modern Bibles are mostly the same as Westcott-Hort’s text at a macro level. At one point, the people of God had that manuscript, and their interpretation of it shaped the way that Greek manuscripts were copied going forward. The difference is that modern interpreters of that data have valued that data more heavily than they have been weighted historically, and those in the Received Text camp view that as an act of God’s providential guidance of the text as it was passed along. In short, the interpretation of the new (to us) data is really a greater factor than the data itself.
Evidence Based Methods are Weak Because Manuscripts Cannot Talk
In modern critical methods, extant manuscripts which are dated prior to AD 1000 are considered the most valuable or relevant New Testament witnesses. Though data after this time is consulted and considered, it is not given the same weight as earlier manuscripts. This becomes problematic because most of our New Testament manuscripts are from after the data window selected by scholars. Additionally, many of these early manuscripts are without a pedigree, meaning that we do not know who made them, why, and for what. For example, Frederic Kenyon proposed that the Papyri fragments were not used in churches for reading, but actually personal texts that a Christian could carry around and read privately. That means that they were more likely to be paraphrastic, contain errors, or be used outside of the mainstream of orthodoxy. Yet this is just a theory. We simply do not know why they were made, for whom, and how they were used. Such is the case for pretty much all of our early manuscripts. Why is this problematic?
During the time period that our earliest extant manuscripts come from, some of the most heated theological debates were going on regarding the humanity and divinity of Christ. We also have testimony during that time which testify to the tampering of manuscripts. Since we do not know anything about the source of these manuscripts, it is nearly impossible to know if a manuscript was used in an orthodox church, or an Arian church, or wasn’t used in a church at all. What is more important from a data analysis perspective then, is what we do know of the context in which those manuscripts were created. Since theological context is not considered in modern critical approaches, we could very well have a reading in our Greek text that was introduced by Maricon himself and be none the wiser. And even if we do print a reading in our modern Bibles that have been historically questioned as gnostic or Arian, this is not taken into consideration by modern methodology.
That means that while early New Testament manuscript evidence serves a powerful apologetic purpose against claims that Christianity was “invented” at some point around the fourth century, it simply does not have the same kind of weight when it comes to being used for constructing accurate Greek texts. We may reproduce the exact hypothetical archetype for Codex B, for example, and still not know who used it, when that archetype was created, or where that archetype came from. One can say that the archetype of Codex B reaches back to the first century, when in reality it may have been created a month before. We simply cannot know. I will continue to argue on my blog that this is a good thing for the church, as it takes the authority of the Scriptures out of the hands of men. By God’s singular care and providence, He has kept His Word pure, and doesn’t need to be reconstructed.
Evidence Based Methods Are the Weakest They Have Ever Been
In the 21st century, we are the farthest away from the time the creation of the manuscripts used to make modern Bibles than any other generation. That means that we have the least amount of perspective on the data we do have, regardless of how much we have of it. The only group of people who had clear insight to Codex Vaticanus for example, were the people that created it, used it, or had access to since lost commentary on it, if that ever existed. If we ignore the insights of the scholars and theologians during the Reformation on this text, which modern scholars typically do, we essentially know nothing about it, except from what can be ascertained from its readings. And if we do not assume any text as standard base text, it is impossible to discern anything from the readings of that text at all, other than somebody used it at some point. How can we know if a reading is orthodox or not, if there is no standard to compare against? It is difficult, even impossible, to know much about a manuscript if the theological context from the time it was created is not considered.
That puts us at a unique time in history, different from even the Reformation. During the 16th century, manuscripts were still being used in churches and in liturgies. In every generation of the church this was the case until the printing press. Rather than assuming “we know more,” it is wiser to assume that we actually know less. Here is a metaphor that may be helpful in understanding my point.
Let’s just say, 1,000 years from now, somebody finds a gas powered lawn mower disassembled into hundreds of pieces in a junkyard in a pile of other disassembled equipment. The person knows its a lawnmower, but doesn’t know what kind of lawnmower or what it originally looked like. If this person wanted to reconstruct that lawn mower, he would have to know which parts go to the lawnmower. He may have another lawnmower which looks kind of like the one he wants to reconstruct, but it’s not the same make and model, and it is from a different time period. Here’s the problem: The person doesn’t know which parts go to the lawnmower he wants to repair, and even if he did, he wouldn’t know exactly how to repair it without an instruction manual because nobody has used lawn mowers in 400 years. In order to repair the mower, he needs to find somebody who knows how, or a preserved model to use as a guide. He could spend his whole life trying to reconstruct the mower, and even if he got it to work, he wouldn’t know if the parts he used were from another piece of equipment that had the same parts as the mower, another mower altogether, or the original mower. The reconstructed mower might even have parts that work, but aren’t the right parts. Only a person who knew what the lawn mower originally looked like could tell him if he reconstructed it correctly with the right parts. A person trying to reconstruct the mower while other mowers of the same kind were still in production would have no problem with the same task, and achieve more accurate results. The fact is, the person will simply never know that all the parts he used even belonged to the mower to begin with, because he doesn’t have the original mower. You wouldn’t call that lawn mower preserved, in any case, even if all the parts were in the junkyard somewhere.
The metaphor works quite nicely with manuscript evidence. During the time a manuscript was created, the people knew what that manuscript was for, who made it, and who used it. They even would know where it departs from the rest of the manuscripts circulating at the time. These are simply insights we cannot know, unless some extant commentary on the manuscript informs us on these things. And often times when we do have this kind of commentary, it is ignored and labeled fraudulent or “out of context.” The point is, the further away from the creation of a manuscript we get, the less we can know about it based on the manuscript itself. This, I argue, is yet another function of providence. We do not need to reconstruct a text, because the Bible has been kept pure in all ages. We need to receive the text as it has been passed down, not recreate a version of the New Testament that looks like a text(B) that was produced by, according to Scrivener, “more or less a Western unitarian” (Royse, Scribal Habits in Early Greek New Testament Papyri, 3).
Evidence Based Methods Are Weak Because They Give False Confidence That We Have the Right Reading
While the scholars working in the field are vocal about not having absolute confidence in the evidence, by the time a text gets to the pew, this doubt is dissipated through popular level presentations on textual criticism. A scholar can print a reading in a Greek text and have doubts about its place in the transmission history, and a Christian will use that reading as if it is the Divine Original itself. A scholar might even have great confidence that the reading printed, or not printed, in the text is original, and be wrong. Due to the nature of genealogical methods of text criticism, a reading can be erroneously placed earlier or later in the textual transmission history, and the scholar would never know it.
The problem of basing the form of our Bibles on extant evidence is not a problem with all evidence. It is a problem of which evidence. If scholars are wrong in their theories, the church has a Bible that is based on the wrong manuscripts, and nobody is the wiser for it. In other words, scholars, like the person who set out to reconstruct the lawn mower, do not know enough about the manuscripts they have selected to use to say that their reconstruction looks anything like the original. Sure, it’s a form of the New Testament, but is it the New Testament? Just like the reconstructed lawn mower might look like the original, the person will never know what parts of the mower he used the wrong part for. Since we do not have this meta-data on our earliest extant manuscripts, the reconstructed product does not say much other than that it looks like a version of the New Testament that existed at what point in history. That text may have existed, but can we know who used it, and why we needed to reconstruct it? What text is this that has fallen away, if God’s Word has been preserved? Reconstructionist text criticism introduces far more problems than it solves – pointing again to the reality that the Bible never needed to be reconstructed based on the evidence that was published in the 19th and 20th centuries. That yet again points to the reality that God did not desire for His people to engage in this effort, but receive the text He had already given.
Why Those In the Received Text Camp Do Not Base Their Bible Primarily on Evidence
Simply put, because it is not wise to do so, for the reasons listed above. That is why the primary function of the Received Text position is providence. According to the Scriptures, God has preserved His Word. The question for most people is, how did He do it? Those in the Received Text camp say that in every generation of copying of New Testament manuscripts, the Christians who copied and used manuscripts had the best perspective on those manuscripts. Those in the modern critical text camp typically say that we, in modernity, have the best perspective on the original languages and the extant manuscripts. Rather than assume that “we know better,” it is wise to avoid that kind of thinking, and instead, look to providence. Recognizing God’s providence is a matter of receiving the product that existed in continued use throughout the ages. Simply because a manuscript survived does not mean that it was in the category of “in continued use.” In fact, a manuscript from 1,700 years ago seemingly points in the opposite direction. I’ve worn out high quality, printed Bibles in a year. If somebody finds my Bible in tact in 1,700 years and can still read it, that would say a lot about how much I used it! Since we don’t know much about the earliest manuscripts, these are not helpful in determining such a text. If we can’t say where a manuscript came from, or how it was used, it is not wise to assume we know that information when we simply don’t. Further, the early data sample is not broad enough to know what else existed at the time that was being used. There is a reason the majority of our manuscripts do not look like those called, “Earliest and best,” and instead of assuming that the people of God got it wrong for hundreds of years, it is more reasonable to assume that the people of God had more information, and more insight on these manuscripts than we do now.
That is why the Reformation is such a pivotal reference point for the transmission history of the New Testament text. It is a time where we, in the 21st century, have the best insight on what actually happened, and the most commentary on the manuscripts and readings that were agreed upon to be authentic by the people of God. During that time and even in the early church, the concept of an “authentic” manuscript was a driving force in identifying texts that should be used. No such function exists today in modern textual criticism. Manuscripts are weighted according to text critical principles, not evaluated by their authenticity or the way the church viewed them historically. Even with all we know of the Reformation, there is still so much we will never know about that process. If we do not even know every manuscript used in the creation of the Received Text, how much more absurd is it to try to figure out the origins of hand copied manuscripts from the fourth century and earlier?
One of the things that needs to be recognized, is that we will never know exactly how the New Testament was transmitted, we simply know that it was. That is why textual scholars have been developing theories for the last 200 years, because there is no definitive trail through time leading back to the start that we can derive from extant data. It is important to note, that just because we do not know, does not mean that Christians throughout the ages did not know. Actually, it seems that this generation is the only generation that doesn’t seem to know. That speaks volumes to the methods of modernity. We cannot say that every New Testament in the fourth century looked like Codex B, and even from an evidenced based approach, that conclusion stands at odds with the data and reason. We can only reasonably approach the issue with what we do know: That God preserved His Word, and that by the time it was mass produced, it looked a certain way. If we want to approach the text like any other book, and say that the New Testament evolved, and was not preserved, we will spend our whole lifetime watching the text of the New Testament change form with the ebbs and flows of different theories of the academy. Often times evangelical textual scholars say, “I do not think God was obligated to give us the original. We should be grateful for what we do have.” Yet I say that that conclusion is based on theories on manuscripts that we simply do not know enough about to make such a conclusion. The conclusion first assumes that the Bible was destroyed, like the mower, and needs to be reconstructed. Yet it is clear, based on the extant evidence, that if the goal is to reproduce the original, that is an unwise errand. It seems especially off base if we are trying to maintain the doctrine of preservation in any meaningful way.
A simple conclusion that causes people to return to the historical protestant text is often times the reality that we do not know enough about the transmission history of the New Testament up to the time of the Reformation to responsibly say that we can reconstruct it to the specifications of the original. Like the person who reconstructed the lawn mower, we will never know if we included all the parts, or even parts that came from other sources than the original. That is why the modern effort of textual criticism is more confident at saying what “isn’t” Scripture than what is Scripture. Even if those conclusions are based on evidence we really don’t know a whole lot about. What those in the Received Text camp set forth, is that it is not primarily a matter of extant evidence which gives us our New Testament, it is a matter of which evidence do we know the people of God used in time. There is no reason to assume that orthodox Christians even used our “earliest and best” manuscripts. That is assuming that modern scholars even factor in that metric, which does not exist in the axioms of the critical methods. If our view of the transmission of the New Testament is based on the belief that God preserved His Word, it is difficult to propose that He did so by first destroying it so that it had to be reconstructed. The belief that God requires His Word to be reconstructed only came about due to the reevaluation of manuscripts which we know virtually nothing about. We do not know who created them, who used them, or even if they were used outside of a single church or area. That is not a wise foundation, from both a data perspective, and a common sense perspective.
Simply because the early evidence is not uniform and has no pedigree does not mean that God did not preserve His Word. In fact, it seemingly demonstrates that God, in His providence, would make it quite difficult for Christians to responsibly place their faith in any such process that places the authority of the Scriptures in axioms of modern scholarship. When the early evidence is viewed in light of what we know about it, it’s value as a source for reconstruction fades to a dim glimmer. What it does demonstrate is that the New Testament existed as early as it says it did, and that it was transmitted all the way until today. The matter of identifying its original form was never meant to be something that men are responsible for reconstructing, but receiving.
If we are not to reconstruct the text, but receive it as a preserved whole, then it seems providence is a much better guide than reconstruction by way of extant evidence that we have little information about. By recognizing God’s providence, we recognize that the people of God in every generation had the best insight on the manuscripts that were extant to them, used by them, and copied by them. This allows us to at least recognize the general form of the New Testament, what Theodore Letis called the “Macro Text.” In other words, the general copying process of the Text of Holy Scripture naturally corrected significant variants, either by producing another copy, or correcting that copy in the margin, according to the best manuscripts available in every age. By the time technology increased with the printing press, many manuscripts which had variants in them existed, yes. Almost every single one of the significant variants recognized today existed then. Yet, unlike this generation, the scholars and theologians of the time had better perspective on that data because it was still being used. They had more insight on the text that had been handed down as “authentic” then we ever will.
We will never know what was contained in every manuscript that was destroyed after that time. In fact, there are hundreds of manuscripts that we know exist today that we simply do not have access to examine. Readings that we consider “minority” today could have easily been a majority reading in AD 800. There are readings considered “minority” today that could have been the majority during the time after the Reformation, and we would never know. The assumption that extant data is the best data is simply not in line with how much we know about manuscripts getting destroyed throughout time. How could it be, that for the first time in church history, that God finally allowed His church to “get it right” concerning the text of Holy Scripture? How could it be, that now, even knowing how many thousands of manuscripts that were destroyed, is the time where we have the most of them? It may be true that we have the most access to all of the manuscripts due to technological advances, but it is important to remind ourselves that we have the least amount of insight on the ones we do have. Additionally, what value is all of this data if the modern scholars are only looking at a subset of that data? The very subset that we know the least about, nonetheless!
The point of this blog is to give people confidence that the people of God in the previous era of the church had that insight, and by God’s providence, He preserved His Word. Rather than believing that we need to reconstruct the text, we should receive the text handed down to us. What we do know of the text of the Reformation is that the people of God used it, translated it, and commented on it. It was so agreed upon that people have called it the “default” text. Does that not sound providential? That the text was so agreed upon it was “default?” The reality is, we do not have the justification, based on evidence at least, to unseat this text so agreed upon. We have no reason to doubt that God has providentially preserved His Word by handing it down through the people of God that used it. We should cherish the fact that God does desire for His sheep to hear His voice, and has given us His Word to make that possible.
The alternative, as I have seen it and demonstrated on my blog, is not such a view. It is a view which says that we don’t know exactly what God spoke by the prophets and apostles – that we need to reconstruct a lost text which has evolved over time. It is a view which says that God didn’t desire to give us all of His Word, just enough of it to get by. It is a view which says that even if God did preserve His Word, we would have no way to know that we have it. It is an honest evaluation of what the Bible is, if we assume that the early, choice evidence preferred by the academy is the only way we have to determine what God’s Word is. Yet this makes perfect sense that such scholars would come to these conclusions, if we consider the limitations of evidence based critical methods. This article hopefully demonstrates that. If anything, God’s providential work in time has shown us that it is folly to try and reconstruct a text that never fell away. It seems, that the real text that has fallen away, is the one the scholars are trying to reconstruct.
For more on the Providentially Preserved Text: https://youngtextlessreformed.com/2019/11/06/a-summary-of-the-confessional-text-position/