Archaeology dating accuracy
According to the so-called high chronology, the transition occurred around 1000 or 980 B. The hope of many scholars who feel that this science-based radiocarbon research will bring the debate to its longed-for solution is, in my view, difficult to adopt.The question I would like to raise is whether radiocarbon dating is really more precise, objective and reliable than the traditional way of dating when applied to the problem of the date of the transition from Iron I to Iron IIa.Ultimately, radiocarbon dating accuracy for calculating Iron Age dates, and consequentially Bible chronology has varied from researcher to researcher.When it comes to Bible chronology the difference between a “high” and “low” chronology is a matter of mere decades not centuries. Other opinions place the transition somewhere between the two—in about 950 B. The date is important because the date you choose will determine whether David and Solomon reigned in the archaeologically poor and archaeologically poorly documented Iron I or in the comparatively rich and richly documented Iron IIa.Since these “long-term” samples may introduce the “old wood” effect, any calculation of precise absolute dates based on “long-term” samples is unreliable and may easily lead to errors of up to several decades or even more.For this reason, researchers prefer to use “short-life” samples, such as seeds, grain or olive pits. In many studies, particular radio-carbon dates are not considered valid because they do not match the majority of dated samples from the site in question.The imposing Judahite fortress of Khirbet Qeiyafa has been securely dated by pottery and radiocarbon analysis to the early tenth century B. Proponents of low Bible chronology, called minimalists, claim the transition occurred around 920 to 900 B. Proponents of a high Bible chronology put the date around 1000 to 980 B. Some scholars have asked if radiocarbon dating accuracy will help settle the question. Radioactive carbon-14 is used to analyze an organic material, such as wood, seeds, or bones, to determine a date of the material’s growth. Did they live in the archaeological period known as Iron Age I, which is archaeologically poorly documented, or in Iron Age IIa, for which more evidence is available.
However, as with any dating technique there are limits to the kinds of things that can be satisfactorily dated, levels of precision and accuracy, age range constraints, and different levels of susceptibility to contamination. The Radiocarbon Revolution Since its development by Willard Libby in the 1940s, radiocarbon (14C) dating has become one of the most essential tools in archaeology.Radiocarbon dating was the first chronometric technique widely available to archaeologists and was especially useful because it allowed researchers to directly date the panoply of organic remains often found in archaeological sites including artifacts made from bone, shell, wood, and other carbon based materials.However, there are a number of other factors that can affect the amount of carbon present in a sample and how that information is interpreted by archaeologists.
Thus a great deal of care is taken in securing and processing samples and multiple samples are often required if we want to be confident about assigning a date to a site, feature, or artifact (read more about the radiocarbon dating technique at:
Radiocarbon dating is especially good for determining the age of sites occupied within the last 26,000 years or so (but has the potential for sites over 50,000), can be used on carbon-based materials (organic or inorganic), and can be accurate to within ±30-50 years.