We are collecting cases of discontinuous technological progress, to inform our understanding of whether artificial intelligence research is likely to undergo such a discontinuity. This page details our investigation.

Details

Motivations

We are interested in whether artificial intelligence research is likely to undergo discontinuous progress in the lead-up to human-level capabilities, or whether it will get there via incremental steps. If the former, we are interested in the nature of such discontinuities.

Why are we interested in this? If discontinuity is likely, a transition to AI is more likely to be abrupt, more likely to be soon, and more likely to be disruptive. Also, if we think a discontinuity is likely, then our research should investigate questions such as how to prepare or be warned, and not questions like when the present trajectories of AI progress will reach human-level capabilities. As well as being decision relevant and important, this question appears to attract substantial disagreement, making it particularly important to resolve.

This project aims to shed light on the potential for discontinuities in AI by investigating the degree and nature of discontinuities in other technologies. This seems an informative baseline for our expectations about AI, especially if we have no strong reason to expect artificial intelligence to be radically unusual in this regard.

We are interested in several specific questions, such as:

  • How common is abrupt progress in technology?
  • Where there are discontinuities, how much progress do they represent? (relative to previous rates of progress)
  • What predicts such discontinuities, if anything?

We are also interested in overall distributions of size of progress increments, but searching specifically for the very largest increments bears on this in a less straightforward way, so we are likely to investigate it by other means later.

Methods

We have collected around fifty instances of technological change which are contenders for being discontinuous. Many of these are suggestions offered to us in response to a Facebook question, a Quora question, and personal communications. We obtained some by searching for abrupt graphs in google images, and noting their subject matter.

We are taking these cases one by one, and assessing whether each involved discontinuous progress on plausible and interesting metrics. For instance, if we were told that fishing hooks became radically stronger in 1997, we might investigate the strength of fishing hooks over time—if we could find the data—and also their cost and how many fish could be caught, because these are measures of more natural interest which we might expect to be related.

We generally count progress in an area as ‘discontinuous’ if the improvement between two measurements is far larger than what one would normally expect over the same time period. This definition is open to revision, as we gain a better understanding of the landscape.

List of cases we have evaluated

This is a list of areas of technological progress which we have tentatively determined to either involve discontinuous technological progress, or not. Note that we only investigate cases that looked likely to be discontinuous.

Key:

Large discontinuity (>100 years of progress at once)

 Moderate discontinuity (>10 years of progress at once)

No sign of substantial discontinuities

 Investigation begun but in progress

The Haber Process 

The Haber process was the first energy efficient method of producing ammonia, which is key to making fertilizer. The reason to expect that the Haber process might represent discontinuous technological progress is that previous processes were barely affordable, while the Haber process was hugely valuable—it is credited with fixing much of the nitrogen now in human bodies—and has been used on an industrial scale since 1913.

A likely place to look for discontinuities then is in the energy cost of fixing nitrogen. Table 4 in Grünewald’s Chemistry for the Future suggests that the invention of the Haber reduced the energy expense by around 60% per nitrogen bonded over a method developed eight years earlier. The previous step however appears to have represented at least a 50% improvement over the process of two years earlier (though the figure is hard to read). Later improvements to the Haber process appear to have been comparable. Thus it seems the Haber process was not an unusually large improvement in energy efficiency, but was probably instead the improvement that happened to take the process into the range of affordability.

Since it appears that energy was an important expense, and the Haber process was especially notable for being energy efficient, and yet did not represent a particular discontinuity in energy efficiency progress, it seems unlikely that the Haber process involved a discontinuity. Furthermore, it appears that the world moved to using the Haber process over other sources of fertilizer gradually, suggesting there was not a massive price differential, nor any sharp practical change as a result of the adoption of the process. In the 20’s the US imported much nitrogen from Chile. Alternative nitrogen source calcium cyanamide reached peak production in 1945, thirty years since the Haber process reached industrial scale production.

The amount of synthetic nitrogen fertilizer applied hasn’t abruptly changed since 1860 (see p24). Neither has the amount of food produced, for a few foods at least.

In sum, it seems the Haber process has had a large effect, but it was produced by a moderate change in efficiency, and manifest over a long period.

Penicillin on syphilis 

Penicillin was introduced to clinical use in 1941, and quickly became the preferred treatment for syphilis. At around that time, there began a steep decline in the prevalence of syphilis, which appears to be generally attributed to penicillin. Cases of syphilis declined by around 80% over fifteen years, as shown in figure 1. Between 1940 and 1975, deaths from syphilis declined by over 98%, from 14 deaths per hundred thousand to 0.2, as shown in figure 2.

It is possible from our perspective that this decline is not entirely from penicillin. US Surgeon General Thomas Parran launched a national syphilis control campaign in 1938. Wikipedia also attributes some of the syphilis decline over the 19th and 20th centuries to decreasing virulence of the spirochete. Nonetheless, penicillin is likely responsible for most of it.

Syphilis infection in the US

Figure 1: historic syphilis infection rates in the US (Wikipedia)

syphilis

Figure 2: syphilis mortality rates in the US during the 20th century. From Armstrong et al

Either way, the decrease in deaths from syphilis appears to have been rapid, but not abrupt: syphilis cases and deaths gradually came down over around fifteen years. In figure 2, the annual reductions during the fastest decline are not much larger than the characteristic difference between years before the decline.

Even if penicillin’s effect on the national death rate from syphilis was gradual, we might expect this to be due to frictions like institutional inertia, rather than ongoing technological improvements. Thus it could still be the case that penicillin was a radically better drug than its predecessors, when applied.

Recent predecessors to penicillin included arsenic and bismuth compounds, and intentionally contracting malaria. It appears on casual investigation that penicillin was successful about 85% of the time soon after its development, while a previous treatment—arsenic and bismuth—was successful around 90% of the time, though it is unclear whether the same success is being measured. However the success figures (for the latter at least) include only people who completed the treatment, and it appears that perhaps only a quarter of patients tended to receive a ‘minimum curative dose’ of arsenic and bismuth therapy before ‘defaulting’, seemingly due to the prolonged nature of the treatment and unpleasant side effects (though the death rate for untreated syphilis is apparently 4%-54%, so it is somewhat surprising to us that so many people would default from treatment). For an early version of penicillin, almost all patients could receive a minimum curative dose; a difference that might represent a large improvement in syphilis treatment.

If penicillin made an abrupt difference to syphilis treatment then, it seems it is likely to have been in terms of costs, broadly construed (which were partly reflected in willingness to be treated). The time required for treatment reduced from more than 20 days to 8 weeks for the first penicillin patients. The side effects qualitatively reduced from horrible and sometimes deadly to bearable (see above). Evaluating these costs quantitatively will remain beyond the scope of this investigation at present.

Even if penicillin was in fact a large improvement over its predecessors in absolute terms, in terms of characteristic progress in syphilis treatments, it was less obviously unusual. Arsphenamine, released in 1910, was sold as ‘salvarsan’ and known as ‘magic bullet’, and won its discoverer Paul Erhlich a Nobel prize. A physician at the time describes:

“Arsenobenzol, designated “606,” whatever the future may bring to justify the present enthusiasm, is now actually a more or less incredible advance in the treatment of syphilis and in many ways is superior to the old mercury – as valuable as this will continue to be – because of its eminently powerful and eminently rapid spirochaeticidal property.”

In sum, penicillin probably made quick but not abrupt progress in reducing syphilis and syphilis mortality. It is unclear whether penicillin is much more likely to cure a patient than earlier treatments, conditional on the treatment being carried out, but it appears penicillin treatment was around four times more likely to be carried out, due to lower costs. Qualitatively it appears that penicillin represented an important reduction in costs, but it is hard to evaluate this precisely or compare it with the longer term progress. It appears that as recently as 1910 another drug for syphilis also represented qualitatively impressive progress in treatment.

Nuclear weapons 

Main article: Discontinuity from nuclear weapons

Nuclear weapons represented abrupt progress in explosive power, but probably not in cost-effectiveness. Nuclear weapons represented progress in relative explosive efficiency which would have taken over six thousand years at previous rates.

High temperature superconductors 

Main article: Cases of discontinuous technological progress

High temperature superconductors represented abrupt progress in the temperature at which superconductivity could take place. In a brief period, progress took place which would previously have taken at least a hundred years.

Jet-propelled vehicles 

Main article: Cases of discontinuous technological progress

Jet-propelled vehicles produced a moderate discontinuity—about thirty years of progress at previous rates—in the land speed record.

Fairey Delta 2 and Lockheed YF-12A 

Main article: Cases of discontinuous technological progress

Fairey Delta 2 and Lockheed YF-12A were planes which increased the air speed record by relatively large factors. They represented 11-17 years and 7-8 years of progress respectively, at previous rates.

The printing press 

The printing press is generally credited with massively increasing the availability of the printed word, starting in around 1450. For instance, some estimate that the number of books in Europe climbed from 30,000 to 10,000,000 in the fifty years following the printing press.

We have not looked into this in depth yet, and there is some ambiguity around the relevance of other printing methods. For instance, according to Wikipedia, in the mid-fifteenth century block-printed books were cheaper than those printed on a printing press.

Aluminium 

It is often claimed that the discovery of the Hall–Héroult process in the 1880s brought the price of aluminium down precipitously. We found several smidgeons of quantitative data about this, but they seriously conflict. By far the most rigorous looking is a report from Patricia Plunkert at the US Geological Survey, from which we get the following data. However be warned that some of these figures may be off by orders of magnitude, if other sources are trusted.

Plunkert provides a table of historic aluminium prices, according to which the nominal price fell from $8 per pound to $0.58 per pound sometime between 1887 and 1895 (during most of which time no records are available). This period probably captures the innovation of interest, as the Hall–Héroult process was patented in 1886 according to Plunkert, and the price only dropped by $1 per pound during the preceding fifteen years according to her table. Plunkert also says that the price was held artificially low to encourage consumers in the early 1900s, suggesting the same may have been true earlier, however this seems likely to be a small correction.

The sewing machine 

Early sewing machines apparently brought the time to produce clothing down by an order of magnitude (from 14 hours to 75 minutes for a man’s dress shirt by one estimate). However it appears that the technology progressed more slowly, then was taken up by the public later – probably when it became cost-effective, at which time adoptees may have experienced a rapid reduction in sewing time (presumably at some expense). These impressions are from a very casual perusal of the evidence.

Video compression 

Blogger John McGowan claims that video compression performance was constant at a ratio of around 250 for about seven years prior to 2003, then jumped to around 900.

Figure 3: video compression performance in the past two decades.

Figure 3: video compression performance in the past two decades.

Information storage volume 

According to the Performance Curves Database (PCDB), ‘information storage volume’ for both handwriting and printing has grown by a factor of three in recent years, after less than doubling in the hundred years previously. It is unclear what exactly is being measured here however.

Undersea cable price 

The bandwidth per cable length available for a dollar apparently grew by more than 1000 times in around 1880.

Chess AI 

There was a notable discontinuity in chess AI according to the SSDF ratings. However it appears to be less than ten years of progress at previous rates. Also, part of this jump appears to have been caused by the introduction of new hardware in the contest.1

Recent progress in SSDF ELO ratings, from Wikipedia

Figure 4: Elo ratings of the best program on SSDF at the end of each year. Data from Wikipedia.

Infrared detector sensitivity 

Infrared detector sensitivity is measured in terms of ‘Noise Equivalent Power’ (NEP), or the amount of power (energy per time) that needs to hit the sensor for the sensor’s output to have a signal:noise ratio of one. We investigated progress in infrared detection technology because according to Academic Press (1974), the helium-cooled germanium bolometer represented a four order of magnitude improvement in sensitivity over uncooled detectors.2 However our own investigation suggests there were other innovations between uncooled detectors and the bolometer in question, and thus no abrupt improvement.

We list advances we know of here, and summarize them in Figure 5. The 1947 point is uncooled. The 1969 point is nearly four orders of magnitude better. However we know of at least four other detectors with intermediate levels of sensitivity, and these are spread fairly evenly between the uncooled device and the most efficient cooled one listed.

We have not checked whether the progress between the uncooled detector and the first cooled detector was discontinuous, given previous rates. This is because we have no strong reason to suspect it is.

xxx

Figure 5: Sensitivity of infrared detectors during the transition to liquid-helium-cooled devices.

Genome sequencing 

This appears to have seen at least a moderate discontinuity. An investigation is in progress.


 

  1. ‘The jump perfectly corresponds to moving from all programs running on an Arena 256 MB Athlon 1200 MHz to some programs running on a 2 GB Q6600 2.4 GHz computer, suggesting the change in hardware accounts for the observed improvement. However, it also corresponds perfectly to Deep Rybka 3 overtaking Rybka 2.3.1. This latter event corresponds to huge jumps in the CCRL and CEGT records at around that time, and they did not change hardware then. The average program in the SSDF list gained 120 points at that time (Karlsson 2008), which is roughly the difference between the size of the jump in the SSDF records and the jump in records from other rating systems. So it appears that the SSDF introduced Rybka and new hardware at the same time, and both produced large jumps.’ – Grace 2013, p19
  2. ‘Following Johnson’s work at shorter wavelengths, photometric systems were established at the University of Arizona for each of the infrared windows from 1 to 25μm. At 5, 10, and 22μm, the helium-cooled germanium bolometer was used. This detector provided four orders of magnitude improvement in sensitivity over uncooled detectors and was utilized at wavelengths out to 1000μm.’ – Academic press, 1974