If knowledge is power, and power is money, we need to keep investing in more and better knowledge
Technical knowledge gives us the power to live better lives. Compare Europe today with the Roman Empire 2,000 years ago. We have industry, mass consumer goods, air transport and the tools for essentially universal education and healthcare. The Romans had a limited range of handmade goods, their fastest transport was the horse, a minority had basic literacy and medicine didn’t really exist. Our education and cosmopolitanism have brought liberal democracy; they lived under a capricious dictatorship.
In the long run, the source of progress and economic growth is science and technology. In our modern economy firms realise that being first to use a new technology can yield large revenues, so they invest in research and development. The problem is, because the benefits of fundamental research are long-term and spread over many people, firms do not have incentive to invest enough.
Some governments have realised that investing in science, beyond what private donors and businesses are willing to pay, can give them an edge militarily. The most enlightened have realised that it boosts civilian living standards. Looking at their experience shows why it is important to spend on science.
Investment to improve maritime navigation provides an early example of effective science policy. From the 16th to the 19th centuries, maritime competition between European states was intense. The Spanish had conquered much of Latin America from 1492, while Portugal had taken outposts in Brazil, Africa and Asia.
Subsequently, the Dutch, British and French got in on the act, taking colonies in North America, Africa and Asia. For mariners of the time, latitude – distance from the equator – was relatively straightforward. They looked at particular stars that remained over the poles – the pole star in the northern hemisphere and the Southern Cross constellation in the southern hemisphere – and used the elevation of these above the horizon as a measure of distance from the equator. Longitude was more problematic; since the Earth rotates, there can be no fixed astronomical markers to tell how far east or west you are.
Any power that could crack this challenge would be able to grow faster, by ferrying goods more rapidly and reliably to and from its colonies, and would be at an advantage in exploration and naval combat. In 1714, the British government passed an Act of Parliament to create the Longitude Prize, to be administered by the Board of Longitude. Fixed prizes of £10-20,000 were offered for methods to measure longitude with specified accuracies. This would be more than £100-£200,000 in today’s money, and would have been substantial in the much poorer 18th century.
The formal prizes were never awarded, but the board had discretion to make awards for steps towards the ultimate goal. More than £100,000 (over £1m in today’s money) was paid out over the next 50 years. £23,000 went to John Harrison for development of chronometers that could keep time reliably over long ocean voyages. These could be kept on a fixed time, such as Greenwich Mean Time, and entering the time at which the sun rose or set on a given date into a naval almanac prepared by astronomers could then be used to determine the longitude.
The benefits of reliable timepieces were immeasurable. Maritime trade became more efficient. Accurate timekeeping was crucial in the industrial revolution; regimented factory life would have been impossible without it and processes that require accurate timing – such as those in the chemical industry – would have worked less well. Modern science, requiring precise timing of experimental steps, reliable measurement of rates of reaction and so on, depends on accurate chronometers. And, not least, skills acquired in engineering intricate components of timepieces went on to be adapted and used in other industries, from the finer parts of internal combustion engines, to tiny electronic switches. Accurate timekeeping and its knock-on effects would probably have developed anyway, but the Longitude Prize undoubtedly accelerated matters.
The American century
The years after the Second World War were a golden age for investment in science and technology. The US had invested in advanced technology – especially the Manhattan Project to build the atomic bomb – in order to win the war, and kept this up in its cold war competition with the Soviets. This intensified with the space race in the late 1950s and 1960s. Federal government spending on R&D went from essentially zero before the war to 0.5 percent of GDP in the 1950s, to a peak of 2.2 percent in 1964. Today it stands at one percent. Explicit R&D spending is augmented by defence spending that stimulates corporate R&D.
Spending on science allowed the US to retain a military edge even as the Soviets tried to keep up. American tanks, planes, ships and submarines were all worth several of their Russian counterparts. But the real benefits were a gain in civilian living standards, which in turn made the Western system more attractive than communism.
Major American research programmes included national laboratories, defence projects, the space programme and the National Institutes of Health (NIH). The Lawrence Livermore National Laboratory was established in 1952. Its main focus in the early years was nuclear warheads, but it made contributions to many other areas. These included civilian nuclear power, lasers (used in fibre optics and DVDs today), advanced computing techniques such as parallel processing, extreme ultraviolet lithography (used for making computer chips), nanotechnology, and the climate models used by the Intergovernmental Panel on Climate Change.
Texas Instruments developed the first integrated circuit in 1958, to support science and missile programmes; its work laid the foundations for compact mass computing. Further advances in computing were financed as part of missile and space research in the 1960s and 1970s. By the 1960s, firms in financial services and other areas were using computers to handle tasks requiring the processing of large amounts of data, creating a private market for computing. Apple, IBM and other firms leveraged this technology to create compact, cheap, user-friendly computers for home and office use in the 1970s and 1980s – after which the story of personal computing is familiar to everyone.
Spin-offs from space research include communications satellites, GPS, solar panels, fuel cells, Velcro and a host of other innovations. These have given mankind new ways of communicating, coordinating business and trade, and generating energy; spawning whole new industries in the process.
In 1967, the Federal government increased spending on the NIH, an R&D organisation with roots going back to 1887. Spending increased further with the National Cancer Act of 1971. Today the NIH finances 28 percent of healthcare R&D in the US; mostly in the form of grants to researchers in universities and other external institutions. Discoveries in molecular biology from earlier decades – such as the structure of DNA uncovered by British researchers James Watson, Francis Crick and Rosalind Franklin in 1953 – were developed towards clinical applications. This spurred the development of genetic engineering (first achieved in 1972) and monoclonal antibodies (first produced in 1975).
These biomedical technologies were rapidly commercialised. In 1982, Genentech gained approval to produce, in genetically engineered bacteria, human insulin for treating diabetes. In 1986, approval was given to the first monoclonal antibody drug, designed to block part of the immune system involved in transplant rejection. Today there are more than 250 biotech drugs in use, and the industry has revenues of approximately $200bn globally.
Europe also supported new technology after the Second World War, but the focus was different from the US. European governments worked closely with corporations in adopting a back-log of technologies that had been delayed by the World Wars, particularly industrial mass production. They financed universal education and healthcare, providing a stimulus to biomedical technologies such as vaccines. And they intervened in banking and other areas to direct capital towards strategic sectors, including the high-tech aerospace and defence sector. Under these circumstances, industrial innovation boomed in Germany, northern Italy and the UK.
Japan in the 1950s followed an economic strategy of forming mutually supportive clusters of industrial corporates, banks and government agencies. These protected circumstances facilitated a host of innovations in mass production, particularly in the car industry, and electronic goods. This was a strategy of indirect support for technology comparable to that practised in Europe.
Ireland in the 1990s is a recent example of the benefits of adopting new technology. Until the 1980s, Ireland had a GDP per capita around two-thirds that of its British neighbour. From the 1960s, Irish governments realised that education was crucial to catching up; they invested in universal secondary education and subsidised universities, with a focus on science and engineering courses.
By the 1980s, the first graduates from this new system were entering the labour market. At the same time, the country was receiving European subsidies for infrastructure and was pushing through difficult reforms in response to the economic challenges of the 1980s. Corporation tax was cut to 12.5 percent, with personal taxes filling the gap; other targeted incentives were established and collective bargaining was enhanced to bring stability and fairness in wage deals.
The combination of well-educated English-speaking graduates, low corporation taxes, wage restraint, subsidised infrastructure and a favourable location between Britain and America brought US corporates in high-value industries to Ireland. Initially the focus was on production, rather than R&D. By the year 2000, half of all software used in Europe was made in Ireland and every major pharmaceutical company had a production facility there.
R&D followed, though its intensity as a proportion of GDP remains slightly below the EU average. In the 2000s, Ireland suffered a property bubble and subsequent sharp recession, but its high-tech industries are providing a source of exports that other troubled nations lack.
The size of the prize
The macroeconomic effects of new technology are clear. From 1950 to 1973, the US economy grew at an average rate of 3.2 percent per year, compared to 2.1 percent thereafter. Western European economies grew at an average of 4.5 percent per year from 1950 to 1973, compared to 2.1 percent thereafter. Japan grew at an average of nine percent in the 1960s, 4.5 percent in the 1970s, 3.8 percent in the 1980s and 1.2 percent thereafter. Ireland grew at an average of nearly seven percent in the 1990s. Across all of the US, Europe and Japan we can see growth slowing as innovation itself slowed.
Other factors were at play in every case, such as favourable demographics after the Second World War. But many of these factors – like post-war rebuilding, and workers moving from agriculture to industry as farming became more advanced – were themselves technological at root. The lesson is clear: invest in science, and rapid economic growth will follow.
The superior economic performance of the US compared to major Western European economies in the 1990s and early 2000s is often attributed to lower taxes and less regulation. The reality is that World Bank data for dozens of countries going back to 1960 shows no relationship between the size of government and rates of economic growth. A more likely explanation is that the US spends roughly twice as much of its GDP on R&D as European countries do.
To see what not to do, consider the case of agricultural research internationally. Today, government-funded research in this area is less than a third of what it was in the early 1980s, in real terms. In consequence, for the first time since the early 20th century, crop yields are growing more slowly than demand, even though population growth has slowed markedly.
Today, Western economies face an uphill battle. We are struggling out of a global financial crisis, while adverse demographics will squeeze growth rates, public finances and financial markets for decades. At the same time, we need to stop emitting carbon dioxide if we are to avoid environmental and economic disaster.
The obvious solution is to invest heavily in science and technology, especially sustainable technology.
In past decades, increasing investment in science by one percent of GDP seems to have boosted GDP growth rates by more than two percent. With this policy, we could lift long-run growth projections for Europe and the US from an anaemic 1.8 percent to a respectable 2.8 percent. These are approximations, but it is clear that spending on science would ultimately make our pensions seem a lot more affordable.