Sunday, March 28, 2010

This century and the last one: A report card for the first 10 years

When we look back at the sweep of history, the 20th century stands out. It stands out as a time of immense progress in our knowledge, a time of great carnage, and the time when the great debate about socialism and the market economy ended. I think it was Arthur C. Clarke who said that one of two things will come next: either we will look back at the 20th century as the most amazing time when everything happened, or the pace of change will further accelerate thus making the 21st century even more incredible than the one that went by. (Does someone know the exact quote?)

Economists have been arguing that the creation of knowledge responds to the inputs going into it. And there is no question that the number of people engaged in knowledge professions today is greater than ever before in human history. Information technology has added strength to this pursuit, amplifying what a puny unaided human mind could do on its own. Earlier, the West dominated the production of knowledge; now we have phenomena like R&D labs in India giving a new kind of low cost production of knowledge, and increased opportunities for risk-taking in research. These factors should increase the pace of progress in creating knowledge. It should take us closer to the scenario where the 21st century will be even more exciting than its predecessor in terms of creating new knowledge.

But I find myself in 2010, nervously looking around, and wondering if we are actually doing better.

From 1900 to 1910, here are a few of the great things that happened:
  • In 1900, Max Planck proposed quantum theory, Hilbert posed his 23 problems, and Louis Bachelier was the first financial economist.
  • In 1901, Marconi did the first wireless trans-atlantic transmission.
  • In 1902, the first car ride from San Francisco to New York took place, and the Wright brothers flew the first plane.
  • In 1903, construction of the Panama canal began.
  • In 1905, Einstein wrote four papers.
  • In 1906, Mahatma Gandhi coined the phrase satyagraha, and the first `vitamins' were discovered.
  • In 1908, the first oil was extracted from the Middle East, and Henry Ford sold the first Model T.
I'm sure there were many other interesting things going on, but these were the big things of that period that mean a lot to me. When I look back at the decade from 2000 to 2010 and ... what cool things can we remember which would change the world? The CPUs got faster. What else do we have to show?

Or is that all sorts of wonderful things have been going on and it is my lack of knowledge? E.g. if I had lived in 1905, I might not have heard about Einstein's four papers.

If it's not just me, and the pace of progress has slackened: Why did we not get amazing progress from 2000 to 2010, despite the expansion of inputs into the systematic quest for new knowledge? Are we hitting diminishing returns; are we in the sad stage of adding decimal places to fundamental constants? Is our production function faulty?

In economics, we seem to be finished with a first cut of a conceptual framework, and now the real challenge lies in how the rubber hits the road, in taking the framework to the messy reality and seeing what works and what doesn't. Nobody believes a theory paper, except the guy who did it; Everyone believes an empirical paper, except the guy who did it. Today, when I face a paper, I first jump ahead to the empirical stuff to see whether the data and the estimation strategy persuades me, and then if it makes sense, I curiously look at the conceptual framework and theory. It is a time to work in the trenches, dealing with messy reality, and not a time for elegant conceptual frameworks.

I remember noticing the May-June 1973 issue of Journal of Political Economy, where there is a pair of papers which appear back to back: Risk, Return, and Equilibrium: Empirical Tests by Fama and MacBeth, followed by The Pricing of Options and Corporate Liabilities by Black and Scholes. One has already got a Nobel prize and I suppose the other will get one too. JPE today does not seem so grand.


  1. I don't quite agree that the pace has slackened. I would propose that a report card of the past decade include atleast some of the following research areas. I don't have a comparative list of events but it should be possible to put together out of these areas:

    1) Simulation of the brain/Cognitive Computing/Understanding the brain (important milestones and papers in 2000-2010):
    Cognitive Computing Lab at IBM
    Henry Markram
    The neuroscientist's angle: Just one example: VS Ramachandran

    2) BioTech - Human Genome Project:
    has a machine (cost $750K) which can do the human genome in a week. Some Chinese firm recently bought 128 of those machines. Ray McCauley of the company predicts that by 2013 the cost of doing your personal genome will be around $100, and in the future the cost will be as little as $1.
    Celera is another company in this space.
    DIY Bio

    3) Quantum Computing:
    Events in this field
    Google is a participant here. The well-known papers were published prior to 2000 but maybe there are papers/milestones in this decade which are not fully appreciated yet - don't know.

    4) Regenerative Medicine:
    Growing organs: Anthony Atala at TEDMED 2009

    5) Miscellaneous: something could get big/bigger in future with roots in this decade - say from nanotech, alt energy (solar, fusion, electric car), Terminator style robotic warfare - drones were used in Af-Pak.
    MIT Tech Review list of annual emerging tech

    I'm not sure if any of the top 4 areas above is being worked on in any R&D lab in India? I still find R&D dominated by the West. The list of top 100 universities is still dominated by the West.

    Moore's law is close to its limit (so yes we are hitting diminishing returns atleast in this one respect) and I feel that IT/computer science should be boring now onwards (relatively speaking... there will still be lots of incremental stuff - web 3.0, probabilistic databases, machine learning - cognitive computing, innovations in interaction with computers, etc, etc).

    However, the massively path-breaking sort of stuff will be something else - there's a common thread between 3 of the 4 areas above which is that they all have to do with the human body/biology.

  2. On a more serious note(and hopefully less sarcastic),at those last lines of lukkha, I am reminded of one of our professors, who prophesised, if the 20th century was of Physics,then the 21st will be of Biology. He offered two reasons for it. One because he was a philosopher, or atleast he thought he was one. And the other, as that of a rationalist.
    The philosophizing reason was: Man, has gone away from himself, from his basic needs and happiness(because Physics deals with outer world phenomena), so he will feel a tug to return to his roots and understand himself better.

    And dont snigger, like we did. He was sincere.

    But the rationalist reason, he mentioned was solid enough. He said, that the bulk of the medicinal/ anatomical research was done prior to 1900s. Yes, there were a few good work which came in,from 1900-2000 but they "underperformed" the growth in Physics.And even after 1500 yrs odd of civilization, we dont know, enough about how our body works,let alone work on it.

    On a very personal level, because I am more interested in Physics, I would like to cite two inventions:

    1. The discovery/invention/creation of negative refractive index with the help of carbon nanotubes ~ Scientific American

    2. And on a Device Electronics level, the invention of Memristor which actually filled the quadrant of basic devices, known to man. Resistor, Inductor and Capacitor. Memristor filled the fourth quadrant and from a beauty pespective,made device electronics much more pleasing. From an application pov, imagine, your computer booting up, as soon as you switch it on, with zero delay.

  3. Dr Shah, either it was a good intentional humor from your side, (in response to the first comment) or my sarcasm was too tangential

  4. "It is a time to work in the trenches, dealing with messy reality, and not a time for elegant conceptual frameworks."

    Maybe. But, what if there are logical inconsistencies with the theory? In which case, it might show a very good fit with data. Or, if the assumptions are very unrealistic or even untrue?

    Elegance of concepts is not something social scientists should look for. I guess they are the by-products of the obvious spill-overs or borrowed notions from the physical sciences.

  5. Alex, I said: In economics, we seem to be finished with a first cut of a conceptual framework, and now the real challenge lies in how the rubber hits the road, in taking the framework to the messy reality and seeing what works and what doesn't.

    I'm not claiming we've figured it out. I'm claiming that there is a first cut of a framework (markets, optimisation, learning how to do econometrics). In the 1970s the papers getting done were laying the foundations of the field (e.g. Lucas critique, Black/Scholes, Lemons model, Spence 1974, etc). Those kinds of papers aren't the focus today. Today the challenge is that of taking the theory to the data. The best and most important economics today is all about confronting the messy detail of the real world, finding out what works and what doesn't work, so as to form a scientific picture of how the world works.

  6. Ajay, a more philosophical utilitarian analysis of this might be a useful exercise. In the early part of the 20th century, you had grand ideas being proposed but these were highly specialized or localized to their domains. They did have a profound impact on their fields but were there a lot of other things happening? What was the sum total of the utility of these discoveries/inventions?

    Instead, with diffusion of information, larger populations being involved in R&D, etc maybe the problems being tackled are incremental but in many more domains and at much larger speeds than ever before. Thus we may not have a few great things to talk about, but we have so many more little things that gradually are adding to our knowledge. IS the sum total of the utility of these incremental improvements more or less? We don't know and probably there is no easy way of knowing this but why does one assume that only grand ideas are great, maybe a utilitarian perspective would show incremental improvements in better light.

  7. Mr.Shah,
    The impact of certain events is felt only after a time lag. Bachellier's work, for example, wasn't exactly well received back then and one had to wait a while to discover the impact of his work. If somebody had written a similar article in 1910, his work wouldn't have even featured as a subject of importance. Today, we label him the first 'Financial Economist' :)

  8. I would second what the commenter above me has said.

    The events were very momentous, granted. But there were not thought to be momentous in 1910.

    Einstien's theory did not gain ground till the 1920's. Mahatma Gandhi's satyagra was not fructified till independence or the 1940's.

    For Mahatma Gandhi, you can say that he was perfecting his techinques for the rest of his lives and he can be counted in any decade from 1890's to 1940's. Just like that, many of the other inventions mentioned can be multi decade inventions.

  9. There is a reason that the Hindu God Vishnu is revered more than Brahma or Shiva.

    Development is much more important than discovery or fruition. And maybe a lot of developmental phase has to take place before spurts of new discoveries happen.


Please note: Comments are moderated; I will delete comments that misbehave. The rules are as follows. Only civilised conversation is permitted on this blog. Criticising me is perfectly okay; uncivilised language is not. I delete any comment which is spam, has personal attacks against anyone, or uses foul language.