Ammo for Michael

January 23, 2011 at 1:09 am (By Randy)

After reading this and other perceptive comments that Michael wrote on Dave Schuler’s blog, I ran across this post by noted economist Tyler Cowen today:

Charles I. Jones, an economist at Stanford University, has “disassembled” American economic growth into component parts, such as increases in capital investment, increases in work hours, increases in research and development, and other factors. Looking at 1950–1993, he found that 80 percent of the growth from that period came from the application of previously discovered ideas, combined with heavy additional investment in education and research, in a manner that cannot be easily repeated for the future. In other words, we’ve been riding off the past. Even more worryingly, he finds that now that we are done exhausting this accumulated stock of benefits, we are discovering new ideas at a speed that will drive a future growth rate of less than one-third of a percent (that’s a rough estimate, not an exact one, but it is consistent with the basic message here). It could be worse yet if the idea-generating countries continue to lose population, as we are seeing in Western Europe and Japan.

(The quote is from Cowen’s soon to be released eBook, The Great Stagnation.)


  1. mockturtle said,

    I find it difficult to believe there was more than an associative link between economic growth and research and development expenditures. R&D is usually the first thing to be cut back in a recession and built up only during flush economic times. The same case could probably be made for education. It seems to be common nowadays to infer cause and effect where merely an association exists.

  2. wj said,

    Well, if you look at it, new technologies come from R&D. And, at least for the past couple of centuries, economic growth has come overwhelmingly from industries using new (at the time) technologies. No R&D ==> minimal economic growth.

  3. Dave Schuler said,

    I posted on this very subject some time ago. Contra the Singularity folks there have been very, very few major breakthroughs in recent years. Lot of elaborations on old ideas, though.

  4. Dave Schuler said,

    See here, for example. Links to a couple of other interesting blogs in that post.

  5. wj said,

    The other thing that is worth remembering, when you are weighing the worth of R&D is that it comes in two sorts. One is the application of known ideas to new products (or services). The other is what happens first — coming up with those new ideas.

    Let me take an example from the computer field, jsut ebcause I happen to know a little about that. For there to be a PC, first there has to be an electornic computer. And there has to be solid state circuitry. There were electornic computers before there were transistors, let alone solid state circuits. Computers were built with tubes. (If you are too young to remember tubes, time to go read some history.)

    All that stuff: transistors, then solid state circuits, computer programming languages, came from basic research (the R in R&D). A few companies actually spend money on this. Think of Bell Labs, Xerox PARC, IBM. Fewer manage to move stuff reliably from basic research to product development, which is why the stuff inveneted at Xerox PARC didn’t make Xerox a power in the computer field.

    So I wouldn’t worry too much that there have been “very, very few major breakthroughs in recent years.” Nobody knows what initially obscure bit of new knowledge will, a decade or three down the road, become the critical step towards a lot of new technology. The recent new breakthroughs are, at a guess, most likely to be seen (in retrospect, a decade or two from now) to be in biology and biochemistry rather than in physics and electronics. And, if you stop and think about it, there has been a lot of progress there. In fact, I would say that we are now well past the “initial breakthrough” step and into the early stages of product development.

  6. mockturtle said,

    Not all companies are IT-related and, hard as it may be to believe, man does not live by information technology alone.

  7. michael reynolds said,

    I was re-reading The Big Sleep — because Chandler is a god and the first paragraph of that book is the best first paragraph in all of mystery.

    Setting that aside, it was in 1939 that Chandler invented Philip Marlowe. 72 years ago. But if Marlowe was going through your apartment — excuse me, bungalow — he would recognize everything except the computer. Obviously the modern refrigerator is better, but he’d know that’s what it was. Likewise the phone, the toilet, the car, even the TV — assuming he read the occasional Popular Mechanics.

    But flip the calendar back by 72 years from 1939. That’s 1867. A detective from 1867 would be lost and amazed in a 1939 home.

    I think the notion that increased R&D automatically yields significant advances smacks of magical thinking.

    My larger point though was that most of just don’t lack much. We don’t lack enough that we’re going to ignite some huge hiring increase in order to feed our need for more stuff. I think we’re living through an aesthetic/philosophical shift. I think we’re burned out on “stuff.”

    I went through a period when I had more money than I knew what to do with. The diminishing returns lesson was learned a bit too slowly to save me a lot of that money, but it did eventually penetrate. I get the feeling the younger generations get it.

    So if old boomers and younger alphabet generationals agree that stuff has kind of lost its allure then we’re going to have to figure out something else to do about jobs. The stuff-making jobs may not be coming back.

  8. mockturtle said,

    “So if old boomers and younger alphabet generationals agree that stuff has kind of lost its allure then we’re going to have to figure out something else to do about jobs. The stuff-making jobs may not be coming back.”

    I couldn’t agree more. And I think–in the long run–it’s a good thing.

  9. wj said,

    Flush toilets in most homes in 1939 would, perhaps, startle someone from 1867. But the thing itself was patented in the 1850, so it isn’t “new technology.” Telephones? Just a small enhancement of the telegraph. Refrigerator? An ice box was rare, but not unheard of in 1867 (introduced in the 1830s, IIRC). Even the automobile — just a combination of the wagon and an engine, both known already. All that differs in all of those was how common (and affordable) they had become.

    Actually, the big technology change from 1867 wasn’t something you’d see in the home in 1939 (or today). It was mass production, which allowed factories to churn out enough copies of stuff cheaply enough that if became ubiquitous rather than rare. Plus things which would be too small to see (vaccination for lots of diseases, antibiotics, etc.), just like all the solid state circuits and computer chips in the modern car aren’t visible to the naked eye.

  10. mockturtle said,

    Man’s glory years were the Renaissance. He has been on a steady decline in thinking and creativity ever since. JMHO, of course! ;-)

  11. Ron said,

    wj, mass production had started by 1867…for guns! That’s how Colt got the first defense contract…and the Civil War had hyped up mass production of rifles for the Union….for domestic goods, yeah, it had just started though…

  12. wj said,

    Ron, thanks for that. (One of those “I should have known, if I’d only thought a bit further” things.) And it reinforces the thought that radically new stuff is, most often, a matter of combining things which are already known in principle or in isolation into new configurations. Sometimes not even all that visibly different configurations, at least until you get beyond the surface.

    All of which makes me less concerned about the notional current/recent rate of innovation than some others are. I think there is a lot going on, which may not be radically new in some form. But which, in its future configuration, will change our lives as radically as mass-produced automobiles or routine vaccination for childhood diseases did.

  13. karen said,

    How about pharma and genetics– do they count?

    The fact that we can artificially inseminate a cow(or pretty much any animal, even women(or men- if you count the girl that became a guy, which i do not count)– that’s R&D? Laser operations, fetal operations, transplants…

    GMOs, chemical fertilizers- robotic milkers– 30+ thousand cow dairies? Pigs that glow? Cloning– ethanol, VTYankee? My son’s Xbox.

    R&D stands for Research & Development, right? And i just read– that the gov’t isn’t impressed w/the advancements being made in pharma and are consideering investing, creating their own(via Insty, i think). Just as they’ve gotten into the Auto business of production, and banking and Healthcare.

    Oy. It’s late.

  14. karen said,

    And Space exploration?

  15. wj said,

    If I may presume, I suspect Michael would argue that Space exploration is just rockets (known for centuries), or at least just liquid-fueled rockets (technology available since 1939 – although admittedly scaled up). Personally, I wouldn’t buy that, but….

    Besides, the specific example he was working from was stuff found in the home. Which is a bit restrictive, but….

  16. karen said,

    I was listening to NPR- and the CEO of Dow was talking.

    R&D may stand for Retail and Distribution, which makes more sense. It was interesting to here from a real businessman’s POV(not to say that no one here is a ~real~ businessman)- * we just don’t get pd like he does, i’m assuming.

    As to being found at home… we have a semen tank:0). Being Organic, no chemicals, hormones, etc– & no $$$ for a robotic milker.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: