A Timeline of Resources
From every great era of civilization, a new technology has emerged that gave humans new capabilities. Each technology has been associated with a resource that was initially extremely valuable, while humans were learning to harness its power, but later turned into a commodity as the technology became commonplace. The invention of a newer technology inevitably accelerated this transition.
The Stone Age
During this period, weapons and implements were made from various kinds of stone. As more durable materials were discovered, they displaced the older ones. Humans were able to do more things and do them better.
The Stone Age lasted 3.4 million years! The last incarnation of it, the Neolithic period stretched from around 9000 BC to around 3000 BC. During the latter part of this time, metal was invented. One can easily imagine how stone became less valuable and stonecutters lost their jobs to metalworkers.
The Bronze Age
While bronze was discovered by the accidental mixing of copper and tin, its replacement of stone for tools and weapons was deliberate. For about 4000 years, the technology took the same trajectory that stone did. As did the value that humans attached to bronze.
The Iron Age
Inevitably, bronze was surpassed by another, more durable and stronger metal — iron and its alloy of carbon, steel. This was more of an incremental step in technology but it came a lot faster than the previous transition.
The Iron Age lasted about 500 years; clearly the pace of innovation was accelerating. But as an object lesson to those of us who believe in the power of technology to always improve the lot of humankind, the next period was one of the worst in human history — the Dark Ages. The sky became dark, probably from the ash of a large volcanic eruption near Indonesia. Crops failed, people struggled for their daily existence. Knowledge that was commonplace during the Bronze and Iron Ages was forgotten. It took 600 years before the world began to recover.
The Renaissance sparked in Europe. While it is written about as a golden age, it was mostly spent rediscovering technology. For instance, it took almost 150 years for Florentine architects to build the Il Duomo di Firenze, a structure that was smaller than the Pantheon built by the Romans more than a millennium earlier.
The Industrial Revolution
Without a doubt, the Renaissance led directly to the Industrial Revolution. Not surprisingly, it happened in Europe. Between 1700 and 1900, the rate at which knowledge was discovered and technology was developed based on that knowledge was breathtaking.
The resources that were the underpinning of the Industrial Revolution were fossil fuels — coal, gas and oil. As the technology matured, the price of these resources continue to drop until supply was constrained for geopolitical reasons and also because the cheap stuff ran out. See below, an interesting chart of the price of oil.
The Computing/Communication Age
The twentieth century was the age of Computing and Communication. The most important resources were communication bandwidth and computing power. The costs of both have dropped exponentially and have become commodities. The companies supplying these commodities are typically late in reacting to the realities of their commoditized markets and resort to hasty restructuring of various kinds in an attempt to survive.
This brings us to the Information Age where the technology is machine learning, artificial intelligence and automation of routine tasks done by the “knowledge workers” about whom Peter Drucker wrote so insightfully. The resource that drives this Information Age is Data! If we extrapolate from history (andprovided we don’t muddle our way again into a new dark age), this should last about 50 years.
Industrial Revolution | 1700 AD - 1900 AD | 200 years
Computing, Communication Age | 1900 AD - 2000 AD | 100 years
Information Age | 2000 AD - 2050 AD | 50 years
How Valuable is Data?
The graph below represents the research of Serafin Lion Engel and Karen LaRocque. They parsed data from the UN Comtrade Database of global trade data (here). It shows the declining importance of resources of the industrial age — copper, oil and natural gas (LNG) — relative to that of data. Copper and natural gas show a steady increase driven by the growth of the global economy. Oil shows a steady decline driven by its substitution with cleaner forms of energy. These trends are easy to accept. The exponential growth in value of data to $25 trillion by 2030 is more difficult to believe so I’ll explain the assumptions.
- The starting point is already high; more than $150B. This is the amount that was spent on personal data and was reported here.
- The growth rate of over 30% is estimated from a forecast by EMC and the growth rate of publicly listed data brokers — Acxiom, Equifax, and Alliance Data Systems.
It is usually a perilous activity to forecast, over a period of decades, the size of markets that are growing today at exponential rates. Yet, it is not too difficult to conclude that the trends are directionally correct and that the market for data will likely be bigger than anything the world has seen so far. We only need to look to the growth rate of Google (>10%) and Facebook (>30%) at their relatively massive size of >$50B and >$20B respectively. In comparison, the giants of the Computing/Communications Age (Microsoft, IBM, Intel and Cisco) slowed down to single digit growth rates at this size.
What does this mean for companies?
The torch has already been passed to the next generation of leading companies. Companies that are harvesting the earth’s resources, building hardware and creating software will continue to be important. This will be true as long as we spend most of our lives in the physical world. Google and Facebook have great software. They even build their own hardware. Both are incidental to their pursuit of their true business — that of harvesting data about their users and the world they live in. To be relevant in the Information Age, companies will have to devise ways to collect data and derive insights from that data.
It will be a natural progression for some companies like Apple who move between the worlds of hardware, software and data with relative ease. Their journey in music is an excellent example. First the iPod (hardware), next iTunes (software) and finally Apple Music (data). Adobe has made an equally impressive transformation from shrink-wrapped software to software as a service and now to data. In contrast, those companies that have been more rooted in their hardware past (Intel, Cisco) or software legacy (IBM, Microsoft) have struggled to adapt.
What does this mean for individuals?
Not so long ago, it was widely accepted that privacy was a fundamental right. After a fair amount of early teeth-gnashing, most people made a Faustian bargain with companies — provide me free services that I love and I’ll let you use my data if you anonymize it appropriately. Your permission is usually buried in the epic EULA that you scroll through with aplomb when signing up for their services. This has made several companies richer and more influential than any in history. Facebook, for instance, has a market cap of $300b while making only $3B in profits in the last twelve months (full disclosure: I own their stock). If nothing were to change, it would take me 100 years to make my money back. Clearly, it is the future value of the data that Facebook holds that gives it such a high valuation.
A similar unspoken agreement has been made with the government — keep me safe and I’ll let you snoop on my communications. Many democratically elected governments are more powerful than ever before.
Today, data is collected mostly in silos. Facebook understands a lot about what news articles you read but Spotify knows a lot more about your music choices. The new generation of companies collecting data about you intend to combine these disparate sources of data making them even more capable of predicting your behavior. Accel Ventures has written a good article on the increased value of these converged data sources for the enterprise (here). There is no doubt in my mind that this will happen in our near future, both in the consumer as well as enterprise space.
Clearly, we don’t live in the dystopian world of George Orwell’s 1984; but the pendulum has swung decisively in its direction. Should we shrug off this rebalancing of power between individuals and institutions and learn to live with it? Or should we become neo-Luddites and renounce the use of technology that enables our data to be used by others?
I believe that individuals will very likely wake up to the fact that their data is valuable and stop giving it away. They will push their governments to take action. Already, the EU has amended its privacy laws (here). The US will most likely follow.
As I pondered this problem, I wondered if there was an alternative: actually owning your data and monetizing it (or not!) as you choose. There are a few pioneering efforts in this direction. Datawallet and Datacoup are two that I know of (full disclosure: I’m an investor in Datawallet). Each takes a slightly different approach to providing a solution to someone who wants to take control of his or her own data. There are efforts in academia as well. Here is one from MIT. These efforts have the potential of restoring the balance of power between individuals and institutions.
On which side of this argument do you land? Open up your private data and monetize selectively? More privacy, less sharing? Do you think status quo serves us well enough and should be left alone?