Skip to the main content.

5 min read

Data – When Is Enough, Enough?

Data – When Is Enough, Enough?

In his 2005 book The Singularity Is Near: When Humans Transcend Biology, futurist Ray Kurzweil imagines a time when intelligence becomes nonbiological and far more powerful than ever before. Little more than a decade later, the continuing exponential progress of information technology has brought us to the point where we are indeed beginning to transcend biological limitations, stimulating creativity and enabling new paradigms of work, thought, and even life.

To understand the exponential progress of information technology, think for a moment about the accelerating timeline of technological progress over the past 52 years since Fairchild Semiconductor’s (and later Intel co-founder) Gordon Moore plotted a handful of data points that tracked the increasing number of integrated circuits that could be placed on a single chip.

Based on just five data points dating back to 1959, Moore found that it was taking approximately a year for engineers to double the number of computing elements per chip. As additional data was gathered over time, the time period for doubling was revised to two years, establishing Moore’s Law.

When Intel launched its first microprocessor, the 4004 in 1971, 2,300 transistors were jam-packed onto a 12-square-millimeter chip. Compare that with the 3.2 billion transistors on Intel’s 10-core Core i7 Broadwell-E processor today, and you begin to understand the scale of exponential acceleration.

It seems incredible that Moore’s Law has held true over the decades, but it has. Time after time, as the limitations of processor design and manufacturing technology seemed likely to derail it, new advances put his theory right back on track.

How did this happen? In a word: technology. As engineers shifted design from slide rules and paper to electronic calculators to CAD systems, the chips in those devices enabled breakthroughs in subsequent chip designs. Likewise, those advanced and advancing processors enabled manufacturing technologies to produce ever-smaller circuits on silicon.

Will Moore’s Law continue to hold? That’s the subject of some debate in our industry right now. But if history is any guide, it’s likely that technology will continue to enable exponential growth – in ways that are today still unimaginable.

Whether processing power continues to double on a regular schedule may be a moot point, however. From a utility standpoint, we are now operating with a vast amount of computing power. How we use available computing power is becoming more and more interesting than how much we will actually have in the future.

My, how you have grown.

An interesting way to think about exponential growth is to contrast it with linear growth.

Let’s say you have a stride of 3 feet, and you take a walk. Ten strides into your walk, you’re 30 feet from your starting point at a rate of 3-feet-per-stride. And, it’s easy to predict where you’d be after 10 more steps.

Now imagine that you could double the distance of your stride with each step. That would mean that after 10 strides, you’d be over 3,000 feet from your starting point.

That’s the almost unimaginable change that exponential growth can deliver. And what makes imagining exactly where we’re headed so difficult. We simply tend not to think in exponential terms in most aspects of our lives.

Today, data flows through our lives at an incredible rate.

Our cars, once marvels of mechanical engineering, have been transformed into computers on wheels. And that smartphone in your pocket? It’s got more processing power than the Apollo command modules that took people to the moon and back.

Right now in data centers and labs all around the world, the ability to crunch mountains of data is enabling exponential growth in 3D printing, virtual reality, automation, robotics, artificial intelligence (AI), and genetic science – all of which stand to dramatically affect the quality and longevity of life itself.

The power of accelerating technology to shape the future is staggering. Consider that the first human genome was sequenced in 2004 at a cost of approximately $2.7 billion; less than the projected cost of $3 billion, and two years ahead of the schedule established in 1990.

While truly groundbreaking at the time, technological and scientific advancements in the interim mean that today the same feat can be accomplished in 26 hours at a cost of about $1,000. Researchers are exploiting those massive time and cost reductions to identify specific mutations and develop highly personalized treatments for cancer patients.

As you can see, exponential technological evolution is not limited to processors. Every related technology is riding a wave of astonishing progress. Storage capacity and data transmission rates – both wired and wireless – continue to grow exponentially as well. And the ability of evolving technology to speed the development of next-generation – or entirely new – technologies ensures that it will continue. The result is an ever-increasing availability of more data to more people in more places, and at lower cost, fueling further innovation in every field.

In 2001, Kurzweil wrote that the overall rate of progress was doubling, “We won’t experience 100 years of progress in the 21st century – it will be more like 20,000 years of progress (at today’s rate).” And it would be wise to heed the forecasting abilities of a man who predicted in 1990 that a computer would beat a master chess player by 1998. After all, it happened on May 11, 1997, when IBM’s Deep Blue beat grandmaster Gary Kasparov in a six-game match. Fourteen years later, in 2011, Deep Blue’s descendant Watson took on two of the game show Jeopardy’s greatest champions and beat them both over three nights.

Taking a turn for the better.

Perhaps our native disinclination to think in terms of exponential growth is why we hear so much about – and why we experience – “disruption” in so many aspects of business today.

A business model like Uber’s simply could not exist before the smartphone, so there’s really no way anyone could have anticipated its utility or popularity in 2007, when Apple introduced the iPhone. It’s not fair to say that taxi and limo operators were caught napping, because Uber was truly revolutionary – not evolutionary – in leveraging newly available geolocation and cloud technologies to transform an industry.

It’s no wonder the company has become the poster child for disruption.

Today, we are all beneficiaries of the technologies that enable such disruption. The cost of computing power, storage, and networking have fallen almost as quickly as their capacities have increased.

Think about the early PCs, they had no onboard storage. Instead you inserted a (truly) floppy disk with a capacity of a few kilobits and went to work. Apple’s first hard drive, the ProFile offered 5MB of storage for $3,499. That’s $700 per MB. Today, you can run to your local Best Buy and pick up a 512GB memory card – that’s 500,000MB – for the princely sum of $500 or $0.001 per MB.

The same price for performance declines hold true across the technology landscape. In 2017, we think nothing of the computing power we hold in our hands, deploy in our data centers, or access in the cloud.

We can expect increasingly easy access to data that will drive change in our business and personal lives, too.

Advances in miniaturization and high-speed wireless networking mean the mobile revolution is just beginning. It’s not hard to imagine that devices the size of an Apple Watch will soon have the utility of a home office. And advances in voice recognition mean we won’t need to interact with that tiny screen, we’ll just have to speak and listen.

Cybersecurity experts are embracing machine learning, as well, in a game of leapfrog that pits the black hats’ algorithms against those of the good guys.

Big data – and the information derived from it – is yet another result of the shrinking cost and growing availability of massive computing power and storage capacity. For example, in medicine, machines can now scan and compare thousands of images to spot anomalies, leading to faster diagnoses. Police departments can use big data to identify patterns of activity and deploy resources more quickly to respond to or even deter, criminal activity. And businesses can sift through unlimited amounts of customer and sales data to identify triggers and opportunities in the marketplace.

Cheap data is also driving the rapid evolution of machine learning. Every day we interact with algorithms that determine what we see online – from friend recommendations on Facebook to purchase suggestions on Amazon to the ads that pop up in almost all our browsers. These machine learning systems create models based on existing information and then make predictions and decisions based on new data. In other words, they “learn.”

It’s likely that soon, big data, machine learning, and nanotechnology – the manipulation of individual atoms and molecules – will come together to address the subtitle of Kurzweil’s book, When Humans Transcend Biology.

In the not too distant future, we can expect microscopic smart machines to help us fight disease, and allow us to experience the exponential growth of technology well into old age.

Learn More »

This article originally appeared in the Summer 2017 edition of Solutions by Zones magazine. View online

Intel Optane will change how you think about managing data

Intel Optane will change how you think about managing data

For any business that wants to stay competitive in the coming years, there’s one thing they’re invariably going to need, and that is an intense...

Read More
The engineered system that’s simple, optimized, and affordable

The engineered system that’s simple, optimized, and affordable

Oracle Database Appliance Whether you’re running a small-to-medium sized business or operating a remote or branch location, field office, or clinic,...

Read More
More endpoints = more access = more risk

More endpoints = more access = more risk

Security in the age of IoT Just two weeks ago, Delta Faucet Company broke a massive social media campaign for Delta™ Leak Detection, Designed with...

Read More