2010年1月15日金曜日

非構造型データが5年間でRDBの3倍の伸び:IDC予測

非構造型データとは、要するにRDBに向かない構造を持っているデータを基本的に指す。 email、ブログなどを始め、現在Web上に存在するデータの多くはこういった非構造型のデータベースである、といわれている。 このフォーマットのデータが今後5年間、構造型データベースが20%延びると予測される中、60%という驚くべき伸びを見せる、という予測がIDCから出ている。
クラウドの位置づけはこの大量に増加するデータを管理する手法として採用すべし、と提示している。

What Big Data Will Mean to IT

Applications designed to take advantage of new computing capabilities and respond to the needs of huge data

We’ve all heard and read about the enormous pace ofgrowth of the cloud – being experienced now and for the future.

And I think that what will drive growth, most powerfully, will be the ever expanding need for data storage.

A recent piece in the San Francisco Gate quoted a study done by analysts IDC last year about enterprise storage needs.

The study said that, over the next five years, structured data (the traditional row-and-column information contained in relational databases), will grow more than 20%. Meanwhile, unstructured data will rise at an astounding 60% compounded rate. That means structured data storage requirements will double, while unstructured data storage requirements will increase seven times. So application scale is dramatically growing.

So what does this mean for the future of IT? Basically, a big skill for IT folks will revolve around being able to stay on top of load variation. “The ability to respond to dynamic app load by rapidly altering application topology will be a fundamental IT skill,” says the article, written by Bernard Golden is CEO of consulting firm HyperStratus.

This is called dynamic scaling, and the demand for it will outstrip what most IT organizations do today – keeping application environments stable and occasional modifying topology through manual intervention by sys administrators.

Cloud providers offer orchestration (defining computing capacity in a single transaction, with the underlying infrastructure of the orchestration software obtaining the necessary individual resources to produce that capacity). But what will be needed for the volume of tomorrow’s applications is dynamism – the ability to rapidly, seamlessly, and transparently adjust resource consumption, all without human intervention, notes the story.

Without it, you’ve got a “buggy whip processes in a motorized world.” There is no way, in my mind, that I can see masses of IT admins on the job 24/7 adjusting computing resource levels.

I think the cloud is the answer here, and so does the author: “There’s no question that cloud computing, whether a public or private/internal variant, is the future of computing. ” But there are and will be challenges dealing with the demands of infinite scalability and highly variable demand.

Golden says that what’s expected for this new world of bid data and ever-increasing computing demand is:

1. A need for highly-automated operations tools that require little initial configuration and subsequent “tuning” because they operate on AI-based rules

2. A huge amount of turmoil as sunk investment in outmoded tools must be written off in favor of new offerings better suited to the new computing environment

3. A change in the necessary skill sets of operations personnel from individual system management to automated system monitoring and inventory management

4. A new generation of applications designed to take advantage of new computing capabilities and respond to the needs of huge data.

As we’re transforming the way companies compute and changing the infrastructure that’ll be needed for the days of big data, I believe it’ll be more essential than ever for companies to employ assistance in measuring database capacity usage. Indeed, that’s why we’re seeing such an explosion of growth for cloud monitoring services that can measure usage and alert companies when they’re approaching their thresholds.

Read the original blog entry...