2009年7月31日金曜日

Rackspace Leverages VMware With Private Cloud Offering

Rackspace Hosting社がPrivate Cloud事業を開始。

Rackspace Leverages VMware With Private Cloud Offering

Rackspace Hosting today announced its new Private Cloud offering, which allows customers to run the centrally managed VMware virtualization platform on private dedicated hardware environments. Rackspace recognizes the demand from enterprises for a more flexible and scalable hosting solution. Although multi-tenant cloud solutions are very flexible and cost-effective, they are not always right for every segment. The Rackspace Private Cloud's single-tenant architecture offers increased control and security, while still maintaining the scalability, flexibility and resource optimization that make shared cloud offerings so compelling.

Rackspace Private Cloud is an evolution of its popular dedicated virtual server (DVS) offering within the managed hosting business unit. In the last year, revenue from virtualization solutions has grown substantially, driven mainly by the increased flexibility, improved asset utilization and lower capital and operating costs that VMware's virtualization provides.

The new Rackspace Private Cloud offering provides new pricing and configuration options that further extend the value for customers. The savings for customers can be substantial and further cost reductions can be achieved by the ability to quickly provision, deploy and take down virtualized server instances, using only what is needed. As with all Rackspace products, Private Cloud is backed-up by Rackspace's Fanatical Support.

Virtualization.com / Thu, 30 Jul 2009 13:54:29 GMT

Sent from FeedDemon

2009年7月30日木曜日

Rackspace pitches pricey, private cloud

Rackspace社が顧客向けのPrivate Cloud事業を開始。
価格帯が、$6K/月、と高額で、その付加価値については議論がある。

Hosting firm Rackspace has unveiled a "Private Cloud" service that allows users to rent a set of dedicated, physical servers and manage a VMware deployment through the company's MyRackspace.com portal. The company claims it is an "evolution " of it's dedicated virtual server (DVS) line.

Running only VMware virtualization software, the offering is a 180 degree turn from Rackspace's public cloud line, Cloud Servers and Cloud Files, which are open source, self-service and available to the public.

Early press releases from Rackspace said a minimum installation of eight virtual servers, which Rackspace says is covered by the same SLAs as regular physical hosting, starts at $6,000 per month, or $54,000 per year. That may seem like a steep premium when a VMware license could cost $3,500 per processor and a server capable of running eight VMs can be had for around $5,000.

If an enterprise can own a server and run VMware on it for the cost of a few months of the same service at Rackspace, then where's the advantage? Rackspace's CTO John Engates said that the cost comparison is the same as for traditional hosting.

"This is not a way to lower costs across the board," said Engates. "This is for real businesses with real needs," in contrast to public cloud options like Amazon EC2 or Rackspace Cloud Servers that are often touted as economical and are dribbled out by the hour. Rackspace is hoping to straddle the divide for enterprises between the attractive features of cloud computing, like self-service and automation, and the comforts of security and reliability that come with owning hardware.

"What we hear from a lot of enterprise customers is that they don't want to be 100% virtualized" and if they are going to host, they don't want to share, said Engate. Security and compliance concerns make multi-tenanted public clouds, where data storage is shared with others, a non-starter for many companies.

This is for real businesses with real needs.
John Engates, Rackspace CTO
James Staten, principal analyst at Forrester Research, says there isn't much new to the technology but the packaging. Staten said that Rackspace's success with it's public cloud services was the real impetus behind the new product line.

"They realize that the value of the halo of 'cloud' that they've achieved" can be spread to other parts of their hosting lineup without too much investment, he said. Staten said the new Rackspace lineup was more of a new twist on existing managed hosting products rather than cloud computing. He said it was really a way for enterprises to get comfortable with the concept through Rackspace's web GUI and VMware automation tools.

"It may be more easily consumed by enterprises facing political challenges internally," he said, while preserving familiar models of IT investment in hosting. Staten said that larger organizations don't really want to turn their IT infrastructure on its ear, but they do want to experiment with the touted advantages of cloud, like flexibility and ease of use.

Rackspace's move is a trend for all hosting providers simply because of the ease and widespread availability of virtualization technology. Staten noted that if the prices don't change and you are still renting servers, with or without virtualization, the "cloud" aspect is in name only, but it is a step towards cloud computing in general for gun-shy enterprise customers.

2009年7月28日火曜日

Cloud Brokers: The Next Big Opportunity?

Cloud Computingのソリューションが多数登場する中で、Cloud Brokerと呼ばれる新しいビジネスモデルが注目され始めている。  エンタプライズのユースケースを重視した発想の中から登場したビジネスモデル、と考えられる。

As enterprises struggle to sort out the array of cloud computing options and services, analysts see a growing opportunity for "cloud brokers" to serve as intermediaries between end users and cloud providers. These cloud middlemen will help companies choose the right platform, deploy apps across multiple clouds, and perhaps even provide cloud arbitrage services that allow end users to shift between platforms to capture the best pricing.   

"The future of cloud computing will be permeated with the notion of brokers negotiating relationships between providers of cloud services and the service customers," said Frank Kenney, research director at Gartner. "Enhancement will include managing access to these services, providing greater security or even creating completely new services." 

Gartner has seen a role for cloud brokers for some time. Last November Gartner analyst Tom Bittman predicted the emergence of thousands of clouds, prompting enterprises to either assemble in-house teams to manage specialized cloud service providers or look to third-party cloud brokers.

2009年7月22日水曜日

4 1/2 Ways to Deal With Data During Cloudbursts



4 1/2 Ways to Deal With Data During Cloudbursts

Cloudbursting is an approach to handling spikes in demand that overwhelm enterprise computing resources by acquiring additional resources from a cloud services provider. It’s a little like having unexpected houseguests and not enough beds for them to sleep in; some of them will have to be put up in a hotel. While such “peaking through the clouds” promises to maximize agility while minimizing cost, there's the nagging question of what exactly to do about the data such distributed applications require or generate. There are several strategies for dealing with cloudbursts, each of which have different implications for cost, performance, and architecture. One of them may fit both your application's unique requirements and your enterprise's overall business model.

1) Independent Clusters: In this scenario strategy, there are minimal communication and data-sharing requirements between the application instances running in the enterprise and cloud data centers. Global load balancers direct requests to either location, but the application instances running in the cloud do not need to communicate (much) with the ones in the enterprise data center. Since these load balancers are probably already in place, there is no significant marginal cost of infrastructure to enable cloudbursting, just a requirement to keep contextual information such as resource allocation current. Applications that involve data coming to and from users that doesn’t need to be saved between sessions — such as generating downloadable videos from uploaded photos — may not require much of a connection between the enterprise and the cloud.

-1

This architecture provides the best economics, but doesn't cover all situations, since there may be data in the enterprise data center that needs to be accessed by the cloud-resident application, or new data may be acquired or produced as the cloud-based instances run, which must then be consolidated with what’s in the enterprise data center.

2) Remote Access to Consolidated Data: The easiest approach to access and update enterprise data may be for application instances running in the cloud to access a single-instance data store. The viability of this approach depends on the pattern and intensity of reads and writes from the cloud data center to the enterprise and the bandwidth, latency and protocol support of the data networking or storage networking approach used to connect the cloud app to the enterprise-based data — whether it be block-oriented, network-attached, content-addressed or simply a database server.

-1

3) On-Demand Data Placement: Placing cloud data centers on a global network backbone can enhance performance and latency, but if I/O intensity and/or network latency are too high for remote access, then any needed data that isn't already in the cloud must be placed there at the beginning of the cloudburst, and any changes must be consolidated in the enterprise store at the end of the cloudburst. The question is: "How much data needs to get where, and how quickly?"

A large data set may required, either because all the data is needed for computation (such as with seismic or protein-folding analysis), or because the pattern of reads is unpredictable and as such needs to be present "just in case." If so, even with fast file transfer techniques, either there will be delays to begin cloudbursting (from trying to pass a lot of data through a small pipe or by using physical disk delivery); or a large bandwidth pipe must be pre-positioned to quickly migrate the data, impacting cost; or the industry will need to move to more of an on-demand, pay-per-use approach for network capacity. Although progress is being made in this last model, the traditional challenge has been the high capital cost of high-bandwidth, last-mile access, which may involve involve digging trenches, laying fiber, deploying optical transport equipment, and paying for rights of way and rights of entry, that can run into the millions of dollars.

-1

4) Pre-positioned Data Placement: Pre-positioning the data in the cloud to support application/server cloudbursting can be effective from a performance perspective, but adds additional cost as a full secondary storage environment and a metro or wide-area network must be deployed. This impacts the breakeven point for cloudbursting, because now a simple rule like the utility premium must be lower than the peak-to-average ratio no longer holds: additional fixed costs shift the breakeven point.

-1

4.5) BC/DR Plus Cloudbursting: However, if the cloud location doubles as the data mirroring or replication site for Business Continuity/Disaster Recovery, then support for cloudbursting can come along for free. However, this may imply a requirement for bi-directional primary/secondary volumes, for example data written at the enterprise site is replicated to the cloud while data written in the cloud is replicated to the enterprise. And the primary/secondary volume designation must be fungible, or some sort of distributed data management and possibly distributed record locking strategy must be implemented. Technology to do this is still evolving.

Such an approach also changes the dynamics and business models associated with cloud environments for the enterprise. For example, a number of cloud service providers have been building mega-data centers. However, if cloud data centers must be within BC/DR distance of enterprise data centers, then a greater number of smaller cloud centers may be preferable to fewer larger ones. This in turn impacts statistical multiplexing effects such as peak smoothing and utilization maximization, limiting advantages such as preferential access to cheap power and cooling, etc. If not, longer distances must be traversed, potentially impacting application performance for highly transactional environments. Additional variations and combinations of the above are possible as well. For example, a master image of the application may be frozen and replicated to the cloud as data via a virtualization layer/container.

In any event, understanding storage options is key for two reasons. First, the business case for cloudbursting may change. Saving a couple of bucks on virtual server hours looks less attractive if wide-area storage networking and costly enterprise arrays are required. On the other hand, an optimal architecture can kill multiple birds with one stone — those of agility, business continuity and cost-minimization — while meeting transaction throughput and latency requirements through distributed, on-net processing. Secondly, different scenarios require different network interconnects between the enterprise and the cloud. The Internet alone may be fine if clusters are independent, but for most scenarios, other mechanisms such as VPNs or optical transport are likely to be required.

Joe Weinman is Strategy and Business Development VP for AT&T Business Solutions.


The future of mobile: GigaOM Pro provides insider perspectives and analysis on the trends defining tomorrow's mobile market. Learn more »

My Clippings / Sun, 19 Jul 2009 13:00:54 GMT

Sent from FeedDemon

Understanding the Role of the Channel in Cloud Computing

Cloud Computingがもたらす最大の影響は、従来IT事業で販路のイニシアチブを取っていたチャネルの存在意義が無くなる、という点である、という指摘をしている記事。 
実際に書いているのは、自称、Cloud 業界で唯一チャネル事業をしている、という6fusion社。 

Larry Walsh, of Channel Insider, recently tackled the sensitive subject of the role of the IT service Channel in the burgeoning market for cloud computing services and then subsequently posted a follow up. Since 6fusion is the only 100% Channel focused cloud computing provider in the market that I know of, I naturally read Larry's piece with an extra amount of attention.

Walsh makes a couple of very poignant observations about what is happening in the market regarding Cloud computing and the impact it has on the thousands of IT intermediaries we often call Resellers or Managed Service Providers (MSPs) but I think Walsh's perspective on the role of the Channel is somewhat uninviting.

The first point he makes is that the majority of Cloud service offerings are aimed at cutting out the Channel from the business equation. He couldn't be more right about this. Maybe I've been in this business for too long, but everything you hear and read about regarding the Channel and Cloud computing stinks like some Michael Dell 'how to' screw the Channel guide from the 1990s. Make no mistake. There is no room for the Channel in the cloud business plans of Microsoft, Salesfore.com, Google Apps (as I recently wrote about) or any of the other hosting providers that have jumped into Cloud computing.

What I would add to Larry's analysis is that the threat of disintermediation is like none other we've seen in the industry. I know we've all heard the displacement theory before, but it's not like the old days. Cloud computing is very much a paradigm shift. It is not about a more efficient way to package, sell and ship the same commodity hardware and software. Cloud computing is a business model rooted in the fundamentals of how we consume technology. It's much bigger than most IT service providers can imagine and it's about control over the very elements that keep IT service providers in business.

The second point that Larry makes is that the Channel would be ill-advised to build a mini-cloud and hope for a measure of insulation from the threat. He points to tough slogging MSPs had when they built out big NOCs for the rising tide of support subscriptions, but here is the true reality: Cloud economics is about sheer volume. This is why Google, Microsoft, Hosting shops, big telcos and the hardware vendors are leading the charge. An IT Service Provider thinking about dropping a couple hundred grand on some kit and virtualization software to 'take on the Man' better think again. This is a mistake of epic proportions. It would be like bringing a hundred dollar bill to a high-stakes poker room.

So what is a Channel company to do in this situation? Walsh says MSPs and VARs should adopt an 'agency' approach acting as an advisor to customers trying to sort out the malaise of application integration, SLAs and contract matters.

6fusion is taking a much different approach with the Channel.

So my experience makes me disagree with Walsh in that I believe whole-heartedly that VARs and MSPs can and should build Cloud services into their portfolio without compromising their client base to the likes of Google and Microsoft or picking up the tab to launch a rack full of servers to get into the game. We are helping the Channel go to market faster and with fewer financial resources every day. And if you speak with our growing number of Channel customers about it, they will tell you they are beginning to make more money than they could ever make peddling someone else's SaaS or yielding the infrastructure market to others. This in spite of the fact that the world is telling them they no longer matter. Again.

Washington to Have Its Own Cloud Store

米国政府が内部組織向けのCloud Storeを構築し、Cloud Computingサービスを手軽に入手できるようなシステムの開発に着手した、とのニュース。 
 

Agencies can shop for cloud computing infrastructure, web applications and services paid for by credit card or requisition

The federal government is going to open its own online cloud store where agencies can shop for cloud computing infrastructure, web applications and services paid for by credit card or requisition.

Federal CIO Vivek Kundra anticipates the wares will be segmented into software-as-a-service, infrastructure-as-a-service, platforms-as-a-service and citizen engagement services.

When this novelty will be available is unclear.

NSA Using Cloud Model For Intelligence Sharing

NSA(National Security Agency)がHadoopを使い、Cloud Computing環境で自省のさまざまなフォーマットのデータを統合してアクセスできるようなシステムを構築する、と発表。

New system will run the Hadoop file system on commodity servers and include search, discovery, and analysis tools.

The National Security Agency is taking a cloud computing approach in developing a new collaborative intelligence gathering system that will link disparate intelligence databases.

The system, currently in testing, will be geographically distributed in data centers around the country, and it will hold "essentially every kind of data there is," said Randy Garrett, director of technology for NSA's integrated intelligence program, at a cloud computing symposium last week at the National Defense University's Information Resources Management College.

The system will house streaming data, unstructured text, large files, and other forms of intelligence data. Analysts will be able to add metadata and tags that, among other things, designate how securely information is to be handled and how widely it gets disseminated. For end users, the system will come with search, discovery, collaboration, correlation, and analysis tools.

The intelligence agency is using the Hadoop file system, an implementation of Google's MapReduce parallel processing system, to make it easier to "rapidly reconfigure data" and for Hadoop's ability to scale.

The NSA's decision to use cloud computing technologies wasn't about cutting costs or seeking innovation for innovation's sake; rather, cloud computing was seen as a way to enable new scenarios and unprecedented scalability, Garrett said. "The object is to do things that were essentially impossible before," he said.

NSA's challenge has been to provide vast amounts of real-time data gathered from intelligence agencies, military branches, and other sources of intelligence to authorized users based on different access privileges. Federal agencies have their own systems for sharing information, but many remain disconnected, while community-wide systems like Intellipedia require significant user input to be helpful.

The NSA effort is part of Intelligence Community Directive 501, an effort to overhaul intelligence sharing proposed under the Bush administration. Current director of national intelligence Dennis Blair has promised that intelligence sharing will remain a priority.

"The legacy systems must be modernized and consolidated to allow for data to actually be shared across an enterprise, and the organizations that collect intelligence must be trained and incentivized to distribute it widely," he said in response to questions from the Senate prior to his confirmation.

The new system will run on commodity hardware and "largely" on commercial software, Garrett said. The NSA will manage the arrayed servers as a pool of resources rather than as individual machines.

Outage for Amazon Web Services

Amazon Web Servicesが相次ぐ落雷による被害によってサービスが中断される事件が今月になって2回起きた。 
障害が起きる回数が多いことが、Cloud ComputingのQoSレベルに対する懸念を増加させている。
 

Amazon's cloud computing services have experienced performance problems this afternoon, with multiple services affected. There are also numerous reports of users briefly being unable to access the main Amazon.com retail site. 

Amazon's Service Health Dashboard showed problems on the EC2 computing cloud in the US. "We detected a period of elevated packet loss from 12:31 PM PDT to 12:46 PM PDT in a single Availability Zone," Amazon reported. "We are continuing to monitor the situation." The dashboard also showed elevated error rates on its Amazon's CloudFront CDN, SimpleDB database service and Mechanical Turk freelance marketplace. The downtime was also confirmed by monitoring services CloudStatus and enStratus, which both show the Amazon services available again as of 2 pm Pacific. 

The outage is the second in a month for Amazon Web Services, following a June 11 incident in which a lightning strike damaged power equipment at one of the company's data centers, disrupting service for some AWS customers. Today's problems come at a time of growing scrutiny of the reliability of cloud computing providers. EC2 previously experienced extended outages in February 2008 and October 2007.

eyeOS Takes the Operating System Into the Cloud

eyeOS社がOS+アプリケーションをCloud Computing環境で提供するサービスを開始。 
 
同様な技術を提供するベンダーとして、iCloud.comやCloudo.com等、いくつか登場しており、教育市場、SMB市場などでデスクトップ環境の仮想化ソリューションとして人気が出始めている模様。 


eyeOS Takes the Operating System Into the Cloud

eyeOS

With the rising popularity of cloud computing, entire operating systems designed to work in the cloud should come as no surprise. Indeed, virtual computing environments are becoming so predominate that some suspect Google's development of the Chrome browser -- and more recently the Chrome operating system -- is nothing more than cloud computing in disguise.

eyeOS is an open source cloud computing operating system that easily installs on a Web server and is designed for personal or collaborative use. It's a great option for schools, small businesses, and public Internet access points like libraries that need a straightforward operating system with a familiar feel.

eyeOS is also available in a hosted option, with all the same bells and whistles as the version you install on your own hardware. I signed up for an account to take it for a test-drive and discovered it makes a very good alternative for operating systems localized on individual machines.

Registering a new account is a super-quick process: Select an alphanumeric username, choose a password, and you're in. eyeOS immediately takes you to a customizable desktop pre-populated with shortcuts to a calendar, address book, and more. Drop down menus give you access to the rest of the apps available in eyeOS.

eyeOS Public Desktop

A widget with quick links to four actions pops up automatically at login, but can be easily disabled.

eyeOS widget

eyeOS offers a typical collection of office tools like a word processer, address book, calendar, and spreadsheet app. There's also a whole slew of additional apps that, other than taking the naming convention a little too far at times (eyeMp3, eyeVideo, eyeChess, eyeCalc), that most users will find handy or entertaining to have around.

eyeOS apps

Naturally it's not possible to tinker under the hood too much with the hosted version of eyeOS, but you can kill processes and track application behavior with the Process Manager.

eyeOS process manager

The email client is bare-bones but gets the job done. It easily resolved the IMAP settings for Gmail and Yahoo, and adding accounts is a snap if you're of a mind to manage your email through the eyeOS hosted service.

eyeOS email

 

3PAR Introduces Cloud-Agile Program Aimed at Enterprise Cloud Computing Leaders

3PAR社はCloud Computing事業者に対してCloud Storageサービスを提供するための基盤ソフト、3PAR Utility−Storageをライセンスするベンダー。  IDCによると、Storage Cloud事業は大きな伸びが予測されている。 
3PARの技術は、Attenda、Terremark、Verizon Business、Datapipe社等、大手のCloud Storageサービスを提供するベンダーも採用している。 

3PAR, the leading global provider of utility storage, announced today the 3PAR Cloud-Agile program, a new partnership initiative to promote the adoption of cloud computing and cloud-based services offered by leading providers with infrastructures powered by 3PAR Utility Storage. The 3PAR Cloud-Agile program gives partners the opportunity to promote differentiated virtual private array (VPA) and disaster recovery (DR) offerings under the respective names 3PAR Cloud-Agile: SECURED and 3PAR Cloud-Agile: ASSURED.

3PAR created the Cloud-Agile program in response to an increase in demand for enterprise IT delivered as a utility service and a corresponding uptake in 3PAR Utility Storage adoption among leading IT hosting providers. Seven of the world's top 10 revenue-generating managed service providers as identified by the "Winter 2009 Managed Hosting Report" issued by Tier1 Research use 3PAR Utility Storage as the storage foundation for their cloud-based service offerings.

According to analyst firm IDC, worldwide IT cloud services spending is projected to grow by a factor of three -- from $16.2 billion in 2008 to $42.3 billion in 2012 -- as enterprise IT shifts away from the use of dedicated infrastructures toward a shared, virtualized utility service model. Within this market, storage-related cloud services are predicted to account for 5 percent of 2008 spending -- a figure projected to grow to 13 percent of overall IT cloud services spending by 2012 -- representing anticipated growth of nearly sevenfold (Source: IDC eXchange, "IT Cloud Services Forecast - 2008, 2012: A Key Driver of New Growth," blogs.idc.com/ie/?p=224, Oct 2008).

"We see that interest in cloud computing services has risen as capital budgets have come under pressure and companies have looked to move enterprise IT costs to a variable basis that aligns more closely with actual business performance," said Benjamin S. Woo, IDC Vice President, Enterprise Storage Systems. "Enterprises are definitely realizing that augmenting core IT infrastructure with cloud services can provide a competitive advantage. Given 3PAR's focus and success with service providers, this corresponds well with the introduction of 3PAR's Cloud-Agile program."

The Cloud-Agile program is designed to raise awareness for the cost and agility benefits of cloud computing services and encourage the development of a mutually beneficial, robust cloud computing ecosystem. Program participation is limited to leading hosting providers using 3PAR Utility Storage as a strategic element of their cloud-based service offerings.

"As a charter member of the Cloud-Agile program and an early 3PAR customer, we are able to add enterprise-class utility storage capabilities quickly and efficiently to our cloud-based offering, Computing as a Service," said Michael Marcellin, Vice President of Verizon Business Global Managed Solutions. "This agility is particularly important for Verizon Business since it enables us to more flexibly serve our enterprise customers as they seek alternatives to traditional IT models."

3PAR Cloud-Agile program participants such as Attenda, DataPipe, Terremark, and Verizon Business say their utility storage infrastructures -- built on 3PAR InServ® Storage Servers -- have enabled them to distinguish themselves with an agility that is lacking from others in their market. These companies share a significant investment in 3PAR Utility Storage, which has provided a strategic advantage by enabling them to handle unpredictable growth, control costs, accelerate book-to-bill cycles, assure service levels, and efficiently deliver value-added services. For these companies, service request response times have been reduced to minutes versus days, weeks, or even months with 3PAR.

"3PAR Utility Storage has been a foundational technology for Terremark," said Jason Lochhead, CTO Hosting Services at Terremark. "The agility and cost-efficiency that 3PAR Utility Storage has given our infrastructure has been strategic to our service offerings. As a charter member of the Cloud-Agile program we look forward to working closely with 3PAR to further promote and enhance our portfolio of value-added cloud and managed storage services to better meet the changing needs of our customers."

For 3PAR Cloud-Agile partners, 3PAR Utility Storage has also contributed to savings on facilities and infrastructure costs while significantly reducing storage administration time. This has meant improved time-to-revenue, lower storage total cost of ownership (TCO), and tighter alignment between costs and revenues. For example, in a survey of 3PAR's cloud computing service provider customers, 69% of IT organizations estimated that, by using 3PAR Utility Storage to power their cloud computing services, they have saved 40% or more in total capacity as compared to traditional storage alternatives (Source: TechValidate).

"We chose 3PAR for its industry-leading flexibility and agility," said Simon Hansford, VP of Products and Marketing at Attenda. "The decision to invest in the 3PAR platform has cut our capacity and array purchase requirements by at least 60% as compared to other systems."

"Our clients want the benefits of cloud computing but they don't want the risk associated with unproven technologies," said Mark Cravotta, Vice President for Worldwide Sales at DataPipe. "3PAR Utility Storage delivers the agility necessary for rapid implementation of secure and standards-compliant solutions. 3PAR Utility Storage is particularly strategic for our business as it allows our clients to reduce storage CAPEX and enables DataPipe to deliver fully managed, highly efficient storage with superior performance."

Under the terms of the program, participants receive expert technical training to enhance the quality of the cloud-based services that these providers are able to offer to end users. The program also offers a combination of lead sharing, sales training, and joint marketing to increase selling opportunities and grow awareness of the benefits of cloud computing. In addition, under the program, participating providers have the opportunity to offer enhanced services built on 3PAR Utility Storage under the names 3PAR Cloud-Agile: SECURED and 3PAR Cloud-Agile: ASSURED.

Oracle (finally) talks to the Virtual Iron customers, discloses the integration roadmap

Oracle社がVirtualIron社を買収後、どのように顧客を移行させるかを表明。

Oracle (finally) talks to the Virtual Iron customers, discloses the integration roadmap

oracle logo

Last month virtualization.info reported how Oracle killed the Virtual Iron brand immediately after the acquisition, firing every employee but 10, terminating the reseller program and severely limiting the capability for existing customers to buy new licenses or upgrade licenses.

The move was so quick and brutal that Oracle gave the impression to completely disregard the loss of 1000-3000 SMB customers. And this represented an opportunity for VMware which launched a discount program to attract those customers on vSphere.

It's possible that the pressure from competitors (also Microsoft jumped in recently) had a positive impact on the Oracle strategy, which finally decided to talk to the Virtual Iron customers through a semi-private webcast held today by Wim Coekaerts, Vice President of Linux and Virtualization Engineering.

CohesiveFT Adds Ubuntu 9.04 SE and Debian GNU/Linux 5.0 to its Elastic Server Platform

オープンソース系ののPrivate Cloud ベンダー、CohesiveFT社がUbuntu9.04とDebian GNU/Linux5.0 のサポートを発表。  自社のAWS EC2との連携ソリューションをこのプラットホームでも提供することとなった。

CohesiveFT Adds Ubuntu 9.04 SE and Debian GNU/Linux 5.0 to its Elastic Server Platform

————————————————————————————————————-

—————————————————————————————————————

CohesiveFT today announced the addition of both Ubuntu 9.04 Server Edition (Jaunty Jackalope) and Debian GNU/Linux 5.0 (Lenny) operating systems to its Elastic Server platform, the company's web-based factory for real-time virtual and cloud server assembly. The Elastic Server platform lets users assemble custom virtual and cloud servers using a point-and-click, self-service interface. The addition of the most recent stable versions of Ubuntu and Debian as operating system options gives users the ability to assemble and deploy their custom Ubuntu or Debian Elastic Servers to numerous virtual and cloud environments including Amazon's Elastic Compute Cloud (EC2).

Ubuntu is a widely popular operating system distributed freely by Canonical based on GNU/Linux. Ubuntu is available under the GNU/GPL but is commercially supported by Canonical. Debian is also an extremely popular operating system based on GNU/Linux. Debian is not supported by a commercial enterprise but by an independent decentralized organization of developers.

The Elastic Server platform is a complement to virtualization and cloud offerings. Users assemble custom servers by choosing from a library of popular components. Once assembled, these custom application stacks can be configured to a variety of virtualization and cloud-ready formats, downloaded and deployed in real-time. There are over three thousand users of the service who have assembled more than seven thousand Elastic Server images for public and private use. The addition of Ubuntu Jaunty Jackalope and Debian Lenny operating systems highlights CohesiveFT's platform momentum following recently introduced support for Eucalyptus, Fedora Core 10, ElasticHosts, and KVM.

 

The Single Biggest Reason Public Clouds Will Dominate the Next Era of IT

何故Cloud Computingが次世代のITソリューションなのか、という分析。 
非常に簡単な論理で説明している。 

The Single Biggest Reason Public Clouds Will Dominate the Next Era of IT

domino1In the past year, I've had hundreds of conversations with client and press about the emerging cloud services model, and its impact on the IT industry.  As you might imagine, more than a few folks question whether the cloud services model will really be as pervasive and transforming as its proponents argue.   The skeptics point, legitimately, to the many remaining challenges of cloud services adoption, particularly around security, availability, performance, limited customization, lack of standards, etc.

My response to the skeptics is very simple: within the next several years, none of those challenges will make a bit of difference to the vast majority of customers.  The public cloud is where the best and richest variety of business solutions will increasingly be found.They will still choose, in large numbers, public IT cloud services as core elements of their IT services delivery portfolios. They will do so for one big reason:  the public cloud is where the best and richest variety of business solutions will increasingly be found.  (You could certainly argue that this is already the case in the consumer IT solutions world.)

The online shift of the latest and greatest business solutions to the Web is happening because the Cloud is winning the war for developers:  a rapidly growing number of developers see the Web as the most attractive "platform" on which to quickly and affordably deploy their solutions.  It's not a mystery:  the Cloud dramatically reduces the barriers for customer adoption (and upgrade) and dramatically expands the market reach for solution developers. Can you imagine a developer of a hot new solution choosing not to deploy in a Cloud/SaaS mode?  Hard to imagine.  They might not do so exclusively - they may continue to also develop for the big on-premise platforms, and many will also deploy their public cloud solution as a software appliance in a private cloud.  But it's easy to see that the public cloud will be the number one deployment target for a large majority of solutions.

If this pattern sounds familiar, it should.  We've seen this movie before:  in the 1980s, people debated about the relative benefits of "PC vs. mainframes (or minicomputers)".  If this pattern sounds familiar, it should. We've seen this movie before.The PC proponents pointed to dramatically lower "cost per MIPS", and the PC opponents cited lower reliability or the lack of legacy tools in the PC world.  In the end, the real battle was not about any of these things - it was about the migration of an enormous amount of developer energy and solutions to the PC (and Wintel server) platforms. The new platform (PC, Client/Server) was the place you'd find the best and newest solutions.  (This is why the battle among the "Platform-as-a-Service (PaaS)" players is so strategic - they're all vying to repeat Microsoft's 1980s/90s Windows story, by attracting the richest solution ecosystems to their Cloud platforms.)

In the PC and Client/Server era, customers followed the solutions, and money flowed into the industry to develop solutions to the limitations of the new platforms.  We'll see the same pattern this time - today's public cloud challenges will not magically disappear or become unimportant to users.  But as more leading solutions pull more customers to the cloud, there will be more incentive for the industry to invest in developing creative solutions to these challenges.

IDC eXchange / Tue, 30 Jun 2009 14:20:10 GMT

Sent from FeedDemon

Users See Cloud As an Alternative Financing Strategy

CLoud Computingが技術面だけではなく、CFOも関心を寄せる理由。

Users See Cloud As an Alternative Financing Strategy

calculator_smallEveryone knows that one of the top cloud services model benefits, according to users, is the ability to stream payments out over the offering's useful life, rather than paying the entire cost up-front.  But I still found it intriguing when IDC colleague Jennifer Koppy recently presented additional data points that support the strong economic appeal of the cloud model:

cloud_vs_leasing-thumb1

CLICK IMAGE to ENLARGE

This survey finding, from the IDC Leasing and Financing Survey Results (IDC#218599, June 2009) report, shows user interest in a number of acquisition options, as alternatives to leasing.  Two things stand out:

  • Users rated cloud computing as the top alternative to traditional IT leasing. Cloud computing garnered the highest average rating (2.7 out of a maximum 4), as well as the highest percentage of respondents (27%) indicating an interest level of 4 ("very interested").  It's notable that the third highest-rated alternative was "utility-type computing", which is synonymous with cloud computing.
  • Cloud computing edged out outsourcing as a leasing/financing alternative. In tough times, as CIOs are squeezed, they've traditionally looked to outsourcing as a method for lowering costs, and spreading them out.  No doubt this finding will be particularly interesting - and challenging - to outsourcing services providers, most of whom are currently trying to determine just how serious they should be in adding cloud services options to their services profolios.

It's clear that cloud computing is of growing interest, not just to the technologists, but to the money people - the CFOs, CEOs, Procurement VPs, as well as senior IT execs - who think about the capital and cost implications of IT.  And from these ratings, it looks like their initial impressions are positive.  The implications for the IT leasing and finance players, as well as traditional IT outsourcers is obvious:  they need to quickly determine how they will get their share of the growing cloud opportunity.

IDC eXchange / Tue, 30 Jun 2009 18:58:10 GMT

Sent from FeedDemon

HP Buys IBRIX to Keep Up With Storage Trends

HPがIBRIX社とうStorage Cloud技術を持っている会社を買収。

HP Buys IBRIX to Keep Up With Storage Trends

HP  said today that it has agreed to buy IBRIX, a Billerica, Mass.-based maker of software that allows customers to build out scalable storage clouds. Terms of the deal, which will augment HP's sales to businesses requiring high-performance computing, were not disclosed. Like Caringo and Parascale, IBRIX offers customers a way to create large storage clouds inside their data centers quickly using cheap, off-the-shelf gear.

It's unlikely that HP would use IBRIX to build a hosted storage cloud of its own, but rather offer the company's tools to its customer, which can then use the IBRIX storage cloud to keep tens of petabytes of data at the ready. IBRIX has been an HP partner for the last three years; it also has a partnership with Dell and EMC. As detailed in this story from The Register, HP's servers were used in conjunction with IBRIX's software in the creation of "Monsters v. Aliens," allowing animators to keep up to terabytes of data accessible for animators to manipulate. From the story:

This HP-IBRIX architecture was a key technology element in the studio's pioneering stereoscopic 3D animated film format used in Monsters vs. Aliens. This scale-out network-attached storage (NAS) file software had to provide access to millions of files, with, for example, more than 17 million in a single rendering working set. In the event, the parallelised file serving software enabled DreamWorks' artists to do things up to five times faster.

The animation industry is a big user of high-performance computing, and a leading indicator of the types of performance needs we'll see regular businesses adapt within the next decade. If big customers don't want to use HP's existing storage gear, buying IBRIX will still keep HP in the storage game.

 

RE: Is the Data Domain Saga Over?

Data Domain社の買収にからむEMC社とNetApp社の抗争はEMC社が最終的に買収を行うことで決着がついた模様。

 

On Monday EMC Corp. raised its all-cash offer for deduplication specialist Data Domain (DDUP) to $33.50 a share, of $2.1 billion, topping a $30 a share cash and stock offer from NetApp (NTAP). EMC, the world’s largest data storage firm, also said the Federal Trade Commission has granted regulatory approval of the deal. The takeover battle began when NetApp offered $25 a share for Data Domain back in May, which prompted a $30 a share offer from EMC..

The ball shifts into NetApp’s court. “In response to EMC’s revised, unsolicited offer, the NetApp Board of Directors will carefully weigh its options, keeping in mind both its fiduciary duty to its stockholders and its disciplined acquisition strategy,” said Dan Warmenhoven, chairman and CEO of NetApp, in a statement. “We will provide an update shortly.”

But Data Domain’s largest shareholder appears to believe the battle is over. Tech Trader Daily reports that Artis Capital Management President Stuart Peterson told Dow Jones in an e-mail that it is “unlikely” that NetApp will ratchet up its previous bid. Artis held 10.6% of DDUP at June 4, but Peterson now says the firm has “chosen to reduce our position considerably.”



Nirvanix Completes SAS 70 Type II Audit

Cloud Storage事業者であるNirvanix社がSAS 70の規準に合格した、との発表。 
SAS 70 はSarbanes Oxley等のCompliance関連の要件に必要になってくるオンラインサービス事業のデータ管理規格
 

(WEB HOST INDUSTRY REVIEW) -- Cloud storage service provider Nirvanix (www.nirvanix.com) announced on Wednesday it has completed the Statement on Auditing Standards No. 70 Type II audit that verifies that its control processes have been formally evaluated and tested by an independent accounting and auditing firm.

In addition, Nirvanix says its security and control processes have been audited and approved by multiple Fortune 50 organizations.

As an auditing standard for service organizations developed by the American Institute of Certified Public Accountants, SAS 70 is a widely recognized assessment of a company's strengths of internal controls based upon identified objectives.

Type II certification includes both a description of these objectives as well as tests validating the effectiveness of the organization's controls.

Verizon Business, Switch and Data, and CRG West all recently announced they have completed the SAS 70 Type II audit.

Nirvanix was found to continually maintain its information security policy as well as monitor and test its network.

"Enterprises looking to make a move to the cloud have a great resource in SAS 70 Type II certification that confirms a solution provider's ability to fulfill its business requirements as promised, consistently and reliably," says Jim Zierick, president and CEO of Nirvanix. "Third-party validation of security and control processes of cloud storage providers helps ensure that information remains safe and secure. SAS 70 Type II certification is a standard that all enterprises should look for in any cloud storage provider."

The Nirvanix Storage Delivery Network offers a fully-managed, highly secure cloud storage service designed for enterprises.

The company recently began providing Samba Tech with content storage and delivery services for its SaaS-based rich-media management platform, Liquid Platform.

Comparing infrastructure as a service providers: Amazon, Rackspace emerge

Public Cloud ベンダーの価格帯の比較。
 

Amazon Web Services has the best pricing based on dollar per virtual machine, but Rackspace fills in a few gaps—especially with persistent instance data, according to a report from JMP Securities.

In the report, JMP analyst Patrick Walravens surveys the infrastructure as a service providers. The report, which admittedly isn't exhaustive, looks at a slice of cloud computing providers to highlight the available options. Walravens looked at Amazon Web Services, Rackspace, GoGrid, Joyent and AppNexus. Walravens' bottom line:

We found that Amazon Web Services offers the most competitive pricing on a dollar per virtual machine basis. Large companies have noticed and are using its Web Services cloud computing platform to host their public websites. Of the top 500,000 websites, 1,422 are hosted on Amazon's EC2 service.

The big emerging player in the JMP Securites report is Rackspace. Walravens notes that Rackspace has an advantage over EC2 courtesy of its persistent instance data.

The Amazon vs. Rackspace analogy is a good one. In fact, Rackspace is likely to be more competitive. Walravens noted Rackspace's disadvantages and two of them—no API access and no Windows support—are being resolved.

The rest of the report highlights the key players–Amazon, GoGrid, Rackspace, Joyent and AppNexus. The overall message here is to read the fine print of these providers—features and pricing are all over the map.

A quick look at the pricing:

Add it up and pricing is all over the map. Walravens notes that it's difficult for buyers to compare pricing. He recommends reading the fine print to get a valid cost analysis. Here are few questions to ponder.

BMC to link up with Amazon Web Services for hybrid cloud deployments

BMC社がAWS社と組んで、企業内のデータセンタとAmazon Web Serviceとを連携させる運用管理アプリケーションサービスを開始する、と発表。 
 
具体的には、BMCの従来提供するデータセンタ管理ツールにAWSリソースを利用する際のポータルサイトが加わる形となる模様。 
 

 

BMC Software on Wednesday will detail plans to collaborate with Amazon Web Services to ease hybrid cloud infrastructure deployments.

Despite a lot of chatter about cloud computing most enterprises will take a hybrid approach to their data center and link so called private clouds—also known as your data center—with outside resources. BMC, a leading data center management software company, is the latest enterprise player to partner with Amazon Web Services.

BMC said that its customers will be able to extend their data centers to Amazon's Elastic Compute Cloud (EC2) via its Business Service Management software.

In a nutshell, enterprise customers can request computing resources through a BMC portal and provision as needed. Asset tracking on EC2 will be integrated with BMC's management database. BMC offers a bevy of data center management, optimization and automation applications.

Here's a look at the dashboard:

Salesforce.com Opens Singapore Facility

Salesforce.comのSingaporeに開設した新規データセンターに関する情報。
Equinix社の建設/運用したもの、との事。 
 

 

Salesforce.com (CRM) has brought its new Singapore data center online ahead of schedule, a move to accommodate strong adoption of Salesforce CRM applications and the Force.com platform across the Asia-Pacific (APAC) region. The company said today that its APAC customer base now tops 5,000 customers (out of 59,300 total).

The Singapore expansion, which was announced last year, was originally scheduled to go live before the end of fiscal 2010. The Singapore project includes a new Network Operations Centre (NOC) that enables 24×7, follow-the-sun monitoring of the company's data centres in North America and Singapore.

Salesforce.com's operations run on about 1,000 servers, most of which are concentrated in a single West Coast data center, with customer data mirrored at an East Coast site. Both sites are hosted within data centers operated by Equinix, according to Salesforce.com, which described its data center operations in an SEC filing earlier this year.

Groups Seek Cloud Computing Standards -- Cloud Computing Standards

Cloud Computingの標準化仕様を目指す新しい動き。
"the Cloud Standards Coordination Working Group,"と呼ばれるこの組織、OASIS、OMG、Cloud Security Alliance等、いくつか関連する業界団体と協業する形で活動する模様。 
 

A number of bodies jointly seek standardization in areas including security, management frameworks, data exchange formats, and cloud taxonomies and reference models.

A group of standards bodies and industry groups has joined forces to collaborate on a strategy behind future cloud computing standardization efforts, the Object Management Group announced this week.

"Rather than one-by-one agreements and developing hundreds of standards that overlap, we're working together," Richard Soley, chairman of and CEO of the Object Management Group said Wednesday during a panel discussion at the National Defense University Information Resources Management College's Cloud Computing Symposium.

The group of standards bodies, called "the Cloud Standards Coordination Working Group," includes the Organization for the Advancement of Structured Information Standards, Object Management Group, Distributed Management Task Force, Storage and Network Industry Association, Open Grid Forum, Cloud Security Alliance, Open Cloud Consortium, Cloud Computing Interoperability Forum and the TM Forum.

The body is looking at standardization in a number of specific areas, including security, interfaces to infrastructure-as-a-service, information about deployment such as resource and component descriptions, management frameworks, data exchange formats and cloud taxonomies and reference models.

The form and scope of those standards is to be determined, and Soley said the groups are looking for much more input from both users and industry. "Standards don't work without heavy participation by prospective end users of those standards," he said. To help facilitate that process, the bodies have set up a wiki to allow community and customer participation in determining the best paths for standards development.

Community participation, deliberate action, and planning must be a vital part of any successful standards process, Gartner VP David Cearley said during the same panel conversation. Otherwise, he said, cloud standards efforts could fail miserably.

"Standards is one of those things that could absolutely strangle and kill everything we want to do in cloud computing if we do it wrong," he said. "We need to make sure that as were approaching standards, we're approaching standards more as they were approached in the broader internet, just in time."

Earlier this year, a group of companies and organizations created the Cloud Computing Manifesto, a group that quickly became an object lesson on the potential pratfalls of standards efforts, as several key companies, includingMicrosoft (NSDQ: MSFT), Google (NSDQ: GOOG), and Amazon (NSDQ: AMZN), decided not to participate.

Enterprise Cloud Computing Momentum in Asia Pacific

Salesforce.com社のアジア圏での売り上げが好調。 
既に顧客数は5000社を超え、既にSingaporeにアジア圏向けのデータセンタが設立され、運用が開始された。
 

Salesforce.com (NYSE: CRM), the enterprise cloud computing company, today announced that customers are adopting Salesforce CRM applications and the Force.com platform in record numbers across the Asia-Pacific (APAC) region.

The company surpassed 5,000 customers in APAC in its first fiscal quarter of 2010, including industry leaders Crocs, Pacnet, Ricoh and more.

  • Salesforce.com cloud computing applications, including the Sales Cloud and the Service Cloud, as well as the Force.com platform for developing and deploying custom cloud applications, allow customers to focus on managing their businesses - rather than managing the cost and complexity associated with software and hardware infrastructure.
  • To further accelerate international expansion and adoption of enterprise cloud computing, salesforce.com's first international data center is now live in Singapore.
  • Along with its two North American data centres, the new Singapore facility allows the company to meet the service demands of its rapidly growing international customer base as well as extend the capacity, redundancy and scalability of its infrastructure.
  • Supporting the data centre infrastructure will be a new Network Operations Centre (NOC) headquartered in Singapore. The NOC enables 24x7, follow-the-sun monitoring of the company's data centres in North America and Singapore.
  • "Asia-Pacific is our fastest growing market, and there has never been a better time for enterprise cloud computing," said Marc Benioff, chairman and CEO, salesforce.com. "Our new Singapore data center represents continued investment in our global real-time infrastructure to accelerate customer success with cloud computing worldwide."

"We welcome the decision of salesforce.com to locate its first data centre outside the U.S. in Singapore. This significant development will add to the richness of Singapore's infocomm landscape, increasing the confidence of SMEs in using SaaS for business productivity and contributing to our Cloud Computing eco-system," said Mr. Andrew Khaw, Senior Director and Group Head, Industry Development, of the Infocomm Development Authority of Singapore. "As an established trusted hub, with excellent infocomm connectivity, Singapore is in a strong position to support fast-growing international infocomm companies such as salesforce.com in delivering innovative web-based services to customers here and globally."

"We congratulate salesforce.com on the launch of its first international data centre in Singapore, just three years after establishing its Asia-Pacific headquarters here. Salesforce.com's rapid growth in the region and decision to host its enterprise cloud computing applications and information here attests to our reputation of being a stable, secure and trusted hub for businesses," said Mr. Manohar Khiatani, Deputy Managing Director, Economic Development Board of Singapore.

Enterprise Cloud Computing Momentum in Asia Pacific
Springboard Research forecast that the Asia Pacific Software-as-a-Service (SaaS) market will reach US$1.16B by 2010, based on a compound annual growth rate (CAGR) of 66 percent. By then, SaaS will comprise 15 percent of the enterprise software applications market in Asia Pacific.

"SaaS is moving beyond its roots in customer relationship management (CRM) into every area of the enterprise, including platforms for application development," said Dane Anderson, CEO of Springboard Research. "With all segments of the cloud computing market growing, we are seeing applications and platforms being adopted across the board by enterprises in the region and, perhaps most importantly, the enterprises we interview are very satisfied."

Asia Pacific Industry Leaders Adopting the Cloud
In its earnings report for the first quarter of fiscal 2010, salesforce.com noted 36 percent revenue growth in Asia Pacific compared to the same quarter a year prior, as customers continue to adopt enterprise cloud computing to run their businesses.

More than 5,000 customers across the Asia-Pacific region are now using enterprise cloud computing from salesforce.com to run their businesses, including AAPT, Acer, Amcor, CGU Insurance, Challenger Financial Services, Crocs, Datacraft, Flight Centre, Hang Seng Bank, Mizuho Private Wealth Management, Ottagi, Pacnet, Ramco, Ricoh, SPH Search, VSNL and Telecom New Zealand.

"With salesforce.com and the cloud computing model we were able to get up and running in record time compared to the on-premise CRM alternatives," said Mr. Richard Carden, Managing Director (Asia) of Pacnet. "We were able to quickly customize Salesforce CRM to the specific needs of our business, which in turn has driven strong user adoption with our employees."

"CIOs and IT departments at Asia-Pacific enterprises recognize the innovation, time-to-value and ease-of-use that salesforce.com delivers with cloud computing," said Lindsey Armstrong, executive vice president of international enterprise sales, salesforce.com. " Enterprises are realizing that this is the era of cloud computing, and salesforce.com gives businesses the ability to harness the power of cloud computing to better their companies."

Customers named in this release are part of the 59,300 customers of all sizes, industries and geographies that comprised the salesforce.com customer base as of April 30, 2009.

The Curious Case of CloudSwitch - The Troposphere

Cloudswitch社と呼ばれる、新しいCloud Computingベンダーが話題を集めている。 
 
まだ製品を発表していないが、$8M のSeries B投資をCommonwealth Capital Venturesより受け、Cloud Broker Serviceと呼ばれるビジネスモデルをアプライアンスの製品として開発中。 
 
Cloud Broker Serviceというのは、複数のPublic Cloudと企業内のPrivate Cloudとの間のデータやワークロードを双方向に動かす事が出来るサービスを指しており、既にいくつかの企業においてパイロット運用を開始している、との事。 
 
セキュリティ、制御/管理、統合システムというキーワードがポイントになっているが、まだソリューションの中身は明らかになっていない。 
 

A few days ago I had a call with Ellen Rubin, one of the co-founders of a new cloud startup called Cloudswitch.  Cloudswitch recently closed an $8M Series B funding from Commonwealth Capital Ventures.  The interesting thing is that they are still in stealth mode and have not yet released a product.  They have created an enormous amount of buzz based on the fact that their company is still in stealth mode and have attracted so much money.  Is the cloud really this hot, or is there more to this story?  I decided to tell their story in pure David Fincher style.  I will tell this curious case of Cloud Switch story backwards.

  • I am given the green light to talk about Cloudswitch, a new kind of cloud service that is described as a cloud broker service.
  • After almost a year of ongoing discussions with Ellen, I finally get why they call it a switch.  They see themselves moving workloads back and forth within the enterprise, as opposed to the concept of a cloudburst which may imply a unidirectional flow.
  • Cloudswitch acquires new office space in Burlington, MA.  They now have a good team of developers, management, and good funding to focus on the getting the product ready and are now spending time with early customers and partners.
  • June 2009 they closed an $8M Series B led by Commonwealth Capital Ventures with existing investors Matrix and Atlas ventures also participating.
  • They spend a lot of time working with enterprises customers and have successfully completed their pilot phase of development.  They are now gearing up for a beta later this year.
  • The new CEO, John, caused a number of venture firms who know him to express interest in doing a preemptive Series B.  Although they were not planning to look for additional funding until 2010, they decided that this was a great opportunity.
  • They build a core team and are fortunate to be able to bring in John McEleney as their CEO.  John was formerly the CEO at SolidWorks and ComputerVision.  He grew SolidWorks to over $350M in revenue and a market leader in the CAD space. He has a great track record of scaling companies.
  • Ellen pings me again in February 2009 to get me up to speed on what they are doing.  I am very excited about what they are doing.
  • They raised $7.4M in a Series A – first part in July 2008, second part added Atlas Venture in December 2008.
  • They tried to focus on solving some of the main issues that will enable enterprises to use cloud computing: security, control and integration with the enterprise data center.  Their product will be delivered as a software appliance.
  • Cloudswitch is founded by Ellen Rubin and John Considine in spring of 2008, and they incubate the company at Matrix Partners.  They do a ton of research asking what people think about their idea.
  • I am contacted by Ellen Rubin, formerly head of marketing at Netezza, in May of 2008.  Ellen asks me what I think about a Cloud Broker appliance startup idea.  I am under no restriction to discuss this idea, other than my word.  I decide not to divulge anything until Ellen gives me the green light.

Legal Technology - Cloud Computing Brings New Legal Challenges

Cloud Computingに関わる法的な課題の整理。 
データのセキュリティに関する各種法律と、Cloud Computing事業者がどのように対応すべきか、の確認が必要と 述べられている。 

In the early days of personal computing, users depended on "local" drives and stored their data on floppy disks kept in containers on desktops or in drawers. Applications from software manufacturers permitted users to create, manage and manipulate their business and personal information.

But in short order, software became more and more sophisticated and floppy disks were replaced by hard drives. Operating systems became faster, hard drives were developed with even more capacity and programs grew in size and scope.

Eventually the advent of networks allowed ever bigger programs to be shared among multiple users accessing ever-growing data banks. Nevertheless, networks remained largely tethered to the location of the users, who, at least theoretically, maintained both physical possession and control over the data.

The trend today is toward something different: Whereas companies may still prefer their employees to be in geographic proximity to urban centers of business and government, the cost of prime real estate, and the availability of fast online interconnectedness in many locations that would otherwise be considered remote, make cloud computing a viable and cost effective alternative. Accordingly, data and data applications that are kept in a cloud may be physically located in one or more remote servers but are nevertheless transparently available to company users.[FOOTNOTE 1]

Data kept in a cloud often is, or may be, shared among, or usable by, multiple parties. It can include information ranging from word processing documents and business presentations to employee or patient health information and tax or accounting records, to schedules, calendars and contacts. The key to cloud computing is the speed with which the data and applications can be accessed, rather than the capacity and speed of a personal computer's hard drive, as was crucially important in the past.

Even individual users are becoming more and more likely to be participants in the cloud computing phenomenon. For example, e-mail programs such as Google's Gmail, which stores users' e-mail on its own servers, is a perfect example of this growing development.

Given the explosive growth of cloud computing, it should be no surprise that it presents numerous legal issues for businesses. Two of the most significant are privacy concerns and the implications of cloud computing for pretrial discovery.

As with other forms of "outsourcing," businesses' duties to protect private or confidential data do not end with their transfer of the data to third-party vendors for storage or processing. A recent report from the World Privacy Forum, "Cloud Computing and Privacy," highlights a number of important privacy issues raised by cloud computing that corporate users of cloud computing should keep in mind.[FOOTNOTE 2]

For example, although the Gramm-Leach-Bliley Act[FOOTNOTE 3] permits financial institutions to disclose confidential consumer information to a third party such as a cloud computing service provider, the terms of any agreement between the financial institution and the provider must be carefully considered.

In addition, the Privacy Rule enacted by the U.S. Department of Health and Human Services under the Health Insurance Portability and Accountability Act[FOOTNOTE 4] requires that covered health plans, health care clearinghouses and health care providers enter into "business associate agreements" with cloud providers (and, of course, other third parties) before turning over so-called protected health information.

There may be risks associated with using cloud computing providers to store confidential corporate information such as trade secrets without appropriate and specially negotiated agreements, as well.

What undoubtedly can complicate the privacy issues in these and other situations is that the governing law might change depending on the cloud provider's physical location. Different rules can apply if storage is in a European Union country, arguably subject to the EU's Data Protection Directive,[FOOTNOTE 5] in multiple states within the United States, or in multiple locations around the world. Accordingly, it is essential that the terms of contracts for cloud computing services must be negotiated keeping in mind the type of data to be stored, the location of the servers and the particular legal obligations of the business whose data it is.

While a business might be able to make a claim against a cloud server for escape of private data, the business may not be insulated by its claim that a privacy breach was the result of the acts by the cloud server.

PRETRIAL DISCOVERY

An issue raised by cloud computing that may be even more difficult to parse than privacy concerns is the implications of cloud computing on pretrial discovery in general and on electronic discovery in particular.

Generally speaking, pretrial discovery may be had of relevant documents that are in the "possession, custody or control" of a party.[FOOTNOTE 6] That means that a party is obliged to produce documents in its control, even if those documents are not literally in the party's possession when the demand is made.[FOOTNOTE 7]

Documents are under a party's control when it has the right, authority or practical ability to obtain them from a non-party.[FOOTNOTE 8] When a corporation relies on a cloud computing provider (or multiple providers), are those documents under its control? Even if they are, how can those documents be authenticated and proven to be reliable?

In Shcherbakovskiy v. Da Capo Al Fine, Ltd.,[FOOTNOTE 9] the 2nd Circuit U.S. Court of Appeals adopted the view that a party may be required to produce documents that it has the practical ability to obtain.

The circuit stated as follows:

Turning to the legal issues first, a party is not obliged to produce, at the risk of sanctions, documents that it does not possess or cannot obtain. See FED. R. Civ. P. 34(a) ("Any may serve on any other party a request … to produce … documents … which are in the possession, custody or control of the party upon whom the request is served …" E.E.O.C. v. Carrols Corp., 215 F.R.D. 46, 52 (N.D.N.Y. 2003); see also Societe Internationale Pour Participations Industrielles Et Commerciales, S.A. v. Rogers, 357 U.S. 197, 204, 78 S.Ct. 1087, 2 L.Ed.2d 1255 (1958) (acknowledging that Rule 34 requires inquiry into whether party has control over documents), Fisher v. U.S. Fidelity & Guar. Co., 246 F.2d 344, 350 (7th Cir. 1957). We also think it fairly obvious that a party also need not seek such documents from third parties if compulsory process against the third parties is available to the party seeking the documents. However, if a party has access and the practical ability to possess documents not available to the party seeking them, production may be required. In Re NASDAQ Market-Makers Antitrust Litig., 169 F.R.D. 493, 530 (S.D.N.Y. 1996).


Shcherbakovskiy did not define what established a "practical ability" to obtain documents, but courts have determined that the legal right to obtain documents or information from another may arise by contract[FOOTNOTE 10] or as a result of an agency relationship.[FOOTNOTE 11]

The Cloud Security Alliance, a not-for-profit association of cloud computing professionals, observed in a recent report, "Security Guidance for Critical Areas of Focus in Cloud Computing,"[FOOTNOTE 12] that cloud providers "have become custodians of primary data assets for which customers have legal responsibilities to preserve and make available in legal proceedings (electronic discovery), even if the customer is not in direct possession or control."

The report pointed out that cloud computing "challenges the presumption" that corporations and other businesses actually are in control of information or data for which they remain legally responsible.

Given the general principles governing pretrial discovery, and the Shcherbakovskiy ruling, cloud users should make certain that the contracts they enter into with cloud providers clearly explain the providers' responsibilities with respect to discovery and other litigation subjects.

Moreover, companies that face the prospect or likelihood of litigation should make certain that they choose cloud providers that are able to ensure the authenticity and reliability of the data they are maintaining, including metadata. Certainly, any "litigation hold" extended by a company as a result of anticipated or pending litigation must include company resources that are stored in cloud servers.

CONCLUSION

As cloud computing becomes more understood and more widely utilized, counsel will focus on both privacy and discovery issues to a greater extent than they are doing so currently, which will lead to negotiated resolution of issues and, on occasion, litigation and court decisions.

As with many issues of technology, counsel will need to understand not just the legal precedent concerning cloud servers, but also the particular facts concerning their business' use of cloud servers, the type of data that is stored in the cloud, and the location and document retention practices of the service provider.

Shari Claire Lewis, a partner at Rivkin Radler, specializes in litigation in the areas of Internet, domain name and computer law. She can be reached at shari.lewis@rivkin.com.

:::: FOOTNOTES ::::


FN 1. For a detailed explanation of cloud computing, see, e.g., Lamia Youseff et al., "Toward a Unified Ontology of Cloud Computing," available at http://www.cs.ucsb.edu/~lyouseff/CCOntology/CloudOntology.pdf.

FN 2. The report is available at http://www.worldprivacyforum.org/cloudprivacy.html. For additional discussion of privacy issues in the cloud computing context, see, e.g., Randal C. Picker, "Competition and Privacy in Web 2.0 and the Cloud," 103 Nw. U. L. Rev. Colloquy 1 (July 2008).

FN 3. 15 U.S.C. §6802.

FN 4. See http://www.hhs.gov/ocr/privacy/hipaa/understanding/index.html.

FN 5. See "Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data," available at http://ec.europa.eu/justice_home/fsj/privacy/docs/95-46-ce/dir1995-46_part1_en.pdf, and http://ec.europa.eu/justice_home/fsj/privacy/docs/95-46-ce/dir1995-46_part2_en.pdf.

FN 6. See Fed. R. Civ. P. 26(b) (1) & 34(a) (1).

FN 7. See Fed. R. Civ. P 34 (a)(1).

FN 8. See, e.g., Babaev v. Grossman, CV03-5076 (DLI)(WDW) 2008 U.S. Dist. LEXIS 77731 (E.D.N.Y. Sept. 8, 2008).

FN 9. 490 F.3d 130 (2d Cir. 2007).

FN 10. See, e.g., Anderson v. Cryovac Inc., 862 F.2d 910 (1st Cir. 1988) (requiring production where seller of real property had control of report prepared for purchaser and maintained in purchaser's possession by virtue of provision in sales contract requiring purchaser to make records available to seller).

FN 11. See, e.g., JPMorgan Chase Bank v. Winnick, 228 F.R.D. 505 (S.D.N.Y. 2005) (holding that administrative agent suing on behalf of holders of debt was obligated to produce documents and information in possession of holders to the same extent as if the holders had brought the suit).

FN 12. "Security Guidance for Critical Areas of Focus in Cloud Computing" (April 2009), available at http://www.cloudsecurityalliance.org/guidance/csaguide.pdf.