2009年8月28日金曜日

Amazon Launches Virtual Private Cloud

Amazon Web ServicesがはっぴょうしたVirtual Private Cloudの内容と、業界の反応

 
A diagram of Amazon Virtual Private Cloud and how it connects cloud-based resources to existing private networks.

A diagram of Amazon Virtual Private Cloud and how it connects cloud-based resources to existing private networks.

Amazon Web Services has introduced Amazon Virtual Private Cloud (VPC), which allows companies to connect a set of Amazon EC2 instances with a corporate data center using a virtual private network (VPN) connection over the IPsec protocol. This offers a "cloudbursting" capability that allows enterprises to quickly expand the capacity of in-house applications while buying the extra capacity on a "pay as you go" usage-based model. The Amazon installation serves as an extension of the private network, as the EC2 instances within the VPC have no Internet-facing IP addresses. Amazon VPC is in limited beta and accepting applications.

Here's a roundup of information and commentary about the new Amazon Virtual Private Cloud:

  • "This new offering lets you take advantage of the low cost and flexibility of AWS while leveraging the investment you have already made in your IT infrastructure," writes Amazon Web Services tech evangelist Jeff Barr, who provides a step-by-step guide toi deploying resources to Amazon VPC.
  • Amazon CTO Werner Vogels says the Amazon VPC offering was developed to meet the needs of CIOs frustrated with many private cloud offerings. "These CIOs know that what is sometimes dubbed 'private cloud' does not meet their goal as it does not give them the benefits of the cloud: true elasticity and capex elimination," Werner writes. "Virtualization and increased automation may give them some improvements in utilization, but they would still be holding the capital, and the operational cost would still be significantly higher … I define the cloud by its benefits, as those are very clear. What are called private clouds have little of these benefits and as such, I don't think of them as true clouds."
  • Looking beyond Amazon, the must-read post is Christofer Hoff's colorfully titled analysis: Calling All Private Cloud Haters: Amazon Just Peed On Your Fire Hydrant. "It should be noted that now that the 800lb Gorilla has staked a flag, this will bring up all sorts of additional auditing and compliance questions, as any sort of broad connectivity into and out of security zones and asset groupings always do," he writes.
  • Sam Charrington at Appistry counters Hoff with his analysis, Amazon VPC Pees in Pool, Not Just on Fire Hydrant. "With this announcement, Amazon is attempting, intentionally or not, to co-opt the notion of private clouds by adopting confusing and misleading terminology," Charrington writes. "By claiming 'isolation' and naming the service VPC, the offering at best contributes to industry confusion around private clouds. At worst it may be outright misleading."

Terremark Offers SAP Cloud Hosting

Terremark社がSAPソリューションを自社のCloud Computingサービスのメニューとして提供を開始した。

Terremark Worldwide (TMRK) will offer hosting services for SAP solutions on its Enterprise Cloud infrastructure platform, the company said today. The agreement offers expanded options for customers looking for a cloud environment to run business applications from SAP, as the German software giant has been slow in deploying its own cloud platform.

"Terremark's hosting services will offer several benefits to both small and large SAP customers," said Michael Ressemann, global head of BPO Solution Delivery and Partner Enablement at SAP. "Our partnership with Terremark will provide customers the ability to experience significant cost savings, while not having to sacrifice the reliability of their hosted SAP applications."

The announcement continues the recent momentum for the Enterprise Cloud, which is hosting several of the federal government's cloud-hosted sites and also backing cloud offerings from partners like CSC.

The Enterprise Cloud is a managed infrastructure service running on Terremark's Infinistructure utility computing platform. The service seeks to marry the advantages of Amazon's utility computing platform - especially scalability and rapid deployment - with enterprise-ready reliability to support mission-critical applications.

Salesforce opens cloud to resellers, expanding AppExchange

Salesforce.com社が自社のForce.comサービスの再販チャネルを展開開始。

Salesforce is taking cloud computing on the road, opening its force.com platform to third-party channel re-sellers - everyone from the local one-man IT shop to the big systems integrators.

Today, the company will announce its new Value Added Reseller (VAR) program as a way of helping outside vendors who often help local, small-to-medium sized businesses manage their hardware, software and infrastructure needs.

By launching this reseller program, Salesforce is providing some answers to the questions that economy-bruised small businesses are probably asking about how to save money in the cloud. Many small business owners, salesforce executives said, are still leery of the cloud but do trust the local support technicians and technology consultants who advise them on how to operate efficiently and save money. In an interview, Mark Trang, senior director for Global Partner Marketing at Salesforce. said:

Some people have counted the channel out when it comes to the cloud. I think this it's a vital piece of the channel chain. They have years of relationships with clients over the years, have trained their own teams, made educational investments and built out their domains. How do they translate all of this to the cloud? Should they move to the cloud? What should their strategy be?

The VAR program, Trang said, is designed to give consulting partners a way to make money in the cloud and maintain an on-going revenue stream by adding additional apps and features to the force.com platform that can sold to their customers - and renewed, with the channel partner - not salesforce - owning the customer relationship and contract renewal.

Pricing for the VAR program is $7.50/per month, per user.

In addition, the company said it plans to add services listings to the company's AppExchange, an online marketplace where developers and companies buy and sell applications for the salesforce platform. Not only does it provide the VARs with a forum for finding new apps to offer their customers but also gives developers an opportunity to interact with them - possibly building relationships that could generate custom apps or extra business down the road.

The VAR program launches Wednesday. The changes to the AppExchange will occur next month.

Rackspace Offers Partner-Developed Cloud Tools

Rackspace社が自社のCloud Computingサービスに関するツールとしてCloud Toolsというサービスを提供開始。

Web hosting provider Rackspace Hosting (www.rackspace.com) announced on Tuesday it has launched Cloud Tools (tools.rackspace.com), an online service for sharing tools, applications and services developed by the company's strategic partners and independent developers for The Rackspace Cloud.

The new tools further expands on Rackspace's previous control panel and provisioning system for The Hosting Cloud, which was launched last year.

The new site is designed to match customers with companies and independent developers building services on The Rackspace Cloud platform that expand the functionality and openness of cloud computing.

The Cloud Tools site displays tools and applications developed on The Rackspace Cloud in four categories, including Monitoring & Reporting, Development Tools, System Management and Client Software.

On the site, visitors can browse the latest tools, search for features, rate the tools and post comments.

For specific tools, users can view screenshots and user ratings, review user feedback and view demo videos before being directed to the strategic partner's website for download.

The site also has a "From the Community" section offering independent developers a way to publish their tools built using The Rackspace Cloud APIs and offer them to customers.

Cloud Tools features tools from such companies as RightScale's cloud management platform, Mixpanel's real-time user interaction analytics, Cyberduck's open source file browser for Mac OS X that supports Rackspace Cloud Files.

There are also tools from 15 other partners and developers including Beanstalk, Cloud Mobile, Cloudkick, Django Cumulus, Elastic Rack, enStratus, jclouds, Jungle Disk, LibCloud, Olark Live Website Chat, rPath, SOASTA, Sonian, Vanilla, and Zeus. Other companies and developers have also said that they are planning to make their tools, applications and services available in the coming weeks.

Some of these tools are free while others can be purchased for a fee established by the developer.

"At Rackspace, it is a real priority to constantly seek out new opportunities to connect our customers with quality services," says Jim Curry, vice president of corporate development at Rackspace Hosting. "This tool-sharing service, along with our commitments to collaborate with the community to build an open cloud, lays the foundation for a large ecosystem. With the recent release of our APIs, we've seen an influx of interest in developing applications for our cloud computing offerings and we're pleased to announce that such applications can be featured in our new Cloud Tools site."

PrimaCloud Taps Xsigo I/O Director For Cloud Strategy

Cloud Computingのサービス企業であるPrimaCloud社がXsigo社のI/O Director を採用し、Cloud Computingサービスの自動化効率の向上を改善している、という事例紹介。

PrimaCloud Taps Xsigo I/O Director For Cloud Strategy

————————————————————————————————————-

—————————————————————————————————————

PrimaCloud announced today that it has deployed the Xsigo I/O Director as the foundation of its data center interconnect strategy.

The Xsigo I/O Director enables PrimaCloud to break the barriers of I/O throughput seen in existing cloud computing offerings, allowing end customers to experience application performance levels that would previously have been achievable only in purpose-built private datacenters. Additionally, Xsigo's virtual I/O infrastructure allows PrimaCloud to automatically provision cost-effective virtual private datacenters for its customers within minutes.

With three years of experience providing cloud computing under the ENKI name, PrimaCloud management was seeing an increasing number of enterprise clients running database and transaction-intensive applications — such as Oracle — which required multi-gigabit connections to the vLAN and NAS storage to avoid I/O contention. In these applications the bandwidth required per virtual machine exceeded 3Gb/sec, which meant that a single cloud server running ten virtual machines required over 30Gb/sec total I/O bandwidth. Only Xsigo virtual I/O, with dual redundant 20Gb/sec I/O connections per server, provided the required performance. The Xsigo I/O Director's low-latency bandwidth also serves to take maximum advantage of PrimaCloud's SSD-cached NAS systems to delivery outstanding application throughput.

PrimaCloud's automatically managed, hypervisor-agnostic cloud architecture requires a high level of automation to create and manage virtual private datacenters (VPDCs). The Xsigo I/O Director is able to automatically provision and manage virtual I/O and VLANs associated with virtual instances running VMWare ESX, Citrix, HyperV, and 3Tera's AppLogic, under the control of PrimaCloud's implementation of Enigmatec's EMS, a cross-platform, policy-based automation engine. Using EMS to configure the I/O Director eliminates manual labor in managing VPDCs, as well as permitting automatic scaling of VPDCs to respond to changes in load, resulting in significant end-customer cost savings.

Xsigo virtual I/O is a critical element of the reference datacenter architecture PrimaCloud uses to deliver on the promise of cloud computing: cost-effective, on-demand computing delivered on a pay-as-you-go-basis while meeting enterprise requirements for performance and uptime. The simplicity of deploying Xsigo Virtual I/O has enabled PrimaCloud's reference architecture to be deployed in any one of its 65 datacenters worldwide for public, or hosted private cloud computing, as well as at customer sites.

Virtualization.com / Tue, 25 Aug 2009 23:18:16 GMT

Sent from FeedDemon

Will Amazon’s Virtual Private Cloud Be Private Enough?

Amazon社がVirtual Private Cloudサービスと呼ばれる、企業向けのサービスを開始した事に関するコメント。
結局は、VPNで守られた論理的な専用線を経由した従来と同じS3、SC2サービスを提供する内容であるが、IBMやMiicrosoft が目指している企業向けのソリューションに対してAmazonは未だに Raw Service (マシン環境をそのまま提供する)を提供することにフォーカスしており、他のCloud Service企業と一線を引いている点では従来を変わらない、という内容。 

Will Amazon's Virtual Private Cloud Be Private Enough?

logo_awsAmazon last night announced it Virtual Private Cloud service, essentially giving enterprise customers worried about security and control in the cloud a salve to get them to trust it. The offering provides access to Amazon's web services through a virtual private network, which is basically a secure tunnel through the Internet from a corporate network to Amazon's servers. It's like having a private line to Amazon's cloud as opposed to a party line.

The virtual part of this announcement is key. The Amazon offering isn't a pledge to put all of your data on a physically separate system — it's all secluded at the network level using the virtual private network. So the information in Amazon's cloud will still be shared with other companies' data on the actual servers. By doing this, Amazon is trying to preserve the benefits of sharing fully utilized servers in a true cloud that can scale, but still provide enterprise customers with a peace of mind that they can lock down some of that data, at least while it travels to the cloud. Amazon is trying to offer the economic benefits of cloud computing in a palatable format for businesses that are weighing whether or not they should try to build their own in-house cloud infrastructures. Amazon CTO Werner Vogels explains in his blog:

These CIOs know that what is sometimes dubbed "private cloud" does not meet their goal as it does not give them the benefits of the cloud: true elasticity and capex elimination. Virtualization and increased automation may give them some improvements in utilization, but they would still be holding the capital, and the operational cost would still be significantly higher.

With this announcement, Amazon is trying to get a jump on it competitors that are gunning for corporate customers. While many big businesses have used Amazon Web Services, most perceive it as being insufficiently secure for important or confidential data. Companies such as Microsoft, IBM and Rackspace are trying to find the right mix of scale and security for enterprise clients. Microsoft is building its own platform and infrastructure-as-a-service offering called Azure; IBM is creating several gradations of a private cloud from something deployed inside a corporation's own data center to a service delivered from Big Blue's data center; and Rackspace is hoping security-minded customers use its dedicated hosting that can scale up to the Rackspace cloud.

Michael Crandell, CEO of Rightscale, which provides cloud management software, tried to explain a bit more what Amazon is trying to do with the Virtual Private Cloud, which, by the way, costs an extra 5 cents an hour per VPN:

Something that initially puzzled me is what the benefits of a VPC are when all the marketing fluff dissipates. Here is what I've learned. First, instances in the VPC are separated from non-VPC instances at a deeper network level than instances in different security groups or belonging to different users. As is typical, Amazon doesn't say anything of substance about the nature of this isolation. Let's see how soon that will have to change to actually attract enterprises…Second, instances in the VPC can seamlessly integrate into a company's internal network routing. This is significant because it means that tools used to inventory, secure, audit, manage and access all servers in the IT infrastructure can now be brought to bear on instances in the cloud as well.

The Amazon offering is different from IBM and Microsoft's efforts in that it provides access to the raw infrastructure, rather than a service. Both Microsoft and IBM, especially Big Blue, are betting that enterprises will demand services, such as IBM's workload specific offerings delivered from a cloud, or Microsoft's SQL Azure, rather than access to the raw infrastructure. In the next few years, more enterprise computing jobs will shift to some of these companies, and by creating a Virtual Private Cloud offering, Amazon makes sure it stays relevant and can get a slice of the enterprise pie. The question will be whether or not businesses find Amazon's Virtual Private Cloud private enough.


Will Google Wave be the end of email as we know it? Find out at NewNet on GigaOM Pro.

GigaOM / Wed, 26 Aug 2009 14:55:00 GMT

Sent from FeedDemon

2009年8月27日木曜日

35+ Amazon S3 Tools and Plugins for Windows, Mac and Linux | About Online Tips

Amazon Web ServiceのS3サービスをアクセスするための各種ツールがかなり広い範囲のベンダーから提供されている。 
AWSがかなり市場に浸透していることが大きな要因であるが、S3の用途がかなり明確になっている、というかパターン化している、という事もいえる。 
 

Amazon S3 provides a simple web services interface that can be used to store and retrieve any amount of data, at any time, from anywhere on the web. Amazon S3 is a cheap and cost effective way to handle sudden increase in traffic (such as from Digg, Technorati, StumbleUpon). Many bloggers are using Amazon S3 to host their images and other static media (such as mp3 files for their podcasts). Accessing and managing S3 account is not simple as it seems. Below is the list of Amazon S3 tools to access and manage S3 account on Windows, Mac, Linux powered computers.

 

  1. S3Sync  It easily transfers directories between a local directory and an S3 bucket. s3sync runs on linux and Windows.
  2. Quillen S3 BackupQuillen backs up your important files to Amazon S3 with minimum data transfer and storage. Support command line interface.
  3. WinS3fs Allows access to Amazon S3 Storage by implementing a local virtual SMB server. Supports Windows filesystem.
  4. S3Tools – It has OpenSource tools to access Amazon S3 file storage. s3cmd  is a unix-like tools to manage stored files from command line. s3browser allows you to view stored files in a browser and s3fuse helps you in mounting the S3 storage locally.
  5. Js3Tream – You can easily backup files to the Amazon S3 Web Service Storage using Windows, Linux or OSX and allows you to transfer files to and from Amazon. Great for backups using TAR or ZIP.
  6. River Drive The tool helps you to store files and taking backup of your files to a safe, encrypted environment. River Drive is a Windows and Linux GUI and CLI interface to Amazon Simple Storage Solution (S3).
  7. Bonkey – Tool for backing up files to multiple locations, including Amazon's S3. Runs on Windows, OS X and Linux.
  8. Memba Velodoc Outlook Add-In – This Amazon S3 plugin allows you to send large files from within Microsoft Outlook 2003 or above using various server platforms including Velodoc XP Edition, Velodoc Enterprise Edition, BITS servers, FTP servers, UNC file shares and Amazon S3.
  9. S3 Bucket Master – Allows you to create or delete Amazon S3 buckets, browse stored keys, upload or download encrypted or compressed files and helps you in managing Access Control Lists in amazon.com's Simple Storage Service (S3).
  10. S3-Util – Command line utility to manage data and buckets on an Amazon S3 account. It provides a getopt interface and portable xml configuration for easy usage and integration with scripts.
  11. Google Desktop File Storage  – Allows you to drag and drop files onto it for storage and retrieval. Files are stored in various backing store implementations, including Local File System, Amazon S3 etc. Supports Windows.
  12. S3Fish – It is an open source bucket explorer tool for Amazon S3.
  13. cloudbuddy – An open source, windows based bucket explorer tool for Amazon S3.
  14. S3cmd – Command line tool to upload, retrieve and manage data in Amazon S3.
  15. S3Fuse – Tool to mount S3 buckets as local resource on  Unix/Linux systems.
  16. S3Drive – Allows you to access Amazon S3 web storage space as a local network drive. S3Drive is a Windows application.
  17. S3Fox – It is a Firefox browser extension using which you can synchronize files between local folder and Amazon S3 automatically. Resembles with any FTP client with two windows to operate. You can upload and download multiple files, manage S3 Cloud Front Distributions, synchronize folders with local system, modify Access Control Policies, create time limited urls and create or edit and use multiple S3 accounts. S3Fox Supports Windows, Linux and Mac.
  18. Sync2S3 – Commercial tool that helps you to manage S3 buckets and resources through the browser. 
  19. S3Browse – You can use this website to manage your Amazon S3 account.
  20. Amazon S3 plugin for Wordpress – Allows you to easily use Amazon S3 with your WordPress blog. S3 Bucket can be easily accessed from the Wordpress admin screen.
  21. FlickR to S3 backup Tool – This open source tool allows you to backup your FlickR photos to your Amazon S3 account.
  22. JetS3t – It is a Java toolkit for using Amazon S3 that comes with two tools, Synchronize and Cockpit. Synchronize sync files between your S3 account and local computer. Cockpit (GUI Interface) lets you upload and download files as well as manage ACLs.
  23. Amazon S3Fox This Firefox add-on provides a FTP like interface (Windows Explorer) to upload and manage files on S3.
  24. S3 Backup – As the name suggest it is an encrypted free online backup solution for uploading or downloading files from S3 - just pick a file or a folder on the local hard drive to put it to S3.
  25. Transmit – It is a popular FTP client for Mac OS X that allows you to upload, download and manage your Amazon S3 storage like a Mac app.
  26. Jungle Disk – It provides a web interface to manage your files on S3 and is available for Mac, Windows and Linux. Jungle Disk helps you in taking backup of files and folders from your local hard drive onto Amazon S3 automatically. Excellent tool for slower bandwidth connections.
  27. WordPress Plugin – This WordPress plugin allows you to use Amazon's Simple Storage Service to host your media (files,images, video or audio files) for your WordPress powered blog. The plugin helps you in browsing Amazon S3 hosted files, upload new files, and create new folders without having to leave the WordPress edit screen.
  28. Bucket Explorer Bucket Explorer is the best of all the "front ends" for S3. It helps you in browsing buckets and the files stored at Amazon S3 and scheduling of upload, download, and copy operations. Files and folders can easily be copied or moved within same S3 account or to different account. The utility also track account activities with bucket logging and SimpleDB audit report. Public URLs and signed URLs can be created to share the files. If you accidental delete a file or a bucket from your S3 account you can recover it from Bucket Explorer's recycle bin, which is a unique feature not available elsewhere. This powerful S3 tool is not free and cost about US$49.99 for single license.
  29. S3 Browser – Amazon S3 provides a simple web services interface that can be used to upload or  download files to and from Amazon S3. You can easily browse, create or delete Amazon S3 Buckets. The S3 utility allows you to edit custom headers and set Access Control on Buckets and Files easily. You may also keep the backed up data on Multiple Data Centers.
  30. Cross FTP – Tool to transfer files from Amazon S3 storage space and your local computer  like the FTP file transfer client You can create buckets, set access control ,Schedule batch transfer or even change the CloudFront distribution settings of individual files with Cross FTP. 
  31. S3 for Live Writer – Plugin to upload images, documents and other files to your Amazon S3 account directly from Windows Live Writer. You can also browse the existing files on your Amazon S3 account. Handy tool for Bloggers.
  32. CloudBerry Explorer – With this tool you can manage files in Amazon S3 storage easily by providing a user interface to Amazon S3 multiple accounts, files, and buckets. You can set up bucket logging to track usage, control, file permissions and delete or temporarily disable CloudFront distributions Full review here.
  33. Gladinet – The tool helps you in mounting Amazon S3 folders to Windows Explorer. You can then access S3 buckets and files to drag and drop files between Amazon S3 and Windows Explorer. Integrates well with SkyDrive, Google Docs, Google Picasa, ADrive and many more.
  34. S3Anywhere – It is an Amazon S3 file manager for Google Android Handsets. It allows you to manage S3 buckets, download and upload files to Amazon S3. File managers allows renaming, deleting, and copying files. You can view permissions (ACL) on each file.
  35. OpenS3 – The utility allow you to synchronize your S3 files on iPhone / iPod Touch and edit any office documents (Word, PowerPoint, Excel), via integrated Zoho office. You can also view your PDF's instantly through their integration with iPaper, and also view photos, share files, and collaborate using business groups easily.
  36. Bucket Upload – It is a free application for smart mobile devices like Nokia N Series that allows uploading files to an Amazon S3 bucket. BucketUpload provides a file browser to list folders from local file system, phone memory and SD card.
  37. S3 Solutions – It is a list of other S3 related tools on the Amazon Developer Connection.

2009年8月25日火曜日

F5 Study Shows Cloud Computing Gaining Critical Mass Among Large Enterprises : VMblog.com - Virtualization Technology News and Information for Everyone

F5 Networks社が調査レポートを発行し、エンタプライズ企業の中でのCloud Computingがどの程度浸透しているのか、そしてかかる課題について調査した。
 
興味深いのは、66%がCloud Computingに向けたIT予算を確保している傍ら、Cloud Computingの定義についてはまちまちである事を表明している。 
 
また、Cloud Computing導入の大きな課題として、アクセス管理などのセキュリティがあげている企業が90%に登った。 

F5 Networks, Inc., the global leader in Application Delivery Networking (ADN), today announced the results of a survey that shows how large enterprises are implementing cloud computing. The study reveals that among large enterprises, cloud computing is gaining critical mass, with more than 80 percent of respondents at least in trial stages for public and private cloud computing deployments. Additionally, despite the maturing rate of adoption of cloud computing among enterprises, the study shows that there is considerable confusion and concern around the definition of cloud computing.

"It's no surprise that large enterprises are attracted to cloud computing because of the promise of an agile, scalable IT infrastructure and reduced costs," said Jason Needham, Sr. Director of Product Management at F5. "However, this survey shows that despite interest in the cloud, widespread enterprise adoption of cloud computing is contingent upon solving access, security, and performance concerns. As organizations turn to the cloud to increase IT agility, it is important for them to understand the technical components of the cloud and how the cloud will affect the network before developing an implementation strategy."

The survey found that IT managers are aggressively adopting cloud computing. Half of respondents reported that they have already deployed a public cloud computing implementation. In addition, private cloud computing models are also enjoying broad acceptance in enterprises, with 45 percent of respondents currently using private clouds. Consequently, cloud computing is also meriting budgetary consideration, with 66 percent of respondents indicating that they have a dedicated budget for cloud computing initiatives.

Although cloud computing is gaining rapid adoption, respondents had little agreement on how to define the term. The survey tested six industry definitions of cloud computing and found the survey participants were unable to choose any of them as being "just right." As part of the study, F5 conducted a focus group with enterprise IT managers, network architects, and cloud service providers to determine a strong definition for cloud computing. The focus group arrived upon the following definition:

 

         

Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service. Users need not have knowledge of, expertise in, or control over the technology infrastructure in the "cloud" that supports them. Furthermore, cloud computing employs a model for enabling available, convenient, and on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

         

Other findings from the survey include:

  • Cloud computing is more than SaaS – Although Software as a Service (SaaS) is an important component of cloud computing, respondents ranked SaaS behind Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) as the most important components of cloud computing.
  • Core cloud computing technologies – Enterprises employ a wide range of technologies in their cloud computing platforms. Access Control was the top concern for people, whereby 90 percent of respondents named Access Control as somewhat/very important for building the cloud. Network Security and Virtualization were also named as key technologies.
  • Cloud computing influencers – The people within the enterprise that influence cloud computing decisions go well beyond IT. Respondents named IT, application development, and line of business (LOB) stakeholders as the top influencers for cloud computing decisions.

F5 and Cloud Computing

F5 offers the enabling solution for enterprises and cloud providers to implement dynamic, scalable IT infrastructures by consolidating and virtualizing vital application resources. By ensuring the availability, security, and acceleration of application and data services within cloud environments, F5 solutions provide the visibility and control necessary to maximize the value of cloud computing investments.

About the Survey

The 2009 F5 cloud computing survey was commissioned by F5 Networks to highlight business trends regarding cloud computing. Conducted by independent market research firm Applied Research West during June–July 2009, the survey polled more than 250 IT managers in large enterprises throughout North America.

2009年8月21日金曜日

On Appirio’s Prediction: The Rise & Fall Of Private Clouds

Private Cloud のベンダーである、Appirio社のPrivate Cloudに絡んだコメント。
Purblic Cloud市場との比較において、Private Cloudの存在意義に疑問を投げかける記事が多く、それに対して反論をしている、という内容。 
 
Appirio社に限らず、Private Cloud 事業者全体に共通する事であるが、Private Cloudの価値はPublic Cloudとの組み合わせる事によってもたらさせる、という点が強調されている。 


On Appirio's Prediction: The Rise & Fall Of Private Clouds

I was invited to add my comments to Appirio's corporate blog in response to my opinions of their 2009 prediction "Rise and Fall of the Private Cloud," but as I mentioned in kind on Twitter, debating a corporate talking point on a company's blog is like watch two monkeys trying to screw a football; it's messy and nobody wins.

However, in light of the fact that I've been preaching about the realities of phased adoption of Cloud — with Private Cloud being a necessary step — I thought I'd add my $0.02.  Of course, I'm doing so while on vacation, sitting on an ancient lava flow with my feet in the ocean in Hawaii, so it's likely to be tropical in nature.

Short and sweet, here's Appirio's stance on Private Cloud:

Here's the rub: Private clouds are just an expensive data center with a fancy name. We predict that 2009 will represent the rise and fall of this over-hyped concept. Of course, virtualization, service-oriented architectures, and open standards are all great things for every company operating a data center to consider. But all this talk about "private clouds" is a distraction from the real news: the vast majority of companies shouldn't need to worry about operating any sort of data center anymore, cloud-like or not.

It's clear that we're talking about very different sets of companies. If we're referring to SME/SMB's, then I think it's fair to suggest the sentiment above is valid.

If we're talking about a large, heavily-regulated enterprise (pick your industry/vertical) with sunk costs and the desire/need to leverage the investment they've made in the consolidation, virtualization and enterprise modernization of their global datacenter footprints and take it to the next level, leveraging capabilities like automation, elasticity, and chargeback, it's poppycock.

Further, it's pretty clear that the hybrid model of Cloud will ultimately win in this space with the adoption of BOTH Public and Private Clouds where and when appropriate.

The idea that somehow companies can use "private cloud" technology to offer their employees web services similar to Google, Amazon, or salesforce.com will lead to massive disappointment.

So now the definition of "Cloud" is limited to "web services" and is defined by "Google, Amazon, or Salesforce.com?"

I call this MyopiCloud.  If this is the only measure of Cloud success, I'd be massively disappointed, also.

Onto the salient points:

Here's why:

  • Private clouds are sub-scale: There's a reason why most innovative cloud computing providers have their roots in powering consumer web technology—that's where the numbers are. Very few corporate data centers will see anything close to the type of volume seen by these vendors. And volume drives cost—the world has yet to see a truly "at scale" data center.

Interesting. If we hang the definition of "at scale" solely on Internet-based volume, I can see how this rings true.  However, large enterprises with LANs and WANs with multi-gigabit connectivity feeding server farms and client bases of internal constituents (not to mention extranet connections) need to be accounted for in that assessment, especially if we're going to be honest about volume.  Limiting connectivity to only the Internet is unreasonable.

Certainly most enterprises are not autonomically elastic (neither are most Cloud providers today) but that's why comparing apples to elephants is a bit silly, even with the benefits that virtualization is beginning to deliver in the compute, network and storage realms.

I know of an eCommerce provider who reports trafficing in (on average) 15 Gb/s of sustained HTTP traffic via its Internet feeds.  Want to guess what the internal traffic levels are inside what amounts to it's Private Cloud at that level of ingress/egress?  Oh, did I just suggest that this "enterprise" is already running a "Private Cloud?"  Why yes, yes I did.  See James Watter's interesting blog on something similar titled "Not So Fast Public Cloud: Big Players Still Run Privately."

  • There's no secret sauce: There's no simple set of tricks that an operator of a data center can borrow from Amazon or Google. These companies make their living operating the world's largest data centers. They are constantly optimizing how they operate based on real-time performance feedback from millions of transactions. (check out this presentation from Jeff Barr and Peter Coffee at the Architecture and Integration Summit). Can other operators of data centers learn something from this experience? Of course. But the rate of innovation will never be the same—private data centers will always be many, many steps behind the cloud.

  • Really? So technology such as Eucalyptus or VMware's vCloud/Project Redwood doesn't play here?  Certainly leveraging the operational models and technology underpinnings (regardless of volume) should allow an enterprise to scale massively, even it it's not at the same levels, no?  The ability to scale to the needs of the business are important, even if you never do so at the scale of an AWS.  I don't really understand this point.  My bandwidth is bigger than your bandwidth?

  • You can't teach an old dog new tricks: What do you get when you move legacy applications as-is to a new and improved data center? Marginal improvements on your legacy applications. There's only so much you can achieve without truly re-platforming your applications to a cloud infrastructure… you can't teach an old dog new tricks. Now that's not entirely fair…. You can certainly teach an old dog to be better behaved. But it's still an old dog.
  • Woof! It's really silly to suggest that the only thing an enterprise will do is simply move "legacy applications as-is to a new and improved data center" without any enterprise modernization, any optimization or the ability to more efficiently migrate to new and improved applications as the agility, flexibility and mobility issues are tackled.  Talk about pissing on fire hydrants!

  • On-premise does not equal secure: the biggest driver towards private clouds has been fear, uncertainty, and doubt about security. For many, it just feels more secure to have your data in a data center that you control. But is it? Unless your company spends more money and energy thinking about security than Amazon, Google, and Salesforce, the answer is probably "no." (Read Craig Balding walk through "7 Technical Security Benefits of Cloud Computing")
  • I've got news for you, just as on-premise does "…not equal secure," neither does off-premise assure such.  I offer you this post as an example with all it's related posts for color.

    Please show me empirically that Amazon, Google or Salesforce spends "…more money and energy thinking about security" than, say, a Fortune 100 company.  Better yet, please show me how I can be, say, PCI compliant using AWS?  Oh, right…Please see the aforementioned posts…especially the one that demonstrates how the most public security gaffes thus far in Cloud are related to the providers you cite in your example.

    May I suggest that being myopic and mixing metaphors broadly by combining the needs and business drivers of the SME/SMB and representing them as that of large enterprises is intellectually dishonest.

    Let's be real, Appirio is in the business of "Enabling enterprise adoption of on-demand for Salesforce.com and Google Enterprise" — two examples of externally hosted SaaS offerings that clearly aren't aimed at enterprises who would otherwise be thinking about Private Cloud.

    Oops, the luau drums are sounding.

    Aloha.

    Related posts:

    1. Private Clouds: Even A Blind Squirrel Finds A Nut Once In A While
    2. Mixing Metaphors: Private Clouds Aren't Defined By Their Location…
    3. Private Clouds: Your Definition Sucks

    Rational Survivability / Wed, 19 Aug 2009 05:24:50 GMT

    Sent from FeedDemon

    SaaS: Only Three Public Companies Are Consistently Profitable

    Garterの調査によると、SaaS事業で利益を上げている会社は3社しか存在しない、という報告が出ている。  利益が出ていない大きな理由として、高額なマーケティングと顧客開拓コスト、ということが上げられている。 

    SaaS: Only Three Public Companies Are Consistently Profitable

    Here's a stunning observation about the SaaS (software as a service) market: Only three publicly traded pure play SaaS companies generate sustained GAAP profits, according to Gartner Inc. Does that mean SaaS isn't living up to expectations?

    That depends on your perspective. On the one hand, SaaS-centric companies continue to fetch lofty valuations on Wall Street. And SaaS revenues continue to grow quickly for many companies. But on the other hand, SaaS providers also face lofty marketing expenses and customer acquisition costs.

    Plus, you have to keep Gartner's statement in perspective. The statement only looks at:

    • Publicly held SaaS companies — leaving scores of privately held SaaS businesses out of the conversation.
    • Pure play SaaS companies — leaving hundreds of hybrid companies (particularly Intuit, Microsoft, SAP, Oracle, etc.) out of the SaaS conversation.

    As I look at our own SaaS 20 Stock Index, I see numerous companies that have generated profits in recent quarters — but Gartner's statement focuses on sustained profits and pure SaaS players. Plus, I must concede: I'm not sure which three SaaS companies Gartner is indirectly praising. (Anyone care to guess?)

    Still, it's one heck of a reality check from Gartner. Thanks to Ken Vanderweel over at Nimsoft for mentioning a series of Gartner stats to me — including the SaaS stat. It's a real eye-opener.

    Follow MSPmentor via RSS; Facebook; Identi.ca; and Twitter. And sign up for our Enewsletter; Webcasts and Resource Center.

    MSPmentor / Thu, 20 Aug 2009 14:18:15 GMT

    Sent from FeedDemon

    2009年8月20日木曜日

    A PCI-Compliant Cloud? Not at Amazon

    PCI-DSS(クレジットカードのオンライントランザクションのセキュリティに関する業界規格)に対して、Amazon Web Serviceが準拠していない、という私的に関する情報。 
     
    Amazonではこの件について自社サイトで説明を行っており、厳密に言うと、PCI Level 1は出来ないが、Level 2は実装可能、と述べている。 
     
    Amazonに限らず、PCIのようなセキュリティ規格をサポートしていないCloud Computingベンダーは数多い。 僅かにTeremarkやSavvis等のEnterprise 向けのホスティング業者がサポートを表明しているが、PCI以外にもセキュリティ規格は他にも多く、全体をカバーするのは実質的には難しい。 

    There's an ongoing debate about the ability of cloud computing services to meet enterprise regulatory compliance requirements, including the Payment Card Industry Data Security Standard (PCI DSS) standard that is essential for e-commerce. Martin McKeay at the Network Security Blog recently highlighted the admission by one of the most popular cloud services, Amazon Web Services, that it does not support the highest levels of PCI compliance.

    "From a compliance and risk management perspective, we recommend that you do not store sensitive credit card payment information in our EC2/S3 system because it is not inherently PCI level 1 compliant," an Amazon representative told a customer in an exchange that was posted on an AWS web forum. A key issue is that PCI auditors are unable to inspect Amazon's data centers. (Read on for additional information from Amazon on this issue).

    McKeay's post has prompted a fresh round of discussion of cloud computing's ability to support PCI DSS, even as recent data breaches have raised questions about the value of PCI compliance.

    2009年8月12日水曜日

    VMware Getting into PaaS with SpringSource Acquisition

    VMWare社のSpringSource社買収に関する分析。  PaaS事業に関する詳細内容。

    VMware Getting into PaaS with SpringSource Acquisition

    Hot on the heals of SpringSource's recent acquisition of Hyperic, VMware today announced their intention to acquire SpringSource. At first glance this move may seem puzzling, why would VMware want to buy an open source enterprise application development platform? Could it be for Hyperic, an open source IT management platform? I doubt it. I'd say it's all about planning for the future, a future where the OS no longer matters, a future where all applications are built, deployed and consumed via the Internet. Yes folks, I'm talking about Platform as a Service.

    According to the post by VMware CTO Steve Herrod, he states that since it's founding 11 years ago, VMware has focused on simplifying IT. More to the point saying "VMware has traditionally treated the applications and operating systems running within our virtual machines (VMs) as black boxes with relatively little knowledge about what they were doing."

    Moreover I too believe that the operating system seems to get in the way more then it helps. Add in overly complex hyper-visors and you've got several layers too many of abstraction when we all know the real work gets done in the application layer. Everything else just subtracts from the end goal -- Building and deploying scalable applications which at the end of the day is the only reason to have any sort of IT infrastructure anyway.

    VMware even has a nice picture to illustrate their the new PaaS initiative:

    Image014

    The announcement goes on to outline "common goals for developers to easily build their applications and move from coding to production execution as seamlessly as possible… regardless of whether they will be deployed to a small internal datacenter for limited use or to a completely external cloud provider for much larger scale audiences (and the hopes of achieving Facebook application stardom!). This end state has a lot in common with what is today referred to as "platform as a service" (abbreviated PaaS). Salesforce.com's Force.com and Google's AppEngine are two of the best known examples of PaaS today."

    I believe that this is a very smart move for VMware. I find it even more interesting because both Hyperic and SpringSource are open source plaforms. Does this mean that VMware is about to become an open source company, probably not. My read is #1 Hyperic is about to get shut down. It's an unneeded asset for VMware, and #2, SpringSource becomes a focus point in VMware's cloud strategy, a strategy that sees itself becoming the key point of interchange when deploying to the cloud, be it an infrastructure focused offering or platform offering. VMware wants to be in the middle and now they will be. (Cloud Interoperability is now more important then ever)

    A few weeks ago Tom Lounibos, CEO of Soasta summed up the opportunity when asked "What's the future for Cloud "IaaS" vendors?"...he replied..."becoming "PaaS" vendors". So true
     

    2009年8月11日火曜日

    The Role of VARs in SaaS

    SaaS市場が広がるにつれ、SI事業者のビジネスモデルが議論されている。 その可能性に関する記事

    The Role of VARs in SaaS

    I just got back from Las Vegas from having participated in a panel for CompTIA Breakaway 2009 where the focus of the panel was on the role that VARs can play in the SaaS space.

    First I need to get something off my chest that has been boiling inside of me since my first few conversations with some of the attendees and VARs at the conference: Hosted Exchange or managed IIS servers is NOT SaaS, call it Managed Services, or whatever you want to call it but it certainly is NOT SaaS and certainly not PaaS for that matter.

    Now with that out of the way ;) it doesn't mean that VARs don't have a role in SaaS and frankly the reason why most VARs use the term SaaS interchangeably with Managed Services is because they really can't be bothered with the details that actually define SaaS; they simply care that you use it as you go and it lives outside of your premises. SaaS just happens to be today's cool buzzword, so why not ride the wave, right?

    Anyways, back to the point I wanted to talk about; VARs and Channel players are normally considered the Trusted Expert for their customers and as such they can still provide expertise on SaaS applications and arguably much easier than traditional on premises applications.

    There are many ways that VARs can make money in SaaS and a lot of them, the same ways they make money today; here are just a few:

    Commissions: Many SaaS vendors have referral programs were you can make a decent commission by a simple referral. Even if some of the vendors don't have an official referral program, I'm sure that most would be happy to talk to you about making something like that work for you if you are bringing them real business that wasn't already part of their pipeline.

    Training & Support: Become real experts of the solutions that you are proposing and go to the extent to offer training & support. Even if the SaaS vendors offer support and training on their own, you have the upper hand of up-selling your existing customers who already trust you as their loyal advisor plus it will make you that much better selling the solution since you know it so well.

    Integrations: Native SaaS applications are inherently built to scale (the ones that are properly architected at least and if yours isn't, check out SaaSGrid for some help!), this requires the applications to be built as service oriented applications so by default most properly written SaaS applications will automatically expose APIs for extensibility and integrations via web services. If you are a technical enough shop, you can actually provide value added services by extending the SaaS application or bridging two solutions much easier than you would normally be able to do in an on premises application.

    Consolidation: If you are offering your customers multiple services, you can actually consolidate the services for them and give them additional benefits like central billing and alike.

    Clearly these are only a few ways that VARs can play a role in the space and some changes to your core business model might need to be made but what I'm interested to know is really what you think? Are you a VAR making money with SaaS? If so, what are you doing and how is it different than your other traditional on premises offerings; also, how is it affecting your bottom line? If not, why not?

    Join us and share your thoughts through the comments. Also If you'd like to mingle with others in the SaaS space, the SaaSBlogs group on LinkedIn now has 2400+ members and is growing every day; make sure you' re not missing out and join today!

    SaaS Blogs / Fri, 07 Aug 2009 14:24:07 GMT

    Sent from FeedDemon

    Bifurcating Clouds

    Cloud Computingのサービス市場が大きくConsumer向けとEnterprise向けの2つの流れに分かれていく、という市場予測。 
     
    Consumer向けは価格を重視し、Enterprise向けは機能を重視する傾向。 
    さらに、Enterprise向けソリューションにおいては、VMWareは重要なポジションをもつ、と予測している。 

    Bifurcating Clouds

    Spectrum of Cloud Computing Providers

    Spectrum of Cloud Computing Providers

    There will soon be two major paths for cloud computing providers: commodity and premium.  If you read my series, Cloud Futures, you'll know that I broke down cloud service providers into three major categories: service clouds, consumer clouds (previously 'commodity')[1], and focused clouds. In retrospect I realize now that there are possibly four, not three major categories. The missing category is premium enterprise clouds. Previously I had lumped these under focused clouds, but I now realize that, in fact, there are likely to be so many of these that they deserve their own category. I'll go even further and suggest that in terms of markets targeted, there will really only be two ends of a spectrum: enterprise and non-enterprise.

    Most clouds will fit towards one of this spectrum or the other. In essence, you're targeting small businesses (startups, SaaS providers, and SMBs) or you are targeting larger businesses (SME or Fortune 2000). The former are extremely cost conscious while the latter may have a number of other equally important drivers, such as security (e.g. VPN access), high availability (HA), SLAs, application portability without modification[2] and similar. Clearly large enterprises will consume services at both ends of the spectrum, but they will have many use cases (mostly 'production') that can only be serviced by a premium service running VMware's forthcoming vCloud product.

    This means we will have a large bifurcation in the cloud computing space with two very different kinds of solutions. Clouds will either target commodity customers or premium customers.  Very few clouds will actually fit in the middle of this spectrum initially, although I expect providers on both sides will grow towards the middle.  In quite a few cases (AT&T and Rackspace come to mind) cloud providers will build two offerings at both ends of the spectrum, but we haven't seen this quite yet.

    Premium vs. Commodity

    Ultimately, commodity clouds will be forced via pricing pressure to continue to drive down capital expenses and operating costs.  As we can already see in the public cloud space, providers have largely standardized on the Xen open source hypervisor.  This is the de facto standard because it is free.  In contrast, premium enterprise clouds will necessarily spend more on their infrastructure to provide advanced features like HA.  Their pricing will reflect this, but it also means they will use VMware's products and hence have unique opportunities for integrating with internal clouds at large enterprises (more on this below).

    This table summarizes the differences.

    Commodity Premium
    Focus Price Value
    Hypervisor Xen VMware ESX/vSphere
    Pricing $ $$$$
    "Enterprise" Features No Yes; lots
    Your App Needs Changing? Yes No

    Enterprise Clouds Are Already Here

    Slide show of Terremark's Enterprise CloudIf you were paying close attention this year, you'll have noticed that both Savvis and Terremark are working on or have delivered enterprise cloud offerings.  There are many  more on the way.  These providers are delivering VMware-based platforms specifically for enterprise customers and pricing reflects that[3].  Terremark even labels itself 'The Enterprise Cloud'.  I had hoped to release a full review of Terremark, but due to time constraints haven't been able to complete it.  If you click on the screenshot to the right it will take you to a set of Flickr photos that are an extensive tour of the Terremark Enterprise Cloud product.

    What's most interesting about this is that two major players have entered into this space and at the same time VMware's vCloud is unreleased.  Nor are there any other shrink-wrapped software packages for building a cloud based on VMware.

    VMware's Dominant Position for Building Internal Clouds

    But why VMware?  What's so important about it?  For those of you who may not be aware, VMware's enterprise-class hypervisor (ESX) is the de facto standard inside the enterprise, in much the same way the Cisco routers & switches are a standard.  This means that as enterprises move towards building internal clouds (an inevitability), they will be more likely to build clouds based on VMware's ESX, which they are already comfortable with.

    A-ha! Surely there is a startup or major player who has already delivered a software offering that allows enterprises to build their own internal clouds?

    No. There is no credible contender to VMware's crown.  Even though they did not see cloud computing coming, even though they are a large organization and slow to move, there is still not a single credible contender with a released product that manages the VMware ESX hypervisor and allows you to build a real self-service internal cloud.  Nada.  Zip.  Zilch.

    There are some prospects like Platform's ISF[4] that could be contenders, but by the time they are released in the wild, VMware's vCloud will also be released.  The window of opportunity for making significant inroads into the enterprise is closing quickly[5].  Once VMware's vCloud is released, who will risk averse IT managers and CIOs in enterprises go to?  A new player or someone already trusted and embedded like VMware?  There is no doubt.  They will largely select vCloud unless VMware fails to execute.

    Can VMware Fail to Execute?
    Is it possible for VMware to fail to execute in it's sweet spot?  It's area of expertise?  Yes.  Is it likely?  No.  If you look at the DNA of the business they already have the kinds of talent necessary for building a strong product in their acquisition of Akimbi, the folks upon whom the VMware Lab Manager product was built.  That team already knows how to build a self-service portal and a large scale VM deployment system including scheduler as these were integral for the lab manager product.

    In other words, the writing is on the wall.

    The Power of Internal + External Clouds
    For many smaller business, moving everything to the cloud will always be a very compelling solution, but for the enterprise it will never be acceptable.  For various reasons (regulatory, political, legal, and others) enterprises must maintain a certain amount of infrastructure.  Also, I've heard fairly compelling arguments that large enterprises have sufficient scale to build and operate their own clouds at a cost advantage to external clouds.  Regardless, some capacity will reside outside of the firewall.

    The usage of external clouds will largely be dictated by use case and in order for enterprises to derive maximum value from both internal and external clouds they will want a single internal portal that manages both.  They will want minimal friction for internal customers to be able to pick the best cloud for the job/cost.  It will also be important to allow some amount of portability (moving VMs and their workloads across the firewall).

    While this doesn't require a VMware hypervisor on both sides of the firewall, it will be greatly facilitated if that is the case.  Tools written against the vCloud API will likely work with vCloud-based external clouds without modification.  There is simply far too much synergy possible once both internal and external clouds are based on the same cloud platform.

    Summary
    There will be two paths for clouds: premium & commodity.  Premium clouds will focus on the enterprise and delivering value they are concerned about.  Commodity clouds will largely be forced to compete on pricing and features irrelevant to the enterprise.  VMware's vCloud will be the dominant player behind the firewall because there is no credible contender.  The synergistic effects of internal & external clouds being based on the same vendor's software will provide powerful and compelling reasons for enterprises to adopt those external clouds.  Enterprises will use commodity clouds, but mostly for batch processing and non-production workloads that are pricing sensitive.  The bulk of enterprise cloud spending will be on vCloud-based public cloud providers.


    [1] I realize, also in retrospect now, that I should have chosen a better name than 'commodity clouds'.  To avoid confusion in this article, I'm going to call them 'consumer' clouds.  Any suggestions?
    [2] This is still pretty much impossible for Amazon to do for many architectures.  When you go to the Amazon or Google 'clouds' you're making a choice to port your application.  Some clouds, like GoGrid with their CloudCenters, do make it portability easier.
    [3] I didn't get it nailed down for this article, but if memory serves Terremark's entry-point offering is about $2,150/month for 10 cores, 10GB RAM, 100GB storage divided up however you like across up to 10 servers.  You can add more of each incrementally and there are pricing discounts on volume.
    [4] Platform has been delivering grid solutions, very similar to technology that powers today's clouds, for many years and has great DNA to build a compelling offering.
    [5] Honestly, it's probably already a done deal.

    Post to Twitter

    Cloudscaling / Thu, 06 Aug 2009 15:30:19 GMT

    Sent from FeedDemon

    2009年8月8日土曜日

    Five Reasons Why Oracle Will Enter the Cloud Computing Business

    IDCのリサーチによる分析。  Oracleが着々とCloud Computing事業の準備を進めている、という理由を5つ挙げている。
    ________________________________

    July 17, 2009 - IDC Link

    By: Jean Bozman

    Cloud computing is an industry-wide hot topic for discussion, building on virtualization and a modular software stack — and aimed at providing cloud services to end-users on a pay-as-you-go basis. At last fall's Oracle OpenWorld conference in San Francisco, Oracle CEO Larry Ellison said that cloud computing was not an immediate focus for the company — at least at the time — and that the definitions of cloud computing were as variable as fashions in apparel. But things have changed in the industry: the focus in cloud computing has shifted from application development, and tire-kicking, to enterprise computing and private clouds within the walls of the IT datacenter. Now, Oracle executives are mentioning cloud computing more often, including on a recent quarterly call with Wall Street analysts. Importantly, the Oracle website now includes a reference page for cloud computing. The company is speaking more often about software as a service (SaaS), with enterprise apps delivered via Internet — as components of enterprise cloud computing. Following up on some quick comments about on-demand SaaS offerings during Oracle's quarterly call with financial analysts on June 23, 2009, IDC believes that a number of factors show Oracle is wellpositioned to compete in the cloud computing market, starting with on-demand application services for use in IT private clouds, and extending into provisioning of cloud computing on behalf of service provider partners (public clouds) — or directly, as a future set of Oracle cloud services.

    Here are five reasons why:

    • Oracle's rich application portfolio could feed a specific cloud computing need for pay-bythe- drink application use. Oracle already supports SaaS efforts, such as Oracle on Demand. Cloud computing would take this approach one step further, by providing the infrastructure elements that support cloud-enabled provisioning of application services, and doing so in a multi-tenant context. One example: Oracle recently announced 8 new Oracle CRM on Demand offerings.

    • The next wave of cloud computing will feature cloud-enabled storage services, building on the concept pioneered by Amazon Web Services' (AWS) S3 storage offering. Clearly, Oracle has already considered this type of cloud computing service, and last fall announced that Oracle DataGuard would be available to web developers building cloud-enabled archiving solutions on AWS' S3 cloud. IDC believes that the same concept could be developed more broadly, aimed at more types of end-users, in 2010.

    • Oracle's pending acquisition of Sun Microsystems will bring the company a portfolio of cloud computing building blocks. These include Java, Solaris, OpenSolaris, and the Sun xVM framework for managing virtualized workloads. Sun xVM supports multiple operating systems as guests — including Sun Solaris/OpenSolaris, Microsoft Windows and Linux. All of which would complement Oracle's Linux virtualization offerings: Oracle Virtual Machine (OVM) and Oracle Enterprise Linux (OEL) for virtualized environments. Oracle's OVM is designed to optimize Oracle products' performance on virtualized IT infrastructure.

    • The Cloud and High Availability (HA) will be a big theme for cloud computing, going forward, because data can be parked or archived on the cloud. Large companies such as Symantec, Microsoft and IBM have announced that products and services that will tap into this emerging, but promising, market. When customers place their replicated data on the cloud, that frees them from concerns about disaster recovery — or data recovery — at their local sites, even in the face of natural disasters and power outages taking data offline. Retrieval of that data, once it is parked in the cloud, is not instantaneous, but it will still be the basis for a business continuity process of backup/recovery and restart of applications tapping that data — with security and availability. IDC believes Oracle's wealth of software to manage Oracle data on storage devices could be leveraged to support cloud computing HA initiatives — and that kind of service could be delivered by Oracle, by enterprise IT organizations (private cloud) or by service provider partners (public cloud) supporting Oracle technology in their infrastructure. In addition, Oracle Application Grid offers a foundation for customers to deploy applications within private clouds for scale-out capacity and flexibility. •

    • Ability to scale up via cloud services. The ability to scale up services, on demand, is key to the cloud now — and will be key to the enterprise cloud. The combination of Oracle and Sun technologies will make it possible to do so, on a variety of hardware/software platforms. In their initial discussion of the aims of the acquisition, Oracle talked about optimizing performance on storage or server appliances — both of which could be engines for cloud computing. Overall, IDC projects that the IT spend for cloud infrastructure will be nearly 10% of total IT infrastructure spending in 2012, compared with less than 5% in 2008; however, IDC will continue to update that future outlook as 2009 progresses.


    What does all of this mean for Oracle? Following the Sun acquisition, Oracle will be able to build vertical software stacks that extend from the hardware platform itself, to the operating system (e.g., Solaris, OpenSolaris, and Oracle Enterprise Linux, which is based on Red Hat Linux), to middleware (Oracle Fusion middleware), and then to Oracle Applications and Oracle Database at the top of the stack. As the next wave of computing ramps, accelerating the use of enterprise applications in cloud infrastructure, Oracle's applications and database products will be key workloads in the cloud — no matter who provides those cloud services (IT organizations or service providers). It appears that Oracle is in the process of assembling an important inventory of cloud computing technologies for enterprise computing, although that segment of the market will take years to mature. Still, the shape, and the exact details of the future announcements surrounding Oracle's cloud offerings have yet to be put into place. As for how, where and when these technologies will get deployed into the marketplace, prior to any formal finalization of the Sun acquisition, we can only say: Stay tuned

    Subscriptions Covered:

    Enterprise Servers: Technology Markets, Operating Environments, Enterprise Virtualization Software