


米国市場でのCloud Computing市場についての技術、ビジネス動向をブログ形式にて情報発信するサイトです。 情報ソースは米国の様々なブログや記事を参考にして、個人的な所見を加えています。 個別に更なる詳細調査などのプロジェクトも受け付けております。 メールにてご連絡頂ければ直ちに対応いたしますのでご連絡ください。
Labels: Amazon, capacity overdrafting, enomaly, oversubscription, Quality of service
Authentication and credential management provider ActivIdentity announced Tuesday that it will launch new cloud computing services to help web-based security service providers defend against threats and identity fraud.
ActivIdentity recently acquired CoreStreet, an identity validation provider, and willintegrate that firm's offerings into its cloud computing services. Company representatives said that the security initiative's subscription-based service model could be an attractive option for businesses that are already facing budget constraints.
"Our cloud computing strategy was inspired by feedback from our customers and service providers, indicating a growing desire to move from security solutions to security services," said ActivIdentity chairman and chief executive Grant Evans.
The new web-based security services could appeal to vertical markets in a number of different sectors where identity validation and protection is essential, including government and financial services, the company noted.
A number of analysts have remarked on the quality of security services offered in the cloud. Gartner found in one study that SaaS-based security services have the potential to offer faster scalability at lower costs than equivalent-capacity onsite solutions.
There were speculations that Apple Inc, (NASDAQ: AAPL) will start a monthly subscriptions service for its iTunes, rather than the current $1.29 per song business model.
Apple has put all these speculations to rest by stating that its acquisition of Lala Company was not to start iTunes subscriptions services but to provide cloud computing services for its iTunes users. Lala provides a personal music storage service which Apple will use for providing cloud computing services. The Lala setup process provides software to store a personal music library online and then play it from any web browser alongside web songs they sell.
The next version of iTunes will allow the users to upload their catalogs on the net and access these from any browser or net connected iPod/Touch/Tablet. This upgrade will be possible through Lala's technology. Once this technology is installed, users will be able to navigate and play their music, videos and playlists from their personal URL using a browser.
This way
Microsoft and Intuit Strike Cloud Computing PactThe deal calls for Azure to be an Intuit preferred platform
JANUARY 24, 2010 07:00 PM EST | ||||||
![]() Microsoft and Intuit are going to join their clouds, Azure and the Intuit Partner Platform (IPP), so developers can deliver and market web applications to the 27 million QuickBooks-using small businesses through the Intuit App Center. The integration also means that small businesses can use Microsoft's cloud-based productivity applications via the Intuit App Center, presumably heading off some losses to Google Apps and Zoho. The deal calls for Azure to be an Intuit preferred platform. There's a free Azure beta SDK that will federate applications developed on Azure with the go-to-market IPP already available athttp://developer.intuit.com/azure. Integration is based on an extension of the QuickBooks data model and will provide APIs for single sign-on, billing, data integration and user management. The companies expect a flood of SaaS apps to follow since together they have some 750,000 development firms and channel partners. Azure launched February 1. Later this year, after they get the integrate just right and widgetry's formally out, Microsoft will make its Business Productivity Online Suite, including Exchange Online, SharePoint Online, Office Live Meeting and Office Communications Online, available for purchase in the Intuit App Center. |
Who Do You Trust To Meter The Cloud?So what has this subject got to do with Cloud Computing? | ||||||
Tom Raftery at Greenmonk (the green shoot fromRedmonk) has a great analysis of the disastrous use of smart meters by PG&E in Bakersfield, California. Bakersfield residents believe their new smart meters are malfunctioning because their bills are much higher than before. PG&E claims higher bills are due to rate hikes, an unusually warm summer, and customers not shifting demand to off-peak times when rates are lower. In the same story on smartmeters.com, State Senator Dean Florez, the Majority Leader in California, is quoted as saying “People think these meters are fraud meters. They feel they’re being defrauded. They’re getting no benefit from these things.” CIO, CTO & Developer Resources One of the advantages of a smart grid is that the two way flow of information will allow utilities to alert customers to real-time electricity pricing via an in-home display. PG&E have not rolled out in-home displays with their smart meters, presumably for cost reasons. If they lose the class-action law suit, that may turn out to have been an unwise decision. There is a better way, however: What PG&E should have is a system where customers can see their electrical consumption in real-time (on their phone, on their computer, on their in-home display, etc.) but also, in the same way that credit card companies contact me if purchasing goes out of my normal pattern, PG&E should have a system in place to contact customers whose bills are going seriously out of kilter. Preferably a system which alerts people in realtime if they are consuming too much electricity when the price is high, through their in-home display, via sms,Twitter DM, whatever. So what has this got to do with Cloud Computing? Quite a lot, actually. Customers of Cloud services right now depend on the "meters" being provided by the service providers themselves. Just like the PG&E customers in Bakersfield. This means that they depend on the service provider itself to tell them about usage and pricing. There isn't an independent audit trail of usage. The meter also locks the customer into the service provider. Data transfer to cloud computing environments must be controlled, to avoid unwarranted usage levels and unanticipated bills from over usage of cloud services. By providing local metering of cloud services' usage, local control is applied to cloud computing by internal IT and finance teams. The Cloud Service Broker analyzes traffic and provides reports as well as an audit trail. Reports include usage information in real-time, per hour, per day, and per service. Reports are based on messages and based on data. Visibility is key. This is all independent of an individual Cloud service provider. It is easy to imagine how useful this would be in conjunction with Amazon's spot pricing (see a great analysis of Amazon's spot pricing by James Urquhart here). |
What Big Data Will Mean to ITApplications designed to take advantage of new computing capabilities and respond to the needs of huge data | ||||||
We’ve all heard and read about the enormous pace ofgrowth of the cloud – being experienced now and for the future. And I think that what will drive growth, most powerfully, will be the ever expanding need for data storage. A recent piece in the San Francisco Gate quoted a study done by analysts IDC last year about enterprise storage needs. The study said that, over the next five years, structured data (the traditional row-and-column information contained in relational databases), will grow more than 20%. Meanwhile, unstructured data will rise at an astounding 60% compounded rate. That means structured data storage requirements will double, while unstructured data storage requirements will increase seven times. So application scale is dramatically growing. So what does this mean for the future of IT? Basically, a big skill for IT folks will revolve around being able to stay on top of load variation. “The ability to respond to dynamic app load by rapidly altering application topology will be a fundamental IT skill,” says the article, written by Bernard Golden is CEO of consulting firm HyperStratus.
Cloud providers offer orchestration (defining computing capacity in a single transaction, with the underlying infrastructure of the orchestration software obtaining the necessary individual resources to produce that capacity). But what will be needed for the volume of tomorrow’s applications is dynamism – the ability to rapidly, seamlessly, and transparently adjust resource consumption, all without human intervention, notes the story. Without it, you’ve got a “buggy whip processes in a motorized world.” There is no way, in my mind, that I can see masses of IT admins on the job 24/7 adjusting computing resource levels. I think the cloud is the answer here, and so does the author: “There’s no question that cloud computing, whether a public or private/internal variant, is the future of computing. ” But there are and will be challenges dealing with the demands of infinite scalability and highly variable demand. “We can expect to see massive stress in IT operations as it grapples with how to respond to workloads that are orders of magnitude larger,” according to the story. ” And, just as many mourned the passing of the friendly, service-with-a-smile telephone operator, many people, both inside and outside of IT, will grieve for the old days of smart, hands-on sys admins.” Golden says that what’s expected for this new world of bid data and ever-increasing computing demand is: 1. A need for highly-automated operations tools that require little initial configuration and subsequent “tuning” because they operate on AI-based rules 2. A huge amount of turmoil as sunk investment in outmoded tools must be written off in favor of new offerings better suited to the new computing environment 3. A change in the necessary skill sets of operations personnel from individual system management to automated system monitoring and inventory management 4. A new generation of applications designed to take advantage of new computing capabilities and respond to the needs of huge data. As we’re transforming the way companies compute and changing the infrastructure that’ll be needed for the days of big data, I believe it’ll be more essential than ever for companies to employ assistance in measuring database capacity usage. Indeed, that’s why we’re seeing such an explosion of growth for cloud monitoring services that can measure usage and alert companies when they’re approaching their thresholds. Read the original blog entry... Published January 9, 2010 – Reads 508 |