2009年10月30日金曜日

クラウドセキュリティのソリューションベンダ例=>

Catbirdと呼ばれるベンダーは各種業界のコンプライアンスに準拠するためのセキュリティソリューションを提供する。
Catbird Launches First-Ever Comprehensive Security and Compliance Solution for Cloud Computing Providers

Catbird®the pioneer in leading-edge security and compliance solutions for virtual infrastructure, today announced the launch and immediate availability of its new vSecurity Cloud Edition™, the first solution designed specifically to enable both public and private cloud computingproviders to instantly deliver SOX, PCI, HIPAA, FISMA, COBIT, DIACAP and other regulatory-based security and compliance policy monitoring and enforcement. vSecurity Cloud Edition was previewed at last month's VMworld 2009, for which it was honored with a Best of Show Finalist award, the first security product to win in the Cloud Computing category.
vSecurity Cloud Edition gives service providers the competitive edge with the industry's only automated monitoring and enforcement solution that covers all seven critical control areas: auditing, inventory management, configuration management, change management, access control, vulnerability management and incident response, in an elegant solution specifically architected to meet the needs of cloud providers.

"Compliance workflow and reporting is the "money report" of cloud computing, and Catbird delivers," said Edmundo Costa, Catbird COO. "Everyday, cloud providers face customers who demand documented privacy, security and compliance before they'll commit to cloud-based deployments. vSecurity Cloud Edition delivers the information with a compliance workflow based on comprehensive security controls, empowering customers to embrace cloud-based infrastructure."

vSecurity Cloud Edition: Security Provider for Amazon Web Services Customers

Catbird is the only virtualization security product offered by Amazon Web Services' Elastic Compute Cloud (EC2) web service, which provides instantly-scalable cloud-based computing capacity to meet evolving enterprise needs. Customers of Amazon's EC2 now have the option of adding automated, continuous vulnerability monitoring to their Amazon Machine Image with Catbird's vSecurity Cloud Edition. Organizations regulated by PCI or other policies that demand compliance can take advantage of Amazon's state-of-the art cloud while remaining compliant, enhancing Amazon's value proposition and accelerating growth in this increasingly-popular space.

"Cloud computing is a natural extension of where data centers are headed with virtualization," notes Neil MacDonald, Vice President and Gartner Fellow. "We've come a long way towards implementing virtualization more securely, and virtualized security controls are a key element of this. Organizations should demand the same level of protection and compliance from cloud-based service providers as they would from their own virtualized or physical infrastructure."

Industry Leading Compliance Controls

vSecurity Cloud Edition delivers unprecedented comprehensive coverage to ensure cloud service providers meet their customer mandates, including:

  • SOX compliance measurement and reporting across 39 COBIT controls;
  • PCI compliance measurement and reporting across 96 test requirements impacted by virtualization;
  • HIPAA compliance measurement and reporting across 37 controls negatively impacted by virtualization;
  • DIACAP compliance measurement and reporting across all 26 controls affected by virtualization, including the 12 Mac1 controls;
  • COBIT compliance measurement across over 40 controls affected by virtualization, and
  • FISMA compliance and reporting across 51 controls.

Among numerous protection and enforcement features, vSecurity Cloud Edition provides:

  • 24x7 vulnerability management with a fully compliant scanner that is automatically correlated with other virtual machine attributes to provide an accurate assessment of known defects against a specific and customizable compliance framework.
  • NAC-based enforcement for continuous monitoring of the virtual machine population, real-time inventory management, and the most accurate real-time VM catalog and virtual machine sprawl prevention
  • A multi-tenant management portal that provides compliance intelligence aggregation, management and reporting across physical, virtual, private and public clouds from a single dashboard, while ensuring the privacy of customer or departmental data.

Essential for cloud providers is customizable, real-time reporting on the compliance and security status of the customer's applications and systems, tailored for the appropriate audience ranging from executive-level managers to technical IT administrators.

Partners signing on quickly

Catbird vSecurity Cloud Edition has already been embraced by cloud providers and platform vendors whose customer base demands broad-based security and compliance.

"Once again, Catbird has risen to the occasion to build the most robust, feature-rich virtualization security solution on the market, with functionality designed specifically to meet the needs of the cloud provider," said Ali Davachi, CEO of ValueReseller.com (www.valuereseller.com), a leading provider of private label cloud hosting services that has recently added vSecurity Cloud Edition to its portfolio of reseller focused services. "We're committed to helping our resellers deliver the most secure environment for their mission-critical hosting and data storage needs. After evaluating several choices, Catbird was the clear winner for its ability to deliver comprehensive security and compliance."

"By integrating Catbird into our suite of products we are able to provide unparalleled security and policy controls to our customers, while also taking most of the complexity out of managing their cloud computing infrastructure," said Jaymes Davis, CEO of Halo FC (www.halofc.com), a cloud enablement software company that is an emerging hybrid cloud solution for the enterprise.

Ideal for both public and private clouds, vSecurity Cloud Edition features a Service-Oriented Architecture that incorporates stateless agents reporting to a separate command center. Full integration with ESX 3.X, vSphere 4.x and up and Citrix XenServer enables seamless deployment in cross-platform environments. Infinitely scalable, vSecurity Cloud Edition allows cloud providers to seamlessly add clouds to existing infrastructure, without additional investment in security and compliance. For public clouds, SLA enforcement provides value-added assurance for clients.

For more information about Catbird's line of visionary security and compliance solutions for virtualization and cloud computing, including the new vSecurity Cloud Edition, visit www.catbird.com

Googleのスマートグリッド戦略は当面提携する事 。相手は=>

スマートメータのメーカ等、スマートグリッドインフラとの接続をサポートするベンダー。
当然サポートする数が多い程いい、という理屈になるが、標準化が進む前にde facto化を狙う、というのも強力な戦略の一つと言える。

GooglePowerMeter1Updated: The Obama administration has just funded the rollout of 18 million meters with stimulus funds, but here's another way to get access to energy data without one of those new digital meters: This morning UK energy management startup AlertMesays it has joined up with Google'senergy management tool PowerMeter. AlertMe, which makes the monitoring device and is backed by venture capital firms like Index Ventures and VantagePoint Venture Partners, will use PowerMeter to help customers track their energy consumption online in the iGoogle format.

AlertMe's gadget is the second device partner that Google has announced in recent months and the first one in the UK.Earlier this month Google announced that the Energy Detective, made by Energy Inc, would be its first hardware partner, enabling U.S. customers that have bought a TED 5000 to monitor energy consumption on a PC. As Google's Tom Sly told us earlier this month, Google plans to keep adding to its list of device partners.

The more gadget partners for PowerMeter, the quicker home owners could have access to real time energy data. TED and AlertMe customers don't need a smart meter. AlertMe also says that it is the first gadget partner for Google that doesn't need an electrician to install it — users just clip the reader onto the old electricity meter and plug into their home broadband connection. For the TED, customers need an electrician.

Bypassing the meter also means that customers can get their energy data more quickly than through most utility-sanctioned in-home energy dashboards. Most of the utilities are sending the energy data from the smart meter to the utility backoffice to be displayed to the customer in a 24 hour period. But the data from AlertMe and TED can sync with your PC or mobile device via broadband in real time.

The 3-year-old AlertMe raised £8 million ($13.04 million) back in June in a Series B round from Good Energies, Index Ventures, SET Partners and VantagePoint Venture Partners. The company's home energy management product uses a Zigbee-based wireless network, sensors and smart plugs to monitor and manage energy consumption in homes. The kit, which costs £1469.00 plus £92.99 per month (Update: the company changed their pricing, the previous was the old pricing) is one of the few energy management products that is actually available now.

AlertMe is also working with utilities, and has a trial going with a division of British Gas, one of the largest residential suppliers of gas and electricity in the UK. The trial with British Gas is focusing specifically on a heating system that can be controlled remotely, enabling home owners to turn on/off, up/down their home heat from any broadband-connected device, like a PC or cell phone. British Gas is offering AlertMe gear as a voluntary option, and customers will have to pay for the upfront hardware as well as a recurring subscription service fee.

Google is also very keen to work with utilities for PowerMeter. The search engine giant is already working with about 10 utilities, including San Diego Gas & Electric, TXU Energy, Wisconsin Public Service, White River Valley Electric Cooperative, JEA, Glasgow EPB, Reliance Energy (India), and Toronto Hydro–Electric System (Canada). And this morning Google also announced its first UK utility partner first:utility, a new independent utility based in Warwick, UK. Google says first:utility is "the only energy supplier in the United Kingdom to provide free smart meters to its customers."

クラウド アナリティクスについて。既存のアナリティクスとの違いについて=>

解説しています。特にHadoopの様なクラウド固有の巨大データ管理が活用できる、等明らかにメリットが見える面もある。

Cloud Expo: Article

Cloud Analytics Checklist

What are enterprise users looking for from a cloud analytics solution?


Cloud Data Analytics on Ulitzer

In the previous article we looked at how realtime cloud analytics looks set to disrupt the $25B SQL/OLAP sector of the IT industry. What are users looking for from a next-generation post-SQL/OLAP enterprise analytics solution? Let's look at the requirements:

  • Realtime + Historical Data. In addition to analyzing (historical) data held in databases (Oracle, SQLServer, DB2, MySQL) or datastores (Hadoop, Amazon Elastic MapReduce), a next-gen analytics solution needs to be able to analyze, filter and transform live data streams in realtime, with low latency, and to be able to "push" just the right data, at the right time, to users throughout the enterprise. With SQL/OLAP or Hadoop/MapReduce, users "pull" historical data via queries or programs to find what they need, but for many analytics scenarios today what's needed instead, to handle information overload is a continuous "realtime push" model where "the data finds the user".
  • External + Internal Data. In the past it was so simple, an enterprise had only to deploy a few large specialized systems (ERP, CRM, Supply Chain, Web Analytics) to handle the internal data flowing through the organization. Today, in order to be able to operate with peak efficiency, a large enterprise will need to have a detailed realtime integrated awareness of all kinds of data sources that could impact the business, for example, information on: customers, partners, employees, competitors, marketing, advertising, pricing, web, news, markets, locations, gov data, communications, email, collaboration, social, IT, datacenters, networks, sensors.
  • Unstructured + Structured Data. SQL/OLAP analytics was built on the idea that data would be held in relational databases, and that the data would be highly structured. Today, this no longer applies. Much of the most valuable data to an enterprise today is either semi-structured or unstructured.
  • Easy-To-Use. SQL/OLAP has proved to be too complex for most enterprise users who need access to analytics for their work. Excel with its simple charting, visualization, sharing and collaboration features provides a much more attractive interface for most users. Other products and services such as Qlikview and GoodData also provide ease-of-use, but none of them (Excel included) offers the kind of realtime analytics, scalability and parallel processing required in analytics today. Despite its complexity and lack of mainstream adoption within the enterprise, a few companies have taken SQL/OLAP and made it even more complex by adding in features to support realtime stream processing. None of these StreamSQL solutions seem to have achieved any widespread adoption to date.
  • Cloud-Based, Pay-Per-Use. Every company looking to compete in the next-generation analytics market will have to have at least a public cloud offering, and most will also have virtual private cloud and private cloud offerings. Since enterprise data will often be held on more than one cloud, it will be increasingly important to have an "intercloud" capability, where analytics apps can be run across multiple (public and/or private) clouds, e.g. across Amazon AWS and Windows Azure.
  • Elastic Scalability, Parallel Processing, MapReduce. With exponentially growing data volumes it will be essential to offer the elastic scalability and parallel processing required required to handle anything from one-off personal data analysis tasks up to the most demanding large-scale analytics apps required by the world's leading organizations in business, web, finance and government.
  • Seamless Integration With Standard Tools (Excel). With 40 Million analytics power users using Excel, this is a must for any analytics solution looking to achieve significant market adoption.

At Cloudscale, we've compiled a Cloud Analytics Checklist, showing how various analytics products/services measure up against this set of requirements. If you're thinking about cloud analytics and would like a copy of the Checklist then send a request with your email address via the Cloudscale website (no signup required) or by email to checklist@cloudscale.com, with the word Checklist in the Subject line.

Ciscoのクラウド戦略: 初めて買収した会社がScanSafe社、次に控えるのは=>

何か、というのが関心の集まるところ。  ScanSafe社はセキュリティの企業であるが、Ciscoは既にIronPort社を買収しており、今回の買収はCiscoのセキュリティ戦略の一環ではなく、新規に開始したクラウド事業の買収戦略の始まり、と見るべき、と本記事では強調している。 
 
確かにお金は持っているけど、最終的に自社のクラウド戦略を強化するために大型の買収に使う事が想定される。  はたしてそれが誰か、というのはまだ憶測も出ていない、という状態で今後どのような予測が出てくるか興味深い。 
 

Cisco Catches a Cloud

Cisco (Nasdaq: CSCO) has agreed to pay $183 million in cash to acquire ScanSafe, a digital security provider whose offerings live and work on the Web.

News headlines suggest that security is the story of this deal. They're wrong. Cisco has been acquiring pieces of a security portfolio for years, including an $830 million purchase of IronPort Systems in 2007. IronPort was and still is the major alternative to Secure Computing's gateway security offerings, which are now maintained by McAfee (NYSE: MFE).

What's striking here is that Cisco is buying a cloud-computing service. This isn't software that can be added to its routers or other networking equipment; it's distinct, an off-site suite more akin to what salesforce.com (NYSE: CRM), NetSuite (NYSE: N), and SuccessFactors (Nasdaq: SFSF) offer than anything else you'll find in Cisco's portfolio.

Cisco's take on cloud computing is just as interesting to me. Last week, one of the company's security bloggers, Seth Hanford, posted an interesting piece about Microsoft's (Nasdaq: MSFT) embarrassing data failure with Sidekick handhelds. The title: "Cloud Computing: Not a Security Panacea."

Hanford makes several good points in the piece. My favorite: "What remains to be seen is whether consumer expectation and assumptions about the nature of cloud computing can be satisfied by companies that only offer hosted services, or those that do not include cost-prohibitive design decisions."

Precisely. Expectations are difficult to satisfy under the very best conditions. With so much hype attached to cloud computing, providing satisfactory protection may very well be impossible -- a delicious irony.

Cloud computing may not be a security panacea, but Cisco will pay for it anyway. Why take a chance when you're the market leader, and blessed with tens of billions in cash and securities?

2009年10月29日木曜日

SaaSストレージ、ただのデータアーカイブサービスでは勝負できない: 3PAR Storageの発表=>

は次の機能を提供する、という内容
  • Synchronous Long Distance Remote Copy
  • Persistent Cache
  • RAID Multi Parity
いづれも、ストレージの障害対策のためのデータ保全を確保機能。
 
以前にもつぶやきしましたが、ストレージSaaSは大変競争の激しい市場で、差別化をするための戦いは相当厳しいものがある、と見ています。  また、価格競争が激しいところですので、運用コストを最大限に下げて、マージンを確保する事業モデルが必要です。 
 

3PAR Storage Servers Enhance Resilience for Cloud Computing

New Utility Storage Software Delivers Multi-Site Disaster Recovery and Service Level Protection for Public and Private Cloud Environment

3PAR® (NYSE: PAR), the leading global provider of utility storage, announced today the introduction of several new software capabilities for the 3PAR InServ® Storage Server to give cloud and enterprise datacenters increased resilience and agility at a lower cost than traditional storage arrays. Together, these new software features from 3PAR—Synchronous Long Distance Remote Copy, Persistent Cache, and RAID MP (Multi-Parity)—help protect 3PAR Utility Storage customers against disasters, preserve service levels in the event of a component outage, and protect against double disk failures. These features are designed to work with the InServ platform to deliver the robust functionality demanded by Tier 1 datacenters, the cost structure attractiveness of Tier 2 environments, and the agility of a highly virtualized infrastructure.

"Cloud computing datacenters demand resilience, flexibility, and cost-effectiveness," said Steve Scully, Research Manager for IDC's Enterprise Storage Systems research group. "With the announcement of these new software features for the public and private cloud datacenter, 3PAR has delivered a confluence of high-end features, advanced virtualization, and attractive cost-of-ownership to the high-end storage market. What's more, 3PAR has brought this exciting combination to the midrange market as well."

Low RTO and RPO for Long Distance DR: Synchronous Long Distance Remote Copy
3PAR Remote Copy dramatically reduces the cost of remote data replication and disaster recovery by leveraging thin copy technology, permitting the combination of midrange and high-end arrays, and eliminating the requirement for professional services. Announced today, 3PAR Remote Copy support for Synchronous Long Distance replication gives 3PAR customers a new, affordable, multi-site alternative for achieving low Recovery Time Objectives (RTOs) and zero-data-loss Recovery Point Objectives (RPOs) with complete distance flexibility.

Synchronous Long Distance Remote Copy delivers the best of both worlds—offering the data integrity of synchronous mode disaster recovery and the extended distances (including cross-continental reach) traditionally only associated with asynchronous replication. Synchronous Long Distance Remote Copy delivers these benefits without the complexity or professional services required by the monolithic vendors that offer multi-target disaster recovery products, and at half the cost. In addition, 3PAR is the first storage vendor to support multi-site capability on midrange arrays, enabling cloud service providers and enterprise customers alike to reduce their equipment costs.

"In the past, we have looked at multi-site, multi-mode disaster recovery but never made the investment because we found the offerings complex, expensive, and difficult to configure," said Dwayne Sye, CIO at Cvent. "We were never able to justify the cost premium and significant up-front and ongoing professional services that the monolithic arrays required. In contrast, 3PAR Remote Copy provided us with far simpler, relatively affordable Synchronous Long Distance replication. This has allowed us to save considerable dollars by using a combination of InServ T-Class and F-Class Storage Servers in place of premium-priced monolithic arrays."

Maintaining Service Levels Under Failure: Persistent Cache
3PAR Persistent Cache is a resiliency feature built into the latest version of the 3PAR InForm® Operating System that allows "always on" environments to gracefully handle both planned and unplanned downtime. Persistent Cache eliminates the substantial performance penalties associated with "write-through" mode so that InServ arrays can maintain required service levels even in the event of a cache or controller node failure. "Write-through" mode suspends the use of write caching for data integrity reasons—an approach typically used in dual-controller array architectures when such failures occur. Even monolithic arrays experience "degraded mode" in these situations, which can impact required application performance levels to the point of being considered an outage.

3PAR Persistent Cache leverages the clustered InSpire® Architecture with its unique Mesh-Active design to preserve write-caching in the event of a controller node failure by rapidly re-mirroring cache to the other nodes in the cluster. Persistent Cache is supported on all quad-node and larger 3PAR arrays, including the InServ F400—making 3PAR the only vendor to incorporate this industry-leading service level protection capability into midrange as well as high-end arrays.

Protection Against Double Disk Failures: 3PAR RAID MP
3PAR RAID MP (Multi-Parity) introduces Fast RAID 6 technology backed by the accelerated performance and Rapid RAID Rebuild capabilities of the 3PAR ASIC. RAID MP is designed to deliver extra RAID protection and to work particularly well with large disk drive configurations—for example, Serial ATA (SATA) drives above 1 TB in capacity. While the 3PAR architecture is capable of delivering up to quadruple parity protection, this first introduction of RAID MP technology supports double parity modes of 6+2 and ultra-efficient 14+2. 3PAR RAID MP offers protection against data loss due to double disk failures while delivering performance levels within 15% of RAID 10. In addition, 3PAR RAID MP capacity overheads are comparable to popular RAID 5 modes. RAID MP is supported on all InServ models.

"The builders of public and private cloud computing solutions want to design resilient service architectures that offer both agility and cost-effectiveness," said David Scott, 3PAR President and CEO. "Traditionally, finding storage systems that facilitate this combination of service capabilities has been nearly impossible. The new resilience features announced today affirm 3PAR as a leader in providing the storage building blocks best suited to helping customers and service providers achieve their solution design goals."

Synchronous Long Distance replication is bundled in the latest version of 3PAR Remote Copy. Persistent Cache and RAID MP are enhancements to the latest version of the InForm Operating System. These versions of Remote Copy and the InForm Operating System are both orderable today.
 

About 3PAR:
3PAR® (NYSE: PAR) is the leading global provider of utility storage, a category of highly virtualized and dynamically tiered storage arrays built for public and private cloud computing. Our virtualized storage platform was built from the ground up to be agile and efficient to address the limitations of traditional storage arrays for utility infrastructures. As a pioneer of thin provisioning and other storage virtualization technologies, we design our products to reduce power consumption to help companies meet their green computing initiatives, and to cut storage total cost of ownership. 3PAR customers have used our self-managing, efficient, and adaptable utility storage systems to reduce administration time and provisioning complexity, to improve server and storage utilization, and to scale and adapt flexibly in response to continuous growth and changing business needs. For more information, visit the 3PAR Website at: www.3PAR.com.

NetSuiteがグローバル企業向けのSaaS ERPサービスを発表。NetSuite OneWorld SRPと呼ばれる製品、機能は=>

主としてグローバル企業間のプロジェクト管理、財務管理などの基幹業務で、国際税務管理、収益管理等、グローバル企業が必要としているコンポーネントを自社のクラウド環境上でサポートしている。

NetSuite Unveils End-to-End Cloud Business Management Suite

(WEB HOST INDUSTRY REVIEW) -- Cloud computing business management software vendor NetSuite (www.netsuite.com) has launched NetSuite OneWorld SRP, the world's first end-to-end cloud business management solution of its kind for global services businesses.

According to its Tuesday announcement, NetSuite OneWorld SRP gives global services businesses comprehensive real-time visibility, integrated financials, resource optimization and services management from corporate, to subsidiaries, down to the individual project level across geographies, currencies, and tax jurisdictions.

"Most services businesses today run on software designed before the World Wide Web existed," NetSuite chief executive officer Zach Nelson said in a statement. "NetSuite OneWorld SRP brings services companies into the Internet Age, and in so doing massively reduces cost while improving productivity. Instead of waiting three months to find out what happened yesterday with a given project or subsidiary entity, with NetSuite OneWorld SRP services-based businesses can finally manage their operation in real-time."

Global services businesses have lacked the integrated business management tools necessary to truly manage corporate performance on a global basis, according to NetSuite, until now. Services-based businesses face an array of complex and volatile business dynamics such as widely fluctuating local resource demand, global pricing pressure, and complex, locally negotiated service delivery contracts. In this context, making bad decisions can significantly impact profitability, cash flow and customer satisfaction.

NetSuite OneWorld SRP uses cloud computing to address the issues of global project and financial management. Its comprehensive services management functionality makes it ideal for professional services, legal, accounting, business process outsourcing management and media-agency organizations with international operations. 

NetSuite OneWorld SRP features automated multi-currency management, built-in support for international tax, compliance and sophisticated revenue recognition management, eliminating arduous manual processes and shortens the period-end close cycle. And being a native cloud solution, NetSuite OneWorld SRP lets global services businesses unlock these benefits with anytime, anywhere access and minimum capital expenditure, while ensuring fast deployment, and lean requirements for ongoing management.

SaaSストレージ、ただのデータアーカイブサービスでは勝負できない: 3PAR Storageの発表=>

は次の機能を提供する、という内容
  • Synchronous Long Distance Remote Copy
  • Persistent Cache
  • RAID Multi Parity
いづれも、ストレージの障害対策のためのデータ保全を確保機能。
 
以前にもつぶやきしましたが、ストレージSaaSは大変競争の激しい市場で、差別化をするための戦いは相当厳しいものがある、と見ています。  また、価格競争が激しいところですので、運用コストを最大限に下げて、マージンを確保する事業モデルが必要です。 
 

3PAR Storage Servers Enhance Resilience for Cloud Computing

New Utility Storage Software Delivers Multi-Site Disaster Recovery and Service Level Protection for Public and Private Cloud Environment

3PAR® (NYSE: PAR), the leading global provider of utility storage, announced today the introduction of several new software capabilities for the 3PAR InServ® Storage Server to give cloud and enterprise datacenters increased resilience and agility at a lower cost than traditional storage arrays. Together, these new software features from 3PAR—Synchronous Long Distance Remote Copy, Persistent Cache, and RAID MP (Multi-Parity)—help protect 3PAR Utility Storage customers against disasters, preserve service levels in the event of a component outage, and protect against double disk failures. These features are designed to work with the InServ platform to deliver the robust functionality demanded by Tier 1 datacenters, the cost structure attractiveness of Tier 2 environments, and the agility of a highly virtualized infrastructure.

"Cloud computing datacenters demand resilience, flexibility, and cost-effectiveness," said Steve Scully, Research Manager for IDC's Enterprise Storage Systems research group. "With the announcement of these new software features for the public and private cloud datacenter, 3PAR has delivered a confluence of high-end features, advanced virtualization, and attractive cost-of-ownership to the high-end storage market. What's more, 3PAR has brought this exciting combination to the midrange market as well."

Low RTO and RPO for Long Distance DR: Synchronous Long Distance Remote Copy
3PAR Remote Copy dramatically reduces the cost of remote data replication and disaster recovery by leveraging thin copy technology, permitting the combination of midrange and high-end arrays, and eliminating the requirement for professional services. Announced today, 3PAR Remote Copy support for Synchronous Long Distance replication gives 3PAR customers a new, affordable, multi-site alternative for achieving low Recovery Time Objectives (RTOs) and zero-data-loss Recovery Point Objectives (RPOs) with complete distance flexibility.

Synchronous Long Distance Remote Copy delivers the best of both worlds—offering the data integrity of synchronous mode disaster recovery and the extended distances (including cross-continental reach) traditionally only associated with asynchronous replication. Synchronous Long Distance Remote Copy delivers these benefits without the complexity or professional services required by the monolithic vendors that offer multi-target disaster recovery products, and at half the cost. In addition, 3PAR is the first storage vendor to support multi-site capability on midrange arrays, enabling cloud service providers and enterprise customers alike to reduce their equipment costs.

"In the past, we have looked at multi-site, multi-mode disaster recovery but never made the investment because we found the offerings complex, expensive, and difficult to configure," said Dwayne Sye, CIO at Cvent. "We were never able to justify the cost premium and significant up-front and ongoing professional services that the monolithic arrays required. In contrast, 3PAR Remote Copy provided us with far simpler, relatively affordable Synchronous Long Distance replication. This has allowed us to save considerable dollars by using a combination of InServ T-Class and F-Class Storage Servers in place of premium-priced monolithic arrays."

Maintaining Service Levels Under Failure: Persistent Cache
3PAR Persistent Cache is a resiliency feature built into the latest version of the 3PAR InForm® Operating System that allows "always on" environments to gracefully handle both planned and unplanned downtime. Persistent Cache eliminates the substantial performance penalties associated with "write-through" mode so that InServ arrays can maintain required service levels even in the event of a cache or controller node failure. "Write-through" mode suspends the use of write caching for data integrity reasons—an approach typically used in dual-controller array architectures when such failures occur. Even monolithic arrays experience "degraded mode" in these situations, which can impact required application performance levels to the point of being considered an outage.

3PAR Persistent Cache leverages the clustered InSpire® Architecture with its unique Mesh-Active design to preserve write-caching in the event of a controller node failure by rapidly re-mirroring cache to the other nodes in the cluster. Persistent Cache is supported on all quad-node and larger 3PAR arrays, including the InServ F400—making 3PAR the only vendor to incorporate this industry-leading service level protection capability into midrange as well as high-end arrays.

Protection Against Double Disk Failures: 3PAR RAID MP
3PAR RAID MP (Multi-Parity) introduces Fast RAID 6 technology backed by the accelerated performance and Rapid RAID Rebuild capabilities of the 3PAR ASIC. RAID MP is designed to deliver extra RAID protection and to work particularly well with large disk drive configurations—for example, Serial ATA (SATA) drives above 1 TB in capacity. While the 3PAR architecture is capable of delivering up to quadruple parity protection, this first introduction of RAID MP technology supports double parity modes of 6+2 and ultra-efficient 14+2. 3PAR RAID MP offers protection against data loss due to double disk failures while delivering performance levels within 15% of RAID 10. In addition, 3PAR RAID MP capacity overheads are comparable to popular RAID 5 modes. RAID MP is supported on all InServ models.

"The builders of public and private cloud computing solutions want to design resilient service architectures that offer both agility and cost-effectiveness," said David Scott, 3PAR President and CEO. "Traditionally, finding storage systems that facilitate this combination of service capabilities has been nearly impossible. The new resilience features announced today affirm 3PAR as a leader in providing the storage building blocks best suited to helping customers and service providers achieve their solution design goals."

Synchronous Long Distance replication is bundled in the latest version of 3PAR Remote Copy. Persistent Cache and RAID MP are enhancements to the latest version of the InForm Operating System. These versions of Remote Copy and the InForm Operating System are both orderable today.
 

About 3PAR:
3PAR® (NYSE: PAR) is the leading global provider of utility storage, a category of highly virtualized and dynamically tiered storage arrays built for public and private cloud computing. Our virtualized storage platform was built from the ground up to be agile and efficient to address the limitations of traditional storage arrays for utility infrastructures. As a pioneer of thin provisioning and other storage virtualization technologies, we design our products to reduce power consumption to help companies meet their green computing initiatives, and to cut storage total cost of ownership. 3PAR customers have used our self-managing, efficient, and adaptable utility storage systems to reduce administration time and provisioning complexity, to improve server and storage utilization, and to scale and adapt flexibly in response to continuous growth and changing business needs. For more information, visit the 3PAR Website at: www.3PAR.com.

Trend Micro社のクラウドセキュリティ戦略の発表。内容は=>

On-Premiseサーバセキュリティとクラウド上の仮想化環境のサーバ、両方をカバーするソリューションとして位置づけられており、次の機能が特に強化されている。
  • On-Premiseサーバ、仮想化レイヤー、クラウド環境、の3つに共通的に対応
  • 仮想化環境とクラウド環境向けのセキュリティコストの低下
  • 各種規格の準拠保証: PCI、SAS70、FISMA、HIPAA、等

Trend Micro Introduces Advanced Server Security Strategy

Trend Micro is introducing an advanced server security strategy that encompasses protection for the cloud, as well as products and solutions that help corporations address the challenging data protection, security and compliance needs of today's datacenters that stretch across physical, virtualized and cloud-computing environments.

Trend Micro Deep Security, the flagship product for advanced server security at Trend Micro, introduces a new paradigm for server security where the entire server is protected, including the operating system, network and applications layers for superior and comprehensive security, regardless of computing environment, virtualization platform or storage location.

It emphasizes:

  • Preventing data breaches and disruptions by providing a layer of defense at the server itself, whether physical, virtual or cloud.
  • Lowering the cost of security management for virtual and cloud computing environments.
  • Helping to make possible compliance over a wide range of regulations and standards, including PCI, SAS 70, FISMA, HIPAA, and more.
  • Addressing immediate security concerns plaguing enterprises in physical and virtual worlds such as SQL injection and cross-site scripting attacks perpetrated by sophisticated, for-profit hackers.

Trend Micro Deep Security provides advanced protection for servers right from the operating system to resident applications with a modular architecture that includes: A deep packet inspection engine with intrusion detection/prevention (IDS/IPS), Web application protection and network-level application control; firewall, integrity monitoring and log inspection modules. This protection is available for both physical and virtual systems using server-based software agents and, coming soon with Deep Security 7.0, using virtual security appliances specifically designed for VMware VI3 and vSphere 4 environments. Trend Micro Deep Security 7.0, the latest version, is the world's first security software that coordinates VMsafe API-based security applied at the hypervisor with additional protection on virtual machines to protect VMware environments. This version also includes new features designed to improve management and simplify compliance for a lower total cost of ownership such as: Event tagging to enable better workflow of security incident handling, the ability to create a "reference system" or known good state to reduce false positive alerts resulting from normal system updates such as patching. Other enhancements include integrity monitoring, log inspection, and SIEM integration capabilities.

Trend Micro Deep Security combines with Trend Micro ServerProtect and Core Protection for Virtual Machines, the company's anti-malware products designed for physical servers and VMware virtual servers respectively. This unique blend delivers layered and comprehensive server security now vital to business continuity. Trend Micro Deep Security, with advanced anti-malware protection, further adds to Trend Micro's ability to broadly deliver layered and comprehensive server security now vital to business continuity.

Trend Micro Deep Security 7 will be available in November 2009 with two pricing models designed to deliver maximum value to organizations. Deep Security is available for traditional physical servers on a per server basis starting at $885 per server. A virtual server license is also available for VMware environments with unlimited agents per host machine starting at $2100 per socket.

Trend Micro advanced server security solutions are part of Trend Micro Enterprise Security – a tightly integrated offering of content security products, services and solutions which is powered by the Trend Micro Smart Protection Network. Trend Micro Enterprise Security delivers maximum protection from emerging threats while greatly reducing the cost and complexity of security management.

AmazonがMySQLサポートをやっと開始。 珍しくAzureより後の機能提供。さらに、=>

EC2の機能強化+価格ダウンを発表。 これもやはりMicrosoftの価格設定に合わせて下げてきたもの。 
 
どうもAzureの登場でAmazonはユーザの乗換えを防ぐための戦略として、主として価格戦略を打ち出しながら、戦っていく模様だ。

Amazon launches relational database service: Think MySQL in the cloud

Posted by Larry Dignan @ 2:15 am

Amazon on Tuesday launched a public beta of a service dubbed the Amazon Relational Database Service (RDS). The main appeal: Allow customers to operate and scale database clusters while leaving pesky tasks like patching and administration to Amazon Web Services.

Adam Selipsky, vice president of Amazon Web Services (AWS), said the goal was to make it easy to scale MySQL clusters. He noted that "MySQL code and developer tools today will work with RDS."

The RDS will round out Amazon's SimpleDB service and other plans where you bring your own database to the company's cloud (AWS blog, Amazon CTO Werner Vogels). What remains to be seen is the level of data that gets pumped into Amazon's RDS. Selipsky said the service is "suited for anything you'd put into a MySQL database" and that Amazon has "taken great pains to make sure it is highly secure."

Selipsky said that RDS came about because Amazon's SimpleDB is optimized for index and query functions not relational functions. Most enterprises mix and match these database types.  Adobe is one of the customers taking RDS for a spin. Nevertheless, enterprise customers are likely to take their time moving sensitive data into Amazon's RDS effort. "We're highly confident that RDS can hold a wide range of data sensitive or otherwise," said Selipsky.

To ride shotgun with the RDS rollout, Amazon also unveiled a new family of Elastic Compute Cloud (EC2) services. The latest family is for high CPU and memory usage. Things like running demanding databases, rendering and caching will operate better with a high-memory EC2 service. Selipsky noted that EC2 has three EC2 families in multiple sizes.

And finally, Amazon is dropping prices across all of its EC2 instances. For instance, the smallest Linux-based EC2 instance runs 10 cents an hour, but will now go for 8.5 cents. In general, the price cut is 15 percent across Linux instances. Microsoft is also dropping the price of Windows EC2 instances.

CiscoがSaaSセキュリティベンダのScanSafe社を買収: Ciscoのクラウド戦略は=>

前回買収した、IronPort社(On-Premise上のコンテンツを守るセキュリティソフト)と組み合わせて、社内ネットワーク、クラウド環境内のデータをCiscoが総合的に守る、という方針にのっとるもの、と考えられる。 
 
買収金額は、$183Mで、ScanSafeの製品は、CiscoのAnyConnect VPN Clientに統合される計画、との事。 

Cisco Boosts Cloud Security with ScanSafe Acquisition
By Jennifer LeClaire
October 27, 2009 8:52AM

 
Cisco Systems has enhanced its ability to offer cloud-computing security with the purchase of ScanSafe for $183 million. Cisco expects the web security market to grow and ScanSafe will be combined with Cisco's recent acquisition of content security provider IronPort. Cisco plans to integrate ScanSafe with Cisco AnyConnect VPN Client.  

Cisco Systems announced Tuesday yet another acquisition. The networking giant is scooping up privately held software-as-a-service (SaaS Relevant Products/Services) web security firm ScanSafe.

Cisco will pay about $183 million in cash and retention-based incentives for the firm, which is based in London and San Francisco and serves both global enterprises and small businesses. The acquisition is expected to close in the second quarter of Cisco's fiscal year 2010.

"With the acquisition of ScanSafe, Cisco is executing on our vision to build a borderless network security architecture that combines network and cloud Relevant Products/Services-based services for advanced security enforcement," said Tom Gillis, vice president and general manager of Cisco's Security Technology Business Unit. "Cisco will provide customers the flexibility to choose the deployment model that best suits their organization and deliver anytime, anywhere protection against web-based threats."

Hybrid-Hosted Web Security

Web security is a large and expanding market that Cisco expects to grow to $2.3 billion by 2012. By acquiring ScanSafe, Cisco said it's building on its successful acquisition of on-premise content Relevant Products/Services security provider IronPort.

The acquisition brings together the Cisco IronPort high-performance web security appliance and ScanSafe's SaaS security service. Cisco said the combination will expand its security portfolio to offer superior on-premise, hosted and hybrid-hosted web security solutions.

"ScanSafe pioneered the market for SaaS web security and continues as a leader in this rapidly growing market," said ScanSafe CEO Eldar Tuvey. "At a time when enterprises are increasingly focused on a flexible and mobile workplace, the need for hybrid-hosted web security solutions is greater than ever. By joining the Cisco team we will be able to offer even better and more flexible protection to our customers."

Cisco's Cloud Strategy

ScanSafe's service will be integrated with Cisco AnyConnect VPN Client to provide a secure mobility solution. Cisco said ScanSafe's global network of carrier-grade data centers and multi-tenant architecture will enhance its ability to provide new cloud-security services for customers anywhere in the world.

"The cloud is a big part of Cisco's strategy. The biggest inhibitor to people using the cloud more is security. ScanSafe is a security platform for cloud-based opportunities. It's a good acquisition for them," said Zeus Kerravala, a vice president at the Yankee Group. "Cisco has been in security for a long time, but it's primarily been premised-based deployments. ScanSafe will help Cisco with its cloud strategy."

Is Cisco leveraging a down economy to scoop up smaller companies at bargain-basement prices? As Kerravala sees it, it's a buyer's market and Cisco has the cash to make acquisitions of companies that can't get the same multiples they expected in an up market.

"For companies with cash, like Cisco, there is a great opportunity to add to their portfolio at a relatively moderate price," Kerravala said. "For what Cisco has in the bank, $183 million is pocket change."

クラウドの標準化活動:OCCIの紹介。標準化活動ははっきり言って、=>

いっぱいあります。 OCCIはそのうちの一つで、業界ベンダーが集まって共通仕様を共同で作ろうというもの。
 
問題は、業界がちゃんとついてきてくれるか、という事と、果たして標準化が本当にユーザの世界から望まれているのか、という事。 
 
小生の感覚からすると、まだ市場は黎明期で、標準化の前に本当にビジネスに寄与するサービスが市場に登場する事がユーザニーズなのでは、と感じるところです。 
 

Dion Hinchcliffe's Next-Generation Enterprises

Dion Hinchcliffe

As Cloud Computing Grows, Where Are The Standards?

user-pic

A new report released this week by IT consulting and advisory firm Avanade claims that there was an impressive 320% jump in actual cloud computing service sign ups by enterprise since the beginning of this year. Based on surveys of 500 companies in over 17 countries, it's the most recent data point in a long list of ones that show that enterprises are considering their cloud options seriously.

One of the key lessons from this is almost certainly that the recession has been a major impetus for cloud computing as businesses consider both on-premises and hosted services as a means to drive down their IT costs. And there's good news from those that have made the plunge. Over 90 percent of those surveyed consider their cloud computing implementation a success.

However there are still troubling signs of cloud computing's immaturity: downtime. Almost a third of the surveyed enterprises said they've experienced an unplanned service outage that cost them a day's lost business productivity. The recent EC2 blackout has also increased concerns around both the security and reliability of major cloud services.

A Leading Cloud Standards Contender:  Open Cloud Computing Interface (OCCI)

This leads directly to the fundamental question of choice. Choice in cloud providers and choice in cloud technologies. If enterprises have the ready ability to switch as easily as they'd like between external and internal clouds and amongst external providers, it would alleviate one of the major stumbling blocks to real adoption of cloud computing -- and all its attendant benefits -- for core IT in most organizations. But to do this requires interoperability that does not exist today.

The classic mantra: Standards drive choice. Choice drives the market.

Right now, the cloud computing market is in its early, formative stages and is still a landscape of proprietary products and approaches. Standards are largely de facto, though progress has been made in some quarters, as we'll see.

Today, organizations that commit to a particular cloud provider or vendor often risk exposure to all the classic issues around lock-in of their infrastructure, software, and data. Worse, the cloud computing platform wars have begun as major players enter the scene with competing solutions that don't work together, forcing you to choose on a basis of anything but interoperability. To be fair, the cloud standards aren't mature enough yet for providers to commit, but it's still a genuine issue.

Since the major brouhaha around the Cloud Computing Manifesto this last March about keeping the cloud as open and interoperable as possible, there has not been much noise about standards. It's worth looking at one of the best attempts to describe the open cloud playing field to perhaps understand why.

The goals of the manifesto were ambitious and egalitarian, namely:

1. User centric systems enrich the lives of individuals, education, communication, collaboration, business, entertainment and society as a whole; the end user is the primary stakeholder in cloud computing.
2. Philanthropic initiatives can greatly increase the well-being of mankind; they should be enabled or enhanced by cloud computing where possible.
3. Openness of standards, systems and software empowers and protects users; existing standards should be adopted where possible for the benefit of all stakeholders.
4. Transparency fosters trust and accountability; decisions should be open to public collaboration and scrutiny and never be made "behind closed doors".
5. Interoperability ensures effectiveness of cloud computing as a public resource; systems must be interoperable over a minimal set of community defined standards and vendor lock-in must be avoided.
6. Representation of all stakeholders is essential; interoperability and standards efforts should not be dominated by vendor(s).
7. Discrimination against any party for any reason is unacceptable; barriers to entry must be minimised.
8. Evolution is an ongoing process in an immature market; standards may take some time to develop and coalesce but activities should be coordinated and collaborative.
9. Balance of commercial and consumer interests is paramount; if in doubt consumer interests prevail.
10. Security is fundamental, not optional.

While some of the notions in the manifesto are indeed high minded and its release was somewhat controversial, the focus is correct in my opinion on topics such net neutrality, equal access (with a slight preference for consumers over companies), active discouragement of vendor lock-in, and maximum interoperability. If followed, these can give us real choice as well as actual business agility. Since the release of the manifesto, there have been a number of initiatives to create standards, enable choice, and ensure the openness of the cloud to the extent that is possible with today's provider landscape. They've just been largely under the radar.

It's also worth noting that the lack of standards always tends to favor the incumbents and some of the biggest cloud computing players are currently and consistently absent from cloud standards and interoperability initiatives. For now, buyers must be beware and take their own steps to abstract themselves from unwanted dependencies while the situation sorts itself out.

The cloud computing standards are coming

Fortunately, with some recent research I've been doing, it's now clear that the apparent lack of buzz and news about cloud computing standards often just reflects that hard work is being done, not the opposite. One of the most encouraging and consistently developing stories today is the standards work being done on the Open Cloud Computing Interface, which is creating a set of REST-based interfaces for the management of cloud resources including computing, storage, and bandwidth. One of my favorite aspects of OCCI is that it tries hard to be a minimal specification that is simple and straightforward, there's little or none of the WS-I megastandards here.

OCCI has been undergoing frequent and steady revision (read the latest iteration, version 5, released at the end of last month) and is coming together as a capable standard that is actively supported by Cisco, Sun Microsystems, Eucalyptus, Rackspace, GoGrid, and many other members. OCCI currently has my vote as the first major cloud computing standard that you're most likely going to see in a real-world cloud service near you in the future. What's missing? Support from the major vendors such as Amazon, Google, Microsoft, and Salesforce.

But OCCI is just one of many cloud computing standards. You can view a larger list of the current standards in development at Cloud-standards.org that includes efforts (or some cases, just involvement) from many of the usual suspects including DMTF, ETSI, NIST, OMG, SNIA, OASIS, The Open Group, and the Open Cloud Consortium.

When you combine the Open Virtualization Format along with OCCI you start to get a complete way to describe, deploy, and manage a cloud computing environment and begin to make it easier and practical to switch between providers that support enough of the base set of standards.

In an upcoming post, I'll take a look at the two key questions that will drive the interoperability and openness questions for the near future. These questions are 1) what is the absolute minimum set of standards required to have full open cloud computing portability and 2) what kind of cloud management efforts are emerging, either standards, products, or just practical techniques, that enable cloud interoperability for enterprises today.

How important are cloud computing standards for your cloud efforts? Please comment below.

オープンソース系のクラウドベンダー、Joyent社が中国で事業開始。どうも日本を飛ばして中国進出がトレンドに=>

なりつつあるが、クラウド市場でも明らかにその傾向が高まってきています。 
 
経済成長が著しい中国を最優先市場にする、というビジネス上の理由は明確であるが、その前に閉じた日本市場の課題について今一度見直すべきではないか、と常々考えています。
 

Joyent Launches Commercial Cloud Computing Platform in China

Joyent, Inc., a provider of Enterprise-Class Cloud Computing, today announced the expansion of its Cloud Computing business to China. Joyent becomes China's first Cloud Computing vendor at a time where computing infrastructure is in very high demand. The company's data center is located in the Qinhuangdao Economic and Technological Development Zone (QETDZ), Hebei Province, China.

"China is the world's fastest growing economy and Joyent is there with the country's first local Cloud Computing offering", said David Young, CEO and Founder of Joyent. "This is definitely a very exciting and positive move for Joyent. We would especially like to thank our partners, QETDZ and Intel for their support in making this expansion possible. We are looking forward to providing the Chinese developer community and China's enterprises with world-leading Cloud Computing technology."

"Intel is pleased to announce its strategic technology collaboration with Joyent at the launch of its Cloud Computing platform in China," said Jason Waxman, Intel's General Manager for High Density Computing. "Joyent's innovative infrastructure, based on the Intel Xeon processor 5500 series, will bring both the efficiency and performance that's needed to meet the growing demand for cloud computing in China."

In 1984, the State Council approved the QETDZ as one of China's first state-level economic and technological development zones. It is currently the only development zone in the Hebei Province. During 2008, QETDZ has endeavored to develop the data industry and coined the name of the region as "Data Valley".

Joyent is immediately launching its base Public Cloud product, the Joyent Accelerator and plans to expand its product line over the next 2 quarters. To find out more go to www.joyent.com.cn

2009年10月27日火曜日

Microsoft社が台湾にクラウド専門のデータセンタを建設予定。台湾のPC関連メーカを=>

ターゲットにクラウド環境を提供しようという狙い。 中華電信 社と共同で建設を行う、との事。
 
日本も段々と外堀が埋められている、という感じでしょうか?
データセンタ、クラウドの世界でもガラパゴス化しなければいいですが。

Microsoft to build cloud-computing technology center in Taiwan

Oct. 27, 2009 (China Knowledge) - Microsoft Corporation will sign a memorandum of understanding with the local government in Taiwan on cloud-computing cooperation on Nov. 4, said Woody T.J. Duh, an official of the Industrial Development Bureau, sources reported.

According to the MOU, Microsoft has plans to build a cloud-computing technology center on the island to provide a platform for local hardware and software service suppliers to develop relevant services and technologies.

The move is in line with Microsoft's idea of consolidating terminal computing capabilities and services.

The software giant also hopes that setting up in Taiwan, which has the largest hardware industry supply chain in the world, will help it succeed in the field of cloud-computing as in the field of PCs.

Reportedly, Chunghwa Telecom Co Ltd, the largest telecom and Internet service provider in Taiwan, will also cooperate with Microsoft on cloud computing technologies.


Copyright © 2008 www.chinaknowledge.com

いよいと日本型クラウドの登場か? それともちょっと違うかな?=>

IIJがCitrixと提携を組み、クラウドサービスを開始する、との発表。 
 
これ、詳しい情報お持ちの方、お教えいただけませんでしょうか?

 

IIJ Group and Citrix Form Partnership for Cloud Services

Mon Oct 26, 2009 1:01am EDT
TOKYO, Oct. 26, 2009 (GLOBE NEWSWIRE) -- Internet Initiative Japan Inc. (IIJ) (Nasdaq:IIJI) (TSE1:3774), one of Japan's leading Internet access and comprehensive network solutions providers, and IIJ Technology Inc. (IIJ-Tech) announced that they will be partnering with Citrix Systems Inc. (Citrix) in order to provide cloud services. IIJ-Tech has concluded a partnership contract through the new Citrix Service Provider (CSP) Program, which was created to offer Citrix software products to IIJ's customers, such as service providers, and a Designated Distributor contract. IIJ will offer the IIJ GIO Virtual Desktop Service starting in February 2010, delivering Citrix's virtual solutions on IIJ's cloud service platform, IIJ GIO.  This partnership has the following objectives.   1.  To provide corporate users with the IIJ GIO Virtual Desktop      Service as DaaS IIJ GIO is a cloud service that runs on the cloud      platform built at IIJ data centers that are connected directly to      IIJ's broadband backbone network. IIJ GIO enables customers to      select freely from a menu of services, such as dedicated or      virtual servers, to fit their usage level and to construct the      most efficient cloud environment. Combining this with a selection      of Citrix virtual solutions, such as Citrix(R) XenApp(TM) and      XenDesktop(TM), we will deliver the IIJ GIO Virtual Desktop      Service as DaaS (Desktop as a Service), which enables customers      to provide virtual desktop environment and virtual applications      and to easily outsource all aspects of their desktop virtual      environment, from construction to operation.   2.  To provide PaaS on IIJ GIO using Xen software licenses as the      CSP Program Partner Under the CSP Program, IIJ-Tech will use the      Xen software licenses to provide services using a monthly billing      system. Using this program, the IIJ Group will offer systems      integrators and value added resellers (VAR) the IIJ GIO platform      and Xen software licenses as PaaS.   3.  To provide service providers with Xen software licenses as a      Designated Distributor under the CSP Program Designated      distributors under the CSP program can offer Xen software      licenses to service providers using a monthly billing system. As      a CSP Program Designated Distributor, IIJ-Tech will offer Xen      software licenses to service providers. The IIJ Group will regularly expand the line up of IIJ GIO services and provide our customers with the highest quality cloud services to support their business infrastructure.  Citrix(R) XenApp(TM) and XenDesktop(TM) are the registered trademarks and service marks of Citrix Systems, Inc. and its subsidiaries.  About IIJ  Founded in 1992, Internet Initiative Japan Inc. (IIJ) (Nasdaq:IIJI) (TSE1:3774) is one of Japan's leading Internet-access and comprehensive network solutions providers. IIJ and its group of companies provide total network solutions that mainly cater to high-end corporate customers. The company's services include high-quality systems integration and security services, Internet access, hosting/housing, and content design. Moreover, the company has built one of the largest Internet backbone networks in Japan, and between Japan and the United States. IIJ was listed on NASDAQ in 1999 and on the First Section of the Tokyo Stock Exchange in 2006. For more information about IIJ, visit the IIJ Web site at http://www.iij.ad.jp/en/.  The Internet Initiative Japan Inc. logo is available at http://www.globenewswire.com/newsroom/prs/?pkgid=4613  About IIJ Technology Inc.  IIJ Technology Inc. (IIJ-Tech) was established in November 1996 to spread out IIJ's superior network technology and forward vision to provide the system building and operation services to fill customers' expectations. IIJ-Tech builds Internet business systems and corporate information systems, as well as builds and operates many systems for government and financial institutions and systems for e-commerce, financial transactions, ISP/ICP, extranets, intranets, and ASPs. More information about IIJ-Tech is available at http://www.iij-tech.co.jp/.  The statements within this release contain forward-looking statements about our future plans that involve risk and uncertainty. These statements may differ materially from actual future events or results. Readers are referred to the documents furnished by Internet Initiative Japan Inc. with the SEC, specifically the most recent reports on Forms 20-F and 6-K, which identify important risk factors that could cause actual results to differ from those contained in the forward-looking statements.  -0- CONTACT:  Internet Initiative Japan Inc.           IIJ Corporate Communications           +81-3-5259-6310           press@iij.ad.jp           http://www.iij.ad.jp/en/ 

クラウドとスマートグリッドの接点の一つ:スマグリで開発されている、=>

ワイアレス通信デバイスが、クラウドをサポートするデータセンタ内部のの各種センサー技術に採用できる、という記事。  広大なデータセンタにもなると、温度、湿度、空気の流れ、気圧、等の情報をデータセンタ内の各所から集めて、効率の良い空調を維持する事が重要な課題になっている。  しかしセンサーの数が増えると、当然ワイヤーの本数も増え、コストが増大する。 このコストを削減するためにワイアレス技術を採用する事がぽんと。 おりしもスマートグリッド技術は、膨大な数の電力メータからの情報を無線で収集し、管理する技術が非常に進んでいる業界。この市場で培った技術を採用する、というのが本記事の提案。
 
基本的には納得できる。  現実的に採用されるほど標準化されているかどうか、というのは慎重に見ていく必要があるとは思う。 

Wireless sensors drive green data centres

Better metrics mean more power efficient servers

Enterprise efforts to consolidate data centres and install virtualisation software are taking a big bite out of the number of power hungry application and storage servers required to support enterprise data. But after taking this critical first step, what else can you do to boost efficiency?

You can move from hatchet to scalpel (to borrow a metaphor from President Obama). In this instance, the reference means that once you've minimised your number of power sucking devices, it's time to precisely monitor and measure data centre environmental metrics, down to the nitty-gritty rack level, so that you know exactly what adjustments are needed to optimise efficiency.

These metrics, of course, are electrical power, heat, airflow, cooling, temperature, humidity and pressure levels. Having visibility into them on a device by device basis, a scarce capability today, reveals the degree to which they are in sync with each equipment manufacturer's recommended specifications for optimal operation.

 'It's hard to improve power and cooling efficiency if you don't know where the waste is in the first place," says Nik Simpson, senior analyst in Burton Group's data centre strategies practice.

Let's face it: it's far easier and less expensive to mount wireless sensors than wired ones. Not needing cabling lets sensors live in many more places, so you can see a more complete and fine grained lay of the land and make precise, appropriate adjustments. Wireless data centre sensors, sensor networks and associated monitoring and management applications, available from companies such as SynapSense and Arch Rock, are starting to enable these capabilities and could kick off an evolved approach to data center energy management.

"The smart grid is moving into the data centre, and it is wireless instruments making this possible," asserts Peter Van Deventer, CEO at SynapSense. He estimates that the cost of a wireless sensor is 10 to 20 times less than that of a wired sensor once you figure in the installation cost.

Because of cabling complexities, costs and the need for pricey data centre 10Gbps ports for sensor communication, wired sensors tend to be installed in very few locations. In fact, sometimes sensors are only in the computer room air handler (CRAH). Though some helpful tabulations and assumptions can be made from this data, they don't show the entire efficiency picture.

There are also useful sensor capabilities built directly into some equipment, such as Cisco's Energywise solution for monitoring the power levels of Cisco network connected devices.

One drawback with embedded sensors, though, is that they usually feed measurement data into each manufacturer's own management system, making it complex to correlate, Simpson notes.

The emergence of wireless sensor applications aimed specifically at gathering real time statistics in many places throughout the data centre should ease the task, though. The applications help maintain compliance with industry standards for Power Usage Effectiveness and Data Center Infrastructure Efficiency. SynapSense also automates some adjustments for optimisation.