Posts Tagged ‘ Hosting-as-a-Service ’

Confidence in Cloud Computing Expected to Surge Economic Growth

The dynamic and flexible nature of cloud computing, software-as-a-service and platform-as-a-service may help organizations in their recovery from the current economic downturn, according to more than two thirds of IT decision leaders and makers who participated in a recent annual study by Vanson Bourne, an International Research Firm. Vanson Bourne surveyed over 600 IT and business decision makers across the United States, United Kingdom and Singapore. Of the countries sampled, Singapore is leading the shift to the cloud, with 76 percent of responding enterprises using some form of cloud computing. The U.S. follows with 66 percent, with the U.K. at 57 percent.

This two year study about Cloud Computing reveals that IT decision makers are very confident in cloud computing’s ability to deliver within budget and offer CapEx savings. Commercial and public sector respondents also predict cloud use will help decrease overall IT budgets by an average of 15 Percent, with others expecting savings as much as 40 Percent.

“Scalability, interoperability and pay-as-you-go elasticity are moving many of our clients toward cloud computing,” said Chad Collins, CEO at Nubifer Inc., a strategic Cloud and SaaS consulting firm. “However, it’s important, primarily for our enterprise clients, to work with a Cloud provider that not only delivers cost savings, but also effectively integrates technologies, applications and infrastructure on a global scale.”

A lack of access to IT capacity is clearly labeled as an obstacle to business progress, with 76 percent of business decision makers reporting they have been prevented from developing or piloting projects due to the cost or constraints within IT. For 55 percent of respondents, this remains an issue.

Confidence in cloud continues to trend upward — 96 percent of IT decision makers are as confident or more confident in cloud computing being enterprise ready now than they were in 2009. In addition, 70 percent of IT decision makers are using or plan to be using an enterprise-grade cloud solution within the next two years.

The ability to scale resources up and down in order to manage fluctuating business demand was the most cited benefit influencing cloud adoption in the U.S. (30 percent) and Singapore (42 percent). The top factor driving U.K. adoption is lower cost of total ownership (41 percent).

Security concerns remain a key barrier to cloud adoption, with 52 percent of respondents who do not leverage a cloud solution citing security of sensitive data as a concern. Yet 73 percent of all respondents want cloud providers to fully manage security or to fully manage security while allowing configuration change requests from the client.

Seventy-nine percent of IT decision makers see cloud as a straight forward way to integrate with corporate systems. For more information on how to leverage a cloud solution inside your environment, contact a Nubifer.com representative today.

Advertisements

Four Key Categories for Cloud Computing

When it comes to cloud computing, concerns about control and security have dominated recent discussions. While it was once assumed that all computing resources could be had from outside, now it is going towards a vision of a data center magically transformed for easy connections to internal and external IT resources.

According to IDC’s Cloud Services Overview report, sales of cloud-related technology is growing at 26 percent per year. That is six times the rate of IT spending as a whole; although they comprised only about 5 percent of total IT revenue this year. While the report points out that defining what constitutes cloud-related spending is complicated, it estimates global spending of $17.5 billion on cloud technologies in 2009 will grow to $44.2 billion by 2013. IDC predicts that hybrid or internal clouds will be the norm, although even in 2013 only an estimated 10 percent of that spending will go specifically to public clouds.

According to Chris Wolf, analyst at The Burton Group, hybrid cloud infrastructure isn’t that different from existing data-center best practices. The difference is that all of the pieces are meant to fit together using Internet-age interoperability standards as opposed to homegrown kludge.

The following are four items to consider when making a “shopping list” when preparing your IT budget for use of private or public cloud services:

1.       Application Integration

Software integration isn’t the first thing most companies consider when building a cloud, although Bernard Golden, CEO at cloud consulting firm HyperStratus, and CIO.com blogger, says it is the most important one.

Tom Fisher, vice president of cloud computing at SuccessFactors.com, a business-application SaaS provider in San Mateo, California, says that integration is a whole lot more than simply batch-processing chunks of data being traded between applications once or twice per day like it was done in mainframes.

Fisher continues to explain that it is critical for companies to be able to provision and manage user identities from a single location across a range of applications, especially when it comes to companies that are new in the software-providing business and do not view their IT as a primary product.

“What you’re looking for is to take your schema and map it to PeopleSoft or another application so you can get more functional integration. You’re passing messages back and forth to each other with proper error-handling agreement so you can be more responsive. It’s still not real time integration, but in most cases you don’t really need that,” says Fisher.

2.       Security

The ability to federate—securely connect without completely merging—two networks, is a critical factor in building a useful cloud, according to Golden.

According to Nick Popp, VP of product development at Verisign (VRSN), that requires layers of security, including multifactor authentication, identity brokers, access management and sometimes an external service provider who can provide that high a level of administrative control. Verisign is considering adding a cloud-based security service.

Wolf states that it requires technology that doesn’t yet exist. According to Wolf, an Information Authority that can act as a central repository for security data and control of applications, data and platforms within the cloud. It is possible to assemble that function out of some of the aspects Popp mentions today, yet Wolf maintains that there is no one technology able to span all platforms necessary to provide real control of even an internally hosted cloud environment.

3.       Virtual I/O

One IT manager at a large digital mapping firm states that if you have to squeeze data for a dozen VMs through a few NICs, the scaling of your VM cluster to cloud proportions will be inhibited.

“When you’re in the dev/test stage, having eight or 10 [Gigabit Ethernet] cables per box is an incredible labeling issue; beyond that, forget it. Moving to virtual I/O is a concept shift—you can’t touch most of the connections anymore—but you’re moving stuff across a high-bandwidth backplane and you can reconfigure the SAN connections or the LANs without having to change cables,” says the IT manager.

Virtual I/O servers (like the Xsigo I/O Director servers used by the IT manager’s company) can run 20Gbit/sec through a single cord and as many as 64 cords to a single server—connecting to a backplane with a total of 1,560Gbit/sec of bandwidth. The IT Manager states that concentrating such a large amount of bandwidth in one device saves space, power and cabling and keeps network performance high and saves money on network gear in the long run.

Speaking about the Xsigo servers, which start at approximately $28,000 through resellers like Dell (DELL), the manager says, “It becomes cost effective pretty quickly. You end up getting three, four times the bandwidth at a quarter the price.”

4.       Storage

Storage remains the weak point of the virtualization and cloud-computing worlds, and the place where the most money is spent.

“Storage is going to continue to be one of the big costs of virtualization. Even if you turn 90 percent of your servers into images, you still have to store them somewhere,” says Golden in summary. Visit Nubifer.com for more information.

Zuora Releases Z-Commerce

The first external service (SaaS) that actually understands the complex billing models of the cloud providers (which account for monthly subscription fees as well as automated metering, pricing and billing for products, bundles and highly individualized/specific configurations) arrived in mid-June in the form of Zuora’s Z-Commerce. An upgrade to Zuora’s billing and payment service that is built for cloud providers, Z-Commerce is a major development. With Z-Commerce, storage-as-a-service is able to charge for terabytes of storage used, or IP address usage, or data transfer charges. Cloud providers can also structure a per CPU instance charge or per application use charge and it can take complexities like peak usage into account. Zuora has provided 20 pre-configured templates for the billing and payment models that cloud providers use.

What makes this development so interesting that that Zuora is using what they are calling the “subscription economy” for the underlying rationale for their success: 125 customers, 75 employees and profitability.

Tien Tzou, the CEO of Zuora (also the former Chief Strategy Officer of Salesforce.com, described subscription economy below:

“The business model of the 21st century is a fundamentally different business model.

The 21st century world needs a whole new set of operational systems — ones that match the customer centric business model that is now necessary to succeed.

The business model of the 20th century was built around manufacturing.  You built products at the lowest possible cost, and you find buyers for that product.

They key metrics were all around inventory, cost of goods sold, product life cycles, etc. But over the last 30 years, we’ve been moving away from a manufacturing economy to a services economy. Away from an economy based on tangible goods, to an economy based on intangible ideas and experiences.

What is important now is the customer — of understanding customer needs, and building services & experiences that fulfill those customer needs.  Hence the rise of CRM.

But our financial and operational systems have not yet evolved!  What we need today are operational systems built around the customer, and around the services you offer to your customers.

You need systems that allow you to design different services, offered under different price plans that customers can choose from based on their specific needs.  So the phone companies have 450 minute plans, prepaid plans, unlimited plans, family plans, and more.  Salesforce has Professional Edition, and Enterprise Edition, and Group Edition, and PRM Edition, and more.  Amazon has Amazon Prime.  ZipCar has their Occasional Driving Plan and their Extra Value Plans.

You need systems that track customer lifecycles — things such as monthly customer value, customer lifetime value, customer churn, customer share of wallet, conversion rates, up sell rates, adoption levels.

You need systems that measure how much of your service your customers are consuming.  By the minute?  By the gigabyte?  By the mile?  By the user?  By the view?  And you need to establish an ongoing, recurring billing relationship with your customers, that maps to your ongoing service relationship, that allows you to monetize your customer interactions based on the relationship that the customer opted into.

The 21st century world needs a whole new set of operational systems — ones that match the customer centric business model that is now necessary to succeed.”

To summarize, what he is saying is that the model for future business isn’t the purchase of goods and services, but rather a price provided to a customer for an ongoing relationship to the company. Under this model, the customer is able to structure the relationship in a way which provides them with what they need to accomplish the job (s) that the company can help them with (which can be a variety of services, products, tools and structured experiences).

This is also interesting because your business is measuring the customer’s commitments to you and the other way around in operation terms, even as the business model is shifting to more interactions than ever before. If you are looking at traditional CRM metrics like CLV, churn, share of wallet, adoption rates and more, as they apply to a business model that has continued to evolve away from pure transactions, Tien is saying that the payment/billing, to him, is the financial infrastructure for this new customer-centered economic model (i.e. the subscription model).

Denis Pombriant of Beagle Research Group, LLC commented on this on his blog recently, pointing out that a subscription model does not guarantee a business will be successful. What does have significant bearing on the success of failure of a business is how well the business manages it or has it managed (i.e. by Zuora).

This can be applied to the subscription economy. Zuora is highlighting what they have predicted: that companies are increasingly moving their business models to subscription based pricing. This is the same model that supports free software and hardware, which charges customers by the month. How it is managed is another can of worms, but for now Zuora has done a service by recognizing that the customer-driven companies are realizing that the customers are willing to pay for the aggregate capabilities of the company in an ongoing way—as long as the company continues to support the customer’s needs in solving problems that arise. To learn more about cloud computing and the subscription model, contact a Nubifer.com representative.

Don’t Underestimate a Small Start in Cloud Computing

Although many predict that cloud computing will forever alter the economics and strategic direction of corporate IT, it is likely that the impact of the cloud will continue to be largely from small projects. Some users and analysts say that these small projects, which do not project complex, enterprise-class, computing-on-demand services, are what to look out for.

David Tapper, outsourcing and offshoring analyst for IDC says, “What we’re seeing is a lot of companies using Google (GOOG) Apps, Salesforce and other SaaS apps, and sometimes platform-as-a-service providers, to support specific applications. A lot of those services are aimed at consumers, but they’re just as relevant in business environments, and they’re starting to make it obvious that a lot of IT functions are generic enough that you don’t need to build them yourself.” New enterprise offerings from Microsoft, such as Microsoft BPOS, have also shown up on the scene with powerful SaaS features to offer businesses.

According to Tapper, the largest representation of mini-cloud computing is small- and mid-sized businesses using commercial versions of Google Mail, Google Apps and similar ad hoc or low-cost cloud-based applications. With that said, larger companies are doing the exact same thing. “Large companies will have users whose data are confidential or who need certain functions, but for most of them, Google Apps is secure enough. We do hear about some very large cloud contracts, so there is serious work going on. They’re not the rule though,” says Tapper.

First Steps into the Cloud

A poll conducted by the Pew Research Center’s Internet & American Life Project found that 71 percent of the “technology stakeholders and critics” believe that most people will do their work from a range of computing devices using Internet-basd applications as their primary tools by 2020.

Respondents were picked from technology and analyst companies for their technical savvy and as a whole believe cloud computing will dominate information transactions by the end of the decade. The June report states that cloud computing will be adopted because of its ability to provide new functions quickly, cheaply and from anywhere the user wishes to work.

Chris Wolf, analyst at Gartner, Inc.’s Burton Group, thinks that while this isn’t unreasonable, it may be a little too optimistic. Wolf says that even fairly large companies sometimes use commercial versions of Google Mail or instant messaging, but it is a different story when it comes to applications requiring more fine tuning, porting, communications middleware or other heavy work to run on public clouds, or data that has to be protected and documented.

Says Wolf, “We see a lot of things going to clouds that aren’t particularly sensitive–training workloads, dev and test environments, SaaS apps; we’re starting to hear complaints about things that fall outside of IT completely, like rogue projects on cloud services. Until there are some standards for security and compliance, most enterprises will continue to move pretty slowly putting critical workloads in those environments. Right now all the security providers are rolling their own and it’s up to the security auditors to say if you’re in compliance with whatever rules govern that data.”

Small, focused projects using cloud technologies are becoming more common, in addition to the use of commercial cloud-based services, says Tapper.

For example, Beth Israel Deaconnes Hospital in Boston elevated a set of VMware (VMW) physical and virtual servers into a cloud-like environment to create an interface to its patient-records and accounting systems, enabling hundreds of IT-starved physician offices to link up with the use of just one browser.

New York’s Museum of Modern Art started using workgroup-on-demand computing systems from CloudSoft Corp. last year. This allowed the museum to create online workspaces for short-term projects that would otherwise have required real or virtual servers and storage on-site.

Cloud computing will make it clear to both IT and business management that some IT functions are just generic when they’re homegrown as when rented, in about a decade or so. Says Tapper, “Productivity apps are the same for the people at the top as the people at the bottom. Why buy it and make IT spend 80 percent of its time maintaining essentially generic technology?” Contact Nubifer.com to learn more…

Cloud Computing in 2010

A recent research study by the Pew Internet & American Life Project released on June 11 found that most people expect to “access software applications online and share and access information through the use of remote server networks, rather than depending primarily on tools and information housed on their individual, personal computers” by 2010. This means that the term “cloud computing” will likely be referred to as simply “computing” ten years down the line.

The report points out that we are currently on that path when it comes to social networking, thanks to sites like Twitter and Facebook. We also communicate in the cloud using services like Yahoo Mail and Gmail, shop in the cloud on sites like Amazon and eBay, listen to music in the cloud on Pandora, share pictures in the cloud on Flickr and watch videos on cloud sites like Hulu and YouTube.

The more advanced among us are even using services like Google Docs, Scribd or Docs.com to create, share or store documents in the cloud. With that said, it will be some time before desktop computing falls away completely.

The report says: “Some respondents observed that putting all or most of faith in remotely accessible tools and data puts a lot of trust in the humans and devices controlling the clouds and exercising gate keeping functions over access to that data. They expressed concerns that cloud dominance by a small number of large firms may constrict the Internet’s openness and its capability to inspire innovation—that people are giving up some degree of choice and control in exchange for streamlines simplicity. A number of people said cloud computing presents difficult security problems and further exposes private information to governments, corporations, thieves, opportunists, and human and machine error.”

For more information on the current state of Cloud Computing, contact Nubifer today.

The Impact of Leveraging a Cloud Delivery Model

In a recent discussion about the positive shift in the Cloud Computing discourse towards actionable steps as opposed to philosophical rants in definitions, .NET Developer’s Journal issued a list of five things not to do. The first mistake among the list of five (which included #2. assuming server virtualization is enough; #3 not understanding service dependencies; #4 leveraging traditional monitoring; #5 not understanding internal/external costs), was not understanding the business value. Failing to understand the business impact of leveraging a Cloud delivery model for a given application or service is a crucial mistake, but it can be avoided.

When evaluating a Cloud delivery option, it is important to first define the service. Consider: is it new to you or are you considering porting an existing service? On one hand, if new, there is a lower financial bar to justify a cloud model, but on the downside is a lack of historical perspective on consumption trends to aid an evaluating financial considerations or performance.

Assuming you choose a new service, the next step is to address why you are looking at Cloud, which may require some to be honest about their reasons. Possible reasons for looking at cloud include: your business requires a highly scalable solution; your data center is out of capacity; you anticipate this to be a short-lived service; you need to collaborate with a business partner on neutral territory; your business has capital constraints.

All of the previously listed reasons are good reasons to consider a Cloud option, yet if you are considering this option because it takes weeks, months even, to get a new server in production; your Operation team is lacking credibility when it comes to maintaining a highly available service; or your internal cost allocation models are appalling—you may need to reconsider. In these cases, there may be some in-house improvements that need to be made before exploring a Cloud option.

An important lesson to consider is that just because you can do something doesn’t mean you necessarily should, and this is easily applicable in this situation. Many firms have had disastrous results in the past when they exposed legacy internal applications to the Internet. The following questions must be answered when thinking about moving applications/services to the Cloud:

·         Does the application consume or generate data with jurisdictional requirements?

·         Will your company face fines or a public relations scandal is there is a security breach/data loss?

·         What part of your business value chain is exposed if the service runs poorly? (And are there critical systems that rely on it?)

·         What if the application/service doesn’t run at all? (Will you be left stranded or are there alternatives that will allow the business to remain functioning?)

Embracing Cloud services—public or private—comes with tremendous benefits, yet a constant dialogue about the business value of the service in question is required to reap the rewards. To discuss the benefits of adopting a hybrid On-Prem/Cloud solution contact Nubifer today.

What Cloud APIs Reveal about the Budding Cloud Market

Although Cloud Computing remains hard to define, one of its essential characteristics is pragmatic access to virtually unlimited network, compute and storage resources. The foundation of a cloud is a solid Application Programming Interface (API), despite the fact that many users access cloud computing through consoles and third-party applications.

CloudSwitch works with several cloud providers and thus is able to interact with a variety of cloud APIs (both active and about-to-be-released versions). CloudSwitch has come up with some impressions after working with both the APIs and those implementing them.

First, clouds remain different in spite of constant discussion about standards. Cloud APIs have to cover more than start/stop/delete a server, and once the API crosses into provisioning the infrastructure (network ranges, storage capacity, geography, accounts, etc.), it all starts to get interesting.

Second, a very strong infrastructure is required for a cloud to function as it should. The infrastructure must be good enough to sell to others when it comes to public clouds. Key elements of the cloud API can inform you about the infrastructure, what tradeoffs the cloud provider has made and the impact of end users, if you are attuned to what to look out for.

Third, APIs are evolving fast, like cloud capabilities. New API calls and expansion of existing functions as cloud providers add new capabilities and features are now a reality. On balance, we are discussing on-the-horizon services and with cloud providers and what form their API is poised to take. This is a perfect opportunity to leverage the experience and work of companies like CloudSwitch as a means to integrate these new capabilities into a coherent data model.

When you look at the functions beyond simple virtual machine control, an API can give you an indication of what is happening in the cloud. Some like to take a peek at the network and storage APIs in order to understand how the cloud is built. Take Amazon, for example. In Amazon, the base network design is that each virtual server receives both a public and private IP address. These addresses are assigned from a pool based on the location of the machine within the infrastructure. Even though there are two IP addresses, however, the public one is just routed (or NAT’ed) to the private address. You only have a single network interface to your server—which is simply and scalable architecture for the cloud provider for support—with Amazon. The server will cause problems for applications requiring at least two NICs, such as some cluster applications.

Terremark’s cloud offering is in stark contrast to Amazon’s. IP addresses are defined by the provider so they can route traffic to your servers, like Amazon, but Terremark allocates a range for your use when you first sign up (while Amazon uses a generic pool of addresses). This can been seen as a positive because there is better control of the assignment of networking address, but on the flip side is potential scaling issues because you only have a limited number of addresses to work with. Additionally, you can assign up to four NIC’s to each server in Terremark’s Enterprise cloud (which allows you to create more complex network topologies and support applications requiring multiple networks for proper operation).

One important thing to consider is that with the Terremark model, servers only have internal addresses. There is no default public NAT address for each server, as with Amazon. Instead, Terremark has created a front-end load balancer that can be used to connect a public IP address to a specified set of servers by protocol and port. You must first create an “Internal Service” (in the language of Terremark) that defines a public IP/Port/Protocol for each protocol and port. Next, assign a server and port to the Service, which will create a connection. You can add more than one server to each public IP/Port/Protocol group  since this is a load balancer. Amazon does have a load balancer function as well, and although it isn’t required to connect public addresses to your cloud servers, it does support connecting multiple servers to a single public IP address.

When it comes down to it, the APIs and the feature sets they define tell a lot about the capabilities and design of a cloud infrastructure. The end user features, flexibility and scalability of the whole service will be impacted by decisions made at the infrastructure level (such as network address allocation, virtual device support and load balancers). It is important to look down to the API level when considering what cloud environment you want because it helps you to better understand how the cloud providers’ infrastructure decisions will impact your deployments.

Although building a cloud is complicated, it can provide a powerful resource when implemented correctly. Cloud with different “sweet spots” emerge when cloud providers choose key components and a base architecture for their service. You can span these different clouds and put the right application in the right environment with CloudSwitch. To schedule a time to discuss how Cloud Computing can help your enterprise, contact Nubifer today.