Posts Tagged ‘ Hybrid Cloud and On-Premise ’

Emerging Trends in Cloud Computing

Due to its reputation as a game-changing technology set, Cloud Computing is a hot topic when discussing emerging technology trends. Cloud Computing is defined by the National Institute of Standards and Technology (NIST) “as a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.”

IT optimization has largely been the reason for the early adoption of Cloud Computing in “Global 2000” enterprises, with the early drivers being cost savings and faster infrastructure provisioning. A December 2009 Forrester Report indicated that over 70% of IT budget is spent on maintaining current IT infrastructure rather than adding new capabilities. Because of this, organizations are seeking to adopt a Cloud Computing model for their enterprise applications in order to better utilize the infrastructure investments.

Several such organizations currently have data center consolidation and virtualization initiatives underway and look to Cloud Computing as a natural progression of those initiatives. Enterprise private cloud solutions add capabilities such as self-service, automation and charge back over the virtualized infrastructure and thus make infrastructure provisioning quicker, helping to improve the over-all utilizations. Additionally, some of these organizations have been beginning to try public cloud solutions as a new infrastructure sourcing option.

IT spending of “Global 2000” enterprises makes up less than 5% of their revenues, thus optimizing IT isn’t going to impact their top or bottom line. In the current economic state, IT optimization is a good reason for these large enterprises to begin looking at Cloud Computing. So what is the true “disruptive” potential of Cloud Computing? It lies in the way it is going to aid these large enterprises in reinventing themselves and their business models in order to rise to the challenge of an evolving business landscape.

Social Networking Clouds and e-Commerce

Worldwide e-Commerce transactions will be worth over $16 trillion by 2013, and by 2012 over 50% of all adult Internet users in the U.S. will be using social networks. Currently, 49% of web users make a purchase based on a recommendation gleaned from social media. This increased adoption of social media makes it easier for consumers to remain connected and get options on products and services. Basically, the consumer has already made up their mind about a produce before even getting to the website or store. This is causing major changes in consumer marketing and the B2C business models. The relationship used to be between the enterprise and the consumer, but it is now changed to a deeper relationship that encompasses the consumer’s community.

Large enterprises can’t afford to have “websites” or “brick-and-mortar stores” any longer if they want to remain relevant and ensure customer loyalty—they need to provide online cloud hosted platforms that engage the consumers constantly along with their social community. That way, they incorporate the enterprise business services in their day-to-day life. When the Gen Y consumers reach the market, for example, “community driven” social commerce just may replace traditional “website based” e-commerce. Enterprises need to begin building such next-generation industry specific service platforms for the domain they operate it in anticipation of this.

Computing’s Pervasiveness

One half of the world population—roughly 3.3 billion—have active mobile devices, and the increased use of these hand held devices is altering the expectations of consumers when it comes to the availability of services. Consumers expect that the products and services should be available to them whenever they need the service, wherever they are, through innovative applications, the kinds of applications that can be better delivered through the cloud model.

The number of smart devices is expected to reach one trillion by 2011, due to increasing adoption of technologies like wireless sensors, wearable computing, RFIDs and more. This will lead to significant changes in the way consumers use technology, as future consumers will be used to (and be expecting) more intelligent products and services such as intelligent buildings that conserve energy and intelligent transportation systems that can make decisions based on real-time traffic information. An entirely new set of innovative products and services based on such pervasive computing will need to be created for the future generation.

Service providers will look to increase customer loyalty by providing more offerings, better services and maintaining deeper relationships as products and services become commoditized. Several industry leaders are increasingly adopting open innovation models, there by creating business clouds supported by an ecosystem of partners, in order to increase the portfolio of offerings and innovate faster. A new generation of applications must be created as Cloud Computing becomes more pervasive with the increased adoption of smart devices.

To gain a competitive edge, reduce CAPEX on infrastructure and maintenance, and take advantage of powerful SaaS technologies offered in the Cloud, Companies need to build their next generation business cloud platforms in order to better manage the scale of information.

To learn more about Cloud Computing and how companies can adopt and interoperate with the cloud, Visit Nubifer.com

Understanding the Cloud with Nubifer Inc. CTO, Henry Chan

The overwhelming majority of cloud computing platforms consist of dependable services relayed via data centers and built in servers with varying tiers of virtualization capabilities. These services are available anywhere that allows access to the networking platform. Clouds often appear as single arenas of access for all subscribers’ enterprise computing needs. All commercial cloud platform offerings are guaranteed to adhere to the customers’ quality of service (QoS) requirements, and typically offer service level agreements.  Open standards are crucial to the expansion and acceptance of cloud computing, and open source software has layed the ground work for many cloud platform implementations.

The article to follow is what Nubifer Inc. CTO, Henry Chan, recently described to be his summarized view of what cloud computing means, its benefits and where it’s heading in the future:

Cloud computing explained:

The “cloud” in cloud computing refers to your network’s Internet connection. Cloud computing is essentially using the Internet to perform tasks like email hosting, data storage and document sharing which were traditionally hosted on premise.

Understanding the benefits of cloud computing:

Cloud computing’s myriad of benefits depend on your organizational infrastructure needs. If your enterprise is sharing large number of applications between a varying number of office locations, it would be beneficial to your organization to store the apps on a virtual server. Web-based application hosting can save time for people traveling without the ability to connect back to the office because they can have access to everything over their shared virtual private network (VPN).

Examples of cloud computing:

Hosted email (such as GMail or Hotmail), online data back-up, online data storage, any Software-as-a-Service (SaaS) application (such as a cloud hosted CRM from vendors like Salesforce, Zoho or Microsoft Dynamics) or accounting applications, are examples of applications that can be hosted in the cloud. By hosting these applications in the cloud, your business can benefit from the interoperability and scalability cloud computing and SaaS services offer.

Safety in the cloud:

Although there are some concerns over the safety of cloud computing, the reality is that data stored in the cloud can be just as secure as the vast majority of data stored on your internal servers. The key is to implement the necessary solutions to ensure that the proper level of encryption is applied to your data while traveling to and from your cloud storage container, as well as when being stored. This can be as safe as any other solution you could implement locally when designed properly. The leading cloud vendors all currently maintain compliance with Sarbanes-Oxley, SAS90, FISMA and HIPPA.

Cloud computing for your enterprise:

To determine which layer of cloud computing is optimally suited for your organization, it is important to thoroughly evaluate your organizational goals as it relates to your IT ecosystem. Examine how you currently use technology, current challenges with technology, how your organization will evolve technologically in the years to come, and what scalability and interoperability will be required going forward. After a careful gap analysis of these determinants, you can decide what types of cloud-based solutions will be optimally suited for your organizational architecture.

Cloud computing, a hybrid solution:

The overwhelming trend in 2010 and 2011 is to move non-sensitive data and applications into the cloud while keeping trade secrets behind your enterprise firewall, as many organizations are not comfortable hosting all their applications and hardware in the cloud. The trick to making cloud computing work for your business is to understand which applications should be kept local and which would benefit most from leveraging the scalability and interoperability of the cloud ecosystem.

Will data be shared with other companies if it is hosted in the cloud:

Short answer: NO! Reputable SaaS and cloud vendors will make sure that your data is properly segmented according to the requirements of your industry.

Costs of cloud computing:

Leading cloud-based solutions charge a monthly fee for application usage and data storage, but you may be outlaying this capital expenditure already, primarily in the form of hardware maintenance and software fees—some of which could be wiped out by moving to the cloud.

Cloud computing makes it easy for your companies’ Human Resource software, payroll and CRM to co-mingle with your existing financial data, supply chain management and operations installation, while simultaneously reducing your capital requirements on these systems. Contact a Nubifer representative today to discover how leveraging the power of cloud computing can help your business excel.

Confidence in Cloud Computing Expected to Surge Economic Growth

The dynamic and flexible nature of cloud computing, software-as-a-service and platform-as-a-service may help organizations in their recovery from the current economic downturn, according to more than two thirds of IT decision leaders and makers who participated in a recent annual study by Vanson Bourne, an International Research Firm. Vanson Bourne surveyed over 600 IT and business decision makers across the United States, United Kingdom and Singapore. Of the countries sampled, Singapore is leading the shift to the cloud, with 76 percent of responding enterprises using some form of cloud computing. The U.S. follows with 66 percent, with the U.K. at 57 percent.

This two year study about Cloud Computing reveals that IT decision makers are very confident in cloud computing’s ability to deliver within budget and offer CapEx savings. Commercial and public sector respondents also predict cloud use will help decrease overall IT budgets by an average of 15 Percent, with others expecting savings as much as 40 Percent.

“Scalability, interoperability and pay-as-you-go elasticity are moving many of our clients toward cloud computing,” said Chad Collins, CEO at Nubifer Inc., a strategic Cloud and SaaS consulting firm. “However, it’s important, primarily for our enterprise clients, to work with a Cloud provider that not only delivers cost savings, but also effectively integrates technologies, applications and infrastructure on a global scale.”

A lack of access to IT capacity is clearly labeled as an obstacle to business progress, with 76 percent of business decision makers reporting they have been prevented from developing or piloting projects due to the cost or constraints within IT. For 55 percent of respondents, this remains an issue.

Confidence in cloud continues to trend upward — 96 percent of IT decision makers are as confident or more confident in cloud computing being enterprise ready now than they were in 2009. In addition, 70 percent of IT decision makers are using or plan to be using an enterprise-grade cloud solution within the next two years.

The ability to scale resources up and down in order to manage fluctuating business demand was the most cited benefit influencing cloud adoption in the U.S. (30 percent) and Singapore (42 percent). The top factor driving U.K. adoption is lower cost of total ownership (41 percent).

Security concerns remain a key barrier to cloud adoption, with 52 percent of respondents who do not leverage a cloud solution citing security of sensitive data as a concern. Yet 73 percent of all respondents want cloud providers to fully manage security or to fully manage security while allowing configuration change requests from the client.

Seventy-nine percent of IT decision makers see cloud as a straight forward way to integrate with corporate systems. For more information on how to leverage a cloud solution inside your environment, contact a Nubifer.com representative today.

Four Key Categories for Cloud Computing

When it comes to cloud computing, concerns about control and security have dominated recent discussions. While it was once assumed that all computing resources could be had from outside, now it is going towards a vision of a data center magically transformed for easy connections to internal and external IT resources.

According to IDC’s Cloud Services Overview report, sales of cloud-related technology is growing at 26 percent per year. That is six times the rate of IT spending as a whole; although they comprised only about 5 percent of total IT revenue this year. While the report points out that defining what constitutes cloud-related spending is complicated, it estimates global spending of $17.5 billion on cloud technologies in 2009 will grow to $44.2 billion by 2013. IDC predicts that hybrid or internal clouds will be the norm, although even in 2013 only an estimated 10 percent of that spending will go specifically to public clouds.

According to Chris Wolf, analyst at The Burton Group, hybrid cloud infrastructure isn’t that different from existing data-center best practices. The difference is that all of the pieces are meant to fit together using Internet-age interoperability standards as opposed to homegrown kludge.

The following are four items to consider when making a “shopping list” when preparing your IT budget for use of private or public cloud services:

1.       Application Integration

Software integration isn’t the first thing most companies consider when building a cloud, although Bernard Golden, CEO at cloud consulting firm HyperStratus, and CIO.com blogger, says it is the most important one.

Tom Fisher, vice president of cloud computing at SuccessFactors.com, a business-application SaaS provider in San Mateo, California, says that integration is a whole lot more than simply batch-processing chunks of data being traded between applications once or twice per day like it was done in mainframes.

Fisher continues to explain that it is critical for companies to be able to provision and manage user identities from a single location across a range of applications, especially when it comes to companies that are new in the software-providing business and do not view their IT as a primary product.

“What you’re looking for is to take your schema and map it to PeopleSoft or another application so you can get more functional integration. You’re passing messages back and forth to each other with proper error-handling agreement so you can be more responsive. It’s still not real time integration, but in most cases you don’t really need that,” says Fisher.

2.       Security

The ability to federate—securely connect without completely merging—two networks, is a critical factor in building a useful cloud, according to Golden.

According to Nick Popp, VP of product development at Verisign (VRSN), that requires layers of security, including multifactor authentication, identity brokers, access management and sometimes an external service provider who can provide that high a level of administrative control. Verisign is considering adding a cloud-based security service.

Wolf states that it requires technology that doesn’t yet exist. According to Wolf, an Information Authority that can act as a central repository for security data and control of applications, data and platforms within the cloud. It is possible to assemble that function out of some of the aspects Popp mentions today, yet Wolf maintains that there is no one technology able to span all platforms necessary to provide real control of even an internally hosted cloud environment.

3.       Virtual I/O

One IT manager at a large digital mapping firm states that if you have to squeeze data for a dozen VMs through a few NICs, the scaling of your VM cluster to cloud proportions will be inhibited.

“When you’re in the dev/test stage, having eight or 10 [Gigabit Ethernet] cables per box is an incredible labeling issue; beyond that, forget it. Moving to virtual I/O is a concept shift—you can’t touch most of the connections anymore—but you’re moving stuff across a high-bandwidth backplane and you can reconfigure the SAN connections or the LANs without having to change cables,” says the IT manager.

Virtual I/O servers (like the Xsigo I/O Director servers used by the IT manager’s company) can run 20Gbit/sec through a single cord and as many as 64 cords to a single server—connecting to a backplane with a total of 1,560Gbit/sec of bandwidth. The IT Manager states that concentrating such a large amount of bandwidth in one device saves space, power and cabling and keeps network performance high and saves money on network gear in the long run.

Speaking about the Xsigo servers, which start at approximately $28,000 through resellers like Dell (DELL), the manager says, “It becomes cost effective pretty quickly. You end up getting three, four times the bandwidth at a quarter the price.”

4.       Storage

Storage remains the weak point of the virtualization and cloud-computing worlds, and the place where the most money is spent.

“Storage is going to continue to be one of the big costs of virtualization. Even if you turn 90 percent of your servers into images, you still have to store them somewhere,” says Golden in summary. Visit Nubifer.com for more information.

The Future of Enterprise Software in the Cloud

Although there is currently a lot of discussion regarding the impact that cloud computing and Software-as-a-Service will have on enterprise software, it comes mainly from a financial standpoint. It is now time to begin understanding how enterprise software as we know it will evolve across a federated set of private and public cloud services.

The strategic direction being taken by Epicor is a prime example of the direction that enterprise software is taking. A provider of ERP software for the mid-market, Epicor is taking a sophisticated approach by allowing customers to host some components on the Epicor suite on premise rather than focusing on hosting software in the cloud. Other components are delivered as a service.

Epicor is a Microsoft software partner that subscribes to the Software Plus Services mantra and as such is moving to offer some elements of its software, like the Web server and SQL server components, as an optional service. Customers would be able to invoke this on the Microsoft Azure cloud computing platform.

Basically, Epicor is going to let customers deploy software components where they make the most sense, based on the needs of customers on an individual basis. This is in contrast to proclaiming that one model of software delivery is better than another model.

Eventually, every customer is going to require a mixed environment, even those that prefer on-premise software, because they will discover that hosting some applications locally and in the cloud simultaneously will allow them to run a global operation 24 hours a day, 7 days a week more easily.

Much of the argument over how software is delivered in the enterprise will melt away as customers begin to view the cloud as merely an extension of their internal IT operations. To learn more on how the future of Software in the Cloud can aide your enterprise, schedule a discussion time with a Nubifer Consultant today.

App Engine and VMware Plans Show Google’s Enterprise Focus

Google opened its Google I/O developer conference in San Francisco on May 19 with the announcement of its new version of the Google App Engine, Google App Engine for Business. This was a strategic announcement, as it shows Google is focused on demonstrating its enterprise chops. Google also highlighted its partnership with VMware to bring enterprise Java developers to the cloud.

Vic Gundotra, vice president of engineering at Google said via a blog post: “… we’re announcing Google App Engine for Business, which offers new features that enable companies to build internal applications on the same reliable, scalable and secure infrastructure that we at Google use for our own apps. For greater cloud portability, we’re also teaming up with VMware to make it easier for companies to build rich web apps and deploy them to the cloud of their choice or on-premise. In just one click, users of the new versions of SpringSource Tool Suite and Google Web Toolkit can deploy their application to Google App Engine for Business, a VMware environment or other infrastructure, such as Amazon EC2.”

Enterprise organizations can build and maintain their own applications on the same scalable infrastructure that powers Google Applications with Google App Engine for Business. Additionally,  Google App Engine for Business has added management and support features that are tailored for each unique enterprise. New capabilities with this platform include: the ability to manage all the apps in an organization in one place; premium developer support; simply pricing based on users and applications; a 99.9 percent uptime service-level agreement (SLA); access to premium features such as cloud-based SQL and SSL (coming later this year).

Kevin Gibbs, technical lead and manager of the Google App Engine project said during the May 18 Google I/O keynote that “managing all the apps at your company” is a prevalent issue for enterprise Web developers. Google sought to address this concern through its Google App Engine hosting platform but discovered it needed to shore it up to support enterprises. Said Gibbs, “Google App Engine for Business is built from the ground up around solving the problems that enterprises face.”

Product management director for developer technology at Google Eric Tholome told eWEEK that Google App Engine for Business allows developers to use standards-based technology (like Java, the Eclipse IDE, Google Web Toolkit GWT and Python) to create applications that run on the platform. Google App Engine for Business also delivers dynamic scaling, flat-rate pricing and consistent availability to users.

Gibbs revealed that Google will be doling out the features in Google App Engine for Business throughout the rest of 2010, with Google’s May 19 announcement acting as a preview of the platform. The platform includes an Enterprise Administration Console, a company-based console which allows users to see, manage and set security policies for all applications in their domain. The company’s road map states that features like support, the SLA, billing, hosted SQL and custom domain SSL will come at a later date.

Gibbs said that pricing for Google App Engine for Business will be $8 per month per user for each application with the maximum being $1,000 per application per month.

Google also announced a series of technology collaboration with VMware. The goal of these is to deliver solutions that make enterprise software developers more efficient at building, deploying and managing applications within all types of cloud environments.

President and CEO of VMware Paul Maritz said, “Companies are actively looking to move toward cloud computing. They are certainly attracted by the economic advantages associated with cloud, but increasingly are focused on the business agility and innovation promised by cloud computing. VMware and Google are aligning to reassure our mutual important to both companies. We will work to ensure that modern applications can run smoothly within the firewalls of a company’s data center or out in the public cloud environment.”

Google is essentially trying to pick up speed in the enterprise, with Java developers using the popular Spring Framework (stemming from VMware’s SpringSource division). Recently, VMware did a similar partnership with Salesforce.com.

Maritz continued to say to the audience at Google I/O, “More than half of the new lines of Java code written are written in the context of Spring. We’re providing the back-end to add to what Google provides on the front end. We have integrated the Spring Framework with Google Web Toolkit to offer an end-to-end environment.”

Google and VMware are teaming up in multiple ways to make cloud applications more productive, portable and flexible. These collaborations will enable Java developers to build rich Web applications, use Google and VMware performance tools on cloud apps and subsequently deploy Spring Java applications on Google App Engine.

Google’s Gundotra explained, “Developers are looking for faster ways to build and run great Web applications, and businesses want platforms that are open and flexible. By working with VMware to bring cloud portability to the enterprise, we are making it easy for developers to deploy rich Java applications in the environments of their choice.”

Google’s support for Spring Java apps on Google App Engine are part of a shared vision to make building, running and managing applications for the cloud easier and in a way that renders the applications portable across clouds. Developers can build SpringSource Tool Suite using the Eclipse-based SpringSource and have the flexibility to choose to deploy their applications in their current private VMware vSphere environment, in VMware vCloud partner clouds or directly to Google App Engine.

Google and VMware are also collaborating to combine the speed of development of Spring Roo–a next-generation rapid application development tool–with the power of the Google Web Toolkit to create rich browser apps. These GWT-powered applications can create a compelling end-user experience on computers and smartphones by leveraging modern browser technologies like HTML5 and AJAX.

With the goal of enabling end-to-end performance visibility of cloud applications built using Spring and Google Web Toolkit, the companies are collaborating to more tightly integrate VMware’s Spring Insight performance tracing technology within the SpringSource tc Server application server with Google’s Speed Tracer technology.

Speaking about the Google/VMware partnership, vice president at Nucleus Research Rebecca Wettemann told eWEEK, “In short, this is a necessary step for Google to stay relevant in the enterprise cloud space. One concern we have heard from those who have been slow to adopt the cloud is being ‘trapped on a proprietary platform.’ This enables developers to use existing skills to build and deploy cloud apps and then take advantage of the economies of the cloud. Obviously, this is similar to Salesforce.com’s recent announcement about its partnership with VMware–we’ll be watching to see how enterprises adopt both. To date, Salesforce.com has been better at getting enterprise developers to develop business apps for its cloud platform.”

For his part, Frank Gillett, an analyst with Forrester Research, describes the Google/VMware more as “revolutionary” and the Salesforce.com/VMware partnership to create VMforce as “evolutionary.”

“Java developers now have a full Platform-as-a-Service [PaaS] place to go rather than have to provide that platform for themselves,” said Gillett of the new Google/VMware partnership. He added, however, “What’s interesting is that IBM, Oracle and SAP have not come out with their own Java cloud platforms. I think we’ll see VMware make another deal or two with other service providers. And we’ll see more enterprises application-focused offerings from Oracle, SAP and IBM.”

Google’s recent enterprise moves show that the company is set on gaining more of the enterprise market by enabling enterprise organizations to buy applications from others through the Google Apps Marketplace (and the recently announced Chrome Web Store), buy from Google with Google Apps for Business or build their own enterprise applications with Google App Engine for Business. Nubifer Inc. is leading Research and Consulting firm specializing in Cloud Computing and Software as a Service.

Transforming Into a Service-Centric IT Organization By Using the Cloud

While IT executives typically approach cloud services from the perspective of how they are being delivered, this model neglects what cloud services are and how they are consumed. These two facets can have a large impact on the overall IT organizations, points out eWeek Knowledge Center contributor Keith Jahn. Jahn maintains that it is very important for IT executives to veer away from the current delivery-only focus by creating a world-class supply chain for managing the supply and demand of cloud services.

Using the popular fable The Sky Is Falling, known lovingly as Chicken Little, Jahn explains a possible future scenario that IT organizations may face due to cloud computing. As the fable goes, Chicken Little embarks on a life-threatening journey to warn the king that the sky is falling and on this journey she gathers friends who join her on her quest. Eventually, the group encounters a sly fox who tricks them into thinking that he has a better path to help them reach the king. The tale can end one of two ways: the fox eats the gullible animals (thus communicating the lesson “Don’t believe everything you hear”) or the king’s hunting dogs can save the day (thus teaching a lesson about courage and perseverance).

So what does this have to do with cloud computing? Cloud computing has the capacity to bring on a scenario that will force IT organizations to change, or possibly be eliminated altogether. The entire technology supply chain as a whole will be severely impacted if IT organizations are wiped out. Traditionally, cloud is viewed as a technology disruption, and is assessed from a deliver orientation, posing questions like how can this new technology deliver solutions cheaper and better and faster? An equally important yet often ignored aspect of this equation is how cloud services are consumed. Cloud services are ready to run, self-sourced, available wherever you are and are pay-as-you-go or subscription based.

New capabilities will emerge as cloud services grow and mature and organizations must be able to solve new problems as they arise. Organizations will also be able to solve old problems cheaper, better and faster. New business models will be ushered in by cloud services and these new business models will force IT to reinvent itself in order to remain relevant. Essentially, IT must move away from its focus on the delivery and management of assets and move toward the creation of a world-class supply chain for managing supply and demand of business services.

Cloud services become a forcing function in this scenario because they are forcing IT to transform. CIOs that choose to ignore this and neglect to make transformative measures will likely see their role shift from innovation leader to CMO (Chief Maintenance Officer), in charge of maintaining legacy systems and services sourced by the business.

Analyzing the Cloud to Pinpoint Patterns

The cloud really began in what IT folks now refer to as the “Internet era,” when people were talking about what was being hosted “in the cloud.” This was the first generation of the cloud, Cloud 1.0 if you will—an enabler that originated in the enterprise. Supply Chain Management (SCM) processes were revolutionized by commercial use of the Internet as a trusted platform and eventually the IT architectural landscape was forever altered.

This model evolved and produced thousands of consumer-class services, which used next-generation Internet technologies on the front end and massive scale architectures on the back end to deliver low-cost services to economic buyers. Enter Cloud 2.0, a more advanced generation of the cloud.

Beyond Cloud 2.0

Cloud 2.0 is driven by the consumer experiences that emerged out of Cloud 1.0. A new economic model and new technologies have surfaced since then, due to Internet-based shopping, search and other services. Services can be self-sourced from anywhere and from any device—and delivered immediately—while infrastructure and applications can be sourced as services in an on-demand manner.

Currently, most of the attention when it comes to cloud services remains focused on the new techniques and sourcing alternatives for IT capabilities, aka IT-as-a-Service. IT can drive higher degrees of automation and consolidation using standardized, highly virtualized infrastructure and applications. This results in a reduction in the cost of maintaining existing solutions and delivering new solutions.

Many companies are struggling with the transition from Cloud 1.0 to Cloud 2.0 due to the technology transitions required to make the move. As this occurs, the volume of services in the commercial cloud marketplace is increasing, propagation of data into the cloud is taking place and Web 3.0/semantic Web technology is maturing. The next generation of the cloud, Cloud 3.0 is beginning to materialize because of these factors.

Cloud 3.0 is significantly different because it will enable access to information through services set in the context of the consumer experience. This means that processes can be broken into smaller pieces and subsequently automated through a collection of services, which are woven together with massive amounts of data able to be accessed. With Cloud 3.0, the need for large-scale, complex applications built around monolithic processes is eliminated. Changes will be able to be made by refactoring service models and integration achieved by subscribing to new data feeds. New connections, new capabilities and new innovations—all of which surpass the current model—will be created.

The Necessary Reinvention of IT

IT is typically organized around the various technology domains taking in new work via project requests and moving it through a Plan-Build-Run Cycle. Here lies the problem. This delivery-oriented, technology-centric approach has inherent latency built-in. This inherent latency has created increasing tension with the business it serves, which is why IT must reinvent itself.

IT must be reinvented so that it becomes the central service-sourcing control point for the enterprise or realize that the business with source them on their own. By becoming the central service-sourcing control point for the enterprise, IT can maintain the required service levels and integrations. Changes to behavior, cultural norms and organizational models are required to achieve this.

IT Must Become Service-Centric in the Cloud

IT must evolve from a technology-centric organization into a service-centric organization in order to survive, as service-centric represents an advanced state of maturity for the IT function. Service-centric allows IT to operate as a business function—a service provider—created around a set of products which customers value and are in turn willing to pay for.

As part of the business strategy, these services are organized into a service portfolio. This model differs from the capability-centric model because the deliverable is the service that is procured as a unit through a catalog and for which the components—and sources of components—are irrelevant to the buyer. With the capability-centric model, the deliverables are usually a collection of technology assets which are often visible to the economic buyer and delivered through a project-oriented life cycle.

With the service-centric model, some existing roles within the IT organization will be eliminated and some new ones will be created. The result is a more agile IT organization which is able to rapidly respond to changing business needs and compete with commercial providers in the cloud service marketplace.

Cloud 3.0: A Business Enabler

Cloud 3.0 enables business users to source services that meet their needs quickly, cost-effectively and at a good service level—and on their own, without the help of an IT organization. Cloud 3.0 will usher in breakthroughs and innovations at an unforeseen pace and scope and will introduce new threats to existing markets for companies while opening new markets for others. In this way, it can be said that cloud is more of a business revolution than a technology one.

Rather than focusing on positioning themselves to adopt and implement cloud technology, a more effective strategy for IT organizations would be to focus on transforming the IT organization into a service-centric model that is able to source, integrate and manage services with high efficiency.

Back to the story and its two possible endings:

The first scenario suggests that IT will choose to ignore that its role is being threatened and continue to focus on the delivery aspects of the cloud. Under the second scenario, IT is rescued by transforming into the service-centric organization model and becoming the single sourcing control point for services in the enterprise. This will effectively place IT in control of fostering business innovation by embracing the next wave of cloud. For more information please visit Nubifer.com.

New Cloud-Focused Linux Flavor: Peppermint

A new cloud-focused Linux flavor is in town: Peppermint. The Peppermint OS is currently a small, private beta which will open up to more testers in early to late May. Aimed at the cloud, the Peppermint OS is described on its home page as: “Cloud/Web application-centric, sleek, user friendly and insanely fast! Peppermint was designed for enhances mobility, efficiency and ease of use. While other operating systems are taking 10 minutes to load, you are already connected, communicating and getting things done. And, unlike other operating systems, Peppermint is ready to use out of the box.”

The Peppermint team announced the closed beta of the new operating system in a blog post on April 14, saying that the operating system is “designed specifically for mobility.” The description of the technology on Launchpad describes Peppermint as “a fork of Lubuntu with an emphasis on cloud apps and using many configuration files sources from Linux Mint. Peppermint uses Mozilla Prism to create single site browsers for easily accessing many popular Web applications outside of the primary Web applications outside of the primary browser. Peppermint uses the LXDE desktop environment and focuses on being easy for new Linux users to find their way around in.”

Lubuntu is described by the Lubuntu project as a lighter, faster and energy-saving modification of Ubuntu using LXDE (the Lightweight X11 Desktop Environment). Kendall Weaver and Shane Remington, a pair of developers in North Carolina, make up the core Peppermint team. Weaver is the maintainer for the Lunix Mint Fluxbox and LXDE editions as well as the lead software developer for Astral IX Media in Asheville, NC and the director of operations for Western Carolina Produce in Hendersonville, NC. Based in Asheville, NC, Remington is the project manager and lead Web developer for Astral IX Media and, according to the Peppermint site, “provides the Peppermint OS project support with Web development, marketing, social network integration and product development.” For more information please visit Nubifer.com.

Using Business Service Management to Manage Private Clouds

Cloud computing promises an entirely new level of flexibility through pay-as-you-go, readily accessible, infinitely scalable IT services, and executives in companies of all sizes are embracing the model. At the same time, they are also posing questions about the risks associated with moving mission-critical workloads and sensitive data into the cloud. eWEEK’s Knowledge Center contributor Richard Whitehead has four suggestions for managing private clouds using service-level agreements and business service management technologies.

“Private clouds” are what the industry is calling hybrid cloud computing models which offer some of the benefits of cloud computing without some of the drawbacks that have been highlighted. These private clouds host all of the company’s internal data and applications while giving the user more flexibility over how service is rendered. The transition to private clouds is part of the larger evolution of the data center, which makes the move from a basic warehouse of information to a more agile, smarter deliverer of services. While virtualization helps companies save on everything from real estate to power and cooling costs, it does pose the challenge of managing all of the physical and virtual servers—or virtual sprawl. Basically, it is harder to manage entities when you cannot physically see and touch them.

A more practical move into the cloud can be facilitated through technology, with private clouds being managed through the use of service-level agreements (SLAs) and business service management (BSM) technologies. The following guide is a continuous methodology to bring new capabilities into an IT department within a private cloud network. Its four steps will give IT the tools and knowledge to overcome common cloud concerns and experience the benefits that a private cloud provides.

Step 1: Prepare

Before looking at alternative computing processes, an IT department must first logically evaluate its current computing assets and ask the following questions. What is the mixture of physical and virtual assets? (The word asset is used because this process should examine the business value delivered by IT.) How are those assets currently performing?

Rather than thinking in terms of server space and bandwidth, IT departments should ask: will this private cloud migration increase sales or streamline distribution? This approach positions IT as a resource rather than as a line item within an organization. Your private cloud migration will never take off if your resources aren’t presented in terms of assets and RIO.

Step 2: Package

Package refers to resources and requires a new set of measurement tools. IT shops are beginning to think in terms of packaging “workloads” in the virtualized world as opposed to running applications on physical servers. Workloads are portable, self-contained units of work or services built through the integration of the JeOS (“just enough” operating system), middleware and the application. They are portable and able to be moved across environments ranging from physical and virtual to cloud and heterogeneous.

A business service is a group of workloads, and this shows a fundamental shift from managing physical servers and applications to managing business services composed of portable workloads that can be mixed and matched in the way that will be serve the business. Managing IT to business services (aka the service-driven data center) is becoming a business best practice and allows the IT department to price and validate its provide cloud plan as such.

Step 3: Price

A valuation must be assigned to each IT unit after you’ve packaged up your IT processes into workloads and services. How much does it cost to run the service? How much will it cost if the service goes offline? The analysis should be presented around how these costs effect the business owner because the costs assessments are driven by the business need.

One of the major advantages of a service-driven data center is that business services are able to be dynamically manages to SLAs and moved around appropriately. This allows companies to attach processes to services by connecting workloads to virtual services and, for the first time, connects a business process to the hardware implementing that business process.

The business service can be managed independent of the hardware because they aren’t tied to the business server and can thus be moved around on an as-needed basis.

Price is dependent on the criticality of the service, what resources it will consume or whether it is worthy of backup and/or disaster recovery support. This shows a new approach not usually disclosed by IT and transparency in a cloud migration plan can be seen as a crucial part of demonstrating the value the cloud provides in a way that is cost-effective.

Step 4: Present

After you have an IT service package, you must present a unified catalog to the consumers of those services. This catalog must be visible to all relevant stakeholders within the organization and can be considered an IT storefront or showcase featuring various options and directions for your private cloud to demonstrate value to the company.

This presentation allows your organization the flexibility to balance IT and business needs for a private cloud architecture that works for all parties; the transparency gives customers a way to interact directly with IT.

Summary

Although cloud computing remains an intimidating and abstract concept for many companies, enterprises can still start taking steps towards extending their enterprise into the cloud with the adoption of private clouds. An organization can achieve a private cloud that is virtualized, workload-based and managed in terms of business services with the service-driven data center. Workloads are managed in a dynamic manner in order to meet business SLAs. The progression from physical server to virtualization to the workload to business service to business service management is clear and logical.

In order to insure that your private cloud is managed effectively—thus providing optimum visibility to the cloud’s business value—it is important to evaluate and present your cloud migration in this way. Cloud investment can seem less daunting when viewed as a continuous process and the transition can be make in small sets which makes the value a private cloud can provide to a business more easily recognizable to stakeholders. For more information, visit Nubifer.com.

Microsoft’s CEO Says Company is Playing All Its Hands in the Cloud

During a recent speech at the University of Washington, Microsoft CEO Steve Ballmer spoke about his company’s future plans: and they primarily take place in the cloud! Citing services and platforms like Windows® Phone 7 Series and Xbox Live, Ballmer spoke about cloud-centric objectives. While Microsoft faces competition from Google and others when it comes to cloud-based initiatives, everyone is wondering how Microsoft will alter its desktop-centered products like the Windows franchise to remain ahead of the pack.

During his March 4 speech at the University of Washington, Ballmer stated that Microsoft’s primary focus in the future will be in the cloud and applications derived from the cloud. This may come as somewhat of a surprise, as Microsoft’s fortune largely comes from desktop-based software like Microsoft® Windows and Microsoft® Office, but Ballmer said, “We shipped Windows 7, which had a lot that’s not cloud-based. Out inspiration now starts with the cloud Windows Phone, Xbox, Windows Azure and SQL Azure … this is the best bet for our company.”

While speaking in front of a screen displaying a large cloud logo with the words “We’re all in,” Ballmer continued to say, “Companies like ours, can they move and dial in and focus and embrace? That’s where we’re programmed. You shouldn’t get into this industry if you don’t want things to chance. The field of endeavor keeps moving forward.”

When discussing Microsoft’s cloud initiatives, Ballmer spoke about the creation of a cloud-based Office that would allow workers to collaborate and communicate. He also referenced cloud-ported entertainment (via Xbox Live) and the creation of something he dubbed “smarter services” which would be capable of quickly integrating new hard- and software that could interact with the cloud smoothly. Ballmer spoke about Microsoft’s cloud-based development platform, Microsoft® Azure, and mentioned Azure Ocean, a University of Washington project which reportedly collects the world’s oceanographic data.

Microsoft’s most recent smartphone operating system, Windows® Phone 7 Series, was cited by Ballmer as one of the company’s cloud-centric smarter devices. “Earlier [Microsoft] phones were designed for voice and legacy [applications],” said the Microsoft CEO before adding that Microsoft® 7 Phone Series was created to “put people, places, content, commerce all front and center for the users with a different point of view that some other phones.”

Citing the reciprocal need of search and Bing Maps to draw in information from users in order to “learn” and define their actions, Ballmer placed the cloud at an even playing field. While Bing Maps has started integrating Flickr images into its Streetside feature—thus presenting an eye-level view of an environment—Microsoft is experimenting with putting Streetside cameras on bikes and pedestrians instead of on the roofs of cars to offer even more views to users. Search engines like Bing take history information ported to them by users and gauge user intent. Ballmer suggested that the “ability of the cloud to learn from all of the data that’s out there, and learn from me about what I’m interested in” is one of the cloud’s most basic and important dimensions.

When it comes to competition in the cloud, Microsoft faces the most in consumer applications. Ballmer praised Apple’s App Store, calling it “a very nice job,” but knows that Microsoft has a ways to go in terms of catching up to Apple’s cloud-based monetization of intellectual property like movies and music. As for Google, the company has a lead in the search engine market in the U.S. and its Google Apps cloud-based productivity has been making inroads with businesses and government.  Google recently announced plans for a dedicated federal cloud computing system sometime later in 2010. This announcement likely propelled Microsoft’s February 24 announcement Business Productivity Online Suites Federal. The online-services cloud for the U.S. government comes equipped with strict security reinforcements.

Overall, Ballmer’s speech at the University of Washington furthered the notion that Microsoft is poised to focus its competitive energies in the cloud more and more. The industry will be waiting to see what this will mean for the traditionally desktop-centric Windows franchise, Microsoft’s flagship product; especially since news recently surfaced suggesting Microsoft is currently developing Windows 8. For more information on Windows Azure please visit Nubifer.com.


Microsoft and IBM Compete for Space in the Cloud as Google Apps Turns 3

Google may have been celebrating the third birthday of Google Apps Premier Edition on February 22, but Microsoft and IBM want a piece of the cake, errr cloud, too. EWeek.com reports that Google is trying to dislodge legacy on-premises installations from Microsoft and IBM while simultaneously fending off SaaS solutions from said companies. In addition, Google has to fend off offerings from Cisco Systems and startups like Zoho and MinTouch, to name a few. Despite the up-and-comers, Google, Microsoft and IBM are the main three companies competing for pre-eminence in the market for cloud collaborative software.

Three year ago, Google launched its Google Apps Premier Edition, marking a bold gamble on the future of collaborative software. Back then, and perhaps even still, the collaborative software market was controlled by Microsoft and IBM. Microsoft and IBM have over 650 million customers for their Microsoft ® Office, Sharepoint and IBM Lotus suite combined. These suits are licensed as “on-premises” software which customers install and maintain on their own servers.

When Google launched Google Apps Premier Edition (GAPE), it served as a departure from this on-premises model by offering collaboration software hosted on Google’s servers and delivered via the Web. We now know this method as cloud computing.

Until the introduction of GAPE, Google Apps was available in a free standard edition (which included Gmail, Google Docs word processing, spreadsheet and presentation software), but with GAPE Google meant to make a profit. For just $50 per user per year, companies could provide their knowledge workers with GAPE, which featured the aforementioned apps as well as additional storage, security and, most importantly, 24/7 support.

Google Apps now has over two million business customers–of all shapes and sizes–and is designed to appeal to both small companies desiring low-cost collaboration software but are lacking the resources to manage it and large enterprises desiring to eliminate the cost of managing collaboration applications on their own. At the time, Microsoft and IBM were not aggressively exploring this new cloud approach.

Fast-forward to 2009. Microsoft and IBM had released hosted collaboration solutions (Microsoft ® Business Productivity Office Suite and LotusLive respectively) to keep Google Apps from being lonely in the cloud.

On the third birthday of GAPE, Google has its work cut out for it. Google is trying to dislodge legacy on-premises installations from Microsoft and IBM while fending of SaaS solutions from Microsoft, IBM, Zoho, Mindtouch and the list goes on.

Dave Girouard, Google Enterprise President, states that while Google spent 2007 and 2008 debating the benefits of the cloud, the release of Microsoft and IBM products validated the market. EWeek.com quotes Girouard as saying, “We now have all major competitors in our industry in full agreement that the cloud is worth going to. We view this as a good thing. If you have all of the major vendors suggesting you look at the cloud, the consideration of our solutions is going to rise dramatically.”

For his part, Ron Markezich, corporate vice president of Microsoft Online Services, thinks that there is room for everyone in the cloud because customer needs vary by perspective. Said Markezich to EWeek.com, “Customers are all in different situations. Whether a customer wants to go 100 percent to the cloud or if they want to go to the cloud in a measured approach in a period of years, we want to make sure we can bet on Microsoft to serve their needs. No one else has credible services that are adopted by some of the larger companies in the world.”

Microsoft’s counter to Google Apps is Microsoft’s ® Business Productivity Online Suite (BPOS). It includes Microsoft ® Exchange Online with Microsoft ® Exchange Hosted Filtering, Microsoft ® SharePoint Online, Microsoft ® Office Communications Online and Microsoft ® Office Living Meeting. Microsoft also offers the Business Productivity Online Deskless Worker Suite (which includes Exchange Online Deskless Worker for email, calendars and global address lists, antivirus and anti-spam filters) and Microsoft ® Outlook Web Access Light (for access to company email) for companies with either tighter budgets or those in need of lower cost email and collaboration software. Sharepoint Online Deskless Worker provides easy access to SharePoint portals, team sites and search functionality.

The standard version of BPOS costs $1 user per month or $120 per user per year while BPOS Deskless Worker Suite is $4 per user per month or $36 per user per year. Users may also license single apps as stand-alone services from $2 to $5 per user per month, which serves as a departure from Google’s one-price-for-the-year GAPE package.

The same code base is used by Microsoft for its BPOS package, on-premises versions of Exchange and SharePoint, thus making legacy customers’ transition into the cloud easier should they decide to migrate to BPOS. Microsoft thinks that this increases the likelihood that customers will remain with Microsoft rather than switching to Google Apps or IBM Lotus.

At Lotusphere 2008, IBM offered a hint at its cloud computing goals with Bluehouse, a SaaS extranet targeted toward small- to mid-size business. The product evolved as LotusLive Engage, a general business collaboration solution with social networking capabilities from IBM’s LotusLive Connections suite, at Lotusphere 2009. In the later half of 2009, the company sought to fill the void left open by the absence of email, by introducing the company’s hosted email solution LotusLive iNotes. iNotes costs $3 per user per month and $36 per user per year. Additionally, IBM offers LotusLive Connections, a hosted social networking solution, as well as the aforementioned LotusLive Engage.

Vice president of online collaboration for IBM Sean Pouelly told EWeek.com that IBM is banking on companies using email to adopt their social networking services saying, “It’s unusual that they just buy one of the services.” Currently over 18 million paid seats use hosted versions of IBM’s Lotus software.

IBM’s efforts in the cloud began to really get attention when the company scored Panasonic as a customer late last year. In its first year of implementing LotusLive iNotes, the consumer electronics maker plans on migrating over 100,000 users from Lotus Notes, Exchange and Panasonic’s proprietary email solution to LotusLive.

When it comes down to it, customers have different reasons for choosing Google, Microsoft or IBM. All three companies have major plans for 2010, and each company has a competitive edge. For more information regarding Cloud Computing please visit Nubifer.com.

The Main Infrastructure Components of Cloud Computing

Cloud computing is perhaps the most-used buzz word in the tech world right now, but to understand cloud computing is to be able to point out its main infrastructure components in comparison to older models.

So what is cloud computing? It is an emerging computing model that allows users to gain access to their applications from virtually anywhere by using any connected device they have access to. The cloud infrastructure supporting the applications is made transparent to users by a user-centric interface. Applications live in massively scalable data centers where computational resources are able to be dynamically provisioned and shared in order to achieve significant economies of scale. The management costs of bringing more IT resources into the cloud can be significantly decreased due to a strong service management platform.

Cloud computing can be viewed simultaneously as a business delivery model and an infrastructure management methodology. As a business delivery model, it provides a user experience through which hardware, software and network resources are optimally leveraged in order to provide innovative services on the web. Servers are provisioned in adherence with the logical requirements of the service using advanced, automated tools. The cloud enables program administrators and service creators to use these services via a web-based interference that abstracts away the complex nature of the underlying dynamic infrastructure.

IT organizations can manage large numbers of highly virtualized resources as a single large resource thanks to the infrastructure management methodology. Additionally, it allows IT organizations to greatly increase their data center resources without ramping up the number of people typically required to maintain that increase. A cloud will thus enable organizations currently using traditional infrastructures to consume IT resources in the data center in new, exciting, and previously-unavailable ways.

Companies with traditional data center management practices know that it can be time-intensive to make IT resources available to an end user because of the many steps it involves. These include procuring hardware, locating raised floor space, not to mention sufficient power and cooling, allocating administrators to install operating systems, middleware and software, provisioning the network and securing the environment. Companies have discovered that this process can take two to three months, if not more, while IT organizations re-provisioning existing hardware resources find that it takes weeks to finish.

This problem is solved by the cloud—as the cloud implements automation, business workflows and resource abstraction that permits a user to look at a catalog of IT services, add them to a shopping cart and subsequently submit the order. Once the order is approved by an administrator, the cloud handles the rest. In this way, the process cuts down on the time usually required to make those resources available to the customer from long months to mere minutes.

Additionally, the cloud provides a user interface that allows the user and the IT administrator to manage the provisioned resources through the life cycle of the service request very easily. Once a user’s resources have been delivered by the cloud, the user can track the order (which usually consists of a variable of servers and software); view the health of those resources; add additional servers; change the installed software; remove servers; increase or decrease the allocated processing power, storage or memory; start, stop and restart servers. Yes, really. These self-service functions are able to be performed 24 hours a day and take just minutes to perform. This is in stark contrast to a non-cloud environment, in which it would take hours or even days to have hardware or software configurations changed to have a server restarted. For more information regarding Infrastructure components for a Cloud ecosystem please visit Nubifer.com.

Nubifer Cloud:Portal

Reducing capital expenditure for hardware supporting your software is a no-brainer, and Nubifer Cloud:Portal allows you to leverage the computing power and scalability of the top-tier cloud platforms. A powerful suite of core portal technologies, interfaces, database schematics and service-oriented architecture libraries, Cloud:Portal comes in several configuration options and you are sure to find the right fit for your enterprise.

Nubifer understands that certain clients requiring custom on-premise and cloud-hosted portals may also require different application layers and data layer configurations. For this reason, Nubifer leverages RAD development techniques to create robust, scalable programming code in ASP.NET (C#), ASP, PHP, Java Servlets, JSP and ColdFusion and Perl. Nubifer also supports a myriad of data formats and database platform types, cloud SOA and architectures such as SQL Server (and Express), Microsoft ® Access, MYSQL, Oracle and more.

Nubifer Cloud:Portal Provides Enterprise Grade Solutions

Your new Nubifer Cloud:Portal is created by Nubifer’s professional services team through customizing and enhancing one or more combinations. In addition, a wide range of cloud modules are compatible and can be added as “plug-in” modules to extend your portal system.

The following Options in Portal types are available:

·         Online Store

·         Task Management System

·         Employee Directory

·         Bug / Task Tracker

·         Forum / Message Board

·         Wizard Driven Registration Forms

·         Time Sheet Manager

·         Blog / RSS Engine Manager

·         Calendar Management System

·         Events Management

·         Custom Modules to Match Business Needs

At its most basic, the cloud is a nebulous infrastructure owned and operated by an outside party that accepts and runs workloads created by customers. Nubifer Cloud:Portal is compatible with cloud platforms and APIs like Google APIs for Google Applications and Windows® Azure, and also runs on standard hosting platforms.

Cloud Portal boasts several attractive portal management features. Multi-level Administrative User Account Management lets you manage accounts securely, search by account and create and edit all accounts. Public Links and Articles Manager allows to you create, edit or archive new articles, search indexed and features the Dynamic Links manager. Through “My Account” User Management, users can manage their own account and upload and submit custom files and information. The Advanced Security feature enables session-based authentication and customized logic.

That’s not all! There are other great features association with Nubifer Cloud Portal. Calendar and Events lets you add and edit; calendars can be user specific or group organization specific and events can be tied to calendar events. The system features dynamic styles because it supports custom styles sheets dynamically triggered by user choice or by configuration settings, which is great for co-branding or the multi-host look and feel. Web Service XML APIs for 3rd party integration feature SOA architecture, are web service enables and are interoperable with the top-tier cloud computing platforms by exposing and consuming XML APIs. Lastly, submission forms, email and database submission is another important feature. Submission forms trigger send mail functionality and are manageable by Portal Admins.

Cloud Portal employs R.I.A. Reporting such as User Reports, Search BY Category Reports, Transaction Details Reports, Simple Report and Timesheet Report through Flex and Flash Reports.

Companies using Cloud Portal are delivered a “version release” code base for their independent endeavors. These companies leveraging Nubifer’s professional portal service have access, ownership and full rights to the “code instance” delivered as the final release version of their customized cloud portal. This type of licensing gives companies a competitive edge by being the sole proprietor of their licenses copy of the cloud portal.

Enterprise companies leverage the Rapid and Rich offering delivered by out portal code models and methodologies. As a result, companies enjoy the value of rapid prototyping and application enhancement with faster to market functionality in their portals.

Nubifer Cloud:Portal technology is designed to facilitate and support your business model today and in the future, by expanding as your company evolves. Within our process for portal development, we define and design the architecture, develop and enhance the portal code and deliver and deploy to your public or private environment. Please visit nubifer.com to learn more about our proprietary offering, Cloud:Portal.

Survey Reveals Developers Concentrating on Hybrid Cloud in 2010

According to a survey of application developers conducted by Evans Data, over 60 percent of IT shops polled have plans to adopt a hybrid cloud model in 2010. The results for the poll, released on January 12, 2010, indicate that 61 percent of over 400 participating developers stated that some portion of their companies’ IT resources will transition into the public cloud within the next year.

The hybrid cloud is set to dominate the IT landscape in 2010 because of those surveyed, over 87 percent of the developers said that half or less of their resources will move. A statement obtained by eWeek.com quotes CEO of Evans Data Janel Garvin as saying, “The hybrid Cloud presents a very reasonable model, which is easy to assimilate and provides a gateway to Cloud computing without the need to commit all resources or surrender all control and security to an outside vendor. Security and government compliance are primary obstacles to public cloud adoption, but a hybrid model allows for selective implementation so these barriers can be avoided.”

Evans Data conducted its survey over November and December of last year as a way to examine timelines for public and private cloud adoption, ways in which to collaborate and develop within the cloud, obstacles and benefits of cloud development, architectures and tools for cloud, development, virtualization in the private data center and other aspects of cloud computing. The survey also concluded that 64 percent of developers surveyed expect their clod apps to venture into mobile devices in the near future as well.

Additional information about the future of cloud computing revealed by Evans Data’s poll revealed that the preferred database for use in the public cloud is MySQL, preferred by over 55 percent of developers. Following by Microsoft and IBM, VMware was also revealed to be the preferred hypervisor vendor or user in a virtualized private cloud. To learn more please visit nubfer.com.

Maximizing Effectiveness in the Cloud

At its most basic, the cloud is a nebulous infrastructure owned and operated by an outside party that accepts and runs workloads created by customers. When thinking about the cloud in this way, the basic question concerning cloud computing becomes, “Can I run all of my applications in the cloud?” If you answer “no” to that question, then ask yourself, “What divisions of my data can safely be run in the cloud?” When assessing how to include cloud computing in your architecture, one way to maximize your effectiveness in the cloud is to see how you can effectively complement your existing architectures.

The current cloud tools strive to manage provisioning and a level of mobility management, with security and audit capabilities on the horizon, in addition to the ability to move the same virtual machine in and out of the cloud. This is where virtualization, a new data center which includes a range of challenges for traditional data center management tools, comes into play. Identity, mobility and data separation are a few obvious sues for virtualization.

1.       Identity

Server identity becomes crucial when you can make 20 identical copies of an existing server and then distribute them around the environment with just a click of a mouse. In this way, the traditional identity based on physicality doesn’t measure up.

2.       Mobility

While physical servers are stationary, VMs are designed to be mobile, and tracking and tracing them throughout their life cycles is an important part of maintaining and proving control and compliance.

3.       Data separation

Resources are shared between host servers and the virtual servers running on them, thus portions of the host’s hardware (like the processor and memory) are allocated to each virtual server. There have not been any breaches of isolation between virtual servers yet, but this may not last.

These challenges are highlighted by cloud governance. While these three issues are currently managed and controlled by someone outside of the IT department, additional challenges that are specific to the cloud now exist. Some of them include life cycle management, access control, integrity and cloud-created VMS.

1.       Life cycle management

How is a workload’s life cycle managed once it has been transferred to the cloud?

2.       Access control

Who was given access to the application and its data while it was in the cloud?

3.       Integrity

Did its integrity remain while it was in the cloud, or was it altered?

4.       Cloud-created VMS

Clouds generate their own workloads and subsequently transfer them into the data center. These so-called “virtual appliances” are being downloaded into data centers each day and identity, integrity and configuration need to be managed and controlled there.

Cloud computing has the potential to increase the flexibility and responsiveness of your IT organization and there are things you can do to be pragmatic about the evolution of cloud computing. They include understanding what is needed in the cloud, gaining experience with “internal clouds” and testing external clouds.

1.       Understanding that is needed to play in the cloud

The term “internal clouds” has resulted from the use of virtualization in the data center. It is important to discuss with auditors how virtualization is impacting their requirements and new requirements and new policies may subsequently be added to your internal audit checklists.

2.       Gaining experience with “internal clouds”

It is important to be able to efficiently implement and enforce the policies with the right automation and control systems. It becomes easier to practice that in the cloud once you have established what you need internally.

3.       Testing external clouds

The use of low-priority workloads help provide a better understanding of what is needed for life cycle management as well as establish what role external cloud infrastructures may play in your overall business architecture.

Essentially, you must be able to manage, control and audit your own internal virtual environment in order to be able to do so with an external cloud environment. Please visit nubifer.com to learn more on maximizing officing effectiveness in the cloud.

Answers to Your Questions on Cloud Connectors

Jeffrey Schwartz and Michael Desmond, both editors of Redmond Developer News, recently sat down with corporate vice president of Microsoft’s Connected Systems Division, Robert Wahbe, at the recent Microsoft Professional Developers Conference (PDC) to talk about Microsoft Azure and its potential impact on the developer ecosystem at Microsoft. Responsible for managing Microsoft’s engineering teams that deliver the company’s Web services and modeling platforms, Wahbe is a major advocate of the Azure Services Platform and offers insight into how to build applications that exist within the world of Software-as-a-Service, or as Microsoft calls it, Software plus Services (S + S).

When asked how much of Windows Azure is based on Hyper-V and how much is an entirely new set of technologies, Wahbe answered, “Windows Azure is a natural evolution of our platform. We think it’s going to have a long-term radical impact with customers, partners and developers, but it’s a natural evolution.” Wahbe continued to explain how Azure brings current technologies (i.e. the server, desktop, etc.) into the cloud and is fundamentally built out of Windows Server 2008 and .NET Framework.

Wahbe also referenced the PDC keynote of Microsoft’s chief software architect, Ray Ozzie, in which Ozzie discussed how most applications are not initially created with the idea of scale-out. Explained Wahbe, expanding upon Ozzie’s points, “The notion of stateless front-ends being able to scale out, both across the data center and across data centers requires that you make sure you have the right architectural base. Microsoft will be trying hard to make sure we have the patterns and practices available to developers to get those models [so that they] can be brought onto the premises.”

As an example, Wahbe created a hypothetical situation in which Visual Studio and .NET Framework can be used to build an ASP.NET app, which in turn can either be deployed locally or to Windows Azure. The only extra step taken when deploying to Windows Azure is to specify additional metadata, such as what kind of SLA you are looking for or how many instances you are going to run on. As explained by Wahbe, the Metadata is an .XML file and as an example of an executable model, Microsoft is easily able to understand that model. “You can write those models in ‘Oslo’ using the DSL written in ‘M,’ targeting Windows Azure in those models,” concludes Wahbe.

Wahbe answered a firm “yes” when asked if there is a natural fit for application developed in Oslo, saying that it works because Oslo is “about helping you write applications more productively,” also adding that you can write any kind of application—including cloud. Although new challenges undoubtedly face development shops, the basic process of writing and deploying code remains the same. According to Wahbe, Microsoft Azure simply provides a new deployment target at a basic level.

As for the differences, developers are going to need to learn a new set of services. An example used by Wahbe is if two businesses were going to connect through a business-to-business messaging app; technology like Windows Communication Foundation can make this as easy process. With the integration of Microsoft Azure, questions about the pros and cons of using the Azure platform and the service bus (which is part of .NET services) will have to be evaluated. Azure “provides you with an out-of-the-box, Internet-scale, pub-sub solution that traverses firewalls,” according to Wahbe. And what could be bad about that?

When asked if developers should expect new development interfaces or plug-ins to Visual Studio, Wahbe answered, “You’re going to see some very natural extensions of what’s in Visual Studio today. For example, you’ll see new project types. I wouldn’t call that a new tool … I’d call it a fairly natural extension to the existing tools.” Additionally, Wahbe expressed Microsoft’s desire to deliver tools to developers as soon as possible. “We want to get a CTP [community technology preview] out early and engage in that conversation. Now we can get this thing out broadly, get the feedback, and I think for me, that’s the most powerful way to develop a platform,” explained Wahbe of the importance of developers’ using and subsequently critiquing Azure.

When asked about the possibility of competitors like Amazon and Google gaining early share due to the ambiguous time frame of Azure, Wahbe’s responded serenely, “The place to start with Amazon is [that] they’re a partner. So they’ve licensed Windows, they’ve licensed SQL, and we have shared partners. What Amazon is doing, like traditional hosters, is they’re taking a lot of the complexity out for our mutual customers around hardware. The heavy lifting that a developer has to do to tale that and then build a scale-out service in the cloud and across data centers—that’s left to the developer.” Wahbe detailed how Microsoft has base computing and base storage—the foundation of Windows Azure—as well as higher-level services such as the database in the cloud. According to Wahbe, developers no longer have to build an Internet-scale pub-sub system, nor do they have to find a new way to do social networking and contacts nor have reporting services created themselves.

In discussing the impact that cloud connecting will have on the cost of development and the management of development processes, Wahbe said, “We think we’re removing complexities out of all layers of the stack by doing this in the cloud for you … we’ll automatically do all of the configuration so you can get load-balancing across all of your instances. We’ll make sure that the data is replicated both for efficiency and also for reliability, both across an individual data center and across multiple data centers. So we think that be doing that, you can now focus much more on what your app is and less on all that application infrastructure.” Wahbe predicts that it will be simpler for developers to build applications with the adoption of Microsoft Azure. For more information on Cloud Connectors, contact a Nubifer representative today.

Answers to Your Questions on Cloud Connectors for Leading Platforms like Windows Azure Platform

Jeffrey Schwartz and Michael Desmond, both editors of Redmond Developer News, recently sat down with corporate vice president of Microsoft’s Connected Systems Division, Robert Wahbe, at the recent Microsoft Professional Developers Conference (PDC) to talk about Microsoft Azure and its potential impact on the developer ecosystem at Microsoft. Responsible for managing Microsoft’s engineering teams that deliver the company’s Web services and modeling platforms, Wahbe is a major advocate of the Azure Services Platform and offers insight into how to build applications that exist within the world of Software-as-a-Service, or as Microsoft calls it, Software plus Services (S + S).

When asked how much of Windows Azure is based on Hyper-V and how much is an entirely new set of technologies, Wahbe answered, “Windows Azure is a natural evolution of our platform. We think it’s going to have a long-term radical impact with customers, partners and developers, but it’s a natural evolution.” Wahbe continued to explain how Azure brings current technologies (i.e. the server, desktop, etc.) into the cloud and is fundamentally built out of Windows Server 2008 and .NET Framework.

Wahbe also referenced the PDC keynote of Microsoft’s chief software architect, Ray Ozzie, in which Ozzie discussed how most applications are not initially created with the idea of scale-out. Explained Wahbe, expanding upon Ozzie’s points, “The notion of stateless front-ends being able to scale out, both across the data center and across data centers requires that you make sure you have the right architectural base. Microsoft will be trying hard to make sure we have the patterns and practices available to developers to get those models [so that they] can be brought onto the premises.”

As an example, Wahbe created a hypothetical situation in which Visual Studio and .NET Framework can be used to build an ASP.NET app, which in turn can either be deployed locally or to Windows Azure. The only extra step taken when deploying to Windows Azure is to specify additional metadata, such as what kind of SLA you are looking for or how many instances you are going to run on. As explained by Wahbe, the Metadata is an .XML file and as an example of an executable model, Microsoft is easily able to understand that model. “You can write those models in ‘Oslo’ using the DSL written in ‘M,’ targeting Windows Azure in those models,” concludes Wahbe.

Wahbe answered a firm “yes” when asked if there is a natural fit for application developed in Oslo, saying that it works because Oslo is “about helping you write applications more productively,” also adding that you can write any kind of application—including cloud. Although new challenges undoubtedly face development shops, the basic process of writing and deploying code remains the same. According to Wahbe, Microsoft Azure simply provides a new deployment target at a basic level.

As for the differences, developers are going to need to learn a new set of services. An example used by Wahbe is if two businesses were going to connect through a business-to-business messaging app; technology like Windows Communication Foundation can make this as easy process. With the integration of Microsoft Azure, questions about the pros and cons of using the Azure platform and the service bus (which is part of .NET services) will have to be evaluated. Azure “provides you with an out-of-the-box, Internet-scale, pub-sub solution that traverses firewalls,” according to Wahbe. And what could be bad about that?

When asked if developers should expect new development interfaces or plug-ins to Visual Studio, Wahbe answered, “You’re going to see some very natural extensions of what’s in Visual Studio today. For example, you’ll see new project types. I wouldn’t call that a new tool … I’d call it a fairly natural extension to the existing tools.” Additionally, Wahbe expressed Microsoft’s desire to deliver tools to developers as soon as possible. “We want to get a CTP [community technology preview] out early and engage in that conversation. Now we can get this thing out broadly, get the feedback, and I think for me, that’s the most powerful way to develop a platform,” explained Wahbe of the importance of developers’ using and subsequently critiquing Azure.

When asked about the possibility of competitors like Amazon and Google gaining early share due to the ambiguous time frame of Azure, Wahbe’s responded serenely, “The place to start with Amazon is [that] they’re a partner. So they’ve licensed Windows, they’ve licensed SQL, and we have shared partners. What Amazon is doing, like traditional hosters, is they’re taking a lot of the complexity out for our mutual customers around hardware. The heavy lifting that a developer has to do to tale that and then build a scale-out service in the cloud and across data centers—that’s left to the developer.” Wahbe detailed how Microsoft has base computing and base storage—the foundation of Windows Azure—as well as higher-level services such as the database in the cloud. According to Wahbe, developers no longer have to build an Internet-scale pub-sub system, nor do they have to find a new way to do social networking and contacts nor have reporting services created themselves.

In discussing the impact that cloud connecting will have on the cost of development and the management of development processes, Wahbe said, “We think we’re removing complexities out of all layers of the stack by doing this in the cloud for you … we’ll automatically do all of the configuration so you can get load-balancing across all of your instances. We’ll make sure that the data is replicated both for efficiency and also for reliability, both across an individual data center and across multiple data centers. So we think that be doing that, you can now focus much more on what your app is and less on all that application infrastructure.” Wahbe predicts that it will be simpler for developers to build applications with the adoption of Microsoft Azure.  For more information regarding Windows Azure, please visit Nubifer.com.

Welcome to Nubifer Cloud Computing blogs

In this location, we share blogs, research, tutorials and opinions about the ever changing and emerging arena of cloud computing, software-as-a-service, platform-as-a-service, hosting-as-a-service, and user-interface-as-a-service. We also share key concepts focused on interoperability while always maintaining an agnostic viewpoint of technologies and services offered by the top cloud platform providers. For more information, please visit Nubifer.com.