Archive for March, 2011

IBM’s Tivoli Live

IBM recently announced a new addition to its SaaS portfolio, IBM Tivoli Live – Service Manager, which provides integrated service management capabilities as a monthly subscription on IBM’s cloud platform. Along with IBM Tivoli Live – Monitoring Services, Tivoli Live solutions allow organizations to quickly adopt and deploy key ITIL processes and combine them with performance and availability monitoring, all under a common subscription and delivery model. There is no need to purchase hardware, software licenses or installation services. 

Both solutions are based on a common platform and architecture that many IBM clients use today as on-premise software. Customers are not locked into a single consumption model and in fact can choose from an array of flexible delivery options including on-premise software, SaaS, appliances and managed help desk services. Now, organizations large and small can take advantage of enterprise-class software and easily migrate from one model to another based on their business needs.

For small and medium-sized businesses without large IT departments, this service provides a quick, and practical path towards improving IT performance. For larger organizations, this service can complement existing IT management infrastructure, helping organizations better manage their costs and standardize IT operations.

Tivoli Live – Service Manager offers a comprehensive set of capabilities for implementing problem, incident, change, release and asset management processes, leveraging a common data model and a robust change management database. Customers have the flexibility to purchase any of these capabilities through our unique role based user pricing.

Tivoli Live – Monitoring Services delivers Tivoli Monitoring and Tivoli Composite Application Management software over the Web, which allow customers to manage the health and performance of their data center’s resources – including operating systems, virtualized servers, middleware and applications.

For more information on IBM’s Cloud Services, visit Nubifer.com.

Cloud Computing’s Popularity with SMB’s

There is no simple answer as to whether or not 2010 was the year small business IT finally adopted cloud computing once and for all. On behalf of Microsoft, 7th Sense Research recently conducted a study on cloud computing in small business computing environments and found that 29% of SMBs view the cloud as an opportunity for small business IT to be more strategic. The study also found that 27% of SMBs have bought into cloud computing because it integrates with existing technology investments, while 12% of SMBs have used the cloud to start a new business.

Despite those figures, overall, small businesses are largely unfamiliar with cloud computing. Josh Waldo, director of SMB Marketing at Microsoft reveals, “Roughly 20 percent of SMBs claim to know what cloud technology is.”

The numbers just don’t match up, but Waldo points out that just because people may not identify with the term cloud computing doesn’t mean they aren’t using the technology. Take Gmail or Hotmail, for example: They are both prime examples of the Software-as-a-Service (SaaS) form of cloud computing and are extremely popular—without their many users even realizing they are using cloud technology when checking their inbox.

“People might not understand what cloud is. But they are using it. They’re using it in their private life. In some cases they’re using it in their work life. But they might not necessarily identify it with the term cloud,” says Waldo.

He believes that the lack of familiarity SMB’s have with cloud computing can be an opportunity for Microsoft, Zoho and other providers of small business technology. Says Waldo, “For Microsoft, what that means is that this gives us a big opportunity to really educate SMB’s about cloud technologies and how they can benefit their business. Our goal is really going to be to help SMB’s evolve how they think about technology.”

According to Waldo, the benefits for small businesses that embrace the cloud are potentially huge: “First, SMBs can get enterprise-class technology at a fraction of the price, where you’re not purchasing on-premises technology that’s going to cost you an enormous amount upfront. Second, it really allows companies, whether you’re a development shop and you’re building software, or you’re an end customer—like a financial or insurance firm—to focus on your business rather than your IT requirements.”

By outsourcing data-center needs, for example, small business IT can eliminate building out capacity to handle potential strikes in data or transaction processing, because they buy the processing power they need when they need it. This leads to another key benefit of cloud computing: elasticity and the expectation of mobility. Waldo defines elasticity as the capability to scale up or down rapidly, based on need. While that includes processing power, it also means being able to add new users from a seasonal workforce—without having to deal with per-seat licensing associated with traditional desktop software.

When it comes to the expectation of mobility, Waldo says that today’s notebook, smartphone and tablet-totting employees want to make their work more flexible by making it mobile. SMB’s can let employees access the information and applications they need while on the go by exposing core applications as SaaS via the cloud.

Embracing Cloud Computing
Waldo recommends that SMB’s that have decided to embrace the cloud by adding cloud computing to their small business technology portfolio seek expert advice. “We really think it’s important that SMB’s choose carefully. And if they’re uncertain, they should work with a third party or a consultant or a value added reseller or some type of agent who understands the various elements of cloud technology and [who] can advise clients,” he says.

According to Chad Collins, CEO of Nubifer.com, a provider of turn-key cloud automation solutions, the first thing a small business should consider is which problem it is trying to solve: “The most important thing is that the cloud really isn’t just about infrastructure. It’s about solving problems. It should be about scalability, elasticity and economies of scale.” Collins adds, “What our enterprise clients are asking for is the ability to create virtual environments, run applications without code changes or rewrites and, most importantly, to be able to collaborate and share using single sign-on interface.

Collins says that the person responsible for small business IT should ask a range of questions when considering a cloud services provider. Among the most important is: Does the cloud provider allow you to run existing applications without any code rewrites or changes to code? Microsoft’s research reveals that 27% of SMBs have already bought into cloud services because it integrates with existing technology, while another 36% would be encouraged to but into the cloud because of that fact. “Being able to migrate custom applications over to the cloud without rewrites is not only a huge cost saver but also a huge time saver for SMBs,” says Collins.

Another important question is whether the cloud provider offers granular user access and user-based permissions based on roles. Can you measure value on a per user basis? Can you auto-suspend resources by setting parameters on usage to avoid overuse of the cloud? The latter is important because although cloud services can result in immense cost savings, their pay-as-you-go nature can yield a large tab if used inefficiently.

Collins recommends paying special attention to the level of responsive support offered by a cloud provider. “I think for SMBs it’s really important. Having to log a Web form and then wait 24 to 48 hours for support can be really frustrating,” he says, adding that the provider should guarantee that a support team would respond in mere hours. Agreeing with Collins, Waldo points out that a service-level agreement with a high-availability and 24 hour support is key.

To discover how the power of cloud computing can benefit your SMB, please visit Nubifer.com.

Microsoft Outlines Plans for Integration-as-a-Service on Windows Azure

Although Microsoft officials have been discussing plans for the successor to the company’s BizTalk Server 2010 product for some time, the cloud angle of Microsoft’s plans for its BizTalk integration server didn’t become clear until late October 2010, at the Professional Developers Conference (PDC). 

When looking at a BizTalk Server Team blog, it appears as if Microsoft is thinking about BizTalk vNext transforming into something akin to Windows Azure and SQL Azure—at least in concept—a “BizTalk Azure.”

An excerpt from the blog says, “Our plans to deliver a true Integration service—a multi-tenant, highly scalable cloud service built on AppFabric and running on Windows Azure—will be an important and game changing step for BizTalk Server, giving customers a way to consume integration easily without having to deploy extensive infrastructure and systems integration.”

The latest news from Microsoft reveals that there will be an on-premise version of BizTalk vNext as well—and the final version is slated to arrive in 2012. A Microsoft spokesperson said, “We will deliver new cloud-based integration capabilities both on Windows Azure (as outlined in the blog) as well as continuing to deliver the same capability on-premises. This leverages our AppFabric strategy of providing a consistent underlying architecture foundation across both services and server. This will be available to customers in the 2 year cadence that is consistent with previous major releases of BizTalk Server and other Microsoft enterprise server products.”

In September 2010, Microsoft released the latest on-premises software version of BizTalk (BizTalk Server 2010), which is a minor release of Microsoft’s integration server that supports Visual Studio 2010, SQL Server 2008 R2, Windows Server AppFabric and Windows Server 2008 R2.

There are currently over 10,000 BizTalk Server customers—paying a hefty price for the product—and thus Microsoft officials are being careful in their positioning of BizTalk Azure. Microsoft will ensure that existing customers are able to move to the Azure version “only at their own pace and on their own terms.” Microsoft plans on providing side-by-side support for BizTalk Server 2010 and BizTalk Azure to make sure apps don’t break and will also offer “enhances integration between BizTalk and AppFabric (both Windows Server AppFabric and Windows Azure AppFabrics).

Microsoft recently rolled out the First CTP (Community Technology Preview) of the Patterns and Practices Composite Application Guidance for using BizTalk Server 2010, Windows Server AppFabric and Windows Azure AppFabric together as part of an overall composite application solution. Additionally, Microsoft previewed a number of future enhancements to Windows Azure AppFabric.

For more information regarding BizTalk on Azure, contact a Nubifer representative today.

DoD Business Applications and the Cloud

The current cloud spending is less than 5% of total IT spending, but with an optimistic 25% growth rate, cloud computing is poised to become one of the dominant types for organizing information systems—which is why it is important for the Department of Defense Business Mission to begin organizing the path to cloud operations in order to migrate from its current low performance/high cost environment. 

The DoD Fiscal Year (FY) 2010 IT cost of the Business Mission—excluding payroll costs for uniformed and civilian personnel—is $5.2 billion, in addition to 1/3 of the costs of the communications and computing infrastructure tacking on an additional $5.4 billion to total costs.

The average IT budgets of the largest US corporate organizations are exceeded by the scope of DoD Business Applications by a multiple of three. As a result, DoD Business Operations need to think about its future IT directions as operating a secure and private cloud that is managed organically by the DoD Business Mission in order to squeeze the cost benefits out of the cloud.

There are many forms of cloud computing, ranging from Platform-as-a-Service (PaaS) and Infrastructure-as-a-Service (IaaS) to Software-as-a-Service (SaaS), but when it comes to the Department of Defense, offerings that can offer support of over 2,000 applications need apply. Business Operations cannot be linked to “public” clouds that are proprietary.

The DoD, for example, can’t rely on the largest cloud service like the Amazon Elastic Cloud, which offers computing capacity completely managed by the customer and is thus a “public cloud.” Because compute processing is purchased on demand, Amazon is an IaaS service. Once your applications are placed in the proprietary Amazon cloud, however, it is difficult to transfer the workload into a different environment.

Google, however, offers a PaaS service as a public cloud (read: accessible to all) via the Google App Engine. Google allows developers to build, host and run web applications on Google’s mature infrastructure with its own operating system; Google only provides a few Google-managed applications.

Salesforce.com’s enterprise level computing currently operates at $1.4 billion revenue rate per year, with 2 million subscribers signed up for SaaS application services running in a proprietary PaaS environment. Because Salesforce offers only proprietary solutions and can’t be considered by DoD, although Salesforce’s recent partnership with VMware might change all that.

Other cloud providers offer IaaS services, but they all leave it to customers to manage their own applications; they qualify for DoD applications provided that would meet open source and security criteria.

Open Platform and Open Source
Microsoft’s Windows Azure platform offers a PaaS environment for developers to create cloud applications and offers services running in Microsoft’s data centers on a proprietary .Net environment. These preferentially .Net applications are integrated into a Microsoft controlled software environment but can be defined as a “closed” platform.

Currently, DoD Business Mission applications are running largely in a Microsoft .Net environment. What remains to be seen is if DoD will pursue cloud migration into a multi-vendor “open platform” and “open source” programming environment or continue sticking to a restrictive Microsoft .Net?

The largest share of the DoD IT budget goes towards the Defense Information Systems Agency (DISA), which has advocated the adoption of the open source SourceForge library in April 2009 for unclassified programs. DISA’s Forge.mil program enables collaborative software development and cross-program sharing of software, system components ad services in support of network-centric operations and warfare. Forge.mil is modeled from concepts proven in open-source software development and represents a collection of screened software components and is used by thousands of developers. Forge.mil takes advantage of a large library of tested software projects and its components are continuously evaluated by thousands of contributors (including some from firms like IBM, Oracle and HP although not from Microsoft, which controls its own library of codes).

OSS is defined as software for which the human-readable source code is available for use, study, reuse, modification, enhancement and redistribution by the users of that software by a DoD Memorandum of October 16, 2009 by the Acting DoD Chief Information Officer on “Clarifying Guidance Regarding Open Source Software (OSS).” OSS meets the definition of “commercial computer software” and will thus be given preference in building systems. DoD has began the process of adoption of open course computer code with the announcement of Forge.mil.

Implications
Due to the emigration of business applications, a reorientation of systems development technologies in favor of running on “private clouds”—while taking advantage of “open source” techniques—is necessary in order to save the most. The technologies currently offered for the construction of “private” clouds will help to achieve the complete separation of the platforms on which applications run, from the applications themselves. The simplification that can be achieved through the sharing of “open” source code from the Forge.mil library makes delivering cloud solutions cheaper, quicker and more readily available.

For more information regarding the DoD and open source cloud platforms, please visit nubifer.com today.

Feds to Unveil Cloud Security Guidelines

Late in 2010, the federal government issued draft plans for the voluntary Federal Risk and Authorization Management Program, dubbed FedRAMP. FedRAMP is expected to be operational by April, 2011 and would ensure cloud services meet federal cyber-security guidelines—which will likely shelve remaining government concerns about cloud security and ramp up adoption of cloud technologies.

Developed with cross-government and industry support over the past 18 months, the voluntary program would put cloud services through a standardized security accreditation and certification process. Any authorization could subsequently be leveraged by other agencies. Federal CIO Vivek Kundra said in a statement, “By simplifying how agencies procure cloud computing solutions, we are paving the way for more cost-effective and energy-efficient service delivery for the public, while reducing the federal government’s data center footprint.”

The adoption of cloud computing has been promoted by the Obama Administration as a way to help save the government money, and Kundra and other top officials have championed the technology and instituting policies like data center consolidation requirements—which could bring about a shift to the cloud. Federal IT managers, however, have consistently raised security concerns as the biggest barrier to adoption.

The government’s security concerns arise partly because cloud computing is a relatively new paradigm that has to be adapted to the security requirements of regulations like the Federal Information Management Security Act (FISMA, which governs federal cyber-security for most government agencies).  By mapping out the baseline required security controls for cloud systems, FedRAMP creates a consistent set of security outlines for cloud computing.

FedRAMP will seek to eliminate a duplicative, costly process to certify and accredit applications. Each agency used to take apps and services through their own accreditation process, but in the shared-infrastructure environment of the cloud, this process is redundant.

The FedRAMP draft is comprised of three major components: a set of cloud computing security baseline requirements; a process to continuously monitor cloud security; and a description of proposed operational approaches to authorizing and assessing cloud-based systems.

FedRAMP will be used for both private and public cloud services, and possibly for non-cloud computing information technologies and products. For example, two agencies have informed IBM of their intent to sponsor certification of their new Federal Community Cloud services.

Commercial vendors will not be able to directly request FedRAMP authorization, but rather have to rely on the sponsorship of a federal agency that plans to use their cloud services. Guidance on the CIO Council’s website suggests, FedRAMP “may not have the resources to accommodate all requests initially,” and that GSA will focus on systems with potentially larger user bases or cross-government interest, suggesting that the government predicts a large amount of interest.

FedRAMP will remain an inter-agency effort under federal CIO Kundra’s authority and will be managed by GSA. The new Joint Authorization Board, which now includes reps from GSA, the Department of Defense, will authorize the systems that go through the process with the sponsoring agency.

Although FedRAMP provides a base accreditation, most agencies have security requirements that go beyond FISMA and thus may have to do more work on top of the FedRAMP certification to make sure the cloud services they are looking to deploy meet individual agency requirements.

For more information regarding the Federal adoption of cloud technologies, visit Nubifer.com.

Cloud Computing’s Varying Forms of Functionality

Although everyone associated with the industry is likely familiar with the term cloud computing, what remains ambiguous are its offerings, both now and in the future. The benefits of cloud computing can essentially be classified into as many as five categories, the majority of which are discussed in the paragraphs to follow.

The Internet allows for you to market your brand internationally, whether you are a SMB or a multi-national organization. It also enables organizations to reach out and offer their products/services to an international audience, and the ability to combine data/applications with the ability to use remote computing resources thus creating exciting new opportunities.

Take the latest and greatest mobile app, for example. This new application has the ability to travel anywhere the user is, whether they are surfing on their TV, phone, or laptop. A tremendous amount of information has to be transferred online and shared with several services in order for that application to operate seamlessly, while guaranteeing privacy and security.

Cloud computing offers more than the storing of data off-site and allowing access through their browser. Cloud computing also has the ability to adapt and scale its services to fit each users’ needs through intelligent algorithms. The basic usage of the cloud results in a more personalized experience, as the platform acquires greater familiarity about the intents of the user. In turn, this allows users to effectively use smart services, acquire better information so they can take action wherever they happen to be.

We as human beings are social entities. We naturally and instinctively interact with those around us. In the past, communication was done by telegraph, letters, telephone and faxes, but it is now largely through the Internet. The Internet has created a plethora of communication opportunities, such as instant messaging, Internet telephony and social media. Cloud computing expands on this concept and offers the opportunity to make it possible to incorporate interaction and collaboration capabilities into areas that were seemingly beyond our reach previously.

Due to this progression of the common-place, our expectations become higher and higher over time. At some point in our past it was unthinkable for a cellular phone to be able to surf the net, and provide driving directions. But today, not only do we expect our mobile phone to give us the Internet at our fingertips, but also we expect it to guide us where we need to go.

Because of these expanding expectations, the cloud must be intelligent as well. There will be corresponding pressure for devices to catch up to cloud computing as it becomes increasingly intelligent and more intuitive.

Hand-held devices are great examples of this. Smart phones have a multitude of functions in additions to communications, such as GPS, voice recorder, camera, fame device, calculator and the list goes on. If a phone is paired with an operating system like Microsoft’s Windows Phone 7, it becomes a smart device capable of using cloud services to their full capabilities.

Because the cloud is built upon the capabilities of servers, it is appropriate to imagine large data centers when thinking of cloud computing. This means that server technology must advance as the cloud does—but there is a catch. Cloud services will become more powerful as a server software does. In this way, server and cloud improvements mutually drive each other, and the user greatly benefits from this, whether the user is an individual, organization or company.

Once we tap into cloud computing fully, web sites will no longer crash because of surges in traffic—the cloud will accommodate to computing activity peaks accordingly.

For more information about the form and functionality of the cloud, visit Nubifer.com.