Archive for May, 2010

Cloud Computing Security Play Made by McAfee with McAfee Cloud Secure

A new service targeting Software-as-a-Service providers from McAfee combines vulnerability scanning and security certification for cloud infrastructures. The service—called the McAfee Cloud Secure program—is basically designated to compliment annual audits of security and process controls most cloud vendors undergo for the purpose of certification. McAfee officials say that with McAfee Cloud Secure they will team up with certification providers to offer an additional level of security by offering a daily scan of application, network perimeter and infrastructure vulnerabilities. Those that pass will be rewarded with a “McAfee SECURE” seal of approval.

Earlier this month at the RSA security conference, securing cloud environments was a major topic up for discussion. A survey by IDC on attitudes towards the cloud revealed that 87.5 percent of participants said the most significant obstacles to cloud adoption were security concerns. IDC analyst Christian Christiansen said in a statement, “SaaS vendors have a difficult time convincing prospects that their services are secure and safe.” According to Christiansen, though, McAfee’s new offering is a step in the right direction toward increased security in the cloud.

McAfee and other vendors have discussed providing security from the cloud in the past, but this announcement shows the increasing focus on providing solutions to secure cloud environments themselves in the industry.

Marc Olesen, senior vice president and general manager of McAfee’s Software-as-a-Service business said in an interview with eWEEK, ” McAfee looks at the cloud really from three different angles, which is security from the cloud, in the cloud and for the cloud. What’s really been out there today are (annual) process certification audits … that address the process controls and security controls that cloud providers have in place. This has typically been an ISO-27001 certification or an SAS-70 certification that cloud providers are suing, and we feel that that’s very important, but it’s just a start.” For more information please contact a Nubifer representative today.

Cloud-Optimized Infrastructure and New Services on the Horizon for Dell

Over the past three years, Dell has gained experience in the Cloud through its Data Center solutions and  group-designed customized offerings for cloud and hyperscaled IT environments. The company is now putting that experience to use, releasing several new hardware, software and service offerings optimized for cloud computing environments. Dell officials launched the new offerings—which include a new partner program, new servers optimized for cloud computing and new services designed to help business migrate to the cloud—at a San Francisco event on March 24.

Based on work the Dell Data Center Solutions group has completed over the past three years, the new offerings were outlined by Valeria Knafo, senior manager of business development and business marketing for the DCS unit. According to Knafo, DCS has built customized computing infrastructures for large cloud service providers and hyperscale data centers and is now trying to make their solutions available to enterprises. Said Knafo, “We’ve taken that experience and brought it to a new set of users.”

Dell officials revealed that they have been working with Microsoft on its Windows Azure cloud platform and that the software giant will work with Dell to create joint cloud-based solutions. Dell and Microsoft will continue to collaborate around Windows Azure (including offering services) and Microsoft will continue buying Dell hardware for its Azure platform as well. Turnkey cloud solutions—including pre-tested and pre-assembled hardware, software and services packages that businesses can use to deploy and run their cloud infrastructures quickly—are among the new offerings.

A cloud solution for Web applications will be the first Platform-as-a-Service made available. The offering will combine Dell servers and services with Web application software from Joyent and will come with challenges, caution Dell officials, like unpredictable traffic and the migrating of the apps from development to production. Dell is also offering a new Cloud Partner Program. According to officials, it will broaden options for customers seeking to move into private or public clouds. Dell announced three new software companies as partners as well: Aster Data, Greenplum and Canonical.

Also on the horizon for Dell is its PowerEdge C-series servers, which are designed to be energy efficient and offer features that are vital to hyperscaled environments—HPC (high-performance computing), social networking, gaming, cloud computing, Web 2.0 functions—like memory capacity and high performance. The C1100 (designed for clustered computing environments), the C2100 (for data analytics, cloud computing and cloud storage) and the C6100 (a four-node cloud and cluster system which offers a shared infrastructure) are the three servers that make up the family.

In unveiling the PowerEdge C-Series, Dell is partaking in the increasing industry trend of offering new systems optimized for cloud computing. For example, on March 17 Fujitsu unveiled the Primergy CX1000, a rack server created to offer the high performance environments need when lowering costs and power consumption. The Primergy CX1000 can also save on data center space through a design which pushes hot air from the system through the top of the enclosure as opposed to the back.

Last, but certainly not least, are Dell’s Integrated Solution Services. They offer complete cloud lifecycle management and include workshops to assess a company’s readiness to move to the cloud. Knafo said that the services are a combination of what Dell gained with the acquisition of Perot Systems and what it had already. “There’s a great interest in the cloud, and a lot of questions on how to get to the cloud. They want a path and a roadmap identifying what the cloud can bring,” said Knafo.

Mike Wilmington, a planner and strategist for Dell’s DCS group, claimed the services will decrease confusion many enterprises may have about the cloud. Said Wilmington, “Clouds are what the customer wants them to be,” meaning that while cloud computing may offer essentially the same benefits to all enterprises (cost reductions, flexibility, improved management and greater energy efficiency) it will look different for every enterprise. For more information please visit Nubifer.com.

Cisco, Verizon and Novell Make Announcements about Plans to Secure the Cloud

Cisco Systems, Verizon Business and Novell announce plans to launch offerings designed to heighten security in the cloud.

On April 28, Cisco announced security services based around email and the Internet that are part of the company’s cloud protection push and its Secure Borderless Network architecture; Cisco’s Secure Borderless Network architecture seeks to give users secure access to their corporate resources on any device, anywhere, at anytime.

Cisco’s IronPort Email Data Loss Prevention and Encryption, and ScanSafe Web Intelligence Reporting are designed to work with Cisco’s other web security solutions to grant companies more flexibility when it comes to their security offerings while streamlining management requirements, increasing visibility and lowering costs.

Verizon and Novell made an announcement on April 28 about their plans to collaborate to create an on-demand identity and access management service called Secure Access Services from Verizon. Secure Access Services from Verizon is designed to enable enterprises to decide and manage who is granted access to cloud-based resources. According to the companies, the identity-as-a-server solution is the first of what will be a host of joint offerings between Verizon and Novell.

According to eWeek, studies continuously indicate that businesses are likely to continue trending toward a cloud-computing environment. With that said, issues concerning security and access control remain key concerns. Officials from Cisco, Verizon and Novell say that the new services will allow businesses to feel more at ease while planning their cloud computing strategies.

“The cloud is a critical component of Cisco’s architectural approach, including its Secure Borderless Network architecture,” said vice president and general manager of Cisco’s Security technology business unit Tom Gillis in a statement. “Securing the cloud is highly challenging. But it is one of the top challenges that the industry must rise to meet as enterprises increasingly demand the flexibility, accessibility and ease of management that cloud-based applications offer for their mobile and distributed workforces.”

Cisco purchased ScanSafe in December 2009 and the result is Cisco’s ScanSafe Web Intelligence Reporting platform. The platform is designed to give users a better idea of how their Internet resources are being used, and the objective is to ensure that business-critical workloads aren’t being encumbered by non-business-related traffic. Cisco’s ScanSafe Web Intelligence Reporting platform can report on user-level data and information on Web communications activities within second, and offers over 80 predefined reports.

Designed to protect outbound email in the cloud, the IronPort email protection solution is perfect for enterprises that don’t want to manage their email. Cisco officials say that it provides hosted mailboxes (while keeping control of email policies) and also offers the option of integrated encryption.

Officials say Cisco operates over 30 data centers around the globe and that security offerings handle large quantities of activity each day—including 2.8 billion reputation look-ups, 2.5 billion web requests and the detection of more than 250 billion span messages—and these are the latest in the company’s expanding portfolio of cloud security offerings.

Verizon and Novell’s collaboration—the Secure Access Services—are designed to enable enterprises to move away from the cost and complexity associated with using traditional premises0based identity and access management software for securing applications. These new services offer centralized management of web access to applications and networks in addition to identity federation and web single sign-on.

Novell CEO Ron Hovsepian released a statement saying, “Security and identity management are critical to accelerating cloud computing adoption and by teaming with Verizon we can deliver these important solutions.” While Verizon brings the security expertise, infrastructure, management capabilities and portal to the service, Novell provides the identity and security software. For more information contact a Nubifer representative today.

Cloud Interoperability Brought to Earth by Microsoft

Executives at Microsoft say that an interoperable cloud could help companies trying to lower costs and governments trying to connect constituents. Cloud services are increasingly seen as a way for businesses and governments to scale IT systems for the future, consolidate IT infrastructure, and enable innovative services not possible until now.

Technology vendors are seeking to identify and solve the issues created by operating in mixed IT environments in order to help organizations fully realize the benefits of cloud services. Additionally, vendors are collaborating to make sure that their products work well together. The industry may still be in the beginning stages of collaborating on cloud interoperability, but has already made great strides.

So what exactly is cloud interoperability and how can it benefit companies now? Cloud interoperability specifically concerns one cloud solution working with other platforms and applications—not just other clouds. Customers want to be able to run applications locally or in the cloud, or even on a combination of both. Currently, Microsoft is collaborating with others in the industry and is working to make sure that the premise of cloud interoperability becomes an actuality.

Microsoft’s general managers Craig Shank and Jean Paoli are spearheading Microsoft’s interoperability efforts. Shank helms the company’s interoperability work on public policy and global standards and Paoli collaborates with the company’s product teams to cater product strategies to the needs of customers. According to Shank, one of the main attractions of the cloud is the amount of flexibility and control it gives customers. “There’s a tremendous level of creative energy around cloud services right now—and the industry is exploring new ideas and scenarios all the time. Our goal is to preserve that flexibility through an open approach to cloud interoperability,” says Shank.

Paoli chimes in to say, “This means continuing to create software that’s more open from the ground up, building products that support technologies such as PHP and Java, and ensuring that our existing products work with the cloud.” Both Shank and Paoli are confident that welcoming competition and choice will allow Microsoft to become more successful down the road. “This may seem surprising,” says Paoli before adding,” but it creates more opportunities for its customers, partners and developers.”

Shank reveals that due to the buzz about the cloud, some forget about the ultimate goal: “To be clear, cloud computing has enormous potential to stimulate economic growth and enable governments to reduce costs and expand services to citizens.” One example of the real-world benefits of cloud interoperability is the public sector. Microsoft is currently showing results in this area via solutions like their Eye for Earth project. Microsoft is helping the European Environment Agency simplify the collection and processing of environmental information for use by the general public and government officials. Eye on Earth obtains data from 22,000 water monitoring points and 1,000 stations that monitor air quality through employing Microsoft® Windows Azure, Microsoft ® SQL Azure and already existing Linux technologies. Eye on Earth then helps synthesize the information and makes it accessible for people in 24 different languages in real time.

Product developments like this emerged out of feedback channels which the company developed with its partners, customers and other vendors. In 2006, for example, Microsoft created the Interoperability Executive Customer (IEC) Council, which is comprised of 35 chief technology officers and chief information officers from a variety of organizations across the globe. The group meats two times per year in Redmond and discuss issues concerning interoperability as well as provide feedback to Microsoft executives.

Additionally, Microsoft recently published a progress report which—for the first time—revealed operational details and results achieved by the Council across six work streams (or priority areas). The Council recently commissioned the creation of a seventh work stream for cloud interoperability geared towards developing standards related to the cloud which addressed topics like data portability, privacy, security and service policies.

Developers are an important part of cloud interoperability, and Microsoft is part of an effort the company co-founded with Zend Technologies, IBM and Rackspace called Simple Cloud. Simple Cloud was created to help developers write basic cloud applications that work on all major cloud platforms.

Microsoft is further engaging in the collaborative work of building technical “bridges” between the company and non-Microsoft technologies, like the recently-released Microsoft ® Windows Azure Software Development Kits (SDKs) for PHP and Java and tools for the new Windows ® Azure platform AppFabric SDKs for Java, PHP and Ruby (Eclipse version 1.0), the SQL CRUD Application Wizard for PHP and the Bing 404 Web Page Error Toolkit for PHP. These examples show the dedication of Microsoft Interoperability team.

Despite the infancy of the industry’s collaboration on cloud interoperability issues, much progress has already been made. This progress has had a major positive impact on the way even average users work and live, even if they don’t realize it yet. A wide perspective and a creative and collaborative approach to problem-solving are required for cloud interoperability. In the future, Microsoft will continue to support more conversation within the industry in order to define cloud principles and make sure all points of view are incorporated. For more information please contact a Nubifer representative today.

Amazon Sets the Record Straight About the Top Five Myths Surrounding Cloud Computing

On April 19, the 5th International Cloud Computing Conference & Expo (Cloud Expo)opened in New York City, and Amazon Web Services (AWS) used the event as a platform to address some of what the company sees as the lingering myths about cloud computing.

AWS officials said that the company continues to grapple with questions about features of the cloud-ranging from reliability and security to cost and elasticity—despite being one of the first companies to successfully and profitably implement cloud computing solutions. Adam Selipsky, vice president of AWS, recently spoke about the persisting myths of cloud computing from Amazon’s Seattle headquarters, specifically addressing five that linger in the face of increased industry adoption of the cloud and continued successful cloud deployments. “We’ve seen a lot of misperceptions about cloud computing is,” said Selipsky before debunking five common myths.

Myth 1: The Cloud Isn’t Reliable

Chief information officers (CIOs) in enterprise organizations have difficult jobs and are usually responsible for thousands of applications, explains Selipsky in his opening argument, adding that they feel like they are responsible for the performance and security of these applications. When problems with the applications arise, CIOs are used to approaching their own people for answers and take some comfort that there is a way to take control of the situation.

Selipsky says that customers need to consider a few things when adopting the cloud, one of which is that the AWS’ operational performance is good. Selipsky reminded users that they own the data, they choose which location to store the data (and it doesn’t move unless the customer decided to move it) and that regardless of whether customers choose to encrypt or not, AWS never looks at the data.

“We have very strong data durability—we’ve designed Amazon S3 (Simple Storage Service) for eleven 9’s of durability. We store multiple copies of each object across multiple locations,” said Selipsky. He added that AWS has a “Versioning” feature which allows customers to revert to the last version of any object they somehow lose due to application failure or an unintentional deletion. Customers can also ensure additional fault-tolerant applications by deploying their applications in various Availability zones or using AWS’ Load Balancing and Auto Scaling features.

“And, all that comes with no capex [capital expenditures] for companies, a low per unit cost where you only pay for what you consume, the ability to focus on engineers on unique incremental value for your business,” said Selipsky before adding that the origin of the reliability claims come merely from an illusion of a control, not actual control. “People think if they can control it they have more say in how things go. It’s like being in a car versus an airplane, but you’re much safer in a plane,” he explained.

Myth 2: The Cloud Provides Inadequate Security and Privacy

When it comes to security, Selipsky notes that it is an end-to-end process and thus companies need to build security at every level of the stack. Taking a look at Amazon’s cloud, it is easy to note that the same security isolations are employed as with a traditional data center—including physical data center security, separation of the network, isolation of the server hardware and isolation of storage. Data centers had already become a frequently-shared infrastructure on the physical data center side before Amazon launched its cloud services. Selipsky added that companies realized that they could benefit by renting space in a data facility as opposed to building it.

When speaking about security fundamentals, Selipsky noted that security could be maintained by providing badge-controlled access, guard stations, monitored security cameras, alarms, separate cages and strictly audited procedures and processes. Not only is Amazon’s Web Services’ data center identical to the best practices employed in private data facilities, there is an added physical security advantage in the fact that customers don’t need to access to the servers and networking gear inside. Access to the data center is thus controlled more strictly than traditional rented facilities. Selipsky also added that the Amazon cloud as equal or better isolation than could be expected from dedicated infrastructure, at the physical level.

In his argument, Selipsky pointed out that networks ceased to be isolated physical islands a long time ago because, as companies increasingly began to need to connect to other companies—and then the Internet—their networks became connected with public infrastructure. Firewalls and switch configurations and other special network functionality were used to prevent bad network traffic from getting in, or conversely from leaking out. Companies began using additional isolation techniques as their network traffic increasingly passed over public infrastructure to make sure that the security of every packet on (or leaving) their network remained secure. These techniques include Multi-protocol Label Switching (MPLS) and encryption.

Amazon used a similar approach to networking in its cloud by maintaining packet-level isolation of network traffic and supporting industry-standard encryption. Amazon Web Services’ Virtual Private Cloud allows a customer to establish their own IP address space and because of that customers can use the same tools and software infrastructure they are familiar with to monitor and control their cloud networks. Amazon’s scale also allows for more investment in security policing and countermeasures than nearly and large corporation could afford. Maintains Selipsky, “Our security is strong and dug in at the DNA level.”

Amazon Web Services invests in testing and validating the security of its virtual server and storage environment significantly as well. When discussing the investments made on the hardware side, Selipsky lists:

After customers release these resources, the server and storage are wiped clean so no important data can be left behind.

Intrusion from other running instances is prevented because each instance has its own customer firewall.

Those in need of more network isolation can use Amazon VPC, which allows you to carry your own IP address space with you into the cloud; your instances are only accessible through those IP addresses only you know.

Those desiring to run on their own boxes—where no other instances are running—can purchase extra large instances where only that XL instance runs on that server.

According to Selipsky, Amazon’s scale allows for more investment in security policing and countermeasures: “In fact, we often find that we can improve companies’ security posture when they use AWS. Take the example lots of CIOs worry about—the rogue server under a developer’s desk running something destructive or that the CIO doesn’t want running. Today, it’s really hard (if not impossible) for CIOS to know how many orphans there are and where they might be. With AWS, CIOs can make a single API call and see every system running in their VPC [Virtual Private Cloud]. No more hidden servers under the desk or anonymously places servers in a rack and plugged into the corporate network. Finally, AWS is SAS-70 certified; ISO 27—1 and NIST are in process.”

Myth 3: Creating My Own In-House Cloud or Private Cloud Will Allow Me to Reap the Same Benefits of the Cloud

According to Selipsky, “There’s a lot of marketing going on about the concept of the ‘private cloud.’ We think there’s a bit of a misnomer here.” Selipsky continued to explain that generally, “we often see companies struggling to accurately measure the cost of infrastructure. Scale and utilization are big advantages for AWS. In our opinion, a cloud has five key characteristics: it eliminates capex; allows you to pay for what you use; provides true elastic capacity to scale up and down; allows you to move very quickly and provision servers in minutes; and allows you to offload the undifferentiated heavy lifting of infrastructure so your engineers work on differentiating problems.

Selipsky also pointed out the following drawbacks of private clouds: still own the capex (and they are expensive!); not pay for  what you use; not have true elasticity; still manage the undifferentiated heavy lifting. “With a private cloud you have to manage capacity very carefully … or you or your private cloud vendor will end up over-provisioning. So you’re going to have to either get very good at capacity management or you’re going to wind up overpaying,” said Selipsky before challenging the elasticity of the private cloud: “The cloud is shapeless. But if it has a tight box around it, it no longer feels very cloud-like.”

One of AWS’ key offerings is Amazon’s ability to save customers money while also driving efficiency. “In virtually every case we’ve seen, we’ve been able to save people a significant amount of money,” said Selipsky. This is in part because AWS’ business has greatly expanded over the last four years and Amazon has achieved enough scale to secure very low costs. AWS has been able to aggregate hundreds of thousands of customers to have a high utilization of its infrastructure. Said Selipsky, “In our conversations with customers we see that really good enterprises are in the 20-30 percent range on utilization—and that’s when they’re good … many are not that strong. The cloud allows us to have several times that utilization. Finally, it’s worth looking at Amazon’s heritage and AWS’ history. We’re a company that works hard to lower its costs so that we can pass savings back to our customers. If you look at the history of AWS, that’s exactly what we’ve done (lowering price on EC2, S3, CloudFront, and AWS bandwidth multiple times already without any competitive pressure to do so).”

Myth 4: The Cloud Isn’t Ideal Because I Can’t Move Everything at Once

Selipsky debunks this myth by saying, “We believe this is nearly impossible and ill-advised. We recommend picking a few apps to gain experience and comfort then build a migration plan. This is what we most often see companies doing. Companies will be operating in hybrid environments for years to come. We see some companies putting some stuff on AWS and then keeping some stuff in-house. And I think that’s fine. It’s a perfectly prudent and legitimate way of proceeding.”

Myth 5: The Biggest Driver of Cloud Adoption is Cost

In busting the final myth, Selipsky said, “There is a big savings in capex and cost but what we find is that one of the main drivers of adoption is that time-to-market for ideas is much faster in the cloud because it lets you focus your engineering resources on what differentiates your business.”

Summary

Speaking about all of the myths surround the cloud, Selipsky concludes that “a lot of this revolves around psychology and fear of change, and human beings needing to gain comfort with new things. Years ago people swore they would never put their credit card information online. But that’s no longer the case. We’re seeing great momentum. We’re seeing, more and more, over time these barriers [to cloud adoption] are moving.” For additional debunked myths regarding Cloud Computing visit Nubifer.com.

IBM Elevates Its Cloud Offerings with Purchase of Cast Iron Systems

IBM Senior Vice President and Group Executive for IBM Software Group Steve Mills announced the acquisition of cloud integration specialist Cast Iron Systems at the IBM Impact 2010 conference in Las Vegas on May 3. The privately held Cast Iron is based in Mountain View, California and delivers cloud integration software, appliances and services, thus the acquisition broadens the delivery of cloud computing services for IMB’s clients. IBM’s business process and integration software portfolio grew over 20 percent during the first quarter and the company sees this deal as a way to expand it further. The financial terms of the acquisition were not disclosed although Cast Iron Systems’ 75 employees will be integrated into IBM.

According to IBM officials, Big Blue anticipated the worldwide cloud computing market to grow at a compounded annual rate of 28 percent from $47 billion in 2008 to a projected $126 billion by 2012. The acquisition of Cast Iron Systems reflects IBM’s expansion of its software business around higher value capabilities that help clients run companies more effectively.

IBM has transformed its business model to focus on higher value, high-margin capabilities through organic and acquisitive growth in the past ten years–and the company’s software business has been a key catalyst in this shift. IBM’s software revenue grew at 11 percent year-to-year during the first quarter and the company generated $8 billion in software group profits in 2008 (up from $2.8 billion in 2000).

Since 2003, the IBM Software Group has acquired over 55 companies, and the acquisition of Cast Iron Systems is part of that. Cast Iron Systems’ clients include Allianz, Peet’s Coffee & Tea, NEC, Dow Jones, Schumacher Group, ShoreTel, Time Warner, Westmont University and Sports Authority and the cloud integration specialist has completed thousands of cloud integrations around the globe for retail organizations, financial institutions and media and entertainment companies.

IBM’s acquisition comes at a time when one of the major challenges facing businesses when adopting cloud delivery models is integrating the disparate systems running in their data centers with new cloud-based applications–which used to be time-consuming work which drained resources. IBM gains the ability to help businesses rapidly integrate their cloud-based applications and on-pemises systems, with the acquisition of Cast Iron Systems. Additionally, the acquisition advances IBM’s capabilities for a hybrid cloud model–which allows enterprises to blend data from on-premises applications with public and private cloud systems.

IBM, which is know for offering application integration capabilities for on-premises and business-to-business applications, will now be able to offer clients a complete platform to integrate cloud applications from providers like Amazon, Salesforce.com, NewSuite and ADP with on-premises applications like SAP and JD Edwards. Relationships between IBM and Amazon and Salesforce.com will essentially become friendlier due to this acquisition.

IBM said that it can use Cast Iron Systems’ hundreds of prebuilt templates and services expertise to eliminate expensive coding, thus allowing cloud integrations to be completed in mere days (rather than weeks, or even longer). These results can be achieved through using a physical appliance, a virtual appliance or a cloud service.

Craig Hayman, general manager for IBM WebSphere said in a statement, “The integration challenges Cast Iron Systems is tackling are crucial to clients who are looking to adopt alternative delivery models to manage their businesses. The combination of IBM and Cast Iron Systems will make it easy for clients to integrate business applications, no matter where those applications reside. This will give clients greater agility and, as a result, better business outcomes.”

IMB cited Cast Iron Systems helping pharmaceutical distributor Amerisource Bergen Specialty Group connecting Saleforce CRM with its on-premise corporate data warehouse as an example. The company has since been able to give its customer service associates access to the accurate, real-time information they need to deliver a positive customer experience while realizing $250,000 in annual cost savings.

Cast Irons Systems additionally aided a division of global corporate insurance leader Allianz integrate Salesforce CRM with its on-premises underwriting applications to offer real-time visibility into contract renewals for its sales team and key performance indicators for sales management. IBM said that Allianz beat its own 30-day integration project deadline by replacing labor-intensive custom code with Cast Iron Systems’ integration solution.

President and chief executive officer of Cast Iron Systems Ken Comee said, “Through IBM, we can bring Cast Iron Systems’ capabilities as the world’s leading provider of cloud integration software and services to  global customer set. Companies around the world will now gain access to our technologies through IBM’s global reach and its vast network of partners. As part of IBM, we will be able to offer clients a broader set of software, services and hardware to support their cloud and other IT initiatives.”

IBM will remain consistent with its software strategy by supporting and enhancing Cast Iron Systems’ technologies and clients while simultaneously allowing them to utilize the broader IBM portfolio. For more information, visit Nubifer.com.

Transforming Into a Service-Centric IT Organization By Using the Cloud

While IT executives typically approach cloud services from the perspective of how they are being delivered, this model neglects what cloud services are and how they are consumed. These two facets can have a large impact on the overall IT organizations, points out eWeek Knowledge Center contributor Keith Jahn. Jahn maintains that it is very important for IT executives to veer away from the current delivery-only focus by creating a world-class supply chain for managing the supply and demand of cloud services.

Using the popular fable The Sky Is Falling, known lovingly as Chicken Little, Jahn explains a possible future scenario that IT organizations may face due to cloud computing. As the fable goes, Chicken Little embarks on a life-threatening journey to warn the king that the sky is falling and on this journey she gathers friends who join her on her quest. Eventually, the group encounters a sly fox who tricks them into thinking that he has a better path to help them reach the king. The tale can end one of two ways: the fox eats the gullible animals (thus communicating the lesson “Don’t believe everything you hear”) or the king’s hunting dogs can save the day (thus teaching a lesson about courage and perseverance).

So what does this have to do with cloud computing? Cloud computing has the capacity to bring on a scenario that will force IT organizations to change, or possibly be eliminated altogether. The entire technology supply chain as a whole will be severely impacted if IT organizations are wiped out. Traditionally, cloud is viewed as a technology disruption, and is assessed from a deliver orientation, posing questions like how can this new technology deliver solutions cheaper and better and faster? An equally important yet often ignored aspect of this equation is how cloud services are consumed. Cloud services are ready to run, self-sourced, available wherever you are and are pay-as-you-go or subscription based.

New capabilities will emerge as cloud services grow and mature and organizations must be able to solve new problems as they arise. Organizations will also be able to solve old problems cheaper, better and faster. New business models will be ushered in by cloud services and these new business models will force IT to reinvent itself in order to remain relevant. Essentially, IT must move away from its focus on the delivery and management of assets and move toward the creation of a world-class supply chain for managing supply and demand of business services.

Cloud services become a forcing function in this scenario because they are forcing IT to transform. CIOs that choose to ignore this and neglect to make transformative measures will likely see their role shift from innovation leader to CMO (Chief Maintenance Officer), in charge of maintaining legacy systems and services sourced by the business.

Analyzing the Cloud to Pinpoint Patterns

The cloud really began in what IT folks now refer to as the “Internet era,” when people were talking about what was being hosted “in the cloud.” This was the first generation of the cloud, Cloud 1.0 if you will—an enabler that originated in the enterprise. Supply Chain Management (SCM) processes were revolutionized by commercial use of the Internet as a trusted platform and eventually the IT architectural landscape was forever altered.

This model evolved and produced thousands of consumer-class services, which used next-generation Internet technologies on the front end and massive scale architectures on the back end to deliver low-cost services to economic buyers. Enter Cloud 2.0, a more advanced generation of the cloud.

Beyond Cloud 2.0

Cloud 2.0 is driven by the consumer experiences that emerged out of Cloud 1.0. A new economic model and new technologies have surfaced since then, due to Internet-based shopping, search and other services. Services can be self-sourced from anywhere and from any device—and delivered immediately—while infrastructure and applications can be sourced as services in an on-demand manner.

Currently, most of the attention when it comes to cloud services remains focused on the new techniques and sourcing alternatives for IT capabilities, aka IT-as-a-Service. IT can drive higher degrees of automation and consolidation using standardized, highly virtualized infrastructure and applications. This results in a reduction in the cost of maintaining existing solutions and delivering new solutions.

Many companies are struggling with the transition from Cloud 1.0 to Cloud 2.0 due to the technology transitions required to make the move. As this occurs, the volume of services in the commercial cloud marketplace is increasing, propagation of data into the cloud is taking place and Web 3.0/semantic Web technology is maturing. The next generation of the cloud, Cloud 3.0 is beginning to materialize because of these factors.

Cloud 3.0 is significantly different because it will enable access to information through services set in the context of the consumer experience. This means that processes can be broken into smaller pieces and subsequently automated through a collection of services, which are woven together with massive amounts of data able to be accessed. With Cloud 3.0, the need for large-scale, complex applications built around monolithic processes is eliminated. Changes will be able to be made by refactoring service models and integration achieved by subscribing to new data feeds. New connections, new capabilities and new innovations—all of which surpass the current model—will be created.

The Necessary Reinvention of IT

IT is typically organized around the various technology domains taking in new work via project requests and moving it through a Plan-Build-Run Cycle. Here lies the problem. This delivery-oriented, technology-centric approach has inherent latency built-in. This inherent latency has created increasing tension with the business it serves, which is why IT must reinvent itself.

IT must be reinvented so that it becomes the central service-sourcing control point for the enterprise or realize that the business with source them on their own. By becoming the central service-sourcing control point for the enterprise, IT can maintain the required service levels and integrations. Changes to behavior, cultural norms and organizational models are required to achieve this.

IT Must Become Service-Centric in the Cloud

IT must evolve from a technology-centric organization into a service-centric organization in order to survive, as service-centric represents an advanced state of maturity for the IT function. Service-centric allows IT to operate as a business function—a service provider—created around a set of products which customers value and are in turn willing to pay for.

As part of the business strategy, these services are organized into a service portfolio. This model differs from the capability-centric model because the deliverable is the service that is procured as a unit through a catalog and for which the components—and sources of components—are irrelevant to the buyer. With the capability-centric model, the deliverables are usually a collection of technology assets which are often visible to the economic buyer and delivered through a project-oriented life cycle.

With the service-centric model, some existing roles within the IT organization will be eliminated and some new ones will be created. The result is a more agile IT organization which is able to rapidly respond to changing business needs and compete with commercial providers in the cloud service marketplace.

Cloud 3.0: A Business Enabler

Cloud 3.0 enables business users to source services that meet their needs quickly, cost-effectively and at a good service level—and on their own, without the help of an IT organization. Cloud 3.0 will usher in breakthroughs and innovations at an unforeseen pace and scope and will introduce new threats to existing markets for companies while opening new markets for others. In this way, it can be said that cloud is more of a business revolution than a technology one.

Rather than focusing on positioning themselves to adopt and implement cloud technology, a more effective strategy for IT organizations would be to focus on transforming the IT organization into a service-centric model that is able to source, integrate and manage services with high efficiency.

Back to the story and its two possible endings:

The first scenario suggests that IT will choose to ignore that its role is being threatened and continue to focus on the delivery aspects of the cloud. Under the second scenario, IT is rescued by transforming into the service-centric organization model and becoming the single sourcing control point for services in the enterprise. This will effectively place IT in control of fostering business innovation by embracing the next wave of cloud. For more information please visit Nubifer.com.