Archive for May, 2010

Cloud Computing Security Play Made by McAfee with McAfee Cloud Secure

A new service targeting Software-as-a-Service providers from McAfee combines vulnerability scanning and security certification for cloud infrastructures. The service—called the McAfee Cloud Secure program—is basically designated to compliment annual audits of security and process controls most cloud vendors undergo for the purpose of certification. McAfee officials say that with McAfee Cloud Secure they will team up with certification providers to offer an additional level of security by offering a daily scan of application, network perimeter and infrastructure vulnerabilities. Those that pass will be rewarded with a “McAfee SECURE” seal of approval.

Earlier this month at the RSA security conference, securing cloud environments was a major topic up for discussion. A survey by IDC on attitudes towards the cloud revealed that 87.5 percent of participants said the most significant obstacles to cloud adoption were security concerns. IDC analyst Christian Christiansen said in a statement, “SaaS vendors have a difficult time convincing prospects that their services are secure and safe.” According to Christiansen, though, McAfee’s new offering is a step in the right direction toward increased security in the cloud.

McAfee and other vendors have discussed providing security from the cloud in the past, but this announcement shows the increasing focus on providing solutions to secure cloud environments themselves in the industry.

Marc Olesen, senior vice president and general manager of McAfee’s Software-as-a-Service business said in an interview with eWEEK, ” McAfee looks at the cloud really from three different angles, which is security from the cloud, in the cloud and for the cloud. What’s really been out there today are (annual) process certification audits … that address the process controls and security controls that cloud providers have in place. This has typically been an ISO-27001 certification or an SAS-70 certification that cloud providers are suing, and we feel that that’s very important, but it’s just a start.” For more information please contact a Nubifer representative today.

Cloud-Optimized Infrastructure and New Services on the Horizon for Dell

Over the past three years, Dell has gained experience in the Cloud through its Data Center solutions and  group-designed customized offerings for cloud and hyperscaled IT environments. The company is now putting that experience to use, releasing several new hardware, software and service offerings optimized for cloud computing environments. Dell officials launched the new offerings—which include a new partner program, new servers optimized for cloud computing and new services designed to help business migrate to the cloud—at a San Francisco event on March 24.

Based on work the Dell Data Center Solutions group has completed over the past three years, the new offerings were outlined by Valeria Knafo, senior manager of business development and business marketing for the DCS unit. According to Knafo, DCS has built customized computing infrastructures for large cloud service providers and hyperscale data centers and is now trying to make their solutions available to enterprises. Said Knafo, “We’ve taken that experience and brought it to a new set of users.”

Dell officials revealed that they have been working with Microsoft on its Windows Azure cloud platform and that the software giant will work with Dell to create joint cloud-based solutions. Dell and Microsoft will continue to collaborate around Windows Azure (including offering services) and Microsoft will continue buying Dell hardware for its Azure platform as well. Turnkey cloud solutions—including pre-tested and pre-assembled hardware, software and services packages that businesses can use to deploy and run their cloud infrastructures quickly—are among the new offerings.

A cloud solution for Web applications will be the first Platform-as-a-Service made available. The offering will combine Dell servers and services with Web application software from Joyent and will come with challenges, caution Dell officials, like unpredictable traffic and the migrating of the apps from development to production. Dell is also offering a new Cloud Partner Program. According to officials, it will broaden options for customers seeking to move into private or public clouds. Dell announced three new software companies as partners as well: Aster Data, Greenplum and Canonical.

Also on the horizon for Dell is its PowerEdge C-series servers, which are designed to be energy efficient and offer features that are vital to hyperscaled environments—HPC (high-performance computing), social networking, gaming, cloud computing, Web 2.0 functions—like memory capacity and high performance. The C1100 (designed for clustered computing environments), the C2100 (for data analytics, cloud computing and cloud storage) and the C6100 (a four-node cloud and cluster system which offers a shared infrastructure) are the three servers that make up the family.

In unveiling the PowerEdge C-Series, Dell is partaking in the increasing industry trend of offering new systems optimized for cloud computing. For example, on March 17 Fujitsu unveiled the Primergy CX1000, a rack server created to offer the high performance environments need when lowering costs and power consumption. The Primergy CX1000 can also save on data center space through a design which pushes hot air from the system through the top of the enclosure as opposed to the back.

Last, but certainly not least, are Dell’s Integrated Solution Services. They offer complete cloud lifecycle management and include workshops to assess a company’s readiness to move to the cloud. Knafo said that the services are a combination of what Dell gained with the acquisition of Perot Systems and what it had already. “There’s a great interest in the cloud, and a lot of questions on how to get to the cloud. They want a path and a roadmap identifying what the cloud can bring,” said Knafo.

Mike Wilmington, a planner and strategist for Dell’s DCS group, claimed the services will decrease confusion many enterprises may have about the cloud. Said Wilmington, “Clouds are what the customer wants them to be,” meaning that while cloud computing may offer essentially the same benefits to all enterprises (cost reductions, flexibility, improved management and greater energy efficiency) it will look different for every enterprise. For more information please visit Nubifer.com.

Cisco, Verizon and Novell Make Announcements about Plans to Secure the Cloud

Cisco Systems, Verizon Business and Novell announce plans to launch offerings designed to heighten security in the cloud.

On April 28, Cisco announced security services based around email and the Internet that are part of the company’s cloud protection push and its Secure Borderless Network architecture; Cisco’s Secure Borderless Network architecture seeks to give users secure access to their corporate resources on any device, anywhere, at anytime.

Cisco’s IronPort Email Data Loss Prevention and Encryption, and ScanSafe Web Intelligence Reporting are designed to work with Cisco’s other web security solutions to grant companies more flexibility when it comes to their security offerings while streamlining management requirements, increasing visibility and lowering costs.

Verizon and Novell made an announcement on April 28 about their plans to collaborate to create an on-demand identity and access management service called Secure Access Services from Verizon. Secure Access Services from Verizon is designed to enable enterprises to decide and manage who is granted access to cloud-based resources. According to the companies, the identity-as-a-server solution is the first of what will be a host of joint offerings between Verizon and Novell.

According to eWeek, studies continuously indicate that businesses are likely to continue trending toward a cloud-computing environment. With that said, issues concerning security and access control remain key concerns. Officials from Cisco, Verizon and Novell say that the new services will allow businesses to feel more at ease while planning their cloud computing strategies.

“The cloud is a critical component of Cisco’s architectural approach, including its Secure Borderless Network architecture,” said vice president and general manager of Cisco’s Security technology business unit Tom Gillis in a statement. “Securing the cloud is highly challenging. But it is one of the top challenges that the industry must rise to meet as enterprises increasingly demand the flexibility, accessibility and ease of management that cloud-based applications offer for their mobile and distributed workforces.”

Cisco purchased ScanSafe in December 2009 and the result is Cisco’s ScanSafe Web Intelligence Reporting platform. The platform is designed to give users a better idea of how their Internet resources are being used, and the objective is to ensure that business-critical workloads aren’t being encumbered by non-business-related traffic. Cisco’s ScanSafe Web Intelligence Reporting platform can report on user-level data and information on Web communications activities within second, and offers over 80 predefined reports.

Designed to protect outbound email in the cloud, the IronPort email protection solution is perfect for enterprises that don’t want to manage their email. Cisco officials say that it provides hosted mailboxes (while keeping control of email policies) and also offers the option of integrated encryption.

Officials say Cisco operates over 30 data centers around the globe and that security offerings handle large quantities of activity each day—including 2.8 billion reputation look-ups, 2.5 billion web requests and the detection of more than 250 billion span messages—and these are the latest in the company’s expanding portfolio of cloud security offerings.

Verizon and Novell’s collaboration—the Secure Access Services—are designed to enable enterprises to move away from the cost and complexity associated with using traditional premises0based identity and access management software for securing applications. These new services offer centralized management of web access to applications and networks in addition to identity federation and web single sign-on.

Novell CEO Ron Hovsepian released a statement saying, “Security and identity management are critical to accelerating cloud computing adoption and by teaming with Verizon we can deliver these important solutions.” While Verizon brings the security expertise, infrastructure, management capabilities and portal to the service, Novell provides the identity and security software. For more information contact a Nubifer representative today.

Cloud Interoperability Brought to Earth by Microsoft

Executives at Microsoft say that an interoperable cloud could help companies trying to lower costs and governments trying to connect constituents. Cloud services are increasingly seen as a way for businesses and governments to scale IT systems for the future, consolidate IT infrastructure, and enable innovative services not possible until now.

Technology vendors are seeking to identify and solve the issues created by operating in mixed IT environments in order to help organizations fully realize the benefits of cloud services. Additionally, vendors are collaborating to make sure that their products work well together. The industry may still be in the beginning stages of collaborating on cloud interoperability, but has already made great strides.

So what exactly is cloud interoperability and how can it benefit companies now? Cloud interoperability specifically concerns one cloud solution working with other platforms and applications—not just other clouds. Customers want to be able to run applications locally or in the cloud, or even on a combination of both. Currently, Microsoft is collaborating with others in the industry and is working to make sure that the premise of cloud interoperability becomes an actuality.

Microsoft’s general managers Craig Shank and Jean Paoli are spearheading Microsoft’s interoperability efforts. Shank helms the company’s interoperability work on public policy and global standards and Paoli collaborates with the company’s product teams to cater product strategies to the needs of customers. According to Shank, one of the main attractions of the cloud is the amount of flexibility and control it gives customers. “There’s a tremendous level of creative energy around cloud services right now—and the industry is exploring new ideas and scenarios all the time. Our goal is to preserve that flexibility through an open approach to cloud interoperability,” says Shank.

Paoli chimes in to say, “This means continuing to create software that’s more open from the ground up, building products that support technologies such as PHP and Java, and ensuring that our existing products work with the cloud.” Both Shank and Paoli are confident that welcoming competition and choice will allow Microsoft to become more successful down the road. “This may seem surprising,” says Paoli before adding,” but it creates more opportunities for its customers, partners and developers.”

Shank reveals that due to the buzz about the cloud, some forget about the ultimate goal: “To be clear, cloud computing has enormous potential to stimulate economic growth and enable governments to reduce costs and expand services to citizens.” One example of the real-world benefits of cloud interoperability is the public sector. Microsoft is currently showing results in this area via solutions like their Eye for Earth project. Microsoft is helping the European Environment Agency simplify the collection and processing of environmental information for use by the general public and government officials. Eye on Earth obtains data from 22,000 water monitoring points and 1,000 stations that monitor air quality through employing Microsoft® Windows Azure, Microsoft ® SQL Azure and already existing Linux technologies. Eye on Earth then helps synthesize the information and makes it accessible for people in 24 different languages in real time.

Product developments like this emerged out of feedback channels which the company developed with its partners, customers and other vendors. In 2006, for example, Microsoft created the Interoperability Executive Customer (IEC) Council, which is comprised of 35 chief technology officers and chief information officers from a variety of organizations across the globe. The group meats two times per year in Redmond and discuss issues concerning interoperability as well as provide feedback to Microsoft executives.

Additionally, Microsoft recently published a progress report which—for the first time—revealed operational details and results achieved by the Council across six work streams (or priority areas). The Council recently commissioned the creation of a seventh work stream for cloud interoperability geared towards developing standards related to the cloud which addressed topics like data portability, privacy, security and service policies.

Developers are an important part of cloud interoperability, and Microsoft is part of an effort the company co-founded with Zend Technologies, IBM and Rackspace called Simple Cloud. Simple Cloud was created to help developers write basic cloud applications that work on all major cloud platforms.

Microsoft is further engaging in the collaborative work of building technical “bridges” between the company and non-Microsoft technologies, like the recently-released Microsoft ® Windows Azure Software Development Kits (SDKs) for PHP and Java and tools for the new Windows ® Azure platform AppFabric SDKs for Java, PHP and Ruby (Eclipse version 1.0), the SQL CRUD Application Wizard for PHP and the Bing 404 Web Page Error Toolkit for PHP. These examples show the dedication of Microsoft Interoperability team.

Despite the infancy of the industry’s collaboration on cloud interoperability issues, much progress has already been made. This progress has had a major positive impact on the way even average users work and live, even if they don’t realize it yet. A wide perspective and a creative and collaborative approach to problem-solving are required for cloud interoperability. In the future, Microsoft will continue to support more conversation within the industry in order to define cloud principles and make sure all points of view are incorporated. For more information please contact a Nubifer representative today.

Amazon Sets the Record Straight About the Top Five Myths Surrounding Cloud Computing

On April 19, the 5th International Cloud Computing Conference & Expo (Cloud Expo)opened in New York City, and Amazon Web Services (AWS) used the event as a platform to address some of what the company sees as the lingering myths about cloud computing.

AWS officials said that the company continues to grapple with questions about features of the cloud-ranging from reliability and security to cost and elasticity—despite being one of the first companies to successfully and profitably implement cloud computing solutions. Adam Selipsky, vice president of AWS, recently spoke about the persisting myths of cloud computing from Amazon’s Seattle headquarters, specifically addressing five that linger in the face of increased industry adoption of the cloud and continued successful cloud deployments. “We’ve seen a lot of misperceptions about cloud computing is,” said Selipsky before debunking five common myths.

Myth 1: The Cloud Isn’t Reliable

Chief information officers (CIOs) in enterprise organizations have difficult jobs and are usually responsible for thousands of applications, explains Selipsky in his opening argument, adding that they feel like they are responsible for the performance and security of these applications. When problems with the applications arise, CIOs are used to approaching their own people for answers and take some comfort that there is a way to take control of the situation.

Selipsky says that customers need to consider a few things when adopting the cloud, one of which is that the AWS’ operational performance is good. Selipsky reminded users that they own the data, they choose which location to store the data (and it doesn’t move unless the customer decided to move it) and that regardless of whether customers choose to encrypt or not, AWS never looks at the data.

“We have very strong data durability—we’ve designed Amazon S3 (Simple Storage Service) for eleven 9’s of durability. We store multiple copies of each object across multiple locations,” said Selipsky. He added that AWS has a “Versioning” feature which allows customers to revert to the last version of any object they somehow lose due to application failure or an unintentional deletion. Customers can also ensure additional fault-tolerant applications by deploying their applications in various Availability zones or using AWS’ Load Balancing and Auto Scaling features.

“And, all that comes with no capex [capital expenditures] for companies, a low per unit cost where you only pay for what you consume, the ability to focus on engineers on unique incremental value for your business,” said Selipsky before adding that the origin of the reliability claims come merely from an illusion of a control, not actual control. “People think if they can control it they have more say in how things go. It’s like being in a car versus an airplane, but you’re much safer in a plane,” he explained.

Myth 2: The Cloud Provides Inadequate Security and Privacy

When it comes to security, Selipsky notes that it is an end-to-end process and thus companies need to build security at every level of the stack. Taking a look at Amazon’s cloud, it is easy to note that the same security isolations are employed as with a traditional data center—including physical data center security, separation of the network, isolation of the server hardware and isolation of storage. Data centers had already become a frequently-shared infrastructure on the physical data center side before Amazon launched its cloud services. Selipsky added that companies realized that they could benefit by renting space in a data facility as opposed to building it.

When speaking about security fundamentals, Selipsky noted that security could be maintained by providing badge-controlled access, guard stations, monitored security cameras, alarms, separate cages and strictly audited procedures and processes. Not only is Amazon’s Web Services’ data center identical to the best practices employed in private data facilities, there is an added physical security advantage in the fact that customers don’t need to access to the servers and networking gear inside. Access to the data center is thus controlled more strictly than traditional rented facilities. Selipsky also added that the Amazon cloud as equal or better isolation than could be expected from dedicated infrastructure, at the physical level.

In his argument, Selipsky pointed out that networks ceased to be isolated physical islands a long time ago because, as companies increasingly began to need to connect to other companies—and then the Internet—their networks became connected with public infrastructure. Firewalls and switch configurations and other special network functionality were used to prevent bad network traffic from getting in, or conversely from leaking out. Companies began using additional isolation techniques as their network traffic increasingly passed over public infrastructure to make sure that the security of every packet on (or leaving) their network remained secure. These techniques include Multi-protocol Label Switching (MPLS) and encryption.

Amazon used a similar approach to networking in its cloud by maintaining packet-level isolation of network traffic and supporting industry-standard encryption. Amazon Web Services’ Virtual Private Cloud allows a customer to establish their own IP address space and because of that customers can use the same tools and software infrastructure they are familiar with to monitor and control their cloud networks. Amazon’s scale also allows for more investment in security policing and countermeasures than nearly and large corporation could afford. Maintains Selipsky, “Our security is strong and dug in at the DNA level.”

Amazon Web Services invests in testing and validating the security of its virtual server and storage environment significantly as well. When discussing the investments made on the hardware side, Selipsky lists:

After customers release these resources, the server and storage are wiped clean so no important data can be left behind.

Intrusion from other running instances is prevented because each instance has its own customer firewall.

Those in need of more network isolation can use Amazon VPC, which allows you to carry your own IP address space with you into the cloud; your instances are only accessible through those IP addresses only you know.

Those desiring to run on their own boxes—where no other instances are running—can purchase extra large instances where only that XL instance runs on that server.

According to Selipsky, Amazon’s scale allows for more investment in security policing and countermeasures: “In fact, we often find that we can improve companies’ security posture when they use AWS. Take the example lots of CIOs worry about—the rogue server under a developer’s desk running something destructive or that the CIO doesn’t want running. Today, it’s really hard (if not impossible) for CIOS to know how many orphans there are and where they might be. With AWS, CIOs can make a single API call and see every system running in their VPC [Virtual Private Cloud]. No more hidden servers under the desk or anonymously places servers in a rack and plugged into the corporate network. Finally, AWS is SAS-70 certified; ISO 27—1 and NIST are in process.”

Myth 3: Creating My Own In-House Cloud or Private Cloud Will Allow Me to Reap the Same Benefits of the Cloud

According to Selipsky, “There’s a lot of marketing going on about the concept of the ‘private cloud.’ We think there’s a bit of a misnomer here.” Selipsky continued to explain that generally, “we often see companies struggling to accurately measure the cost of infrastructure. Scale and utilization are big advantages for AWS. In our opinion, a cloud has five key characteristics: it eliminates capex; allows you to pay for what you use; provides true elastic capacity to scale up and down; allows you to move very quickly and provision servers in minutes; and allows you to offload the undifferentiated heavy lifting of infrastructure so your engineers work on differentiating problems.

Selipsky also pointed out the following drawbacks of private clouds: still own the capex (and they are expensive!); not pay for  what you use; not have true elasticity; still manage the undifferentiated heavy lifting. “With a private cloud you have to manage capacity very carefully … or you or your private cloud vendor will end up over-provisioning. So you’re going to have to either get very good at capacity management or you’re going to wind up overpaying,” said Selipsky before challenging the elasticity of the private cloud: “The cloud is shapeless. But if it has a tight box around it, it no longer feels very cloud-like.”

One of AWS’ key offerings is Amazon’s ability to save customers money while also driving efficiency. “In virtually every case we’ve seen, we’ve been able to save people a significant amount of money,” said Selipsky. This is in part because AWS’ business has greatly expanded over the last four years and Amazon has achieved enough scale to secure very low costs. AWS has been able to aggregate hundreds of thousands of customers to have a high utilization of its infrastructure. Said Selipsky, “In our conversations with customers we see that really good enterprises are in the 20-30 percent range on utilization—and that’s when they’re good … many are not that strong. The cloud allows us to have several times that utilization. Finally, it’s worth looking at Amazon’s heritage and AWS’ history. We’re a company that works hard to lower its costs so that we can pass savings back to our customers. If you look at the history of AWS, that’s exactly what we’ve done (lowering price on EC2, S3, CloudFront, and AWS bandwidth multiple times already without any competitive pressure to do so).”

Myth 4: The Cloud Isn’t Ideal Because I Can’t Move Everything at Once

Selipsky debunks this myth by saying, “We believe this is nearly impossible and ill-advised. We recommend picking a few apps to gain experience and comfort then build a migration plan. This is what we most often see companies doing. Companies will be operating in hybrid environments for years to come. We see some companies putting some stuff on AWS and then keeping some stuff in-house. And I think that’s fine. It’s a perfectly prudent and legitimate way of proceeding.”

Myth 5: The Biggest Driver of Cloud Adoption is Cost

In busting the final myth, Selipsky said, “There is a big savings in capex and cost but what we find is that one of the main drivers of adoption is that time-to-market for ideas is much faster in the cloud because it lets you focus your engineering resources on what differentiates your business.”

Summary

Speaking about all of the myths surround the cloud, Selipsky concludes that “a lot of this revolves around psychology and fear of change, and human beings needing to gain comfort with new things. Years ago people swore they would never put their credit card information online. But that’s no longer the case. We’re seeing great momentum. We’re seeing, more and more, over time these barriers [to cloud adoption] are moving.” For additional debunked myths regarding Cloud Computing visit Nubifer.com.

IBM Elevates Its Cloud Offerings with Purchase of Cast Iron Systems

IBM Senior Vice President and Group Executive for IBM Software Group Steve Mills announced the acquisition of cloud integration specialist Cast Iron Systems at the IBM Impact 2010 conference in Las Vegas on May 3. The privately held Cast Iron is based in Mountain View, California and delivers cloud integration software, appliances and services, thus the acquisition broadens the delivery of cloud computing services for IMB’s clients. IBM’s business process and integration software portfolio grew over 20 percent during the first quarter and the company sees this deal as a way to expand it further. The financial terms of the acquisition were not disclosed although Cast Iron Systems’ 75 employees will be integrated into IBM.

According to IBM officials, Big Blue anticipated the worldwide cloud computing market to grow at a compounded annual rate of 28 percent from $47 billion in 2008 to a projected $126 billion by 2012. The acquisition of Cast Iron Systems reflects IBM’s expansion of its software business around higher value capabilities that help clients run companies more effectively.

IBM has transformed its business model to focus on higher value, high-margin capabilities through organic and acquisitive growth in the past ten years–and the company’s software business has been a key catalyst in this shift. IBM’s software revenue grew at 11 percent year-to-year during the first quarter and the company generated $8 billion in software group profits in 2008 (up from $2.8 billion in 2000).

Since 2003, the IBM Software Group has acquired over 55 companies, and the acquisition of Cast Iron Systems is part of that. Cast Iron Systems’ clients include Allianz, Peet’s Coffee & Tea, NEC, Dow Jones, Schumacher Group, ShoreTel, Time Warner, Westmont University and Sports Authority and the cloud integration specialist has completed thousands of cloud integrations around the globe for retail organizations, financial institutions and media and entertainment companies.

IBM’s acquisition comes at a time when one of the major challenges facing businesses when adopting cloud delivery models is integrating the disparate systems running in their data centers with new cloud-based applications–which used to be time-consuming work which drained resources. IBM gains the ability to help businesses rapidly integrate their cloud-based applications and on-pemises systems, with the acquisition of Cast Iron Systems. Additionally, the acquisition advances IBM’s capabilities for a hybrid cloud model–which allows enterprises to blend data from on-premises applications with public and private cloud systems.

IBM, which is know for offering application integration capabilities for on-premises and business-to-business applications, will now be able to offer clients a complete platform to integrate cloud applications from providers like Amazon, Salesforce.com, NewSuite and ADP with on-premises applications like SAP and JD Edwards. Relationships between IBM and Amazon and Salesforce.com will essentially become friendlier due to this acquisition.

IBM said that it can use Cast Iron Systems’ hundreds of prebuilt templates and services expertise to eliminate expensive coding, thus allowing cloud integrations to be completed in mere days (rather than weeks, or even longer). These results can be achieved through using a physical appliance, a virtual appliance or a cloud service.

Craig Hayman, general manager for IBM WebSphere said in a statement, “The integration challenges Cast Iron Systems is tackling are crucial to clients who are looking to adopt alternative delivery models to manage their businesses. The combination of IBM and Cast Iron Systems will make it easy for clients to integrate business applications, no matter where those applications reside. This will give clients greater agility and, as a result, better business outcomes.”

IMB cited Cast Iron Systems helping pharmaceutical distributor Amerisource Bergen Specialty Group connecting Saleforce CRM with its on-premise corporate data warehouse as an example. The company has since been able to give its customer service associates access to the accurate, real-time information they need to deliver a positive customer experience while realizing $250,000 in annual cost savings.

Cast Irons Systems additionally aided a division of global corporate insurance leader Allianz integrate Salesforce CRM with its on-premises underwriting applications to offer real-time visibility into contract renewals for its sales team and key performance indicators for sales management. IBM said that Allianz beat its own 30-day integration project deadline by replacing labor-intensive custom code with Cast Iron Systems’ integration solution.

President and chief executive officer of Cast Iron Systems Ken Comee said, “Through IBM, we can bring Cast Iron Systems’ capabilities as the world’s leading provider of cloud integration software and services to  global customer set. Companies around the world will now gain access to our technologies through IBM’s global reach and its vast network of partners. As part of IBM, we will be able to offer clients a broader set of software, services and hardware to support their cloud and other IT initiatives.”

IBM will remain consistent with its software strategy by supporting and enhancing Cast Iron Systems’ technologies and clients while simultaneously allowing them to utilize the broader IBM portfolio. For more information, visit Nubifer.com.

Transforming Into a Service-Centric IT Organization By Using the Cloud

While IT executives typically approach cloud services from the perspective of how they are being delivered, this model neglects what cloud services are and how they are consumed. These two facets can have a large impact on the overall IT organizations, points out eWeek Knowledge Center contributor Keith Jahn. Jahn maintains that it is very important for IT executives to veer away from the current delivery-only focus by creating a world-class supply chain for managing the supply and demand of cloud services.

Using the popular fable The Sky Is Falling, known lovingly as Chicken Little, Jahn explains a possible future scenario that IT organizations may face due to cloud computing. As the fable goes, Chicken Little embarks on a life-threatening journey to warn the king that the sky is falling and on this journey she gathers friends who join her on her quest. Eventually, the group encounters a sly fox who tricks them into thinking that he has a better path to help them reach the king. The tale can end one of two ways: the fox eats the gullible animals (thus communicating the lesson “Don’t believe everything you hear”) or the king’s hunting dogs can save the day (thus teaching a lesson about courage and perseverance).

So what does this have to do with cloud computing? Cloud computing has the capacity to bring on a scenario that will force IT organizations to change, or possibly be eliminated altogether. The entire technology supply chain as a whole will be severely impacted if IT organizations are wiped out. Traditionally, cloud is viewed as a technology disruption, and is assessed from a deliver orientation, posing questions like how can this new technology deliver solutions cheaper and better and faster? An equally important yet often ignored aspect of this equation is how cloud services are consumed. Cloud services are ready to run, self-sourced, available wherever you are and are pay-as-you-go or subscription based.

New capabilities will emerge as cloud services grow and mature and organizations must be able to solve new problems as they arise. Organizations will also be able to solve old problems cheaper, better and faster. New business models will be ushered in by cloud services and these new business models will force IT to reinvent itself in order to remain relevant. Essentially, IT must move away from its focus on the delivery and management of assets and move toward the creation of a world-class supply chain for managing supply and demand of business services.

Cloud services become a forcing function in this scenario because they are forcing IT to transform. CIOs that choose to ignore this and neglect to make transformative measures will likely see their role shift from innovation leader to CMO (Chief Maintenance Officer), in charge of maintaining legacy systems and services sourced by the business.

Analyzing the Cloud to Pinpoint Patterns

The cloud really began in what IT folks now refer to as the “Internet era,” when people were talking about what was being hosted “in the cloud.” This was the first generation of the cloud, Cloud 1.0 if you will—an enabler that originated in the enterprise. Supply Chain Management (SCM) processes were revolutionized by commercial use of the Internet as a trusted platform and eventually the IT architectural landscape was forever altered.

This model evolved and produced thousands of consumer-class services, which used next-generation Internet technologies on the front end and massive scale architectures on the back end to deliver low-cost services to economic buyers. Enter Cloud 2.0, a more advanced generation of the cloud.

Beyond Cloud 2.0

Cloud 2.0 is driven by the consumer experiences that emerged out of Cloud 1.0. A new economic model and new technologies have surfaced since then, due to Internet-based shopping, search and other services. Services can be self-sourced from anywhere and from any device—and delivered immediately—while infrastructure and applications can be sourced as services in an on-demand manner.

Currently, most of the attention when it comes to cloud services remains focused on the new techniques and sourcing alternatives for IT capabilities, aka IT-as-a-Service. IT can drive higher degrees of automation and consolidation using standardized, highly virtualized infrastructure and applications. This results in a reduction in the cost of maintaining existing solutions and delivering new solutions.

Many companies are struggling with the transition from Cloud 1.0 to Cloud 2.0 due to the technology transitions required to make the move. As this occurs, the volume of services in the commercial cloud marketplace is increasing, propagation of data into the cloud is taking place and Web 3.0/semantic Web technology is maturing. The next generation of the cloud, Cloud 3.0 is beginning to materialize because of these factors.

Cloud 3.0 is significantly different because it will enable access to information through services set in the context of the consumer experience. This means that processes can be broken into smaller pieces and subsequently automated through a collection of services, which are woven together with massive amounts of data able to be accessed. With Cloud 3.0, the need for large-scale, complex applications built around monolithic processes is eliminated. Changes will be able to be made by refactoring service models and integration achieved by subscribing to new data feeds. New connections, new capabilities and new innovations—all of which surpass the current model—will be created.

The Necessary Reinvention of IT

IT is typically organized around the various technology domains taking in new work via project requests and moving it through a Plan-Build-Run Cycle. Here lies the problem. This delivery-oriented, technology-centric approach has inherent latency built-in. This inherent latency has created increasing tension with the business it serves, which is why IT must reinvent itself.

IT must be reinvented so that it becomes the central service-sourcing control point for the enterprise or realize that the business with source them on their own. By becoming the central service-sourcing control point for the enterprise, IT can maintain the required service levels and integrations. Changes to behavior, cultural norms and organizational models are required to achieve this.

IT Must Become Service-Centric in the Cloud

IT must evolve from a technology-centric organization into a service-centric organization in order to survive, as service-centric represents an advanced state of maturity for the IT function. Service-centric allows IT to operate as a business function—a service provider—created around a set of products which customers value and are in turn willing to pay for.

As part of the business strategy, these services are organized into a service portfolio. This model differs from the capability-centric model because the deliverable is the service that is procured as a unit through a catalog and for which the components—and sources of components—are irrelevant to the buyer. With the capability-centric model, the deliverables are usually a collection of technology assets which are often visible to the economic buyer and delivered through a project-oriented life cycle.

With the service-centric model, some existing roles within the IT organization will be eliminated and some new ones will be created. The result is a more agile IT organization which is able to rapidly respond to changing business needs and compete with commercial providers in the cloud service marketplace.

Cloud 3.0: A Business Enabler

Cloud 3.0 enables business users to source services that meet their needs quickly, cost-effectively and at a good service level—and on their own, without the help of an IT organization. Cloud 3.0 will usher in breakthroughs and innovations at an unforeseen pace and scope and will introduce new threats to existing markets for companies while opening new markets for others. In this way, it can be said that cloud is more of a business revolution than a technology one.

Rather than focusing on positioning themselves to adopt and implement cloud technology, a more effective strategy for IT organizations would be to focus on transforming the IT organization into a service-centric model that is able to source, integrate and manage services with high efficiency.

Back to the story and its two possible endings:

The first scenario suggests that IT will choose to ignore that its role is being threatened and continue to focus on the delivery aspects of the cloud. Under the second scenario, IT is rescued by transforming into the service-centric organization model and becoming the single sourcing control point for services in the enterprise. This will effectively place IT in control of fostering business innovation by embracing the next wave of cloud. For more information please visit Nubifer.com.

A Guide to Securing Sensitive Data in Cloud Environments

Due to the outsourced nature of the cloud and its innate loss of control, it is important to make sure that sensitive data is constantly and carefully monitored for protection. That task is easier said than done, which is why the following questions arise: How do you monitor a database server when its underlying hardware moves every day—sometimes even multiple times a day and sometimes without your knowledge? How do you ensure that your cloud computing vendor’s database administers and system administrators are not copying or viewing confidential records inappropriately or abusing their privileges in another way?

When deploying a secure database platform in a cloud computing environment, these obstacles and many more are bound to arise and an enterprise needs to be able to overcome them, as these barriers may be enough to prevent some enterprises from moving their on-premises approach. There are three critical architectural concerns to consider when transferring applications with sensitive data to the cloud.

Issue 1: Monitoring an Ever-changing Environment

Cloud computing grants you the ability to move servers and add or remove resources in order to maximize the use of your systems and reduce expense. This increased flexibility and efficiency often means that the database servers housing your sensitive data are constantly being provisioned and deprovisioned. Each of these scenarios represents a potential target for hackers, which is an important point to consider.

Monitoring data access becomes more difficult due to the dynamic nature of a cloud infrastructure. If the information in those applications is subject to regulations like the Payment Card Industry Data Security Standard (PCI DSS) or the Health Insurance Portability and Accountability Act (HIPAA), it is vital to make sure that it is secure.

It is essential to find a methodology that is easily deployed on new database servers without management involvement when thinking about solutions to monitor activity on these dynamic database servers. This requires a distributed model in which each instance in the cloud has a sensor or agent running locally; and this software must be able to be provisioned automatically along with the database software without requiring intrusive system management.

It won’t always be possible to reboot whenever it is necessary to install, upgrade or update the agents in a multitenancy environment such as this, and the cloud vendor may even place limitations on installation of software requiring certain privileges. With the right architecture in place, you will be able to see where your databases are hosted at any point in town and will be able to centrally log all activity and flag suspicious events across all servers wherever they are.

Issue 2: Working in a WAN

Currently, database activity monitoring solutions utilize a network-sniffing model to identify malicious queries, but this approach isn’t feasible in the cloud environment because the network encompasses the entire Internet. Another method that doesn’t work in the cloud is adding a local agent which sends all traffic to a remote server.

The solution is something that is designed for distributed processing where the local sensor is able to analyze traffic autonomously. Another thing to consider is that  cloud computing resources procured are likely to be on a WAN. Network bandwidth and network latency will make off-host processing inefficient. With cloud computing, you are likely unable to colocate a server lose to your databases. This means that the time and resources spent spending every transaction to a remote server for analysis will stunt network performance and also hinder timely interruption of malicious activity.

So when securing databases in cloud computing, a better approach is to utilize a distributed monitoring solution that is based on “smart” agents. That way, once a security policy for a monitored database is in place, that agent or sensor is able to implement protection and alerting locally and thus prevent the network from turning into the gating factor for performance.

It is also necessary to test the WAN capabilities of your chosen software for remote management of distributed data centers. It should be able to encrypt all traffic between the management console and sensors to restrict exposure of sensitive data. There are also various compression techniques that can enhance performance so that alerts and policy updates are transmitted efficiently.

Issue 2: Know Who Has Privileged Access to Your Data

The activity of privileged users is one of the most difficult elements to monitor in any database implementation. It is important to remember that DBAs and system administrators know how to stealthy access and copy sensitive information (and cover their tracks afterward). There are unknown personnel at unknown sites with these access privileges in cloud computing environments. Additionally, you cannot personally conduct background checks on third parties like you would for your own staff in this situation. When looking at all of these factors, it is easy to see why protecting against inside threats is important yet difficult to do.

So how do you resolve this issue? One way is to separate duties to ensure that the activities of privileged third parties are monitored by your own staff and also that the pieces of the solution on the cloud side of the network are unable to be defeated without alerts going off. It is also necessary to be able to closely monitor individual data assets regardless of the method used to access it.

Seek out a system that knows when the data is being accessed in violation of the policy–without relying on query analytics alone. Sophisticated users with privileges can create new views, insert stored procedures into a database or generate triggers which compromise information without the SQL command arising suspicion.

Summary

Although some may wrongfully conclude that the complex nature of monitoring database in a cloud architecture isn’t worth changing from dedicated systems–or at least not just yet. With that said, most enterprises will decide that deploying applications with sensitive data on one of these models is inevitable. Leading organizations have begun to change and as a result tools are now meeting the requirements driven by the issues raised in this article.

Essentially, security should not prevent you from moving forward with deploying databases in the cloud if you think your enterprise would benefit from doing so. By looking before you leap–ensuring your security methodologies adequately address these unique cases–you can make the transition safely.  For more information please visit Nubifer.com.

New Cloud-Focused Linux Flavor: Peppermint

A new cloud-focused Linux flavor is in town: Peppermint. The Peppermint OS is currently a small, private beta which will open up to more testers in early to late May. Aimed at the cloud, the Peppermint OS is described on its home page as: “Cloud/Web application-centric, sleek, user friendly and insanely fast! Peppermint was designed for enhances mobility, efficiency and ease of use. While other operating systems are taking 10 minutes to load, you are already connected, communicating and getting things done. And, unlike other operating systems, Peppermint is ready to use out of the box.”

The Peppermint team announced the closed beta of the new operating system in a blog post on April 14, saying that the operating system is “designed specifically for mobility.” The description of the technology on Launchpad describes Peppermint as “a fork of Lubuntu with an emphasis on cloud apps and using many configuration files sources from Linux Mint. Peppermint uses Mozilla Prism to create single site browsers for easily accessing many popular Web applications outside of the primary Web applications outside of the primary browser. Peppermint uses the LXDE desktop environment and focuses on being easy for new Linux users to find their way around in.”

Lubuntu is described by the Lubuntu project as a lighter, faster and energy-saving modification of Ubuntu using LXDE (the Lightweight X11 Desktop Environment). Kendall Weaver and Shane Remington, a pair of developers in North Carolina, make up the core Peppermint team. Weaver is the maintainer for the Lunix Mint Fluxbox and LXDE editions as well as the lead software developer for Astral IX Media in Asheville, NC and the director of operations for Western Carolina Produce in Hendersonville, NC. Based in Asheville, NC, Remington is the project manager and lead Web developer for Astral IX Media and, according to the Peppermint site, “provides the Peppermint OS project support with Web development, marketing, social network integration and product development.” For more information please visit Nubifer.com.

Using Business Service Management to Manage Private Clouds

Cloud computing promises an entirely new level of flexibility through pay-as-you-go, readily accessible, infinitely scalable IT services, and executives in companies of all sizes are embracing the model. At the same time, they are also posing questions about the risks associated with moving mission-critical workloads and sensitive data into the cloud. eWEEK’s Knowledge Center contributor Richard Whitehead has four suggestions for managing private clouds using service-level agreements and business service management technologies.

“Private clouds” are what the industry is calling hybrid cloud computing models which offer some of the benefits of cloud computing without some of the drawbacks that have been highlighted. These private clouds host all of the company’s internal data and applications while giving the user more flexibility over how service is rendered. The transition to private clouds is part of the larger evolution of the data center, which makes the move from a basic warehouse of information to a more agile, smarter deliverer of services. While virtualization helps companies save on everything from real estate to power and cooling costs, it does pose the challenge of managing all of the physical and virtual servers—or virtual sprawl. Basically, it is harder to manage entities when you cannot physically see and touch them.

A more practical move into the cloud can be facilitated through technology, with private clouds being managed through the use of service-level agreements (SLAs) and business service management (BSM) technologies. The following guide is a continuous methodology to bring new capabilities into an IT department within a private cloud network. Its four steps will give IT the tools and knowledge to overcome common cloud concerns and experience the benefits that a private cloud provides.

Step 1: Prepare

Before looking at alternative computing processes, an IT department must first logically evaluate its current computing assets and ask the following questions. What is the mixture of physical and virtual assets? (The word asset is used because this process should examine the business value delivered by IT.) How are those assets currently performing?

Rather than thinking in terms of server space and bandwidth, IT departments should ask: will this private cloud migration increase sales or streamline distribution? This approach positions IT as a resource rather than as a line item within an organization. Your private cloud migration will never take off if your resources aren’t presented in terms of assets and RIO.

Step 2: Package

Package refers to resources and requires a new set of measurement tools. IT shops are beginning to think in terms of packaging “workloads” in the virtualized world as opposed to running applications on physical servers. Workloads are portable, self-contained units of work or services built through the integration of the JeOS (“just enough” operating system), middleware and the application. They are portable and able to be moved across environments ranging from physical and virtual to cloud and heterogeneous.

A business service is a group of workloads, and this shows a fundamental shift from managing physical servers and applications to managing business services composed of portable workloads that can be mixed and matched in the way that will be serve the business. Managing IT to business services (aka the service-driven data center) is becoming a business best practice and allows the IT department to price and validate its provide cloud plan as such.

Step 3: Price

A valuation must be assigned to each IT unit after you’ve packaged up your IT processes into workloads and services. How much does it cost to run the service? How much will it cost if the service goes offline? The analysis should be presented around how these costs effect the business owner because the costs assessments are driven by the business need.

One of the major advantages of a service-driven data center is that business services are able to be dynamically manages to SLAs and moved around appropriately. This allows companies to attach processes to services by connecting workloads to virtual services and, for the first time, connects a business process to the hardware implementing that business process.

The business service can be managed independent of the hardware because they aren’t tied to the business server and can thus be moved around on an as-needed basis.

Price is dependent on the criticality of the service, what resources it will consume or whether it is worthy of backup and/or disaster recovery support. This shows a new approach not usually disclosed by IT and transparency in a cloud migration plan can be seen as a crucial part of demonstrating the value the cloud provides in a way that is cost-effective.

Step 4: Present

After you have an IT service package, you must present a unified catalog to the consumers of those services. This catalog must be visible to all relevant stakeholders within the organization and can be considered an IT storefront or showcase featuring various options and directions for your private cloud to demonstrate value to the company.

This presentation allows your organization the flexibility to balance IT and business needs for a private cloud architecture that works for all parties; the transparency gives customers a way to interact directly with IT.

Summary

Although cloud computing remains an intimidating and abstract concept for many companies, enterprises can still start taking steps towards extending their enterprise into the cloud with the adoption of private clouds. An organization can achieve a private cloud that is virtualized, workload-based and managed in terms of business services with the service-driven data center. Workloads are managed in a dynamic manner in order to meet business SLAs. The progression from physical server to virtualization to the workload to business service to business service management is clear and logical.

In order to insure that your private cloud is managed effectively—thus providing optimum visibility to the cloud’s business value—it is important to evaluate and present your cloud migration in this way. Cloud investment can seem less daunting when viewed as a continuous process and the transition can be make in small sets which makes the value a private cloud can provide to a business more easily recognizable to stakeholders. For more information, visit Nubifer.com.

Microsoft and Intuit Pair Up to Push Cloud Apps

Despite being competitors, Microsoft and Intuit announced plans to pair up to encourage small businesses to develop cloud apps for the Windows Azure platform in early January 2010.

Intuit is offering a free, beta software development kit (SDK) for Azure and citing Azure as a “preferred platform” for cloud app deployment on the Intuit Partner Platform as part of its collaboration with Microsoft. This marriage opens up the Microsoft partner network to Intuit’s platform and also grants developers on the Intuit cloud platform access to Azure and its tool kit.

As a result of this collaboration, developers will be encouraged to use Azure to make software applications that integrate with Intuit’s massively popular bookkeeping program, QuickBooks. The companies announced that the tools will be made available to Intuit partners via the Intuit App Center.

Microsoft will make parts of its Online Business Productivity Suite (such as Exchange Online, SharePoint Online, Office Live Meeting and Office Communications Online) available for purchase via the Intuit app Center as well.

The agreement occurred just weeks before Microsoft began monetizing the Windows Azure platform (on February 1)—when developers who had been using the Azure beta free of charge began paying for use of the platform.

According to a spokesperson for Microsoft, the Intuit beta Azure SDK will remain free, with the timing for stripping the beta tag “unclear.”

Designed to automatically manage and scale applications hosted on Microsoft’s public cloud, Azure is Microsoft’s latest Platform-as-a-Service. Azure will serve as a competitor for similar offerings like Force.com and Google App Engine. Contact a Nubifer representative to see how the Intuit – Microsoft partnership can work for your business.