Posts Tagged ‘ Cloud Security ’

Guidelines for Cloud Consumers and Providers

Business users are drawn to the cloud. That’s not surprising, considering they tend to see mostly benefits: self-service freedom, scalability, availability, flexibility, and the pleasure of avoiding various nasty hardware and software headaches.IT leaders though are a different story—they are not always as ecstatic.  They indicate uneasiness about cloud securityand have legitimate concerns that unauthorized users could get their hands on their applications and data. Moreover, retaining a level of influence and control is a must for them. Can both “sides” meet halfway? Is it attainable to provide the freedom that users want while having the control that IT leaders need?
.
Simply put, Yes…. However, doing so will entail a collaborative effort. Both business users and IT leaders have to assume a few key responsibilities. In addition, you will have to make certain that your cloud provider will be doing its part as well.

.

Your 5 Responsibilities

Here are a few things you need to be held accountable for:
.
1. Define the business need. Identify the root problem you want to solve a cloud technology. Is it a perpetually recurring concern, or one that happens irregularly? Did you need an answer “last week,” or do you have time to construct a solution?

Important note: Not all clouds are created equally. Some can run your applications unchanged, with instant access; while others require little tweaking. Recognizing your needs and differentiating cloud technologies will help you determine the correct strategy for handling the particular business problem that needs attention.

2. Identify your application and process requirements. Once you have accurately defined your business needs, it is time to select the application best-suited to meet those needs. Be clear and precise about the nature of the application, the development process you want to adapt, and the roles and access permissions for each user.

Your teams no longer have to struggle through traditional linear and slow development processes. Instead, the cloud can give them access to the best practices that are fluid and agile. Many self-service solutions can even empower them to run copies of the same environment in parallel.

Simply put, the cloud may lead to breakthrough productivity when used properly. However, if used incorrectly it can also lead to enormous amounts of wasted resources. Having said this, take your time to do your research and choose wisely.

3. Determine your timetable. Cloud projects are not short sprints contrary to popular belief. They are better illustrated as long journeys over time. Please plan accordingly.

Nubifer recommends to define your early experiments in a quarterly basis because cloud technology is transformative. Learn from the first quarter, take note, and execute the necessary adjustments and then move on to the next. The objective is to generate a learning organization that increases control over time and progresses based on data and experience.

4. Establish success factors. Define what success is for you. Do you want to improve the agility of the development process? Maybe you want to increase the availability of your applications? Or perhaps you want to enhance remote collaboration? Define achievement, and have a tool to measure progress as well. Identifying metrics and establishing realistic goals will aid you achieve the solution that meets not only your needs, but also your budget and payback time frame.

5. Define data and application security. Companies overlook this critical responsibility more often than they realize. Make sure to do your due diligence and attentively determine whom you can trust with cloud application. After which, empower them. The following are questions that need unambiguous answers: What specific roles will team members take in the cloud model? Does everyone comprehend fully the nature of the application and data they are planning to bring to the cloud? Does everyone know how to protect your data? Do they understand your password policies? Dealing with these security factors early on enables you to create a solid foundation for cloud success while having your own peace of mind about this issue.

Your Provider’s 5 Responsibilities

Meanwhile, make sure your cloud provider offers the following to attain better cloud control:
1. Self-service solutions. Time equals money. Thus waiting equals wasted time and money. So search for cloud applications that are ready from the get go. Determine if the solution you are considering may implement the applications and business process you have in mind immediately, or if the provider requires you to rewrite the application or change the process entirely.

There is also a need to distinguish if users will require training, or if they already equipped to handle a self-service Web interface. Answers to these questions can determine whether adoption will be rapid and smooth, or slow and bumpy.

2. Scale and speed. A well-constructed cloud solution provides the unique combination of scale and speed. It gives you access to the resources at a scale that you need with on-demand responsiveness. This combination will empower your team to run several instances in parallel, snapshot, suspend/resume, publish, collaborate, and accelerate the business cycle.

3. Reliability and availability. As articulated in the Service Level Agreements (SLAs), it is the responsibility of the cloud provider to make the system reliable and available. The provider should set clear and precise operational expectations, such as 99.9 percent availability, with you, the consumer.

4. Security. Ask for a comprehensive review of your cloud provider’s security technology and processes. In specific, ask about the following:

  • Application and data transportability. Can your provider give you the ability to export existing applications, data and processes into the cloud with ease? And can you import back just as hassle free?
  • Data center physical security.
  • Access and operations security. How does the consumer protect its physical data centers? Are these the SAS 70 Type II data centers? Are there trained and skilled data center operators in those places?
  • Virtual data center security. Your provider must be clear about how to control the method of access to physical machines. How are these machines managed? And who are able to access these machines?
  • In terms of scale and speed, most cloud efficiency derives from how the cloud is architected. Be sure to understand how the individual pieces, the compute nodes, network nodes, storage nodes, etc., are architected and how they are secured and integrated.

Application and data security.

In order to be able to implement your policies, the cloud solution must permit you to define groups, roles with granular role-based access control, proper password policies and data encryption–both iin transit and at rest.

5. Cost efficiencies. Without any commitments upfront, cloud solutions should enable your success to drive success. Unlike a managed service or a hosting solution, a cloud solution uses technology to automate the back-end systems, and therefore can operate large resource pools without the immense human costs. Having this luxury translates all these into real cost savings for you.

Despite business leaders recognizing the benefits of cloud computing technologies, more than a handful still have questions about cloud security and control. Indeed, that is understandable. However, by adopting a collaborative approach and aligning their responsibilities with those of the cloud provider, these leaders can find solutions that offer the best of both worlds. They get the visibility and control they want and need, while giving their teams access to the huge performance gains only the cloud can provide.

Contact Nubifer for a free, no-obligation Cloud Migration consultation.

Has Your Organization Adopted a Cloud Migration Strategy?

There has been an increased amount of research lately that indicates that many organizations will move to the cloud in the short term, there isn’t a lot of information detailing who is using it now and what they are using it for.

A published study by CDW reported that a number of enterprises are actually unaware that they are already using cloud applications and have a limited cloud adoption strategy.

It must be noted though, that this does not mean these enterprises have no intention of moving to the cloud. It just means, that these enterprises have not yet approached cloud computing strategically, and have not implemented an organization wide adoption strategy.

Cloud Computing Strategies

Another interesting note, according to the CDW report, is the percentage of companies claiming to have an enterprise policy on the acclimation to cloud computing — only 38%. This comes as a surprise as the report also concludes that 84% of organizations have already installed, at the minimum, one cloud application.

In March 2011, more than 1,200 IT professionals were asked to answer surveys for the CDW 2011 Cloud Computing Tracking Poll, which drew some interesting conclusions. It was discovered that these enterprises are uneasy with using public clouds and would rather go through the private clouds.

Cloud Application Usage

However, it is necessary to examine these statistics again with more caution. As mentioned above, more than 84% of these organizations claim that they have, at the bare minimum, one cloud application, yet they still do not consider themselves as cloud users.

The reason behind this discrepancy has yet to be determined. In other words, organizations are still unclear as to if and how it can integrate with their current enterprise architecture.

This is emphasized by how only 42% of those surveyed being convinced that their operations and amenities have the ability to operate efficiently in the cloud. Statistics show that applications operated in the cloud most frequently are the following:

  • Commodity applications such as email (50% of cloud users)
  • File storage (39%)
  • Web and video conferencing (36% and 32%)
  • Online learning (34%)

Developing a Cloud Strategy

Eight industries that were surveyed as part of the CDW Cloud Computing Tracking Poll back in March 2011 were—small businesses, medium businesses, large businesses, the Federal government, State and Local governments, healthcare, higher education and K-12 public schools. The poll discovered conclusions specific to each of the eight industries. It also included 150 individuals from each industry who acknowledged themselves as knowledgeable with the current uses and future plans of cloud application usage within their respective organization.

Although there are various hurdles to consider prior to adoption, primarily they can be divided into four segments:

1. Adoption Strategy

Despite having a number as high as 84% of organizations using at least one cloud-based application, only 25% of them have an organization wide adoption strategy and recognize themselves as cloud users. Just over a third has a formal plan for cloud adoption.

2. ROI Considerations

Approximately 75% were noted to have cost reductions upon migrating applications to a cloud platform.

3. Security

One of the, if not the primary obstacle, holding both current and potential users back is security. However, quite a number of users, including those who are currently using cloud applications, have yet to realize the full potential of security applications available.

4. Future spending

It is necessary for organizations to discover what future hardware and software acquisitions can be migrated into a cloud ecosystem.

Cloud Computing Now

A lot can happen in five years—this is especially true for the cloud industry. Currently, this study does not discuss in depth the difference between cloud computing and SaaS. However, it is likely that SaaS could be included in the study as it did define cloud computing as a “model for enabling convenient, on-demand access to a shared pool of configurable computing resources.”

With this in mind, along with the recent Forrester research on IT spending, it is highly likely that the data CDW has outlined will be significantly different five years from now.

According to Forrester, a record number of organizations will be investing in SaaS technologies, which broadly, is a subset of cloud computing. The data includes a finding that 25% of enterprises examined have a adopted a new cloud technology this year, with 14% using IaaS, 8% using PaaS, and 6% using business-process-as-a-service.

Does Your Organization Have a Cloud Migration Strategy?

In the end, the research was able to provide some thought provoking data. It was able to show that many companies are already leveraging the cloud without even knowing it.

Regardless of the potential ROI and efficiency gains offered by cloud computing, a significant number of companies have yet to seize the opportunity to leverage the scalability and efficiency of modern cloud applications.

Aside from this, according to the research, many companies find themselves without a coherent company wide strategy for dealing with cloud adoption. This is important to note because it is no secret a lack of planning can lead to disastrous results—with results like these needing a lot of financial and organizational efforts to fix.

If your organization is one of those lacking a coherent and comprehensive cloud adoption strategy, contact the Cloud accelerator experts at Nubifer to help guide the way. Nubifer partners with the leading vendors in order to provide unbiased cloud application architecture diagrams, white papers, security and compliance risk analysis and migration consulting services.


Strategies for Cloud Security

Security and compliance concerns continue to be the primary barrier to cloud adoption. Despite important security concerns, cloud computing is gaining traction. The issue now is not “will my organization move to the cloud?” Rather, it is “when?”In this article, Nubifer’s Research Team explores requirements for intelligent cloud security strategies. What are the minimum requirements? How do you coalesce traditional security protocols with advanced technologies like data loss prevention and risk management?
Security Concerns Slowing Cloud Adoption

A recent Cloud Trends Report for 2011 discovered that the number of organizations that are immenently planning the move to the cloud almost doubled from 2009 (24%) to 2010 (44%). The study also discovered that issues relating to cloud security is the primary obstacle to migration. In the published report, more than a quarter of those surveyed cited security as their number one concern, with almost 60% including security in their top three.

CA Technologies recently published a study concluding that, despite industry concerns about cloud security, roughly half of those leveraging the cloud do not effectively review vendors for security issues before deployments. The study, ‘Security of Cloud Computing Users: A Study of Practitioners in the US & Europe’, discovered that IT personnel vary with their determination of who is in charge of securing sensitive data and how to go about doing  it.

Constructing a Cloud Security Plan

Despite the ability of many organizations to analyze their own security protocols, there remain many valid cloud security fears. Shifting the burden of protecting important data to an outside vendor is nerve-racking, especially in a vertical that has to abide by regulations such as HIPAA, SOX or PCI DSS.

Risks involving cloud security still have many unknowns, so discovering an over-arching cloud strategy is a requirement. If your organisation does not have a game plan in place, are you ready to adapt and change as requirements evolove?

Your CFO or related exec is your organizations’ largest risk for financial application breach and data loss. The HR director needs to be effectively trained and managed so that ‘lost’ personnel files don’t come back to bite you.  Most importantly, the largest risk of all is the CEO.

Hackers realize this, which is why Chief executives are consistently victims of  “whaling attacks,” such as the well known ‘CEO subpoena phishing scam’.

A robust strategy to protect the most privileged users has the additional benefit of giving your organization an generalized cloud security road-map. Are mobile device risks a concern? Your most senior users desire remote and mobile access. What about data loss? Your senior users have more access to tarrying data points.

When your organization moves from analyzing itself to evaluating potential cloud application and platforms, do not neglect to look into how prevalent cloud services have already become in your IT infrastructure. Are you using Salesforce.com? Basecamp? Taleo? Google Apps?

Super brand cloud/SaaS/PaaS providers, Microsoft, Salesforce.com and Google all have tremendous reputations. So aligning projects leveraging these brands with security protocols should not be time consuming. You’ll want to analyze others to ensure they are legit providers that spend the time to properly secure their IT environments.

Lastly, as software licenses run out and as product upgrades come due, you’ll be in position to effectively begin analyzing the cloud vendors you will want to leverage for your mission-critical operations.

Following that advice will get you started. For more information on formulating a Cloud Security strategy visit Nubifer.com.


5 Recommendations to Keep your Personal Data Secure in the Cloud

Apple’s iCloud offering  is additional evidence of the unmitigated flow of data to the cloud. Despite the latest breaches of security at various organizations, including the issues that have affected many Sony customers, more and more of us are casting personal or business assets to the cloud.

Yet many of us remain uneducated about the required steps we should employ to keep our online data safe. Adhering to these five guidelines will go a long way towards aiding the average person keep online threats at a distance.

1. Don’t Take Security for Granted
There are two ways to your online data. One is through the cloud provider’s environment, and the second route is even more potent, and it’s much closer to home. The easiest and most available way for an intruder to get to your online records is through your login credentials. Of course you want the provider to be secure, but don’t let that make you listless about your personal log-in creds.

2. Use Strong, Memorable Passwords
The problem with having complicated passwords is that they are usually hard to remember. Thekey is to start with something notable and then merge it into a strong password — this entails mixing numbers, letters, lower and upper case, and symbols as well. Start with an address, car license numbers, telephone numbers, date of birth. Don’t use your own — use those you know; friends, kids, parents, partners, previous addresses; or old addresses you were at and cars you drove a decade ago. Choose something that can’t be linked to your online personality but always mix it up — half an area code, a name with half of a zip code, parts of an old address. Then add in a $, an !, or an @ sign to mix it up even more.

3. Guard your Inbox
You are going to recycle passwords, mostly for sites where you are not keeping  important information like your credit card numbers, DOB, address or SSN. There’s one place where you should never neglect to use a unique password — your email inbox. Because this is the primary location where all your other logins come back to when you reset a password. This one location is the portal to all your other online personas.

Although it’s a bit of a hassle, you should opt for double-protecting your inbox with a two-factor authentication, which means you have to enter a second password in order to gain access. This is especially crucial if you have a habit of going to malicious websites, you don’t keep your anti-malware software up to date, or you have a habit of failing to identify phishing emails.

4. Don’t Leave the Password Recovery Backdoor Open
Quite often, users take many precautions to protect their personal information but make it very easy to reset their password through the password recovery service. If your user ID is simple to guess (it’s often your email) then do not use something easy to figure out for your password reset, such as your DOB, wife’s maiden name or some other easily accessible piece of personal information.

5. Have an Alternate to Fall Back on
Security is mostly about risk avoidance, and however careful your execution, you can’t eliminate all risk. So give yourself a fallback option. Don’t put all your money in one account, have a separate emergency email address, make sure you’ve got local coffee shop with WiFi you can resort to if your main Internet connection disappears. Knowing that you’ve got a second option if something bad happens helps you remain calm in an emergency, which gives you a better chance of surviving a crisis.

For more information regarding the security of your online data, visit Nubifer.com.

DoD Business Applications and the Cloud

The current cloud spending is less than 5% of total IT spending, but with an optimistic 25% growth rate, cloud computing is poised to become one of the dominant types for organizing information systems—which is why it is important for the Department of Defense Business Mission to begin organizing the path to cloud operations in order to migrate from its current low performance/high cost environment. 

The DoD Fiscal Year (FY) 2010 IT cost of the Business Mission—excluding payroll costs for uniformed and civilian personnel—is $5.2 billion, in addition to 1/3 of the costs of the communications and computing infrastructure tacking on an additional $5.4 billion to total costs.

The average IT budgets of the largest US corporate organizations are exceeded by the scope of DoD Business Applications by a multiple of three. As a result, DoD Business Operations need to think about its future IT directions as operating a secure and private cloud that is managed organically by the DoD Business Mission in order to squeeze the cost benefits out of the cloud.

There are many forms of cloud computing, ranging from Platform-as-a-Service (PaaS) and Infrastructure-as-a-Service (IaaS) to Software-as-a-Service (SaaS), but when it comes to the Department of Defense, offerings that can offer support of over 2,000 applications need apply. Business Operations cannot be linked to “public” clouds that are proprietary.

The DoD, for example, can’t rely on the largest cloud service like the Amazon Elastic Cloud, which offers computing capacity completely managed by the customer and is thus a “public cloud.” Because compute processing is purchased on demand, Amazon is an IaaS service. Once your applications are placed in the proprietary Amazon cloud, however, it is difficult to transfer the workload into a different environment.

Google, however, offers a PaaS service as a public cloud (read: accessible to all) via the Google App Engine. Google allows developers to build, host and run web applications on Google’s mature infrastructure with its own operating system; Google only provides a few Google-managed applications.

Salesforce.com’s enterprise level computing currently operates at $1.4 billion revenue rate per year, with 2 million subscribers signed up for SaaS application services running in a proprietary PaaS environment. Because Salesforce offers only proprietary solutions and can’t be considered by DoD, although Salesforce’s recent partnership with VMware might change all that.

Other cloud providers offer IaaS services, but they all leave it to customers to manage their own applications; they qualify for DoD applications provided that would meet open source and security criteria.

Open Platform and Open Source
Microsoft’s Windows Azure platform offers a PaaS environment for developers to create cloud applications and offers services running in Microsoft’s data centers on a proprietary .Net environment. These preferentially .Net applications are integrated into a Microsoft controlled software environment but can be defined as a “closed” platform.

Currently, DoD Business Mission applications are running largely in a Microsoft .Net environment. What remains to be seen is if DoD will pursue cloud migration into a multi-vendor “open platform” and “open source” programming environment or continue sticking to a restrictive Microsoft .Net?

The largest share of the DoD IT budget goes towards the Defense Information Systems Agency (DISA), which has advocated the adoption of the open source SourceForge library in April 2009 for unclassified programs. DISA’s Forge.mil program enables collaborative software development and cross-program sharing of software, system components ad services in support of network-centric operations and warfare. Forge.mil is modeled from concepts proven in open-source software development and represents a collection of screened software components and is used by thousands of developers. Forge.mil takes advantage of a large library of tested software projects and its components are continuously evaluated by thousands of contributors (including some from firms like IBM, Oracle and HP although not from Microsoft, which controls its own library of codes).

OSS is defined as software for which the human-readable source code is available for use, study, reuse, modification, enhancement and redistribution by the users of that software by a DoD Memorandum of October 16, 2009 by the Acting DoD Chief Information Officer on “Clarifying Guidance Regarding Open Source Software (OSS).” OSS meets the definition of “commercial computer software” and will thus be given preference in building systems. DoD has began the process of adoption of open course computer code with the announcement of Forge.mil.

Implications
Due to the emigration of business applications, a reorientation of systems development technologies in favor of running on “private clouds”—while taking advantage of “open source” techniques—is necessary in order to save the most. The technologies currently offered for the construction of “private” clouds will help to achieve the complete separation of the platforms on which applications run, from the applications themselves. The simplification that can be achieved through the sharing of “open” source code from the Forge.mil library makes delivering cloud solutions cheaper, quicker and more readily available.

For more information regarding the DoD and open source cloud platforms, please visit nubifer.com today.

Feds to Unveil Cloud Security Guidelines

Late in 2010, the federal government issued draft plans for the voluntary Federal Risk and Authorization Management Program, dubbed FedRAMP. FedRAMP is expected to be operational by April, 2011 and would ensure cloud services meet federal cyber-security guidelines—which will likely shelve remaining government concerns about cloud security and ramp up adoption of cloud technologies.

Developed with cross-government and industry support over the past 18 months, the voluntary program would put cloud services through a standardized security accreditation and certification process. Any authorization could subsequently be leveraged by other agencies. Federal CIO Vivek Kundra said in a statement, “By simplifying how agencies procure cloud computing solutions, we are paving the way for more cost-effective and energy-efficient service delivery for the public, while reducing the federal government’s data center footprint.”

The adoption of cloud computing has been promoted by the Obama Administration as a way to help save the government money, and Kundra and other top officials have championed the technology and instituting policies like data center consolidation requirements—which could bring about a shift to the cloud. Federal IT managers, however, have consistently raised security concerns as the biggest barrier to adoption.

The government’s security concerns arise partly because cloud computing is a relatively new paradigm that has to be adapted to the security requirements of regulations like the Federal Information Management Security Act (FISMA, which governs federal cyber-security for most government agencies).  By mapping out the baseline required security controls for cloud systems, FedRAMP creates a consistent set of security outlines for cloud computing.

FedRAMP will seek to eliminate a duplicative, costly process to certify and accredit applications. Each agency used to take apps and services through their own accreditation process, but in the shared-infrastructure environment of the cloud, this process is redundant.

The FedRAMP draft is comprised of three major components: a set of cloud computing security baseline requirements; a process to continuously monitor cloud security; and a description of proposed operational approaches to authorizing and assessing cloud-based systems.

FedRAMP will be used for both private and public cloud services, and possibly for non-cloud computing information technologies and products. For example, two agencies have informed IBM of their intent to sponsor certification of their new Federal Community Cloud services.

Commercial vendors will not be able to directly request FedRAMP authorization, but rather have to rely on the sponsorship of a federal agency that plans to use their cloud services. Guidance on the CIO Council’s website suggests, FedRAMP “may not have the resources to accommodate all requests initially,” and that GSA will focus on systems with potentially larger user bases or cross-government interest, suggesting that the government predicts a large amount of interest.

FedRAMP will remain an inter-agency effort under federal CIO Kundra’s authority and will be managed by GSA. The new Joint Authorization Board, which now includes reps from GSA, the Department of Defense, will authorize the systems that go through the process with the sponsoring agency.

Although FedRAMP provides a base accreditation, most agencies have security requirements that go beyond FISMA and thus may have to do more work on top of the FedRAMP certification to make sure the cloud services they are looking to deploy meet individual agency requirements.

For more information regarding the Federal adoption of cloud technologies, visit Nubifer.com.

The Public Sector Cloud Model

With technological innovations in security, storage, network and connectivity making cloud infrastructure increasingly cost effective, cloud computing is becoming increasingly prevalent in enterprise IT environments. Cloud service brokers are quickly adopting these new technologies and are looking to deliver reliable, scalable, cost efficient options to their customers.

The concept of ‘shared compute resources’ has existed for awhile, with the industry full of ideas to eliminate the need for the desktop and computer sprawls in data centers, with these concepts centering on hosted applications. Hosted applications can be accessed from any place using an Internet connected device, but recently a new paradigm of similar hosted computing has come forth. This new concept is to create compute power in the cloud and make it available to anyone—while simultaneously hiding all of the complexity of managing it.

Cloud computing can not only be used as a vehicle of quicker service deployment and delivery for enterprises, but can aid governments as well. This is because the combined scale, sprawl and complexity of the government sector IT requires a simpler solution. Governments commonly reach out to widely dispersed geographies, people and sectors, which have different agendas, Internet connectivity, require different scales, applications of different complexity and other variables.

Because of this, governments have been maintaining IT environments of their own, creating an inability to reach people and deploy applications being limited by their capacity to create more data-centers.

A cloud platform may be an effective option for the public sector because it can provide a scalable way of building and facilitating computing infrastructures for their computing needs. The government’s ability to reach people on a broader scale can be made possible by the cloud’s increased availability, also resulting in simplified maintenance requirements for their own in-house IT environments.

Compute Resource Distribution
In order to guarantee that compute resources are readily available for various departments, governments usually require large geo-located deployments of IT infrastructure. In the past, this was completed with the help of distributing and allocating budgets for IT within siloed departmental budgets, making it difficult for governments to track and control the expenditures various departments make in their disparate IT ecosystems.

Lower investments in IT equals lower automation of processes and subsequently lower quality of service, but this can be changed by IT infrastructure provisioning using a pubic cloud platform. Cloud infrastructures can help entities ensure that that IT needs of its department are dispersed in the form of computing capacity as opposed to budgets.

Provisioning
A users scale of usage dictates deeper discounts on the platform pricing, but not in provisioning of compute efficiencies. Governments are essentially buying IT solutions in bulk—which is why cloud computing is able to provide a solution to the provisioning challenge of governments’ IT needs. Governments should readily consider centralized cloud deployments with quick provisioning of computing power.

In anticipation and expectation of providing better access to information and services to the people, most governments entities are aiming to distribute compute resources to as many sectors of the country as possible. The time to deliver a service is currently dependent on factors like bottlenecks, availability and processes, but cloud computing can shift the focus of governments to extending the reach of IT applications and information.

Standards in Regulation
It is necessary for governments to ensure that complex regulatory frameworks are implemented and followed in their IT environments. A large portion of these regulatory needs are followed through by IT departments today, and regulatory controls are executed through IT policies. Most often, security and governance are dependent on individual or standardized procedural controls—and the cloud can facilitate the shift from procedural controls to standards.

Managing Information Availability
Governments’ focus is on dispersing meaningful information to their citizens and their various departments, and cloud computing can help facilitate this focus. Governments will be able to scale to unforeseen new heights with a renewed focus on information disbursement.

Essentially, shifting the priority from managing infrastructure to managing information can drive social change, and the cloud is positioned to make this a reality for governments organizations.

For more information regarding the Cloud Computing’s role in the public sector, visit Nubifer.com.

Start Me Up….Cloud Tools Help Companies Accelerate the Adoption of Cloud Computing

Article reposted form HPC in the Cloud Online Magazine. Article originally posted on Nov. 29 2010:

For decision makers looking to maximize their impact on the business, cloud computing offers a myriad of benefits. At a time when cloud computing is still being defined, companies are actively researching how to take advantage of these new technology innovations for business automation, infrastructure reduction, and strategic utility based software solutions.

When leveraging “the cloud”, organizations can have on-demand access to a pool of computing resources that can instantly scale as demands change. This means IT — or even business users — can start new projects with minimal effort or interaction and only pay for the amount of IT resources they end up using.

The most basic division in cloud computing is between private and public clouds. Private clouds operate either within an organization’s DMZ or as managed compute resources operated for the client’s sole use by a third-party platform provider. Public clouds let multiple users segment resources from a collection of data-centers in order to satisfy their business needs. Resources readily available from the Cloud include:

● Software-as-a-Service (SaaS): Provides users with business applications run off-site by an application provider. Security patches, upgrades and performance enhancements are the application provider’s responsibility.

● Platform-as-a-Service (PaaS): Platform providers offer a development environment with tools to aide programmers in creating new or updated applications, without having to own the software or servers.

● Infrastructure-as-a-Service (IaaS): Offers processing power, storage and bandwidth as utility services, similar to an electric utility model. The advantage is greater flexibility, scalability and interoperability with an organization’s legacy systems.

Many Platforms and Services to Choose From:

Cloud computing is still in its infancy, with a host of platform and application providers serving up a plethora of Internet-based services ranging from scalable on-demand  applications to data storage services to spam filtering. In this current IT environment, organizations’ technology ecosystem have to operate cloud-based services individually, but cloud integration specialists and ISVs (integrated software vendors) are becoming more prevalent and readily available to build on top of the emerging and powerful platforms.

Mashing together services provided by the worlds largest and best funded companies like Microsoft, Google, Salesforce.com, Rackspace, Oracle, IBM, HP and many others, gives way to an opportunity for companies to take hold and innovate, and build a competitive, cost saving cloud of their own on the backs of these software giant’s evolving view of the cloud.

Cloud computing comes into focus only when you think about what IT always needs: a way to increase capacity or add capabilities on the fly without investing in new infrastructure, training new personnel, licensing and maintenance of new software. Cloud computing involves all subscription-centric or pay-for-what-you-use service that extends your IT environments existing capabilities.

Before deciding whether an application is destined for the cloud, analyze you current cost of ownership. Examine more than just the original licenses and cost of ownership; factor in ongoing expenses for maintenance, power, personnel and facilities. To start, many organizations build an internal private cloud for application development and testing, and decide from their if it is cost-effective to scale fully into a public cloud environment.

“Bridging the Whitespace” between Cloud Applications

One company, Nubifer.com (which in Latin, translates to ‘bringing the clouds’) approaches simplifying the move to the Cloud for its enterprise clients by leveraging a proprietary set of Cloud tools named Nubifer Cloud:Portal, Cloud:Connector and Cloud:Link. Nubifer’s approach with Cloud:Portal enables the rapid development of “enterprise cloud mash-ups”, providing rich dash-boards for authentication, single sign-on and identity management. This increased functionality offers simple administration of accounts spanning multiple SaaS systems, and the ability to augment and quickly integrate popular cloud applications. Cloud Connector seamlessly integrates data management, data sync services, and enables highly available data interchange between platforms and applications. And Cloud:Link provides rich dashboards for analytic and monitoring metrics improving system governance and audit trails of various SLAs (Service Level Agreements).

As a Cloud computing accelerator, Nubifer focuses on aiding enterprise companies in the adoption of emerging SaaS and PaaS platforms. Our recommended approach to an initial Cloud migration is to institute a “pilot program” tailored around your platform(s) of choice to in order to fully iron-out any integration issues that may arise prior to a complete roll-out.

Nubifer’s set of Cloud Tools can be hosted on Windows Azure, Amazon EC2 or Google AppEngine. The scalability offered by these Cloud platforms promote an increased level of interoperability, availability, and a significantly lower financial barrier for entry not historically seen with current on-prem application platforms.

Cloud computing’s many flavors of services and offerings can be daunting at first review, but if you take a close look at the top providers offerings, you will see an ever increasing road map for on-boarding your existing or new applications to “the cloud”. Taking the first step is easy, and companies like Nubifer that provide the platform services, and the partner networks to aid your goals, are resourced and very eager to support your efforts.

Department of Defense And Cloud Security Management

Migrating Department of Defense applications to public cloud platforms operated outside of the Department of Defense DMZ typically raise concerns about the efficacy of security protocols. Currently, the DoD data-centers rely on fire-walled barriers that are designed to prohibit interactions with those outside of its perimeter. The effectiveness of these safe-guards can be argued on a number of levels. The DoD contracts out the management of much of its data, meaning those in charge of their data are neither military nor civilian employees.

Regardless of this outsourcing, the transference of compute resources to third party platform providers will be subjected to stringent security guidelines. What may be viewed as a minor security incident could result in a revocation of security certification for the cloud services provider.

High level DoD executives realize that cloud computing offers a significant opportunity for cost savings, scalability, as well as fail-safe features that offer advantages when compared to the current DISA data-centers. Decision makers are now asking whether the externalization of the DoD workload to a public cloud cause a degradation in network security. Will the governmental auditors reject a public cloud because they cannot fully guarantee security? But the fact is that many public cloud offerings offer the same level of data security, obfuscation and redundancy that’s offered in the DoD’s internal data-centers.

DoD data-centers lock up server farms as well as associated power inside a physical structure in order to gain security. Additional controls installed include:

– Perimeter firewalls
– Demilitarized zones (DMZ) for isolating incoming transactions
– Network segmentation
– Intrusion detection devices and software for monitoring compliance with security protocols

Currently, there are a plethora of companies selling hardware devices and software packages claiming to increase data-center security. But as security threats rise, data-center management teams keep adding disparate security management devices, thus increasing not only operating costs but also the delays that are incurred as transactions travel their way through multiple security barriers.

The accumulation of these disparate security features only increase the vulnerability of systems and add to potential security loop-holes. Each data-center will ultimately have security measures that are unique to each individual situation. Therefore they are not amenable to coordinated and standardized oversight.

Cloud platform providers gain from the benefits of virtualization. Virtual machines from multiple providers are co-hosted on physical resources without any cross-referencing that can jeopardize security. This allows virtualization to be the key technology that enables the migration of applications into a cloud environment where security is provided via the hypervisor that controls each separate virtual machine.  A standardized third-party security appliance can be connected to this hypervisor allowing for consistent security services delivered to every virtual machine even if they run on differing operating systems.

Users must stop viewing protection of applications at the data center or server levels as the basis for achieving security. Instead, we have to view each individual virtual computer, with its own operating system and its own application as fully equipped to benefit from standardized security services.

A data-center may encompass thousands of virtual machines. Cloud security will be achieved by protecting virtual computers through their hypervisor on which they operate. This way, every virtual machine can be assigned a sub-set of security protocols that will carry its protection safeguards as well as security criteria. Take moving a virtual machine from a DISA data-center to the cloud, the security of a relocated virtual machine will not be compromised. Multi-tenancy of diverse applications, from varied sources is now feasible since the cloud can run diverse applications in separate security enclosures, each with their own customized security policies.

In a cloud environment the addition of a new application is simplified. Integration with security measures can be instant and seamless because a hypervisor already supports your current security protocols. And if a virtual machine can port its own security measures when migrating from one cloud to another, these integration efforts can be further reduced.

In Summation
Security services for a cloud environment can now be pooled and standardized to support a large number of virtual machines. Such pooled services can be managed to give DoD data-centers vastly improved shared security awareness.

But the overall management and monitoring of enterprise-wide security will still remain an intensive task. However, as compared with the current diversity in security methods, the transfer of applications onto a cloud platform will further reduce costs and simplify the administration of security.

Whether the Department of Defense can efficiently implement its own private cloud, or whether it will have to rely on commercially provided cloud providers is yet to be known. The DoD could rely on commercial firms for most cloud computing services, except for retaining the direct oversight over security. This could be accomplished by managing all security appliances and policies from DoD Network Control Centers that would be staffed by internal DoD personnel.

For more information regarding security of Cloud platforms and how the government is approaching Cloud Computing and Software-as-a-Service, visit Nubifer.com.

Protecting Data in the Cloud

When it comes to cloud computing, one of the major concerns is protecting the data being stored in the cloud. IT departments often lack the knowledge necessary to make informed decisions regarding the identification of sensitive data—which can cost an enterprise millions of dollars in legal costs and lost revenue.

The battle between encryption and tokenization was explored in a recent technology report, and the merits of both are being considered as securing data in the cloud becomes more and more important. Although the debate over which solution is best continues, it is ultimately good news that protection in cloud computing is available in the first place.

It is essential that data is secure while in storage or in transit (both inherent in cloud computing) in the current business climate; the protection is necessary whether dealing with retail processing, accessing personal medical records or managing government information and financial activity. It is necessary to implement the correct security measure to protect sensitive information.

So what is tokenization? Tokenization is the process in which sensitive data is segmented into one or more pieces and replaced with non-sensitive values, or tokens, and the original data is stored encrypted elsewhere. When clients need access to the sensitive data, they typically provide the token along with authentication credentials to a service that then validates the credentials, decrypts the secure data, and provides it back to the client. Even though encryption is used, the client is never involved in either the encryption or decryption process so encryption keys are never exchanged outside the token service. Tokens protect information like medical records, social security numbers, financial transactions, etc prevent unauthorized access.

Encryption, on the other hand, is the process of changing the information using an algorithm to ensure it is unreadable to anyone expect those who possess a key or special knowledge. The military and government have been using this method for some time to make sure that their sensitive information remains in the hands of the right people and organizations.

Tokenization and encryption can be applied when using cloud computing to protect the information is used in the cloud. For organizations seeking to determine which method is a better fit for them, it is necessary to ask questions about the security of the method and whether one has more pros than the others. It is necessary in this case to clearly define the objectives of the business process as well.

A clear method of protecting information is essential if cloud computing is posing benefits for the enterprise. Conversely, this can also be an obstacle to launching a cloud computing strategy. Gartner reports that 85 percent of participants cited security as a key factor that could prevent them from launching cloud-based apps.

In conclusion, there is no clear winner in the debate over tokenization versus encryption. Rather, it depends on the goals of the business and how the company plans to manage the security of their sensitive information. The data needs to be protected in a way that is easily manageable when launching a cloud computing strategy—and it is only at this point that cloud computing can be both successful and secure. For more information regarding securing data int eh cloud via tokenization, contact a Nubifer representative today.

Google Apps Receives Federal Certification for Cloud Computing

On July 26, Google released a version of its hosted suite of applications that meets the primary federal IT security certification, making a major leap forward in its push to drive cloud computing in the government. Nearly one year in the making, Google announces its new edition of Google Apps as the first portfolio of cloud applications to have received certification under the Federal Information Security Management Act (FISMA).

The government version of Google Apps has the same pricing and services as the premier edition, including Gmail, the Docs productivity site and the Talk instant-messaging application.

Google Business Development Executive David Mihalchik said to reporters, “We see the FISMA certification in the federal government environment as really the green light for federal agencies to move forward with the adoption of cloud computing for Google Apps.”

Federal CIO Vivek Kundra announced a broad initiative to embrace the cloud across the federal government last September, as a way to reduce both costs and inefficiencies of redundant and underused IT deployments. The launch of that campaign was accompanied by the launch of Apps.gov. An online storefront for vendors to showcase their cloud-based services for federal IT manager, Apps.gov was revealed at an event at NASA’s Ames Research Center and attended by Google co-founder Sergey Brin. At the same time, Google announced plans to develop a version of its popular cloud-based services that  would meet the federal-government sector’s security requirements.

Mike Bradshaw, director of Google’s Federal Division, said, “We’re excited about this announcement and the benefits that cloud computing can bring to this market.” Bradshaw continued to say that “the President’s budget has identified the adoption of cloud computing in the federal government as a way to more efficiently use the billions of dollars spent on IT annually.” Bradshaw added that the government spends $45 million in electrical costs alone to run its data-centers and servers.

Security concerns are consistently cited by proponents of modernizing the deferral IT apparatus as the largest barrier to the adoption of cloud computing. Google is including extra security features to make federal IT buyers at agencies with more stringent security requirements feel more at ease. These extra security features are in addition to the 1,500 pages of documentation that came with Google’s FISMA certification.

Google will store government cloud accounts on dedicated servers within its data centers that will be segregated from its equipment that houses consumer and business data. Additionally, Google has committed to only use servers located in the continental U.S. for government cloud accounts. Google’s premier edition commercial customers have their data stored on servers in both the U.S. and European Union.

Mihalchik explained that security was the leading priority from the get-go in developing Google Apps for Government saying, “We set out to send a signal to government customers that the cloud is ready for government.” Adding, “today we’ve done that with the FISMA certification, and also going beyond FISMA to meet some of the other specific security requirements of government customers.”

Thus far, Google has won government customers at state and local levels such as in the cities of Los Angeles, California and Orlando, Florida. Mihalchik said that over one dozen federal agencies are in various stages of trialing or deploying elements of Google apps. Mihalchik states that several agencies are using Google anti-spam and anti-virus products to filter their email. Others, like the Department of Energy, are running pilot programs to evaluate the full suite of Google Apps in comparison with competitors’ offerings.

Find out more about cloud security and FISMA certification of Google Apps by talking to a Nubifer Consultant today.

Understanding the Cloud with Nubifer Inc. CTO, Henry Chan

The overwhelming majority of cloud computing platforms consist of dependable services relayed via data centers and built in servers with varying tiers of virtualization capabilities. These services are available anywhere that allows access to the networking platform. Clouds often appear as single arenas of access for all subscribers’ enterprise computing needs. All commercial cloud platform offerings are guaranteed to adhere to the customers’ quality of service (QoS) requirements, and typically offer service level agreements.  Open standards are crucial to the expansion and acceptance of cloud computing, and open source software has layed the ground work for many cloud platform implementations.

The article to follow is what Nubifer Inc. CTO, Henry Chan, recently described to be his summarized view of what cloud computing means, its benefits and where it’s heading in the future:

Cloud computing explained:

The “cloud” in cloud computing refers to your network’s Internet connection. Cloud computing is essentially using the Internet to perform tasks like email hosting, data storage and document sharing which were traditionally hosted on premise.

Understanding the benefits of cloud computing:

Cloud computing’s myriad of benefits depend on your organizational infrastructure needs. If your enterprise is sharing large number of applications between a varying number of office locations, it would be beneficial to your organization to store the apps on a virtual server. Web-based application hosting can save time for people traveling without the ability to connect back to the office because they can have access to everything over their shared virtual private network (VPN).

Examples of cloud computing:

Hosted email (such as GMail or Hotmail), online data back-up, online data storage, any Software-as-a-Service (SaaS) application (such as a cloud hosted CRM from vendors like Salesforce, Zoho or Microsoft Dynamics) or accounting applications, are examples of applications that can be hosted in the cloud. By hosting these applications in the cloud, your business can benefit from the interoperability and scalability cloud computing and SaaS services offer.

Safety in the cloud:

Although there are some concerns over the safety of cloud computing, the reality is that data stored in the cloud can be just as secure as the vast majority of data stored on your internal servers. The key is to implement the necessary solutions to ensure that the proper level of encryption is applied to your data while traveling to and from your cloud storage container, as well as when being stored. This can be as safe as any other solution you could implement locally when designed properly. The leading cloud vendors all currently maintain compliance with Sarbanes-Oxley, SAS90, FISMA and HIPPA.

Cloud computing for your enterprise:

To determine which layer of cloud computing is optimally suited for your organization, it is important to thoroughly evaluate your organizational goals as it relates to your IT ecosystem. Examine how you currently use technology, current challenges with technology, how your organization will evolve technologically in the years to come, and what scalability and interoperability will be required going forward. After a careful gap analysis of these determinants, you can decide what types of cloud-based solutions will be optimally suited for your organizational architecture.

Cloud computing, a hybrid solution:

The overwhelming trend in 2010 and 2011 is to move non-sensitive data and applications into the cloud while keeping trade secrets behind your enterprise firewall, as many organizations are not comfortable hosting all their applications and hardware in the cloud. The trick to making cloud computing work for your business is to understand which applications should be kept local and which would benefit most from leveraging the scalability and interoperability of the cloud ecosystem.

Will data be shared with other companies if it is hosted in the cloud:

Short answer: NO! Reputable SaaS and cloud vendors will make sure that your data is properly segmented according to the requirements of your industry.

Costs of cloud computing:

Leading cloud-based solutions charge a monthly fee for application usage and data storage, but you may be outlaying this capital expenditure already, primarily in the form of hardware maintenance and software fees—some of which could be wiped out by moving to the cloud.

Cloud computing makes it easy for your companies’ Human Resource software, payroll and CRM to co-mingle with your existing financial data, supply chain management and operations installation, while simultaneously reducing your capital requirements on these systems. Contact a Nubifer representative today to discover how leveraging the power of cloud computing can help your business excel.

Confidence in Cloud Computing Expected to Surge Economic Growth

The dynamic and flexible nature of cloud computing, software-as-a-service and platform-as-a-service may help organizations in their recovery from the current economic downturn, according to more than two thirds of IT decision leaders and makers who participated in a recent annual study by Vanson Bourne, an International Research Firm. Vanson Bourne surveyed over 600 IT and business decision makers across the United States, United Kingdom and Singapore. Of the countries sampled, Singapore is leading the shift to the cloud, with 76 percent of responding enterprises using some form of cloud computing. The U.S. follows with 66 percent, with the U.K. at 57 percent.

This two year study about Cloud Computing reveals that IT decision makers are very confident in cloud computing’s ability to deliver within budget and offer CapEx savings. Commercial and public sector respondents also predict cloud use will help decrease overall IT budgets by an average of 15 Percent, with others expecting savings as much as 40 Percent.

“Scalability, interoperability and pay-as-you-go elasticity are moving many of our clients toward cloud computing,” said Chad Collins, CEO at Nubifer Inc., a strategic Cloud and SaaS consulting firm. “However, it’s important, primarily for our enterprise clients, to work with a Cloud provider that not only delivers cost savings, but also effectively integrates technologies, applications and infrastructure on a global scale.”

A lack of access to IT capacity is clearly labeled as an obstacle to business progress, with 76 percent of business decision makers reporting they have been prevented from developing or piloting projects due to the cost or constraints within IT. For 55 percent of respondents, this remains an issue.

Confidence in cloud continues to trend upward — 96 percent of IT decision makers are as confident or more confident in cloud computing being enterprise ready now than they were in 2009. In addition, 70 percent of IT decision makers are using or plan to be using an enterprise-grade cloud solution within the next two years.

The ability to scale resources up and down in order to manage fluctuating business demand was the most cited benefit influencing cloud adoption in the U.S. (30 percent) and Singapore (42 percent). The top factor driving U.K. adoption is lower cost of total ownership (41 percent).

Security concerns remain a key barrier to cloud adoption, with 52 percent of respondents who do not leverage a cloud solution citing security of sensitive data as a concern. Yet 73 percent of all respondents want cloud providers to fully manage security or to fully manage security while allowing configuration change requests from the client.

Seventy-nine percent of IT decision makers see cloud as a straight forward way to integrate with corporate systems. For more information on how to leverage a cloud solution inside your environment, contact a Nubifer.com representative today.

Do You Still Need to Worry About Cloud Security?

The answer to the question posed above is … maybe, but definitely not as much as before! A few recent studies in a handful of technologically conservative industries suggest that people and businesses are becoming increasingly comfortable with storing and managing their data in the cloud.

Markets like health care, finance and government, which are typically technology risk-averse, are quickly adopting (and even advocating) disruptive cloud technologies.

Those that have yet to adopt Software-as-a-Service continue to raise two fears when considering making the move into the cloud: Who is in control of my data? Is it safe to store my data somewhere other than the office? These concerns are valid and must be understood by those making the move to the cloud, but the idea that my data must be stored under my roof is shifting.

One expert from Accenture was recently quoted in an article on InformationWeek.com as saying, “Healthcare firms are beginning to realize that cloud providers actually may offer more robust security than is available in-house.” Within that same story a recent study was cited that stated that about one-third of the health care industry currently uses cloud apps and that over 70% of respondents plan to shift more and more to SaaS and cloud apps. While these estimates are interesting in any field, the intrigue is heightened when it comes to health care, where HIPPA compliance rules are notoriously strict.

The finance world is seeing similar shifts. For example, a recent study conducted by SIFMA explained how cloud computing is enabling the financial industry to move forward with technology in spite of budget restraints. “The [finance] industry is showing a larger appetite for disruptive technologies such as cloud computing to force business model change,” said the study.

Even the federal government is showing traces of similar trends, with federal CIO Vivek Kundra singing the praises of cloud computing even more than Marc Benioff! “Far too long we’ve been thinking very much vertically and making sure things are separated. Now we have an opportunity to lead with solutions that by nature encourage collaboration both horizontally and vertically.”

Cloud security remains an important issue that vendors take seriously, but there is definitely a shifting mood towards acceptance of cloud security. In a recently blog post, John Soat summarized the current mood saying, “It’s not that security in the cloud isn’t still a concern for both [health care and finance] industries, but it’s a known, and perhaps better understood factor … So while security is still a legitimate concern, it doesn’t seem to be the show stopper it used to be …”

Four Key Categories for Cloud Computing

When it comes to cloud computing, concerns about control and security have dominated recent discussions. While it was once assumed that all computing resources could be had from outside, now it is going towards a vision of a data center magically transformed for easy connections to internal and external IT resources.

According to IDC’s Cloud Services Overview report, sales of cloud-related technology is growing at 26 percent per year. That is six times the rate of IT spending as a whole; although they comprised only about 5 percent of total IT revenue this year. While the report points out that defining what constitutes cloud-related spending is complicated, it estimates global spending of $17.5 billion on cloud technologies in 2009 will grow to $44.2 billion by 2013. IDC predicts that hybrid or internal clouds will be the norm, although even in 2013 only an estimated 10 percent of that spending will go specifically to public clouds.

According to Chris Wolf, analyst at The Burton Group, hybrid cloud infrastructure isn’t that different from existing data-center best practices. The difference is that all of the pieces are meant to fit together using Internet-age interoperability standards as opposed to homegrown kludge.

The following are four items to consider when making a “shopping list” when preparing your IT budget for use of private or public cloud services:

1.       Application Integration

Software integration isn’t the first thing most companies consider when building a cloud, although Bernard Golden, CEO at cloud consulting firm HyperStratus, and CIO.com blogger, says it is the most important one.

Tom Fisher, vice president of cloud computing at SuccessFactors.com, a business-application SaaS provider in San Mateo, California, says that integration is a whole lot more than simply batch-processing chunks of data being traded between applications once or twice per day like it was done in mainframes.

Fisher continues to explain that it is critical for companies to be able to provision and manage user identities from a single location across a range of applications, especially when it comes to companies that are new in the software-providing business and do not view their IT as a primary product.

“What you’re looking for is to take your schema and map it to PeopleSoft or another application so you can get more functional integration. You’re passing messages back and forth to each other with proper error-handling agreement so you can be more responsive. It’s still not real time integration, but in most cases you don’t really need that,” says Fisher.

2.       Security

The ability to federate—securely connect without completely merging—two networks, is a critical factor in building a useful cloud, according to Golden.

According to Nick Popp, VP of product development at Verisign (VRSN), that requires layers of security, including multifactor authentication, identity brokers, access management and sometimes an external service provider who can provide that high a level of administrative control. Verisign is considering adding a cloud-based security service.

Wolf states that it requires technology that doesn’t yet exist. According to Wolf, an Information Authority that can act as a central repository for security data and control of applications, data and platforms within the cloud. It is possible to assemble that function out of some of the aspects Popp mentions today, yet Wolf maintains that there is no one technology able to span all platforms necessary to provide real control of even an internally hosted cloud environment.

3.       Virtual I/O

One IT manager at a large digital mapping firm states that if you have to squeeze data for a dozen VMs through a few NICs, the scaling of your VM cluster to cloud proportions will be inhibited.

“When you’re in the dev/test stage, having eight or 10 [Gigabit Ethernet] cables per box is an incredible labeling issue; beyond that, forget it. Moving to virtual I/O is a concept shift—you can’t touch most of the connections anymore—but you’re moving stuff across a high-bandwidth backplane and you can reconfigure the SAN connections or the LANs without having to change cables,” says the IT manager.

Virtual I/O servers (like the Xsigo I/O Director servers used by the IT manager’s company) can run 20Gbit/sec through a single cord and as many as 64 cords to a single server—connecting to a backplane with a total of 1,560Gbit/sec of bandwidth. The IT Manager states that concentrating such a large amount of bandwidth in one device saves space, power and cabling and keeps network performance high and saves money on network gear in the long run.

Speaking about the Xsigo servers, which start at approximately $28,000 through resellers like Dell (DELL), the manager says, “It becomes cost effective pretty quickly. You end up getting three, four times the bandwidth at a quarter the price.”

4.       Storage

Storage remains the weak point of the virtualization and cloud-computing worlds, and the place where the most money is spent.

“Storage is going to continue to be one of the big costs of virtualization. Even if you turn 90 percent of your servers into images, you still have to store them somewhere,” says Golden in summary. Visit Nubifer.com for more information.

Microsoft Releases Security Guidelines for Windows Azure

Industry analysts have praised Microsoft for doing a respectable job at ensuring the security of its Business Productivity Online Services, Windows and SQL Azure. With that said, deploying applications to the cloud requires additional considerations to ensure that data remains in the correct hands.

Microsoft released a version of its Security Development Lifecycle in early June as a result of these concerns. Microsoft’s Security Development Lifecycle, a statement of best practices to those building Windows and .NET applications, focuses on how to build security into Windows Azure applications and has been updated over the years to ensure the security of those apps.

Principle security program manager of Microsoft’s Security Development Lifecycle team Michael Howard warns that those practices were not, however, designed for the cloud. Speaking in a pre-recorded video statement embedded in a blog entry, Howard says, “Many corporations want to move their applications to the cloud but that changes the threats, the threat scenarios change substantially.”

Titled “Security Best Practices for Developing Windows Azure Applications,” the 26-page white paper is divided into three sections: the first describes the security technologies that are part of Windows Azure (including the Windows Identity Foundation, Windows Azure App Fabric Access Control Service and Active Directory Federation Services 2.0—a core component for providing common logins to Windows Server and Azure); the second explains how developers can apply the various SDL practices to build more secure Windows Azure applications, outlining various threats like namespace configuration issues and recommending data security practices like how to generate shared-access signatures and use of HTTPS in the request URL;  and the third is a matrix that identifies various threats and how to address them.

Says Howard, “Some of those threat mitigations can be technologies you use from Windows Azure and some of them are threat mitigations that you must be aware of and build into your application.”

Security is a major concern and Microsoft has address many key issues concerning security in the cloud. President of Lieberman Software Corp., a Microsoft Gold Certified Partner specializing in enterprise security Phil Lieberman says, “By Microsoft providing extensive training and guidance on how to properly and securely use its cloud platform, it can overcome customer resistance at all levels and achieve revenue growth as well as dominance in this new area. This strategy can ultimately provide significant growth for Microsoft.”

Agreeing with Lieberman, Scott Matsumoto, a principal consultant with the Washington, D.C.-based consultancy firm Cigital Inc., which specializes in security, says, “I especially like the fact that they discuss what the platform does and what’s still the responsibility of the application developer. I think that it could be [wrongly] dismissed as a rehash of other information or incomplete—that would be unfair.” To find more research on Cloud Security, please visit Nubifer.com.

Microsoft Makes Strides for a More Secure and Trustworthy Cloud

Cloud computing currently holds court in the IT industry with vendors, service providers, press, analysts and customers all evaluating and discussing the opportunities presented by the cloud.

Security is a very important piece to the puzzle, and nearly every day a new press article or analyst report indicated that cloud security and privacy are a top concern for customers as the benefits of cloud computing continue to unfold. For example, a recent Microsoft survey revealed that although 86% of senior business leaders are thrilled about cloud computing, over 75% remain concerned about the security, access and privacy of data in the cloud.

Customers are correct in asking how cloud vendors are working to ensure the security of cloud applications, the privacy of individuals and protection of data. In March, Microsoft CEO Steve Ballmer told an audience at the University of Washington that, “This is a dimension of the cloud, and it’s a dimension of the cloud that needs all of our best work.”

Microsoft is seeking to address security-related concerns and help customers understand which questions they need to ask as part of Microsoft’s Trustworthy Computing efforts. The company is trying to become more transparent than competitors concerning how they help enable an increasingly secure cloud.

Server and Tools Business president Bob Muglia approached the issue in his recent keynote at Microsoft’s TechEd North America conference saying, “The data that you have is in your organization is yours. We’re not confused about that, that it’s incumbent on us to help you protect that information for you. Microsoft’s strategy is to deliver software, services and tools that enable customers to realize the benefits of a cloud-based model with the reliability and security of on-premise software.”

The Microsoft Global Foundations Services (GFS) site is a resource for users to learn about Microsoft’s cloud security efforts, with the white papers “Securing Microsoft’s Cloud Infrastructure” and “Microsoft’s Compliance Framework for Online Services” being very informative.

Driving a comprehensive, centralized Information Security Program for all Microsoft cloud data-centers and the 200+ consumer and commercial services they deliver –all built using the Microsoft Security Development Lifecycle–GFS covers everything from physical security to compliance, such as Risk Management Process, Response, and work with law enforcement; Defense-in-Depth Security controls across physical, network, identity and access, host, application and data; A Comprehensive Compliance Framework to address standards and regulations such as PCI, SOX, HIPPA, and the Media Ratings Council; and third party auditing, validation and certification (ISO 27001, SAS 70).

Muglia also pointed out Microsoft’s focus on identity, saying, “As you move to cloud services you will have a number of vendors, and you will need a common identity system.” In general, identity is the cornerstone of security, especially cloud security. Microsoft currently provides technologies with Windows Server and cloud offerings which customers can use to extend existing investments in identity infrastructure (like Active Directory) for easier and more secure access to cloud services.

Microsoft is not alone in working on cloud security, as noted by Microsoft’s chief privacy strategist Peter Cullen. “These truly are issues that no one company, industry or sector can tackle in isolation. So it is important to start these dialogs in earnest and include a diverse range of stakeholders from every corner of the globe,” Cullen said in his keynote at the Computers, Freedom and Privacy (CFP) conference. Microsoft is working with customers, governments, law enforcement, partners and industry organizers (like the Cloud Security Alliance) to ensure more secure and trustworthy cloud computing through strategies and technologies. To receive additional information on Cloud security contact a Nubifer.com representative today.

Facebook Security and Privacy: Ten Reminders to Live By

Facebook is arguably the largest social network on the globe, and because of that there are security and privacy issues that users need to remember. Here is a list of ten reminders to consider.

A reminder of why users need to be on guard when using Facebook arose during the Week of May 3, when users of the social network discovered that they were being permitted to view their friends’ private chat conversations. The loophole was quickly fixed by the folks over at Facebook, but users’ concerns about privacy issues remain.

A few months prior to the May 3 incident, some Facebook users received private messages that were meant for other users. Facebook acted similarly in this case, swiftly addressing the problem, but once again privacy advocates began to question whether Facebook was taking enough measures to protect data.

Facebook has maintained that these minor glitches are fixed quickly, and users must remember that it is nearly impossible for a social network service with over 400 million active users to deliver absolute data security 100 percent of the time. When joining Internet social networks, users need to expect their personal data to be vulnerable to a certain degree and make it their duty to maintain personal privacy and security on a social network.

Ten reminders to live by:

1. Privacy Concerns

There are legitimate privacy concerns that users need to be aware of in order to understand the issues that may arise when using Facebook. As soon as you acknowledge that Facebook isn’t without flaws, you can begin to safeguard your data. Once you have a better understanding of privacy on the Web, you can alter the way in which you use social networks.

2. Holes

The ways in which hackers find way to target Facebook’s users increases as the site becomes more and more popular. One of these malicious hackers’ tactics employs a phishing scam that asks users to input their credentials into a faux Facebook look-alike. Once a user does so, hackers have access to their log-in information and can alter that person’s profile and send that information to others.

3. Only Offer What You Want Others to See

Third parties can only see the information that you put on the social network. This seems simple, but it is an important thing to remember. Facebook is a place where users can communicate with friends, and some users use it as a platform to reveal things that they should not. It is important for users to remember that what they intend to share to a smaller group may eventually be able to be accessed by others.

4. Facebook is Meant for Adults

Facebook originated as an online space for college students, but as the social network expanded it began to include generations above and below the collegiate level … meaning kids. It is important to remember that the Web remains a dangerous place for kids and that if adults are concerned about privacy then it isn’t a safe place for children.

5. Use the Facebook Privacy Settings

It is important to change your privacy settings before using Facebook. Even critics find that Facebook’s privacy settings to be robust in the world of social networking. Users can decide which people are permitted to see the content in their profiles within a few minutes of reviewing the site’s settings. Facebook highlights the importance of privacy and equips users with the tools to feel comfortable on the social network.

6. Be Weary of Sharing Sensitive Information on the Web

The Web may have been a bastion of anonymity years ago, but that era is over. Users share more and more information on sites like Facebook and as a result the desire for anonymity has gradually diminished. Users need to remember that the Internet isn’t the place to disclose sensitive information and consequently only share what they are comfortable with all Web users seeing.

7. Is Privacy Best for a Social Network?

Facebook’s default settings make certain information available to others, thus it isn’t in a social network’s best interest for users to be able to use every single privacy setting. Users will need to be more diligent because the more information that they share on a social network, the more likely people are to want to use it. This fact is already known by Facebook, MySpace and Google and users need to know it too and begin fighting back.

8. Alternatives Aren’t Immune to Security Issues

Facebook alternatives aren’t any better in terms of privacy and security issues. Google Buzz, for example, has been a target by privacy advocates since its beginnings, with critics wondering why Google didn’t implement the right policies from the beginning. Facebook comes out on top when comparing all privacy on all the major social networks and consequently is probably the best choice for users concerned with privacy.

9. Some Privacy Is Lost and Gone

As users continue to reveal their true identities, the days of anonymity on the Web are numbered (if not gone completely). While many are uncomfortable with this, many users are becoming more comfortable with this fact. Web users can expect their names a maybe even a picture to be available on the Web when signing up for social networks. Information such as their hometown and college is also freely available. Absolute privacy is a thing of the past and users need to accept this fact.

10. Blame Can Be Placed on Facebook and Users Alike

While Facebook is an easy scapegoat for privacy woes, a large part of the blame can be placed on users. Facebook relies on users sharing information with others as its basic business model, and while it does attempt to maintain privacy, it is up to the users to control what information they choose to divulge. Additionally, it is incumbent upon users to educate themselves about the risks that could affect then if they don’t brush up on privacy and social networks. To learn more please visit Nubifer.com.

Cloud Computing Security Play Made by McAfee with McAfee Cloud Secure

A new service targeting Software-as-a-Service providers from McAfee combines vulnerability scanning and security certification for cloud infrastructures. The service—called the McAfee Cloud Secure program—is basically designated to compliment annual audits of security and process controls most cloud vendors undergo for the purpose of certification. McAfee officials say that with McAfee Cloud Secure they will team up with certification providers to offer an additional level of security by offering a daily scan of application, network perimeter and infrastructure vulnerabilities. Those that pass will be rewarded with a “McAfee SECURE” seal of approval.

Earlier this month at the RSA security conference, securing cloud environments was a major topic up for discussion. A survey by IDC on attitudes towards the cloud revealed that 87.5 percent of participants said the most significant obstacles to cloud adoption were security concerns. IDC analyst Christian Christiansen said in a statement, “SaaS vendors have a difficult time convincing prospects that their services are secure and safe.” According to Christiansen, though, McAfee’s new offering is a step in the right direction toward increased security in the cloud.

McAfee and other vendors have discussed providing security from the cloud in the past, but this announcement shows the increasing focus on providing solutions to secure cloud environments themselves in the industry.

Marc Olesen, senior vice president and general manager of McAfee’s Software-as-a-Service business said in an interview with eWEEK, ” McAfee looks at the cloud really from three different angles, which is security from the cloud, in the cloud and for the cloud. What’s really been out there today are (annual) process certification audits … that address the process controls and security controls that cloud providers have in place. This has typically been an ISO-27001 certification or an SAS-70 certification that cloud providers are suing, and we feel that that’s very important, but it’s just a start.” For more information please contact a Nubifer representative today.

Cloud Interoperability Brought to Earth by Microsoft

Executives at Microsoft say that an interoperable cloud could help companies trying to lower costs and governments trying to connect constituents. Cloud services are increasingly seen as a way for businesses and governments to scale IT systems for the future, consolidate IT infrastructure, and enable innovative services not possible until now.

Technology vendors are seeking to identify and solve the issues created by operating in mixed IT environments in order to help organizations fully realize the benefits of cloud services. Additionally, vendors are collaborating to make sure that their products work well together. The industry may still be in the beginning stages of collaborating on cloud interoperability, but has already made great strides.

So what exactly is cloud interoperability and how can it benefit companies now? Cloud interoperability specifically concerns one cloud solution working with other platforms and applications—not just other clouds. Customers want to be able to run applications locally or in the cloud, or even on a combination of both. Currently, Microsoft is collaborating with others in the industry and is working to make sure that the premise of cloud interoperability becomes an actuality.

Microsoft’s general managers Craig Shank and Jean Paoli are spearheading Microsoft’s interoperability efforts. Shank helms the company’s interoperability work on public policy and global standards and Paoli collaborates with the company’s product teams to cater product strategies to the needs of customers. According to Shank, one of the main attractions of the cloud is the amount of flexibility and control it gives customers. “There’s a tremendous level of creative energy around cloud services right now—and the industry is exploring new ideas and scenarios all the time. Our goal is to preserve that flexibility through an open approach to cloud interoperability,” says Shank.

Paoli chimes in to say, “This means continuing to create software that’s more open from the ground up, building products that support technologies such as PHP and Java, and ensuring that our existing products work with the cloud.” Both Shank and Paoli are confident that welcoming competition and choice will allow Microsoft to become more successful down the road. “This may seem surprising,” says Paoli before adding,” but it creates more opportunities for its customers, partners and developers.”

Shank reveals that due to the buzz about the cloud, some forget about the ultimate goal: “To be clear, cloud computing has enormous potential to stimulate economic growth and enable governments to reduce costs and expand services to citizens.” One example of the real-world benefits of cloud interoperability is the public sector. Microsoft is currently showing results in this area via solutions like their Eye for Earth project. Microsoft is helping the European Environment Agency simplify the collection and processing of environmental information for use by the general public and government officials. Eye on Earth obtains data from 22,000 water monitoring points and 1,000 stations that monitor air quality through employing Microsoft® Windows Azure, Microsoft ® SQL Azure and already existing Linux technologies. Eye on Earth then helps synthesize the information and makes it accessible for people in 24 different languages in real time.

Product developments like this emerged out of feedback channels which the company developed with its partners, customers and other vendors. In 2006, for example, Microsoft created the Interoperability Executive Customer (IEC) Council, which is comprised of 35 chief technology officers and chief information officers from a variety of organizations across the globe. The group meats two times per year in Redmond and discuss issues concerning interoperability as well as provide feedback to Microsoft executives.

Additionally, Microsoft recently published a progress report which—for the first time—revealed operational details and results achieved by the Council across six work streams (or priority areas). The Council recently commissioned the creation of a seventh work stream for cloud interoperability geared towards developing standards related to the cloud which addressed topics like data portability, privacy, security and service policies.

Developers are an important part of cloud interoperability, and Microsoft is part of an effort the company co-founded with Zend Technologies, IBM and Rackspace called Simple Cloud. Simple Cloud was created to help developers write basic cloud applications that work on all major cloud platforms.

Microsoft is further engaging in the collaborative work of building technical “bridges” between the company and non-Microsoft technologies, like the recently-released Microsoft ® Windows Azure Software Development Kits (SDKs) for PHP and Java and tools for the new Windows ® Azure platform AppFabric SDKs for Java, PHP and Ruby (Eclipse version 1.0), the SQL CRUD Application Wizard for PHP and the Bing 404 Web Page Error Toolkit for PHP. These examples show the dedication of Microsoft Interoperability team.

Despite the infancy of the industry’s collaboration on cloud interoperability issues, much progress has already been made. This progress has had a major positive impact on the way even average users work and live, even if they don’t realize it yet. A wide perspective and a creative and collaborative approach to problem-solving are required for cloud interoperability. In the future, Microsoft will continue to support more conversation within the industry in order to define cloud principles and make sure all points of view are incorporated. For more information please contact a Nubifer representative today.

Amazon Sets the Record Straight About the Top Five Myths Surrounding Cloud Computing

On April 19, the 5th International Cloud Computing Conference & Expo (Cloud Expo)opened in New York City, and Amazon Web Services (AWS) used the event as a platform to address some of what the company sees as the lingering myths about cloud computing.

AWS officials said that the company continues to grapple with questions about features of the cloud-ranging from reliability and security to cost and elasticity—despite being one of the first companies to successfully and profitably implement cloud computing solutions. Adam Selipsky, vice president of AWS, recently spoke about the persisting myths of cloud computing from Amazon’s Seattle headquarters, specifically addressing five that linger in the face of increased industry adoption of the cloud and continued successful cloud deployments. “We’ve seen a lot of misperceptions about cloud computing is,” said Selipsky before debunking five common myths.

Myth 1: The Cloud Isn’t Reliable

Chief information officers (CIOs) in enterprise organizations have difficult jobs and are usually responsible for thousands of applications, explains Selipsky in his opening argument, adding that they feel like they are responsible for the performance and security of these applications. When problems with the applications arise, CIOs are used to approaching their own people for answers and take some comfort that there is a way to take control of the situation.

Selipsky says that customers need to consider a few things when adopting the cloud, one of which is that the AWS’ operational performance is good. Selipsky reminded users that they own the data, they choose which location to store the data (and it doesn’t move unless the customer decided to move it) and that regardless of whether customers choose to encrypt or not, AWS never looks at the data.

“We have very strong data durability—we’ve designed Amazon S3 (Simple Storage Service) for eleven 9’s of durability. We store multiple copies of each object across multiple locations,” said Selipsky. He added that AWS has a “Versioning” feature which allows customers to revert to the last version of any object they somehow lose due to application failure or an unintentional deletion. Customers can also ensure additional fault-tolerant applications by deploying their applications in various Availability zones or using AWS’ Load Balancing and Auto Scaling features.

“And, all that comes with no capex [capital expenditures] for companies, a low per unit cost where you only pay for what you consume, the ability to focus on engineers on unique incremental value for your business,” said Selipsky before adding that the origin of the reliability claims come merely from an illusion of a control, not actual control. “People think if they can control it they have more say in how things go. It’s like being in a car versus an airplane, but you’re much safer in a plane,” he explained.

Myth 2: The Cloud Provides Inadequate Security and Privacy

When it comes to security, Selipsky notes that it is an end-to-end process and thus companies need to build security at every level of the stack. Taking a look at Amazon’s cloud, it is easy to note that the same security isolations are employed as with a traditional data center—including physical data center security, separation of the network, isolation of the server hardware and isolation of storage. Data centers had already become a frequently-shared infrastructure on the physical data center side before Amazon launched its cloud services. Selipsky added that companies realized that they could benefit by renting space in a data facility as opposed to building it.

When speaking about security fundamentals, Selipsky noted that security could be maintained by providing badge-controlled access, guard stations, monitored security cameras, alarms, separate cages and strictly audited procedures and processes. Not only is Amazon’s Web Services’ data center identical to the best practices employed in private data facilities, there is an added physical security advantage in the fact that customers don’t need to access to the servers and networking gear inside. Access to the data center is thus controlled more strictly than traditional rented facilities. Selipsky also added that the Amazon cloud as equal or better isolation than could be expected from dedicated infrastructure, at the physical level.

In his argument, Selipsky pointed out that networks ceased to be isolated physical islands a long time ago because, as companies increasingly began to need to connect to other companies—and then the Internet—their networks became connected with public infrastructure. Firewalls and switch configurations and other special network functionality were used to prevent bad network traffic from getting in, or conversely from leaking out. Companies began using additional isolation techniques as their network traffic increasingly passed over public infrastructure to make sure that the security of every packet on (or leaving) their network remained secure. These techniques include Multi-protocol Label Switching (MPLS) and encryption.

Amazon used a similar approach to networking in its cloud by maintaining packet-level isolation of network traffic and supporting industry-standard encryption. Amazon Web Services’ Virtual Private Cloud allows a customer to establish their own IP address space and because of that customers can use the same tools and software infrastructure they are familiar with to monitor and control their cloud networks. Amazon’s scale also allows for more investment in security policing and countermeasures than nearly and large corporation could afford. Maintains Selipsky, “Our security is strong and dug in at the DNA level.”

Amazon Web Services invests in testing and validating the security of its virtual server and storage environment significantly as well. When discussing the investments made on the hardware side, Selipsky lists:

After customers release these resources, the server and storage are wiped clean so no important data can be left behind.

Intrusion from other running instances is prevented because each instance has its own customer firewall.

Those in need of more network isolation can use Amazon VPC, which allows you to carry your own IP address space with you into the cloud; your instances are only accessible through those IP addresses only you know.

Those desiring to run on their own boxes—where no other instances are running—can purchase extra large instances where only that XL instance runs on that server.

According to Selipsky, Amazon’s scale allows for more investment in security policing and countermeasures: “In fact, we often find that we can improve companies’ security posture when they use AWS. Take the example lots of CIOs worry about—the rogue server under a developer’s desk running something destructive or that the CIO doesn’t want running. Today, it’s really hard (if not impossible) for CIOS to know how many orphans there are and where they might be. With AWS, CIOs can make a single API call and see every system running in their VPC [Virtual Private Cloud]. No more hidden servers under the desk or anonymously places servers in a rack and plugged into the corporate network. Finally, AWS is SAS-70 certified; ISO 27—1 and NIST are in process.”

Myth 3: Creating My Own In-House Cloud or Private Cloud Will Allow Me to Reap the Same Benefits of the Cloud

According to Selipsky, “There’s a lot of marketing going on about the concept of the ‘private cloud.’ We think there’s a bit of a misnomer here.” Selipsky continued to explain that generally, “we often see companies struggling to accurately measure the cost of infrastructure. Scale and utilization are big advantages for AWS. In our opinion, a cloud has five key characteristics: it eliminates capex; allows you to pay for what you use; provides true elastic capacity to scale up and down; allows you to move very quickly and provision servers in minutes; and allows you to offload the undifferentiated heavy lifting of infrastructure so your engineers work on differentiating problems.

Selipsky also pointed out the following drawbacks of private clouds: still own the capex (and they are expensive!); not pay for  what you use; not have true elasticity; still manage the undifferentiated heavy lifting. “With a private cloud you have to manage capacity very carefully … or you or your private cloud vendor will end up over-provisioning. So you’re going to have to either get very good at capacity management or you’re going to wind up overpaying,” said Selipsky before challenging the elasticity of the private cloud: “The cloud is shapeless. But if it has a tight box around it, it no longer feels very cloud-like.”

One of AWS’ key offerings is Amazon’s ability to save customers money while also driving efficiency. “In virtually every case we’ve seen, we’ve been able to save people a significant amount of money,” said Selipsky. This is in part because AWS’ business has greatly expanded over the last four years and Amazon has achieved enough scale to secure very low costs. AWS has been able to aggregate hundreds of thousands of customers to have a high utilization of its infrastructure. Said Selipsky, “In our conversations with customers we see that really good enterprises are in the 20-30 percent range on utilization—and that’s when they’re good … many are not that strong. The cloud allows us to have several times that utilization. Finally, it’s worth looking at Amazon’s heritage and AWS’ history. We’re a company that works hard to lower its costs so that we can pass savings back to our customers. If you look at the history of AWS, that’s exactly what we’ve done (lowering price on EC2, S3, CloudFront, and AWS bandwidth multiple times already without any competitive pressure to do so).”

Myth 4: The Cloud Isn’t Ideal Because I Can’t Move Everything at Once

Selipsky debunks this myth by saying, “We believe this is nearly impossible and ill-advised. We recommend picking a few apps to gain experience and comfort then build a migration plan. This is what we most often see companies doing. Companies will be operating in hybrid environments for years to come. We see some companies putting some stuff on AWS and then keeping some stuff in-house. And I think that’s fine. It’s a perfectly prudent and legitimate way of proceeding.”

Myth 5: The Biggest Driver of Cloud Adoption is Cost

In busting the final myth, Selipsky said, “There is a big savings in capex and cost but what we find is that one of the main drivers of adoption is that time-to-market for ideas is much faster in the cloud because it lets you focus your engineering resources on what differentiates your business.”

Summary

Speaking about all of the myths surround the cloud, Selipsky concludes that “a lot of this revolves around psychology and fear of change, and human beings needing to gain comfort with new things. Years ago people swore they would never put their credit card information online. But that’s no longer the case. We’re seeing great momentum. We’re seeing, more and more, over time these barriers [to cloud adoption] are moving.” For additional debunked myths regarding Cloud Computing visit Nubifer.com.

A Guide to Securing Sensitive Data in Cloud Environments

Due to the outsourced nature of the cloud and its innate loss of control, it is important to make sure that sensitive data is constantly and carefully monitored for protection. That task is easier said than done, which is why the following questions arise: How do you monitor a database server when its underlying hardware moves every day—sometimes even multiple times a day and sometimes without your knowledge? How do you ensure that your cloud computing vendor’s database administers and system administrators are not copying or viewing confidential records inappropriately or abusing their privileges in another way?

When deploying a secure database platform in a cloud computing environment, these obstacles and many more are bound to arise and an enterprise needs to be able to overcome them, as these barriers may be enough to prevent some enterprises from moving their on-premises approach. There are three critical architectural concerns to consider when transferring applications with sensitive data to the cloud.

Issue 1: Monitoring an Ever-changing Environment

Cloud computing grants you the ability to move servers and add or remove resources in order to maximize the use of your systems and reduce expense. This increased flexibility and efficiency often means that the database servers housing your sensitive data are constantly being provisioned and deprovisioned. Each of these scenarios represents a potential target for hackers, which is an important point to consider.

Monitoring data access becomes more difficult due to the dynamic nature of a cloud infrastructure. If the information in those applications is subject to regulations like the Payment Card Industry Data Security Standard (PCI DSS) or the Health Insurance Portability and Accountability Act (HIPAA), it is vital to make sure that it is secure.

It is essential to find a methodology that is easily deployed on new database servers without management involvement when thinking about solutions to monitor activity on these dynamic database servers. This requires a distributed model in which each instance in the cloud has a sensor or agent running locally; and this software must be able to be provisioned automatically along with the database software without requiring intrusive system management.

It won’t always be possible to reboot whenever it is necessary to install, upgrade or update the agents in a multitenancy environment such as this, and the cloud vendor may even place limitations on installation of software requiring certain privileges. With the right architecture in place, you will be able to see where your databases are hosted at any point in town and will be able to centrally log all activity and flag suspicious events across all servers wherever they are.

Issue 2: Working in a WAN

Currently, database activity monitoring solutions utilize a network-sniffing model to identify malicious queries, but this approach isn’t feasible in the cloud environment because the network encompasses the entire Internet. Another method that doesn’t work in the cloud is adding a local agent which sends all traffic to a remote server.

The solution is something that is designed for distributed processing where the local sensor is able to analyze traffic autonomously. Another thing to consider is that  cloud computing resources procured are likely to be on a WAN. Network bandwidth and network latency will make off-host processing inefficient. With cloud computing, you are likely unable to colocate a server lose to your databases. This means that the time and resources spent spending every transaction to a remote server for analysis will stunt network performance and also hinder timely interruption of malicious activity.

So when securing databases in cloud computing, a better approach is to utilize a distributed monitoring solution that is based on “smart” agents. That way, once a security policy for a monitored database is in place, that agent or sensor is able to implement protection and alerting locally and thus prevent the network from turning into the gating factor for performance.

It is also necessary to test the WAN capabilities of your chosen software for remote management of distributed data centers. It should be able to encrypt all traffic between the management console and sensors to restrict exposure of sensitive data. There are also various compression techniques that can enhance performance so that alerts and policy updates are transmitted efficiently.

Issue 2: Know Who Has Privileged Access to Your Data

The activity of privileged users is one of the most difficult elements to monitor in any database implementation. It is important to remember that DBAs and system administrators know how to stealthy access and copy sensitive information (and cover their tracks afterward). There are unknown personnel at unknown sites with these access privileges in cloud computing environments. Additionally, you cannot personally conduct background checks on third parties like you would for your own staff in this situation. When looking at all of these factors, it is easy to see why protecting against inside threats is important yet difficult to do.

So how do you resolve this issue? One way is to separate duties to ensure that the activities of privileged third parties are monitored by your own staff and also that the pieces of the solution on the cloud side of the network are unable to be defeated without alerts going off. It is also necessary to be able to closely monitor individual data assets regardless of the method used to access it.

Seek out a system that knows when the data is being accessed in violation of the policy–without relying on query analytics alone. Sophisticated users with privileges can create new views, insert stored procedures into a database or generate triggers which compromise information without the SQL command arising suspicion.

Summary

Although some may wrongfully conclude that the complex nature of monitoring database in a cloud architecture isn’t worth changing from dedicated systems–or at least not just yet. With that said, most enterprises will decide that deploying applications with sensitive data on one of these models is inevitable. Leading organizations have begun to change and as a result tools are now meeting the requirements driven by the issues raised in this article.

Essentially, security should not prevent you from moving forward with deploying databases in the cloud if you think your enterprise would benefit from doing so. By looking before you leap–ensuring your security methodologies adequately address these unique cases–you can make the transition safely.  For more information please visit Nubifer.com.

Public vs. Private Options in the Cloud

The demand for cloud computing is perpetually increasing, which means that business and technology managers need to clear up any questions they have about the differences between public and private clouds—and quickly at that.

The St. Louis-based United Seating and Mobility is one company that faced the common dilemma of choosing between a public or private cloud. The company—which sells specialized wheelchairs at 30 locations in 12 states—initially used phones and email to stay up to date on vendor contracts and other matters before monitoring these developments with off-the-shelf applications on its own servers. Finally, United Seating and Mobility decided to move to the public cloud.

United Seating and Mobility’s director of operations Michael DeHart tells Baseline Magazine of the move, “The off-the-shelf applications didn’t collaborate. You’d log on to all of the apps and try to remember which one needed which password.” Staffers across the nation now share the information seamlessly via the enhanced tools available in the public cloud.

Another example illustrating the difference between the public and private cloud is the Cleveland Cavaliers. The NBA team uses a private cloud to run its arena’s website. Going private allowed for increased one-on-one interaction with the cloud provider partner while simultaneously giving the franchise more resources to handle increased traffic to the site. Traffic on the area site has been known to spike when, for example, the team makes the playoffs or a major artist is coming to the venue. “When you’ve booked Miley Cyrus you’d better be ready,” says the Cleveland Cavaliers director of web services Jeff Lillibridge.

Despite choosing different versions of the cloud, both United Seating and Mobility and the Cleveland Cavaliers have noticed that few enterprise managers will be able to avoid the topic of private verses public clouds. According to research firm IDC, worldwide cloud services revenue will reach $44.2 billion in 2013, compared to $17.4 billion last year.

Business and technology professionals remain stumped about what private and public clouds are despite the increased demand for worldwide cloud services. Examples of public clouds include Google AppEngine, IBM’s Blue Cloud, LotusLive Engage and Amazon’s Elastic Compute Cloud (EC2). A public cloud is a shared technology resource used on an as-needed basis and available via the Internet while a private cloud is created specifically for the use of one organization.

Enhanced by virtualization technologies, both concepts are making way for an “evergreen” approach to IT in which enterprises can obtain technologies when they need them without purchasing and maintaining a host of in-house services.

Bob Zukis, national leader of IT strategy for PricewaterhouseCoopers (PwC) says, “It all stems from the legacy model of ‘build it and forget about it.’ Changes taking place in the industry are making it much more efficient and effective to provision what IT needs. So ‘build it and forget about it’ no longer meets the needs of the business. Whether you’re going with a public or private cloud, you’re pursuing a way to increase your technological resources in a more efficient flexible way.”

In addition to being evergreen, this movement is also green-friendly. Says Frost and Sullivan’s Vanessa Alvarez, “Cloud computing allows for resources and paying only for what they use. When an application is not utilizing resources, those resources can be moved to another application that needs them, enabling maximum resource efficiencies. If additional capacity or resources are no longer needed, virtual servers can be powered down or shut off.”

Organizations continue to struggle to choose between private versus public clouds. On one hand, private clouds offer security and increased flexibility compared to traditional legacy systems, but they have a higher barrier of entry compared to public clouds. In contrast, private cloud services require that an enterprise IT manager handle technology standardization, virtualization and operations automation in addition to operations support and business support systems.

“With public clouds, you provision your organization very quickly, by increasing service, storage and other computing needs, “says Zukis. “A private cloud takes a lot more time because you’re essentially rearchitecting your legacy environment.” Although public clouds don’t require this organizational shift and are thus faster and more convenient, they fail to provide the same amount of transparency as private clouds. Says Zukis, “It’s not always clear what you’re buying off the shelf with public clouds.”

Assessing the Value of Security

Another major issue in the cloud debate is security. All organizations value security but each has to decide between balance between cost and convenience, on one hand, and data security, on the other. Some organizations might have a higher threshold for potential violations than others and thus require a need-for-speed strategy.

Head of strategic sales and marketing at NIIT Technologies Aninda Bose, who has analyzed both cloud structures through her job and also in her position with nonprofit research organization Project Management Institute, states that the public cloud is the better option for an enterprise dealing with high-transaction/low-security or low data value. An example illustrating this is a local government office, which needs to tell a citizen that their car registration is up for renewal and simply needs to give the citizen a renewal date—a perfect situation for public cloud hosting.

Examples better suited for the private cloud model due to the sensitivity of their data include a federal agency, financial institution or health care provider. Mark White, principal with Deloitte Consulting, explains, “Accounting treatments and taxation applications are not yet fully tested for public cloud services. So enterprises with significant risk from information exposure may want to focus on the private cloud approach. This caution is most relevant for systems that process, manage and report key customer, financial or intelligence information. It’s less important for ‘edge’ systems, such as salesforce automation and Web order-entry applications.”

Sioux Falls, South Dakota-based medical-practice company The Orthopedic Institute is very data-dependent and concluded that the private cloud structure best fit its needs—specifically because the company must comply with strict rules for protecting patient information laid out by HIPAA (Health Insurance Portability and Accountability Act).

IT Director David Vrooman explains that The Orthopedic Institute was seeking to change it domain name from Orth-I.com but after exploring possibilities with the exclusive provider of .md domains MaxMD it determines that MaxMd could also provide private cloud services for highly secured, encrypted email transmissions. Moreover, the cost of entry was less than doing it in-house. “We didn’t want to use one of our servers for this because it would have amounted to a $20,000 startup cost. By going with a private cloud option, we launched this at one-fifth of that expense—and it only took an afternoon to get it started, ” says Vrooman. “It would have taken at least a week for my staff and me to get this done. And because MaxMD has taken over the email encryption, I’m not getting up at 3am to find out what’s wrong with the server.”

Some industry experts warn that traditional views about security and cloud computing may be changing, however, and that includes organizations which are dependent on highly secured data. CPA2Biz, the New York-based American Institute of Certified Public Accountants, wanted to provide its 350,000 members with access to the latest software tools for its business resources-providing subsidiary. CPA2Biz worked with Intacct to create a public cloud model for its CPA members. The program was launched in April and since then concerns have about security have been addressed and hundreds of firms are supporting approximately 2,000 clients through the public cloud services offered through CPA2Biz.

“Only those in the largest of member organizations would be able to consider a private cloud system. Plus, we don’t believe there are security advantages to a private cloud system,” says vice president of corporate alliances at CPA2Biz Michael Cerami. “We’ve selected partners who operate highly secure public cloud environments. This allows us to provide our members with great collaborative tools that enable them to work proactively with their clients in real time.”

The Choice

Going back to United Seating and Mobility, the organization was interested in the public cloud structure because it isn’t dependent on high-volume, automated sales. The company uses IMB’s LotusLive Engage for online meetings, file-sharing and project-management tasks.

DeHart estimates that it would have taken up a server and a half had it done this in house saying, “Being on the public cloud allows us to avoid this entirely. It’s a leasing-versus-owning concept—an operational expense versus a capital one. And the Software-as-a-Service offerings are better than what we could get off the shelf. We certainly can’t use this cloud to work with any sensitive health data. But we can run much of our business operations on it, freeing up our IT people to focus on email, uptime and cell phone services.”

Now, take the Cleveland Cavaliers. They opted for private cloud services to support the website for their venue, Quicken Loans Arena, aka “the Q.” Fans can search for information about upcoming events on TheQArena.com and are directed to a business called Veritix is they want to buy tickets. The arena site acts as a traffic conduit for Veritiix, thus a private cloud was the best option and the team partnered with Hosted Solutions. Since the current NBA season began last fall, the site’s page views and visits have seen an increase of over 60 percent and the number of unique visitors has increased by 55 percent. The team avoids uncertainly about who is minding the data by employing Hosted Solutions.

The private cloud also enables the team to manage site traffic that can jump significantly in the case of a last-second, playoff-determining shot, for example. “The need to scale was significant but we didn’t want to oversee our own dedicated hosting,” says Lillibridge. “It would have been more expensive, and we would have had the headache of managing our own servers. We needed dedicated services that would avoid this, while allowing our capacity to increase during peak times and decrease when we don’t have a lot of traffic.”

There is no clear cut answer for whether the private or public cloud is better, rather companies needs to assess their own individual requirements for sped, security, resources and scalability. To learn more about which Cloud option is right for your enterprise, contact a Nubifer representative today.

Legal Risks for Companies to Consider Before Embracing the Cloud

Along with its never-ending stream of possibilities in revolutionizing the invention, development, deployment, scale, updating, maintenance and payment for data and applications, cloud computing brings a variety of legal risks to the table, and companies must consider these before entering a highly optimized public cloud.

Risk from uncertainty over where sensitive data and applications physically dwell arises from what Baselinemag.com calls the “nationless state” of the public cloud. Among these ricks are jurisdictions where laws governing the protection and availability of data are very different than what companies are used to. Information in the cloud can also be widely distributed across various legal and international jurisdictions (which each have different laws concerning security, privacy, data theft, data loss and intellectual property) due to the virtual and dynamic nature of cloud computing architecture.

Furthermore, when operating in the cloud, issues concerning privacy, data ownership and access to data cause many questions to arise. National or international legal precedents for cloud computing may be few and far between, but companies nonetheless must ensure that they can immediately access their information and that their service provider has appropriate backup and data-retrieval procedures in place.

A new paradigm of licensing—in which traditional software license agreements will be replaced with cloud service agreements—will be replaced with cloud service agreements as a result of the legal framework of cloud computing. Lawyers representing cloud service providers will subsequently try to reduce the liability of their clients by proposing contracts with the service provided “as is” without a warranty. Under this new paradigm, the service is provided without any assurance or promise of a specific level of performance. This added rick must be evaluated within the context of the benefits derived from the cloud as well as the proposed data which will be stored in the cloud.

Cloud computing also causes issues for companies that have to meet increasingly stringent compliance and reporting requirements for the management of their data. These issues pose major risks in protecting companies’ sensitive data and the information assets their customers have entrusted them to watch over.

In summary, enterprises must make sure that their cloud service providers specify where their data dwells, the legal framework within those specific jurisdictions and the security, backup, anti-hacking and anti-viral processes the service provider has set up. Despite these risks, cloud computing has enormous benefits which should make companies eager to take advantage of its optimization, scalability and cost savings that cloud computing provides. While embracing the cloud, companies must simply conduct a more detailed legal analysis and assessment of risks, much like they would with traditional IT services. For more information on security relating to Cloud Computing, please visit Nubifer.com.

Heightening Cloud Security in Your Enterprise

The responsibility of securing corporate information in the cloud falls upon the enterprise, and enterprises, as cloud consumers, can greatly improve cloud security. Currently, if there is a breach in security, the enterprise is responsible. eWeek Knowledge Center contributor Matthew Gardiner reveals six ways in which enterprises can improve cloud security essentially by thinking as a cloud provider. Once an enterprise has improved security within their cloud computing model, it can fully reap the benefits from the cloud.

Cloud security is a shared responsibility between cloud providers and enterprises, although the dividing line between the two is currently, well, cloudy. The dividing line between cloud providers and enterprises is dependent on the type of cloud model–ranging from Software-as-a-Service (SaaS) to Platform-as-a-Service (PaaS) to Infrastructure-as-a-Service (IaaS).

SaaS approaches what can be though of as a security black box, in which application security activities are largely invisible to the enterprise. IaaS, in which an enterprise is principally responsible for the security of the application, data and other levels of the infrastructure stack, sits at the other end of the spectrum.

The following six steps outline what enterprises can do to improve security in a cloud computing model and thus reap the full benefits from the cloud:

1. Learn from your current internal private clouds and the security systems and processes constructed around them

Medium to large enterprises have been setting up internal clouds for the past ten years, so while many of them didn’t refer to them as clouds, most enterprises have internal clouds already. These clouds were often referred to as shared services, like authentication services, database services, provisioning services or enterprise data centers.

2. Assess the importance and risk of your multiple IT-enabled business processes

Although the potential cost savings resulting from a transition into the cloud can be calculated rather easily, conducting a “risk vs. reward” calculation is difficult without having a basic understanding of the risk side of the equation. Because this is entirely dependent on the business context of the business process, the cloud providers cannot conduct this analysis for enterprises. The obvious first candidates for the cloud are low Service-Level Agreement (SLA) applications with relatively high cost. The potential regulatory impacts need to be considered as well, because some data and services aren’t allowed by regulators to move off-site or out of the state or country.

3. Analyze different cloud models and categories

There are general differences between different cloud models (public, private, hybrid) and cloud categories (SaaS, PaaS, IaaS) that directly relate to security control and responsibility, thus enterprises need to analyze both.

Enterprises must have both an opinion and policy for these cloud approaches within the context of their organizations and the risk profile of their own businesses.

4. Apply your Service-Oriented Architecture (SOA) design and security principles to the cloud

The cloud can be seen as an expansion of SOA, as most organizations have been using SOA principles in their application development organizations for several years. In this way, the cloud can be seen as service orientation taken to its next logical step. Combined with centralized security policy administration and decision making, the SOA security principles of highly distributed security enforcement apply  directly to the cloud. The principles can simply be transfered to the cloud rather than reinventing the system when switching your focus from SOA to the cloud.

5. Think like a cloud provider

Rather than thinking of your enterprise as a cloud consumer, think as a cloud provider. Your organization is part of a value chain in which you supply services to your customers and partners. If you are able to equate the risk/reward balance so that you profitably consume cloud services, you can apply that way of thinking to guide your entry as a cloud provider within your ecosystem. This will in turn help your organization better comprehend what is happening within the realm of cloud providers.

6. Get to know and start using Web security standards sooner than later

The Web security industry has been working on securing and managing cross-domain systems for quite some time, and useful security standards to secure cloud services have emerged as a result. These standards–which include Security Assertion Markup Language (SAML), Service Provisioning Markup Language (SPLM), Extensible Access Control Markup Language (XACML) and Web Services-Security WS-Security)–must be adopted for security systems to be effective in the increasingly cloud-connected world.

Ensuring that security professionals be viewed as rational advocates of the cloud is an important requirement for enterprises when it comes to improving the security of cloud services. When properly balanced and business-driven, technologists can serve as positive forces in the risk/reward dialogue and also help increase the probability of increasing cloud security for their enterprise. To learn more about Cloud Security please visit Nubifer.com.

Security in the Cloud

One major concern has loomed over companies considering a transition into the cloud: security. The “S” word has affected the cloud more than other types of hosted environments, but most concerns about security are not based on reality.

Three factors about cloud security:

1.       Cloud security is almost identical to internal security, and the security tools used to protect your data in the cloud are the same ones you use each day. The only difference is that the cloud is a multi-tenant environment with multiple companies sharing the same cloud service provider.

2.       Security issues within the cloud can be address with the very same security tools you currently have in place. While security tools are important, they should not be perceived as a hindrance when making the transition into the cloud. Over time, the commodity nature of IT will require that you transition your technologies to the cloud in order to remain financially competitive. This is why it is important to start addressing security measures now in order to prepare for the future.

3.       As long as you choose a quality cloud provider, your security within the cloud will be as good—perhaps even better!—than your current security. The level of security within in the cloud is designed for the most risky client in the cloud, and thus you will receive that same security whatever your level of risk.

Internal or External IT?

Prior to asking questions about security within the cloud, you need to ask what exactly should move into the cloud in the first place, such as commodities. Back when companies first began taking advantage of IT, the initial businesses to computerize their organization’s processes had significant gains over competitors. As the IT field grew, however, the initial competitive benefits of computerization began to wane, and computerization thus became a requirement in order to simply remain relevant. As such, there is an increasing amount of IT operating as a commodity.

Cloud computing essentially allows business to offload commodity technologies and free up resources and time to concentrate on the core business. For example, a company manufacturing paper products requires a certain amount of IT to run its business and also make it competitive. The company also runs a large quantity of commodity IT; this commodity technology takes time, money, energy and people away from the company’s business of producing paper products at a price that rivals competitors. This is where cloud computing comes in.

The commodity IT analysis form helps you determine what parts of your IT can be moved externally by helping you list out all of the functions that your IT organization performs and decide if you think of this activity as a commodity, or not.

Internal IT Security

Some think that internal IT no longer helps businesses set themselves apart from other businesses. The devaluing of IT leads to many companies failing to adequately fund required budgets to operate a first-class IT infrastructure. In addition, there is an increasing number of security mandates from external and internal courses means that IT can’t always fund and operate as required.

Another problem involves specialization and its effect on business function, as businesses exist as specialized entities. When looking at funding and maintaining a non-core part of the business, IT faces a problem. For example, an automotive maker avoids starting a food production company even though it could feed its employees that way because that is not its core business. It is unlikely that the automotive manufacturer’s IT department will be as successful as its manufacturing business. On balance, a business with IT as its only product line or service should be more successful as providing IT. Thus if the automotive maker isn’t going to operate as a best-in-class IT business, why would its security be expected to be best-in-class? A company with IT as its business is the best choice for securing your data because the quality of its product and its market success depends on its security being effective.

Factors to consider when picking a cloud provider:

Cloud providers have internal and external threats that can be accepted or mitigated, like internal IT, and these challenges are all manageable:

Security assessment: Most organizations usually relax their level of security over time, and as a way to combat this, the cloud provider must perform regular security assessments. The subsequent security report must be given to each client immediately after it is performed so the client knows the current state over their security in the cloud.

Multi-tenancy: The cloud provider should design its security to ensure that it meets the needs of its higher-risk clients, and in turn all clients will reap the rewards of this.

Shared Risk: The cloud service provider will not be the cloud operator in many instances, but the cloud service provider may nonetheless be providing a value-added service in addition to another cloud provider’s service. Take a Software-as-a-Service provider, for example. The SaaS provider needs infrastructure, and it may make more sense to get that infrastructure from an Infrastructure-as-a-Service provider as opposed to building it on its own. Within this kind of multi-tier service provider, the risk of security issues are shared by each part because the risk affects all parties involved at various layers. The architecture used by the main cloud provider must be addressed and that information taken into account when assessing the client’s total risk mitigation plan.

Distributed Data Centers: Due to the fact that providers can offer an environment that is geographically distributed, a cloud computing environment should be less prone to disasters–in theory. In reality, many organizations sign up for cloud computing services that are not geographically distributed, this they should require that their provider have a working and regularly-tested disaster recovery plan (including SLAs).

Staff Security Screening: As with other types of organizations, contractors are often hired to work for cloud providers, and these contractors should be subject to a full background investigation.

Physical Security: When choosing a cloud security provider, physical external threats should be analyzed carefully. Some important questions to ask are: Do all of the cloud provider’s facilities have the same levels of security? Is your organization being offered the most secure facility with no guarantee that your data will actually reside there?

Policies: Cloud providers are not exempt from suffering from data leaks or security incidents, which is why cloud providers need to have incident response policies and procedures for each client that they feed into their overall incident response plan.

Data Leakage: One of the greatest organizational risks from a security standpoint is data leakage. As such, the cloud provider must have the ability to map its policy to the secure mandate you must comply with and talk about the issues at hand.

Coding: In-house software used by all cloud providers may contain application bugs. For this reason, each client should make sure that the cloud provider follows secure coding practices. All code should additionally be written using a standard methodology that is documented and can also be demonstrated to the customer.

In conclusion, security remains a major concern, but it is important to understand that the technology used to secure your organization within the cloud isn’t untested or new. Security questions within the cloud represent the logical progression to outsourcing of commodity services to some of the same IT providers that you have been confidently using for years already. Moving IT elements into the cloud is simply a natural progression in the overall IT evolution. Visit nubifer.com for more information regarding the ever-changing environment of Cloud security.

Welcome to Nubifer Cloud Computing blogs

In this location, we share blogs, research, tutorials and opinions about the ever changing and emerging arena of cloud computing, software-as-a-service, platform-as-a-service, hosting-as-a-service, and user-interface-as-a-service. We also share key concepts focused on interoperability while always maintaining an agnostic viewpoint of technologies and services offered by the top cloud platform providers. For more information, please visit Nubifer.com.