Archive for the ‘ Uncategorized ’ Category

2012 in review

The WordPress.com stats helper monkeys prepared a 2012 annual report for this blog.

Here’s an excerpt:

600 people reached the top of Mt. Everest in 2012. This blog got about 4,000 views in 2012. If every person who reached the top of Mt. Everest viewed this blog, it would have taken 7 years to get that many views.

Click here to see the complete report.

Advertisements

Cloud Computing in 2012 (continued) – Automation of Cloud Resources

With the emergence of cloud computing, the ability to spin up new resources via an API and deploy a new virtual  instance of machines quickly has become one of the more popular paradigms. Resource automation is an important part of any company’s business initiatives. Cloud hosted applications can be deployed, provisioned and created with new instances in an on-demand basis. These computing resources can be created and utilized in very short order, usually in a matter of minutes, and in some cases a matter of seconds.

High Availability, Configurable and Programmable features make Cloud Automation a key driver for adoption

The dynamic nature of cloud automation enables companies to provision and de-provision cloud resources on-the-fly as you meet needed threshold metrics based off of application environment variables, such as increased load traffic, additional users or environment changes on the system running queries, additional taxation of your application platform during data mining or migration processes, and during heavy usage and peak load traffic with data crunching scenarios imposed during run-time.

An important point to note is the rapid adoption and move towards multicore servers strengthing the impact of dynamic automatic virtualization. Overall, the fulcrum points of cloud automation are, increased efficiency, as it pertains to billing, and the ease of implementing highly available computing power required during peak loads of your business applications hosted across the clouds.

Benefits of not having to purchase and provision new servers

By embracing cloud technologies, companies are presented with opportunities to scale their offerings to their clients and serve their business initiatives, all while gaining the ability of  dynamical scaling. Competitive advantages can be seen in hundreds of articles and white papers written by companies, who were either start-ups or building a new application, or migrating an existing data center application to the cloud, and were enabled to get to market quicker and expand or contract their platform hardware requirements in real time as the demand of their systems change.

Amazon and Windows Azure® Cloud Automation Features

There are numerous cloud providers to compare and contrast for Cloud Automation, so we have put together some key points about Amazon Web Services and Windows Azure. Amazon offers auto scaling which the company describes here : http://aws.amazon.com/autoscaling/. Windows Azure offers Autoscaling Application Blog information here: http://www.windowsazure.com/en-us/develop/net/how-to-guides/autoscaling/

Amazon CloudWatch and Auto Scaling ( the following content was copied from the links above )

Amazon CloudWatch provides monitoring for AWS cloud resources and the applications customers run on AWS. Developers and system administrators can use it to collect and track metrics, gain insight, and react immediately to keep their applications running smoothly. Amazon CloudWatch monitors AWS resources such as Amazon EC2 and Amazon RDS DB instances, and can also monitor custom metrics generated by a customer’s applications and services. With Amazon CloudWatch, you gain system-wide visibility into resource utilization, application performance, and operational health.

Amazon CloudWatch provides a reliable, scalable, and flexible monitoring solution that you can start using within minutes. You no longer need to set up, manage, or scale your own monitoring systems and infrastructure. Using Amazon CloudWatch, you can easily monitor as much or as little metric data as you need. Amazon CloudWatch lets you programmatically retrieve your monitoring data, view graphs, and set alarms to help you troubleshoot, spot trends, and take automated action based on the state of your cloud environment.

Amazon Auto Scaling

Auto Scaling allows you to scale your Amazon EC2 capacity up or down automatically according to conditions you define. With Auto Scaling, you can ensure that the number of Amazon EC2 instances you’re using increases seamlessly during demand spikes to maintain performance, and decreases automatically during demand lulls to minimize costs. Auto Scaling is particularly well suited for applications that experience hourly, daily, or weekly variability in usage. Auto Scaling is enabled by Amazon CloudWatch and available at no additional charge beyond Amazon CloudWatch fees.

Features of Amazon Auto Scaling

  • Scale out Amazon EC2 instances seamlessly and automatically when demand increases.
  • Shed unneeded Amazon EC2 instances automatically and save money when demand subsides.
  • Scale dynamically based on your Amazon CloudWatch metrics, or predictably according to a schedule that you      define.
  • Receive notifications via Amazon Simple Notification Service (SNS) to be alerted when you use Amazon      CloudWatch alarms to initiate Auto Scaling actions, or when Auto Scaling      completes an action.
  • Run On-Demand or Spot  instances, including those inside your Virtual Private Cloud (VPC) or High Performance Computing (HPC) Clusters.
  • If you’re signed up for the Amazon EC2 service, you’re already registered to use  Auto Scaling and can begin using the feature via the Auto Scaling APIs or  Command Line Tools.
  • Auto Scaling is enabled by Amazon CloudWatch and carries no additional fees.

Windows Azure® Autoscaling Application Block
( the following content was copied from the links above )

The Autoscaling Application Block can automatically scale your Windows Azure application based on rules that you define specifically for your application. You can use these rules to help your Windows Azure application maintain its throughput in response to changes in its workload, while at the same time control the costs associated with hosting your application in Windows Azure.

Along with scaling by increasing or decreasing the number of role instances in your application, the block also enables you to use other scaling actions such as throttling certain functionality within your application or using custom-defined actions.

You can choose to host the block in a Windows Azure role or in an on-premises application.

A multitude of Cloud Platform offerings, brings compelling opportunity

It is clear that the cloud computing models of deployment, management and ongoing maintenance and support bring a very compelling argument to “why adopt the cloud”. The largest and most innovative companies today are openly embracing this paradigm of technology thought leadership and ROI is proving itself every day.

To learn more about Cloud Computing, Windows Azure®, Amazon Cloud, and other fantastic Cloud and SaaS platforms for business, contact our strategic cloud advisers by visiting http://www.nubifer.com

Compliance in the Cloud

Cloud computing seems like a simple idea, and, ease of operation, deployment and licensing are its most desirable qualities. But when it comes to issues of compliance, once you go beneath the surface you’ll discover more questions than you thought of originally.

Compliance covers a lot of issues, from government regulations, to industry regulations such as PCI DSS  and HIPAA. Your organization probably has internal guidelines in place, but migrating to a public cloud, a cloud application suite or something similar will mean giving up the reins to the cloud vendor.

That’s a position many auditors—and C level officials—discover themselves in today. They want to discover how to adopt the cloud  in a fashion that maintains their good standing with compliance. Here are a few tips for keeping an eye on compliance in the cloud.

Challenges to your Workload

When you survey cloud vendors, start by asking about sound practices and methods for identity and access management, data protection and incident response times. These are basic compliance requirements. Then, as you identify various compliance issues to your prospective cloud vendor’s controls, you’ll probably encounter a few cloud-specific challenges.

Multi-tenancy and de-provisioning also pose challenges. Public clouds use multi-tenancy to better provision server workloads and keep costs low. But multi-tenancy means you’re sharing server space with other organizations, so you should know what safeguards your cloud provider has in place to prevent any compromise. Depending on how critical your data is, you may also want to use encryption. HIPAA, for example, requires that all user data, both moving and at rest, be encrypted.

User de-provisioning is an issue that will become more challenging as password-authentication methods grow in complexity and volume. Federated identity management schemes will make it easier for users to log on to multiple clouds, and that will make de-provisioning much trickier.

Ever-Changing Standards

Like it or not, you’re an early adopter. Your decisions about what applications to move to the cloud and when to move them will benefit from an understanding of new and/or modified standards that are now evolving for cloud computing.

Today you can look for SAS 70 Type II and ISO 27001 certifications for general compliance with controls for financial and information security typically required by government and industry regulations, but these don’t guarantee that your company’s processes will comply.

Bringing visibility to users is a major goal of the Cloud Security Alliance, a three-year-old organization fast gaining popularity among users, auditors and service providers. A major goal of the CSA is development of standardized auditing frameworks to facilitate communication between users and cloud vendors.

Well underway, for example, is a governance, risk and compliance (GRC) standards suite, or stack, with four main elements: the Cloud Trust Protocol, Cloud Audit, Consensus Assessments Initiative and the Cloud Controls Matrix. The Cloud Controls Matrix includes a spreadsheet that maps basic requirements for major standards to their IT control areas, such as “Human Resources  Employment Termination,” while the Consensus Assessments Initiative offers a detailed questionnaire that maps those control areas to specific questions that users and auditors can ask cloud vendors.

Efforts of the CSA and other alliances, plus those of industry groups and government agencies, are bound to produce a wealth of standards in the next several years. The CSA has formal alliances with ISO, ITU and NIST, so that its developments can be used by those groups as contributions to standards they’re working on. And a 2010 Forrester Research report counted 48 industry groups working on security-related standards in late 2010.

Importance of an SLA

Regardless of your company’s size or status, don’t assume your cloud vendor’s standard terms and conditions will fit your requirements. Start your due diligence by examining the vendor’s contract.

Your company’s size can give you leverage to negotiate, but a smaller business can find leverage, too, if it represents a new industry for a cloud vendor that wants to expand its market. In any case, don’t be afraid to negotiate.

Security

To best understand your potential risk, as well as your benefits, you should bring your security team into the conversation at the earliest possible opportunity, says Forrester.

Moving to the cloud may offer an opportunity to align security with corporate goals in a more permanent way by formalizing the risk-assessment function in a security committee. The committee can help assess risk and make budget proposals to fit your business strategy.

You should also pay attention to the security innovations coming from the numerous security services and vendor partnerships now growing up around the cloud.

For more information regarding compliance and security in the Cloud, contact a Nubifer representative today.

Cisco, Verizon and Novell Make Announcements about Plans to Secure the Cloud

Cisco Systems, Verizon Business and Novell announce plans to launch offerings designed to heighten security in the cloud.

On April 28, Cisco announced security services based around email and the Internet that are part of the company’s cloud protection push and its Secure Borderless Network architecture; Cisco’s Secure Borderless Network architecture seeks to give users secure access to their corporate resources on any device, anywhere, at anytime.

Cisco’s IronPort Email Data Loss Prevention and Encryption, and ScanSafe Web Intelligence Reporting are designed to work with Cisco’s other web security solutions to grant companies more flexibility when it comes to their security offerings while streamlining management requirements, increasing visibility and lowering costs.

Verizon and Novell made an announcement on April 28 about their plans to collaborate to create an on-demand identity and access management service called Secure Access Services from Verizon. Secure Access Services from Verizon is designed to enable enterprises to decide and manage who is granted access to cloud-based resources. According to the companies, the identity-as-a-server solution is the first of what will be a host of joint offerings between Verizon and Novell.

According to eWeek, studies continuously indicate that businesses are likely to continue trending toward a cloud-computing environment. With that said, issues concerning security and access control remain key concerns. Officials from Cisco, Verizon and Novell say that the new services will allow businesses to feel more at ease while planning their cloud computing strategies.

“The cloud is a critical component of Cisco’s architectural approach, including its Secure Borderless Network architecture,” said vice president and general manager of Cisco’s Security technology business unit Tom Gillis in a statement. “Securing the cloud is highly challenging. But it is one of the top challenges that the industry must rise to meet as enterprises increasingly demand the flexibility, accessibility and ease of management that cloud-based applications offer for their mobile and distributed workforces.”

Cisco purchased ScanSafe in December 2009 and the result is Cisco’s ScanSafe Web Intelligence Reporting platform. The platform is designed to give users a better idea of how their Internet resources are being used, and the objective is to ensure that business-critical workloads aren’t being encumbered by non-business-related traffic. Cisco’s ScanSafe Web Intelligence Reporting platform can report on user-level data and information on Web communications activities within second, and offers over 80 predefined reports.

Designed to protect outbound email in the cloud, the IronPort email protection solution is perfect for enterprises that don’t want to manage their email. Cisco officials say that it provides hosted mailboxes (while keeping control of email policies) and also offers the option of integrated encryption.

Officials say Cisco operates over 30 data centers around the globe and that security offerings handle large quantities of activity each day—including 2.8 billion reputation look-ups, 2.5 billion web requests and the detection of more than 250 billion span messages—and these are the latest in the company’s expanding portfolio of cloud security offerings.

Verizon and Novell’s collaboration—the Secure Access Services—are designed to enable enterprises to move away from the cost and complexity associated with using traditional premises0based identity and access management software for securing applications. These new services offer centralized management of web access to applications and networks in addition to identity federation and web single sign-on.

Novell CEO Ron Hovsepian released a statement saying, “Security and identity management are critical to accelerating cloud computing adoption and by teaming with Verizon we can deliver these important solutions.” While Verizon brings the security expertise, infrastructure, management capabilities and portal to the service, Novell provides the identity and security software. For more information contact a Nubifer representative today.

New Cloud-Focused Linux Flavor: Peppermint

A new cloud-focused Linux flavor is in town: Peppermint. The Peppermint OS is currently a small, private beta which will open up to more testers in early to late May. Aimed at the cloud, the Peppermint OS is described on its home page as: “Cloud/Web application-centric, sleek, user friendly and insanely fast! Peppermint was designed for enhances mobility, efficiency and ease of use. While other operating systems are taking 10 minutes to load, you are already connected, communicating and getting things done. And, unlike other operating systems, Peppermint is ready to use out of the box.”

The Peppermint team announced the closed beta of the new operating system in a blog post on April 14, saying that the operating system is “designed specifically for mobility.” The description of the technology on Launchpad describes Peppermint as “a fork of Lubuntu with an emphasis on cloud apps and using many configuration files sources from Linux Mint. Peppermint uses Mozilla Prism to create single site browsers for easily accessing many popular Web applications outside of the primary Web applications outside of the primary browser. Peppermint uses the LXDE desktop environment and focuses on being easy for new Linux users to find their way around in.”

Lubuntu is described by the Lubuntu project as a lighter, faster and energy-saving modification of Ubuntu using LXDE (the Lightweight X11 Desktop Environment). Kendall Weaver and Shane Remington, a pair of developers in North Carolina, make up the core Peppermint team. Weaver is the maintainer for the Lunix Mint Fluxbox and LXDE editions as well as the lead software developer for Astral IX Media in Asheville, NC and the director of operations for Western Carolina Produce in Hendersonville, NC. Based in Asheville, NC, Remington is the project manager and lead Web developer for Astral IX Media and, according to the Peppermint site, “provides the Peppermint OS project support with Web development, marketing, social network integration and product development.” For more information please visit Nubifer.com.

Using Business Service Management to Manage Private Clouds

Cloud computing promises an entirely new level of flexibility through pay-as-you-go, readily accessible, infinitely scalable IT services, and executives in companies of all sizes are embracing the model. At the same time, they are also posing questions about the risks associated with moving mission-critical workloads and sensitive data into the cloud. eWEEK’s Knowledge Center contributor Richard Whitehead has four suggestions for managing private clouds using service-level agreements and business service management technologies.

“Private clouds” are what the industry is calling hybrid cloud computing models which offer some of the benefits of cloud computing without some of the drawbacks that have been highlighted. These private clouds host all of the company’s internal data and applications while giving the user more flexibility over how service is rendered. The transition to private clouds is part of the larger evolution of the data center, which makes the move from a basic warehouse of information to a more agile, smarter deliverer of services. While virtualization helps companies save on everything from real estate to power and cooling costs, it does pose the challenge of managing all of the physical and virtual servers—or virtual sprawl. Basically, it is harder to manage entities when you cannot physically see and touch them.

A more practical move into the cloud can be facilitated through technology, with private clouds being managed through the use of service-level agreements (SLAs) and business service management (BSM) technologies. The following guide is a continuous methodology to bring new capabilities into an IT department within a private cloud network. Its four steps will give IT the tools and knowledge to overcome common cloud concerns and experience the benefits that a private cloud provides.

Step 1: Prepare

Before looking at alternative computing processes, an IT department must first logically evaluate its current computing assets and ask the following questions. What is the mixture of physical and virtual assets? (The word asset is used because this process should examine the business value delivered by IT.) How are those assets currently performing?

Rather than thinking in terms of server space and bandwidth, IT departments should ask: will this private cloud migration increase sales or streamline distribution? This approach positions IT as a resource rather than as a line item within an organization. Your private cloud migration will never take off if your resources aren’t presented in terms of assets and RIO.

Step 2: Package

Package refers to resources and requires a new set of measurement tools. IT shops are beginning to think in terms of packaging “workloads” in the virtualized world as opposed to running applications on physical servers. Workloads are portable, self-contained units of work or services built through the integration of the JeOS (“just enough” operating system), middleware and the application. They are portable and able to be moved across environments ranging from physical and virtual to cloud and heterogeneous.

A business service is a group of workloads, and this shows a fundamental shift from managing physical servers and applications to managing business services composed of portable workloads that can be mixed and matched in the way that will be serve the business. Managing IT to business services (aka the service-driven data center) is becoming a business best practice and allows the IT department to price and validate its provide cloud plan as such.

Step 3: Price

A valuation must be assigned to each IT unit after you’ve packaged up your IT processes into workloads and services. How much does it cost to run the service? How much will it cost if the service goes offline? The analysis should be presented around how these costs effect the business owner because the costs assessments are driven by the business need.

One of the major advantages of a service-driven data center is that business services are able to be dynamically manages to SLAs and moved around appropriately. This allows companies to attach processes to services by connecting workloads to virtual services and, for the first time, connects a business process to the hardware implementing that business process.

The business service can be managed independent of the hardware because they aren’t tied to the business server and can thus be moved around on an as-needed basis.

Price is dependent on the criticality of the service, what resources it will consume or whether it is worthy of backup and/or disaster recovery support. This shows a new approach not usually disclosed by IT and transparency in a cloud migration plan can be seen as a crucial part of demonstrating the value the cloud provides in a way that is cost-effective.

Step 4: Present

After you have an IT service package, you must present a unified catalog to the consumers of those services. This catalog must be visible to all relevant stakeholders within the organization and can be considered an IT storefront or showcase featuring various options and directions for your private cloud to demonstrate value to the company.

This presentation allows your organization the flexibility to balance IT and business needs for a private cloud architecture that works for all parties; the transparency gives customers a way to interact directly with IT.

Summary

Although cloud computing remains an intimidating and abstract concept for many companies, enterprises can still start taking steps towards extending their enterprise into the cloud with the adoption of private clouds. An organization can achieve a private cloud that is virtualized, workload-based and managed in terms of business services with the service-driven data center. Workloads are managed in a dynamic manner in order to meet business SLAs. The progression from physical server to virtualization to the workload to business service to business service management is clear and logical.

In order to insure that your private cloud is managed effectively—thus providing optimum visibility to the cloud’s business value—it is important to evaluate and present your cloud migration in this way. Cloud investment can seem less daunting when viewed as a continuous process and the transition can be make in small sets which makes the value a private cloud can provide to a business more easily recognizable to stakeholders. For more information, visit Nubifer.com.

Microsoft and Intuit Pair Up to Push Cloud Apps

Despite being competitors, Microsoft and Intuit announced plans to pair up to encourage small businesses to develop cloud apps for the Windows Azure platform in early January 2010.

Intuit is offering a free, beta software development kit (SDK) for Azure and citing Azure as a “preferred platform” for cloud app deployment on the Intuit Partner Platform as part of its collaboration with Microsoft. This marriage opens up the Microsoft partner network to Intuit’s platform and also grants developers on the Intuit cloud platform access to Azure and its tool kit.

As a result of this collaboration, developers will be encouraged to use Azure to make software applications that integrate with Intuit’s massively popular bookkeeping program, QuickBooks. The companies announced that the tools will be made available to Intuit partners via the Intuit App Center.

Microsoft will make parts of its Online Business Productivity Suite (such as Exchange Online, SharePoint Online, Office Live Meeting and Office Communications Online) available for purchase via the Intuit app Center as well.

The agreement occurred just weeks before Microsoft began monetizing the Windows Azure platform (on February 1)—when developers who had been using the Azure beta free of charge began paying for use of the platform.

According to a spokesperson for Microsoft, the Intuit beta Azure SDK will remain free, with the timing for stripping the beta tag “unclear.”

Designed to automatically manage and scale applications hosted on Microsoft’s public cloud, Azure is Microsoft’s latest Platform-as-a-Service. Azure will serve as a competitor for similar offerings like Force.com and Google App Engine. Contact a Nubifer representative to see how the Intuit – Microsoft partnership can work for your business.