Author Archive

2012 in review

The WordPress.com stats helper monkeys prepared a 2012 annual report for this blog.

Here’s an excerpt:

600 people reached the top of Mt. Everest in 2012. This blog got about 4,000 views in 2012. If every person who reached the top of Mt. Everest viewed this blog, it would have taken 7 years to get that many views.

Click here to see the complete report.

Cloud Computing in 2012 (continued) – Automation of Cloud Resources

With the emergence of cloud computing, the ability to spin up new resources via an API and deploy a new virtual  instance of machines quickly has become one of the more popular paradigms. Resource automation is an important part of any company’s business initiatives. Cloud hosted applications can be deployed, provisioned and created with new instances in an on-demand basis. These computing resources can be created and utilized in very short order, usually in a matter of minutes, and in some cases a matter of seconds.

High Availability, Configurable and Programmable features make Cloud Automation a key driver for adoption

The dynamic nature of cloud automation enables companies to provision and de-provision cloud resources on-the-fly as you meet needed threshold metrics based off of application environment variables, such as increased load traffic, additional users or environment changes on the system running queries, additional taxation of your application platform during data mining or migration processes, and during heavy usage and peak load traffic with data crunching scenarios imposed during run-time.

An important point to note is the rapid adoption and move towards multicore servers strengthing the impact of dynamic automatic virtualization. Overall, the fulcrum points of cloud automation are, increased efficiency, as it pertains to billing, and the ease of implementing highly available computing power required during peak loads of your business applications hosted across the clouds.

Benefits of not having to purchase and provision new servers

By embracing cloud technologies, companies are presented with opportunities to scale their offerings to their clients and serve their business initiatives, all while gaining the ability of  dynamical scaling. Competitive advantages can be seen in hundreds of articles and white papers written by companies, who were either start-ups or building a new application, or migrating an existing data center application to the cloud, and were enabled to get to market quicker and expand or contract their platform hardware requirements in real time as the demand of their systems change.

Amazon and Windows Azure® Cloud Automation Features

There are numerous cloud providers to compare and contrast for Cloud Automation, so we have put together some key points about Amazon Web Services and Windows Azure. Amazon offers auto scaling which the company describes here : http://aws.amazon.com/autoscaling/. Windows Azure offers Autoscaling Application Blog information here: http://www.windowsazure.com/en-us/develop/net/how-to-guides/autoscaling/

Amazon CloudWatch and Auto Scaling ( the following content was copied from the links above )

Amazon CloudWatch provides monitoring for AWS cloud resources and the applications customers run on AWS. Developers and system administrators can use it to collect and track metrics, gain insight, and react immediately to keep their applications running smoothly. Amazon CloudWatch monitors AWS resources such as Amazon EC2 and Amazon RDS DB instances, and can also monitor custom metrics generated by a customer’s applications and services. With Amazon CloudWatch, you gain system-wide visibility into resource utilization, application performance, and operational health.

Amazon CloudWatch provides a reliable, scalable, and flexible monitoring solution that you can start using within minutes. You no longer need to set up, manage, or scale your own monitoring systems and infrastructure. Using Amazon CloudWatch, you can easily monitor as much or as little metric data as you need. Amazon CloudWatch lets you programmatically retrieve your monitoring data, view graphs, and set alarms to help you troubleshoot, spot trends, and take automated action based on the state of your cloud environment.

Amazon Auto Scaling

Auto Scaling allows you to scale your Amazon EC2 capacity up or down automatically according to conditions you define. With Auto Scaling, you can ensure that the number of Amazon EC2 instances you’re using increases seamlessly during demand spikes to maintain performance, and decreases automatically during demand lulls to minimize costs. Auto Scaling is particularly well suited for applications that experience hourly, daily, or weekly variability in usage. Auto Scaling is enabled by Amazon CloudWatch and available at no additional charge beyond Amazon CloudWatch fees.

Features of Amazon Auto Scaling

  • Scale out Amazon EC2 instances seamlessly and automatically when demand increases.
  • Shed unneeded Amazon EC2 instances automatically and save money when demand subsides.
  • Scale dynamically based on your Amazon CloudWatch metrics, or predictably according to a schedule that you      define.
  • Receive notifications via Amazon Simple Notification Service (SNS) to be alerted when you use Amazon      CloudWatch alarms to initiate Auto Scaling actions, or when Auto Scaling      completes an action.
  • Run On-Demand or Spot  instances, including those inside your Virtual Private Cloud (VPC) or High Performance Computing (HPC) Clusters.
  • If you’re signed up for the Amazon EC2 service, you’re already registered to use  Auto Scaling and can begin using the feature via the Auto Scaling APIs or  Command Line Tools.
  • Auto Scaling is enabled by Amazon CloudWatch and carries no additional fees.

Windows Azure® Autoscaling Application Block
( the following content was copied from the links above )

The Autoscaling Application Block can automatically scale your Windows Azure application based on rules that you define specifically for your application. You can use these rules to help your Windows Azure application maintain its throughput in response to changes in its workload, while at the same time control the costs associated with hosting your application in Windows Azure.

Along with scaling by increasing or decreasing the number of role instances in your application, the block also enables you to use other scaling actions such as throttling certain functionality within your application or using custom-defined actions.

You can choose to host the block in a Windows Azure role or in an on-premises application.

A multitude of Cloud Platform offerings, brings compelling opportunity

It is clear that the cloud computing models of deployment, management and ongoing maintenance and support bring a very compelling argument to “why adopt the cloud”. The largest and most innovative companies today are openly embracing this paradigm of technology thought leadership and ROI is proving itself every day.

To learn more about Cloud Computing, Windows Azure®, Amazon Cloud, and other fantastic Cloud and SaaS platforms for business, contact our strategic cloud advisers by visiting http://www.nubifer.com

Cloud Computing in 2012 (continued) – On-Demand Elasticity

Cloud computing, at its core, offers a large set of resources that  enable a concept known as elasticity. Elasticity is a part of the core feature set that comprise cloud computing. The concept behind elasticity is so integral to cloud computing that Amazon Web services decided to categorize the major offering in their cloud as Amazon EC2 (Elastic Cloud Compute).

The definition of elasticity can be described, or sometimes known as, dynamic scaling. The ability to dynamically scale and change resource requirements or consumption needs in direct response to runtime requirements makes this paradigm of cloud computing an integral part of the model. Most applications require a standard level of resources operating under normal, ready state environmental conditions, but also require additional computing resources during peak usage situations.

Before the advent of the cloud model, companies were required to pre-build, pre-purchase and configure sufficient capacities to not just operate properly under standard load requirements, but also handle extensive peak load situations while offering sufficient performance. When looking into the past and present of the self-hosted model, this means companies having to over provision and purchase additional hardware and software for their given application requirements and further requires engineers to try to accurately predict customer or end user usage in peak load scenarios.

When looking into managed hosting, it is possible to start with a small subset of computing resources and hardware and continue to grow the resource as the applications requirements grow. But in the model of managed hosting, provisioning for new hardware and software dedicated to the application’s needs can take weeks, or even larger companies, months.

With cloud computing having hundreds and thousands of virtualized computing resources which can be leveraged, provisioned, and released in conjunction to the application and peak load requirements on demand make the elastic cloud model the most powerful and convenient paradigm available to business. When businesses incorporate automation via dynamic scaling, also known as elasticity, the service-level offerings to end-users increase substantially.

Our next blog will focus on virtualization in cloud computing. Please check back often, or subscribe to our blog to stay up-to-date on the latest posts and perspectives and news about cloud computing. For more information about Nubifer Cloud Computing visit www.NUBIFER.com

Cloud Computing in 2012 (continued) – Shared Resources in the Cloud

A primary characteristic of cloud computing is that the platform leverages pooled or shared assets. These computing resources can be bought, controlled externally, and used for public or private usage. As we look further into the validity of these shared computing resources, one can easily see that they are an integral component to any public or private cloud platform.

Take, for example, a business website. We begin to see standard options commonly available in today’s market. Shared hosting, is one of the choices companies have had for quite some time now. The shared approach leads them to be free from managing their own data center, and in turn, leverage a third party. Most of the time, managed hosting services lease out to their customers a dedicated server which is not the shared with other users.

Based solely on this, cloud computing looks a lot like a shared hosting model of managed services. This is due to the fact that the cloud platform provider is the third-party that manages, operates and owns the physical computing hardware and software resources which are distributed and shared. At this juncture in the paradigm is where the similarities between shared or dedicated hosting and cloud computing end.

With cloud computing set aside for a moment, the move away from IT departments utilizing self hosted resources and using outsourced IT services  has been evolving for years. This change has substantial economic impacts. Two of the main areas of change are in CAPEX and OPEX. This furthers the potential opportunity for reducing OPEX in conjunction with operating the hardware and software infrastructure. The change from CAPEX toward OPEX defines a lowering of the barrier for entry when starting a new project.

When examining self hosting, companies are required to allocate funding to be spent up front for licenses and hardware purchases. Operating under fixed costs, it is an out-of-pocket expense in the beginning of that project. Furthermore, when leveraging and outsourced offering (a.k.a. managed hosting), the upfront fees can typically be equal to a one-month start-up operational cost, and possibly a set up fee. When analyzed from a financial perspective, the annual cost is close to the same, or just a little bit lower, than the CAPEX expense for an equal project. Additionally, this can be offset by the reduction of required OPEX to manage and care for the infrastructure.

In stark comparison, when analyzing the cloud model, it is standard to see no up-front fees. With closer examination, a subscriber to cloud services can register, purchase, and be leveraging the services in much less time than it takes to read this blog.

The dramatic differential comparisons in financial expenditures you might see between these hosting models, and the cloud model, exist because the cost structures when utilizing cloud infrastructures are drastically more attractive than earlier models offered to IT.  With further investigation, it’s clear the economies of scale are multi-faceted, and driven by relation to the economics of volume. The largest cloud platform providers are able to offer a better price point to the IT consumers because they are able to bulk purchase, and offer better goods and services; which in this paradigm, are capacity, power, data storage, and compute processing power.

And so continues our 2012 blog series dedicated to understanding the core layers of cloud computing. Our next blog will focus on elasticity in cloud computing. Please check back often, or subscribe to our blog to stay up-to-date on the latest posts and perspectives and news about cloud computing. For more information about Nubifer Cloud Computing visit www.NUBIFER.com

Cloud Computing in 2012 – The Evolution Continued

The term ‘Cloud Computing’ is now mainstream and well-known in most business sectors. To gain an understanding of how this came to be, and what all the interest and hype is about, you must recall the recent and growing beliefs among vendors analysts that have helped to popularize, and define cloud computing as as the pinnacle of computing service offered by third parties, offering cheap computing infrastructure and software services.

The resources available for use when needed is  described as “on demand”, and can be scaled dynamically in direct response to the needs of the users of the software and platforms. Simply put, cloud computing completes a departure from the past of developing, maintaining, operating and managing IT infrastructure systems. Thus bringing businesses an easier way to focus on what they do best in their own vertical.

Economic Advantage
When you look at cloud computing from an economic perspective, the adoption of cloud computing has the potential to provide economic benefits for all sizes of businesses, providing for greater flexibility and agility in the day to day operations. As the cloud providers and industry leaders continue to refine, evolve and define cloud computing, our understanding of its costs, its values and ongoing benefits proliferate opportunity each day.

Some of the areas we’ll cover in our online blogs are the main principles of cloud computing. Additionally, we will also discuss the benefits from moving from traditional data center driven software applications and migrating to the cloud. Furthermore, we will discuss how evolving IT has brought us to what we now call cloud computing.

Media and News about the Cloud
Nowadays, online journals, blogs conferences, technical books and a myriad of other information sources continue to define and disseminate information about cloud computing. However, even large mainstream technology companies and technical information websites are still learning, and educating the masses on what ultimately brought about the “Cloud”.

In some respects, cloud computing’s entry to the World Wide Web is not new, but what is new is the access that companies and people have. Furthermore, it is clear that Cloud Computing  may also win the award for the most overhyped category, including service oriented architectures also known as SOA, application service providers, business intelligence and other evolving computing terms, just to name a few.

Because our blog discusses the very large scale topic of cloud computing, we need to dive in and discuss the facets of the cloud in its greatest detail possible. It is our goal, at Nubifer to cut through all the hype, and share with you practical applications, frameworks, business thought leaders and approaches to leveraging the cloud for your own endeavors.

Analyze to Understand, Practice to Gain Experience

Many analysts, business users, subscribers and pundits ask themselves, “How did this new paradigm of cloud computing, and its driving popularity come to be?” It’s easy to step back and call cloud computing a marketing approach or another series of vendors trying to play up their offerings. But with all the hype put aside, there is a large body of legitimate information and technology advancements that are fueling the cloud, and all of the excitement behind it.  All of the expectations and hype surrounding Cloud Computing are based on sound information and real-time opportunities aimed at improving business efficiencies profitability and succinctly.

Primarily software developers and SMB’s leveraged the cloud in the first 24 months that it was available for public use. Amazon attracted over1 million customers when it first opened its offering to public consumption.  Amazon’s own website shows the bandwidth consumed by large companies leveraging the cloud has even surpassed their own online store, Amazon.com. Clearly, something is driving the rapid adoption of the cloud.

The Cloud has taken on similar marketing popularity as previous paradigm shifts and offerings evolving in the World Wide Web. For example, the move from traditional mainframes, to then client and server and then from client server to now the Internet, the model of cloud computing has, and will continue to  have major implications for the future of business IT.

Principles that Help Define the Many Layers of Cloud Computing

  1. subscription-based services and resources available via pooled computing resources
  2. hardware utilization maximized by virtualized computing resources
  3. the ability to on demand scale the elastic software approaches
  4. virtual machine management being able to be automated for creation or deletion of existing instances
  5. enhanced billing services for resources used

It is our perspective at Nubifer, that these layers of cloud computing are the main key areas of interest, and are the components necessary for something to be defined as Cloud Computing.

Our upcoming blog series will cover

  • shared or pooled resources
    available via subscription model
  • elasticity
    On demand scale dynamically with capex expenditure
  • virtualization
    Utilization of hardware assets
  • automation
    Complete provisioning deployment configuration build outs and moves all without manual involvement
  • metered billing
    Pay for what you use usage-based business model

And so begins our 2012 blog series dedicated to understanding the core layers of cloud computing. Please check back often, or subscribe to our blog to stay up-to-date on the latest posts and perspectives and news about cloud computing. For more information about Nubifer Cloud Computing visit http://www.NUBIFER.com

Guidelines for Cloud Consumers and Providers

Business users are drawn to the cloud. That’s not surprising, considering they tend to see mostly benefits: self-service freedom, scalability, availability, flexibility, and the pleasure of avoiding various nasty hardware and software headaches.IT leaders though are a different story—they are not always as ecstatic.  They indicate uneasiness about cloud securityand have legitimate concerns that unauthorized users could get their hands on their applications and data. Moreover, retaining a level of influence and control is a must for them. Can both “sides” meet halfway? Is it attainable to provide the freedom that users want while having the control that IT leaders need?
.
Simply put, Yes…. However, doing so will entail a collaborative effort. Both business users and IT leaders have to assume a few key responsibilities. In addition, you will have to make certain that your cloud provider will be doing its part as well.

.

Your 5 Responsibilities

Here are a few things you need to be held accountable for:
.
1. Define the business need. Identify the root problem you want to solve a cloud technology. Is it a perpetually recurring concern, or one that happens irregularly? Did you need an answer “last week,” or do you have time to construct a solution?

Important note: Not all clouds are created equally. Some can run your applications unchanged, with instant access; while others require little tweaking. Recognizing your needs and differentiating cloud technologies will help you determine the correct strategy for handling the particular business problem that needs attention.

2. Identify your application and process requirements. Once you have accurately defined your business needs, it is time to select the application best-suited to meet those needs. Be clear and precise about the nature of the application, the development process you want to adapt, and the roles and access permissions for each user.

Your teams no longer have to struggle through traditional linear and slow development processes. Instead, the cloud can give them access to the best practices that are fluid and agile. Many self-service solutions can even empower them to run copies of the same environment in parallel.

Simply put, the cloud may lead to breakthrough productivity when used properly. However, if used incorrectly it can also lead to enormous amounts of wasted resources. Having said this, take your time to do your research and choose wisely.

3. Determine your timetable. Cloud projects are not short sprints contrary to popular belief. They are better illustrated as long journeys over time. Please plan accordingly.

Nubifer recommends to define your early experiments in a quarterly basis because cloud technology is transformative. Learn from the first quarter, take note, and execute the necessary adjustments and then move on to the next. The objective is to generate a learning organization that increases control over time and progresses based on data and experience.

4. Establish success factors. Define what success is for you. Do you want to improve the agility of the development process? Maybe you want to increase the availability of your applications? Or perhaps you want to enhance remote collaboration? Define achievement, and have a tool to measure progress as well. Identifying metrics and establishing realistic goals will aid you achieve the solution that meets not only your needs, but also your budget and payback time frame.

5. Define data and application security. Companies overlook this critical responsibility more often than they realize. Make sure to do your due diligence and attentively determine whom you can trust with cloud application. After which, empower them. The following are questions that need unambiguous answers: What specific roles will team members take in the cloud model? Does everyone comprehend fully the nature of the application and data they are planning to bring to the cloud? Does everyone know how to protect your data? Do they understand your password policies? Dealing with these security factors early on enables you to create a solid foundation for cloud success while having your own peace of mind about this issue.

Your Provider’s 5 Responsibilities

Meanwhile, make sure your cloud provider offers the following to attain better cloud control:
1. Self-service solutions. Time equals money. Thus waiting equals wasted time and money. So search for cloud applications that are ready from the get go. Determine if the solution you are considering may implement the applications and business process you have in mind immediately, or if the provider requires you to rewrite the application or change the process entirely.

There is also a need to distinguish if users will require training, or if they already equipped to handle a self-service Web interface. Answers to these questions can determine whether adoption will be rapid and smooth, or slow and bumpy.

2. Scale and speed. A well-constructed cloud solution provides the unique combination of scale and speed. It gives you access to the resources at a scale that you need with on-demand responsiveness. This combination will empower your team to run several instances in parallel, snapshot, suspend/resume, publish, collaborate, and accelerate the business cycle.

3. Reliability and availability. As articulated in the Service Level Agreements (SLAs), it is the responsibility of the cloud provider to make the system reliable and available. The provider should set clear and precise operational expectations, such as 99.9 percent availability, with you, the consumer.

4. Security. Ask for a comprehensive review of your cloud provider’s security technology and processes. In specific, ask about the following:

  • Application and data transportability. Can your provider give you the ability to export existing applications, data and processes into the cloud with ease? And can you import back just as hassle free?
  • Data center physical security.
  • Access and operations security. How does the consumer protect its physical data centers? Are these the SAS 70 Type II data centers? Are there trained and skilled data center operators in those places?
  • Virtual data center security. Your provider must be clear about how to control the method of access to physical machines. How are these machines managed? And who are able to access these machines?
  • In terms of scale and speed, most cloud efficiency derives from how the cloud is architected. Be sure to understand how the individual pieces, the compute nodes, network nodes, storage nodes, etc., are architected and how they are secured and integrated.

Application and data security.

In order to be able to implement your policies, the cloud solution must permit you to define groups, roles with granular role-based access control, proper password policies and data encryption–both iin transit and at rest.

5. Cost efficiencies. Without any commitments upfront, cloud solutions should enable your success to drive success. Unlike a managed service or a hosting solution, a cloud solution uses technology to automate the back-end systems, and therefore can operate large resource pools without the immense human costs. Having this luxury translates all these into real cost savings for you.

Despite business leaders recognizing the benefits of cloud computing technologies, more than a handful still have questions about cloud security and control. Indeed, that is understandable. However, by adopting a collaborative approach and aligning their responsibilities with those of the cloud provider, these leaders can find solutions that offer the best of both worlds. They get the visibility and control they want and need, while giving their teams access to the huge performance gains only the cloud can provide.

Contact Nubifer for a free, no-obligation Cloud Migration consultation.

Has Your Organization Adopted a Cloud Migration Strategy?

There has been an increased amount of research lately that indicates that many organizations will move to the cloud in the short term, there isn’t a lot of information detailing who is using it now and what they are using it for.

A published study by CDW reported that a number of enterprises are actually unaware that they are already using cloud applications and have a limited cloud adoption strategy.

It must be noted though, that this does not mean these enterprises have no intention of moving to the cloud. It just means, that these enterprises have not yet approached cloud computing strategically, and have not implemented an organization wide adoption strategy.

Cloud Computing Strategies

Another interesting note, according to the CDW report, is the percentage of companies claiming to have an enterprise policy on the acclimation to cloud computing — only 38%. This comes as a surprise as the report also concludes that 84% of organizations have already installed, at the minimum, one cloud application.

In March 2011, more than 1,200 IT professionals were asked to answer surveys for the CDW 2011 Cloud Computing Tracking Poll, which drew some interesting conclusions. It was discovered that these enterprises are uneasy with using public clouds and would rather go through the private clouds.

Cloud Application Usage

However, it is necessary to examine these statistics again with more caution. As mentioned above, more than 84% of these organizations claim that they have, at the bare minimum, one cloud application, yet they still do not consider themselves as cloud users.

The reason behind this discrepancy has yet to be determined. In other words, organizations are still unclear as to if and how it can integrate with their current enterprise architecture.

This is emphasized by how only 42% of those surveyed being convinced that their operations and amenities have the ability to operate efficiently in the cloud. Statistics show that applications operated in the cloud most frequently are the following:

  • Commodity applications such as email (50% of cloud users)
  • File storage (39%)
  • Web and video conferencing (36% and 32%)
  • Online learning (34%)

Developing a Cloud Strategy

Eight industries that were surveyed as part of the CDW Cloud Computing Tracking Poll back in March 2011 were—small businesses, medium businesses, large businesses, the Federal government, State and Local governments, healthcare, higher education and K-12 public schools. The poll discovered conclusions specific to each of the eight industries. It also included 150 individuals from each industry who acknowledged themselves as knowledgeable with the current uses and future plans of cloud application usage within their respective organization.

Although there are various hurdles to consider prior to adoption, primarily they can be divided into four segments:

1. Adoption Strategy

Despite having a number as high as 84% of organizations using at least one cloud-based application, only 25% of them have an organization wide adoption strategy and recognize themselves as cloud users. Just over a third has a formal plan for cloud adoption.

2. ROI Considerations

Approximately 75% were noted to have cost reductions upon migrating applications to a cloud platform.

3. Security

One of the, if not the primary obstacle, holding both current and potential users back is security. However, quite a number of users, including those who are currently using cloud applications, have yet to realize the full potential of security applications available.

4. Future spending

It is necessary for organizations to discover what future hardware and software acquisitions can be migrated into a cloud ecosystem.

Cloud Computing Now

A lot can happen in five years—this is especially true for the cloud industry. Currently, this study does not discuss in depth the difference between cloud computing and SaaS. However, it is likely that SaaS could be included in the study as it did define cloud computing as a “model for enabling convenient, on-demand access to a shared pool of configurable computing resources.”

With this in mind, along with the recent Forrester research on IT spending, it is highly likely that the data CDW has outlined will be significantly different five years from now.

According to Forrester, a record number of organizations will be investing in SaaS technologies, which broadly, is a subset of cloud computing. The data includes a finding that 25% of enterprises examined have a adopted a new cloud technology this year, with 14% using IaaS, 8% using PaaS, and 6% using business-process-as-a-service.

Does Your Organization Have a Cloud Migration Strategy?

In the end, the research was able to provide some thought provoking data. It was able to show that many companies are already leveraging the cloud without even knowing it.

Regardless of the potential ROI and efficiency gains offered by cloud computing, a significant number of companies have yet to seize the opportunity to leverage the scalability and efficiency of modern cloud applications.

Aside from this, according to the research, many companies find themselves without a coherent company wide strategy for dealing with cloud adoption. This is important to note because it is no secret a lack of planning can lead to disastrous results—with results like these needing a lot of financial and organizational efforts to fix.

If your organization is one of those lacking a coherent and comprehensive cloud adoption strategy, contact the Cloud accelerator experts at Nubifer to help guide the way. Nubifer partners with the leading vendors in order to provide unbiased cloud application architecture diagrams, white papers, security and compliance risk analysis and migration consulting services.


Cloud Appliances for Private Clouds

Cloud computing technologies have the ability to deliver a vast array of important benefits, including the option to leverage compute and storage resources on-demand. Public clouds are the most visible form of this. But, some organizations need important applications and workloads to be operated behind their firewall.

The size of  modern data sets makes it difficult to send over the internet to a public cloud data center. Management most likely has security concerns about data being stored in a facility outside of IT’s control. Often times there is specific hardware, software, or storage requirements that cannot be adhered to in public cloud ecosystems. In response, many organizations are leveraging private clouds.

There are two basic approaches to deployment of a private cloud environment: Build your own or purchase an appliance.

Build Your Own Private Cloud

With organizations operating their own compute, storage and network resources, one option to look into is redeploying these existing instances into a private cloud. Due to the trend of server consolidation, many of these machines may already be operating a virtualization layer. Beginning from this point, deploying infrastructure (IBM, VMWare, etc.) is a logical nest step.

Erecting a private cloud takes more than piling software layers on top of existing resources. Unfortunately, many enterprises may not have the internal resources and expertise to take on this integration workload. This is where a consulting firm like Nubifer can play an integral role in solving these vexing problems.

The Open Source Alternative

With proprietary and trade-marked technology comes the issue of being stuck with a specific vendor. In response, open-source options have evolved. Rackspace CEO Lew Moorman said his company opted to leverage OpenStack to open-source the software behind the cloud computing stack “because we believe a widely adopted, open platform will drive standards.” In the past 6 months, more than 50 companies had joined the community.

Opposition to adopting open source does exist. For example, the OpenStack code base is still very immature, and features such as supporting ‘VMware hypervisors’ and live migration of instances are still in development.

Also, IT folks need to download the releases and install themwith the existing compute, storage and networking infrastructure. This brings up another potential deal breaker: do you burden your internal IT staff with these modifications? Nubifer is here to help…

Cloud Appliances

An evolving method to deploying a private cloud is by leveraging a cloud appliance. A cloud appliance is a rack of computing resources delivered tested and ready to go, with the software versioned and configured. When the appliance is plugged in to power and the network, you’re ready to go.

For example, Nubifer partner, IBM, sells a private cloud appliance. This appliance blends standard hardware components and x86-based servers. By deploying an integrated cloud appliance,  IT is spared the time it would take to build its own. This frees up an organization to enterprise to focus on delivering business value rather than building IT componentry.

IBM’s private cloud offering is an integrated solution combining self-service, orchestration, and automation for heterogeneous resource pools.

Cloud appliances have drawbacks, though. For example,  new equipment is bought as part of the appliance, versus redeploying existing components.Because of this, an organization would probably consider an appliance during a hardware refresh cycle. In addition, there are a limited amount of pre-configured models, leading to a one size does not fit all situation.

Organizations are attempting to focus more on primary business functions, which for most does not include constructing IT infrastructure. All while public clouds are leveraging standardization to lower costs and offer greater levels of agility.

However, many workload requirements inhibit moving data sets to public cloud environments, spawning the deployment of private clouds. However, when an enterprise considers building a private cloud, it’s back in the discussion of building out IT infrastructure.

Cloud appliances offer a potential solution. By pre-integrating all components, IT simply plugs in and turns the power on. And after all, when buying a new car, you would prefer to turn the key and go, versus huddling hour upon hour reading the user manual. Why shouldn’t your private cloud deliver a similar experience?

For more information on private cloud implementation contact a Nubifer representative.

Compliance in the Cloud

Cloud computing seems like a simple idea, and, ease of operation, deployment and licensing are its most desirable qualities. But when it comes to issues of compliance, once you go beneath the surface you’ll discover more questions than you thought of originally.

Compliance covers a lot of issues, from government regulations, to industry regulations such as PCI DSS  and HIPAA. Your organization probably has internal guidelines in place, but migrating to a public cloud, a cloud application suite or something similar will mean giving up the reins to the cloud vendor.

That’s a position many auditors—and C level officials—discover themselves in today. They want to discover how to adopt the cloud  in a fashion that maintains their good standing with compliance. Here are a few tips for keeping an eye on compliance in the cloud.

Challenges to your Workload

When you survey cloud vendors, start by asking about sound practices and methods for identity and access management, data protection and incident response times. These are basic compliance requirements. Then, as you identify various compliance issues to your prospective cloud vendor’s controls, you’ll probably encounter a few cloud-specific challenges.

Multi-tenancy and de-provisioning also pose challenges. Public clouds use multi-tenancy to better provision server workloads and keep costs low. But multi-tenancy means you’re sharing server space with other organizations, so you should know what safeguards your cloud provider has in place to prevent any compromise. Depending on how critical your data is, you may also want to use encryption. HIPAA, for example, requires that all user data, both moving and at rest, be encrypted.

User de-provisioning is an issue that will become more challenging as password-authentication methods grow in complexity and volume. Federated identity management schemes will make it easier for users to log on to multiple clouds, and that will make de-provisioning much trickier.

Ever-Changing Standards

Like it or not, you’re an early adopter. Your decisions about what applications to move to the cloud and when to move them will benefit from an understanding of new and/or modified standards that are now evolving for cloud computing.

Today you can look for SAS 70 Type II and ISO 27001 certifications for general compliance with controls for financial and information security typically required by government and industry regulations, but these don’t guarantee that your company’s processes will comply.

Bringing visibility to users is a major goal of the Cloud Security Alliance, a three-year-old organization fast gaining popularity among users, auditors and service providers. A major goal of the CSA is development of standardized auditing frameworks to facilitate communication between users and cloud vendors.

Well underway, for example, is a governance, risk and compliance (GRC) standards suite, or stack, with four main elements: the Cloud Trust Protocol, Cloud Audit, Consensus Assessments Initiative and the Cloud Controls Matrix. The Cloud Controls Matrix includes a spreadsheet that maps basic requirements for major standards to their IT control areas, such as “Human Resources  Employment Termination,” while the Consensus Assessments Initiative offers a detailed questionnaire that maps those control areas to specific questions that users and auditors can ask cloud vendors.

Efforts of the CSA and other alliances, plus those of industry groups and government agencies, are bound to produce a wealth of standards in the next several years. The CSA has formal alliances with ISO, ITU and NIST, so that its developments can be used by those groups as contributions to standards they’re working on. And a 2010 Forrester Research report counted 48 industry groups working on security-related standards in late 2010.

Importance of an SLA

Regardless of your company’s size or status, don’t assume your cloud vendor’s standard terms and conditions will fit your requirements. Start your due diligence by examining the vendor’s contract.

Your company’s size can give you leverage to negotiate, but a smaller business can find leverage, too, if it represents a new industry for a cloud vendor that wants to expand its market. In any case, don’t be afraid to negotiate.

Security

To best understand your potential risk, as well as your benefits, you should bring your security team into the conversation at the earliest possible opportunity, says Forrester.

Moving to the cloud may offer an opportunity to align security with corporate goals in a more permanent way by formalizing the risk-assessment function in a security committee. The committee can help assess risk and make budget proposals to fit your business strategy.

You should also pay attention to the security innovations coming from the numerous security services and vendor partnerships now growing up around the cloud.

For more information regarding compliance and security in the Cloud, contact a Nubifer representative today.

Kentico Portal, a CMS for the Cloud

Cloud computing has been gaining momentum for the last few years, and has recently become required ingredient in every robust enterprise IT environment. Leading CMS vendor, and Nubifer partner, Kentico Software, took a step forward recently when they announced that their CMS Portals are now supported by the leading Cloud platforms. This means that you can now decide to deploy Kentico either on premise in your own IT landscape, using a public Cloud platform (such as Amazon or Windows Azure), or leveraging a hybrid model (with a database behind a firewall and a front end in the cloud).

Kentico Software sees the cloud computing as an important step for their customers. The recent releases of Kentico CMS “…removes barriers for our customers who are looking at their enterprise cloud computing strategy. Regardless of whether it’s on-premise or in the cloud, Kentico CMS is ready,” says Kentico Software CEO, Petr Palas.
Based on the influence of cloud, mobile devices and social media, the online needs of users and customers have changed significantly in recent years. The days of simple brochure-esque websites targeting traditional browser devices with one-way communication are quickly coming to an end. The web has evolved to become much more sophisticated medium. A business website is no longer a destination; rather, it is a central nexus for commercial engagement.Nubifer realizes that a business site today needs to cover the gamut – it needs to be visually appealing, it needs to have an intuitive information architecture, it needs to deliver dynamic, rich, compelling content, it needs to have mechanisms for visitor interaction, it needs to be optimized for speed and responsiveness, it needs to be highly scalable and it needs to deliver an excellent experience to traditional browser devices like desktops and laptops.
Kentico identified that in order to deal with the huge demand for web content from the social and mobile Internet, business websites need to be built with scalability at the forefront of the engineer’s minds. This is where the Cloud and Kentico CMS meet; elastic infrastructure which can be optimized to adapt to the growing needs of your business. Whether this is Infrastructure-as-a-Service ( IaaS ), or Platform-as-a-Service (PaaS), Kentico CMS provides turn-key solutions to the various options available which will allow your organization’s web properties to scale efficiently and economically.Kentico’s cloud optimized CMS platform enables organizations to deploy their portal in minutes and easily create a fully-configured, fault-tolerant and load-balanced cluster. Kentico’s cloud-ready portal deployments automatically scale to meet the needs’ of customers, which can vary widely depending on the number of projects, the number of people working on each project and users’ geographic locations.
By automatically and dynamically growing and reducing the number of servers on the cloud, those leveraging a Kentico CMS solution can reduce costs, only paying for the system usage as needed, while maintaining optimum system performance.”Kentico Software shares our vision of driving the expansion and delivery of new capabilities in the cloud,” said Chad Collins, Nubifer CEO. “The Kentico CMS brings automation, increased IT control and visibility to users, who understand the advantages of creating and deploying scalable portal solutions in the cloud.”
About Kentico CMS
Kentico CMS is an affordable Web content management system providing a complete set of features for building websites, community sites, intranets and on-line stores on the Microsoft ASP.NET platform. It supports WYSIWYG editing, workflows, multiple languages, full-text search, SEO, on-line forms, image galleries, forums, groups, blogs, polls, media libraries and is shipped with 250+ configurable Web parts. It’s currently used by more than 6,000 websites in 84 countries.

Kentico Software clients include Microsoft, McDonald’s, Vodafone, O2, Orange, Brussels Airlines, Mazda, Ford, Subaru, Isuzu, Samsung, Gibson, ESPN, Guinness, DKNY, Abbott Labs, Medibank, Ireland.ie and others.

About Kentico Software
Kentico Software (www.kentico.com) helps clients create professional websites, online stores, community sites and intranets using Kentico CMS for ASP.NET. It’s committed to deliver a full-featured, enterprise-class, stable and scalable Web Content Management solution on the Microsoft .NET platform. Founded in 2004, Kentico is headquartered in the Czech Republic and has a U.S. office in Nashua, NH. Since its inception, Kentico has continued to rapidly expand the Kentico CMS user base worldwide.Kentico Software is a Microsoft Gold Certified Partner. In 2010, Kentico was named the fastest growing technology company in the Czech Republic in the Deloitte Technology FAST 50 awards. For more information about Kentico’s CMS offerings, and how it can add value to your web properties, contact Nubifer today.

Strategies for Cloud Security

Security and compliance concerns continue to be the primary barrier to cloud adoption. Despite important security concerns, cloud computing is gaining traction. The issue now is not “will my organization move to the cloud?” Rather, it is “when?”In this article, Nubifer’s Research Team explores requirements for intelligent cloud security strategies. What are the minimum requirements? How do you coalesce traditional security protocols with advanced technologies like data loss prevention and risk management?
Security Concerns Slowing Cloud Adoption

A recent Cloud Trends Report for 2011 discovered that the number of organizations that are immenently planning the move to the cloud almost doubled from 2009 (24%) to 2010 (44%). The study also discovered that issues relating to cloud security is the primary obstacle to migration. In the published report, more than a quarter of those surveyed cited security as their number one concern, with almost 60% including security in their top three.

CA Technologies recently published a study concluding that, despite industry concerns about cloud security, roughly half of those leveraging the cloud do not effectively review vendors for security issues before deployments. The study, ‘Security of Cloud Computing Users: A Study of Practitioners in the US & Europe’, discovered that IT personnel vary with their determination of who is in charge of securing sensitive data and how to go about doing  it.

Constructing a Cloud Security Plan

Despite the ability of many organizations to analyze their own security protocols, there remain many valid cloud security fears. Shifting the burden of protecting important data to an outside vendor is nerve-racking, especially in a vertical that has to abide by regulations such as HIPAA, SOX or PCI DSS.

Risks involving cloud security still have many unknowns, so discovering an over-arching cloud strategy is a requirement. If your organisation does not have a game plan in place, are you ready to adapt and change as requirements evolove?

Your CFO or related exec is your organizations’ largest risk for financial application breach and data loss. The HR director needs to be effectively trained and managed so that ‘lost’ personnel files don’t come back to bite you.  Most importantly, the largest risk of all is the CEO.

Hackers realize this, which is why Chief executives are consistently victims of  “whaling attacks,” such as the well known ‘CEO subpoena phishing scam’.

A robust strategy to protect the most privileged users has the additional benefit of giving your organization an generalized cloud security road-map. Are mobile device risks a concern? Your most senior users desire remote and mobile access. What about data loss? Your senior users have more access to tarrying data points.

When your organization moves from analyzing itself to evaluating potential cloud application and platforms, do not neglect to look into how prevalent cloud services have already become in your IT infrastructure. Are you using Salesforce.com? Basecamp? Taleo? Google Apps?

Super brand cloud/SaaS/PaaS providers, Microsoft, Salesforce.com and Google all have tremendous reputations. So aligning projects leveraging these brands with security protocols should not be time consuming. You’ll want to analyze others to ensure they are legit providers that spend the time to properly secure their IT environments.

Lastly, as software licenses run out and as product upgrades come due, you’ll be in position to effectively begin analyzing the cloud vendors you will want to leverage for your mission-critical operations.

Following that advice will get you started. For more information on formulating a Cloud Security strategy visit Nubifer.com.


Organizations Leveraging the Cloud

In a recent poll by CDW, it found that nearly 28 % of all US based companies are leveraging the cloud, while almost 75% said that their first access to the cloud was through a simple cloud application.

The Cloud Computing Tracking Poll was conducted as a review of the current and future use of the cloud by business organizations and  government offices which was based on a survey of nearly 1,200 IT professionals.

About 84% of the organizations said that they have deployed at least one cloud application, while some others are not aware if they are a part of the users who are in the cloud.

“Many organizations are carefully – and selectively – moving into cloud computing, as well they should, because it represents a significant shift in how computing resources are provided and managed,” said David Cottingham, senior director, CDW. “With thoughtful planning, organizations can realize benefits that align directly with their organizational goals: consolidated IT infrastructure, reduced IT energy and capital costs, and ‘anywhere’ access to documents and applications.”

CDW mentioned that applications most frequently run in the cloud are service applications, such as email or docs, which have about half of the cloud users, file storage has 39 % of users, web and video conferencing has 36 and 32 % respectively, and 34 % of the respondents are the ones conducting online training programs.

Among those currently leveraging the cloud, almost 85 % said they cut application costs by moving to the cloud. On an average, users said, they save 21 % annually on those applications which are migrated to a cloud platform.

“The potential to cut costs while maintaining or even enhancing computing capabilities for end users presents a compelling case for investment in cloud computing,” Cottingham said. “The fact that even current cloud users anticipate spending just a third of their IT budget on cloud computing within five years suggests that before wide-scale implementation, IT managers are taking a hard look at their IT governance, architecture, security and other prerequisites for cloud computing, in order to ensure that their implementations are successful.”

This survey included 150 individuals from various industries who thought of themselves as familiar with their organization’s use of, or plans for cloud computing, a report on the CDW website said.

To learn more about how your organization can leverage cloud applications, visit Nubifer.com.

5 Recommendations to Keep your Personal Data Secure in the Cloud

Apple’s iCloud offering  is additional evidence of the unmitigated flow of data to the cloud. Despite the latest breaches of security at various organizations, including the issues that have affected many Sony customers, more and more of us are casting personal or business assets to the cloud.

Yet many of us remain uneducated about the required steps we should employ to keep our online data safe. Adhering to these five guidelines will go a long way towards aiding the average person keep online threats at a distance.

1. Don’t Take Security for Granted
There are two ways to your online data. One is through the cloud provider’s environment, and the second route is even more potent, and it’s much closer to home. The easiest and most available way for an intruder to get to your online records is through your login credentials. Of course you want the provider to be secure, but don’t let that make you listless about your personal log-in creds.

2. Use Strong, Memorable Passwords
The problem with having complicated passwords is that they are usually hard to remember. Thekey is to start with something notable and then merge it into a strong password — this entails mixing numbers, letters, lower and upper case, and symbols as well. Start with an address, car license numbers, telephone numbers, date of birth. Don’t use your own — use those you know; friends, kids, parents, partners, previous addresses; or old addresses you were at and cars you drove a decade ago. Choose something that can’t be linked to your online personality but always mix it up — half an area code, a name with half of a zip code, parts of an old address. Then add in a $, an !, or an @ sign to mix it up even more.

3. Guard your Inbox
You are going to recycle passwords, mostly for sites where you are not keeping  important information like your credit card numbers, DOB, address or SSN. There’s one place where you should never neglect to use a unique password — your email inbox. Because this is the primary location where all your other logins come back to when you reset a password. This one location is the portal to all your other online personas.

Although it’s a bit of a hassle, you should opt for double-protecting your inbox with a two-factor authentication, which means you have to enter a second password in order to gain access. This is especially crucial if you have a habit of going to malicious websites, you don’t keep your anti-malware software up to date, or you have a habit of failing to identify phishing emails.

4. Don’t Leave the Password Recovery Backdoor Open
Quite often, users take many precautions to protect their personal information but make it very easy to reset their password through the password recovery service. If your user ID is simple to guess (it’s often your email) then do not use something easy to figure out for your password reset, such as your DOB, wife’s maiden name or some other easily accessible piece of personal information.

5. Have an Alternate to Fall Back on
Security is mostly about risk avoidance, and however careful your execution, you can’t eliminate all risk. So give yourself a fallback option. Don’t put all your money in one account, have a separate emergency email address, make sure you’ve got local coffee shop with WiFi you can resort to if your main Internet connection disappears. Knowing that you’ve got a second option if something bad happens helps you remain calm in an emergency, which gives you a better chance of surviving a crisis.

For more information regarding the security of your online data, visit Nubifer.com.

Fujitsu to Deliver First Windows Azure Appliance This Summer

The “private cloud” Windows Azure appliances that Microsoft announced a year ago are here. There’s an August, 2011 ship date slated for the first of them.

Fujitsu, one of three OEMs that announced initial support for the Azure Appliance concept, is going to deliver its first Azure Appliance in August 2011, Fujitsu and Microsoft announced on June 7. Fujitsu’s offering is known as the Fujitsu Fujitsu Global Cloud Platform, FGCP/A5, and will be running in Fujitsu’s datacenter in Japan. Fujitsu has been running a trial of the service since April 21, 2011, with 20 companies, according to the press release.

Microsoft officials had no further updates on the whereabouts of appliances from Dell or Hewlett-Packard. Originally, Microsoft told customers to expect Azure Appliances to be in production and available for sale by the end of 2010.

Windows Azure Appliances, as initially described, were designed to be pre-configured containers with between hundreds and thousands of servers running the Windows Azure platform. These containers will be housed, at first, at Dell’s, HP’s and Fujitsu’s datacenters, with Microsoft providing the Azure infrastructure and services for these containers.

In the longer term, Microsoft officials said they expected some large enterprises, like eBay, to house the containers in their own data-centers on site — in other words, to run their own “customer-hosted clouds.” Over time, smaller service providers also will be authorized to make Azure Appliances available to their customers as well.

Fujitsu’s goal with the new Azure-based offering is to sign up 400 enterprise companies, plus 5,000 small/medium enterprise customers and ISVs, in the five-year period following launch, a recent Fujitsu press release noted.

For more information regarding the Azure Appliances, and how they can provide you with a turn-key private cloud solution, visit Nubifer.com/azure.

Intriguing Cloud Computing Statistics

If you remain skeptical about cloud computing, the following stats may put any lingering confusion to rest:

  • CRN predicts that by 2014 small business spending on cloud computing will reach nearly $100 billion.
  • IDC approximates that the market for public cloud products and services are at $16B in 2010 and will expand to $56B by 2014.
  • An estimation by Gartner places the cloud market at $150B by 2013, while Merrill Lynch places it at $160B by 2012.
  • SandHill recently conducted a survey of 500 IT decision-makers and when asked to name their primary reason for adopting cloud applications, 50% of the respondents cited business agility.
  • Enterprise applications will need to adapt to the rapid growth of mobile and social computing, which is unprecedented in the history of technology.
  • According to an estimation made by Gartner, the rapid growth of virtualization will mean that 60% of server workloads will be virtualized by 60%.
  • Although public cloud infrastructure, applications and platforms are growing at 25%, IDC estimates that the market for enterprise servers will increase two-fold by 2013.
  • A recent survey revealed that every enterprise was using a SaaS application, while less than 25% of IT departments were aware that they were using one.
To learn more about cloud computing and how it can benefit your organization, contact Nubifer today.

How Cloud Computing Could Change the Role of the CIO

Cloud computing is at the top of conference agendas and a common buzz word online, so it should come as no surprise that it is also on the minds of many IT executives. And as more and more enterprise IT departments move to the cloud, many are beginning to wonder how this will affect the traditional role of the CIO.

The role of the CIO will change if the IT department shifts from a service provider to a utility model, with usage-based metering. This will result in a shift in core tasks from developing applications and user interfaces and result in a new set of tasks involving the definition of service-level agreements, selecting cloud management tools and understanding customer service. The role of the CIO could shift to become more like an independent business manager running a public service.

Usage-Based
The CIO used to be involved in strategic technology planning for the organization and was likely making strategic decisions, such as when to upgrade Microsoft Office and Windows and which strategic vendor to use for hardware. But this changes when an organization implements a cloud architecture, as new tasks and skills come into play. Some of the traditional roles of the CIO remain, while the CIO is also required to play a new role as a cloud manager. This requires providing the tools and computing power to meet the changing needs of users in a quicker, more efficient manner. This may also include setting up a private cloud, in which users have access to consistent, repeatable services from a services portal available via standard Internet protocols.

Earlier this year, an InformationWeek article revealed that a survey of IT executives found that, when stating the top reasons for moving to cloud computing, cutting costs was nearly just as important to respondents as faster response to end users. The same survey found that although 58 percent of respondents were making the move to the cloud, most were taking a slow approach to do so.

The Future Lies in the Cloud
With that said, a Mashable post citing a different cloud computing survey predicts that by 2011, a vast majority of computing will take place in the cloud. Although this survey seemed to focus more on the consumer side of things, most IT executives see a future in the cloud. The CIO job will adapt and change as this transition occurs, and will function more as a logistical manager.

As cloud services move outside the firewall, understanding how the vendor is providing the services your company needs will become increasingly important. As will understanding that your company’s information is safe and secure wherever it is stored.

To learn more about the cloud, and how it can help your organization, contact Nubifer today.

Cloud Computing: A Guide for Small Businesses

Cloud computing is all the rage these days, being generally described as a computing model in which services and storage are provided online When small business owners or new software companies refer to cloud computing, they most often mean an application that runs on the Internet; as opposed to operating from a desktop that is connected to the Internet—Software as a Service (SaaS). 

Everything from phone services to marketing operations has a cloud based solution. Oftentimes, you are using SaaS without even realizing it. For example, your email provider is likely delivering service from the cloud, without on-premise hardware and software.

The following is a guide of different factors to consider when deciding to adopt a cloud solution for your business.

The growth of cloud computing is astonishing.
The worldwide cloud computing market is estimated at $8 billion, with the U.S. market accounting for $3.2 billion of that sum, or 40%. Gartner’s 2011 predictions place Cloud Computing at the top of their list of Top Strategic Technologies. Additionally, Gartner predicts that the SaaS market will reach $14 billion in 2013.

Says Gartner, “Cloud computing services exist along a spectrum from open public to closed private. The next three years will see the delivery of a range of cloud service approaches that fall between these two extremes. Vendors will offers packaged private cloud implementations that deliver the vendor’s public cloud service technologies (software and/or hardware) and methodologies (i.e., best practices to build and run the service) in a form that can be implemented inside the consumer’s enterprise. Many will also offer management services to remotely manage the cloud service implementation.”

A recent study conducted by AMI-Partners revealed: “Small and medium business (SMB) spending in the U.S. on software-as-service (SaaS) will increase exponentially over the next five years, eclipsing growth in investments in on-premise software by a significant margin. AMI forecasts growth in investments in on-premise software by a significant margin. AMI forecasts a 25% CAGR in hosted business application services spending through 2014. This will come against a modest 5% uptick for all other categories of on-premise software combined. However, this growth will not be uniformly spread across all hosted applications. Mature applications such as ERP, SCM, procurement, finance, and core HR will turn over more slowly than those that are less saturated and have lower switching costs.”

Cloud computing software solutions vs. desktop applications.
Small businesses choose cloud computing solutions over desktop applications because it is less expensive. You pay a small monthly amount rather than a one-time fee, like with traditional desktop software.

Another reason small businesses choose cloud computing solutions is that the SaaS application is often a simplified version of what you are currently using, which is installed on your machine. The developers of many cloud computing apps have created just the basics required to get the job done.

One of the market leaders in the cloud computing industry, Salesforce.com, has over 52,000 customers in 2009 while hosting provider Rackspace has over 1,000 SaaS apps in its new AppMatcher.com service.

Cloud computing solutions are available whenever you want, wherever you are.
The application often needs to be accessible from a web browser for many small business users operating virtual offices or operating remotely on different machines depending on location. Cloud computing is available wherever you have access to a computer and browser, and that is one of its biggest advantages.

If you aren’t connected and operating your laptop offline, many apps have either a mobile app or a widget that you can download to run a lighter version of the software. Some Google Apps, for example, offer a desktop version called Google Gears, which will sync your data when you’re back online. Google Apps has over two million businesses and 25 million users in its cloud computing marketplace.

Simple, focused cloud computing solutions can often get the job done.
If you don’t use all of the features of your desktop, a cloud computing application might offer a “forever free” plan, which will allow you to do the same work as a desktop application, but limited in some way. A billing solution might let you run an unlimited number of voices, for example, but only for two separate clients.

With that said, all apps that live in the cloud are not more basic than their desktop equivalents, but rather they offer a paired down basic package that can help you complete the task at hand when you don’t require the feature-risk version. Zoho, for example, offers a simple bookkeeping app that is free. You can also integrate it with other financial SaaS apps to do more, or purchase the more feature-rich SaaS version.

Pay attention to the security of your data.
It is important to remember that you are still responsible for making sure data is where it needs to be—onsite or in a cloud. Your cloud computing vendor isn’t responsible for your data, security or data privacy. They may promise certain aspects of security, but your are responsible if regulators come calling if you are a financial institution, for example. It is important to make sure you aren’t violating any compliance concerns and that your data is safe.

A May 2010 ‘USA Today’ article told the story of a small business owner whose store was robbed. Eight desktops were stolen. They purchased eight new computers and were back in business in no time thanks to cloud services, like Salesforce.com, Microsoft Office 365 and QuickBooks Online.

Choose a stable and reliable cloud computing vendor.
It is important to ask questions like, What type of Service Level Agreement (SLA) do they have? How long have they been in business? Can you talk to users directly? How many customers do they have? It is often possible to read testimonials and get good information, and if the testimonials are real, they will often link to the person who made the comment. You can also do a search on Twitter, Facebook or LinkedIn.

Consider the uptime of your cloud computing applications.
Uptime refers to the time a hosted application’s performance record and most are in the range of 98-99.9%–which acknowledges that servers go down for maintenance or unexpected issues. Make sure to read SLAs carefully and talk about changing terms with the vendor if you have to.

Pay attention to customer support.
Be sure to check if there is an extra charge for support and maintenance or if it is included in your monthly subscription fee. While it is often included, it important to read the fine print to check and also to see if you have access to a customer support team via phone, email or social media.

Choose a flexible cloud computing vendor.
Your monthly frees are usually dependent on how many users you have and you can add and subtract users as needed. Your capital outlay to “purchase” cloud based apps is often lower than traditional on-premise or desktop apps. Cloud computing is one streamlined way to scale with your needs.

Evaluate your requirement for software upgrades.
Cloud computing apps are regularly improved and upgraded, and you benefit from each and every improvement without additional direct cost and without the effort and time of downloading and configuring upgrades. Enhancements often happen more quickly and in shorter development cycles, often based on customer requests.

Make sure your cloud solutions integrate well.
Cloud computing might just be for you if your need involves some type of integration, as many of the current cloud based apps offer an API (application programming interface) which other synergistic apps will leverage. You might find an accounting package, for example, that ties into a CRM package. You would have to pay someone to customize both apps for you if you wanted to do this with your current desktop application. A web-based app would save you time and money and might have already done it for you.

You can look into an offering like CastIron (recently acquired by IBM) if integration is your concern, as it “pre-configures a number of apps” so that you can connect to the solutions you are using already.

Cloud computing offers a distinct advantage if rapid deployment is integral to your project, as many cloud computing projects are up and running in hours—sometimes in minutes. Although you may not get every feature set configured to your need, you can start working right our of the gate oftentimes. If the provider you evaluate has an API connected to another application you need, it may offer advantages over a desktop application—which will require more money to customize later.

Cloud computing isn’t always the least exciting solution.
Cloud computing may be the perfect option if cash-flow is an issue. While on-premise software purchases often involve high upfront licensing costs, cloud computing apps often require no large up-front licensing fees requiring department or board approval. There are usually no annual maintenance fees either.

On a website pricing page, SaaS pricing is often clear, and if a cloud computing app vendor requires a demo or doesn’t reveal its pricing, it usually means that there is a more complicated solution that demands some installation process or customization that will cost more upfront.

Pay attention to how quickly your software needs to change.
User are often forced to choose between a.) Upgrading at a high cost and experiencing delays as the new features are evaluated and plans for adoption are formulated, and hire or enlist local IT talent to develop, test, debug, deploy and train personnel on the new application, or b.) Continue using the older version of the software and avoid advantages of an upgraded version when an application packages requires an upgrade.

You are left waiting for software changes to be made by the software company in both cases, but with the cloud computing model, you will see those upgrades sooner than with a desktop application. The vendor applies upgrades at the data center and the upgrades are made available to users immediately via online connection and there are only minor delays—they also come at no cost to the user.

Remember: your monthly fee covers the upgrades so make sure to compare this when you need to consider this. If you upgrade each year, than the monthly outlay may be lower from a total cost perspective over time, while with a desktop application you are waiting until the next—often annual—release.

Many goals can be accomplished without all of the bells and whistles.
Because they are often focused on a particular area or business niche, cloud based applications can be less robust. While it can be argued that you have to operate your business—from a software perspective—using the Pareto Principle in which 80% of the effects come from 20% of the solution, this isn’t entirely accurate because most desktop users routinely admit that they don’t use all of the features of a desktop application. This partly explains how many cloud based applications get developed—they look at core problems rather than a large feature set that most users won’t even try.

If the cloud computing application lacks some of the features you need, you can add features via customization or premium levels of vendor service. Each application provider is different, but most offer extensive interface capability, usually via Web services that integrate both internal and hosted systems.

A common myth is that cloud computing software doesn’t play well with legacy applications/data sources, but there are two primary methods of integrating cloud computing apps: batch synchronization (which initially involved exporting/importing your data into a cloud based applications, after which your data can be incrementally synchronized on a scheduled basis) and real-time integration via Web services (which is like a neural middle layer where your application talks to the cloud computing company).

It is important to note that you have to evaluate the implications and limitations of cloud computing software for your needs. Some gaps remain for complex end-to-end processes that require complicated workflows or business processes.

Engage your technical team.
It is important to keep lead technical people in the loop for security and integration issues for a number of reasons. If you are a business owner and are unsure about what information you are sharing, you could be sending information out that onsite applications need or you could even be putting corporate information at risk.

Applications and services are now easily accessible to end-users, who can acquire SaaS capabilities without input from their IT or data management teams, which is a major challenge with cloud computing. Other related issues like data replication, outages and the complications of outsourced data storage can complicate cloud integrations. And if your tech team isn’t aware that your are running certain cloud based apps, you could create  challenge in multiple functional areas.

Good cloud computing companies have built their web apps on a Web-services based architecture because it is less proprietary and easier for these apps to share data with one another. These standards make it easier for companies to integrate services, but they can inadvertently create security problems by making a hacker’s job easy if the proper security isn’t in place.

Internal training is still required.
Most SaaS vendors provide online video tutorials in addition to robust user communities and forums where you can get your questions answered. This makes cloud applications easier to use with less training involved. Direct access to these teams means less of a burden on your own internal technical teams.

Conclusion
Cloud computing is drastically shaping the current small business market and if you are trying to grow your business and are limited by cash flow, cloud computing is an attractive option. The addition of Smartphones and other mobile technologies—aka mobile computing—makes for a dwindling audience for on-premise applications. The previously listed 16 things to consider before choosing cloud computing solutions will help give you a new outlook on how to get work done and solve problems.

For more information on how cloud computing can help your small business contact a Nubifer representative today.

IBM’s Tivoli Live

IBM recently announced a new addition to its SaaS portfolio, IBM Tivoli Live – Service Manager, which provides integrated service management capabilities as a monthly subscription on IBM’s cloud platform. Along with IBM Tivoli Live – Monitoring Services, Tivoli Live solutions allow organizations to quickly adopt and deploy key ITIL processes and combine them with performance and availability monitoring, all under a common subscription and delivery model. There is no need to purchase hardware, software licenses or installation services. 

Both solutions are based on a common platform and architecture that many IBM clients use today as on-premise software. Customers are not locked into a single consumption model and in fact can choose from an array of flexible delivery options including on-premise software, SaaS, appliances and managed help desk services. Now, organizations large and small can take advantage of enterprise-class software and easily migrate from one model to another based on their business needs.

For small and medium-sized businesses without large IT departments, this service provides a quick, and practical path towards improving IT performance. For larger organizations, this service can complement existing IT management infrastructure, helping organizations better manage their costs and standardize IT operations.

Tivoli Live – Service Manager offers a comprehensive set of capabilities for implementing problem, incident, change, release and asset management processes, leveraging a common data model and a robust change management database. Customers have the flexibility to purchase any of these capabilities through our unique role based user pricing.

Tivoli Live – Monitoring Services delivers Tivoli Monitoring and Tivoli Composite Application Management software over the Web, which allow customers to manage the health and performance of their data center’s resources – including operating systems, virtualized servers, middleware and applications.

For more information on IBM’s Cloud Services, visit Nubifer.com.

Cloud Computing’s Popularity with SMB’s

There is no simple answer as to whether or not 2010 was the year small business IT finally adopted cloud computing once and for all. On behalf of Microsoft, 7th Sense Research recently conducted a study on cloud computing in small business computing environments and found that 29% of SMBs view the cloud as an opportunity for small business IT to be more strategic. The study also found that 27% of SMBs have bought into cloud computing because it integrates with existing technology investments, while 12% of SMBs have used the cloud to start a new business.

Despite those figures, overall, small businesses are largely unfamiliar with cloud computing. Josh Waldo, director of SMB Marketing at Microsoft reveals, “Roughly 20 percent of SMBs claim to know what cloud technology is.”

The numbers just don’t match up, but Waldo points out that just because people may not identify with the term cloud computing doesn’t mean they aren’t using the technology. Take Gmail or Hotmail, for example: They are both prime examples of the Software-as-a-Service (SaaS) form of cloud computing and are extremely popular—without their many users even realizing they are using cloud technology when checking their inbox.

“People might not understand what cloud is. But they are using it. They’re using it in their private life. In some cases they’re using it in their work life. But they might not necessarily identify it with the term cloud,” says Waldo.

He believes that the lack of familiarity SMB’s have with cloud computing can be an opportunity for Microsoft, Zoho and other providers of small business technology. Says Waldo, “For Microsoft, what that means is that this gives us a big opportunity to really educate SMB’s about cloud technologies and how they can benefit their business. Our goal is really going to be to help SMB’s evolve how they think about technology.”

According to Waldo, the benefits for small businesses that embrace the cloud are potentially huge: “First, SMBs can get enterprise-class technology at a fraction of the price, where you’re not purchasing on-premises technology that’s going to cost you an enormous amount upfront. Second, it really allows companies, whether you’re a development shop and you’re building software, or you’re an end customer—like a financial or insurance firm—to focus on your business rather than your IT requirements.”

By outsourcing data-center needs, for example, small business IT can eliminate building out capacity to handle potential strikes in data or transaction processing, because they buy the processing power they need when they need it. This leads to another key benefit of cloud computing: elasticity and the expectation of mobility. Waldo defines elasticity as the capability to scale up or down rapidly, based on need. While that includes processing power, it also means being able to add new users from a seasonal workforce—without having to deal with per-seat licensing associated with traditional desktop software.

When it comes to the expectation of mobility, Waldo says that today’s notebook, smartphone and tablet-totting employees want to make their work more flexible by making it mobile. SMB’s can let employees access the information and applications they need while on the go by exposing core applications as SaaS via the cloud.

Embracing Cloud Computing
Waldo recommends that SMB’s that have decided to embrace the cloud by adding cloud computing to their small business technology portfolio seek expert advice. “We really think it’s important that SMB’s choose carefully. And if they’re uncertain, they should work with a third party or a consultant or a value added reseller or some type of agent who understands the various elements of cloud technology and [who] can advise clients,” he says.

According to Chad Collins, CEO of Nubifer.com, a provider of turn-key cloud automation solutions, the first thing a small business should consider is which problem it is trying to solve: “The most important thing is that the cloud really isn’t just about infrastructure. It’s about solving problems. It should be about scalability, elasticity and economies of scale.” Collins adds, “What our enterprise clients are asking for is the ability to create virtual environments, run applications without code changes or rewrites and, most importantly, to be able to collaborate and share using single sign-on interface.

Collins says that the person responsible for small business IT should ask a range of questions when considering a cloud services provider. Among the most important is: Does the cloud provider allow you to run existing applications without any code rewrites or changes to code? Microsoft’s research reveals that 27% of SMBs have already bought into cloud services because it integrates with existing technology, while another 36% would be encouraged to but into the cloud because of that fact. “Being able to migrate custom applications over to the cloud without rewrites is not only a huge cost saver but also a huge time saver for SMBs,” says Collins.

Another important question is whether the cloud provider offers granular user access and user-based permissions based on roles. Can you measure value on a per user basis? Can you auto-suspend resources by setting parameters on usage to avoid overuse of the cloud? The latter is important because although cloud services can result in immense cost savings, their pay-as-you-go nature can yield a large tab if used inefficiently.

Collins recommends paying special attention to the level of responsive support offered by a cloud provider. “I think for SMBs it’s really important. Having to log a Web form and then wait 24 to 48 hours for support can be really frustrating,” he says, adding that the provider should guarantee that a support team would respond in mere hours. Agreeing with Collins, Waldo points out that a service-level agreement with a high-availability and 24 hour support is key.

To discover how the power of cloud computing can benefit your SMB, please visit Nubifer.com.

Microsoft Outlines Plans for Integration-as-a-Service on Windows Azure

Although Microsoft officials have been discussing plans for the successor to the company’s BizTalk Server 2010 product for some time, the cloud angle of Microsoft’s plans for its BizTalk integration server didn’t become clear until late October 2010, at the Professional Developers Conference (PDC). 

When looking at a BizTalk Server Team blog, it appears as if Microsoft is thinking about BizTalk vNext transforming into something akin to Windows Azure and SQL Azure—at least in concept—a “BizTalk Azure.”

An excerpt from the blog says, “Our plans to deliver a true Integration service—a multi-tenant, highly scalable cloud service built on AppFabric and running on Windows Azure—will be an important and game changing step for BizTalk Server, giving customers a way to consume integration easily without having to deploy extensive infrastructure and systems integration.”

The latest news from Microsoft reveals that there will be an on-premise version of BizTalk vNext as well—and the final version is slated to arrive in 2012. A Microsoft spokesperson said, “We will deliver new cloud-based integration capabilities both on Windows Azure (as outlined in the blog) as well as continuing to deliver the same capability on-premises. This leverages our AppFabric strategy of providing a consistent underlying architecture foundation across both services and server. This will be available to customers in the 2 year cadence that is consistent with previous major releases of BizTalk Server and other Microsoft enterprise server products.”

In September 2010, Microsoft released the latest on-premises software version of BizTalk (BizTalk Server 2010), which is a minor release of Microsoft’s integration server that supports Visual Studio 2010, SQL Server 2008 R2, Windows Server AppFabric and Windows Server 2008 R2.

There are currently over 10,000 BizTalk Server customers—paying a hefty price for the product—and thus Microsoft officials are being careful in their positioning of BizTalk Azure. Microsoft will ensure that existing customers are able to move to the Azure version “only at their own pace and on their own terms.” Microsoft plans on providing side-by-side support for BizTalk Server 2010 and BizTalk Azure to make sure apps don’t break and will also offer “enhances integration between BizTalk and AppFabric (both Windows Server AppFabric and Windows Azure AppFabrics).

Microsoft recently rolled out the First CTP (Community Technology Preview) of the Patterns and Practices Composite Application Guidance for using BizTalk Server 2010, Windows Server AppFabric and Windows Azure AppFabric together as part of an overall composite application solution. Additionally, Microsoft previewed a number of future enhancements to Windows Azure AppFabric.

For more information regarding BizTalk on Azure, contact a Nubifer representative today.

DoD Business Applications and the Cloud

The current cloud spending is less than 5% of total IT spending, but with an optimistic 25% growth rate, cloud computing is poised to become one of the dominant types for organizing information systems—which is why it is important for the Department of Defense Business Mission to begin organizing the path to cloud operations in order to migrate from its current low performance/high cost environment. 

The DoD Fiscal Year (FY) 2010 IT cost of the Business Mission—excluding payroll costs for uniformed and civilian personnel—is $5.2 billion, in addition to 1/3 of the costs of the communications and computing infrastructure tacking on an additional $5.4 billion to total costs.

The average IT budgets of the largest US corporate organizations are exceeded by the scope of DoD Business Applications by a multiple of three. As a result, DoD Business Operations need to think about its future IT directions as operating a secure and private cloud that is managed organically by the DoD Business Mission in order to squeeze the cost benefits out of the cloud.

There are many forms of cloud computing, ranging from Platform-as-a-Service (PaaS) and Infrastructure-as-a-Service (IaaS) to Software-as-a-Service (SaaS), but when it comes to the Department of Defense, offerings that can offer support of over 2,000 applications need apply. Business Operations cannot be linked to “public” clouds that are proprietary.

The DoD, for example, can’t rely on the largest cloud service like the Amazon Elastic Cloud, which offers computing capacity completely managed by the customer and is thus a “public cloud.” Because compute processing is purchased on demand, Amazon is an IaaS service. Once your applications are placed in the proprietary Amazon cloud, however, it is difficult to transfer the workload into a different environment.

Google, however, offers a PaaS service as a public cloud (read: accessible to all) via the Google App Engine. Google allows developers to build, host and run web applications on Google’s mature infrastructure with its own operating system; Google only provides a few Google-managed applications.

Salesforce.com’s enterprise level computing currently operates at $1.4 billion revenue rate per year, with 2 million subscribers signed up for SaaS application services running in a proprietary PaaS environment. Because Salesforce offers only proprietary solutions and can’t be considered by DoD, although Salesforce’s recent partnership with VMware might change all that.

Other cloud providers offer IaaS services, but they all leave it to customers to manage their own applications; they qualify for DoD applications provided that would meet open source and security criteria.

Open Platform and Open Source
Microsoft’s Windows Azure platform offers a PaaS environment for developers to create cloud applications and offers services running in Microsoft’s data centers on a proprietary .Net environment. These preferentially .Net applications are integrated into a Microsoft controlled software environment but can be defined as a “closed” platform.

Currently, DoD Business Mission applications are running largely in a Microsoft .Net environment. What remains to be seen is if DoD will pursue cloud migration into a multi-vendor “open platform” and “open source” programming environment or continue sticking to a restrictive Microsoft .Net?

The largest share of the DoD IT budget goes towards the Defense Information Systems Agency (DISA), which has advocated the adoption of the open source SourceForge library in April 2009 for unclassified programs. DISA’s Forge.mil program enables collaborative software development and cross-program sharing of software, system components ad services in support of network-centric operations and warfare. Forge.mil is modeled from concepts proven in open-source software development and represents a collection of screened software components and is used by thousands of developers. Forge.mil takes advantage of a large library of tested software projects and its components are continuously evaluated by thousands of contributors (including some from firms like IBM, Oracle and HP although not from Microsoft, which controls its own library of codes).

OSS is defined as software for which the human-readable source code is available for use, study, reuse, modification, enhancement and redistribution by the users of that software by a DoD Memorandum of October 16, 2009 by the Acting DoD Chief Information Officer on “Clarifying Guidance Regarding Open Source Software (OSS).” OSS meets the definition of “commercial computer software” and will thus be given preference in building systems. DoD has began the process of adoption of open course computer code with the announcement of Forge.mil.

Implications
Due to the emigration of business applications, a reorientation of systems development technologies in favor of running on “private clouds”—while taking advantage of “open source” techniques—is necessary in order to save the most. The technologies currently offered for the construction of “private” clouds will help to achieve the complete separation of the platforms on which applications run, from the applications themselves. The simplification that can be achieved through the sharing of “open” source code from the Forge.mil library makes delivering cloud solutions cheaper, quicker and more readily available.

For more information regarding the DoD and open source cloud platforms, please visit nubifer.com today.

Feds to Unveil Cloud Security Guidelines

Late in 2010, the federal government issued draft plans for the voluntary Federal Risk and Authorization Management Program, dubbed FedRAMP. FedRAMP is expected to be operational by April, 2011 and would ensure cloud services meet federal cyber-security guidelines—which will likely shelve remaining government concerns about cloud security and ramp up adoption of cloud technologies.

Developed with cross-government and industry support over the past 18 months, the voluntary program would put cloud services through a standardized security accreditation and certification process. Any authorization could subsequently be leveraged by other agencies. Federal CIO Vivek Kundra said in a statement, “By simplifying how agencies procure cloud computing solutions, we are paving the way for more cost-effective and energy-efficient service delivery for the public, while reducing the federal government’s data center footprint.”

The adoption of cloud computing has been promoted by the Obama Administration as a way to help save the government money, and Kundra and other top officials have championed the technology and instituting policies like data center consolidation requirements—which could bring about a shift to the cloud. Federal IT managers, however, have consistently raised security concerns as the biggest barrier to adoption.

The government’s security concerns arise partly because cloud computing is a relatively new paradigm that has to be adapted to the security requirements of regulations like the Federal Information Management Security Act (FISMA, which governs federal cyber-security for most government agencies).  By mapping out the baseline required security controls for cloud systems, FedRAMP creates a consistent set of security outlines for cloud computing.

FedRAMP will seek to eliminate a duplicative, costly process to certify and accredit applications. Each agency used to take apps and services through their own accreditation process, but in the shared-infrastructure environment of the cloud, this process is redundant.

The FedRAMP draft is comprised of three major components: a set of cloud computing security baseline requirements; a process to continuously monitor cloud security; and a description of proposed operational approaches to authorizing and assessing cloud-based systems.

FedRAMP will be used for both private and public cloud services, and possibly for non-cloud computing information technologies and products. For example, two agencies have informed IBM of their intent to sponsor certification of their new Federal Community Cloud services.

Commercial vendors will not be able to directly request FedRAMP authorization, but rather have to rely on the sponsorship of a federal agency that plans to use their cloud services. Guidance on the CIO Council’s website suggests, FedRAMP “may not have the resources to accommodate all requests initially,” and that GSA will focus on systems with potentially larger user bases or cross-government interest, suggesting that the government predicts a large amount of interest.

FedRAMP will remain an inter-agency effort under federal CIO Kundra’s authority and will be managed by GSA. The new Joint Authorization Board, which now includes reps from GSA, the Department of Defense, will authorize the systems that go through the process with the sponsoring agency.

Although FedRAMP provides a base accreditation, most agencies have security requirements that go beyond FISMA and thus may have to do more work on top of the FedRAMP certification to make sure the cloud services they are looking to deploy meet individual agency requirements.

For more information regarding the Federal adoption of cloud technologies, visit Nubifer.com.

Cloud Computing’s Varying Forms of Functionality

Although everyone associated with the industry is likely familiar with the term cloud computing, what remains ambiguous are its offerings, both now and in the future. The benefits of cloud computing can essentially be classified into as many as five categories, the majority of which are discussed in the paragraphs to follow.

The Internet allows for you to market your brand internationally, whether you are a SMB or a multi-national organization. It also enables organizations to reach out and offer their products/services to an international audience, and the ability to combine data/applications with the ability to use remote computing resources thus creating exciting new opportunities.

Take the latest and greatest mobile app, for example. This new application has the ability to travel anywhere the user is, whether they are surfing on their TV, phone, or laptop. A tremendous amount of information has to be transferred online and shared with several services in order for that application to operate seamlessly, while guaranteeing privacy and security.

Cloud computing offers more than the storing of data off-site and allowing access through their browser. Cloud computing also has the ability to adapt and scale its services to fit each users’ needs through intelligent algorithms. The basic usage of the cloud results in a more personalized experience, as the platform acquires greater familiarity about the intents of the user. In turn, this allows users to effectively use smart services, acquire better information so they can take action wherever they happen to be.

We as human beings are social entities. We naturally and instinctively interact with those around us. In the past, communication was done by telegraph, letters, telephone and faxes, but it is now largely through the Internet. The Internet has created a plethora of communication opportunities, such as instant messaging, Internet telephony and social media. Cloud computing expands on this concept and offers the opportunity to make it possible to incorporate interaction and collaboration capabilities into areas that were seemingly beyond our reach previously.

Due to this progression of the common-place, our expectations become higher and higher over time. At some point in our past it was unthinkable for a cellular phone to be able to surf the net, and provide driving directions. But today, not only do we expect our mobile phone to give us the Internet at our fingertips, but also we expect it to guide us where we need to go.

Because of these expanding expectations, the cloud must be intelligent as well. There will be corresponding pressure for devices to catch up to cloud computing as it becomes increasingly intelligent and more intuitive.

Hand-held devices are great examples of this. Smart phones have a multitude of functions in additions to communications, such as GPS, voice recorder, camera, fame device, calculator and the list goes on. If a phone is paired with an operating system like Microsoft’s Windows Phone 7, it becomes a smart device capable of using cloud services to their full capabilities.

Because the cloud is built upon the capabilities of servers, it is appropriate to imagine large data centers when thinking of cloud computing. This means that server technology must advance as the cloud does—but there is a catch. Cloud services will become more powerful as a server software does. In this way, server and cloud improvements mutually drive each other, and the user greatly benefits from this, whether the user is an individual, organization or company.

Once we tap into cloud computing fully, web sites will no longer crash because of surges in traffic—the cloud will accommodate to computing activity peaks accordingly.

For more information about the form and functionality of the cloud, visit Nubifer.com.

A Review of IBM’s Cloud Services

With more than 20,000 members and more than 200,000 processes currently modeled and documented, IBM’s new Blueworks Live offering unites process documentation and social community elements. IBM’s new cloud service provides a new ability to structure and automate ad-hoc processes quickly, effectively and efficiently.

IBM’s latest business process management cloud offering, Blueworks Live offers the most effective way for businesses to acquire and use IT with IBM’s reputation for security, reliability and integration. IDC predicts that public cloud services will grow at over five times the rate of traditional IT products. In 2009 worldwide revenue from public IT cloud services exceeded $16 billion, and that figure is expected to reach $55.5 million by 2014 (representing a compound annual growth rate of 27.4%).

With IBM’s new business process management cloud offering, employees will be able to quickly improve simple processes like new marketing promotional campaigns, employee on-boarding, and sales quote approvals, gaining increased visibility, understanding, insight and control. With Blueworks Live, business users interact with their departmental colleagues and collaborate through a private and secure company work stream, choosing to easily follow any updates to roles, processes, etc. These are updated in a stream view similar to that of the popular social networks enabling managers and team members to instantly see the status of work in progress via built-in dashboards and reports.

Blueworks Live offers intuitive discovery and documentation capabilities for even the most complex processes. For example, one client, PRC, is using Blueworks Live’s capabilities as part of its integrated call center operations.

IBM WebSphere Decision Server
Automating decisions to streamline process design and execution and subsequently make better, quicker decisions is a key to improving business processes. IBM is adding a new technology to its market leadership in Business Rules Management Systems – WebSphere Decision Server. Among this decision management software’s capabilities is the ability to deliver more dynamic marketing promotions and pricing, better fraud detection and prevention and more refined risk assessments.

Joining SPSS Decision Management, this product builds on IBM’s growing decision management portfolio, allowing business users to set up industry-specific data for fast, efficient modeling, providing predictive analytics to business users.

IBM enables organizations to detect and react to defined data patterns as they occur and provide the appropriate decision response based on various factors such as business policies and best practices or regulatory requirements by combining Business Rules Management technologies with WebSphere Decision Server.

IBM WebSphere Lombardi Edition
An easy-to-use business process management (BPM) offering for building and managing process applications with less time, money and risk in a unified platform, WebSphere Lombardi Edition’s graphical design makes it easy for process owners and business users to implement and improve their business processes.

Clients can gain the visibility to understand process bottlenecks and inefficiencies so they can be streamlined, with built-in real-time monitoring and analytics. WebSphere Lombardi Edition is tailored to business processes requiring a high degree of usage and collaboration, and its shared model architecture ensures collaboration between business and IT departments.

Squeezing the Most Out of Gmail

If you have moved from server based email systems and are using Gmail, it is important to make sure you are making the most out of Gmail.

Use Priority Inbox to Save Time
Do you know how much time you spend checking your email? Likely a lot! Gmail’s Priority Inbox helps you prioritize your email by identifying the messages that require your immediate attention, saving you a lot of time. Using a variety of signals to predict which messages are important, Gmail discovers which people you email most and which messages you open and reply to. Once you turn on and manage Priority Inbox in your mail Settings, the service will continue to get better and better the more you use it.

Seamless Chat, Video and Calling
Gmail knows that you work with people in multiple ways, and makes it easy to choose the most effective means of communication, whether it may be email, chat, text messaging, video chats or phone calls—which are all available from your inbox. Voice and video chat, for example, lets you have an actual conversation with someone or meet face-to-face in high resolution. Google also added the ability to call phones in Gmail, making it possible to make phone calls from your computer to any landline or mobile phone number.

Become More Attached to Your Email
Attachments in other email systems take up space, can be difficult to find and often make you open up another program to take action—slowing you down. Gmail alleviates this cumbersome burden by letting you quickly view attachments without needing to open or download them on client-side software. Google’s Docs Viewer lets you view .doc, .pdf, .ppt and other attachments in a new browser tab by clicking the “view” link at the bottom of a Gmail message. And what if you want to edit the file? Simply click “edit online” to open it in Google Docs or download it to your desktop.

Gmail also features the Google Docs preview tab, which lets you read the entire contents of a Google document, spreadsheets or presentation right in Gmail. (Your administrator needs to have enables Labs for your to access them.)

Put Email in Context
With contextual gadgets, you can update a sales lead without even leaving your inbox. Contextual gadgets display information from social networks, business services, web applications and other systems—while allowing you to interact with that data right within Gmail. With just a few clicks via the Google Apps Marketplace, your administrator or any third-party developer can build and distribute Gmail contextual gadgets to the domain with a few easy clicks.

Productivity Keys
Google built in keyboard shortcuts to help you sort through your email quickly and efficiently. After enabling this feature in settings, you can archive (e), reply (r), compose (c), delete (#) or complete other actions with one key or a short combo. While in Gmail, you can print it out and post it at your desk as well.

Experiment in Google Labs
Gmail Labs gives you, the user, features to customize Gmail in whatever way you want. Some Labs accommodate references (like adding a “Send & Archive” button), while others help you communicate (like the Google Voice player and SMS in Chat) and help you stay organized (like the Google Docs and Calendar gadgets).

For more information regarding Google Apps, and its efficiencies, contact a Nubifer representative today.

Zoho Corp. Adding an SMB Accounting Application: Zoho Books

Zoho Corp., a leader in Software as a Service business applications, announced Wednesday January 19th that they are adding an accounting application to their portfolio: Zoho Books. Over the past few years, Zoho has had over 300,000 apps created on their platform, and as Zoho evolves as a leading work-flow engine, they are introducing application Integration with online payment gateways like Paypal, Google Checkout and Authorize.net.

Zoho offers SaaS applications and provides a wide, integrated portfolio of rich online applications for businesses. With 26 different applications spanning Collaboration, Business and Productivity, Zoho helps businesses and organizations get work done. Zoho’s applications are delivered via the Internet, requiring nothing but a browser, enabling organizations to focus on their business while leveraging Zoho in order to maintain the servers and keep data safe.

Detailing Zoho Books

Zoho Books is an online accounting application that gives organizations complete visibility of their finances and aides management of cash moving in and out of the business. Zoho defines its’ Books application as “accounting for rest of us”. A primary selling point is that users need not be an accountant to mange their business and make informed financial decisions.

Those interested can view Zoho’s Youtube video describing Zoho Books here.

Features of Zoho Books:

Money In
Get a clear picture of how much cash-flow your business is generating. Manage your customers and invoice them either online or by direct mail. Automate recurring invoices, payment reminders and payment thank-you notes.

Money Out
Manage and control expenses and cash flow. Record invoices and commitments for purchase, services and even for reimbursable expenses, such as client travel. Keep track of the outstanding balances with vendors.

Banking and Credit Cards
Record and monitor your bank and credit card transactions such as deposits, fund transfers, checks, expenses, credits and refunds.

Go Global
Transact globally with multi-currency capabilities. Record foreign currency invoices and  expenses.

Collaborate
Share accounting duties with anyone in your organization, but set different permissions for those with access employees.

Stay on Top of Your Business
Glance through the dashboard to know what’s going well with your business and what’s not. Make smart and quick business decisions with the help of our insightful, available-anywhere reports.

Zoho Books integrate​s seamlessly with other Zoho ​applications. F​or example, users can import their contacts from Zoho CRM, view d​ata from various modules in Zoho Sheet, etc. I​n particular, Zoho Invoice customers will be able to seamlessly migrate from Zoho Invoice to Zoho Books – and go beyond invoicing to full-blown accounting without having to start over.

Zoho Books is also immediately available for Google Apps users through the Google Apps Marketplace.

Zoho Books is priced at $24/month (or $240/year with a 2 month discount). This includes access for 2 users. If you’d like to provide access for additional users, it’ll be an additional $5/user/month.

For more information on Zoho Books or any other Zoho application contact a Nubifer representative today.

Cloud’s Little Helpers: 12 Companies to Watch in 2011

Article reposted form HPC in the Cloud Online Magazine. Article originally posted on Dec. 14th 2010:

2010 has been an incredible year for cloud computing in general and an even more exciting year for HPC and cloud. This is due, in part, to an increasing number of offerings designed to make high-performance computing applications perform better, flow with more steamlined management and make better use of the elastic resources that have become available.

As the end of the year approaches, it seemed like a great time to look back on some companies that shaped the HPC cloud ecosystem as a whole as well as to give a holiday “heads up” on some companies to keep an eye on in the coming year. There’s no way to put together a list that encompasses everything but here are a few honorable mentions.

Amazon EC2

This year Amazon took the world by storm with the announcement of services focused on HPC, HPC Clusters.  Cluster Compute and Cluster GPU instances have been specifically engineered to provide high-performance network capability – allowing applications to get the low-latency network performance required for tightly coupled, node-to-node communication.  Finally, it seems that affordable, flexible and elastic services have arrived for the HPC community.

Adaptive Computing

Computing, and in particular cloud computing, is really all about the software and how to make the cloud work for you and not against you as a user.  Adaptive has been around since mid 1990’s (formerly known as Cluster Resources) and provides intelligent automation software for data center, cloud, and high-performance computing environments. The company’s software solutions, powered by Moab, deliver policy-based governance that allows customers to consolidate and virtualize resources, allocate and manage applications, optimize service levels, and reduce operational costs.  These services have allowed many users to get the most out of the cloud infrastructure.

Nubifer

Here’s a name that might be new to some of you. Nubifer’s mission revolves around making (and keeping) the cloud simple with a series of cloud program and services that enable users to easily configure and create cloud based services. One aspect of the company is its personalized and tailored architecture from any web-enable device–this means that part of their appeal is their technology-agnostic approach.

Clustercorp

Clustercorp has an impressive sound byte – “Over 10,000 datacenters are power by Rocks Worldwide.”

Rocks+ is a complete cluster and cloud operating environment. Rocks+ can be used with Amazon’s EC2 to power large scale enterprise data and HPC workload.  Rock’s creates single computing resource from multiple clustered systems.  Remove the complexity drives down the costs.

Whamcloud

First what a great name, not easy to forget.  Whamcloud is basically picking up Lustre where Sun left off.  The company provides vendor-neutral solutions for Lustre 1.6 and beyond.  With years of experience developing Lustre features for high performance computing solutions – 50% of the TOP 500 fastest computers are powered by Lustre.

Cloud.com

Yet another great name that’s certainly not easy to forget….

Cloud.com’s approach to cloud computing is to help organizations quickly and easily build, manage, and deploy private and public clouds. Extending beyond individual virtual machine images running on commodity hardware, the Cloud.com CloudStack provides an integrated software solution for delivering virtual data centers as a service.

The CloudStack’s secure cloud architecture, administrators can ensure that memory, CPU, network, and storage allocated to the individual virtual datacenter deployments are isolated from one end user to another.  Certainly addressing one of cloud computing’s big challenges – security.

Microsoft

With many of the traditional big vendors reducing or even eliminating their spend in HPC markets Microsoft seems to be increasing their spend.  Pushing the Azure and Azure services Microsoft’s cloud services vision starts to become a reality as the company continues to tout its proclaimed devotion to bringing high performance computing to the masses.

Platform Computing

It is all about software management services here and many from traditional HPC have at least heard the name.  After all, it’s the software makes the hardware work.  The good news is that the world is recognizing that software and software management has been a missing link in the evolution of cloud computing.  Platform has a rich set of cluster workload management software and have clearly targeted the HPC community and will likely continuting building its long legacy in HPC this year with more advancements for HPC cloud users.

Mellanox

With a broad array of system interconnects, Mellanox provide the fabric or glue that connects all the pieces together – Ethernet to Infinband, interconnect CPUs and  Storage, adapter cards to switches. Mellanox has what can only be described as a “veritable smorgasbord” of interconnect products for high performance computing.

Rightscale

A pay-as-you go cloud computing model which is very attractive to small- and mid-size businesses as well as HPC users for the simple reason that it reduces capital expenditures and provides economies of scale not possible with the traditional datacenter model.  Rightscale also provides a simple way to leverage Amazon’s EC2 platform, which is the top IaaS choice for many scientific and large-scale enterprise applications.

BlueArc

In 2009 the amount of digital content created and stored grew by 62 percent over the previous year, which had already been higher than any year on record. By the end of this decade the amount of data to be stored and created will be 44 times bigger than it was in 2009. This explosive growth in digital content, particularly unstructured content, has changed the rules of the game for businesses of all types. HPC is a huge creator and consumer of data, and it is more and more unstructured.  Not only do you get both structured and unstructured but you also get high availability, manageability and high performance.

Virident Systems

Is it conceivable that the HPC user community is ready for solid-state storage solutions? Answer is yes.  Solid state has been around for 30 or so years in the HPC/supercomputing community from vendors such as Cray Research first half of 1980.  Now SSD, based on NAND Flash memory, is back with a vengeance in several form factors as HDD replacements or more impressively as storage utilizing PCIe form factor.  tachIOn from Virident provides a Tier 0 solution for high performance computing workloads, the goal is to eliminate the all to common IO bottleneck.

The Public Sector Cloud Model

With technological innovations in security, storage, network and connectivity making cloud infrastructure increasingly cost effective, cloud computing is becoming increasingly prevalent in enterprise IT environments. Cloud service brokers are quickly adopting these new technologies and are looking to deliver reliable, scalable, cost efficient options to their customers.

The concept of ‘shared compute resources’ has existed for awhile, with the industry full of ideas to eliminate the need for the desktop and computer sprawls in data centers, with these concepts centering on hosted applications. Hosted applications can be accessed from any place using an Internet connected device, but recently a new paradigm of similar hosted computing has come forth. This new concept is to create compute power in the cloud and make it available to anyone—while simultaneously hiding all of the complexity of managing it.

Cloud computing can not only be used as a vehicle of quicker service deployment and delivery for enterprises, but can aid governments as well. This is because the combined scale, sprawl and complexity of the government sector IT requires a simpler solution. Governments commonly reach out to widely dispersed geographies, people and sectors, which have different agendas, Internet connectivity, require different scales, applications of different complexity and other variables.

Because of this, governments have been maintaining IT environments of their own, creating an inability to reach people and deploy applications being limited by their capacity to create more data-centers.

A cloud platform may be an effective option for the public sector because it can provide a scalable way of building and facilitating computing infrastructures for their computing needs. The government’s ability to reach people on a broader scale can be made possible by the cloud’s increased availability, also resulting in simplified maintenance requirements for their own in-house IT environments.

Compute Resource Distribution
In order to guarantee that compute resources are readily available for various departments, governments usually require large geo-located deployments of IT infrastructure. In the past, this was completed with the help of distributing and allocating budgets for IT within siloed departmental budgets, making it difficult for governments to track and control the expenditures various departments make in their disparate IT ecosystems.

Lower investments in IT equals lower automation of processes and subsequently lower quality of service, but this can be changed by IT infrastructure provisioning using a pubic cloud platform. Cloud infrastructures can help entities ensure that that IT needs of its department are dispersed in the form of computing capacity as opposed to budgets.

Provisioning
A users scale of usage dictates deeper discounts on the platform pricing, but not in provisioning of compute efficiencies. Governments are essentially buying IT solutions in bulk—which is why cloud computing is able to provide a solution to the provisioning challenge of governments’ IT needs. Governments should readily consider centralized cloud deployments with quick provisioning of computing power.

In anticipation and expectation of providing better access to information and services to the people, most governments entities are aiming to distribute compute resources to as many sectors of the country as possible. The time to deliver a service is currently dependent on factors like bottlenecks, availability and processes, but cloud computing can shift the focus of governments to extending the reach of IT applications and information.

Standards in Regulation
It is necessary for governments to ensure that complex regulatory frameworks are implemented and followed in their IT environments. A large portion of these regulatory needs are followed through by IT departments today, and regulatory controls are executed through IT policies. Most often, security and governance are dependent on individual or standardized procedural controls—and the cloud can facilitate the shift from procedural controls to standards.

Managing Information Availability
Governments’ focus is on dispersing meaningful information to their citizens and their various departments, and cloud computing can help facilitate this focus. Governments will be able to scale to unforeseen new heights with a renewed focus on information disbursement.

Essentially, shifting the priority from managing infrastructure to managing information can drive social change, and the cloud is positioned to make this a reality for governments organizations.

For more information regarding the Cloud Computing’s role in the public sector, visit Nubifer.com.

Start Me Up….Cloud Tools Help Companies Accelerate the Adoption of Cloud Computing

Article reposted form HPC in the Cloud Online Magazine. Article originally posted on Nov. 29 2010:

For decision makers looking to maximize their impact on the business, cloud computing offers a myriad of benefits. At a time when cloud computing is still being defined, companies are actively researching how to take advantage of these new technology innovations for business automation, infrastructure reduction, and strategic utility based software solutions.

When leveraging “the cloud”, organizations can have on-demand access to a pool of computing resources that can instantly scale as demands change. This means IT — or even business users — can start new projects with minimal effort or interaction and only pay for the amount of IT resources they end up using.

The most basic division in cloud computing is between private and public clouds. Private clouds operate either within an organization’s DMZ or as managed compute resources operated for the client’s sole use by a third-party platform provider. Public clouds let multiple users segment resources from a collection of data-centers in order to satisfy their business needs. Resources readily available from the Cloud include:

● Software-as-a-Service (SaaS): Provides users with business applications run off-site by an application provider. Security patches, upgrades and performance enhancements are the application provider’s responsibility.

● Platform-as-a-Service (PaaS): Platform providers offer a development environment with tools to aide programmers in creating new or updated applications, without having to own the software or servers.

● Infrastructure-as-a-Service (IaaS): Offers processing power, storage and bandwidth as utility services, similar to an electric utility model. The advantage is greater flexibility, scalability and interoperability with an organization’s legacy systems.

Many Platforms and Services to Choose From:

Cloud computing is still in its infancy, with a host of platform and application providers serving up a plethora of Internet-based services ranging from scalable on-demand  applications to data storage services to spam filtering. In this current IT environment, organizations’ technology ecosystem have to operate cloud-based services individually, but cloud integration specialists and ISVs (integrated software vendors) are becoming more prevalent and readily available to build on top of the emerging and powerful platforms.

Mashing together services provided by the worlds largest and best funded companies like Microsoft, Google, Salesforce.com, Rackspace, Oracle, IBM, HP and many others, gives way to an opportunity for companies to take hold and innovate, and build a competitive, cost saving cloud of their own on the backs of these software giant’s evolving view of the cloud.

Cloud computing comes into focus only when you think about what IT always needs: a way to increase capacity or add capabilities on the fly without investing in new infrastructure, training new personnel, licensing and maintenance of new software. Cloud computing involves all subscription-centric or pay-for-what-you-use service that extends your IT environments existing capabilities.

Before deciding whether an application is destined for the cloud, analyze you current cost of ownership. Examine more than just the original licenses and cost of ownership; factor in ongoing expenses for maintenance, power, personnel and facilities. To start, many organizations build an internal private cloud for application development and testing, and decide from their if it is cost-effective to scale fully into a public cloud environment.

“Bridging the Whitespace” between Cloud Applications

One company, Nubifer.com (which in Latin, translates to ‘bringing the clouds’) approaches simplifying the move to the Cloud for its enterprise clients by leveraging a proprietary set of Cloud tools named Nubifer Cloud:Portal, Cloud:Connector and Cloud:Link. Nubifer’s approach with Cloud:Portal enables the rapid development of “enterprise cloud mash-ups”, providing rich dash-boards for authentication, single sign-on and identity management. This increased functionality offers simple administration of accounts spanning multiple SaaS systems, and the ability to augment and quickly integrate popular cloud applications. Cloud Connector seamlessly integrates data management, data sync services, and enables highly available data interchange between platforms and applications. And Cloud:Link provides rich dashboards for analytic and monitoring metrics improving system governance and audit trails of various SLAs (Service Level Agreements).

As a Cloud computing accelerator, Nubifer focuses on aiding enterprise companies in the adoption of emerging SaaS and PaaS platforms. Our recommended approach to an initial Cloud migration is to institute a “pilot program” tailored around your platform(s) of choice to in order to fully iron-out any integration issues that may arise prior to a complete roll-out.

Nubifer’s set of Cloud Tools can be hosted on Windows Azure, Amazon EC2 or Google AppEngine. The scalability offered by these Cloud platforms promote an increased level of interoperability, availability, and a significantly lower financial barrier for entry not historically seen with current on-prem application platforms.

Cloud computing’s many flavors of services and offerings can be daunting at first review, but if you take a close look at the top providers offerings, you will see an ever increasing road map for on-boarding your existing or new applications to “the cloud”. Taking the first step is easy, and companies like Nubifer that provide the platform services, and the partner networks to aid your goals, are resourced and very eager to support your efforts.

10 Compelling Reasons to Choose Microsoft Dynamics CRM 2011

The beta version of Microsoft Dynamics CRM 2011 was launched earlier this fall, generating buzz among industry analysts due to its major enhancements. Among the new features are a next-generation Microsoft Outlook client, Microsoft Office contextual CRM Ribbon for Office navigation and user experience, user personalization and role-tailored design.

Specifically architected for both cloud and on-prem deployments, the new software is Microsoft’s most robust attempt at gaining traction in Customer Relationship Management (CRM) market space.

Following are some improved features of Microsoft Dynamics CRM 2011:

1.     Advanced User Personalization Capabilities
Users can now configure their workplaces to meet their unique roles and informational needs. Personalizing a workspace means that users can set the default pane and tab that display when they open Microsoft Dynamics CRM Online. Now you can customize what links appear in the workplace view, how many records appear in lists, how numbers and dates display and language capabilities. Users can also combine these personalized features with new dashboards creating  personalized dashboards for default viewing.

2.     Integration with SharePoint and Microsoft Dynamics NAV
This latest version of Microsoft Dynamics illustrates Microsoft’s desire to offer out-of-the-box integration with two of its key products, SharePoint and Microsoft Dynamics NAV. Microsoft Dynamics CRM integrates with SharePoint Server’s document management through contextual document repositories and will also integrate with Microsoft Dynamics NAV 2009 R2 (set to arrive in the next few months). In conjunction with each other, these two products will increase productivity by enhancing interaction between front and back office applications.

3.     Business Intelligence Functionality
Microsoft Dynamics CRM 2011’s new real-time dashboards offer advanced business intelligence functionality that is more intuitive. Users are able to speedily configure multiple dashboards to monitor business performance, for example, and can set up dashboards for individual or shared use. The dashboards can include in-line charts with drill-down intelligence to visually navigate data, identify trends and uncover new insights.

4.     Seamless Integration with Microsoft Office
With Microsoft Dynamics CRM 2011 comes a new Office 2010 contextual ribbon for Microsoft Dynamics CRM Online and Microsoft Dynamics CRM browser clients, which delivers a consistent, familiar navigation and user experience. This allows Dynamics CRM users to take advantage of native Outlook functionality such as previews and conditional formatting. With the new release, users can highlight and flag CRM records (like with an Outlook email) and the reading pane grants readers an instant view of a record without having to open up a new screen.

5.     Interactive Process Dialogs
Dialogs aid users in the collection and processing of information using step-by-step scripts. Companies can use dialogs to increase performance and versatility by incorporating advanced work-flow logic, which calls automated tasks using the responses a customer or user makes during a dialog script.

6.     Improved Configuration Capabilities
Key features include custom activities and communications, data auditing, field-level security, tailored form experience and improved knowledge base.

7.      Cloud Development and Deployment
With Microsoft Dynamics CRM 2011, developers can take advantage of Windows Azure to develop and deploy custom code for Microsoft Dynamics CRM Online using tools like Visual Studio. Developers can incorporate Microsoft Silverlight, Windows Communication Foundation and .NET Language Integrated Query (LINQ) into their cloud solutions using Microsoft .NET Framework 4.0.

8.     Role-Based Forms and Views
Forms and views in Microsoft Dynamics CRM are based on user roles; this role-tailored design ensures that users have speedy access to the relevant information they need, while simultaneously preventing users from accessing data that they aren’t authorized to view.

9.     Microsoft Dynamics Marketplace
An online solutions catalog which helps developers accelerate and extend their Microsoft Dynamics CRM Online implementations. The Microsoft Dynamics Marketplace is fully integrated with Microsoft Dynamics CRM enabling customers to access the Marketplace from within Microsoft Dynamics CRM in order to search applications and connect with Microsoft Registered Partners.

10.  Customizing and Sharing
With Dynamics CRM, Microsoft introduces what the vendor calls “solutions”: ways to save customizations and share them with others. Users can create a solution or import an app created by a developer outside the organization; a managed solution can only be edited by a specific user while an un-managed solution can be edited by any user with an appropriate role. A solution is able to have version numbering, relationships with entities and other components and security features based on user roles.

Microsoft Dynamics 2011 offers a powerful suite of Business Intelligence capabilities which will aid any organization streamline is contact management processes.

For more information about CRM consulting services offered by Nubifer, visit www.nubifer.com

BPOS to be Enhanced with Office Web Apps

Although the software giant has yet to reveal a specific timeline for the integration, Microsoft announced plans in October to add Office Web Apps to its hosted Business Productivity Online Suite (BPOS). This integration will give Microsoft a much-needed edge, and keep BPOS ahead of rivals like Google Apps. Google Apps offers office productivity applications as part of their broader cloud-based collaboration and communication suites.

Described by Microsoft officials as “online companions” to Word, Excel, PowerPoint and OneNote, Office Web Apps offers hosted versions of Microsoft’s Word, Excel, PowerPoint and OneNote that feature the use-ability found in the on-premise Microsoft Office suite. The software company says they are aiming to let users “access, view and edit” documents via a the Internet.

With about 20 million users, Office We Apps is currently available free for individual consumers as part of the Windows Live online services. Office Web apps is also a component to the free Live@EDU collaboration and communication suite for educational institutions. Office Web Apps can also be accessed by organizations that own the on-premise versions of Office 2010 and SharePoint 2010.

It has been widely reported that the absence of Office Web Apps from BPOS has not hindered the adoption of that collaboration and communication suite for businesses (which features Exchange Online, Office SharePoint Online and Microsoft Office Live Meeting).

According to industry analysts, BPOS licenses have more than tripled since the start of 2010, but it is unknown how many BPOS seats have been sold overall. Microsoft stated recently that there are 40 million paid seats of Microsoft Online Services—of which BPOS is a part of. In October, Microsoft announced a number of big customer wins for BPOS, such as DuPont (58,000 end users), Volvo (18,000 end users), Australia’s Spotless Group, Godiva and Sunoco.

Industry analysts have observed that the familiarity of Microsoft’s software interfaces and tools (because it is present in many enterprises), as well as the links between Microsoft’s cloud and on-premise software, will be an advantage for the company.

Gartner explains, “I’d expect to see a growing opportunity for companies looking to move to a more cost-effective collaboration environment to consider Microsoft in the mix because of its experience in delivering enterprise collaboration.”

Analysts have also seen that Microsoft’s sales-teams are being aggressive about spreading the word about BPOS and promoting it as part of the renewing of enterprise contracts. A Gartner analyst has been quoted as saying, “Microsoft has tapped a deep root of demand for cloud services with BPOS.”

Additionally, Microsoft announced new customers, including several California State University schools, the University of Montana, Northern Kentucky University, the College of DuPage, Washington University in St. Louis and Aston University in the U.K., for Live@EDU. Live@EDU now features more than 10,000 academic institutions with over 11 million end users. Live@EDU includes Office Web Apps, Windows Live Sky Drive and Outlook Live.

For more information regarding BPOS contract a Nubifer representative today. Nubifer is a Microsoft Certified Partner.

Zoho CRM Adds QuickBooks and Telephony Integration

Zoho Corp., a leader in Software as a Service business applications, announced Wednesday December 8th that their ‘Zoho CRM’ offering now allows users to leverage QuickBooks software and Telephony Integration. Over the past few years, Zoho has had over 300,000 apps created on this platform, and as Zoho CRM evolves as a leading work-flow engine, they are introducing two key modules to Zoho CRM – QuickBooks & Telephony integration.

Zoho offers SaaS applications and provides a wide, integrated portfolio of rich online applications for businesses. With more than 20 different applications spanning Collaboration, Business and Productivity, Zoho helps businesses and organizations get work done. Zoho’s applications are delivered via the Internet, requiring nothing but a browser, enabling organizations to focus on their business while leveraging Zoho in order to maintain the servers and keep data safe.

Zoho CRM for QuickBooks

Zoho’s most recent CRM integration will help users sync information between their Zoho CRM and QuickBooks applications. This update now enables a business’ Customer and Inventory data to be synced between these two leading-edge officing and productivity systems. As QuickBooks doesn’t offer a per user model, this Add-on can be licensed for the entire organization for $25/Month.

Key features of the QuickBooks update for Zoho CRM include:

  • Zoho CRM for QuickBooks syncs Contacts, Vendors, Products, Quotes, Invoices and the Sales Orders modules.
  • Users can choose to import data from both systems or the Sync data automatically
  • Users have the option to choose which system gets a priority when there is conflicting data
  • Users have options to map data fields between Zoho CRM & QuickBooks
  • Zoho CRM Integrates with on-premise versions of QuickBooks Premier from 2008 to 2010 and also Simple Start  2008

Zoho PhoneBridge – Telephony Integration

As the name suggests, PhoneBridge connects your Telephone system(PBX) with Zoho CRM and allows you to interact with your CRM account during all your inbound and outbound calls. This add-on connects data from Zoho CRM with the telephony systems. For incoming and outgoing calls from your telephone, Zoho CRM can pull up the information of the caller, if available in the CRM system, and display that information in the app allowing you to log information for the contact.

Where is it used? Consider the case of a Call Center. Cold Calling, telemarketing calls, telesales calls,customer care, customer support –  these are the typical operations of call centers. They can leverage the data from the CRM System during a call.

This feature is available immediately. This module is proceed at $6/user/month after a 15 day trial.

Contact Nubifer representative to discover how Zoho CRM can work for your business.


Gartner Discovers 10% of IT Budgets Devoted to Cloud Computing

A recent survey conducted by Gartner reports that companies spend approximately 10 percent of their budget for external IT services on cloud computing research, migrations and implementations.

Gartner conducted the survey from April to July 2010, surveying CIOs across 40 countries, discovering that nearly 40% of respondents allocated IT budget to cloud computing. Almost 45% of the CIOs and other senior IT decision makers questioned about general IT spending trends provided answers pertaining to cloud computing and its increased adoption rates.

Among the questions asked were how organizations’ budgets for cloud computing were distributed. Detailing the results, a Research Director at Gartner noted that, “One-third of the spending on cloud computing is a continuation from the previous budget year, a further third is incremental spending that is new to the budget, and 14 percent is spending that was diverted from a different budget category in the previous year.”

Organizations polled in Europe, Asia, the Middle East, Africa and North America spent between 40 and 50 percent of their cloud budget on cloud services from external providers. The survey also discovered that almost half of respondents with a cloud computing budget planned to ramp up the use of cloud services from external platform providers.

According to Gartner analysts, the survey results demonstrated a “shift towards the ‘utility’ approach for non-core services, and increased investment in core functionality, often closely aligned with competitive differentiation.”

Additionally, more than 40% of respondents anticipated an increase in spending in private cloud implementations designed for internal or restricted use of the enterprise, compared to a third of those polled seeking to implement public clouds.

Gartner called the investment trends for cloud computing as “healthy” as a whole. Said Gartner, “This is yet another trend that indicates a shift in spending from traditional IT assets such as the data-center assets and a move towards assets that are accessed in the cloud. It is bad news for technology providers and IT service firms that are not investing and gearing up to deliver these new services seeing an increased demand by buyers.”

Discussing the findings, Chad Collins, CEO of Nubifer Cloud Computing said, “This survey supports what we are seeing at ground zero when working with our enterprise clients. Company executives are asking themselves why they should continue with business as usual, doling out up-front cap/ex investments while supporting all the risks associated with large scale IT implementations.” Collins elaborates, “Cloud platforms allow these organizations to eliminate risks and upfront investments while gaining greater interoperability and scalability features.”

Collins went on to add, “Forward thinking organizations realize that by using external providers and cloud computing models, they can gain more flexibility in their cost and management of thier application base, while also getting the elasticity and scalability needed for growth.”

To learn more about adopting a cloud platform and how your organization can realize the benefits of cloud computing technologies, contact a Nubifer representative today.

Zoho Launches “Zoho Support” for Cloud Based Customer Support

The Zoho Family welcomed a new addition on November 10: Zoho Support. The web-based help desk software helps organizations to easily manage and respond to perpetually-increasing customer support inquiries arriving via a variety of channels, from phone and email to website and self-service portals. In our current era of real-time communication and instant gratification, customers have come to expect punctual answers to their questions.

Zoho Support provides this with its innovative web interface, which allows technicians to prioritize customer tickets, locate the correct response, and promptly respond to the customer.

Often times, the fastest way for customers to get support is via self-service, and Zoho Support boasts features that members of the support organization to publish solutions to known issues in a customer portal. This enables customers to solve their problem before contacting your staff. Customers can also submit and track tickets in the customer portal.

Zoho Support is comprised of the following modules, or tabs:

  • Requests: The requests tabs aids technicians in prioritizing the support requests coming in from customers. The support request can be taken care of in a variety of ways. For example: those requests that are assigned to me, or those requests that are unread.
    A support manager may want to see which requests are overdue and must be address immediately. It is often hard to programmatically determine (i.e. through an email or a form) if a request is high-priority or who it should go to, thus the request module helps organizations triage these support requests so they can be directed to the appropriate team.
  • Accounts & Contacts: This tab provides a view into the support operation through a customer perspective and, most importantly, organizations can define, track and enforce specific SLAs they may have agreed to with their customers. All support requests are unique; two that might look identical may have a different priority depending on the SLA that was promised to each individual customer.
  • Reports & Dashboards: This tab provides a quick view into how your support organization is doing, so you can quickly obtain a large amount of data that will allow you to make the best decisions both for your business and your customers. What products are getting the most support requests, for example. What incidents are occurring the most often? What support group (or rep) as the slowest response time?
  • Tasks: This tab serves as a simplified to-do list for a support rep—either with external activities, like responding to a customer with a particular solution or diagnosis, or external ones, like research or trying a new approach. This helps everyone in the support team to remain organized and allows the support manager to have a simplified look at what everyone is working on.
  • Solutions Database: This module allows support reps to create and publish (internally or externally) solutions to the most common customer issues. When published externally, a solution is automatically available in the customer portal, so it can be easily located by customers.
  • Workflow, API and more: This includes automating tasks, assignments, alerts with the workflow rules; integrate with other systems via the APT; maintain details of your catalogue (skus, release dates, support windows), and more.

Zoho Support is already integrated with Zoho CRM, so you can smoothly transition from selling to supporting. Zoho Chat is also integrated, so your support agents can easily find what they are looking for not only in the solutions database but from their colleagues over IM while they’re on the phone with a customer.

Zoho Support is available immediately via paid plans starting at $12 per month per agent for Enterprises and large support organizations. Unlimited-user plans for smaller companies that process up to 200 tickets per day are also available and Zoho Support additionally offers a free plan.

For more information regarding Zoho’s suite of officing applications visit www.nubifer.com.

Emerging Trends in Cloud Computing

Due to its reputation as a game-changing technology set, Cloud Computing is a hot topic when discussing emerging technology trends. Cloud Computing is defined by the National Institute of Standards and Technology (NIST) “as a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.”

IT optimization has largely been the reason for the early adoption of Cloud Computing in “Global 2000” enterprises, with the early drivers being cost savings and faster infrastructure provisioning. A December 2009 Forrester Report indicated that over 70% of IT budget is spent on maintaining current IT infrastructure rather than adding new capabilities. Because of this, organizations are seeking to adopt a Cloud Computing model for their enterprise applications in order to better utilize the infrastructure investments.

Several such organizations currently have data center consolidation and virtualization initiatives underway and look to Cloud Computing as a natural progression of those initiatives. Enterprise private cloud solutions add capabilities such as self-service, automation and charge back over the virtualized infrastructure and thus make infrastructure provisioning quicker, helping to improve the over-all utilizations. Additionally, some of these organizations have been beginning to try public cloud solutions as a new infrastructure sourcing option.

IT spending of “Global 2000” enterprises makes up less than 5% of their revenues, thus optimizing IT isn’t going to impact their top or bottom line. In the current economic state, IT optimization is a good reason for these large enterprises to begin looking at Cloud Computing. So what is the true “disruptive” potential of Cloud Computing? It lies in the way it is going to aid these large enterprises in reinventing themselves and their business models in order to rise to the challenge of an evolving business landscape.

Social Networking Clouds and e-Commerce

Worldwide e-Commerce transactions will be worth over $16 trillion by 2013, and by 2012 over 50% of all adult Internet users in the U.S. will be using social networks. Currently, 49% of web users make a purchase based on a recommendation gleaned from social media. This increased adoption of social media makes it easier for consumers to remain connected and get options on products and services. Basically, the consumer has already made up their mind about a produce before even getting to the website or store. This is causing major changes in consumer marketing and the B2C business models. The relationship used to be between the enterprise and the consumer, but it is now changed to a deeper relationship that encompasses the consumer’s community.

Large enterprises can’t afford to have “websites” or “brick-and-mortar stores” any longer if they want to remain relevant and ensure customer loyalty—they need to provide online cloud hosted platforms that engage the consumers constantly along with their social community. That way, they incorporate the enterprise business services in their day-to-day life. When the Gen Y consumers reach the market, for example, “community driven” social commerce just may replace traditional “website based” e-commerce. Enterprises need to begin building such next-generation industry specific service platforms for the domain they operate it in anticipation of this.

Computing’s Pervasiveness

One half of the world population—roughly 3.3 billion—have active mobile devices, and the increased use of these hand held devices is altering the expectations of consumers when it comes to the availability of services. Consumers expect that the products and services should be available to them whenever they need the service, wherever they are, through innovative applications, the kinds of applications that can be better delivered through the cloud model.

The number of smart devices is expected to reach one trillion by 2011, due to increasing adoption of technologies like wireless sensors, wearable computing, RFIDs and more. This will lead to significant changes in the way consumers use technology, as future consumers will be used to (and be expecting) more intelligent products and services such as intelligent buildings that conserve energy and intelligent transportation systems that can make decisions based on real-time traffic information. An entirely new set of innovative products and services based on such pervasive computing will need to be created for the future generation.

Service providers will look to increase customer loyalty by providing more offerings, better services and maintaining deeper relationships as products and services become commoditized. Several industry leaders are increasingly adopting open innovation models, there by creating business clouds supported by an ecosystem of partners, in order to increase the portfolio of offerings and innovate faster. A new generation of applications must be created as Cloud Computing becomes more pervasive with the increased adoption of smart devices.

To gain a competitive edge, reduce CAPEX on infrastructure and maintenance, and take advantage of powerful SaaS technologies offered in the Cloud, Companies need to build their next generation business cloud platforms in order to better manage the scale of information.

To learn more about Cloud Computing and how companies can adopt and interoperate with the cloud, Visit Nubifer.com

Department of Defense And Cloud Security Management

Migrating Department of Defense applications to public cloud platforms operated outside of the Department of Defense DMZ typically raise concerns about the efficacy of security protocols. Currently, the DoD data-centers rely on fire-walled barriers that are designed to prohibit interactions with those outside of its perimeter. The effectiveness of these safe-guards can be argued on a number of levels. The DoD contracts out the management of much of its data, meaning those in charge of their data are neither military nor civilian employees.

Regardless of this outsourcing, the transference of compute resources to third party platform providers will be subjected to stringent security guidelines. What may be viewed as a minor security incident could result in a revocation of security certification for the cloud services provider.

High level DoD executives realize that cloud computing offers a significant opportunity for cost savings, scalability, as well as fail-safe features that offer advantages when compared to the current DISA data-centers. Decision makers are now asking whether the externalization of the DoD workload to a public cloud cause a degradation in network security. Will the governmental auditors reject a public cloud because they cannot fully guarantee security? But the fact is that many public cloud offerings offer the same level of data security, obfuscation and redundancy that’s offered in the DoD’s internal data-centers.

DoD data-centers lock up server farms as well as associated power inside a physical structure in order to gain security. Additional controls installed include:

– Perimeter firewalls
– Demilitarized zones (DMZ) for isolating incoming transactions
– Network segmentation
– Intrusion detection devices and software for monitoring compliance with security protocols

Currently, there are a plethora of companies selling hardware devices and software packages claiming to increase data-center security. But as security threats rise, data-center management teams keep adding disparate security management devices, thus increasing not only operating costs but also the delays that are incurred as transactions travel their way through multiple security barriers.

The accumulation of these disparate security features only increase the vulnerability of systems and add to potential security loop-holes. Each data-center will ultimately have security measures that are unique to each individual situation. Therefore they are not amenable to coordinated and standardized oversight.

Cloud platform providers gain from the benefits of virtualization. Virtual machines from multiple providers are co-hosted on physical resources without any cross-referencing that can jeopardize security. This allows virtualization to be the key technology that enables the migration of applications into a cloud environment where security is provided via the hypervisor that controls each separate virtual machine.  A standardized third-party security appliance can be connected to this hypervisor allowing for consistent security services delivered to every virtual machine even if they run on differing operating systems.

Users must stop viewing protection of applications at the data center or server levels as the basis for achieving security. Instead, we have to view each individual virtual computer, with its own operating system and its own application as fully equipped to benefit from standardized security services.

A data-center may encompass thousands of virtual machines. Cloud security will be achieved by protecting virtual computers through their hypervisor on which they operate. This way, every virtual machine can be assigned a sub-set of security protocols that will carry its protection safeguards as well as security criteria. Take moving a virtual machine from a DISA data-center to the cloud, the security of a relocated virtual machine will not be compromised. Multi-tenancy of diverse applications, from varied sources is now feasible since the cloud can run diverse applications in separate security enclosures, each with their own customized security policies.

In a cloud environment the addition of a new application is simplified. Integration with security measures can be instant and seamless because a hypervisor already supports your current security protocols. And if a virtual machine can port its own security measures when migrating from one cloud to another, these integration efforts can be further reduced.

In Summation
Security services for a cloud environment can now be pooled and standardized to support a large number of virtual machines. Such pooled services can be managed to give DoD data-centers vastly improved shared security awareness.

But the overall management and monitoring of enterprise-wide security will still remain an intensive task. However, as compared with the current diversity in security methods, the transfer of applications onto a cloud platform will further reduce costs and simplify the administration of security.

Whether the Department of Defense can efficiently implement its own private cloud, or whether it will have to rely on commercially provided cloud providers is yet to be known. The DoD could rely on commercial firms for most cloud computing services, except for retaining the direct oversight over security. This could be accomplished by managing all security appliances and policies from DoD Network Control Centers that would be staffed by internal DoD personnel.

For more information regarding security of Cloud platforms and how the government is approaching Cloud Computing and Software-as-a-Service, visit Nubifer.com.

Microsoft Announces Office 365

Announced October 19th 2010, Microsoft is launching Office 365, the software giants’ next cloud productivity offering syncing Microsoft Office, SharePoint Online, Exchange Online and Lync Online in an “always-on” software and platform-as-a-service. Office 365 makes it simpler for organizations to get and use Microsoft’s highly-acclaimed business productivity solutions via the cloud.

With the Office 365 cloud offering, users can now work together more collaboratively from anywhere on any device with Internet connectivity, while collaborating with others inside and outside their enterprise in a secure and interoperable fashion. As part of today’s launch  announcement by Microsoft, the Redmond based software company is opening a pilot beta program for Office 365 in 13 different regions and countries.

Microsoft relied on years of experience when architecting Office 365, delivering industry-acclaimed enterprise cloud services ranging from the first browser-based e-mail to today’s Business Productivity Online Suite, Microsoft Office Live Small Business and Live@edu. Adopting the Office 365 cloud platform means Microsoft users don’t have to alter the way they work, because Office 365 works with the most prevalent browsers, smart-phone hand-sets and desktop applications people use today.

Office 365 developers worked in close association with existing customers to develop this cloud offering, resulting in a platform that is designed to meet a wide array of user needs:

“Office 365 is the best of everything we know about productivity, all in a single cloud service,” said Kurt DelBene, president of the Office Division at Microsoft. “With Office 365, your local bakery can get enterprise-caliber software and services for the first time, while a multinational pharmaceutical company can reduce costs and more easily stay current with the latest innovations. People can focus on their business, while we and our partners take care of the technology.”

With Office 365 for small businesses, professionals and small companies with fewer than 25 employees can be up and running with Office Web Apps, Exchange Online, SharePoint Online, Lync Online and an external website in just 15 minutes, for $6 per user, per month.

Microsoft Office 365 for the enterprise introduces an wide range of choices for mid and large organizations, as well as for governmental entities, starting at $2 per user, per month for basic e-mail. Office 365 for the enterprise also includes the choice to receive Microsoft Office Professional Plus on a pay-as-you-use basis. For less than $25 per user, per month, organizations can get Office Professional Plus along with webmail, voicemail, business social networking, instant messaging, Web portals, extranets, voice-conferencing, video-conferencing, web-conferencing, 24×7 phone support, on-premises licenses, and more.

Office 365 is creating new growth opportunities for Microsoft and its partners by reaching more customers and types of users and meeting more IT needs — all while reducing the financial burden for its customers.

Product Availability

Office 365 will be available worldwide in 2011. Starting today, Microsoft will begin testing Office 365 with a few thousand organizations in 13 countries and regions, with the beat expanding to include more organizations as the platform matures. Office 365 will be generally available in over 40 countries and regions next year.

Towards the end of next year, Microsoft Office 365 will offer Dynamics CRM Online in order to provide their complete business productivity experience to organizations of all varieties and scales. Additionally, Office 365 for education will debut later next year, giving students, faculty and school employees powerful technology tailored specifically to their needs.

October 19th at Noon PDT, Microsoft will launch http://www.Office365.com. Customers and partners can sign up for the Office 365 beta and learn more at that site, or follow Office 365 on Twitter (@Office365), Facebook (Office 365), or the new Office 365 blog at http://community.office365.com to get the latest information.

Nubifer is a Microsoft Registered Partner with expertise in Office, Windows 7, BPOS and Windows Azure.  Contact a representative today to learn how the Office 365 cloud platform can streamline your business processes or visit www.nubifer.com and fill out our online questionaire.

Protecting Data in the Cloud

When it comes to cloud computing, one of the major concerns is protecting the data being stored in the cloud. IT departments often lack the knowledge necessary to make informed decisions regarding the identification of sensitive data—which can cost an enterprise millions of dollars in legal costs and lost revenue.

The battle between encryption and tokenization was explored in a recent technology report, and the merits of both are being considered as securing data in the cloud becomes more and more important. Although the debate over which solution is best continues, it is ultimately good news that protection in cloud computing is available in the first place.

It is essential that data is secure while in storage or in transit (both inherent in cloud computing) in the current business climate; the protection is necessary whether dealing with retail processing, accessing personal medical records or managing government information and financial activity. It is necessary to implement the correct security measure to protect sensitive information.

So what is tokenization? Tokenization is the process in which sensitive data is segmented into one or more pieces and replaced with non-sensitive values, or tokens, and the original data is stored encrypted elsewhere. When clients need access to the sensitive data, they typically provide the token along with authentication credentials to a service that then validates the credentials, decrypts the secure data, and provides it back to the client. Even though encryption is used, the client is never involved in either the encryption or decryption process so encryption keys are never exchanged outside the token service. Tokens protect information like medical records, social security numbers, financial transactions, etc prevent unauthorized access.

Encryption, on the other hand, is the process of changing the information using an algorithm to ensure it is unreadable to anyone expect those who possess a key or special knowledge. The military and government have been using this method for some time to make sure that their sensitive information remains in the hands of the right people and organizations.

Tokenization and encryption can be applied when using cloud computing to protect the information is used in the cloud. For organizations seeking to determine which method is a better fit for them, it is necessary to ask questions about the security of the method and whether one has more pros than the others. It is necessary in this case to clearly define the objectives of the business process as well.

A clear method of protecting information is essential if cloud computing is posing benefits for the enterprise. Conversely, this can also be an obstacle to launching a cloud computing strategy. Gartner reports that 85 percent of participants cited security as a key factor that could prevent them from launching cloud-based apps.

In conclusion, there is no clear winner in the debate over tokenization versus encryption. Rather, it depends on the goals of the business and how the company plans to manage the security of their sensitive information. The data needs to be protected in a way that is easily manageable when launching a cloud computing strategy—and it is only at this point that cloud computing can be both successful and secure. For more information regarding securing data int eh cloud via tokenization, contact a Nubifer representative today.

Zoho Creator Adds Reporting & Scheduler Modules

Zoho Corp., a leader in Software as a Service business applications, announced Wednesday October 6th that their ‘Zoho Creator’ offering now allows users to create situational applications. Over the past few years, Zoho has had over 300,000 apps created on this platform, and as Zoho Creator evolves as a leading work-flow engine, they are introducing two key modules to Zoho Creator – Reports & Schedules.

Zoho offers SaaS applications and provides a wide, integrated portfolio of rich online applications for businesses. With more than 20 different applications spanning Collaboration, Business and Productivity, Zoho helps businesses and organizations get work done. Zoho’s applications are delivered via the Internet, requiring nothing but a browser, enabling organizations to focus on their business while leveraging Zoho in order to maintain the servers and keep data safe.

Reports Module
Zoho is introducing a powerful business intelligence module in Zoho Reports that lets users create different types of reports and pivot tables.This Reporting module is now integrated into Zoho Creator allowing users to analyze the data they have in their application. Users are now able to:

  • Create dynamic reports based on the data contained in their Creator app
  • Generate Pivot Tables (including multi-level pivots) with a range of options
  • Filter & Sort data with a report builder interface
  • Embed & Share reports with team members or by embedding them on a website

Scheduler Module
The newly introduced scheduler module lets users create and schedule automated tasks. These tasks can be triggered by user input or at pre-set times and/or dates. There are three general schedule types:

  • Form Schedules, which lets users configure actions to be executed based on any date/date time field in a form
  • Report Schedules lets users schedule periodic reports of data that has been added to their application
  • Custom Schedule give users the power to create and execute their own scripts

Reports Pricing

  • Two reports are available for free users and paid users with ‘Basic’ and ‘Standard’ plans.
  • Unlimited Reports are available for Paid users (Professional plans and above)

Scheduler Pricing

  • Scheduler module is available for all paid users. It includes 31 schedulers.
  • A 15 day trial version is available for free users.

These two modules are available for use now and are readily available at http://www.zoho.com.  For more information on Zoho’s suite of SaaS applications, and migration best practices please contact a Nubifer representative today. www.Nubifer.com -or- (800) 293 4496.

IBM’s Goals for Cloud Computing

With a reported $14 billion left over from its 2009 fiscal year, what is IBM going to do with it? Industry experts suggest that they are expanding upon their ability to deliver enterprise cloud computing solutions and services.

It’s expected that cloud services will add $3 billion in net revenue by 2015, says IBM’s Vice President of Cloud Computing Walt Braeger. Although IBM won’t be able to do so without acquisitions, the company’s excess of cash won’t be spent on a single acquisition—like when IBM purchased Cognos for $5 billion in 2007.

It’s been well documented that IBM has already spent a great deal on significant cloud computing acquisitions, most of which were acquired for less than $1 billion, although a few stray from that norm. IBM spent $1.4 billion on Sterling Commerce (which was a unit of AT&T and focuses on providing software that helps companies manage their channel relationships) earlier this year, for example. IBM also acquired Cast Iron Systems—who’s software systems help connect its cloud services to traditional and legacy software systems—for an undisclosed amount in May.

Cloud Computing and its Inner-workings
Because the business aspects of cloud computing are increasingly local, IBM will need to have a physical presence in many of the nations in which it hopes to build a customer base for its cloud services.

Braeger cited that IBM already has a multi-billion dollar investment in its service delivery centers, located across the globe. Because they require massive amounts of reliable bandwidth, cloud computing centers have to be located next to major Internet points of presence. IBM has not encountered bandwidth constraints as it grows its cloud computing business thus far, but that may change as focus turns to developing markets—where solutions will need to incorporate low-bandwidth mobile devices.

Organizations will utilize IBM’s cloud appliance service model, CloudBurst, in some cases. CloudBurst is a physical device that delivers a cloud, and is one of IBM’s many answers to the security fears of its customers.

CloudBurst, when initially conceptualized, was aimed at developers. This is due to the fact that developers drive so much business value that the typical enterprise devotes 30 to 50 percent of its entire technology infrastructure to development and testing. All but ten percent of that infrastructure remains idle, IBM said, thus making the case for a scalable, flexible, interoperable cloud infrastructure.

Developers and a Cloud Infrastructure
According to Braeger, cloud projects in enterprise testing have delivered on all the hype surrounding the cloud, with an average Return On Investment (ROI) in just four months, making these projects extremely attractive to organizations that are constricting IT budgets.

These types of projects require a multitude of components to be in place before they can begin delivering: self-service access to resources, a detailed service catalog, and infrastructure ready and available for instant provisioning. Those have always proven to be pain points for developers and enterprise IT in the past. For example, developers dealt with bureaucracy when requesting a system on which they could test a new application. Now though, says Braeger, the situation has changed due to a cloud-based infrastructure.

Changing the model so drastically requires IT to revamp other functions beyond deploying resources. Organizations need to develop the capacity to charge other departments for use, for example, meaning departments must be willing to be metered.

These are unlikely to be major obstacles though, at least according to Braeger. In many large enterprises, IT departments already have an accounting function and an auditing function that is standardized. Because modules that work well in a particular auditing system can be reused across a wide spectrum of customer, this makes IBM’s job easier.

Cloud Computing Living Up to the Hype
There is bound to be internal push-back, regardless of the early evidence of a quick return on investment, with so many major changes altering the daily work-flow. During his company’s annual meeting this year, IBM CEO Samuel Palmisano told shareholders that despite the turmoil, cloud technologies are slated to revolutionize how IT functions within the enterprise.

Braeger did admit that hype surrounding cloud computing is current at what industry researcher Gartner would call a “peak of inflated expectations”, which is inevitably followed by a “trough of disillusionment”—but is able to highlight the cloud’s demonstrated success in delivering a quick ROI and more efficient IT services as a reason not to be concerned.

Zoho CRM, Invoice & Projects now Integrate with Gmail

Zoho announced today that Zoho CRM, Invoice & Projects now integrate with Gmail through Contextual Gadgets. Gmail Contextual Gadgets is a way for users to integrate third party applications into Gmail. When a user installs a contextual gadget in Gmail, the gadget shows up when that individual opens an email. The gadget can contain information pulled in from various third-party systems (eg: Zoho CRM, Invoice and Projects) and displayed contextually within that email.

Google announced this earlier this year, but Zoho unveiled today that they have created contextual gadgets for their CRM, Invoice & Projects applications.

Following are a few examples of tasks that can be accomplished leveraging Zoho’s Contextual Gadgets.

Zoho CRM

  • With a click of the mouse, users can search to see if the sender exists in your CRM system.
  • From within an email, users can add the sender to the Zoho CRM as a Contact or a Lead without having to leave their Gmail domain.
  • Users can add a Potential to a Contact, updating it from directly within the email message.
  • If the sender exists within the users CRM database, all details relating to the contact can be viewed within the email.
  • Users can Add/View Tasks and Notes to the sender within the email.

Zoho Invoice

  • If the sender exists within the system, their information is pulled from Zoho Invoice and displayed within the email message.
  • All emails sent from Zoho Invoice to the specified user are listed within the gadget, creating an Email History for each contact within the users’ database.
  • Users can view all unpaid invoices with the status as ‘Unpaid’ or ‘Open’ being displayed prominently for the sender.
  • Gadget users are now able to view payments received from the sender within the email.

Zoho Projects

  • Users can now create a new Project directly from the email, and share it with co-workers.
  • It’s now possible to transform an email into an actionable task in Zoho Projects and share it with the appropriate agent within your organization.
  • Users are now able to redirect the relevant contents of an email and make it an open forum post available for discussion.
  • Contextual Gadgets now make it simple to assign a task to any of your team members from within the email message.

If you would like more information regarding Zoho Projects Contextual Gadgets,visit Nubifer.com.

Zoho CRM, Invoice & Projects are already part of the Google Apps Marketplace, and are currently being leveraged by thousands of businesses using Google Apps.

A Closer Look at Microsoft’s Cloud Service Offerings

Although the Business Productivity Online Services (BPOS) is a primary component of Microsoft’s Cloud services, BPOS is not an all encompassing definition of their cloud service suite—it is simply one compelling offering available.

A Closer Look at Microsoft’s Cloud Services

It is a common assumption that Microsoft is relatively new to offering cloud services, but Microsoft has been on a journey leading up to this point for 15 years, beginning back with Windows Live and Hotmail.

During that time, their services and offerings delivered online have continued to expand. Currently, a number of cloud-based solutions are available, enabling businesses and organizations to become more efficient and scalable. Here is an outline of Microsoft’s cloud offerings, and brief descriptions of their capabilities:

Windows Azure:
A flexible, familiar environment to create applications and services for the cloud in which can shorten time to market and adapt to growing demand.

Windows Live ID:
Identify and authentication system provided by Windows Live. Lets you create universal sign-in credentials across diverse applications.

Microsoft SQL Azure:
Provides a highly scalable, multi-tenant database that doesn’t require installation, setup, patches or routine management.

Windows Intune:
Streamlines how businesses manage and secure PC’s using Windows Cloud Services and Windows 7.

Microsoft Office Web Apps:
Offers online companions to Word, Excel, PowerPoint and OneNote, granting freedom to access, edit and share Microsoft Office documents from anywhere.

Microsoft Exchange Online:
Highly secure hosted email for your employees. Offers “anywhere access” and starts at just $4 per user per month.

Microsoft Office Live Meeting:
Provides real-time Web-hosted conferencing, enabling you to connect with colleagues and engage clients from wherever you’re located.

Microsoft Forefront Online Protection for Exchange:
Helps protect businesses’ inbound and outbound email from viruses, spam, phishing scams and email policy violations.

Microsoft SharePoint Online:
Gives your business a highly secure, central location in which employees can collaborate and share documents.

Microsoft Office Communication Online:
Delivers robust messaging functionality for real-time communication via text, voice and video.

Microsoft Dynamics CRM Online:
Helps you find, keep and grow business relationships by centralizing customer information and streamlining processes with a system that adapt to new demands quickly.

Microsoft Business Productivity Online Suite (BPOS):
Unites online versions of Microsoft’s messaging and collaborating solutions such as: Exchange Online, SharePoint Online, Office Live Meeting and Office Communications Online.

Opting for Microsoft Online Services allows you to combine the power of rich desktop-based applications with the flexibility of fully-hosted Internet services.This approach gives users an all-in-one integrated experience on the same applications your users already know with a consistent look and feel from any device, in any location.

To summarize, the opportunity that Microsoft’s cloud services offers is exciting, whether you are a partner or a business. It is important to utilize the resources outlined above to either begin or continue your journey into what Microsoft’s cloud and online services can offer your enterprise.

For more information about Microsoft’s Cloud Solutions, contact a Nubifer representative today, or visit Nubifer.com.

Interoperability, Cap Ex and Cloud Computing

At a time when organizations are coming under increased pressure to cut operational costs—especially when it comes to technology budgets—cloud computing offers companies interoperability and robust technology environment which in turn can improve cost savings. Because cloud solutions, apps and platforms are Internet-based and share compute resources, delivering software and information to computers on demand lessens the financial burden traditional IT ecosystems place on the enterprise.

A global IT research firm recently completed their second global research project on IT outsourcing practice and attitudes. The researchers polled IT and business decision makers in the U.S., U.K. and Singapore and discovered that more than two-thirds of business decision makers say upgrading infrastructure with a lighter budget is the main challenge facing them this year. Additionally, more than three quarters of decision makers regard cost savings as their main strategic priority for the fiscal year.

The pressure is not lost on IT managers either, with nearly half of respondents citing discovering a more cost-effective IT infrastructure as their main priority. Effective managing and prioritizing IT demand come next, followed by delivering faster data access across their organizations.

Reducing Your In-House IT Burden
Currently, public sector IT managers claim that only 7 percent of their infrastructure is outsourced, despite highlighting the need to reduce IT infrastructure costs as their key concern. Comparing this to the private sector, the figure changes significantly. In the private sector, IT managers say almost 20% percent of their infrastructure is outsourced.

Owning and operating your own IT infrastructure has previously been viewed as something that differentiated large from medium businesses, as it could allow for greater business flexibility of service design-once hardware and equipment was provisioned with enough capacity- and afford greater flexibility. This is changing with the availability of high-powered virtual servers and virtualization technologies, in addition to the increased standardization of platforms and ability of service providers to deliver leading-edge cloud services.

Only an exceptionally efficient enterprise can justify building and operating an IT infrastructure after a full audit of costs and impacts is taken—in both monetary and environmental terms. The ROI of maintaining infrastructure in-house rarely materializes, as replacement of machines outpaces cost efficiencies.

The Changing IT Environment
IT is heading down a path where a large-scale shift toward outsourced systems is occurring, and expected to increase into 2011 and 2012. Within the next 10 years, public sector respondents expect to have 64 percent of their organizational infrastructure under third-party management—approaching the assumed private sector level of nearly 67 percent.

The Adoption of Cloud Solutions
A large portion of this shift will result in applications moving to cloud environments where shared resources are provided to computers and other devices on demand. Currently, 59 percent of government and public sector IT teams are using or expect to use cloud for enterprise applications within five years, with the overwhelming majority expecting significant savings by switching to cloud-based infrastructure.

The primary differences between the cloud platforms are in the levels of service delivered. Both public clouds (in which resources are available for purchase by any organization) and private clouds (in which an organization shares a pool of resources amongst its divisions and partners) offer the same capabilities when it comes to rapid deployment, interoperability and scalability.

Although private clouds allow for simplified compliance auditing when the highest levels of security are required, only the modern generation of enterprise cloud services are able to offer the full range of security, application stack choice and service-level capabilities required for critical compute services.

Cloud computing offers the opportunity to do more by buying less, so the choice between cuts or the cloud is a no longer an issue. IT teams and service managers need vision and the courage to drive the required changes by working differently.

Decision makers in public entities currently lag behind their private sector counterparts in their cloud adoption ratios, and this proves to be a key barrier to a more over-arching support of cloud platforms. Currently only 8 percent of public service managers claim they understand what cloud computing is and what benefits in can offer. This shows a clear need for more dialogue between the IT teams and the wider organizations on what is now possible with cloud computing.

On the other hand, service managers do see the potential for development of new ways of working based on more flexible IT infrastructure. In fact, two-thirds of respondents agree they could change the way they plan for IT enhancements if they could reduce or reduce the cost of IT infrastructure.

This vision must apply to the regulatory environment, with virtualization being rapidly accredited for government use as an approved technology. It is important to learn from high security environments in the commercial world, in which virtualization is already accredited.

In order to benefit from the corresponding economic efficiencies, efforts must be accelerated as quickly as possible. Resources will be freed to deliver the services organizations need in a flexible, scalable and more sustainable way.

For more information about how we can implement a interoperable and cost effective cloud solution please visit nubifer.com.

Predicting, Building Toward and Defining the Future of Cloud Automation

Cloud computing is an outcome of efficient IT automation, and is a model that is only possible by standardizing core elements of computing and the automation of their operation. The cloud cannot be a self-sustaining platform without automation, nor can it scale to very large numbers of customers or systems.

As the modern IT landscape becomes more concentrated, new computing complexities begin to surface. Although this has occurred in the past with evolving programming languages, computer networks, software design architectures and system virtualization, IT automation has raised the bar on that concept more than any other.

By most industry estimations, we are only at an early stage in the grand scheme of operations automation, just the second or third of several unavoidable evolutionary stages in the growing capability of systems to stand on their own in a global IT ecosystem.

Organizational Automation
Organizational automation of server deployment is the first stage of automation. When the server is the unit of deployment, server automation is a prime idea. Each server can host a single operating system, creating that OS and formatting to apps to include is an ideal method of streamlining operations of a single server.

The difficulty of this method is that it’s difficult to execute efficiently at large scales because the system administrator is still burdened to make operational decisions on behalf of the application. You may now be wondering about the number of servers is an ideal amount to deploy? And, which types of servers should you should add instances to in order to meet peak loads and what the time-frame should be for doing that? The result of this method of operation is a significantly cumbersome manual operations environment, with most organizations Nubifer has Consulted with at this stage and scale, implementing strategic capacity planning and erect a system for expected peak.

Application Implementation
The implementation of a sectioned distributed application where the different data-sets of the application are aimed for a deployment location is a significant upgrade to single server deployment. This type of automation essentially ensures that each set of data ends up where it’s supposed to be stored and that it’s configured correctly.

Standards in Source Code
We also noticed that standardized operations code adds important functionality to simple distributed deployment automation which shifts capacity consumption based on application needs in real time. This type of scaling automation ensures that your organization pays only for what you use.

Implementing Cloud Automation
Nubifer has noticed that modern scaling automation has one primary limitation: the fashion in which the health of the application is determined has to be built into application OS’s ahead of time. The developer has to determine what conditions to examine, what state requires an adjustment to scale and what layers of the application are scaled in response. This all has to be effectively architected prior to the application being deployed into your organization’s IT environment.

Interop and Identity Management
Leveraging the interoperability and intelligence of behavior learning algorithms enabling cloud systems to receive a wide variety of monitoring data is the next logical step, followed by picking through that data to determine normal and abnormal behaviors and to determine appropriate ways to react to any anomalies. These forms of learned behavior turn the application system an adaptive system which becomes increasingly better at making efficient choices the longer the application is in production.

Even though the issue discussed above is a complicated one, successful migrations will be exceedingly important as they will continuously evolve strategies for dealing with app performance, security and cost management.

Why Give Up Control?
You may be wondering why you want to give up control over operations of your key apps to an automation system. The reasoning lies under the same motivation for turning over of your operating systems to virtual machines, your phone systems to managed service providers or your compute resources to cloud environments: agility, interoperability, scalability and cost.

The Take-Aways
Companies that adopt one or more cloud models for a large percentage of their workloads will see key advantages over those that don’t, and cloud providers that adopt the best infrastructure and service automation systems will improve their chances in the marketplace. Visit Nubifer.com to learn more about the past, present and future states of cloud computing and to gain insights and key research into the field of cloud computing, software-as-a-service, platform and infrastructure-as-a-service.

Developing Cloud Applications: Pattern Usage and Workload Modeling

For enterprise companies today, the process of determining one or more common application usage profiles for use in cloud platform performance testing is known as ‘application workload modeling’. Cloud application workload modeling can be accomplished in a myriad of ways, and is a critical piece to properly planning, developing and implementing successful cloud solution technologies.

Some General Best Practices when Developing Cloud Applications.

  • Understand your application usage patterns. New business processes are prime candidates for building out such apps. Silo-ed departmental initiatives often evolve into organizational best practices that get adopted by the entire enterprise, and because most of the programs are developed organically from the ground up, they can leverage the interoperability of the cloud and be scaled depending on demand. This also allows the app to be discontinued with minimal cost if the initiative isn’t deemed efficient or necessary to the organization.

  • Develop and Deploy Your Application. Creating a plan and sequence of key metric drivers help you keep your cloud deployment efforts on track. Start small, grow fast is a common mantra of many start-ups (including ours), the overwhelming majority of which are intimidated by the significant cost of on-premise infrastructure.
  1. Define and Identify the objectives
  2. Document and Identify primary usage scenarios
  3. Develop and Determine navigation paths for key scenarios
  4. Design and Determine individual user data and variances
  5. Determine the likely-hood of such scenarios
  6. Identify peak target load levels
  7. Prepare and Deploy the new cloud solution
  • Monitor Spiked Usage Patterns for “Common Utility Apps”. Within every organization, large or small, there’s at least one program or application that receives spiked usage during a certain time of the year, quarter or month. One example of this pattern is related to corporate tax software, as this app is lightly used for many months, but becomes a highly leveraged application during the end of the fiscal year tax calculation process. Another example is Human Resource Information Systems (HRIS) and the periodic need for employees to subscribe to new company health plans, insurance plans, etc. Other examples include e-commerce websites like Ebay and Buy.com which experience this “peak load” requirement during holiday or special sales seasons.

The common thread across all of these types of “on-demand” cloud apps is that their usage rate is relatively standard or predictable most of the time, but become the most demanded of resources periodically. Utilizing a scalable cloud solution approach in this manner enables greater cost savings and ensures high availability of your enterprise business systems.

Application Load and Scalability, and Dynamically Reacting to Peak Load

As it is most often associated with consumer-facing web apps, unpredictable load occurs when an inordinate amount of traffic is directed toward your site, and the app is subsequently unable to meet this demand—causing the entire website to return a load error message. Nubifer has noticed sudden spikes in traffic when organizations launch fresh marketing campaigns, or receive extensive back-linking from prominent authority sites. Apps and sites eminently susceptible to these load spikes are ideal candidates for the cloud, and the most prominent advantage of this methodolgy is the auto-scale or on-demand capability.

Monitoring, a Primary Key to Any Successful Cloud Deployment

Your cloud platform monitors the patterns of Internet traffic and the utilization of the infrastructure, adding additional server resources if the traffic crosses your preset threshold. The extra servers that are added can be safely deactivated once the traffic subsides and the environment isn’t so demanding. This creates an extremely cost-efficient use case for leveraging a cloud platform for app and site hosting.

To the contrary of unpredictable load occurrences, e-commerce sites commonly experience predictable spikes in traffic. For instance, when Amazon launches pre-ordering for the next novel for Oprah’s book club, they prepare their infrastructure to handle these peak loads. Organizations of this size typically have a ballpark budget figure of the infrastructure cost because of its inherent predictability. There are many occurrences in the public sector that experience predictable bursts as well, such as electoral results and the examination of the latest census reports.

Understanding Application Usage Pattern Trends

Within your business, these patterns are manifested during a virtual company meeting or initiation of a compulsory online training for all employees, but the primary difference between this pattern of usage and the first is that there may not be a periodic recurrence of this particular pattern or spike in resource demand.

It’s paramount that your IT personnel remain cognizant of these peak load times, whether they are predictable or not, as this is a key element for effectively leveraging a cloud solution that offers support and business intelligence data regarding peak load and latency issues.

How We Have Evolved to Solve for Peak Load and Usage Monitoring

Nubifer has solved these business scenarios by developing a robust set of tools and monitoring applications for private and public clouds, named Nubifer Cloud:Link. To learn more about Cloud:Link and Nubifer’s approach to enterprise cloud monitoring visit CloudLink.pro

Google Apps Receives Federal Certification for Cloud Computing

On July 26, Google released a version of its hosted suite of applications that meets the primary federal IT security certification, making a major leap forward in its push to drive cloud computing in the government. Nearly one year in the making, Google announces its new edition of Google Apps as the first portfolio of cloud applications to have received certification under the Federal Information Security Management Act (FISMA).

The government version of Google Apps has the same pricing and services as the premier edition, including Gmail, the Docs productivity site and the Talk instant-messaging application.

Google Business Development Executive David Mihalchik said to reporters, “We see the FISMA certification in the federal government environment as really the green light for federal agencies to move forward with the adoption of cloud computing for Google Apps.”

Federal CIO Vivek Kundra announced a broad initiative to embrace the cloud across the federal government last September, as a way to reduce both costs and inefficiencies of redundant and underused IT deployments. The launch of that campaign was accompanied by the launch of Apps.gov. An online storefront for vendors to showcase their cloud-based services for federal IT manager, Apps.gov was revealed at an event at NASA’s Ames Research Center and attended by Google co-founder Sergey Brin. At the same time, Google announced plans to develop a version of its popular cloud-based services that  would meet the federal-government sector’s security requirements.

Mike Bradshaw, director of Google’s Federal Division, said, “We’re excited about this announcement and the benefits that cloud computing can bring to this market.” Bradshaw continued to say that “the President’s budget has identified the adoption of cloud computing in the federal government as a way to more efficiently use the billions of dollars spent on IT annually.” Bradshaw added that the government spends $45 million in electrical costs alone to run its data-centers and servers.

Security concerns are consistently cited by proponents of modernizing the deferral IT apparatus as the largest barrier to the adoption of cloud computing. Google is including extra security features to make federal IT buyers at agencies with more stringent security requirements feel more at ease. These extra security features are in addition to the 1,500 pages of documentation that came with Google’s FISMA certification.

Google will store government cloud accounts on dedicated servers within its data centers that will be segregated from its equipment that houses consumer and business data. Additionally, Google has committed to only use servers located in the continental U.S. for government cloud accounts. Google’s premier edition commercial customers have their data stored on servers in both the U.S. and European Union.

Mihalchik explained that security was the leading priority from the get-go in developing Google Apps for Government saying, “We set out to send a signal to government customers that the cloud is ready for government.” Adding, “today we’ve done that with the FISMA certification, and also going beyond FISMA to meet some of the other specific security requirements of government customers.”

Thus far, Google has won government customers at state and local levels such as in the cities of Los Angeles, California and Orlando, Florida. Mihalchik said that over one dozen federal agencies are in various stages of trialing or deploying elements of Google apps. Mihalchik states that several agencies are using Google anti-spam and anti-virus products to filter their email. Others, like the Department of Energy, are running pilot programs to evaluate the full suite of Google Apps in comparison with competitors’ offerings.

Find out more about cloud security and FISMA certification of Google Apps by talking to a Nubifer Consultant today.

Zoho Sheet 2.0 launches on August 31st 2010, with support for Million Cell Spreadsheets

Zoho, an industry leader in cloud hosted officing software, announced today the launch of Zoho Sheet 2.0. Among the many added features of Zoho Sheet, is the newly added support for million cell spreadsheets.

When a user logs-in to Zoho Sheet 2.0, they will not notice much change visually, but there have been many performance improvements on the back-end. Frequent users of Zoho’s increasingly popular spreadsheet app will notice the performance and interoperability improvements instantly. Regarding the performance of the app, Zoho enhanced the back-end engine significantly upgrading its performance, allowing users of Zoho Sheet 2.0 to load large and complex spreadsheets with instant response times.

Zoho Sheet’s One Million Cell Spreadsheet

At Nubifer Inc., we are constantly working with extensive spreadsheets, and were infinitely familiar with constant freezes and over-consumption of local compute resources. This is no longer an issue for our teams, as Zoho Sheet is completely online with all the heavy lifting being done on the server side, keeping our client side agile and nimble.

With Zoho’s latest product update, subscribers can now create a million cell spreadsheet. Zoho Sheet 2.0 supports 65,536 rows and 256 columns per worksheet, creating 1 Million Cells per spreadsheet project. Supporting a million cells is an important feature, but maintaining efficient load-times with large spreadsheets was the primary goal with Zoho Sheet 2.0. Waiting as long as 5 minutes to load very large spreadsheets is no longer an issue, this can now be experienced instantly within your web browser. We here at Nubifer encourage you to give it a test drive, and witness for yourself how agile and efficient response is while using Zoho Sheet 2.0.

Here is an example embedded spreadsheet with 25,000 rows. The performance on the return is quite impressive.


In addition to the improved performance metrics, here are some other great features designed to aid functionality and work flow.

Chrome & Safari Browser Support

Zoho Sheet now officially supports Chrome 4+, Safari 4+, Firefox 2+ and IE 6+.

Some Additionally Impressive Improvements

  • Users can now directly input Chinese, Japanese & Korean characters without having to double-click on a cell.
  • Improved ‘Find’ functionality. Control+F will now bring up the ‘Find’ panel at the bottom of the spreadsheet with options to search within the row, column or sheet.
  • The ‘Undo’ and ‘Redo’ actions now work across the spreadsheet and are maintained on a per-user basis while collaborating with other users.
  • You can now set formats and styles on column, row, and sheet tiers.

Are you an existing user? If not, you probably wont see many changes visually, but you will experience these enhancements when working with Zoho Sheets 2.0.

Zoho is tirelessly working on performance updates to their cloud-hosted officing applications. Some updates are cosmetic for look and feel, while others are performance based. The overwhelming majority of Zoho’s updates go under the hood. For these updates, users may not notice anything visually, but these updates are significant and lay the groundwork for things to come in the future.

For more information about Zoho Sheet, or other Zoho officing applications please visit Nubifer.com.

Understanding the Cloud with Nubifer Inc. CTO, Henry Chan

The overwhelming majority of cloud computing platforms consist of dependable services relayed via data centers and built in servers with varying tiers of virtualization capabilities. These services are available anywhere that allows access to the networking platform. Clouds often appear as single arenas of access for all subscribers’ enterprise computing needs. All commercial cloud platform offerings are guaranteed to adhere to the customers’ quality of service (QoS) requirements, and typically offer service level agreements.  Open standards are crucial to the expansion and acceptance of cloud computing, and open source software has layed the ground work for many cloud platform implementations.

The article to follow is what Nubifer Inc. CTO, Henry Chan, recently described to be his summarized view of what cloud computing means, its benefits and where it’s heading in the future:

Cloud computing explained:

The “cloud” in cloud computing refers to your network’s Internet connection. Cloud computing is essentially using the Internet to perform tasks like email hosting, data storage and document sharing which were traditionally hosted on premise.

Understanding the benefits of cloud computing:

Cloud computing’s myriad of benefits depend on your organizational infrastructure needs. If your enterprise is sharing large number of applications between a varying number of office locations, it would be beneficial to your organization to store the apps on a virtual server. Web-based application hosting can save time for people traveling without the ability to connect back to the office because they can have access to everything over their shared virtual private network (VPN).

Examples of cloud computing:

Hosted email (such as GMail or Hotmail), online data back-up, online data storage, any Software-as-a-Service (SaaS) application (such as a cloud hosted CRM from vendors like Salesforce, Zoho or Microsoft Dynamics) or accounting applications, are examples of applications that can be hosted in the cloud. By hosting these applications in the cloud, your business can benefit from the interoperability and scalability cloud computing and SaaS services offer.

Safety in the cloud:

Although there are some concerns over the safety of cloud computing, the reality is that data stored in the cloud can be just as secure as the vast majority of data stored on your internal servers. The key is to implement the necessary solutions to ensure that the proper level of encryption is applied to your data while traveling to and from your cloud storage container, as well as when being stored. This can be as safe as any other solution you could implement locally when designed properly. The leading cloud vendors all currently maintain compliance with Sarbanes-Oxley, SAS90, FISMA and HIPPA.

Cloud computing for your enterprise:

To determine which layer of cloud computing is optimally suited for your organization, it is important to thoroughly evaluate your organizational goals as it relates to your IT ecosystem. Examine how you currently use technology, current challenges with technology, how your organization will evolve technologically in the years to come, and what scalability and interoperability will be required going forward. After a careful gap analysis of these determinants, you can decide what types of cloud-based solutions will be optimally suited for your organizational architecture.

Cloud computing, a hybrid solution:

The overwhelming trend in 2010 and 2011 is to move non-sensitive data and applications into the cloud while keeping trade secrets behind your enterprise firewall, as many organizations are not comfortable hosting all their applications and hardware in the cloud. The trick to making cloud computing work for your business is to understand which applications should be kept local and which would benefit most from leveraging the scalability and interoperability of the cloud ecosystem.

Will data be shared with other companies if it is hosted in the cloud:

Short answer: NO! Reputable SaaS and cloud vendors will make sure that your data is properly segmented according to the requirements of your industry.

Costs of cloud computing:

Leading cloud-based solutions charge a monthly fee for application usage and data storage, but you may be outlaying this capital expenditure already, primarily in the form of hardware maintenance and software fees—some of which could be wiped out by moving to the cloud.

Cloud computing makes it easy for your companies’ Human Resource software, payroll and CRM to co-mingle with your existing financial data, supply chain management and operations installation, while simultaneously reducing your capital requirements on these systems. Contact a Nubifer representative today to discover how leveraging the power of cloud computing can help your business excel.

Confidence in Cloud Computing Expected to Surge Economic Growth

The dynamic and flexible nature of cloud computing, software-as-a-service and platform-as-a-service may help organizations in their recovery from the current economic downturn, according to more than two thirds of IT decision leaders and makers who participated in a recent annual study by Vanson Bourne, an International Research Firm. Vanson Bourne surveyed over 600 IT and business decision makers across the United States, United Kingdom and Singapore. Of the countries sampled, Singapore is leading the shift to the cloud, with 76 percent of responding enterprises using some form of cloud computing. The U.S. follows with 66 percent, with the U.K. at 57 percent.

This two year study about Cloud Computing reveals that IT decision makers are very confident in cloud computing’s ability to deliver within budget and offer CapEx savings. Commercial and public sector respondents also predict cloud use will help decrease overall IT budgets by an average of 15 Percent, with others expecting savings as much as 40 Percent.

“Scalability, interoperability and pay-as-you-go elasticity are moving many of our clients toward cloud computing,” said Chad Collins, CEO at Nubifer Inc., a strategic Cloud and SaaS consulting firm. “However, it’s important, primarily for our enterprise clients, to work with a Cloud provider that not only delivers cost savings, but also effectively integrates technologies, applications and infrastructure on a global scale.”

A lack of access to IT capacity is clearly labeled as an obstacle to business progress, with 76 percent of business decision makers reporting they have been prevented from developing or piloting projects due to the cost or constraints within IT. For 55 percent of respondents, this remains an issue.

Confidence in cloud continues to trend upward — 96 percent of IT decision makers are as confident or more confident in cloud computing being enterprise ready now than they were in 2009. In addition, 70 percent of IT decision makers are using or plan to be using an enterprise-grade cloud solution within the next two years.

The ability to scale resources up and down in order to manage fluctuating business demand was the most cited benefit influencing cloud adoption in the U.S. (30 percent) and Singapore (42 percent). The top factor driving U.K. adoption is lower cost of total ownership (41 percent).

Security concerns remain a key barrier to cloud adoption, with 52 percent of respondents who do not leverage a cloud solution citing security of sensitive data as a concern. Yet 73 percent of all respondents want cloud providers to fully manage security or to fully manage security while allowing configuration change requests from the client.

Seventy-nine percent of IT decision makers see cloud as a straight forward way to integrate with corporate systems. For more information on how to leverage a cloud solution inside your environment, contact a Nubifer.com representative today.

Taking a Closer Look at the Power of Microsoft Windows Azure AppFabric

Microsoft’s Windows Azure runs Windows applications and stores advanced applications, services and data in the cloud. This baseline understanding of Windows Azure, coupled with the practicality of using computers in the cloud makes leveraging the acres of Internet-accessible servers on offer today an obvious choice. Especially when the alternate option of buying and maintaining your own space in data centers and hardware deployed to those data centers can quickly become costly. For some applications, both code and data might live in the cloud, where the systems they use are managed and maintained by someone else. On-premise applications—which run inside an organization—might store data in the cloud or rely on other cloud infrastructure services. Ultimately, making use of the cloud’s capabilities provides a variety of advantages.

Windows Azure applications and on-premises applications can access the Windows Azure storage service using a REST-ful approach. The storage service allows storing binary large objects (blobs), provides queues for communication between components of Windows Azure application, and also offers a form of tables with a simple query language. The Windows Azure platform also provides SQL Azure for applications that need traditional relational storage. An application using the Windows Azure platform is free to use any combination of these storage options.

One obvious need between applications hosted in the cloud and hosted on-premise is communication between applications. Windows Azure AppFabric provides a Service Bus for bi-directional application connectivity and Access Control for federated claims-based access control.

Service Bus for Azure AppFabric

The primary feature of the Service Bus is message “relaying” to and from the Windows Azure cloud to your software running on-premise, bypassing any firewalls, network address translation (NAT) or other network obstacles. The Service Bus can also help negotiate direct connections between applications. Meanwhile, the Access Control feature provides a claims-based access control mechanism for applications, making federation easier to tackle and allowing your applications to trust identities provided by other systems.

A .NET developer SDK is available that simplifies integrating these services into your on-premises .NET applications. The SDK integrates seamlessly with Windows Communication Foundation (WCF) and other Microsoft technologies to build on pre-existing skill sets as much as possible. These SDKs have been designed to provide a first-class .NET developer experience, but it is important to point out that they each provide interfaces based on industry standard protocols. Thus, making it possible for applications running on any platform to integrate with them through REST, SOAP and WS-protocols.

SDKs for Java and Ruby are currently available for download. Combining them with the underlying Windows Azure platform service produces a powerful, cloud-based environment for developers.

Access Control for the Azure AppFabric

Over the last decade, the industry has been moving toward an identity solution based on claims. A claims-based identity model allows the common features of authentication and authorization to be factored out of your code, at which point such logic can then be centralized into external services that are written and maintained by subject matter experts in security and identity. This is beneficial to all parties involved.

Access Control is a cloud-based service that does exactly that. Rather than writing your own customer user account and role database, customers can let AC orchestrate the authentication and most of the user authorization. With a single code base in your application, customers can authorize access to both enterprise clients and simple clients. Enterprise clients can leverage ADFS V2 to allow users to authenticate using their Active Directory logon credentials, while simple clients can establish a shared secret with AC to authenticate directly with AC.

The extensibility of Access Control allows for easy integration of authentication and authorization through many identity providers without the need for refactoring code. As Access Control evolves, support for authentication against Facebook Connect, Google Accounts, and Windows Live ID can be quickly added to an application. To reiterate: over time, it will be easy to authorize access to more and more users without having to change the code base.

When using AC, the user must obtain a security token from AC in order to log in; this token is similar to a signed email message from AC to your service with a set of claims about the user’s identity. AC doesn’t issue a token unless the user first provides his or her identity by either authenticating with AC directly or by presenting a security token from another trusted issuer (such as ADFS) that has authenticated that user. So by the time the user presents a token to the service, assuming it is validated, it is safe to trust the claims in the token and begin processing the user’s request.

Single sign-on is easier to achieve under this model, so a customer’s service is no longer responsible for:

• Authenticating users
• Storing user accounts and passwords
• Calling to enterprise directories to look up user identity details
• Integrating with identity systems from other platforms or companies
• Delegation of authentication (a.k.a. federation) with other security realms

Under this model, a customer’s service can make identity-related decisions based on claims about the user made by a trusted issuer like AC. This could be anything from simple service personalization with the user’s first name, to authorizing the user to access higher-valued features and resources in the customer’s service.

Standards

Due to the fact that single sign-on and claims-based identity have been evolving since 2000, there are a myriad of ways of doing it. There are competing standards for token formats as well as competing standards for the protocols used to request those tokens and send them to services. This fact is what makes AC so useful, because over time, as it evolves to support a broader range of these standards, your service will benefit from broader access to clients without having to know the details of these standards, much less worry about trying to implement them correctly.

Security Assertion Markup Language (SAML) was the first standard. SAML specified an XML format for tokens (SAML tokens) in addition to protocols for performing Web App/Service single sign-on (SAML tokens are sometimes referred to inside Microsoft as SAMLP–for the SAML protocol suite). WS-Federation and related WS-* specifications also define a set of protocols for Web App/Service single sign-on, but they do not restrict the token format to SAML, although it is practically the most common format used today.

To Summarize

The Service Bus and Access Control constituents of the Windows Azure platform provide key building block services that are vital for building cloud-based or cloud-aware applications. Service Bus enables customer to connect existing on-premises applications with new investments being built for the cloud. Those cloud assets will be able to easily communicate with on-premises services through the network traversal capabilities, which are provided through Service Bus relay.

Overall, the Windows Azure platform represents a comprehensive Microsoft strategy designed to make it easy for Microsoft developers to realize the opportunities inherent to cloud computing. The Service Bus and Access Control offer a key component of the platform strategy, designed specifically to aid .NET developers in making the transition to the cloud. These services provide cloud-centric building blocks and infrastructure in the areas of secure application connectivity and federated access control.

For more information on the Service Bus & Access Control, please contact a Nubifer representative or visit these Microsoft sponsored links:

• An Introduction to Windows Azure platform AppFabric for Developers (this paper)
o http://go.microsoft.com/fwlink/?LinkID=150833

• A Developer’s Guide to Service Bus in Windows Azure platform AppFabric
o http://go.microsoft.com/fwlink/?LinkID=150834

• A Developer’s Guide to Access Control in Windows Azure platform AppFabric
o http://go.microsoft.com/fwlink/?LinkID=150835

• Windows Azure platform
o http://www.microsoft.com/windowsazure/

• Service Bus and Access Control portal
o http://netservices.azure.com/

Two Kinds of Cloud Agility

CIO.com’s Bernard Golden defines cloud agility and provides examples of how cloud computing fosters business agility in the following article.

Although agility is commonly described as a key benefit of cloud computing, there are two types of agility that are real, but one of them packs more of a punch.

First, however, it is important to define cloud agility. Cloud agility is tied to the rapid provisioning of computer resources. In typical IT shops, new compute instances or storage can take weeks (or even months!), but the same provisioning process takes just minutes in cloud environments.

Work is able to commence at a rapid pace due to the dramatic shortening of the provisioning timeframe. For example, in a cloud environment submitting a request for computing resources and waiting anxiously for a fulfillment response via email does not happen. In this way, agility can be defined as “the power of moving quickly and easily; nimbleness,” and in his way it is clear how this rapid provisioning is commonly referred to advancing agility.

It is at this point that the definition of agility becomes confusing, as people often conflate both engineering resource availability and business response to changing conditions or opportunity under agility.

While both types of agility are useful, business response to changing conditions or opportunity will prove to be the more compelling type of agility. It will also come to be seen as the real agility associated with cloud computing.

The issue with this type of agility, however, is that it is a local optimization, meaning that it makes a portion of internal IT processes more agile. However this doesn’t necessarily shorten the overall application supply chain, which extends from initial prototype to production rollout.

It is, in fact, very common for cloud agility to enable developers and QA to begin their work more quickly, but for the overall delivery time to stay the same, stretched by slow handover to operations, extended shakedown time in the new production environment and poor coordination with release to the business units.

Additionally, if cloud computing comes to be seen as an internal IT optimization, with little effect on the timeliness of compute capability rolling out into mainline business processes, IT potentially may never receive the business unit support it requires to fund the shift to cloud computing. What may happen, is that cloud computing will end up like virtualization, in which in many organizations remains at 20 or 30 percent penetration, unable to gather the funding necessary to support wider implementation. Necessary funding will probably never materialize if the move to cloud computing is presented as “helps our programmers program faster.”

Now, for the second type of agility, which affects how quickly business units can roll out new offerings. This type of agility does not suffer the same problems that the first one does. Funding will not be an issue if business units can see a direct correlation between cloud computing and stealing a march on the competition. Funding is never an issue when the business benefit is clear.

The following three examples show the kind of business agility fostered by cloud computing in the world of journalism:

1. The Daily Telegraph broke  a story about a scandal regarding Members of Parliament expenses which was a huge cause celebre featuring examples of MPs seeking reimbursement of for building a duck house and other equally outrageous claims. As can be imagined, the number of expense forms was huge, and overtaxed the resources of the Telegraph available to review and analyze them. The Telegraph loaded the documents in Google Docs and allowed readers to browse them at their own will. CIO of the Telegraph Media Group, Toby Wright, used this example during a presentation at the Cloud Computing World Forum and pointed out how interesting it was to see several hundred people clicking through the spreadsheets at once.

2. The Daily Telegraph’s competitor, the Guardian, of course featured its own response to the expenses scandal. The Guardian quickly wrote an application to let people examine individual claims and identify ones that should be examined more closely. As a result, more questionable claims surfaced more quickly and allowed the situation to heat up. Simon Willison of the Guardian said of the agility that cloud computing offers, “I am working at the Guardian because I am interested in the opportunity to build rapid prototypes that go live: apps that live for two or three days.” Essentially, the agility of cloud computing enables quick rollout of short-lived applications to support the Guardian’s core business: delivery of news and insight.

3. Now, for an example from the United States. The Washington Post took static pdf files of former First Lady Hillary Clinton’s schedule and used Amazon Web Services to transform them into a searchable document format. The Washington Post then placed the documents into a database and put a simple graphic interface in place to allow members of the public to be able to click through them as well–once again, crowds-ourcing the analysis of documents to accelerate analysis.

It can be argued that these examples don’t prove the overall point of how cloud computing improves business agility–they are media businesses, after all, not “real” businesses that deal with physical objects and can’t be satisfied with a centralized publication site. This point doesn’t take into account that modern economies are shifting to become more IT-infused and digital data is becoming a key part of every business offering. The ability to turn out applications associated with the foundation business offering will be a critical differentiator in the future economy.

Customers get more value and the vendor gets competitive advantage due to this ability to surround a physical product or service with supporting applications. In order to win in the future, it is important to know how to take advantage of cloud computing to speed delivery of complimentary applications into the marketplace. As companies battle it out in the marketplace, they will be at a disadvantage if they fail to optimize the application delivery supply chain.

It is a mistake to view cloud computing as a technology that helps IT do its job quicker, and internal IT agility is necessary but not sufficient for the future. It will be more important to link the application of cloud computing to business agility, speeding business innovation to the marketplace. In summary, both types of agility are good but the latter should be the aim of cloud computing efforts.

A Guide to Choosing CRM Software

Customer Relationship Management (CRM) software lets you effectively manage your business, but choosing the right software is often a daunting process. This nubifer.com blog is aimed at alleviating some of the more challenging decision making processes.

CRMs offer several levels of organization to help strengthen and deepen customer relationships, ranging from basic contact management software, to tracking and managing sales, or tweets on Twitter. The Return on Investment (ROI) usually is an increase in sales, and should also translate to better customer service. The following guide will help you through the process, from pinpointing your customer relationship needs to ultimately selecting a CRM software application.

Choosing CRM Software: Why Invest in a CRM?

CRM is a term used to describe methodologies, software and Internet capabilities designed to help businesses effectively manage customer relationships. Traditionally, CRMs have been seen as an automated way to track and maintain client contact information, but the CRMs of today are faster, smarter and highlight the most current computing technologies available.

In this way, the CRM can be used as a tool to set and measure sales goals, devise, deliver and track email marketing campaigns up through and including interfacing with social media accounts. The importance of CRMs in the marketplace has grown as well, and with sales, marketing and customer service on the playing field, an enterprise can match customer needs with company offerings, thus becoming more efficient and profitable.

Raju Vegesna, Executive Evangelist for Zoho, an online CRM company based in Pleasanton, California, adds that beyond managing customer relations, “A CRM system comes in handy in such situations as it helps you aggregate all customer related information in a single place,” which is crucial for a small business owner trying to keep track of contracts, invoices and emails.

Vegesna added that if small business owners frequently personalize and email customers manually–or if they are unaware of the status of each customer in the pipeline–they will likely need a CRM system.

Chad Collins, CEO of Nubifer Inc., a Cloud, SaaS and CRM strategic advisory company based in San Diego, California, says that, essentially, CRMs offer “business functionality at your fingertips that will save a ton of time for front-line personnel by streamlining your varied sales processes.”

Collins suggests a top-down approach, in which management sets the example by using the tool, as a way to encourage employee buy-in. Collins also suggests having a designated go-to employee (someone that is not the boss) who really knows the ins and outs of the system, called the “CRM Evangelist.” He also suggested offerings rewards and incentives to help employees approach the new system without fear.

The cost is the next major challenge to CRM success. According to Collins, it can cost anywhere from $300 to $2,000 per user per year to implement a CRM. “The CEO needs to understand the cost of CRM goes beyond simple licensing, rather it encompass the license, training, and whatever business process changes they need to make,” says Collins.

According to Chad Collins of Nubifer Inc., there are three main areas to consider when evaluating the pros and cons of a CRM: Platform, how easy it is to implement the CRM and vendor strength and weakness.

Platform

  • How much flexibility is there in the software/product so the company can create their own process?
  • How easy is it to configure the software or to get started with on-demand (Internet-based) solutions?
  • How easy is it to integrate data from other sources into the software or on-demand solution?
  • How scalable is the software or on-demand solution?
  • Will it deliver what you need it to deliver in terms of performance?
  • Will it offer portals or front end screens to help you and your colleagues to collaborate with one another?

Ease of Implementation

  • Are you looking for on-demand, SaaS, cloud, and Internet-based solutions?
  • Thin or thick clients: Will you have the software on your machine when you travel or do you need to dial up using a browser?
  • How much mobility do you want? Can it be done on a laptop or can it be done using mobile phones?

Vendor Strength and Weakness

  • How long has the company been around­?
  • Where have they gone in terms of their vertical thrust –do they specialize in just one sector?
  • What computing platform are they using to make sure it’s compatible with your system?
  • What’s their domain expertise in terms of your particular business area?
  • What professional services do they offer to help you get up and running?
  • What partnerships do they have with companies like Microsoft Outlook to work with your CRM?

It will be easier to determine what technology is the best fit for your company once these questions are answered.

Choosing CRM Software: Social CRMs

The latest trend to emerge in CRM is social networking, but industry executives are still trying to figure out whether or not small businesses need their CRM to track their social networking. Collins of Nubifer Inc. says that the advantages of social CRM—for those that are ready to embrace it—are three-fold:

  1. The ability to connect with people using free (or very cheap) means.
  2. The ability to find those that you want to do business with on social networks and learn what’s important to them using monitoring tools.
  3. The ability to create a message that responds directly to what customer challenges are right then and there.

Collins added, “What’s [also] really important today is leveraging the web and creating opportunities to engage people. Traditional CRMs weren’t built for that. Now with online social networks you can create content that works for you 24/7 and builds leads for you. People can find what you’re talking about and ask you for questions. You can create more online relationships than you can face to face.”

An example is given by Collins: “If you have a large group of people on Twitter talking about a specific problem they are trying to solve, you want to be able to grab those Tweets or Facebook posts and route them to the appropriate person in your company so the customer can get the answer they require directly from the source.”

When you are ready to take the leap, there is a CRM available to fit your needs, whether you need to simply organize contact information or require robust assistance in meeting and tracking your sales goal. For more information regarding choosing the right CRM for your business contact a Nubifer Consultant.

Jabber Now Supported on Zoho Chat

Launched Wednesday August 4th, the ever-popular Jabber protocol will be supported on Zoho Chat. This enables users to log-in with their personal Zoho credentials and chat with colleagues and personnel if the enterprise network contains a Jabber client. This latest Zoho update interoperates with a multitude of Jabber clients including desktop, web and mobile clients.

HIGHLIGHTS

  • Zoho Chat now supports Jabber. Users can connect to Zoho Chat from any desktop/web/mobile clients
  • Zoho Chat is a multi-protocol IM App that is integrated across all Zoho Apps
  • Zoho Chat can also be used for support when embedded on websites
  • Supports notifications on the desktop clients (for document sharing, missed messages)

In Zoho’s previous release, Jabber on the client side was supported, thus permitting users to connect to other Jabber networks from the Zoho Chat client. With this most recent update, Zoho Chat supports Jabber protocol on the server side allowing you to connect to Zoho from any chat client (encrypted connections only), creating many interesting business use case scenarios.

If your business environment is anything like ours here at Nubifer.com, you need to remain constantly connected to your partners, clients and colleagues. This newest release from Zoho allows users to log-in to their mobile device and run the application in the background. While on Jabber clients, Zoho Chat users can view the status of other connected members, view their profile photos, receive ‘Typing’ notifications, set a users current status and much more. Users will also be notified whenever a connection tries to establish a chat (if the mobile app supports push notifications).

‘Idle Detection’ is also supported with this newest Zoho Chat release. A primary feature in the Zoho Chat Jabber Support release is the ability to retrieve Zoho Groups (Personal groups) from a users account and initiate a group chat from the subscribers preferred desktop Client.

Site Support and Notifications

A highly sought after feature from us here at Nubifer, as well as from other Zoho users, was the ability to support customers chat requests from a desktop client. With this recent release, Zoho Chat can now be embedded on a subscribers website to receive support requests. With this update, users can receive notifications from their website visitors in the subscribers’ preferred desktop client. Once these invitations to chat are received, a user can accept the invitation and initiate a chat session with the website visitors.

Available on users’ desktop clients, Zoho Chat now contains a notification system which alerts a subscriber a document is shared, when someone responds to a topic in Zoho Discussions, or when a chat is missed. Please contact a Nubifer.com representative to learn more about Zoho’s multitude of Cloud hosted officing applications.

Here is what you need to try Zoho Chat on your favorite chat client.

  • Protocol: XMPP/Jabber
  • Username: Zoho username
  • Password: Your Zoho Password
  • Domain: zoho.com
  • Jabber ID: username@zoho.com

For more information about Zoho Apps, please visit nubifer.com

Rackspace Announces Plans to Collaborate with NASA and Other Industry Leaders on OpenStack Project

On July 19, Rackspace Hosting, a specialist in the hosting and cloud computing industry, announced the launch of OpenStackTM, an open-source cloud platform designed to advance the emergence of technology standards and cloud interoperability. Rackspace is donating the code that fuels its Cloud Files and Cloud Servers public-cloud offerings to the OpenStack project, which will additionally incorporate technology that powers the NASA Nebula Cloud Platform. NASA and Rackspace plan on collaborating on joint technology development and leveraging the efforts of open-source software developers on a global scale.

NASA’s Chief Technology Officer for IT Chris C. Kemp said of the announcement, “Modern scientific computation requires ever increasing storage and processing power delivered on-demand. To serve this demand, we built Nebula, an infrastructure cloud platform designed to meet the needs of our scientific and engineering community. NASA and Rackspace are uniquely positioned to drive this initiative based on our experience in building large scale cloud platforms and our desire to embrace open source.”

OpenStack is poised to feature several cloud infrastructure components including a fully distributed object store that is based on Rackspace Cloud Files (currently available at OpenStack.org). A scalable compute-provisioning engine based on the NASA Nebula cloud technology and Rackspace Cloud Servers technology are the next components planned for release, anticipated to be available sometime in late 2010. Organizations using these components would be able to turn physical hardware into scalable and extensible cloud environments using the same code currently in production serving large government projects and tens of thousands of customers.

“We are founding the OpenStack initiative to help drive industry standards, prevent vendor lock-in and generally increase the velocity of innovation in cloud technologies. We are proud to have NASA’s support in this effort. Its Nebula Cloud Platform is a tremendous boost to the OpenStack community. We expect ongoing collaboration with NASA and the rest of the community to drive more-rapid cloud adoption and innovation, in the private and public spheres,” Lew Moorman, President and CSO at Rackspace, said at the time of the announcement.

Both organizations have committed to use OpenStack to power their cloud platforms, while Rackspace will dedicate open-source developers and resources to support adoption of OpenStack among service providers and enterprises. Rackspace hosted an OpenStack Design Summit in Austin, Texas from July 13 to 16, in which over 100 technical advisors, developers and founding members teamed up to validate the code and ratify the project roadmap. Among the more than 25 companies represented at the Design Summit were Autonomic Resources, AMD, Cloud.com, Citrix,  Dell, FathomDB, Intel, Limelight, Zuora, Zenoss, Riptano and Spiceworks.

“OpenStack provides a solid foundation for promoting the emergence of cloud standards and interoperability. As a longtime technology partner with Rackspace, Citrix will collaborate closely with the community to provide full support for the XenServer platform and our other cloud-enabling products,” said Peter Levine, SVP and GM, Datacenter and Cloud Division, Citrix Systems.

Forrest Norrod, Vice President and General manager of Server Platforms, Dell, added, “We believe in offering customers choice in cloud computing that helps them improve efficiency. OpenStack on Dell is a great option to create open source enterprise cloud solutions.”

Updated User Policy Management for Google Apps

Google has released a series of new features granting administrators more controls to manage Google Apps within their organizations, including new data migration tools, SSL enforcement capabilities, multi-domain support and the ability to tailor Google Apps with over 100 applications from the recently-introduced Google Apps Marketplace. On July 20 Google announced one of the most-requested features from administrators: User Policy Management.

With User Policy Management, administrators can segment their users into organizational units and control which applications are enabled or disabled for each group.  Take a manufacturing firm, for example. The company might want to give their office workers access to Google Talk, but not their production line employees, and this is possible with User Policy Management.

Additionally, organizations can use this functionality to test applications with pilot users before making them available on a larger scale. Associate Vice President for Computer Services at Temple University Sheri Stahler says, “Using the new User Policy Management feature in Google Apps, we’re able to test out new applications like Google Wave with a subset of users to decide how we should roll our new functionality more broadly.”

Customers can transition to Google Apps from on-premise environments with User Policy Management, as it grants them the ability to toggle services on or off for groups of users. A business can enable just the collaboration tools like Google Docs and Google sites for users who have yet to move off old on-premises messaging solutions, for example.

These settings can be managed by administrators on the ‘Organizations & Users’ tab in the ‘Next Generation’ control panel. On balance, organizations can mirror their existing LDAP organizational schema using Google Apps Directory Sync or programmatically assign users to organizational units using the Google Apps Provisioning API.

Premier and Educational edition users can begin using User Policy Management for Google Apps at no additional charge.

Dell and Microsoft Partner Up with the Windows Azure Platform Appliance

Dell and Microsoft announced a strategic partnership in which Dell will adopt the Windows Azure platform appliance as part of its Dell Services Cloud to develop and deliver next-generation cloud services at Microsoft’s Worldwide Partner Conference on July 12. With the Windows Azure platform, Dell will be able to deliver private and public cloud services for its enterprise, public, small and medium-sized business customers. Additionally, Dell will develop a Dell-powered Windows Azure platform appliance for enterprise organizations to run in their data-centers.

So what does this mean exactly? By implementing the limited production release of the Windows Azure platform appliance to host public and private clouds for its customers, Dell will leverage its vertical industry expertise in offering solutions for the speedy delivery of flexible application hosting and IT operations. In addition, Dell Services will produce application migration, advisory migration and integration and implementation services.

Microsoft and Dell will work together to develop a Windows Azure platform appliance for large enterprise, public and hosting customers to deploy to their own data centers. The resulting appliance will leverage infrastructure from Dell combined with the Windows Azure platform.

This partnership shows that both Dell and Microsoft recognize that more organizations can reap the benefits of the flexibility and efficiency of the Windows Azure platform. Both companies understand that cloud computing allows IT to increase responsiveness to business needs and also delivers significant efficiencies in infrastructure costs. The result will be an appliance to power a Dell Platform-as-a-Service (PaaS) Cloud.

The announcement with Dell occurred on the same day that Microsoft announced the limited production release of the Windows Azure platform appliance, a turnkey cloud platform for large service providers and enterprises to run in their own data centers. Initial partners (like Dell) and customers using the appliance in their data centers will have the scale-out application platform and data center efficiency of Windows Azure and SQL Azure that Microsoft currently provides.

Since the launch of the Windows Azure platform, Dell Data Center Solutions (DCS) has been working with Microsoft to built out and power the platform. Dell will use the insight gained as a primary infrastructure partner for the Windows Azure platform to make certain that the Dell-powered Windows Azure platform appliance is optimized for power and space to save ongoing operating costs and performance of large-scale cloud services.

A top provider of cloud computing infrastructure, Dell’s client roster boasts 20 of the 25 most heavily-trafficked Internet sites and four of the top global search engines. The company has been custom-designing infrastructure solutions for the top global cloud service providers and hyperscale data center operations for the past three years and has developed an expertise about the specific needs of organizations in hosting, HPC, Web 2.0, gaming, energy, social networking, energy, SaaS, plus public and private cloud builders in that time.

Speaking about the partnership with Microsoft, president of Dell Services Peter Altabef said, “Organizations are looking for innovative ways to use IT to increase their responsiveness to business needs and drive greater efficiency. With the Microsoft partnership and the Windows Azure platform appliance, Dell is expanding its cloud services capabilities to help customers reduce their total costs and increase their ability to succeed. The addition of the Dell-powered Windows Azure platform appliance marks an important expansion of Dell’s leadership as a top provider of cloud computing infrastructure.”

Dell Services delivers vertically-focused cloud solutions with the combined experience of Dell and Perot Systems. Currently, Dell Services delivers managed and Software-as-a-Service support to over 10,000 customers across the globe. Additionally, Dell boasts a comprehensive suite of services designed to help customers leverage public and private cloud models. With the new Dell PaaS powered by the Windows Azure platform appliance, Dell will be able to offer customers an expanded suite of services including transformational services to help organizations move applications into the cloud and cloud-based hosting.

Summarizing the goal of the partnership with Dell, Bob Muglia, president of Microsoft Server and Tools said at the Microsoft Windows Partner Conference on July 12, “Microsoft and Dell have been building, implementing and operating massive cloud operations for years. Now we are extending our longstanding partnership to help usher in the new era of cloud computing, by giving customers and partners the ability to deploy Windows Azure platform in their datacenters.”

Do You Still Need to Worry About Cloud Security?

The answer to the question posed above is … maybe, but definitely not as much as before! A few recent studies in a handful of technologically conservative industries suggest that people and businesses are becoming increasingly comfortable with storing and managing their data in the cloud.

Markets like health care, finance and government, which are typically technology risk-averse, are quickly adopting (and even advocating) disruptive cloud technologies.

Those that have yet to adopt Software-as-a-Service continue to raise two fears when considering making the move into the cloud: Who is in control of my data? Is it safe to store my data somewhere other than the office? These concerns are valid and must be understood by those making the move to the cloud, but the idea that my data must be stored under my roof is shifting.

One expert from Accenture was recently quoted in an article on InformationWeek.com as saying, “Healthcare firms are beginning to realize that cloud providers actually may offer more robust security than is available in-house.” Within that same story a recent study was cited that stated that about one-third of the health care industry currently uses cloud apps and that over 70% of respondents plan to shift more and more to SaaS and cloud apps. While these estimates are interesting in any field, the intrigue is heightened when it comes to health care, where HIPPA compliance rules are notoriously strict.

The finance world is seeing similar shifts. For example, a recent study conducted by SIFMA explained how cloud computing is enabling the financial industry to move forward with technology in spite of budget restraints. “The [finance] industry is showing a larger appetite for disruptive technologies such as cloud computing to force business model change,” said the study.

Even the federal government is showing traces of similar trends, with federal CIO Vivek Kundra singing the praises of cloud computing even more than Marc Benioff! “Far too long we’ve been thinking very much vertically and making sure things are separated. Now we have an opportunity to lead with solutions that by nature encourage collaboration both horizontally and vertically.”

Cloud security remains an important issue that vendors take seriously, but there is definitely a shifting mood towards acceptance of cloud security. In a recently blog post, John Soat summarized the current mood saying, “It’s not that security in the cloud isn’t still a concern for both [health care and finance] industries, but it’s a known, and perhaps better understood factor … So while security is still a legitimate concern, it doesn’t seem to be the show stopper it used to be …”

Evaluating Zoho CRM

Although SalesForce.com may be the name most commonly associated with SaaS CRM, Zoho CRM is picking up speed as a cheap option for small business or large companies with only a few people using the service. While much attention has been paid to Google Apps, Zoho has been quietly creating a portfolio of on-line applications that is worth recognition. Now many are wondering if Zoho CRM will have as large of an impact on SalesForce that SalesForce did on SAP.

About Zoho

Part of Advent, Zoho has been producing SaaS Office-like applications since 2006. One of Zoho’s chief architects, Raju Vegesna, joined Advent upon graduating in 2000 and moving from India to the United States. Among Vegesna’s chief responsibilities is getting Zoho on the map.

Zoho initially offered spreadsheet and writing applications although the company, which targets smaller businesses with 10 to 100 employees, now has a complete range of productivity applications such as email, a database, project management, invoicing, HR, document management, planning and last but not least, CRM.

Zoho CRM

Aimed at businesses seeking to manage customer relations to transform leads into profitable relationships, Zoho CRM begins with lead generation. From there are lead conversion, accounts set up, contacts, potential mapping and campaign tabs. One of Zoho CRM’s best features is its layout. Full reporting facilities with formatting, graphical layouts and dashboards, forecasting and other management tools are neatly displayed and optimized.

Zoho CRM is fully email enabled and updates can be sent to any user set up along with full contact administration. Time lines ensure that leads are never forgotten or campaigns slipped. Like Zimbra and ProjectPlace, Zoho CRM offers brand alignment, which means users can change layout colors and add their own logo branding. Another key feature is Zoho’s comprehensive help section, which is constantly updated with comments and posts from other users online. Contact details from a standard comma separated value (.CSV) file from a user’s email system or spreadsheet application (such as Excel, Star or Open Office) can be imported by Zoho CRM. Users can also export CRM data in the same format as well.

The cost of Zoho CRM is surprisingly low. Zoho CRM offers 100,000 records storage in Free Edition and Unlimited data storage in Professional and Enterprise Editions. In FE, users can “import” up to 1,500 records per batch in contrast to 20,000 records in the Enterprise Edition.

Protected: Microsoft Azure® Platform-as-a-Service Breaks Away from the Pack

This content is password protected. To view it please enter your password below:

Four Key Categories for Cloud Computing

When it comes to cloud computing, concerns about control and security have dominated recent discussions. While it was once assumed that all computing resources could be had from outside, now it is going towards a vision of a data center magically transformed for easy connections to internal and external IT resources.

According to IDC’s Cloud Services Overview report, sales of cloud-related technology is growing at 26 percent per year. That is six times the rate of IT spending as a whole; although they comprised only about 5 percent of total IT revenue this year. While the report points out that defining what constitutes cloud-related spending is complicated, it estimates global spending of $17.5 billion on cloud technologies in 2009 will grow to $44.2 billion by 2013. IDC predicts that hybrid or internal clouds will be the norm, although even in 2013 only an estimated 10 percent of that spending will go specifically to public clouds.

According to Chris Wolf, analyst at The Burton Group, hybrid cloud infrastructure isn’t that different from existing data-center best practices. The difference is that all of the pieces are meant to fit together using Internet-age interoperability standards as opposed to homegrown kludge.

The following are four items to consider when making a “shopping list” when preparing your IT budget for use of private or public cloud services:

1.       Application Integration

Software integration isn’t the first thing most companies consider when building a cloud, although Bernard Golden, CEO at cloud consulting firm HyperStratus, and CIO.com blogger, says it is the most important one.

Tom Fisher, vice president of cloud computing at SuccessFactors.com, a business-application SaaS provider in San Mateo, California, says that integration is a whole lot more than simply batch-processing chunks of data being traded between applications once or twice per day like it was done in mainframes.

Fisher continues to explain that it is critical for companies to be able to provision and manage user identities from a single location across a range of applications, especially when it comes to companies that are new in the software-providing business and do not view their IT as a primary product.

“What you’re looking for is to take your schema and map it to PeopleSoft or another application so you can get more functional integration. You’re passing messages back and forth to each other with proper error-handling agreement so you can be more responsive. It’s still not real time integration, but in most cases you don’t really need that,” says Fisher.

2.       Security

The ability to federate—securely connect without completely merging—two networks, is a critical factor in building a useful cloud, according to Golden.

According to Nick Popp, VP of product development at Verisign (VRSN), that requires layers of security, including multifactor authentication, identity brokers, access management and sometimes an external service provider who can provide that high a level of administrative control. Verisign is considering adding a cloud-based security service.

Wolf states that it requires technology that doesn’t yet exist. According to Wolf, an Information Authority that can act as a central repository for security data and control of applications, data and platforms within the cloud. It is possible to assemble that function out of some of the aspects Popp mentions today, yet Wolf maintains that there is no one technology able to span all platforms necessary to provide real control of even an internally hosted cloud environment.

3.       Virtual I/O

One IT manager at a large digital mapping firm states that if you have to squeeze data for a dozen VMs through a few NICs, the scaling of your VM cluster to cloud proportions will be inhibited.

“When you’re in the dev/test stage, having eight or 10 [Gigabit Ethernet] cables per box is an incredible labeling issue; beyond that, forget it. Moving to virtual I/O is a concept shift—you can’t touch most of the connections anymore—but you’re moving stuff across a high-bandwidth backplane and you can reconfigure the SAN connections or the LANs without having to change cables,” says the IT manager.

Virtual I/O servers (like the Xsigo I/O Director servers used by the IT manager’s company) can run 20Gbit/sec through a single cord and as many as 64 cords to a single server—connecting to a backplane with a total of 1,560Gbit/sec of bandwidth. The IT Manager states that concentrating such a large amount of bandwidth in one device saves space, power and cabling and keeps network performance high and saves money on network gear in the long run.

Speaking about the Xsigo servers, which start at approximately $28,000 through resellers like Dell (DELL), the manager says, “It becomes cost effective pretty quickly. You end up getting three, four times the bandwidth at a quarter the price.”

4.       Storage

Storage remains the weak point of the virtualization and cloud-computing worlds, and the place where the most money is spent.

“Storage is going to continue to be one of the big costs of virtualization. Even if you turn 90 percent of your servers into images, you still have to store them somewhere,” says Golden in summary. Visit Nubifer.com for more information.

Zuora Releases Z-Commerce

The first external service (SaaS) that actually understands the complex billing models of the cloud providers (which account for monthly subscription fees as well as automated metering, pricing and billing for products, bundles and highly individualized/specific configurations) arrived in mid-June in the form of Zuora’s Z-Commerce. An upgrade to Zuora’s billing and payment service that is built for cloud providers, Z-Commerce is a major development. With Z-Commerce, storage-as-a-service is able to charge for terabytes of storage used, or IP address usage, or data transfer charges. Cloud providers can also structure a per CPU instance charge or per application use charge and it can take complexities like peak usage into account. Zuora has provided 20 pre-configured templates for the billing and payment models that cloud providers use.

What makes this development so interesting that that Zuora is using what they are calling the “subscription economy” for the underlying rationale for their success: 125 customers, 75 employees and profitability.

Tien Tzou, the CEO of Zuora (also the former Chief Strategy Officer of Salesforce.com, described subscription economy below:

“The business model of the 21st century is a fundamentally different business model.

The 21st century world needs a whole new set of operational systems — ones that match the customer centric business model that is now necessary to succeed.

The business model of the 20th century was built around manufacturing.  You built products at the lowest possible cost, and you find buyers for that product.

They key metrics were all around inventory, cost of goods sold, product life cycles, etc. But over the last 30 years, we’ve been moving away from a manufacturing economy to a services economy. Away from an economy based on tangible goods, to an economy based on intangible ideas and experiences.

What is important now is the customer — of understanding customer needs, and building services & experiences that fulfill those customer needs.  Hence the rise of CRM.

But our financial and operational systems have not yet evolved!  What we need today are operational systems built around the customer, and around the services you offer to your customers.

You need systems that allow you to design different services, offered under different price plans that customers can choose from based on their specific needs.  So the phone companies have 450 minute plans, prepaid plans, unlimited plans, family plans, and more.  Salesforce has Professional Edition, and Enterprise Edition, and Group Edition, and PRM Edition, and more.  Amazon has Amazon Prime.  ZipCar has their Occasional Driving Plan and their Extra Value Plans.

You need systems that track customer lifecycles — things such as monthly customer value, customer lifetime value, customer churn, customer share of wallet, conversion rates, up sell rates, adoption levels.

You need systems that measure how much of your service your customers are consuming.  By the minute?  By the gigabyte?  By the mile?  By the user?  By the view?  And you need to establish an ongoing, recurring billing relationship with your customers, that maps to your ongoing service relationship, that allows you to monetize your customer interactions based on the relationship that the customer opted into.

The 21st century world needs a whole new set of operational systems — ones that match the customer centric business model that is now necessary to succeed.”

To summarize, what he is saying is that the model for future business isn’t the purchase of goods and services, but rather a price provided to a customer for an ongoing relationship to the company. Under this model, the customer is able to structure the relationship in a way which provides them with what they need to accomplish the job (s) that the company can help them with (which can be a variety of services, products, tools and structured experiences).

This is also interesting because your business is measuring the customer’s commitments to you and the other way around in operation terms, even as the business model is shifting to more interactions than ever before. If you are looking at traditional CRM metrics like CLV, churn, share of wallet, adoption rates and more, as they apply to a business model that has continued to evolve away from pure transactions, Tien is saying that the payment/billing, to him, is the financial infrastructure for this new customer-centered economic model (i.e. the subscription model).

Denis Pombriant of Beagle Research Group, LLC commented on this on his blog recently, pointing out that a subscription model does not guarantee a business will be successful. What does have significant bearing on the success of failure of a business is how well the business manages it or has it managed (i.e. by Zuora).

This can be applied to the subscription economy. Zuora is highlighting what they have predicted: that companies are increasingly moving their business models to subscription based pricing. This is the same model that supports free software and hardware, which charges customers by the month. How it is managed is another can of worms, but for now Zuora has done a service by recognizing that the customer-driven companies are realizing that the customers are willing to pay for the aggregate capabilities of the company in an ongoing way—as long as the company continues to support the customer’s needs in solving problems that arise. To learn more about cloud computing and the subscription model, contact a Nubifer.com representative.

Microsoft Releases Security Guidelines for Windows Azure

Industry analysts have praised Microsoft for doing a respectable job at ensuring the security of its Business Productivity Online Services, Windows and SQL Azure. With that said, deploying applications to the cloud requires additional considerations to ensure that data remains in the correct hands.

Microsoft released a version of its Security Development Lifecycle in early June as a result of these concerns. Microsoft’s Security Development Lifecycle, a statement of best practices to those building Windows and .NET applications, focuses on how to build security into Windows Azure applications and has been updated over the years to ensure the security of those apps.

Principle security program manager of Microsoft’s Security Development Lifecycle team Michael Howard warns that those practices were not, however, designed for the cloud. Speaking in a pre-recorded video statement embedded in a blog entry, Howard says, “Many corporations want to move their applications to the cloud but that changes the threats, the threat scenarios change substantially.”

Titled “Security Best Practices for Developing Windows Azure Applications,” the 26-page white paper is divided into three sections: the first describes the security technologies that are part of Windows Azure (including the Windows Identity Foundation, Windows Azure App Fabric Access Control Service and Active Directory Federation Services 2.0—a core component for providing common logins to Windows Server and Azure); the second explains how developers can apply the various SDL practices to build more secure Windows Azure applications, outlining various threats like namespace configuration issues and recommending data security practices like how to generate shared-access signatures and use of HTTPS in the request URL;  and the third is a matrix that identifies various threats and how to address them.

Says Howard, “Some of those threat mitigations can be technologies you use from Windows Azure and some of them are threat mitigations that you must be aware of and build into your application.”

Security is a major concern and Microsoft has address many key issues concerning security in the cloud. President of Lieberman Software Corp., a Microsoft Gold Certified Partner specializing in enterprise security Phil Lieberman says, “By Microsoft providing extensive training and guidance on how to properly and securely use its cloud platform, it can overcome customer resistance at all levels and achieve revenue growth as well as dominance in this new area. This strategy can ultimately provide significant growth for Microsoft.”

Agreeing with Lieberman, Scott Matsumoto, a principal consultant with the Washington, D.C.-based consultancy firm Cigital Inc., which specializes in security, says, “I especially like the fact that they discuss what the platform does and what’s still the responsibility of the application developer. I think that it could be [wrongly] dismissed as a rehash of other information or incomplete—that would be unfair.” To find more research on Cloud Security, please visit Nubifer.com.

Five Best Practices for Private Cloud Computing

Industry experts state that private cloud computing enables enterprise IT executives to maximize their organization’s resources and align IT services with business needs while they wait for public cloud computing standards to become defined.

Even for enterprises that like to manage infrastructure and application in-house, building a private cloud is good practice. Frank Gens, senior vice president and chief analyst at IDC, a research firm in Framingham, Massachusetts, says, “With virtualization and the private cloud, CIOs are much closer to that goal of efficient and dynamic IT service delivery and capability.”

Automation minimizes the IT staff’s involvement when the cloud is up and running and is thus a key goal. “The end user is the constituent who is going to leverage the workload for productive work,” says vice president for services and support at Surgient Inc. Brian Wilson. An Infrastructure as a Service provider in Austin, Texas, Surgient Inc. has deployed 150 private clouds for enterprises in the Fortune 500.

According to Wilson, the most important aspect of a private cloud is self-service. With that said, “a self-service portal does not guarantee self-service. Self-service needs to be layered on top of automation services.” CIOs need to consider the service’s design, definition, library and life-cycle. Additionally, the service should integrate applications which report usage for charge-back (preferable with an administrative dashboard and event broadcasting).

A private cloud doesn’t mean a less complex cloud, and as more enterprises launched their private clouds, best practices are beginning to emerge. Here is a list of five best practices for private cloud computing, according to Wilson:

1. Access

  • Evaluate current and planned hardware, hypervisors, network architecture and storage.
  • Understand corporate security standards and existing vendor relationships ad know where you vendors are going (so you don’t buy into dead-end technology).
  • Begin with a defined project and plan for scale, heterogeneity and change. Plan for and document your deployment plans using client-specific use cases and success criteria.

2. Deploy

  • Microsoft CEO Steve Ballmer compares the usage curve for cloud computing to a hockey stick, so be prepared for the uptick by establishing a deployment schedule.
  • Ensure that essential content is available in a centralized library.
  • Introduce critical members of the team, finalize use cases and confirm the schedule from the beginning.
  • Dynamically manage IT policies by automating self-service provisioning of applications while remaining flexible and understanding of change.
  • Plan for on-site training.

3. Analyze

  • Review usage trends, resource consumption trends, server use and administration overhead–a step that is skipped often, according to Wilson.
  • Understand the metrics for RIO and TCO and gain executive buy-in with formal ROI evaluations monthly and quarterly.
  • Continue to evaluate your processes, as the cloud is a fundamental shift from traditional processes. Ask yourself if there is a better way to do this throughout the process.

4. Create Reusable Code

  • Plan your service catalog wisely by creating reusable building blocks of virtual machines and services.
  • Take the time to understand your users needs and plan for their experience, as your content is critical.
  • Take the centralized view that is possible with a private cloud; avoid discrete stacks and multiple operating systems.

5. Don’t Forget to Charge Back

  • According to Wilson, very few organizations actually charge back, even though one of the pillars of the cloud is its ability to meter services on an as-needed-basis.
  • Saint Luke’s Health System, for example, operates 11 hospitals and clinics in the Kansas City, Missouri metropolitan area. CIO Debe Gash opted for public cloud computing because of the speed with which it enabled her organization to comply with new HIPPA regulations and says charge-back helps keep IT costs down and prove its mettle.
  • “The bill of IT for each entity is valuable. They can see what they’re using. The visibility into what something actually costs is very helpful to them,” says Gash. The charge-back also shows which systems are driving IT costs, thus Gash can “validate that we’re spending money on what’s strategic to the organization.”

To receive more information regarding best practices for private cloud computing contact a Nubifer.com representative today.

Microsoft Makes Strides for a More Secure and Trustworthy Cloud

Cloud computing currently holds court in the IT industry with vendors, service providers, press, analysts and customers all evaluating and discussing the opportunities presented by the cloud.

Security is a very important piece to the puzzle, and nearly every day a new press article or analyst report indicated that cloud security and privacy are a top concern for customers as the benefits of cloud computing continue to unfold. For example, a recent Microsoft survey revealed that although 86% of senior business leaders are thrilled about cloud computing, over 75% remain concerned about the security, access and privacy of data in the cloud.

Customers are correct in asking how cloud vendors are working to ensure the security of cloud applications, the privacy of individuals and protection of data. In March, Microsoft CEO Steve Ballmer told an audience at the University of Washington that, “This is a dimension of the cloud, and it’s a dimension of the cloud that needs all of our best work.”

Microsoft is seeking to address security-related concerns and help customers understand which questions they need to ask as part of Microsoft’s Trustworthy Computing efforts. The company is trying to become more transparent than competitors concerning how they help enable an increasingly secure cloud.

Server and Tools Business president Bob Muglia approached the issue in his recent keynote at Microsoft’s TechEd North America conference saying, “The data that you have is in your organization is yours. We’re not confused about that, that it’s incumbent on us to help you protect that information for you. Microsoft’s strategy is to deliver software, services and tools that enable customers to realize the benefits of a cloud-based model with the reliability and security of on-premise software.”

The Microsoft Global Foundations Services (GFS) site is a resource for users to learn about Microsoft’s cloud security efforts, with the white papers “Securing Microsoft’s Cloud Infrastructure” and “Microsoft’s Compliance Framework for Online Services” being very informative.

Driving a comprehensive, centralized Information Security Program for all Microsoft cloud data-centers and the 200+ consumer and commercial services they deliver –all built using the Microsoft Security Development Lifecycle–GFS covers everything from physical security to compliance, such as Risk Management Process, Response, and work with law enforcement; Defense-in-Depth Security controls across physical, network, identity and access, host, application and data; A Comprehensive Compliance Framework to address standards and regulations such as PCI, SOX, HIPPA, and the Media Ratings Council; and third party auditing, validation and certification (ISO 27001, SAS 70).

Muglia also pointed out Microsoft’s focus on identity, saying, “As you move to cloud services you will have a number of vendors, and you will need a common identity system.” In general, identity is the cornerstone of security, especially cloud security. Microsoft currently provides technologies with Windows Server and cloud offerings which customers can use to extend existing investments in identity infrastructure (like Active Directory) for easier and more secure access to cloud services.

Microsoft is not alone in working on cloud security, as noted by Microsoft’s chief privacy strategist Peter Cullen. “These truly are issues that no one company, industry or sector can tackle in isolation. So it is important to start these dialogs in earnest and include a diverse range of stakeholders from every corner of the globe,” Cullen said in his keynote at the Computers, Freedom and Privacy (CFP) conference. Microsoft is working with customers, governments, law enforcement, partners and industry organizers (like the Cloud Security Alliance) to ensure more secure and trustworthy cloud computing through strategies and technologies. To receive additional information on Cloud security contact a Nubifer.com representative today.

Don’t Underestimate a Small Start in Cloud Computing

Although many predict that cloud computing will forever alter the economics and strategic direction of corporate IT, it is likely that the impact of the cloud will continue to be largely from small projects. Some users and analysts say that these small projects, which do not project complex, enterprise-class, computing-on-demand services, are what to look out for.

David Tapper, outsourcing and offshoring analyst for IDC says, “What we’re seeing is a lot of companies using Google (GOOG) Apps, Salesforce and other SaaS apps, and sometimes platform-as-a-service providers, to support specific applications. A lot of those services are aimed at consumers, but they’re just as relevant in business environments, and they’re starting to make it obvious that a lot of IT functions are generic enough that you don’t need to build them yourself.” New enterprise offerings from Microsoft, such as Microsoft BPOS, have also shown up on the scene with powerful SaaS features to offer businesses.

According to Tapper, the largest representation of mini-cloud computing is small- and mid-sized businesses using commercial versions of Google Mail, Google Apps and similar ad hoc or low-cost cloud-based applications. With that said, larger companies are doing the exact same thing. “Large companies will have users whose data are confidential or who need certain functions, but for most of them, Google Apps is secure enough. We do hear about some very large cloud contracts, so there is serious work going on. They’re not the rule though,” says Tapper.

First Steps into the Cloud

A poll conducted by the Pew Research Center’s Internet & American Life Project found that 71 percent of the “technology stakeholders and critics” believe that most people will do their work from a range of computing devices using Internet-basd applications as their primary tools by 2020.

Respondents were picked from technology and analyst companies for their technical savvy and as a whole believe cloud computing will dominate information transactions by the end of the decade. The June report states that cloud computing will be adopted because of its ability to provide new functions quickly, cheaply and from anywhere the user wishes to work.

Chris Wolf, analyst at Gartner, Inc.’s Burton Group, thinks that while this isn’t unreasonable, it may be a little too optimistic. Wolf says that even fairly large companies sometimes use commercial versions of Google Mail or instant messaging, but it is a different story when it comes to applications requiring more fine tuning, porting, communications middleware or other heavy work to run on public clouds, or data that has to be protected and documented.

Says Wolf, “We see a lot of things going to clouds that aren’t particularly sensitive–training workloads, dev and test environments, SaaS apps; we’re starting to hear complaints about things that fall outside of IT completely, like rogue projects on cloud services. Until there are some standards for security and compliance, most enterprises will continue to move pretty slowly putting critical workloads in those environments. Right now all the security providers are rolling their own and it’s up to the security auditors to say if you’re in compliance with whatever rules govern that data.”

Small, focused projects using cloud technologies are becoming more common, in addition to the use of commercial cloud-based services, says Tapper.

For example, Beth Israel Deaconnes Hospital in Boston elevated a set of VMware (VMW) physical and virtual servers into a cloud-like environment to create an interface to its patient-records and accounting systems, enabling hundreds of IT-starved physician offices to link up with the use of just one browser.

New York’s Museum of Modern Art started using workgroup-on-demand computing systems from CloudSoft Corp. last year. This allowed the museum to create online workspaces for short-term projects that would otherwise have required real or virtual servers and storage on-site.

Cloud computing will make it clear to both IT and business management that some IT functions are just generic when they’re homegrown as when rented, in about a decade or so. Says Tapper, “Productivity apps are the same for the people at the top as the people at the bottom. Why buy it and make IT spend 80 percent of its time maintaining essentially generic technology?” Contact Nubifer.com to learn more…

Nubifer Cloud:Link Mobile and Why Windows Phone 7 is Worth the Wait

Sure, Android devices become more cutting-edge with each near-monthly release and Apple recently unveiled its new iPhone, but some industry experts suggest that Windows Phone 7 is worth the wait. Additionally, businesses may benefit from waiting until Windows Phone 7 arrives to properly compare the benefits and drawbacks of all three platforms before making a decision.

Everyone is buzzing about the next-generation iPhone and smartphones like the HTC Incredible and HTC EVO 4G, but iPhone and Android aren’t even the top smart phone platforms. With more market share than second place Apple and third place Microsoft combined, RIM remains the number one smartphone platform. Despite significant gains since its launch, Android is in fourth place, with only 60 percent as much market share as Microsoft.

So what gives? In two words: the business market. While iPhone was revolutionary for merging the line between consumer gadget and business tool, RIM has established itself as synonymous with mobile business communications. Apple and Google don’t provide infrastructure integration or management tools comparable to those available with the Blackberry Enterprise Server (BES).

The continued divide between consumer and business is highlighted by the fact that Microsoft is still in third place with 15 percent market share. Apple and Google continue to leapfrog one another while RIM and Microsoft are waiting to make their move.

The long delay in new smartphone technology from Microsoft is the result of leadership shakeups and the fact that Microsoft completely reinvented its mobile strategy, starting from scratch. Windows Phone 7 isn’t merely an incremental evolution of Windows Mobile 6.5. Rather, Microsoft went back to the drawing board to create an entirely new OS platform that recognizes the difference between a desktop PC and a smartphone as opposed to assuming that the smartphone is a scaled-down Windows PC.

Slated to arrive later this year, Windows 7 smartphones promise an attractive combination of the intuitive touch interface and experience found in the iPhone and Android, as well as the integration and native apps to tie in with the Microsoft server infrastructure that comprises the backbone of most customers network and communications architecture.

With that said, the Windows Phone 7 platform won’t be without its own set of issues. Like Apple’s iPhone, Windows Phone 7 is expected to lack true multitasking and the copy and paste functionality from the get-go. Additionally, Microsoft is also locking down the environment with hardware and software restrictions that limit how smartphone manufacturers can customize the devices, and doing away with all backward compatibility with existing Windows Mobile hardware and apps.

As a mobile computing platform, Cloud Computing today touches many devices and end points. From Application Servers to Desktops and of course the burgeoning ecosystem of smart phone devices. When studying the landscapes and plethora of cell phone operating systems, and technology capabilities of the smart phones, you start to see a whole new and exciting layer of technology for consumers and business people alike.

Given the rich capabilities of Windows Phone 7 offering Silverlight, and/or XNA technology, we at Nubifer have become compelled to engineer the upgrades to our cloud services to inter-operate with the powerful new upcoming technologies offered by Windows Phone 7. At Nubifer, we plan to deploy and inter-operate with many popular smart phones and hand-set devices by way of linking these devices to our Nubifer Cloud:Link technology and offering extended functionality delivered by Nubifer Cloud:Connector and Cloud:Portal which enable enterprise companies to gain a deeper view into the analytics and human computer interaction of end users and subscribers of various owned and leased software systems hosted entirely in the cloud or by way of the hybrid model.

It makes sense for companies that don’t need to replace their smartphones at once to wait for Windows Phone 7 to arrive, at which point all three platforms and be compared and contrasted. May the best smartphone win!

Cloud Computing in 2010

A recent research study by the Pew Internet & American Life Project released on June 11 found that most people expect to “access software applications online and share and access information through the use of remote server networks, rather than depending primarily on tools and information housed on their individual, personal computers” by 2010. This means that the term “cloud computing” will likely be referred to as simply “computing” ten years down the line.

The report points out that we are currently on that path when it comes to social networking, thanks to sites like Twitter and Facebook. We also communicate in the cloud using services like Yahoo Mail and Gmail, shop in the cloud on sites like Amazon and eBay, listen to music in the cloud on Pandora, share pictures in the cloud on Flickr and watch videos on cloud sites like Hulu and YouTube.

The more advanced among us are even using services like Google Docs, Scribd or Docs.com to create, share or store documents in the cloud. With that said, it will be some time before desktop computing falls away completely.

The report says: “Some respondents observed that putting all or most of faith in remotely accessible tools and data puts a lot of trust in the humans and devices controlling the clouds and exercising gate keeping functions over access to that data. They expressed concerns that cloud dominance by a small number of large firms may constrict the Internet’s openness and its capability to inspire innovation—that people are giving up some degree of choice and control in exchange for streamlines simplicity. A number of people said cloud computing presents difficult security problems and further exposes private information to governments, corporations, thieves, opportunists, and human and machine error.”

For more information on the current state of Cloud Computing, contact Nubifer today.

The Impact of Leveraging a Cloud Delivery Model

In a recent discussion about the positive shift in the Cloud Computing discourse towards actionable steps as opposed to philosophical rants in definitions, .NET Developer’s Journal issued a list of five things not to do. The first mistake among the list of five (which included #2. assuming server virtualization is enough; #3 not understanding service dependencies; #4 leveraging traditional monitoring; #5 not understanding internal/external costs), was not understanding the business value. Failing to understand the business impact of leveraging a Cloud delivery model for a given application or service is a crucial mistake, but it can be avoided.

When evaluating a Cloud delivery option, it is important to first define the service. Consider: is it new to you or are you considering porting an existing service? On one hand, if new, there is a lower financial bar to justify a cloud model, but on the downside is a lack of historical perspective on consumption trends to aid an evaluating financial considerations or performance.

Assuming you choose a new service, the next step is to address why you are looking at Cloud, which may require some to be honest about their reasons. Possible reasons for looking at cloud include: your business requires a highly scalable solution; your data center is out of capacity; you anticipate this to be a short-lived service; you need to collaborate with a business partner on neutral territory; your business has capital constraints.

All of the previously listed reasons are good reasons to consider a Cloud option, yet if you are considering this option because it takes weeks, months even, to get a new server in production; your Operation team is lacking credibility when it comes to maintaining a highly available service; or your internal cost allocation models are appalling—you may need to reconsider. In these cases, there may be some in-house improvements that need to be made before exploring a Cloud option.

An important lesson to consider is that just because you can do something doesn’t mean you necessarily should, and this is easily applicable in this situation. Many firms have had disastrous results in the past when they exposed legacy internal applications to the Internet. The following questions must be answered when thinking about moving applications/services to the Cloud:

·         Does the application consume or generate data with jurisdictional requirements?

·         Will your company face fines or a public relations scandal is there is a security breach/data loss?

·         What part of your business value chain is exposed if the service runs poorly? (And are there critical systems that rely on it?)

·         What if the application/service doesn’t run at all? (Will you be left stranded or are there alternatives that will allow the business to remain functioning?)

Embracing Cloud services—public or private—comes with tremendous benefits, yet a constant dialogue about the business value of the service in question is required to reap the rewards. To discuss the benefits of adopting a hybrid On-Prem/Cloud solution contact Nubifer today.

Asigra Introduces Cloud Backup Plan

Cloud backup and recovery software provider Asigra announced the launch of Cloud Backup v10 on June 8. Available through the Asigra partner network, the latest edition extends the scope and performance of the Asigra platform, including protection for laptops, desktops, servers, data centers and cloud computing environments with tiered recovery options to meet Recovery Time Objectives (RTOs). Organizations can select an Asigra service provider for offsite backup, choose to deploy the software directly onsite, or both. Pricing begins at $50 per month through cloud backup service providers.

V10 expanded the tiers of backup and recovery (Local-Only Backup and Backup Lifecycle Manager (BLM) enables cloud storage) and also allows the backup of laptops in the field and other environments, enabling businesses to back up and recover their data to and from physical, virtual or both types of servers. Among the features are DS-Mobile support to backup laptops in the field, FIPS 140-2 NIST certified security and encryption of data in-flight and at-rest and new backup sets for comprehensive protection of enterprise applications, including MS Exchange, MS SharePoint, MS SQL, Windows Hyper-V Oracle SBT, Sybase and Local-Only backup.

Senior analyst at the Enterprise Strategy Group Lauren Whitehouse said, “The local backup option is a powerful benefit for managed service providers (MSPs) as they can now offer more pricing granularity for customers on three levels—local, new and aging data. With more pricing flexibility, more reliable and affordable backup service package to attract more business customers and free them from the pain of tape backup.”

At least two-thirds of companies in North America and Europe have already implemented server virtualization, according to Forrester Research. Asigra added enhancements to the virtualization support in v10 as a response to the major server virtualization vendors embracing the cloud as the strategic deliverable of a virtualized infrastructure. The company has offered support for virtual machine backups at the host level; Cloud Backup v10 is able to be deployed as a virtual appliance with virtual infrastructures. The company said that the current version now supports Hyper-V, VMware and XenServer.

“The availability of Asigra Cloud Backup v10 has reset the playing field for Asigra with end-to-end date protection from the laptop to the data center to the public cloud. With advanced features that differentiate Asigra both technologically and economically from comparable solutions, the platform can adapt to the changing nature of today’s IT environments, providing unmatched backup efficiency and security as well as the ability to respond to dynamic business challenges,” said executive vice president for Asigra Eran Farakjun. To discover how a Cloud back-up system can benefit your enterprise, contact Nubifer Inc.

The Future of Enterprise Software in the Cloud

Although there is currently a lot of discussion regarding the impact that cloud computing and Software-as-a-Service will have on enterprise software, it comes mainly from a financial standpoint. It is now time to begin understanding how enterprise software as we know it will evolve across a federated set of private and public cloud services.

The strategic direction being taken by Epicor is a prime example of the direction that enterprise software is taking. A provider of ERP software for the mid-market, Epicor is taking a sophisticated approach by allowing customers to host some components on the Epicor suite on premise rather than focusing on hosting software in the cloud. Other components are delivered as a service.

Epicor is a Microsoft software partner that subscribes to the Software Plus Services mantra and as such is moving to offer some elements of its software, like the Web server and SQL server components, as an optional service. Customers would be able to invoke this on the Microsoft Azure cloud computing platform.

Basically, Epicor is going to let customers deploy software components where they make the most sense, based on the needs of customers on an individual basis. This is in contrast to proclaiming that one model of software delivery is better than another model.

Eventually, every customer is going to require a mixed environment, even those that prefer on-premise software, because they will discover that hosting some applications locally and in the cloud simultaneously will allow them to run a global operation 24 hours a day, 7 days a week more easily.

Much of the argument over how software is delivered in the enterprise will melt away as customers begin to view the cloud as merely an extension of their internal IT operations. To learn more on how the future of Software in the Cloud can aide your enterprise, schedule a discussion time with a Nubifer Consultant today.

What Cloud APIs Reveal about the Budding Cloud Market

Although Cloud Computing remains hard to define, one of its essential characteristics is pragmatic access to virtually unlimited network, compute and storage resources. The foundation of a cloud is a solid Application Programming Interface (API), despite the fact that many users access cloud computing through consoles and third-party applications.

CloudSwitch works with several cloud providers and thus is able to interact with a variety of cloud APIs (both active and about-to-be-released versions). CloudSwitch has come up with some impressions after working with both the APIs and those implementing them.

First, clouds remain different in spite of constant discussion about standards. Cloud APIs have to cover more than start/stop/delete a server, and once the API crosses into provisioning the infrastructure (network ranges, storage capacity, geography, accounts, etc.), it all starts to get interesting.

Second, a very strong infrastructure is required for a cloud to function as it should. The infrastructure must be good enough to sell to others when it comes to public clouds. Key elements of the cloud API can inform you about the infrastructure, what tradeoffs the cloud provider has made and the impact of end users, if you are attuned to what to look out for.

Third, APIs are evolving fast, like cloud capabilities. New API calls and expansion of existing functions as cloud providers add new capabilities and features are now a reality. On balance, we are discussing on-the-horizon services and with cloud providers and what form their API is poised to take. This is a perfect opportunity to leverage the experience and work of companies like CloudSwitch as a means to integrate these new capabilities into a coherent data model.

When you look at the functions beyond simple virtual machine control, an API can give you an indication of what is happening in the cloud. Some like to take a peek at the network and storage APIs in order to understand how the cloud is built. Take Amazon, for example. In Amazon, the base network design is that each virtual server receives both a public and private IP address. These addresses are assigned from a pool based on the location of the machine within the infrastructure. Even though there are two IP addresses, however, the public one is just routed (or NAT’ed) to the private address. You only have a single network interface to your server—which is simply and scalable architecture for the cloud provider for support—with Amazon. The server will cause problems for applications requiring at least two NICs, such as some cluster applications.

Terremark’s cloud offering is in stark contrast to Amazon’s. IP addresses are defined by the provider so they can route traffic to your servers, like Amazon, but Terremark allocates a range for your use when you first sign up (while Amazon uses a generic pool of addresses). This can been seen as a positive because there is better control of the assignment of networking address, but on the flip side is potential scaling issues because you only have a limited number of addresses to work with. Additionally, you can assign up to four NIC’s to each server in Terremark’s Enterprise cloud (which allows you to create more complex network topologies and support applications requiring multiple networks for proper operation).

One important thing to consider is that with the Terremark model, servers only have internal addresses. There is no default public NAT address for each server, as with Amazon. Instead, Terremark has created a front-end load balancer that can be used to connect a public IP address to a specified set of servers by protocol and port. You must first create an “Internal Service” (in the language of Terremark) that defines a public IP/Port/Protocol for each protocol and port. Next, assign a server and port to the Service, which will create a connection. You can add more than one server to each public IP/Port/Protocol group  since this is a load balancer. Amazon does have a load balancer function as well, and although it isn’t required to connect public addresses to your cloud servers, it does support connecting multiple servers to a single public IP address.

When it comes down to it, the APIs and the feature sets they define tell a lot about the capabilities and design of a cloud infrastructure. The end user features, flexibility and scalability of the whole service will be impacted by decisions made at the infrastructure level (such as network address allocation, virtual device support and load balancers). It is important to look down to the API level when considering what cloud environment you want because it helps you to better understand how the cloud providers’ infrastructure decisions will impact your deployments.

Although building a cloud is complicated, it can provide a powerful resource when implemented correctly. Cloud with different “sweet spots” emerge when cloud providers choose key components and a base architecture for their service. You can span these different clouds and put the right application in the right environment with CloudSwitch. To schedule a time to discuss how Cloud Computing can help your enterprise, contact Nubifer today.

App Engine and VMware Plans Show Google’s Enterprise Focus

Google opened its Google I/O developer conference in San Francisco on May 19 with the announcement of its new version of the Google App Engine, Google App Engine for Business. This was a strategic announcement, as it shows Google is focused on demonstrating its enterprise chops. Google also highlighted its partnership with VMware to bring enterprise Java developers to the cloud.

Vic Gundotra, vice president of engineering at Google said via a blog post: “… we’re announcing Google App Engine for Business, which offers new features that enable companies to build internal applications on the same reliable, scalable and secure infrastructure that we at Google use for our own apps. For greater cloud portability, we’re also teaming up with VMware to make it easier for companies to build rich web apps and deploy them to the cloud of their choice or on-premise. In just one click, users of the new versions of SpringSource Tool Suite and Google Web Toolkit can deploy their application to Google App Engine for Business, a VMware environment or other infrastructure, such as Amazon EC2.”

Enterprise organizations can build and maintain their own applications on the same scalable infrastructure that powers Google Applications with Google App Engine for Business. Additionally,  Google App Engine for Business has added management and support features that are tailored for each unique enterprise. New capabilities with this platform include: the ability to manage all the apps in an organization in one place; premium developer support; simply pricing based on users and applications; a 99.9 percent uptime service-level agreement (SLA); access to premium features such as cloud-based SQL and SSL (coming later this year).

Kevin Gibbs, technical lead and manager of the Google App Engine project said during the May 18 Google I/O keynote that “managing all the apps at your company” is a prevalent issue for enterprise Web developers. Google sought to address this concern through its Google App Engine hosting platform but discovered it needed to shore it up to support enterprises. Said Gibbs, “Google App Engine for Business is built from the ground up around solving the problems that enterprises face.”

Product management director for developer technology at Google Eric Tholome told eWEEK that Google App Engine for Business allows developers to use standards-based technology (like Java, the Eclipse IDE, Google Web Toolkit GWT and Python) to create applications that run on the platform. Google App Engine for Business also delivers dynamic scaling, flat-rate pricing and consistent availability to users.

Gibbs revealed that Google will be doling out the features in Google App Engine for Business throughout the rest of 2010, with Google’s May 19 announcement acting as a preview of the platform. The platform includes an Enterprise Administration Console, a company-based console which allows users to see, manage and set security policies for all applications in their domain. The company’s road map states that features like support, the SLA, billing, hosted SQL and custom domain SSL will come at a later date.

Gibbs said that pricing for Google App Engine for Business will be $8 per month per user for each application with the maximum being $1,000 per application per month.

Google also announced a series of technology collaboration with VMware. The goal of these is to deliver solutions that make enterprise software developers more efficient at building, deploying and managing applications within all types of cloud environments.

President and CEO of VMware Paul Maritz said, “Companies are actively looking to move toward cloud computing. They are certainly attracted by the economic advantages associated with cloud, but increasingly are focused on the business agility and innovation promised by cloud computing. VMware and Google are aligning to reassure our mutual important to both companies. We will work to ensure that modern applications can run smoothly within the firewalls of a company’s data center or out in the public cloud environment.”

Google is essentially trying to pick up speed in the enterprise, with Java developers using the popular Spring Framework (stemming from VMware’s SpringSource division). Recently, VMware did a similar partnership with Salesforce.com.

Maritz continued to say to the audience at Google I/O, “More than half of the new lines of Java code written are written in the context of Spring. We’re providing the back-end to add to what Google provides on the front end. We have integrated the Spring Framework with Google Web Toolkit to offer an end-to-end environment.”

Google and VMware are teaming up in multiple ways to make cloud applications more productive, portable and flexible. These collaborations will enable Java developers to build rich Web applications, use Google and VMware performance tools on cloud apps and subsequently deploy Spring Java applications on Google App Engine.

Google’s Gundotra explained, “Developers are looking for faster ways to build and run great Web applications, and businesses want platforms that are open and flexible. By working with VMware to bring cloud portability to the enterprise, we are making it easy for developers to deploy rich Java applications in the environments of their choice.”

Google’s support for Spring Java apps on Google App Engine are part of a shared vision to make building, running and managing applications for the cloud easier and in a way that renders the applications portable across clouds. Developers can build SpringSource Tool Suite using the Eclipse-based SpringSource and have the flexibility to choose to deploy their applications in their current private VMware vSphere environment, in VMware vCloud partner clouds or directly to Google App Engine.

Google and VMware are also collaborating to combine the speed of development of Spring Roo–a next-generation rapid application development tool–with the power of the Google Web Toolkit to create rich browser apps. These GWT-powered applications can create a compelling end-user experience on computers and smartphones by leveraging modern browser technologies like HTML5 and AJAX.

With the goal of enabling end-to-end performance visibility of cloud applications built using Spring and Google Web Toolkit, the companies are collaborating to more tightly integrate VMware’s Spring Insight performance tracing technology within the SpringSource tc Server application server with Google’s Speed Tracer technology.

Speaking about the Google/VMware partnership, vice president at Nucleus Research Rebecca Wettemann told eWEEK, “In short, this is a necessary step for Google to stay relevant in the enterprise cloud space. One concern we have heard from those who have been slow to adopt the cloud is being ‘trapped on a proprietary platform.’ This enables developers to use existing skills to build and deploy cloud apps and then take advantage of the economies of the cloud. Obviously, this is similar to Salesforce.com’s recent announcement about its partnership with VMware–we’ll be watching to see how enterprises adopt both. To date, Salesforce.com has been better at getting enterprise developers to develop business apps for its cloud platform.”

For his part, Frank Gillett, an analyst with Forrester Research, describes the Google/VMware more as “revolutionary” and the Salesforce.com/VMware partnership to create VMforce as “evolutionary.”

“Java developers now have a full Platform-as-a-Service [PaaS] place to go rather than have to provide that platform for themselves,” said Gillett of the new Google/VMware partnership. He added, however, “What’s interesting is that IBM, Oracle and SAP have not come out with their own Java cloud platforms. I think we’ll see VMware make another deal or two with other service providers. And we’ll see more enterprises application-focused offerings from Oracle, SAP and IBM.”

Google’s recent enterprise moves show that the company is set on gaining more of the enterprise market by enabling enterprise organizations to buy applications from others through the Google Apps Marketplace (and the recently announced Chrome Web Store), buy from Google with Google Apps for Business or build their own enterprise applications with Google App Engine for Business. Nubifer Inc. is leading Research and Consulting firm specializing in Cloud Computing and Software as a Service.

Cloud Computing Business Models on the Horizon

Everyone is wondering what will follow SaaS, PaaS and IaaS, so here is a tutorial on some of the emerging cloud computing business models on the horizon.

Computing arbitrage:

Companies like broadband.com are buying bandwidth at a wholesale rate and reselling it to the companies to meet their specific needs. Peekfon began buying data bandwidth in bulk and slice it up to sell to their customers as a way to solve the problem of expensive roaming for customers in Europe. The company was able to negotiate with the operators to buy bandwidth in bulk because they intentionally decided to steer away from the voice plans. They also used heavy compression on their devices to optimize the bandwidth.

While elastic computing is an integral part of cloud computing, not all companies who want to leverage the cloud necessarily like it. These companies with unique cloud computing needs—like fixed long-term computing that grows at relatively fixed low rate and seasonal peaks—have a problem that can easily be solved via intermediaries. Since it requires hi cap-ex, there will be fewer and fewer cloud providers. Being a “cloud VAR” could be a good value proposition for the vendors that are “cloud SI” or have a portfolio of cloud management.

App-driven and content-driven clouds:

Now that the competition between private and public clouds is nearly over, it is time to think about a vertical cloud. The needs to compute depend on what is being computed, and it depends on the applications’ specific needs to compute, the nature and volume of data that is being computed and the kind of content that is being delivered. The vendors are optimizing the cloud to match their application and content needs in the current SaaS world, and some are predicting that a few companies will help ISV’s by delivering app-centric and content-centric clouds.

For advocates of net neutrality, the current cloud-neutrality that is application-agnostic is positive, but innovation on top of raw clouds is still needed. Developer’s need fine knobs for CPU computes, I/O computes, main-memory computing and other varying needs of their applications. The extensions are specific to a programming stack like Heroku for Ruby but the opportunity to provide custom vertical extensions for an existing cloud or to build a cloud that is purpose-built for a specific class of applications and has a range of stack options underneath (making it easy for the developers to leverage the cloud natively) is here. Nubifer Inc. provides Cloud and SaaS Consulting services to enterprise companies.

U.S. Government Moves to the Cloud

The U.S. Recovery, Accountability and Transparency Board recently announced the move of its Recovery.gov site to a cloud computing infrastructure. That cloud computing infrastructure is powered by Amazon.com’s Elastic Compute Cloud (EC2) and will grant the U.S. Recovery Accountability and Transparency Board more efficient computer operation, reduced costs and improved security.

Amazon Web Services’ (AWS) cloud technology was selected as the foundation for the move by Smartronix, which acted as the prime contractor on the migration made by the U.S. Recovery Accountability and Transparency. Also in the May 13 announcement, the board said Recovery.gov is now the first government-wide system to make the move into the cloud.

The U.S. government’s official Website that provides easy access to data related to Recovery Act spending, Recovery.gov allows for the reporting of potential fraud, waste and abuse. The American Recovery and Reinvestment Act of 2009 created the Recovery Accountability and Transparency Board with two goals in mind: to provide transparency related to the use of Recovery-related funds, and to prevent and detect fraud, waste and mismanagement.

CEO of Smartronix John Parris said of the announcement, “Smartronix is honored to have supported the Recovery Board’s historic achievement in taking Recovery.gov, the standard for open government, to the Amazon Elastic Compute Cloud (EC2). This is the first federal Website infrastructure to operate on the Amazon EC2 and was achieved due to the transparent and collaborative working relationship between Team Smartronix and our outstanding government client.”

The board anticipates that the move will save approximately $750,000 during its current budget cycle and result in long-term savings as well. For fiscal year 2010 and 2011 direct cost savings to the Recovery Board will be $334,800 and $420,000 respectively.

Aside from savings, the move to the cloud will free up resources and enable the board’s staff to focus on its core mission of providing Recovery.com’s users with rich content without worrying about management of the Website’s underlying data center and related computer equipment.

In a statement released in conjunction with the announcement, vice president of Amazon Web Services Adam Selipsky said, “Recovery.gov is demonstrating how government agencies are leveraging the Amazon Web Services cloud computing platform to run their technology infrastructure at a fraction of the cost of owning and managing it themselves. Building on AWS enables Recovery.giv to reap the benefits of the cloud–including the ability to add or shed the resources as needed, paying only for resources used and freeing up scarce engineering resources from running technology infrastructure–all without sacrificing operational performance, reliability, or security.”

The Board’s Chairman, Earl Devany, said, “Cloud computing strikes me as a perfect tool to help achieve greater transparency and accountability. Moving to the cloud allows us to provide better service at lower costs. I hope this development will inspire other government entities to accelerate their own efforts. The American taxpayers would be the winners.”

Board officials also said that greater protection against network attacks and real time detection of system tampering are some of the security improvements from the move. Amazon’s computer security platform has been essentially added to the Board’s own security system (which will continue to be maintained and operated by the Board’s staff).

President of Environmental Systems Research Institute (ESRI) Jack Dangermound also released a statement after the announcement was made. “Recovery.gov broke new ground in citizen participation in government and is now a pioneer in moving to the cloud. Opening government and sharing data through GIS are strengthening democratic processes of the nation,” said Dangermound. “The Recovery Board had the foresight to see the added value of empowering citizens to look at stimulus spending on a map, to explore their own neighborhoods, and overlay spending information with other information. This is much more revealing than simply presenting lists and charts and raises the bar for other federal agencies.” For more information please visit Nubifer.com.

Facebook Security and Privacy: Ten Reminders to Live By

Facebook is arguably the largest social network on the globe, and because of that there are security and privacy issues that users need to remember. Here is a list of ten reminders to consider.

A reminder of why users need to be on guard when using Facebook arose during the Week of May 3, when users of the social network discovered that they were being permitted to view their friends’ private chat conversations. The loophole was quickly fixed by the folks over at Facebook, but users’ concerns about privacy issues remain.

A few months prior to the May 3 incident, some Facebook users received private messages that were meant for other users. Facebook acted similarly in this case, swiftly addressing the problem, but once again privacy advocates began to question whether Facebook was taking enough measures to protect data.

Facebook has maintained that these minor glitches are fixed quickly, and users must remember that it is nearly impossible for a social network service with over 400 million active users to deliver absolute data security 100 percent of the time. When joining Internet social networks, users need to expect their personal data to be vulnerable to a certain degree and make it their duty to maintain personal privacy and security on a social network.

Ten reminders to live by:

1. Privacy Concerns

There are legitimate privacy concerns that users need to be aware of in order to understand the issues that may arise when using Facebook. As soon as you acknowledge that Facebook isn’t without flaws, you can begin to safeguard your data. Once you have a better understanding of privacy on the Web, you can alter the way in which you use social networks.

2. Holes

The ways in which hackers find way to target Facebook’s users increases as the site becomes more and more popular. One of these malicious hackers’ tactics employs a phishing scam that asks users to input their credentials into a faux Facebook look-alike. Once a user does so, hackers have access to their log-in information and can alter that person’s profile and send that information to others.

3. Only Offer What You Want Others to See

Third parties can only see the information that you put on the social network. This seems simple, but it is an important thing to remember. Facebook is a place where users can communicate with friends, and some users use it as a platform to reveal things that they should not. It is important for users to remember that what they intend to share to a smaller group may eventually be able to be accessed by others.

4. Facebook is Meant for Adults

Facebook originated as an online space for college students, but as the social network expanded it began to include generations above and below the collegiate level … meaning kids. It is important to remember that the Web remains a dangerous place for kids and that if adults are concerned about privacy then it isn’t a safe place for children.

5. Use the Facebook Privacy Settings

It is important to change your privacy settings before using Facebook. Even critics find that Facebook’s privacy settings to be robust in the world of social networking. Users can decide which people are permitted to see the content in their profiles within a few minutes of reviewing the site’s settings. Facebook highlights the importance of privacy and equips users with the tools to feel comfortable on the social network.

6. Be Weary of Sharing Sensitive Information on the Web

The Web may have been a bastion of anonymity years ago, but that era is over. Users share more and more information on sites like Facebook and as a result the desire for anonymity has gradually diminished. Users need to remember that the Internet isn’t the place to disclose sensitive information and consequently only share what they are comfortable with all Web users seeing.

7. Is Privacy Best for a Social Network?

Facebook’s default settings make certain information available to others, thus it isn’t in a social network’s best interest for users to be able to use every single privacy setting. Users will need to be more diligent because the more information that they share on a social network, the more likely people are to want to use it. This fact is already known by Facebook, MySpace and Google and users need to know it too and begin fighting back.

8. Alternatives Aren’t Immune to Security Issues

Facebook alternatives aren’t any better in terms of privacy and security issues. Google Buzz, for example, has been a target by privacy advocates since its beginnings, with critics wondering why Google didn’t implement the right policies from the beginning. Facebook comes out on top when comparing all privacy on all the major social networks and consequently is probably the best choice for users concerned with privacy.

9. Some Privacy Is Lost and Gone

As users continue to reveal their true identities, the days of anonymity on the Web are numbered (if not gone completely). While many are uncomfortable with this, many users are becoming more comfortable with this fact. Web users can expect their names a maybe even a picture to be available on the Web when signing up for social networks. Information such as their hometown and college is also freely available. Absolute privacy is a thing of the past and users need to accept this fact.

10. Blame Can Be Placed on Facebook and Users Alike

While Facebook is an easy scapegoat for privacy woes, a large part of the blame can be placed on users. Facebook relies on users sharing information with others as its basic business model, and while it does attempt to maintain privacy, it is up to the users to control what information they choose to divulge. Additionally, it is incumbent upon users to educate themselves about the risks that could affect then if they don’t brush up on privacy and social networks. To learn more please visit Nubifer.com.

EMC CEO Joe Tucci Predicts Many Clouds in the Future

EMC isn’t alone in focusing on cloud computing during the EMC World 2010 show, as IT vendors, analysts and the like are buzzing about the cloud. But according to EMC CEO Joe Tucci, the storage giant has a new prediction for the future of cloud computing. During his keynote speech on May 10, and a subsequent discussion with reporters and analysts, Tucci said that EMC’s vision of the future varies from others because it sees many private clouds. This exists in stark contrast with the vision of only a few vendors—like Google, Amazon and Microsoft—offering massive public clouds.

“There won’t be four, five or six giant cloud providers. At the end of the day, you’ll have tens of thousands of private clouds and hundreds of public clouds,” said Tucci.

EMC plans on taking on the role of helping businesses move to private cloud environments, where IT administrators have the ability to view multiple data centers as a single pool of resources. These enterprises with their public clouds will also work with public cloud environments, according to Tucci.

The increased complexity and costs of current data centers serve as a catalyst for the demand for cloud computing models. Tucci says that this explosion of data—which comes from multiple sources, including the growth of mobile device users, medical imaging advancements, increased access to broadband and smart devices—is poised to grow further. “Obviously, we need a new approach, because … infrastructures are too complex and too costly. Enter the cloud. This is the new approach,” Tucci said.

According to Tucci, clouds will be based mainly on x86 architectures, feature converged networks and federated resources and will be dynamic, secure, flexible, cost efficient and reliable. These clouds will also be accessible via multiple devices, a growing need due to the ever-increasing use of mobile devices.

EMC’s May 10 announcements were focused on the push for the private cloud, including the introduction of the VPlex appliances and an expanded networking strategy. Said Tucci, “Our mission is to be your guide and to help you on this journey to the private cloud.”

Tucci said that because of the high level of performance in x86 processors from Intel and Advances Micro Devices, he isn’t predicting a long-term future for other architectures in cloud computing. Tucci used Intel’s eight-core Xeon 7500 “Nehalem EX” processors, which can offer up to 1 terabyte of storage, with systems OEMs prepping to unveil servers with as many as eight processors as an example.

Speaking about the overall growth of x86 processor shipments and revenues, Tucci said that RISC architectures and mainframes will continue to slip: “What I’m saying is, we’re convinced, and everything, that EMC does, and everything Cisco does, will be x86-based. Yes, we’re placing a bet on x86, and we’re going to an all-x86 world.” EMC is currently in the midst of a three-year process of migrating to a private cloud environment. This will include abandoning platforms like Solaris and moving to an all-x86 environment. For more information, please visit Nubifer.com.

Cloud Computing Security Play Made by McAfee with McAfee Cloud Secure

A new service targeting Software-as-a-Service providers from McAfee combines vulnerability scanning and security certification for cloud infrastructures. The service—called the McAfee Cloud Secure program—is basically designated to compliment annual audits of security and process controls most cloud vendors undergo for the purpose of certification. McAfee officials say that with McAfee Cloud Secure they will team up with certification providers to offer an additional level of security by offering a daily scan of application, network perimeter and infrastructure vulnerabilities. Those that pass will be rewarded with a “McAfee SECURE” seal of approval.

Earlier this month at the RSA security conference, securing cloud environments was a major topic up for discussion. A survey by IDC on attitudes towards the cloud revealed that 87.5 percent of participants said the most significant obstacles to cloud adoption were security concerns. IDC analyst Christian Christiansen said in a statement, “SaaS vendors have a difficult time convincing prospects that their services are secure and safe.” According to Christiansen, though, McAfee’s new offering is a step in the right direction toward increased security in the cloud.

McAfee and other vendors have discussed providing security from the cloud in the past, but this announcement shows the increasing focus on providing solutions to secure cloud environments themselves in the industry.

Marc Olesen, senior vice president and general manager of McAfee’s Software-as-a-Service business said in an interview with eWEEK, ” McAfee looks at the cloud really from three different angles, which is security from the cloud, in the cloud and for the cloud. What’s really been out there today are (annual) process certification audits … that address the process controls and security controls that cloud providers have in place. This has typically been an ISO-27001 certification or an SAS-70 certification that cloud providers are suing, and we feel that that’s very important, but it’s just a start.” For more information please contact a Nubifer representative today.

Cloud-Optimized Infrastructure and New Services on the Horizon for Dell

Over the past three years, Dell has gained experience in the Cloud through its Data Center solutions and  group-designed customized offerings for cloud and hyperscaled IT environments. The company is now putting that experience to use, releasing several new hardware, software and service offerings optimized for cloud computing environments. Dell officials launched the new offerings—which include a new partner program, new servers optimized for cloud computing and new services designed to help business migrate to the cloud—at a San Francisco event on March 24.

Based on work the Dell Data Center Solutions group has completed over the past three years, the new offerings were outlined by Valeria Knafo, senior manager of business development and business marketing for the DCS unit. According to Knafo, DCS has built customized computing infrastructures for large cloud service providers and hyperscale data centers and is now trying to make their solutions available to enterprises. Said Knafo, “We’ve taken that experience and brought it to a new set of users.”

Dell officials revealed that they have been working with Microsoft on its Windows Azure cloud platform and that the software giant will work with Dell to create joint cloud-based solutions. Dell and Microsoft will continue to collaborate around Windows Azure (including offering services) and Microsoft will continue buying Dell hardware for its Azure platform as well. Turnkey cloud solutions—including pre-tested and pre-assembled hardware, software and services packages that businesses can use to deploy and run their cloud infrastructures quickly—are among the new offerings.

A cloud solution for Web applications will be the first Platform-as-a-Service made available. The offering will combine Dell servers and services with Web application software from Joyent and will come with challenges, caution Dell officials, like unpredictable traffic and the migrating of the apps from development to production. Dell is also offering a new Cloud Partner Program. According to officials, it will broaden options for customers seeking to move into private or public clouds. Dell announced three new software companies as partners as well: Aster Data, Greenplum and Canonical.

Also on the horizon for Dell is its PowerEdge C-series servers, which are designed to be energy efficient and offer features that are vital to hyperscaled environments—HPC (high-performance computing), social networking, gaming, cloud computing, Web 2.0 functions—like memory capacity and high performance. The C1100 (designed for clustered computing environments), the C2100 (for data analytics, cloud computing and cloud storage) and the C6100 (a four-node cloud and cluster system which offers a shared infrastructure) are the three servers that make up the family.

In unveiling the PowerEdge C-Series, Dell is partaking in the increasing industry trend of offering new systems optimized for cloud computing. For example, on March 17 Fujitsu unveiled the Primergy CX1000, a rack server created to offer the high performance environments need when lowering costs and power consumption. The Primergy CX1000 can also save on data center space through a design which pushes hot air from the system through the top of the enclosure as opposed to the back.

Last, but certainly not least, are Dell’s Integrated Solution Services. They offer complete cloud lifecycle management and include workshops to assess a company’s readiness to move to the cloud. Knafo said that the services are a combination of what Dell gained with the acquisition of Perot Systems and what it had already. “There’s a great interest in the cloud, and a lot of questions on how to get to the cloud. They want a path and a roadmap identifying what the cloud can bring,” said Knafo.

Mike Wilmington, a planner and strategist for Dell’s DCS group, claimed the services will decrease confusion many enterprises may have about the cloud. Said Wilmington, “Clouds are what the customer wants them to be,” meaning that while cloud computing may offer essentially the same benefits to all enterprises (cost reductions, flexibility, improved management and greater energy efficiency) it will look different for every enterprise. For more information please visit Nubifer.com.

Cisco, Verizon and Novell Make Announcements about Plans to Secure the Cloud

Cisco Systems, Verizon Business and Novell announce plans to launch offerings designed to heighten security in the cloud.

On April 28, Cisco announced security services based around email and the Internet that are part of the company’s cloud protection push and its Secure Borderless Network architecture; Cisco’s Secure Borderless Network architecture seeks to give users secure access to their corporate resources on any device, anywhere, at anytime.

Cisco’s IronPort Email Data Loss Prevention and Encryption, and ScanSafe Web Intelligence Reporting are designed to work with Cisco’s other web security solutions to grant companies more flexibility when it comes to their security offerings while streamlining management requirements, increasing visibility and lowering costs.

Verizon and Novell made an announcement on April 28 about their plans to collaborate to create an on-demand identity and access management service called Secure Access Services from Verizon. Secure Access Services from Verizon is designed to enable enterprises to decide and manage who is granted access to cloud-based resources. According to the companies, the identity-as-a-server solution is the first of what will be a host of joint offerings between Verizon and Novell.

According to eWeek, studies continuously indicate that businesses are likely to continue trending toward a cloud-computing environment. With that said, issues concerning security and access control remain key concerns. Officials from Cisco, Verizon and Novell say that the new services will allow businesses to feel more at ease while planning their cloud computing strategies.

“The cloud is a critical component of Cisco’s architectural approach, including its Secure Borderless Network architecture,” said vice president and general manager of Cisco’s Security technology business unit Tom Gillis in a statement. “Securing the cloud is highly challenging. But it is one of the top challenges that the industry must rise to meet as enterprises increasingly demand the flexibility, accessibility and ease of management that cloud-based applications offer for their mobile and distributed workforces.”

Cisco purchased ScanSafe in December 2009 and the result is Cisco’s ScanSafe Web Intelligence Reporting platform. The platform is designed to give users a better idea of how their Internet resources are being used, and the objective is to ensure that business-critical workloads aren’t being encumbered by non-business-related traffic. Cisco’s ScanSafe Web Intelligence Reporting platform can report on user-level data and information on Web communications activities within second, and offers over 80 predefined reports.

Designed to protect outbound email in the cloud, the IronPort email protection solution is perfect for enterprises that don’t want to manage their email. Cisco officials say that it provides hosted mailboxes (while keeping control of email policies) and also offers the option of integrated encryption.

Officials say Cisco operates over 30 data centers around the globe and that security offerings handle large quantities of activity each day—including 2.8 billion reputation look-ups, 2.5 billion web requests and the detection of more than 250 billion span messages—and these are the latest in the company’s expanding portfolio of cloud security offerings.

Verizon and Novell’s collaboration—the Secure Access Services—are designed to enable enterprises to move away from the cost and complexity associated with using traditional premises0based identity and access management software for securing applications. These new services offer centralized management of web access to applications and networks in addition to identity federation and web single sign-on.

Novell CEO Ron Hovsepian released a statement saying, “Security and identity management are critical to accelerating cloud computing adoption and by teaming with Verizon we can deliver these important solutions.” While Verizon brings the security expertise, infrastructure, management capabilities and portal to the service, Novell provides the identity and security software. For more information contact a Nubifer representative today.

Cloud Interoperability Brought to Earth by Microsoft

Executives at Microsoft say that an interoperable cloud could help companies trying to lower costs and governments trying to connect constituents. Cloud services are increasingly seen as a way for businesses and governments to scale IT systems for the future, consolidate IT infrastructure, and enable innovative services not possible until now.

Technology vendors are seeking to identify and solve the issues created by operating in mixed IT environments in order to help organizations fully realize the benefits of cloud services. Additionally, vendors are collaborating to make sure that their products work well together. The industry may still be in the beginning stages of collaborating on cloud interoperability, but has already made great strides.

So what exactly is cloud interoperability and how can it benefit companies now? Cloud interoperability specifically concerns one cloud solution working with other platforms and applications—not just other clouds. Customers want to be able to run applications locally or in the cloud, or even on a combination of both. Currently, Microsoft is collaborating with others in the industry and is working to make sure that the premise of cloud interoperability becomes an actuality.

Microsoft’s general managers Craig Shank and Jean Paoli are spearheading Microsoft’s interoperability efforts. Shank helms the company’s interoperability work on public policy and global standards and Paoli collaborates with the company’s product teams to cater product strategies to the needs of customers. According to Shank, one of the main attractions of the cloud is the amount of flexibility and control it gives customers. “There’s a tremendous level of creative energy around cloud services right now—and the industry is exploring new ideas and scenarios all the time. Our goal is to preserve that flexibility through an open approach to cloud interoperability,” says Shank.

Paoli chimes in to say, “This means continuing to create software that’s more open from the ground up, building products that support technologies such as PHP and Java, and ensuring that our existing products work with the cloud.” Both Shank and Paoli are confident that welcoming competition and choice will allow Microsoft to become more successful down the road. “This may seem surprising,” says Paoli before adding,” but it creates more opportunities for its customers, partners and developers.”

Shank reveals that due to the buzz about the cloud, some forget about the ultimate goal: “To be clear, cloud computing has enormous potential to stimulate economic growth and enable governments to reduce costs and expand services to citizens.” One example of the real-world benefits of cloud interoperability is the public sector. Microsoft is currently showing results in this area via solutions like their Eye for Earth project. Microsoft is helping the European Environment Agency simplify the collection and processing of environmental information for use by the general public and government officials. Eye on Earth obtains data from 22,000 water monitoring points and 1,000 stations that monitor air quality through employing Microsoft® Windows Azure, Microsoft ® SQL Azure and already existing Linux technologies. Eye on Earth then helps synthesize the information and makes it accessible for people in 24 different languages in real time.

Product developments like this emerged out of feedback channels which the company developed with its partners, customers and other vendors. In 2006, for example, Microsoft created the Interoperability Executive Customer (IEC) Council, which is comprised of 35 chief technology officers and chief information officers from a variety of organizations across the globe. The group meats two times per year in Redmond and discuss issues concerning interoperability as well as provide feedback to Microsoft executives.

Additionally, Microsoft recently published a progress report which—for the first time—revealed operational details and results achieved by the Council across six work streams (or priority areas). The Council recently commissioned the creation of a seventh work stream for cloud interoperability geared towards developing standards related to the cloud which addressed topics like data portability, privacy, security and service policies.

Developers are an important part of cloud interoperability, and Microsoft is part of an effort the company co-founded with Zend Technologies, IBM and Rackspace called Simple Cloud. Simple Cloud was created to help developers write basic cloud applications that work on all major cloud platforms.

Microsoft is further engaging in the collaborative work of building technical “bridges” between the company and non-Microsoft technologies, like the recently-released Microsoft ® Windows Azure Software Development Kits (SDKs) for PHP and Java and tools for the new Windows ® Azure platform AppFabric SDKs for Java, PHP and Ruby (Eclipse version 1.0), the SQL CRUD Application Wizard for PHP and the Bing 404 Web Page Error Toolkit for PHP. These examples show the dedication of Microsoft Interoperability team.

Despite the infancy of the industry’s collaboration on cloud interoperability issues, much progress has already been made. This progress has had a major positive impact on the way even average users work and live, even if they don’t realize it yet. A wide perspective and a creative and collaborative approach to problem-solving are required for cloud interoperability. In the future, Microsoft will continue to support more conversation within the industry in order to define cloud principles and make sure all points of view are incorporated. For more information please contact a Nubifer representative today.

Amazon Sets the Record Straight About the Top Five Myths Surrounding Cloud Computing

On April 19, the 5th International Cloud Computing Conference & Expo (Cloud Expo)opened in New York City, and Amazon Web Services (AWS) used the event as a platform to address some of what the company sees as the lingering myths about cloud computing.

AWS officials said that the company continues to grapple with questions about features of the cloud-ranging from reliability and security to cost and elasticity—despite being one of the first companies to successfully and profitably implement cloud computing solutions. Adam Selipsky, vice president of AWS, recently spoke about the persisting myths of cloud computing from Amazon’s Seattle headquarters, specifically addressing five that linger in the face of increased industry adoption of the cloud and continued successful cloud deployments. “We’ve seen a lot of misperceptions about cloud computing is,” said Selipsky before debunking five common myths.

Myth 1: The Cloud Isn’t Reliable

Chief information officers (CIOs) in enterprise organizations have difficult jobs and are usually responsible for thousands of applications, explains Selipsky in his opening argument, adding that they feel like they are responsible for the performance and security of these applications. When problems with the applications arise, CIOs are used to approaching their own people for answers and take some comfort that there is a way to take control of the situation.

Selipsky says that customers need to consider a few things when adopting the cloud, one of which is that the AWS’ operational performance is good. Selipsky reminded users that they own the data, they choose which location to store the data (and it doesn’t move unless the customer decided to move it) and that regardless of whether customers choose to encrypt or not, AWS never looks at the data.

“We have very strong data durability—we’ve designed Amazon S3 (Simple Storage Service) for eleven 9’s of durability. We store multiple copies of each object across multiple locations,” said Selipsky. He added that AWS has a “Versioning” feature which allows customers to revert to the last version of any object they somehow lose due to application failure or an unintentional deletion. Customers can also ensure additional fault-tolerant applications by deploying their applications in various Availability zones or using AWS’ Load Balancing and Auto Scaling features.

“And, all that comes with no capex [capital expenditures] for companies, a low per unit cost where you only pay for what you consume, the ability to focus on engineers on unique incremental value for your business,” said Selipsky before adding that the origin of the reliability claims come merely from an illusion of a control, not actual control. “People think if they can control it they have more say in how things go. It’s like being in a car versus an airplane, but you’re much safer in a plane,” he explained.

Myth 2: The Cloud Provides Inadequate Security and Privacy

When it comes to security, Selipsky notes that it is an end-to-end process and thus companies need to build security at every level of the stack. Taking a look at Amazon’s cloud, it is easy to note that the same security isolations are employed as with a traditional data center—including physical data center security, separation of the network, isolation of the server hardware and isolation of storage. Data centers had already become a frequently-shared infrastructure on the physical data center side before Amazon launched its cloud services. Selipsky added that companies realized that they could benefit by renting space in a data facility as opposed to building it.

When speaking about security fundamentals, Selipsky noted that security could be maintained by providing badge-controlled access, guard stations, monitored security cameras, alarms, separate cages and strictly audited procedures and processes. Not only is Amazon’s Web Services’ data center identical to the best practices employed in private data facilities, there is an added physical security advantage in the fact that customers don’t need to access to the servers and networking gear inside. Access to the data center is thus controlled more strictly than traditional rented facilities. Selipsky also added that the Amazon cloud as equal or better isolation than could be expected from dedicated infrastructure, at the physical level.

In his argument, Selipsky pointed out that networks ceased to be isolated physical islands a long time ago because, as companies increasingly began to need to connect to other companies—and then the Internet—their networks became connected with public infrastructure. Firewalls and switch configurations and other special network functionality were used to prevent bad network traffic from getting in, or conversely from leaking out. Companies began using additional isolation techniques as their network traffic increasingly passed over public infrastructure to make sure that the security of every packet on (or leaving) their network remained secure. These techniques include Multi-protocol Label Switching (MPLS) and encryption.

Amazon used a similar approach to networking in its cloud by maintaining packet-level isolation of network traffic and supporting industry-standard encryption. Amazon Web Services’ Virtual Private Cloud allows a customer to establish their own IP address space and because of that customers can use the same tools and software infrastructure they are familiar with to monitor and control their cloud networks. Amazon’s scale also allows for more investment in security policing and countermeasures than nearly and large corporation could afford. Maintains Selipsky, “Our security is strong and dug in at the DNA level.”

Amazon Web Services invests in testing and validating the security of its virtual server and storage environment significantly as well. When discussing the investments made on the hardware side, Selipsky lists:

After customers release these resources, the server and storage are wiped clean so no important data can be left behind.

Intrusion from other running instances is prevented because each instance has its own customer firewall.

Those in need of more network isolation can use Amazon VPC, which allows you to carry your own IP address space with you into the cloud; your instances are only accessible through those IP addresses only you know.

Those desiring to run on their own boxes—where no other instances are running—can purchase extra large instances where only that XL instance runs on that server.

According to Selipsky, Amazon’s scale allows for more investment in security policing and countermeasures: “In fact, we often find that we can improve companies’ security posture when they use AWS. Take the example lots of CIOs worry about—the rogue server under a developer’s desk running something destructive or that the CIO doesn’t want running. Today, it’s really hard (if not impossible) for CIOS to know how many orphans there are and where they might be. With AWS, CIOs can make a single API call and see every system running in their VPC [Virtual Private Cloud]. No more hidden servers under the desk or anonymously places servers in a rack and plugged into the corporate network. Finally, AWS is SAS-70 certified; ISO 27—1 and NIST are in process.”

Myth 3: Creating My Own In-House Cloud or Private Cloud Will Allow Me to Reap the Same Benefits of the Cloud

According to Selipsky, “There’s a lot of marketing going on about the concept of the ‘private cloud.’ We think there’s a bit of a misnomer here.” Selipsky continued to explain that generally, “we often see companies struggling to accurately measure the cost of infrastructure. Scale and utilization are big advantages for AWS. In our opinion, a cloud has five key characteristics: it eliminates capex; allows you to pay for what you use; provides true elastic capacity to scale up and down; allows you to move very quickly and provision servers in minutes; and allows you to offload the undifferentiated heavy lifting of infrastructure so your engineers work on differentiating problems.

Selipsky also pointed out the following drawbacks of private clouds: still own the capex (and they are expensive!); not pay for  what you use; not have true elasticity; still manage the undifferentiated heavy lifting. “With a private cloud you have to manage capacity very carefully … or you or your private cloud vendor will end up over-provisioning. So you’re going to have to either get very good at capacity management or you’re going to wind up overpaying,” said Selipsky before challenging the elasticity of the private cloud: “The cloud is shapeless. But if it has a tight box around it, it no longer feels very cloud-like.”

One of AWS’ key offerings is Amazon’s ability to save customers money while also driving efficiency. “In virtually every case we’ve seen, we’ve been able to save people a significant amount of money,” said Selipsky. This is in part because AWS’ business has greatly expanded over the last four years and Amazon has achieved enough scale to secure very low costs. AWS has been able to aggregate hundreds of thousands of customers to have a high utilization of its infrastructure. Said Selipsky, “In our conversations with customers we see that really good enterprises are in the 20-30 percent range on utilization—and that’s when they’re good … many are not that strong. The cloud allows us to have several times that utilization. Finally, it’s worth looking at Amazon’s heritage and AWS’ history. We’re a company that works hard to lower its costs so that we can pass savings back to our customers. If you look at the history of AWS, that’s exactly what we’ve done (lowering price on EC2, S3, CloudFront, and AWS bandwidth multiple times already without any competitive pressure to do so).”

Myth 4: The Cloud Isn’t Ideal Because I Can’t Move Everything at Once

Selipsky debunks this myth by saying, “We believe this is nearly impossible and ill-advised. We recommend picking a few apps to gain experience and comfort then build a migration plan. This is what we most often see companies doing. Companies will be operating in hybrid environments for years to come. We see some companies putting some stuff on AWS and then keeping some stuff in-house. And I think that’s fine. It’s a perfectly prudent and legitimate way of proceeding.”

Myth 5: The Biggest Driver of Cloud Adoption is Cost

In busting the final myth, Selipsky said, “There is a big savings in capex and cost but what we find is that one of the main drivers of adoption is that time-to-market for ideas is much faster in the cloud because it lets you focus your engineering resources on what differentiates your business.”

Summary

Speaking about all of the myths surround the cloud, Selipsky concludes that “a lot of this revolves around psychology and fear of change, and human beings needing to gain comfort with new things. Years ago people swore they would never put their credit card information online. But that’s no longer the case. We’re seeing great momentum. We’re seeing, more and more, over time these barriers [to cloud adoption] are moving.” For additional debunked myths regarding Cloud Computing visit Nubifer.com.

IBM Elevates Its Cloud Offerings with Purchase of Cast Iron Systems

IBM Senior Vice President and Group Executive for IBM Software Group Steve Mills announced the acquisition of cloud integration specialist Cast Iron Systems at the IBM Impact 2010 conference in Las Vegas on May 3. The privately held Cast Iron is based in Mountain View, California and delivers cloud integration software, appliances and services, thus the acquisition broadens the delivery of cloud computing services for IMB’s clients. IBM’s business process and integration software portfolio grew over 20 percent during the first quarter and the company sees this deal as a way to expand it further. The financial terms of the acquisition were not disclosed although Cast Iron Systems’ 75 employees will be integrated into IBM.

According to IBM officials, Big Blue anticipated the worldwide cloud computing market to grow at a compounded annual rate of 28 percent from $47 billion in 2008 to a projected $126 billion by 2012. The acquisition of Cast Iron Systems reflects IBM’s expansion of its software business around higher value capabilities that help clients run companies more effectively.

IBM has transformed its business model to focus on higher value, high-margin capabilities through organic and acquisitive growth in the past ten years–and the company’s software business has been a key catalyst in this shift. IBM’s software revenue grew at 11 percent year-to-year during the first quarter and the company generated $8 billion in software group profits in 2008 (up from $2.8 billion in 2000).

Since 2003, the IBM Software Group has acquired over 55 companies, and the acquisition of Cast Iron Systems is part of that. Cast Iron Systems’ clients include Allianz, Peet’s Coffee & Tea, NEC, Dow Jones, Schumacher Group, ShoreTel, Time Warner, Westmont University and Sports Authority and the cloud integration specialist has completed thousands of cloud integrations around the globe for retail organizations, financial institutions and media and entertainment companies.

IBM’s acquisition comes at a time when one of the major challenges facing businesses when adopting cloud delivery models is integrating the disparate systems running in their data centers with new cloud-based applications–which used to be time-consuming work which drained resources. IBM gains the ability to help businesses rapidly integrate their cloud-based applications and on-pemises systems, with the acquisition of Cast Iron Systems. Additionally, the acquisition advances IBM’s capabilities for a hybrid cloud model–which allows enterprises to blend data from on-premises applications with public and private cloud systems.

IBM, which is know for offering application integration capabilities for on-premises and business-to-business applications, will now be able to offer clients a complete platform to integrate cloud applications from providers like Amazon, Salesforce.com, NewSuite and ADP with on-premises applications like SAP and JD Edwards. Relationships between IBM and Amazon and Salesforce.com will essentially become friendlier due to this acquisition.

IBM said that it can use Cast Iron Systems’ hundreds of prebuilt templates and services expertise to eliminate expensive coding, thus allowing cloud integrations to be completed in mere days (rather than weeks, or even longer). These results can be achieved through using a physical appliance, a virtual appliance or a cloud service.

Craig Hayman, general manager for IBM WebSphere said in a statement, “The integration challenges Cast Iron Systems is tackling are crucial to clients who are looking to adopt alternative delivery models to manage their businesses. The combination of IBM and Cast Iron Systems will make it easy for clients to integrate business applications, no matter where those applications reside. This will give clients greater agility and, as a result, better business outcomes.”

IMB cited Cast Iron Systems helping pharmaceutical distributor Amerisource Bergen Specialty Group connecting Saleforce CRM with its on-premise corporate data warehouse as an example. The company has since been able to give its customer service associates access to the accurate, real-time information they need to deliver a positive customer experience while realizing $250,000 in annual cost savings.

Cast Irons Systems additionally aided a division of global corporate insurance leader Allianz integrate Salesforce CRM with its on-premises underwriting applications to offer real-time visibility into contract renewals for its sales team and key performance indicators for sales management. IBM said that Allianz beat its own 30-day integration project deadline by replacing labor-intensive custom code with Cast Iron Systems’ integration solution.

President and chief executive officer of Cast Iron Systems Ken Comee said, “Through IBM, we can bring Cast Iron Systems’ capabilities as the world’s leading provider of cloud integration software and services to  global customer set. Companies around the world will now gain access to our technologies through IBM’s global reach and its vast network of partners. As part of IBM, we will be able to offer clients a broader set of software, services and hardware to support their cloud and other IT initiatives.”

IBM will remain consistent with its software strategy by supporting and enhancing Cast Iron Systems’ technologies and clients while simultaneously allowing them to utilize the broader IBM portfolio. For more information, visit Nubifer.com.

Transforming Into a Service-Centric IT Organization By Using the Cloud

While IT executives typically approach cloud services from the perspective of how they are being delivered, this model neglects what cloud services are and how they are consumed. These two facets can have a large impact on the overall IT organizations, points out eWeek Knowledge Center contributor Keith Jahn. Jahn maintains that it is very important for IT executives to veer away from the current delivery-only focus by creating a world-class supply chain for managing the supply and demand of cloud services.

Using the popular fable The Sky Is Falling, known lovingly as Chicken Little, Jahn explains a possible future scenario that IT organizations may face due to cloud computing. As the fable goes, Chicken Little embarks on a life-threatening journey to warn the king that the sky is falling and on this journey she gathers friends who join her on her quest. Eventually, the group encounters a sly fox who tricks them into thinking that he has a better path to help them reach the king. The tale can end one of two ways: the fox eats the gullible animals (thus communicating the lesson “Don’t believe everything you hear”) or the king’s hunting dogs can save the day (thus teaching a lesson about courage and perseverance).

So what does this have to do with cloud computing? Cloud computing has the capacity to bring on a scenario that will force IT organizations to change, or possibly be eliminated altogether. The entire technology supply chain as a whole will be severely impacted if IT organizations are wiped out. Traditionally, cloud is viewed as a technology disruption, and is assessed from a deliver orientation, posing questions like how can this new technology deliver solutions cheaper and better and faster? An equally important yet often ignored aspect of this equation is how cloud services are consumed. Cloud services are ready to run, self-sourced, available wherever you are and are pay-as-you-go or subscription based.

New capabilities will emerge as cloud services grow and mature and organizations must be able to solve new problems as they arise. Organizations will also be able to solve old problems cheaper, better and faster. New business models will be ushered in by cloud services and these new business models will force IT to reinvent itself in order to remain relevant. Essentially, IT must move away from its focus on the delivery and management of assets and move toward the creation of a world-class supply chain for managing supply and demand of business services.

Cloud services become a forcing function in this scenario because they are forcing IT to transform. CIOs that choose to ignore this and neglect to make transformative measures will likely see their role shift from innovation leader to CMO (Chief Maintenance Officer), in charge of maintaining legacy systems and services sourced by the business.

Analyzing the Cloud to Pinpoint Patterns

The cloud really began in what IT folks now refer to as the “Internet era,” when people were talking about what was being hosted “in the cloud.” This was the first generation of the cloud, Cloud 1.0 if you will—an enabler that originated in the enterprise. Supply Chain Management (SCM) processes were revolutionized by commercial use of the Internet as a trusted platform and eventually the IT architectural landscape was forever altered.

This model evolved and produced thousands of consumer-class services, which used next-generation Internet technologies on the front end and massive scale architectures on the back end to deliver low-cost services to economic buyers. Enter Cloud 2.0, a more advanced generation of the cloud.

Beyond Cloud 2.0

Cloud 2.0 is driven by the consumer experiences that emerged out of Cloud 1.0. A new economic model and new technologies have surfaced since then, due to Internet-based shopping, search and other services. Services can be self-sourced from anywhere and from any device—and delivered immediately—while infrastructure and applications can be sourced as services in an on-demand manner.

Currently, most of the attention when it comes to cloud services remains focused on the new techniques and sourcing alternatives for IT capabilities, aka IT-as-a-Service. IT can drive higher degrees of automation and consolidation using standardized, highly virtualized infrastructure and applications. This results in a reduction in the cost of maintaining existing solutions and delivering new solutions.

Many companies are struggling with the transition from Cloud 1.0 to Cloud 2.0 due to the technology transitions required to make the move. As this occurs, the volume of services in the commercial cloud marketplace is increasing, propagation of data into the cloud is taking place and Web 3.0/semantic Web technology is maturing. The next generation of the cloud, Cloud 3.0 is beginning to materialize because of these factors.

Cloud 3.0 is significantly different because it will enable access to information through services set in the context of the consumer experience. This means that processes can be broken into smaller pieces and subsequently automated through a collection of services, which are woven together with massive amounts of data able to be accessed. With Cloud 3.0, the need for large-scale, complex applications built around monolithic processes is eliminated. Changes will be able to be made by refactoring service models and integration achieved by subscribing to new data feeds. New connections, new capabilities and new innovations—all of which surpass the current model—will be created.

The Necessary Reinvention of IT

IT is typically organized around the various technology domains taking in new work via project requests and moving it through a Plan-Build-Run Cycle. Here lies the problem. This delivery-oriented, technology-centric approach has inherent latency built-in. This inherent latency has created increasing tension with the business it serves, which is why IT must reinvent itself.

IT must be reinvented so that it becomes the central service-sourcing control point for the enterprise or realize that the business with source them on their own. By becoming the central service-sourcing control point for the enterprise, IT can maintain the required service levels and integrations. Changes to behavior, cultural norms and organizational models are required to achieve this.

IT Must Become Service-Centric in the Cloud

IT must evolve from a technology-centric organization into a service-centric organization in order to survive, as service-centric represents an advanced state of maturity for the IT function. Service-centric allows IT to operate as a business function—a service provider—created around a set of products which customers value and are in turn willing to pay for.

As part of the business strategy, these services are organized into a service portfolio. This model differs from the capability-centric model because the deliverable is the service that is procured as a unit through a catalog and for which the components—and sources of components—are irrelevant to the buyer. With the capability-centric model, the deliverables are usually a collection of technology assets which are often visible to the economic buyer and delivered through a project-oriented life cycle.

With the service-centric model, some existing roles within the IT organization will be eliminated and some new ones will be created. The result is a more agile IT organization which is able to rapidly respond to changing business needs and compete with commercial providers in the cloud service marketplace.

Cloud 3.0: A Business Enabler

Cloud 3.0 enables business users to source services that meet their needs quickly, cost-effectively and at a good service level—and on their own, without the help of an IT organization. Cloud 3.0 will usher in breakthroughs and innovations at an unforeseen pace and scope and will introduce new threats to existing markets for companies while opening new markets for others. In this way, it can be said that cloud is more of a business revolution than a technology one.

Rather than focusing on positioning themselves to adopt and implement cloud technology, a more effective strategy for IT organizations would be to focus on transforming the IT organization into a service-centric model that is able to source, integrate and manage services with high efficiency.

Back to the story and its two possible endings:

The first scenario suggests that IT will choose to ignore that its role is being threatened and continue to focus on the delivery aspects of the cloud. Under the second scenario, IT is rescued by transforming into the service-centric organization model and becoming the single sourcing control point for services in the enterprise. This will effectively place IT in control of fostering business innovation by embracing the next wave of cloud. For more information please visit Nubifer.com.

A Guide to Securing Sensitive Data in Cloud Environments

Due to the outsourced nature of the cloud and its innate loss of control, it is important to make sure that sensitive data is constantly and carefully monitored for protection. That task is easier said than done, which is why the following questions arise: How do you monitor a database server when its underlying hardware moves every day—sometimes even multiple times a day and sometimes without your knowledge? How do you ensure that your cloud computing vendor’s database administers and system administrators are not copying or viewing confidential records inappropriately or abusing their privileges in another way?

When deploying a secure database platform in a cloud computing environment, these obstacles and many more are bound to arise and an enterprise needs to be able to overcome them, as these barriers may be enough to prevent some enterprises from moving their on-premises approach. There are three critical architectural concerns to consider when transferring applications with sensitive data to the cloud.

Issue 1: Monitoring an Ever-changing Environment

Cloud computing grants you the ability to move servers and add or remove resources in order to maximize the use of your systems and reduce expense. This increased flexibility and efficiency often means that the database servers housing your sensitive data are constantly being provisioned and deprovisioned. Each of these scenarios represents a potential target for hackers, which is an important point to consider.

Monitoring data access becomes more difficult due to the dynamic nature of a cloud infrastructure. If the information in those applications is subject to regulations like the Payment Card Industry Data Security Standard (PCI DSS) or the Health Insurance Portability and Accountability Act (HIPAA), it is vital to make sure that it is secure.

It is essential to find a methodology that is easily deployed on new database servers without management involvement when thinking about solutions to monitor activity on these dynamic database servers. This requires a distributed model in which each instance in the cloud has a sensor or agent running locally; and this software must be able to be provisioned automatically along with the database software without requiring intrusive system management.

It won’t always be possible to reboot whenever it is necessary to install, upgrade or update the agents in a multitenancy environment such as this, and the cloud vendor may even place limitations on installation of software requiring certain privileges. With the right architecture in place, you will be able to see where your databases are hosted at any point in town and will be able to centrally log all activity and flag suspicious events across all servers wherever they are.

Issue 2: Working in a WAN

Currently, database activity monitoring solutions utilize a network-sniffing model to identify malicious queries, but this approach isn’t feasible in the cloud environment because the network encompasses the entire Internet. Another method that doesn’t work in the cloud is adding a local agent which sends all traffic to a remote server.

The solution is something that is designed for distributed processing where the local sensor is able to analyze traffic autonomously. Another thing to consider is that  cloud computing resources procured are likely to be on a WAN. Network bandwidth and network latency will make off-host processing inefficient. With cloud computing, you are likely unable to colocate a server lose to your databases. This means that the time and resources spent spending every transaction to a remote server for analysis will stunt network performance and also hinder timely interruption of malicious activity.

So when securing databases in cloud computing, a better approach is to utilize a distributed monitoring solution that is based on “smart” agents. That way, once a security policy for a monitored database is in place, that agent or sensor is able to implement protection and alerting locally and thus prevent the network from turning into the gating factor for performance.

It is also necessary to test the WAN capabilities of your chosen software for remote management of distributed data centers. It should be able to encrypt all traffic between the management console and sensors to restrict exposure of sensitive data. There are also various compression techniques that can enhance performance so that alerts and policy updates are transmitted efficiently.

Issue 2: Know Who Has Privileged Access to Your Data

The activity of privileged users is one of the most difficult elements to monitor in any database implementation. It is important to remember that DBAs and system administrators know how to stealthy access and copy sensitive information (and cover their tracks afterward). There are unknown personnel at unknown sites with these access privileges in cloud computing environments. Additionally, you cannot personally conduct background checks on third parties like you would for your own staff in this situation. When looking at all of these factors, it is easy to see why protecting against inside threats is important yet difficult to do.

So how do you resolve this issue? One way is to separate duties to ensure that the activities of privileged third parties are monitored by your own staff and also that the pieces of the solution on the cloud side of the network are unable to be defeated without alerts going off. It is also necessary to be able to closely monitor individual data assets regardless of the method used to access it.

Seek out a system that knows when the data is being accessed in violation of the policy–without relying on query analytics alone. Sophisticated users with privileges can create new views, insert stored procedures into a database or generate triggers which compromise information without the SQL command arising suspicion.

Summary

Although some may wrongfully conclude that the complex nature of monitoring database in a cloud architecture isn’t worth changing from dedicated systems–or at least not just yet. With that said, most enterprises will decide that deploying applications with sensitive data on one of these models is inevitable. Leading organizations have begun to change and as a result tools are now meeting the requirements driven by the issues raised in this article.

Essentially, security should not prevent you from moving forward with deploying databases in the cloud if you think your enterprise would benefit from doing so. By looking before you leap–ensuring your security methodologies adequately address these unique cases–you can make the transition safely.  For more information please visit Nubifer.com.

New Cloud-Focused Linux Flavor: Peppermint

A new cloud-focused Linux flavor is in town: Peppermint. The Peppermint OS is currently a small, private beta which will open up to more testers in early to late May. Aimed at the cloud, the Peppermint OS is described on its home page as: “Cloud/Web application-centric, sleek, user friendly and insanely fast! Peppermint was designed for enhances mobility, efficiency and ease of use. While other operating systems are taking 10 minutes to load, you are already connected, communicating and getting things done. And, unlike other operating systems, Peppermint is ready to use out of the box.”

The Peppermint team announced the closed beta of the new operating system in a blog post on April 14, saying that the operating system is “designed specifically for mobility.” The description of the technology on Launchpad describes Peppermint as “a fork of Lubuntu with an emphasis on cloud apps and using many configuration files sources from Linux Mint. Peppermint uses Mozilla Prism to create single site browsers for easily accessing many popular Web applications outside of the primary Web applications outside of the primary browser. Peppermint uses the LXDE desktop environment and focuses on being easy for new Linux users to find their way around in.”

Lubuntu is described by the Lubuntu project as a lighter, faster and energy-saving modification of Ubuntu using LXDE (the Lightweight X11 Desktop Environment). Kendall Weaver and Shane Remington, a pair of developers in North Carolina, make up the core Peppermint team. Weaver is the maintainer for the Lunix Mint Fluxbox and LXDE editions as well as the lead software developer for Astral IX Media in Asheville, NC and the director of operations for Western Carolina Produce in Hendersonville, NC. Based in Asheville, NC, Remington is the project manager and lead Web developer for Astral IX Media and, according to the Peppermint site, “provides the Peppermint OS project support with Web development, marketing, social network integration and product development.” For more information please visit Nubifer.com.

Using Business Service Management to Manage Private Clouds

Cloud computing promises an entirely new level of flexibility through pay-as-you-go, readily accessible, infinitely scalable IT services, and executives in companies of all sizes are embracing the model. At the same time, they are also posing questions about the risks associated with moving mission-critical workloads and sensitive data into the cloud. eWEEK’s Knowledge Center contributor Richard Whitehead has four suggestions for managing private clouds using service-level agreements and business service management technologies.

“Private clouds” are what the industry is calling hybrid cloud computing models which offer some of the benefits of cloud computing without some of the drawbacks that have been highlighted. These private clouds host all of the company’s internal data and applications while giving the user more flexibility over how service is rendered. The transition to private clouds is part of the larger evolution of the data center, which makes the move from a basic warehouse of information to a more agile, smarter deliverer of services. While virtualization helps companies save on everything from real estate to power and cooling costs, it does pose the challenge of managing all of the physical and virtual servers—or virtual sprawl. Basically, it is harder to manage entities when you cannot physically see and touch them.

A more practical move into the cloud can be facilitated through technology, with private clouds being managed through the use of service-level agreements (SLAs) and business service management (BSM) technologies. The following guide is a continuous methodology to bring new capabilities into an IT department within a private cloud network. Its four steps will give IT the tools and knowledge to overcome common cloud concerns and experience the benefits that a private cloud provides.

Step 1: Prepare

Before looking at alternative computing processes, an IT department must first logically evaluate its current computing assets and ask the following questions. What is the mixture of physical and virtual assets? (The word asset is used because this process should examine the business value delivered by IT.) How are those assets currently performing?

Rather than thinking in terms of server space and bandwidth, IT departments should ask: will this private cloud migration increase sales or streamline distribution? This approach positions IT as a resource rather than as a line item within an organization. Your private cloud migration will never take off if your resources aren’t presented in terms of assets and RIO.

Step 2: Package

Package refers to resources and requires a new set of measurement tools. IT shops are beginning to think in terms of packaging “workloads” in the virtualized world as opposed to running applications on physical servers. Workloads are portable, self-contained units of work or services built through the integration of the JeOS (“just enough” operating system), middleware and the application. They are portable and able to be moved across environments ranging from physical and virtual to cloud and heterogeneous.

A business service is a group of workloads, and this shows a fundamental shift from managing physical servers and applications to managing business services composed of portable workloads that can be mixed and matched in the way that will be serve the business. Managing IT to business services (aka the service-driven data center) is becoming a business best practice and allows the IT department to price and validate its provide cloud plan as such.

Step 3: Price

A valuation must be assigned to each IT unit after you’ve packaged up your IT processes into workloads and services. How much does it cost to run the service? How much will it cost if the service goes offline? The analysis should be presented around how these costs effect the business owner because the costs assessments are driven by the business need.

One of the major advantages of a service-driven data center is that business services are able to be dynamically manages to SLAs and moved around appropriately. This allows companies to attach processes to services by connecting workloads to virtual services and, for the first time, connects a business process to the hardware implementing that business process.

The business service can be managed independent of the hardware because they aren’t tied to the business server and can thus be moved around on an as-needed basis.

Price is dependent on the criticality of the service, what resources it will consume or whether it is worthy of backup and/or disaster recovery support. This shows a new approach not usually disclosed by IT and transparency in a cloud migration plan can be seen as a crucial part of demonstrating the value the cloud provides in a way that is cost-effective.

Step 4: Present

After you have an IT service package, you must present a unified catalog to the consumers of those services. This catalog must be visible to all relevant stakeholders within the organization and can be considered an IT storefront or showcase featuring various options and directions for your private cloud to demonstrate value to the company.

This presentation allows your organization the flexibility to balance IT and business needs for a private cloud architecture that works for all parties; the transparency gives customers a way to interact directly with IT.

Summary

Although cloud computing remains an intimidating and abstract concept for many companies, enterprises can still start taking steps towards extending their enterprise into the cloud with the adoption of private clouds. An organization can achieve a private cloud that is virtualized, workload-based and managed in terms of business services with the service-driven data center. Workloads are managed in a dynamic manner in order to meet business SLAs. The progression from physical server to virtualization to the workload to business service to business service management is clear and logical.

In order to insure that your private cloud is managed effectively—thus providing optimum visibility to the cloud’s business value—it is important to evaluate and present your cloud migration in this way. Cloud investment can seem less daunting when viewed as a continuous process and the transition can be make in small sets which makes the value a private cloud can provide to a business more easily recognizable to stakeholders. For more information, visit Nubifer.com.

Microsoft and Intuit Pair Up to Push Cloud Apps

Despite being competitors, Microsoft and Intuit announced plans to pair up to encourage small businesses to develop cloud apps for the Windows Azure platform in early January 2010.

Intuit is offering a free, beta software development kit (SDK) for Azure and citing Azure as a “preferred platform” for cloud app deployment on the Intuit Partner Platform as part of its collaboration with Microsoft. This marriage opens up the Microsoft partner network to Intuit’s platform and also grants developers on the Intuit cloud platform access to Azure and its tool kit.

As a result of this collaboration, developers will be encouraged to use Azure to make software applications that integrate with Intuit’s massively popular bookkeeping program, QuickBooks. The companies announced that the tools will be made available to Intuit partners via the Intuit App Center.

Microsoft will make parts of its Online Business Productivity Suite (such as Exchange Online, SharePoint Online, Office Live Meeting and Office Communications Online) available for purchase via the Intuit app Center as well.

The agreement occurred just weeks before Microsoft began monetizing the Windows Azure platform (on February 1)—when developers who had been using the Azure beta free of charge began paying for use of the platform.

According to a spokesperson for Microsoft, the Intuit beta Azure SDK will remain free, with the timing for stripping the beta tag “unclear.”

Designed to automatically manage and scale applications hosted on Microsoft’s public cloud, Azure is Microsoft’s latest Platform-as-a-Service. Azure will serve as a competitor for similar offerings like Force.com and Google App Engine. Contact a Nubifer representative to see how the Intuit – Microsoft partnership can work for your business.

Public vs. Private Options in the Cloud

The demand for cloud computing is perpetually increasing, which means that business and technology managers need to clear up any questions they have about the differences between public and private clouds—and quickly at that.

The St. Louis-based United Seating and Mobility is one company that faced the common dilemma of choosing between a public or private cloud. The company—which sells specialized wheelchairs at 30 locations in 12 states—initially used phones and email to stay up to date on vendor contracts and other matters before monitoring these developments with off-the-shelf applications on its own servers. Finally, United Seating and Mobility decided to move to the public cloud.

United Seating and Mobility’s director of operations Michael DeHart tells Baseline Magazine of the move, “The off-the-shelf applications didn’t collaborate. You’d log on to all of the apps and try to remember which one needed which password.” Staffers across the nation now share the information seamlessly via the enhanced tools available in the public cloud.

Another example illustrating the difference between the public and private cloud is the Cleveland Cavaliers. The NBA team uses a private cloud to run its arena’s website. Going private allowed for increased one-on-one interaction with the cloud provider partner while simultaneously giving the franchise more resources to handle increased traffic to the site. Traffic on the area site has been known to spike when, for example, the team makes the playoffs or a major artist is coming to the venue. “When you’ve booked Miley Cyrus you’d better be ready,” says the Cleveland Cavaliers director of web services Jeff Lillibridge.

Despite choosing different versions of the cloud, both United Seating and Mobility and the Cleveland Cavaliers have noticed that few enterprise managers will be able to avoid the topic of private verses public clouds. According to research firm IDC, worldwide cloud services revenue will reach $44.2 billion in 2013, compared to $17.4 billion last year.

Business and technology professionals remain stumped about what private and public clouds are despite the increased demand for worldwide cloud services. Examples of public clouds include Google AppEngine, IBM’s Blue Cloud, LotusLive Engage and Amazon’s Elastic Compute Cloud (EC2). A public cloud is a shared technology resource used on an as-needed basis and available via the Internet while a private cloud is created specifically for the use of one organization.

Enhanced by virtualization technologies, both concepts are making way for an “evergreen” approach to IT in which enterprises can obtain technologies when they need them without purchasing and maintaining a host of in-house services.

Bob Zukis, national leader of IT strategy for PricewaterhouseCoopers (PwC) says, “It all stems from the legacy model of ‘build it and forget about it.’ Changes taking place in the industry are making it much more efficient and effective to provision what IT needs. So ‘build it and forget about it’ no longer meets the needs of the business. Whether you’re going with a public or private cloud, you’re pursuing a way to increase your technological resources in a more efficient flexible way.”

In addition to being evergreen, this movement is also green-friendly. Says Frost and Sullivan’s Vanessa Alvarez, “Cloud computing allows for resources and paying only for what they use. When an application is not utilizing resources, those resources can be moved to another application that needs them, enabling maximum resource efficiencies. If additional capacity or resources are no longer needed, virtual servers can be powered down or shut off.”

Organizations continue to struggle to choose between private versus public clouds. On one hand, private clouds offer security and increased flexibility compared to traditional legacy systems, but they have a higher barrier of entry compared to public clouds. In contrast, private cloud services require that an enterprise IT manager handle technology standardization, virtualization and operations automation in addition to operations support and business support systems.

“With public clouds, you provision your organization very quickly, by increasing service, storage and other computing needs, “says Zukis. “A private cloud takes a lot more time because you’re essentially rearchitecting your legacy environment.” Although public clouds don’t require this organizational shift and are thus faster and more convenient, they fail to provide the same amount of transparency as private clouds. Says Zukis, “It’s not always clear what you’re buying off the shelf with public clouds.”

Assessing the Value of Security

Another major issue in the cloud debate is security. All organizations value security but each has to decide between balance between cost and convenience, on one hand, and data security, on the other. Some organizations might have a higher threshold for potential violations than others and thus require a need-for-speed strategy.

Head of strategic sales and marketing at NIIT Technologies Aninda Bose, who has analyzed both cloud structures through her job and also in her position with nonprofit research organization Project Management Institute, states that the public cloud is the better option for an enterprise dealing with high-transaction/low-security or low data value. An example illustrating this is a local government office, which needs to tell a citizen that their car registration is up for renewal and simply needs to give the citizen a renewal date—a perfect situation for public cloud hosting.

Examples better suited for the private cloud model due to the sensitivity of their data include a federal agency, financial institution or health care provider. Mark White, principal with Deloitte Consulting, explains, “Accounting treatments and taxation applications are not yet fully tested for public cloud services. So enterprises with significant risk from information exposure may want to focus on the private cloud approach. This caution is most relevant for systems that process, manage and report key customer, financial or intelligence information. It’s less important for ‘edge’ systems, such as salesforce automation and Web order-entry applications.”

Sioux Falls, South Dakota-based medical-practice company The Orthopedic Institute is very data-dependent and concluded that the private cloud structure best fit its needs—specifically because the company must comply with strict rules for protecting patient information laid out by HIPAA (Health Insurance Portability and Accountability Act).

IT Director David Vrooman explains that The Orthopedic Institute was seeking to change it domain name from Orth-I.com but after exploring possibilities with the exclusive provider of .md domains MaxMD it determines that MaxMd could also provide private cloud services for highly secured, encrypted email transmissions. Moreover, the cost of entry was less than doing it in-house. “We didn’t want to use one of our servers for this because it would have amounted to a $20,000 startup cost. By going with a private cloud option, we launched this at one-fifth of that expense—and it only took an afternoon to get it started, ” says Vrooman. “It would have taken at least a week for my staff and me to get this done. And because MaxMD has taken over the email encryption, I’m not getting up at 3am to find out what’s wrong with the server.”

Some industry experts warn that traditional views about security and cloud computing may be changing, however, and that includes organizations which are dependent on highly secured data. CPA2Biz, the New York-based American Institute of Certified Public Accountants, wanted to provide its 350,000 members with access to the latest software tools for its business resources-providing subsidiary. CPA2Biz worked with Intacct to create a public cloud model for its CPA members. The program was launched in April and since then concerns have about security have been addressed and hundreds of firms are supporting approximately 2,000 clients through the public cloud services offered through CPA2Biz.

“Only those in the largest of member organizations would be able to consider a private cloud system. Plus, we don’t believe there are security advantages to a private cloud system,” says vice president of corporate alliances at CPA2Biz Michael Cerami. “We’ve selected partners who operate highly secure public cloud environments. This allows us to provide our members with great collaborative tools that enable them to work proactively with their clients in real time.”

The Choice

Going back to United Seating and Mobility, the organization was interested in the public cloud structure because it isn’t dependent on high-volume, automated sales. The company uses IMB’s LotusLive Engage for online meetings, file-sharing and project-management tasks.

DeHart estimates that it would have taken up a server and a half had it done this in house saying, “Being on the public cloud allows us to avoid this entirely. It’s a leasing-versus-owning concept—an operational expense versus a capital one. And the Software-as-a-Service offerings are better than what we could get off the shelf. We certainly can’t use this cloud to work with any sensitive health data. But we can run much of our business operations on it, freeing up our IT people to focus on email, uptime and cell phone services.”

Now, take the Cleveland Cavaliers. They opted for private cloud services to support the website for their venue, Quicken Loans Arena, aka “the Q.” Fans can search for information about upcoming events on TheQArena.com and are directed to a business called Veritix is they want to buy tickets. The arena site acts as a traffic conduit for Veritiix, thus a private cloud was the best option and the team partnered with Hosted Solutions. Since the current NBA season began last fall, the site’s page views and visits have seen an increase of over 60 percent and the number of unique visitors has increased by 55 percent. The team avoids uncertainly about who is minding the data by employing Hosted Solutions.

The private cloud also enables the team to manage site traffic that can jump significantly in the case of a last-second, playoff-determining shot, for example. “The need to scale was significant but we didn’t want to oversee our own dedicated hosting,” says Lillibridge. “It would have been more expensive, and we would have had the headache of managing our own servers. We needed dedicated services that would avoid this, while allowing our capacity to increase during peak times and decrease when we don’t have a lot of traffic.”

There is no clear cut answer for whether the private or public cloud is better, rather companies needs to assess their own individual requirements for sped, security, resources and scalability. To learn more about which Cloud option is right for your enterprise, contact a Nubifer representative today.

A Guide to Windows® Azure Platform Billing

Understanding billing for Windows® Azure Platform can be a bit daunting, so here is a brief guide, including useful definitions and explanations.

The Microsoft ® Online Customer Service Portal (MCOP) limits one Account Owner Windows Live ID (WLID) per MOCP account, and the Account Owner has the ability to create and manage subscriptions, view billing and usage data and specify the Service Administrator for each subscription. While this is convenient for smaller companies, large corporations may need to create multiple subscriptions in order to design an effective account structure that will able to support and also reflect their market strategy. Although the Service Administrator (Service Admin WLID) manages deployments, they cannot create subscriptions.

The Account Administrator can create one or more subscriptions for each individual MOCP account and for each subscription, the Account Administrator can specify a different WLID as the Service Administrator. It is also important to note that the Service Administrator WLID can be the same or different as the Account Owner and is the person actually using the Windows ® Azure Platform. Once a subscription is created in the Microsoft ® Online Customer Service Portal (MOPC), a Project appears in the Windows ® Azure portal.

The relationship between components is clearly displayed in the diagram below:

Projects:

Up to twenty services can be allocated by one project. Resources in the Project are shared between all of the Services created and the resources are divided into Compute Instances/Cores and Storage accounts.

The Project will have 20 Small Compute Instances that you can utilize, by default. These Small Compute Instances could be a variety of combinations of VM sizes as long as the total number of Cores across all deployed services within the Project doesn’t exceed 20.

To increase the number of Cores, simply contact Microsoft ® Online Services customer support to verify the billing account and provide the requested Small Compute Instances/Cores (subject to a possible credit check). You also have the ability to design how you want to have the Cores allocated, although be default the available resources are counted as number of Small Compute Instances. See the conversion on Compute Instances below:

Compute Instance Size CPU Memory Instance Storage
Small 1.6 GHz 1.75 GB 225 GB
Medium 2 x 1.6 GHz 3.5 GB 490 GB
Large 4 x 1.6 GHz 7 GB 1,000 GB
Extra large 8 x 1.6 GHz 14 GB 2,040 GB

Table 1: Compute Instances Comparison

The compute Instances are shared between all the running services in the project—including Production and Stage Environments. This allows you to have multiple Services with different number of Compute Instances (up to the number of maximum available for that Project).

5 Storage accounts are available per Project, although you can request to increase this up to 20 Storage accounts per Project by contacting Microsoft ® Online Services customer support. You will need to purchase a new subscription if you need more than 20 Storage accounts.

Services:

A total of 20 Services per project are permitted. Services are where applications are deployed; each Service provides two environments: Production and Staging. This is visible when you create a service in the Windows ® Azure portal.

A maximum number of five roles per application are permitted within a Service; this includes any combinations of different web and worker roles on the same configuration file up to a maximum of 5. Each role can have any number of VMS, see below:

The Service has two roles in this example, and each role has a specific worker role. Web Role, web tier, handles the Web interface, while the Worker Role, business tier, handles the business logic. Each role can have any number of VMs/Cores up to the maximum available on the project.

If this service is deployed from the Azure ® resources perspective, the following resources will be used:

1 x Service

–       Web Role = 3 Small Compute Nodes (3 x Small VMs)

–       Worker Role = 4 Small Compute Nodes (2 x Medium VMs)

–       2 Roles used

Total resources left on the Project:

–       Services (20 -1) = 19

–       Small Compute Nodes (20 – 7) = 13 small compute instances

–       Storage accounts = 5

For more information regarding the Windows Azure pricing model, please contact a Nubifer representative.

Amazon’s Elastic Compute Cloud Platform EC2 Gets Windows Server Customers from Microsoft

Amazon has launched an initiative for Microsoft customers to bring their Windows Server licenses to Amazons EC2, Elastic Compute Cloud Platform. This initiative is in tandem with a brand new Microsoft pilot program which allows Windows Server customers with an EA (Enterprise Agreement) with Microsoft to bring their licenses to Amazon EC2. Peter DeSantis, general manager of EC2 at Amazon, said in a recent interview with eWEEK that these customers will pay Amazon’s Linux On-Demand or Reserved Instance rates and thus save between 35 to 50 percent, depending on the type of customer and instance.

Also in his interview with eWEEK, DeSantis said that Amazon customers have sought support for Windows Server and Amazon has delivered support for Windows Server 2003 and Windows Server 2008. Customers with EA agreements with Microsoft began to ask if those agreements could be applied to EC2 instances, thus the new pilot program. Amazon announced the new initiative on March 24 and began enrolling customers instantaneously. According to DeSantis, enrollment will continue through September 12, 2010.

Amazon sent out a notice announcing the program and stated the following criteria as requirements laid out by Microsoft to participate in the pilot: your company must be based or have legal entity in the United States; your company must have an existing Microsoft Enterprise Agreement that doesn’t expire within 12 months of your entry into the Pilot; you must already have purchased Software Assurance from Microsoft for your EA Windows Server licenses; you must be an Enterprise customer (this does not include Academic Government institutions).

eWEEK revealed some of the fine print for the project released by Amazon:

“Once enrolled, you can move your Enterprise Agreement Windows Server Standard, Windows Server Enterprise, or Windows Server Datacenter edition licenses to Amazon EC2 for 1 year. Each of your Windows Server Standard licenses will let you launch one EC2 instance. Each of your Windows Server Enterprise or Windows Server Datacenter licenses will let you launch up to four EC2 instances. In either case, you can use any of the EC2 instance types. The licenses you bring to EC2 can only be moved between EC2 and your on-premises machines every 90 days. You can use your licenses in the US East (Northern Virginia) or US West (Northern California) Regions. You will still be responsible for maintaining your Client Access Licenses and External Connector licenses appropriately.” To learn more about Microsoft’s and Amazon’s Cloud offerings visit Nubifer.com.

Legal Risks for Companies to Consider Before Embracing the Cloud

Along with its never-ending stream of possibilities in revolutionizing the invention, development, deployment, scale, updating, maintenance and payment for data and applications, cloud computing brings a variety of legal risks to the table, and companies must consider these before entering a highly optimized public cloud.

Risk from uncertainty over where sensitive data and applications physically dwell arises from what Baselinemag.com calls the “nationless state” of the public cloud. Among these ricks are jurisdictions where laws governing the protection and availability of data are very different than what companies are used to. Information in the cloud can also be widely distributed across various legal and international jurisdictions (which each have different laws concerning security, privacy, data theft, data loss and intellectual property) due to the virtual and dynamic nature of cloud computing architecture.

Furthermore, when operating in the cloud, issues concerning privacy, data ownership and access to data cause many questions to arise. National or international legal precedents for cloud computing may be few and far between, but companies nonetheless must ensure that they can immediately access their information and that their service provider has appropriate backup and data-retrieval procedures in place.

A new paradigm of licensing—in which traditional software license agreements will be replaced with cloud service agreements—will be replaced with cloud service agreements as a result of the legal framework of cloud computing. Lawyers representing cloud service providers will subsequently try to reduce the liability of their clients by proposing contracts with the service provided “as is” without a warranty. Under this new paradigm, the service is provided without any assurance or promise of a specific level of performance. This added rick must be evaluated within the context of the benefits derived from the cloud as well as the proposed data which will be stored in the cloud.

Cloud computing also causes issues for companies that have to meet increasingly stringent compliance and reporting requirements for the management of their data. These issues pose major risks in protecting companies’ sensitive data and the information assets their customers have entrusted them to watch over.

In summary, enterprises must make sure that their cloud service providers specify where their data dwells, the legal framework within those specific jurisdictions and the security, backup, anti-hacking and anti-viral processes the service provider has set up. Despite these risks, cloud computing has enormous benefits which should make companies eager to take advantage of its optimization, scalability and cost savings that cloud computing provides. While embracing the cloud, companies must simply conduct a more detailed legal analysis and assessment of risks, much like they would with traditional IT services. For more information on security relating to Cloud Computing, please visit Nubifer.com.

Microsoft Not Willing to Get Left in the Dust Left by Cloud Services Business

Microsoft may be the largest software company on the globe, but that didn’t stop it from being left in the dust by other companies more than once and eWEEK reports that when it comes to cloud services Microsoft is not willing to make the same mistake.

Although Microsoft was initially weary of the cloud, the company is now singing a different tune and trying to move further into the data center. Microsoft had its first booth dedicated solely to business cloud services at the SaaSCon 2010, held at the Santa Clara Convention Center April 6 and 7. Microsoft is positioning Exchange Online (email), SharePoint Online (collaboration), Dynamics CRM Online (business apps), SQL Azure (structured storage) and AD/Live ID (Active Directory assess) as its lead services for business. All of these services are designed to run on Windows Server 2008 in the data center and sync up with the corresponding on-premises applications.

The services are designed to work hand-in-hand with standard Microsoft client software (including Windows 7, Windows Phone, Office and Office Mobile), thus ensuring that the overarching strategy is set and users will have to report on its cohesiveness over time. Microsoft is also offering its own data centers and its own version of Infrastructure-as-a-Service for hosting client enterprises apps and services. Microsoft is using Azure—a full online stack comprised of Windows 7, the SQL database and additional Web services—as a Platform-as-a-Service for developers.

Featuring Business Productivity Online Suite, Exchange Hosted Services, Microsoft Dynamics CRM Online and MS Office Web Apps, Microsoft Online Services are up and running. In mid-March Microsoft launched a cloud backup service on the consumer side called SkyDrive, which is an online storage repository for files which users can access from anywhere via the Web. SkyDrive may be a very popular service, as it offers a neat (in both senses of the word) 25GB of online space for free (which is more than the 2GB offered as a motivator by other services).

SkyDrive simply requires a Windows Live account (also free) and shows that Microsoft really is taking the plunge. For more information on Microsoft’s Cloud offerings, please visit Nubifer.com.

ERP and CRM Integration Via Business Intelligence for the Cloud

The masterminds behind Crystal Reports are unveiling a new business intelligence cloud offering being sold through channel partners. Not only do solution providers get an ongoing annuity on the sale, but they can perform the integration work to link the cloud-based BI to the data source (whichever ERP/CRM solution it is, such as Oracle, Salesforce.com, SAP or something else).

Traditional VARs gaging the potential of the cloud business model may have a difficult time seeing how much money per user per month will be enough for a business to reap the benefit of the cloud. Indicee executives Mark Cunningham, CEP, and Craig Todd, director of partnerships, understand the businesses are accustomed to the big sale upfront and ongoing services after that sale. Cunningham and Todd were both part of the team that created the Crystal Reports business intelligence software–which sold to Seagate before becoming part of SAP–and decided to bring their technology expertise into the cloud.

Although Cunningham and Todd knew that business was moving into the cloud and that their expertise had revealed that channel partners are the ideal way to connect with end customers, they just didn’t know how to merge those two ideas. Said Todd to Channel Insider, “The biggest single difference in what SaaS is removes those boxes. It has initially been seen as a threat by some of our partners.”

“A lot of VARs are worried about being disintermediated. Their expertise in installing software is no longer required.But the ones we’ve been working with the last few months see it as an opportunity,” continued Todd.

Arxis Technology in Simi Valley, California, an ERP, CRM and BI specialist, is one such partner. The 25-person company has two offices in California as well as offices in Chicago and Phoenix. Director of sales and marketing Mark Severance told Channel Insider that whether the customer is deploying on-premises solutions or in the cloud solutions the revenue comes out even. “The biggest thing people are having a hard time with is that you are used to the big upfront sale. But, honestly, from our perspective, if you have great products and do a great job taking care of the customer, then there’s a business model for that you do,” explains Severance.

Severance said that the annuity part of the business (in which Arxis receives a commission per user per month on an ongoing basis) will eventually make up for the lack of large upfront sale. Additionally, Arxis can offer the integration and implementation services which customers need, which means setting up the BI solution’s data sources, whether they may be Salesforce.com or an internal CRM or ERP solution.

Arxis continues to offer traditional on-premises CRM and ERP software sales and implementation; the biggest vendor Arxis works with currently is Sage. Arxis offers a BI solution from Business Objects in on-premises and cloud form and recently added Indicee’s cloud-based BI solution for a variety of reasons. One major reason is that some customers are unable to afford an on-premises-based BI solution and thus a cloud-based solution is more economically accessible.

Severance further pointed out that most of computing is making the transition into the cloud. While companies used to feel safe having their server in-house, they now want to be able to access there data whenever, wherever they are, from whichever device they are using.

Indicee’s Cunningham and Todd also pointed out that VARs can provide their end customers with training services as well as services like change management. Said Todd, “There’s an exciting opportunity here for traditional VARs. This creates a platform that allows partners to focus on the V and A in the VAR–the value add.”

Pricing at Indicee starts at $69 per user per month, with a five-user pack priced at $150 per month. The VAR cut generally is a 20 percent commission on sales of five packs or more, calculated monthly and paid out quarterly, but Todd noted that it is dependent on how much work the VAR is completing to get the customer.

Gartner predicts sales of $150 million by 2013. Cunningham notes that SaaS is poised for growth and that if solution providers are seeking to enter the cloud, business intelligence is a lucrative starting point, even with its required integrative work. To learn more about CRM Applications in the Cloud, please visit Nubifer.com.

The Role of Multitenancy in the Cloud

The debate over whether or not multitenancy is a prerequisite for cloud computing wages on. While those pondering the use of cloud apps might think they are removed from this debate, they might want to think again, because multitenancy is the clearest path to getting more from a cloud app while spending less.

Those in the multitenancy camp, so to say, point out that there is only a slight only difference between two subscription-based cloud apps is that one is multitenant and the other is single-tenant. The multitenant option will offer more value over time while lowering a customer’s costs and the higher degree of multitenancy—i.e. the more a cloud provider’s infrastructure and resources are shared—the lower the customer cost.

At the root of the debate is revenue and cost economics of cloud services. Revenues for most cloud app providers come from selling monthly or annual per-seat subscriptions. These bring in just a portion of the annual revenue that would be generated by an on-premise software license with comparable functionality. The challenge for selling software subscriptions comes from reducing operating costs to be able to manage with less. If this is not achieved, the provider may have to do more than an on-premise vendor does—like run multiple infrastructures, maintain multiple versions, perform upgrades and maintain customer-specific code—with less money. The answer to this conundrum is multitenancy. Multitenancy extends the cost of infrastructure and labor across the customer base. Customers sharing resources down to the database schema is perfect for scaling.

As the provider adds customers, and those customers benefit from this scaling up, the economies of scale improve. The cloud app provider is able to grow and innovate more as costs decrease and in turn value increases. Over time customers can expect to see more value (like in the form of increased functionality), even if costs don’t lower. For more information of Multitenancy, visit Nubifer.com.

Microsoft and Citrix Come to a Desktop Virtualization Agreement

On March 18, Microsoft announced a partnership with Citrix Systems which seeks to promote the pair of companies’ end-to-end virtualization packages for businesses. One aspect of the broad-based partnership sees Microsoft and Citrix aggressively offering customers of rival VMware View the option of trading in 500 licenses with no additional cost. This highly aggressive facet of the recent alliance between Microsoft and Citrix highlights the perpetually increasing competitive nature of the entire virtualization industry.

Also during the company’s March 18 announcement, Microsoft put a number of changes in place in its virtualization policy. One such change which was instituted was making virtual desktop access rights a Windows Client Software Assurance benefit. Beginning on July 1, Software Assurance clients will no longer need to buy a separate license in order to access Windows in a virtual environment.

Windows Client Software Assurance and Virtual Desktop Access license customers will be able to access virtualized Windows and Office applications beginning on July 1 as well. These applications will be accessible through non-corporate network devices, like home PCs. Under Microsoft’s agreement with Citrix, Windows XP Mode will no longer require hardware virtualization technology and assets like Citrix XenDesktop’s HDC technology will be able to be applied to the capabilities of the Microsoft RemoteFX platform.

In an interview with eWEEK one day before the March 18 announcement, Brad Anderson, corporate vice president of Microsoft’s management and Services Division, said, “What we’re bringing to the market together is this end-to-end experience with a simple and consistent interface for the end user. It’s comprehensive, and it leverages what customers already have. If you take a look at the assets that our companies already have in virtualization, it’s the most comprehensive group of assets on the market.”

Together, Microsoft and Citrix are trying to fire a broadside into rival VMware with the “rescue for VMware VDI” promotion. The promotion allows VMware View customers to trade in up to 500 licenses for no additional cost. New Microsoft-Citrix customers also receive about 50 percent off the estimated retail price for virtual desktop infrastructure through another promotion.

In its media portrayal, Microsoft emphasized the announcement as a value proposition. “Two infrastructures are more expensive than one infrastructure,” said Anderson before adding, “When customers see the chance to consolidate multiple infrastructures into one, it’s a chance to manage virtual and hardware desktop so it’s truly one infrastructure. It enables administrators to do everything through system center. And reducing infrastructure reduces cost.”

The partnership with Citrix comes on the heels of another Microsoft virtualization initiative, which arrived on February 22. Microsoft unveiled two business-focused virtualization applications, App-V 4.6 and MED-V 1.0 SP1 Release Candidate designed to better integrate proprietary applications into business’ evolving IT infrastructure APP-V 4.6 extends 64-bit support for Microsoft’s application virtualization product to streaming applications. MED-V 1.o SP1 RC allows applications which require Internet Explorer 6—or that otherwise cannot be supported on Windows 7—to run in a managed virtual desktop environment. For more information about Cloud Computing, please visit Nubifer.com.

Microsoft’s CEO Says Company is Playing All Its Hands in the Cloud

During a recent speech at the University of Washington, Microsoft CEO Steve Ballmer spoke about his company’s future plans: and they primarily take place in the cloud! Citing services and platforms like Windows® Phone 7 Series and Xbox Live, Ballmer spoke about cloud-centric objectives. While Microsoft faces competition from Google and others when it comes to cloud-based initiatives, everyone is wondering how Microsoft will alter its desktop-centered products like the Windows franchise to remain ahead of the pack.

During his March 4 speech at the University of Washington, Ballmer stated that Microsoft’s primary focus in the future will be in the cloud and applications derived from the cloud. This may come as somewhat of a surprise, as Microsoft’s fortune largely comes from desktop-based software like Microsoft® Windows and Microsoft® Office, but Ballmer said, “We shipped Windows 7, which had a lot that’s not cloud-based. Out inspiration now starts with the cloud Windows Phone, Xbox, Windows Azure and SQL Azure … this is the best bet for our company.”

While speaking in front of a screen displaying a large cloud logo with the words “We’re all in,” Ballmer continued to say, “Companies like ours, can they move and dial in and focus and embrace? That’s where we’re programmed. You shouldn’t get into this industry if you don’t want things to chance. The field of endeavor keeps moving forward.”

When discussing Microsoft’s cloud initiatives, Ballmer spoke about the creation of a cloud-based Office that would allow workers to collaborate and communicate. He also referenced cloud-ported entertainment (via Xbox Live) and the creation of something he dubbed “smarter services” which would be capable of quickly integrating new hard- and software that could interact with the cloud smoothly. Ballmer spoke about Microsoft’s cloud-based development platform, Microsoft® Azure, and mentioned Azure Ocean, a University of Washington project which reportedly collects the world’s oceanographic data.

Microsoft’s most recent smartphone operating system, Windows® Phone 7 Series, was cited by Ballmer as one of the company’s cloud-centric smarter devices. “Earlier [Microsoft] phones were designed for voice and legacy [applications],” said the Microsoft CEO before adding that Microsoft® 7 Phone Series was created to “put people, places, content, commerce all front and center for the users with a different point of view that some other phones.”

Citing the reciprocal need of search and Bing Maps to draw in information from users in order to “learn” and define their actions, Ballmer placed the cloud at an even playing field. While Bing Maps has started integrating Flickr images into its Streetside feature—thus presenting an eye-level view of an environment—Microsoft is experimenting with putting Streetside cameras on bikes and pedestrians instead of on the roofs of cars to offer even more views to users. Search engines like Bing take history information ported to them by users and gauge user intent. Ballmer suggested that the “ability of the cloud to learn from all of the data that’s out there, and learn from me about what I’m interested in” is one of the cloud’s most basic and important dimensions.

When it comes to competition in the cloud, Microsoft faces the most in consumer applications. Ballmer praised Apple’s App Store, calling it “a very nice job,” but knows that Microsoft has a ways to go in terms of catching up to Apple’s cloud-based monetization of intellectual property like movies and music. As for Google, the company has a lead in the search engine market in the U.S. and its Google Apps cloud-based productivity has been making inroads with businesses and government.  Google recently announced plans for a dedicated federal cloud computing system sometime later in 2010. This announcement likely propelled Microsoft’s February 24 announcement Business Productivity Online Suites Federal. The online-services cloud for the U.S. government comes equipped with strict security reinforcements.

Overall, Ballmer’s speech at the University of Washington furthered the notion that Microsoft is poised to focus its competitive energies in the cloud more and more. The industry will be waiting to see what this will mean for the traditionally desktop-centric Windows franchise, Microsoft’s flagship product; especially since news recently surfaced suggesting Microsoft is currently developing Windows 8. For more information on Windows Azure please visit Nubifer.com.


CA Augments Cloud Business with Nimsoft Buy

CA has announced plans to purchase Nimsoft for $350 million, thus furthering its bolstering of cloud computing capabilities. CA’s series of cloud-related acquisitions already includes Cassatt, NetQS, Oblicore and 3Tera.

On March 10, CA officials announced the $350 million, all-cash acquisition of Nimsoft, revealing that the deal is predicted to close by the end of the March. Nimsoft is the fifth cloud-centric company CA has purchased in the past year, showing CA’s continued aggressive move to build up its cloud computing capabilities.

With the acquisition of Nimsoft, CA gains IT performance and availability monitoring solutions for highly virtualized data centers and cloud computing environments as well as greater traction in key areas like midmarket companies and emerging global markets. CA refers to midmarket companies as emerging enterprises: companies with revenues between $300 million and $2 billion.

CA CEO Bill McCraken said in a conference with analysts and journalists that the deal is about Nimsoft’s technology and customers—of which the company has 800 scattered in over 30 countries. “We want to reach new customers, and we want to reach them in a way we haven’t been able to do here at CA, even after a couple of tries,” said McCraken.

McCraken said that the emerging enterprise space will account for approximately a quarter of the software spending in CA’s market by 2010. Cloud computing for business is provided by MSPs and McCracken said that the cloud is poised to play a major role in emerging economies.

Executive vice president for CA’s Cloud Products and Solutions Business Line Chris O’Malley said via a conference call, “We are looking to build up that off-shore revenue.”

In addition to a variety of public cloud computing environments, Nimsoft’s monitoring and reporting products are used with on-demand offerings like Google Apps for Business, Amazon Web Services, Amazon EC2 (Elastic Compute Cloud), the Rackspace Cloud and Salesforce.com. CA also reports that Nimsoft’s monitoring and reporting products are used by customers for internal applications, databases, and physical and virtual data centers.

MSPs are granted high visibility into customers’ business applications in internal and external infrastructures with Nimsoft’s Unified Monitoring Solution. Nimsoft president and CEO Gary Read and McCracken said that Nimsoft’s technology is created with a high level of automation in order to make it easy to use for MSPs.

Read will become senior vice president and general manager of CA’s Nimsoft business unit when the deal with Nimsoft is finalized. Read said that combining his company—which is 12 years old—with Nimsoft makes sense. Although Nimsoft had done well, Read worried that the company would struggle to stay up to speed with the market changes. Nimsoft will be able to continue with innovation while scale its products easily once part of CA. Most Nimsoft employees are expected to remain with the company once the deal with CA is complete.

CA has acquired Cassatt, NetQS and Oblicore in less than a year and is in the midst of purchasing 3Tera. Each company pushed Ca further into the cloud and Nimsoft will add to CA’s capabilities in the cloud. In McCraken’s words, acquisitions like the current purchase of Nimsoft serve to “accelerate CA’s market leadership.” To learn more about Cloud Computing, please visit Nubifer.com.

Apple iPad Tests the Limits of Google’s Chrome Running on Cloud Computing Devices

With the recent release of its iPad, Apple is poised to challenge Google in the current cloud computing crusade, say Gartner analysts. Apple’s iPad is expected to offer the most compelling mobile Internet experience to date, but later on in 2010 Google is predicted to introduce its own version for mobile Web consumption in the form of netbooks built on its Chrome Operating System.

If Apple’s tablet PC catches on like the company hopes it will, then it could serve as a foil for Google’s cloud computing fans. Apple CEO Steve Jobs has already proclaimed that holding the iPad is like “holding the Internet in your hand.” The 9.7-inch IPS screen on the device displays high-def video and other content, like e-mail, e-books and games to be consumed from the cloud.

Author Nicholas Carr, an avid follower of cloud happenings, explains the intentions of Apple in introducing the iPad by saying, “It wants to deliver the killer device to the cloud era, a machine that will define computing’s new age in the way that the Windows PC defined the old age. The iPad is, as Jobs said today, ‘something in the middle,’ a multipurpose gadget aimed at the sweet spot between the tiny smartphone and the traditional laptop. If it succeeds, we’ll all be using iPads to play iTunes, read iBooks, watch iShows, and engage in iChats. It will be an iWorld.”

An iWorld? Not if Google has its say! Later on in 2010 Google is expected to unveil its very own version of the Internet able to be held in users’ hands: netbooks based on Chrome. Companies like Acer and Asustek Computer are also building a range of Android-based tablets and netbooks, while Dell CEO Michael Dell was recently seen showcasing the Android-based Dell Mini 5 tablet at the World Economic Forum in Davos, Switzerland. It sounds like Apple may have more competition that just Google!

The iPad will undoubtedly be a challenge to Google’s plans for cloud computing, which include making Google search and Google apps able to reach any device connected to the Web. According to Gartner analyst Rau Valdes, Apple and Google are bound to face off with similar machines. Said Valdes to eWeek, “You could look and say that iPad is being targeted to the broad market of casual users rather than, say, the road warrior who needs to run Outlook and Excel and the people who are going to surf the Net on the couch. One could say that a netbook based on Chrome OS would have an identical use case.”

Consumers will eventually have to choose between shelling out around $499 for an iPad (that is just a base price, mind you) or a similar fee (or possibly lower) for a Chrome netbook. Valdes thinks that there are two types of users: a parent figure consuming Internet content on a Chrome OS netbook and a teenager playing games purchased on Apple’s App Store on an iPad. Stay tuned to see what happens when Apple and Google collide with similar machines later on in 2010.

Looking Back at the Changing Face of the Software Industry from 2004 and Beyond

Bill Gates may have made a whole lot of predictions about the future of software in the first edition of his 1995 book The Road Ahead, but even the founder of Microsoft couldn’t image the magnitude of the impact of the Internet.

Within a few years, the Web altered everything. As old software companies faded away—unable to adjust to the new paradigm—new ones cropped up in their place. Although many of these new companies weren’t able to survive the dot-com bust, they did make an impact on the software industry as a whole. The way in which companies coped with the industry in flux back then can be easily applied to the way companies are adopting the cloud computing model in 2010.

Driven by emerging business needs, new customer demands and market forces, the way software was developed and the vendors that deliver it were greatly altered in the mid-2000s. Said Microsoft’s platform strategy general manager Charles Fitzgerald in 2004, “There’s an argument that almost every company is in the software business in one way or another.” Fitzgerald added that although American Express and eBay aren’t commonly thought of as being in the software business, they are. “If you participate in the information economy, you will be a software company. If you’re in a customer-facing business, software is the way you’re going to differentiate yourself,” he explained.

The fact of the matter is that the industry that provided much of the software in 2004-05 was poised to change dramatically in the years that followed. The industry will continue to enter periodic waves of consolidation and expansions, and the industry consensus is that it will remain in consolidation mode for the next couple of years. Larry Ellison, CEO of Oracle, predicted that within a few years the software market will be dominated by just a few companies: Oracle, Microsoft, Salesforce.com, Adobe and SAP.

Ellison wasn’t alone in his predictions, as some software buyers, like Mani Shabrang, head of technology deployment and research and development in Dow Chemical Co.’s business-intelligence center, agreed with him. “The number of software vendors will definitely get smaller and smaller,” said Shabrang in 2004. Another variable to consider, brought up by Shabrang, was that vendors of new types of software would emerge as vendors of mature software categories (like enterprise resource planning) consolidate. Shabrang predicted that a new generation of tools for visualizing data and intelligent software that recognizes the tone and meaning of written prose (in addition to mining text) would pop up as well.

Another group believed that there will be just as many software vendors in the future as there were back then. Danny Sabbah, chief technology officer of IBM’s software group, said that new companies would develop higher-level applications, thus leaving the markets for infrastructure software, middlewear and even core applications such as ERP to a few major companies.

CEO of business-intelligence software vendor Information Builders Inc. Gerald Cohen said, “Roughly every two or three years, new software categories appear. As long as there’s a venture-capital industry, there will be new categories of software.”

So what would the next application be? No one knew, although emerging service-oriented architecture technology was poised to lay the foundation for a new generation of software applications. The software of the future was predicted to be made up components, many of which would be developed in-house by the business requiring them. This is in contrast to what was the model back in 2004, in which vendors developed ever-larger applications that often took months to install.

According to Sabbah, software would likely switch from integrating business processes within a company to integrating these processes between companies. For example, applications might link ordering, invoicing, and inventory-management tasks up and down a supply chain within an industry in the not-so-distant future.

Another looming question was what the predominant operating system and underlying new applications would be. Microsoft ® Windows and Linux distributions would continue to compete, that much was sure, and the battle only got fiercer when Microsoft unveiled its next-generation Longhorn client and server in 2006-07, respectively.

Even in 2004, industry prognosticators knew that larger and more-complex systems weren’t going anywhere. The question was, how would the process of developing software be managed, especially as geographically disbursed programmers and offshore developers were doing an increasing amount of development work? The challenged awaiting users of the complex applications they create also needed to be addressed.

IBM’s Sabbah had this to say about the future of software, “The real challenge of our industry is to build software that is [easy to use] and simple to deploy but not simplistic.”

As shown by the growth of companies which provide software on a hosted basis, like Salesforce.com, it became increasingly important to pay attention to changes in vendor-buyer relationships and how software functionality was delivered.

Co-founder and CEO of business-intelligence and data-analysis software vendor SAS Institute Inc. Jim Goodnight wasn’t worried by these potential changes, instead placing his focus on that new opportunities awaited him and his company. In 2004 Goodnight said, “The IT industry needs to jeep a fairly shortened horizon. Our horizon is about two years. We make it a practice not to have these big five-year plans. If you do, you’re going to get about halfway through, and the world if going to change.” In 2010 Goodnight’s words still ring true.  For more information regarding the changing Software landscape, please visit Nubifer.com.

Microsoft and IBM Compete for Space in the Cloud as Google Apps Turns 3

Google may have been celebrating the third birthday of Google Apps Premier Edition on February 22, but Microsoft and IBM want a piece of the cake, errr cloud, too. EWeek.com reports that Google is trying to dislodge legacy on-premises installations from Microsoft and IBM while simultaneously fending off SaaS solutions from said companies. In addition, Google has to fend off offerings from Cisco Systems and startups like Zoho and MinTouch, to name a few. Despite the up-and-comers, Google, Microsoft and IBM are the main three companies competing for pre-eminence in the market for cloud collaborative software.

Three year ago, Google launched its Google Apps Premier Edition, marking a bold gamble on the future of collaborative software. Back then, and perhaps even still, the collaborative software market was controlled by Microsoft and IBM. Microsoft and IBM have over 650 million customers for their Microsoft ® Office, Sharepoint and IBM Lotus suite combined. These suits are licensed as “on-premises” software which customers install and maintain on their own servers.

When Google launched Google Apps Premier Edition (GAPE), it served as a departure from this on-premises model by offering collaboration software hosted on Google’s servers and delivered via the Web. We now know this method as cloud computing.

Until the introduction of GAPE, Google Apps was available in a free standard edition (which included Gmail, Google Docs word processing, spreadsheet and presentation software), but with GAPE Google meant to make a profit. For just $50 per user per year, companies could provide their knowledge workers with GAPE, which featured the aforementioned apps as well as additional storage, security and, most importantly, 24/7 support.

Google Apps now has over two million business customers–of all shapes and sizes–and is designed to appeal to both small companies desiring low-cost collaboration software but are lacking the resources to manage it and large enterprises desiring to eliminate the cost of managing collaboration applications on their own. At the time, Microsoft and IBM were not aggressively exploring this new cloud approach.

Fast-forward to 2009. Microsoft and IBM had released hosted collaboration solutions (Microsoft ® Business Productivity Office Suite and LotusLive respectively) to keep Google Apps from being lonely in the cloud.

On the third birthday of GAPE, Google has its work cut out for it. Google is trying to dislodge legacy on-premises installations from Microsoft and IBM while fending of SaaS solutions from Microsoft, IBM, Zoho, Mindtouch and the list goes on.

Dave Girouard, Google Enterprise President, states that while Google spent 2007 and 2008 debating the benefits of the cloud, the release of Microsoft and IBM products validated the market. EWeek.com quotes Girouard as saying, “We now have all major competitors in our industry in full agreement that the cloud is worth going to. We view this as a good thing. If you have all of the major vendors suggesting you look at the cloud, the consideration of our solutions is going to rise dramatically.”

For his part, Ron Markezich, corporate vice president of Microsoft Online Services, thinks that there is room for everyone in the cloud because customer needs vary by perspective. Said Markezich to EWeek.com, “Customers are all in different situations. Whether a customer wants to go 100 percent to the cloud or if they want to go to the cloud in a measured approach in a period of years, we want to make sure we can bet on Microsoft to serve their needs. No one else has credible services that are adopted by some of the larger companies in the world.”

Microsoft’s counter to Google Apps is Microsoft’s ® Business Productivity Online Suite (BPOS). It includes Microsoft ® Exchange Online with Microsoft ® Exchange Hosted Filtering, Microsoft ® SharePoint Online, Microsoft ® Office Communications Online and Microsoft ® Office Living Meeting. Microsoft also offers the Business Productivity Online Deskless Worker Suite (which includes Exchange Online Deskless Worker for email, calendars and global address lists, antivirus and anti-spam filters) and Microsoft ® Outlook Web Access Light (for access to company email) for companies with either tighter budgets or those in need of lower cost email and collaboration software. Sharepoint Online Deskless Worker provides easy access to SharePoint portals, team sites and search functionality.

The standard version of BPOS costs $1 user per month or $120 per user per year while BPOS Deskless Worker Suite is $4 per user per month or $36 per user per year. Users may also license single apps as stand-alone services from $2 to $5 per user per month, which serves as a departure from Google’s one-price-for-the-year GAPE package.

The same code base is used by Microsoft for its BPOS package, on-premises versions of Exchange and SharePoint, thus making legacy customers’ transition into the cloud easier should they decide to migrate to BPOS. Microsoft thinks that this increases the likelihood that customers will remain with Microsoft rather than switching to Google Apps or IBM Lotus.

At Lotusphere 2008, IBM offered a hint at its cloud computing goals with Bluehouse, a SaaS extranet targeted toward small- to mid-size business. The product evolved as LotusLive Engage, a general business collaboration solution with social networking capabilities from IBM’s LotusLive Connections suite, at Lotusphere 2009. In the later half of 2009, the company sought to fill the void left open by the absence of email, by introducing the company’s hosted email solution LotusLive iNotes. iNotes costs $3 per user per month and $36 per user per year. Additionally, IBM offers LotusLive Connections, a hosted social networking solution, as well as the aforementioned LotusLive Engage.

Vice president of online collaboration for IBM Sean Pouelly told EWeek.com that IBM is banking on companies using email to adopt their social networking services saying, “It’s unusual that they just buy one of the services.” Currently over 18 million paid seats use hosted versions of IBM’s Lotus software.

IBM’s efforts in the cloud began to really get attention when the company scored Panasonic as a customer late last year. In its first year of implementing LotusLive iNotes, the consumer electronics maker plans on migrating over 100,000 users from Lotus Notes, Exchange and Panasonic’s proprietary email solution to LotusLive.

When it comes down to it, customers have different reasons for choosing Google, Microsoft or IBM. All three companies have major plans for 2010, and each company has a competitive edge. For more information regarding Cloud Computing please visit Nubifer.com.

The Main Infrastructure Components of Cloud Computing

Cloud computing is perhaps the most-used buzz word in the tech world right now, but to understand cloud computing is to be able to point out its main infrastructure components in comparison to older models.

So what is cloud computing? It is an emerging computing model that allows users to gain access to their applications from virtually anywhere by using any connected device they have access to. The cloud infrastructure supporting the applications is made transparent to users by a user-centric interface. Applications live in massively scalable data centers where computational resources are able to be dynamically provisioned and shared in order to achieve significant economies of scale. The management costs of bringing more IT resources into the cloud can be significantly decreased due to a strong service management platform.

Cloud computing can be viewed simultaneously as a business delivery model and an infrastructure management methodology. As a business delivery model, it provides a user experience through which hardware, software and network resources are optimally leveraged in order to provide innovative services on the web. Servers are provisioned in adherence with the logical requirements of the service using advanced, automated tools. The cloud enables program administrators and service creators to use these services via a web-based interference that abstracts away the complex nature of the underlying dynamic infrastructure.

IT organizations can manage large numbers of highly virtualized resources as a single large resource thanks to the infrastructure management methodology. Additionally, it allows IT organizations to greatly increase their data center resources without ramping up the number of people typically required to maintain that increase. A cloud will thus enable organizations currently using traditional infrastructures to consume IT resources in the data center in new, exciting, and previously-unavailable ways.

Companies with traditional data center management practices know that it can be time-intensive to make IT resources available to an end user because of the many steps it involves. These include procuring hardware, locating raised floor space, not to mention sufficient power and cooling, allocating administrators to install operating systems, middleware and software, provisioning the network and securing the environment. Companies have discovered that this process can take two to three months, if not more, while IT organizations re-provisioning existing hardware resources find that it takes weeks to finish.

This problem is solved by the cloud—as the cloud implements automation, business workflows and resource abstraction that permits a user to look at a catalog of IT services, add them to a shopping cart and subsequently submit the order. Once the order is approved by an administrator, the cloud handles the rest. In this way, the process cuts down on the time usually required to make those resources available to the customer from long months to mere minutes.

Additionally, the cloud provides a user interface that allows the user and the IT administrator to manage the provisioned resources through the life cycle of the service request very easily. Once a user’s resources have been delivered by the cloud, the user can track the order (which usually consists of a variable of servers and software); view the health of those resources; add additional servers; change the installed software; remove servers; increase or decrease the allocated processing power, storage or memory; start, stop and restart servers. Yes, really. These self-service functions are able to be performed 24 hours a day and take just minutes to perform. This is in stark contrast to a non-cloud environment, in which it would take hours or even days to have hardware or software configurations changed to have a server restarted. For more information regarding Infrastructure components for a Cloud ecosystem please visit Nubifer.com.

Heightening Cloud Security in Your Enterprise

The responsibility of securing corporate information in the cloud falls upon the enterprise, and enterprises, as cloud consumers, can greatly improve cloud security. Currently, if there is a breach in security, the enterprise is responsible. eWeek Knowledge Center contributor Matthew Gardiner reveals six ways in which enterprises can improve cloud security essentially by thinking as a cloud provider. Once an enterprise has improved security within their cloud computing model, it can fully reap the benefits from the cloud.

Cloud security is a shared responsibility between cloud providers and enterprises, although the dividing line between the two is currently, well, cloudy. The dividing line between cloud providers and enterprises is dependent on the type of cloud model–ranging from Software-as-a-Service (SaaS) to Platform-as-a-Service (PaaS) to Infrastructure-as-a-Service (IaaS).

SaaS approaches what can be though of as a security black box, in which application security activities are largely invisible to the enterprise. IaaS, in which an enterprise is principally responsible for the security of the application, data and other levels of the infrastructure stack, sits at the other end of the spectrum.

The following six steps outline what enterprises can do to improve security in a cloud computing model and thus reap the full benefits from the cloud:

1. Learn from your current internal private clouds and the security systems and processes constructed around them

Medium to large enterprises have been setting up internal clouds for the past ten years, so while many of them didn’t refer to them as clouds, most enterprises have internal clouds already. These clouds were often referred to as shared services, like authentication services, database services, provisioning services or enterprise data centers.

2. Assess the importance and risk of your multiple IT-enabled business processes

Although the potential cost savings resulting from a transition into the cloud can be calculated rather easily, conducting a “risk vs. reward” calculation is difficult without having a basic understanding of the risk side of the equation. Because this is entirely dependent on the business context of the business process, the cloud providers cannot conduct this analysis for enterprises. The obvious first candidates for the cloud are low Service-Level Agreement (SLA) applications with relatively high cost. The potential regulatory impacts need to be considered as well, because some data and services aren’t allowed by regulators to move off-site or out of the state or country.

3. Analyze different cloud models and categories

There are general differences between different cloud models (public, private, hybrid) and cloud categories (SaaS, PaaS, IaaS) that directly relate to security control and responsibility, thus enterprises need to analyze both.

Enterprises must have both an opinion and policy for these cloud approaches within the context of their organizations and the risk profile of their own businesses.

4. Apply your Service-Oriented Architecture (SOA) design and security principles to the cloud

The cloud can be seen as an expansion of SOA, as most organizations have been using SOA principles in their application development organizations for several years. In this way, the cloud can be seen as service orientation taken to its next logical step. Combined with centralized security policy administration and decision making, the SOA security principles of highly distributed security enforcement apply  directly to the cloud. The principles can simply be transfered to the cloud rather than reinventing the system when switching your focus from SOA to the cloud.

5. Think like a cloud provider

Rather than thinking of your enterprise as a cloud consumer, think as a cloud provider. Your organization is part of a value chain in which you supply services to your customers and partners. If you are able to equate the risk/reward balance so that you profitably consume cloud services, you can apply that way of thinking to guide your entry as a cloud provider within your ecosystem. This will in turn help your organization better comprehend what is happening within the realm of cloud providers.

6. Get to know and start using Web security standards sooner than later

The Web security industry has been working on securing and managing cross-domain systems for quite some time, and useful security standards to secure cloud services have emerged as a result. These standards–which include Security Assertion Markup Language (SAML), Service Provisioning Markup Language (SPLM), Extensible Access Control Markup Language (XACML) and Web Services-Security WS-Security)–must be adopted for security systems to be effective in the increasingly cloud-connected world.

Ensuring that security professionals be viewed as rational advocates of the cloud is an important requirement for enterprises when it comes to improving the security of cloud services. When properly balanced and business-driven, technologists can serve as positive forces in the risk/reward dialogue and also help increase the probability of increasing cloud security for their enterprise. To learn more about Cloud Security please visit Nubifer.com.

Media Streaming Added to Amazon CloudFront

Amazon Web Services LLC unveiled media streaming for its content delivery service, Amazon CloudFront, on December 16, 2009. The brand new feature enables streaming delivery of audio and video content, thus providing an alternative to progressive download where end users download a full media file.

According to Amazon officials, Amazon CloudFront streams content from a worldwide network of 14 edge locations, which ensures low latencies and also offers cost-effective delivery. Like all Amazon Web Services, Amazon CloudFront requires no up-front investment, minimum fees or long-term contracts and uses the pay-what-you-use model.

General manager of Amazon CloudFront Tal Saraf said in a statement released in conjunction with the company’s announcement, “Many customers have told us that an on-demand streaming media service with low latency, high performance and reliability has been out of reach—it was technically complex and required sales negotiations and up-front commitments. We’re excited to add streaming functionality to Amazon CloudFront that is so easy, customers of any size can start streaming content in minutes.”

Amazon reports that viewers literally watch the bytes as they are delivered because content is delivered to end users in real time. In addition to giving the end user more control over their viewing experience, streaming also lowers costs for content owners by reducing the amount of data transferred when end users fail to watch the whole video.

Users only need to store the original copy of their media objects in the Amazon Simple Storage Service (Amazon S3) in order to stream content with Amazon CloudFront, and then enable those files for distribution in Amazon CloudFront with a simple command using the AWS Management Console or the Amazon CloudFront API. Amazon officials said that end users requesting streaming content are automatically routed to the CloudFront edge location best suited to serve the stream, thus end users can get the highest bit rate, lowest latency and highest-quality stream possible. Due to multiple levels of redundancy built into Amazon CloudFront, customers’ streams are served reliably and with high quality.

Daniel Rhodes of video sharing website Vidly said in a statement, “In the five minutes it took us to implement Amazon CloudFront’s streaming service, Vidly was able to both cut costs and offer additional features that significantly improved the in-video experience for our worldwide audience. Without any upfront capital, we are able to side-step the purchase and administration of streaming servers while still getting all the same benefits. Amazon CloudFront brings all the benefits together in such a great tightly integrated way with Amazon’s other services we use and is reliably distributed worldwide, all with barely any work on our part.”

LongTail Video had added support for Amazon CloudFront Streaming to their popular open source video player, JW Player. “There was a great fit between the JW player and Amazon CloudFront streaming: both focus on making it as easy as possible for anyone to incorporate high quality video into Websites,” said LongTail Video co-founder Jeroen “JW” Wijering.

Using Adobe’s Flash Media Server 3.5.3 (FMS), Amazon CloudFront lets developers take advantage of many features of FMS. Customers can decide to deliver their content via the Flash standard Real Time Messaging Protocol (RTMP) or using its encrypted version, RTMPE (for added security). Customers can also use advanced features like dynamic bit rate streaming (which automatically adjusts the bit rate of the stream plated to the end user based on the quality of the user’s connection). Currently supporting on-demand media, Amazon CloudFront streaming support for live events is slated for 2010. For more information regarding Cloud Hosting options please visit Nubifer.com.

The Effects of Platform-as-a-Service (PaaS) on ISVs

Over the past decade, the ascent of Software-as-a-Service (SaaS) has allowed Independent Software Vendors (ISVs) to develop new applications hosted and delivered on the Web. Until recently, however, any ISV creating a SaaS offering has been required to create its own hosting and service delivery infrastructure. With the rise of Platform-as-a-Service (PaaS) over the past two years, this has all changed. As the online equivalent of conventional computing platforms, PaaS provides an immediate infrastructure on which an ISV can quickly build and deliver a SaaS application.

Many ISVs are hesitant to bind their fate to an emerging platform provider, yet those that have taken a leap of faith and adopted PaaS early on have reaped the benefits, seeing dramatic reductions in development costs and timescales. PaaS supercharges SaaS by lowering barriers to entry and foreshortening time-to-market, this quickening the pace of innovation and intensifying competition.

The nature of ISVs will forever be altered by the advent of PaaS, not only ISVs who choose to introduce SaaS offerings but those who remain tethered to conventionally-licenses, customer-operated software products. The ways in which PaaS alters the competitive landscape across a variety of parameters:

Dramatically quicker cycles of innovation

By implementing the iterative, continuous improvement upgrade model of SaaS, PaaS allows developers to monitor and subsequently respond to customer usage and feedback and quickly incorporate the latest functionality into their own applications.

Lowered price points

Developers’ costs are cut down across multiple dimensions by the shared, pay-as-you-go, elastic infrastructure of PaaS. This results in greatly reduced development and operations costs.

Multiplicity of players from reduced barriers to entry

Large numbers of market entrants are attracted to the low costs of starting on a PaaS provider’s infrastructure. These entrants would not otherwise be able to fund their own infrastructure and thanks to a PaaS are able to significantly increase innovation and competition.

New business models, propositions, partner channels and routes to market

New ways of offering products and bringing them to market, many of them highly disruptive to established models, are created by the “as-a-service” model.

It is important for ISVs to understand and evaluate that PaaS is different than other platforms in order for them to remain in control of their own destiny. PaaS is a new kind of platform, the dynamics of which are different than conventional software platforms. Developers need to be weary of assessing PaaS alternatives on the basis of criteria that are not valid when applied to PaaS. For more information on Platform as a Service please visit Nubifer.com.

Google’s Plans on Expanding Its Cloud Offerings for 2010

After a few years spent plugging away in the cloud computing market, hosting its Google Apps collaboration programs for business and consumers, Google is embracing cloud computing even more in 2010. According to Google’s vice president of product management Bradley Horowitz, the company plans on focusing on Google Voice and cloud computing this year. Industry prognosticators predict that the Gizmo5 assets will boost the Google Voice phone management application and Google will be competing with IBM, Microsoft and Cisco Systems for market share in hosted applications.

According to one Google executive, we haven’t seen anything yet when it comes to Google Voice.  A phone management application which lets users route calls to all of their phones from one distinct number, Google Voice features tools like automatic voicemail transcription, conference calling, SMS support and low-cost international calling. Oh, and did we mention it’s free? That might explain why there are over 1.4 million users. While 1.4 million is a mere fraction of the 500 million people around the globe using Skype, that is about to change. Currently, Google Voice users are required to have a phone carrier to use the service, something no required by the popular VOIP all Skype, but in 2010 that is going to change.

In November 2009 Google acquired Gizmo5. The maker of so-called softphone software will allow Google Voice to operate similarly to Skype, by letting users place calls via the Internet from one PC to another or even from a PC to a mobile phone or landline. Although Horowitz, who jumped ship from Yahoo two years ago and currently oversees Gmail, Google Docs, Picasa and other Google Apps, has yet to outline specifics for how exactly Google will implement Gizmo5 with Google Voice, he appeared elated with the move during a recent interview with eWeek.com.

During the interview, Horowitz described the goal of the newly-improved Google Phone as a way to seamlessly combine telephony communication as it currently resides separate from user’s experience on the Web. According to Horowitz, Google sees essentially all computing services, for work and for play, funneling through the Web in the future.

Although over two million businesses have signed up for Google Apps, there remains a sizable faction of businesses that are hesitant to embrace the cloud. Web-based social networks like Facebook and Twitter, with over 350 and 60 million users respectively, became more and more popular in 2009, which shows an increasing trend towards accepting the cloud. Essentially, worries associated with cloud computing began to dissipate in 2009, which means there is a lot to look forward to for cloud computing in 2010.

One way that Google made cloud computing more accessible last year was by showcasing the Data Liberation Front to let users export data created within users’ Google Apps to apps outside of Google’s realm. Additionally, Google launched the Google Dashboard, which lets users see exactly how much data they were creating within Google to host. Horowitz believes that Google’s trust-taking measures will pay off.

Google won’t be the only company moving deeper into cloud computing, as a whole batch of rival companies have plans to forge ahead and mark new territory in 2010. Customers and businesses will gain from the competition within cloud computing as the rivalry between companies will mean more choices for everyone. For more information on Google Apps Migration, please visit Nubifer.com.

Collaboration Transitioned to the Cloud

Cloud computing provides ample possibilities when enabling richer communication, whether inside or outside the firewall. Regardless of the location, area of specialization or the format of information, the Web offers an ideal forum for project stakeholders to share ideas. Collaboration can play a vital role in the discovery process when a browser is all that is required to interact.

There are many technical considerations that need to be addressed when moving collaboration into the cloud. The data involved in modern scientific research is vast and complex, and as such it isn’t possible to take legacy infrastructure that is firmly planted on the ground and move it into the cloud. There are simply too many transactional systems bundled around these data hubs to get to the core.

On balance, too much latency would be introduced if thick-client technologies were installed at every site to transact on one or many data warehouses. Organizations should instead focus on enabling the integration, shared access and reporting of project-centric date via a cloud-based project data mart. This should be done rather than isolating information within disciplinary silos and requires a services-based formation platform. The services-based information platform must be capable of extracting the most relevant scientific intelligence from diverse systems and formats.

Take a fictional pharmaceutical company, for example, that is working on a drug discovery project with a Contact Research Organization (CRO). Many scientific organizations actually install their legacy IT systems at the outsourcer’s site as a way to exchange and analyze data. This is costly and also inefficient because systems need to be maintained within the organization;s internal IT infrastructure and at the CRO site.

The redundancies multiply with each department, location and partner involved. Data mart and reporting are on top of a serviced-based architecture with a cloud-based project and workflows, critical information and transactions, which need to be accessed by collaborators, and can be maintained globally with a lower support burden and seat cost. To learn more about Collaboration in the Cloud, please visit Nubifer.com.

Nubifer Cloud:Portal

Reducing capital expenditure for hardware supporting your software is a no-brainer, and Nubifer Cloud:Portal allows you to leverage the computing power and scalability of the top-tier cloud platforms. A powerful suite of core portal technologies, interfaces, database schematics and service-oriented architecture libraries, Cloud:Portal comes in several configuration options and you are sure to find the right fit for your enterprise.

Nubifer understands that certain clients requiring custom on-premise and cloud-hosted portals may also require different application layers and data layer configurations. For this reason, Nubifer leverages RAD development techniques to create robust, scalable programming code in ASP.NET (C#), ASP, PHP, Java Servlets, JSP and ColdFusion and Perl. Nubifer also supports a myriad of data formats and database platform types, cloud SOA and architectures such as SQL Server (and Express), Microsoft ® Access, MYSQL, Oracle and more.

Nubifer Cloud:Portal Provides Enterprise Grade Solutions

Your new Nubifer Cloud:Portal is created by Nubifer’s professional services team through customizing and enhancing one or more combinations. In addition, a wide range of cloud modules are compatible and can be added as “plug-in” modules to extend your portal system.

The following Options in Portal types are available:

·         Online Store

·         Task Management System

·         Employee Directory

·         Bug / Task Tracker

·         Forum / Message Board

·         Wizard Driven Registration Forms

·         Time Sheet Manager

·         Blog / RSS Engine Manager

·         Calendar Management System

·         Events Management

·         Custom Modules to Match Business Needs

At its most basic, the cloud is a nebulous infrastructure owned and operated by an outside party that accepts and runs workloads created by customers. Nubifer Cloud:Portal is compatible with cloud platforms and APIs like Google APIs for Google Applications and Windows® Azure, and also runs on standard hosting platforms.

Cloud Portal boasts several attractive portal management features. Multi-level Administrative User Account Management lets you manage accounts securely, search by account and create and edit all accounts. Public Links and Articles Manager allows to you create, edit or archive new articles, search indexed and features the Dynamic Links manager. Through “My Account” User Management, users can manage their own account and upload and submit custom files and information. The Advanced Security feature enables session-based authentication and customized logic.

That’s not all! There are other great features association with Nubifer Cloud Portal. Calendar and Events lets you add and edit; calendars can be user specific or group organization specific and events can be tied to calendar events. The system features dynamic styles because it supports custom styles sheets dynamically triggered by user choice or by configuration settings, which is great for co-branding or the multi-host look and feel. Web Service XML APIs for 3rd party integration feature SOA architecture, are web service enables and are interoperable with the top-tier cloud computing platforms by exposing and consuming XML APIs. Lastly, submission forms, email and database submission is another important feature. Submission forms trigger send mail functionality and are manageable by Portal Admins.

Cloud Portal employs R.I.A. Reporting such as User Reports, Search BY Category Reports, Transaction Details Reports, Simple Report and Timesheet Report through Flex and Flash Reports.

Companies using Cloud Portal are delivered a “version release” code base for their independent endeavors. These companies leveraging Nubifer’s professional portal service have access, ownership and full rights to the “code instance” delivered as the final release version of their customized cloud portal. This type of licensing gives companies a competitive edge by being the sole proprietor of their licenses copy of the cloud portal.

Enterprise companies leverage the Rapid and Rich offering delivered by out portal code models and methodologies. As a result, companies enjoy the value of rapid prototyping and application enhancement with faster to market functionality in their portals.

Nubifer Cloud:Portal technology is designed to facilitate and support your business model today and in the future, by expanding as your company evolves. Within our process for portal development, we define and design the architecture, develop and enhance the portal code and deliver and deploy to your public or private environment. Please visit nubifer.com to learn more about our proprietary offering, Cloud:Portal.

Security in the Cloud

One major concern has loomed over companies considering a transition into the cloud: security. The “S” word has affected the cloud more than other types of hosted environments, but most concerns about security are not based on reality.

Three factors about cloud security:

1.       Cloud security is almost identical to internal security, and the security tools used to protect your data in the cloud are the same ones you use each day. The only difference is that the cloud is a multi-tenant environment with multiple companies sharing the same cloud service provider.

2.       Security issues within the cloud can be address with the very same security tools you currently have in place. While security tools are important, they should not be perceived as a hindrance when making the transition into the cloud. Over time, the commodity nature of IT will require that you transition your technologies to the cloud in order to remain financially competitive. This is why it is important to start addressing security measures now in order to prepare for the future.

3.       As long as you choose a quality cloud provider, your security within the cloud will be as good—perhaps even better!—than your current security. The level of security within in the cloud is designed for the most risky client in the cloud, and thus you will receive that same security whatever your level of risk.

Internal or External IT?

Prior to asking questions about security within the cloud, you need to ask what exactly should move into the cloud in the first place, such as commodities. Back when companies first began taking advantage of IT, the initial businesses to computerize their organization’s processes had significant gains over competitors. As the IT field grew, however, the initial competitive benefits of computerization began to wane, and computerization thus became a requirement in order to simply remain relevant. As such, there is an increasing amount of IT operating as a commodity.

Cloud computing essentially allows business to offload commodity technologies and free up resources and time to concentrate on the core business. For example, a company manufacturing paper products requires a certain amount of IT to run its business and also make it competitive. The company also runs a large quantity of commodity IT; this commodity technology takes time, money, energy and people away from the company’s business of producing paper products at a price that rivals competitors. This is where cloud computing comes in.

The commodity IT analysis form helps you determine what parts of your IT can be moved externally by helping you list out all of the functions that your IT organization performs and decide if you think of this activity as a commodity, or not.

Internal IT Security

Some think that internal IT no longer helps businesses set themselves apart from other businesses. The devaluing of IT leads to many companies failing to adequately fund required budgets to operate a first-class IT infrastructure. In addition, there is an increasing number of security mandates from external and internal courses means that IT can’t always fund and operate as required.

Another problem involves specialization and its effect on business function, as businesses exist as specialized entities. When looking at funding and maintaining a non-core part of the business, IT faces a problem. For example, an automotive maker avoids starting a food production company even though it could feed its employees that way because that is not its core business. It is unlikely that the automotive manufacturer’s IT department will be as successful as its manufacturing business. On balance, a business with IT as its only product line or service should be more successful as providing IT. Thus if the automotive maker isn’t going to operate as a best-in-class IT business, why would its security be expected to be best-in-class? A company with IT as its business is the best choice for securing your data because the quality of its product and its market success depends on its security being effective.

Factors to consider when picking a cloud provider:

Cloud providers have internal and external threats that can be accepted or mitigated, like internal IT, and these challenges are all manageable:

Security assessment: Most organizations usually relax their level of security over time, and as a way to combat this, the cloud provider must perform regular security assessments. The subsequent security report must be given to each client immediately after it is performed so the client knows the current state over their security in the cloud.

Multi-tenancy: The cloud provider should design its security to ensure that it meets the needs of its higher-risk clients, and in turn all clients will reap the rewards of this.

Shared Risk: The cloud service provider will not be the cloud operator in many instances, but the cloud service provider may nonetheless be providing a value-added service in addition to another cloud provider’s service. Take a Software-as-a-Service provider, for example. The SaaS provider needs infrastructure, and it may make more sense to get that infrastructure from an Infrastructure-as-a-Service provider as opposed to building it on its own. Within this kind of multi-tier service provider, the risk of security issues are shared by each part because the risk affects all parties involved at various layers. The architecture used by the main cloud provider must be addressed and that information taken into account when assessing the client’s total risk mitigation plan.

Distributed Data Centers: Due to the fact that providers can offer an environment that is geographically distributed, a cloud computing environment should be less prone to disasters–in theory. In reality, many organizations sign up for cloud computing services that are not geographically distributed, this they should require that their provider have a working and regularly-tested disaster recovery plan (including SLAs).

Staff Security Screening: As with other types of organizations, contractors are often hired to work for cloud providers, and these contractors should be subject to a full background investigation.

Physical Security: When choosing a cloud security provider, physical external threats should be analyzed carefully. Some important questions to ask are: Do all of the cloud provider’s facilities have the same levels of security? Is your organization being offered the most secure facility with no guarantee that your data will actually reside there?

Policies: Cloud providers are not exempt from suffering from data leaks or security incidents, which is why cloud providers need to have incident response policies and procedures for each client that they feed into their overall incident response plan.

Data Leakage: One of the greatest organizational risks from a security standpoint is data leakage. As such, the cloud provider must have the ability to map its policy to the secure mandate you must comply with and talk about the issues at hand.

Coding: In-house software used by all cloud providers may contain application bugs. For this reason, each client should make sure that the cloud provider follows secure coding practices. All code should additionally be written using a standard methodology that is documented and can also be demonstrated to the customer.

In conclusion, security remains a major concern, but it is important to understand that the technology used to secure your organization within the cloud isn’t untested or new. Security questions within the cloud represent the logical progression to outsourcing of commodity services to some of the same IT providers that you have been confidently using for years already. Moving IT elements into the cloud is simply a natural progression in the overall IT evolution. Visit nubifer.com for more information regarding the ever-changing environment of Cloud security.

Google’s Power Play

Seeking to keep its large data centers supplied with power, Google’s Google Energy subsidiary has asked the Federal Energy Regulatory Commission for the right to purchase and re-sell electricity to consumers. A vast amount of electricity is required for Google’s cloud computing model, which includes its Google Apps collaboration applications and its popular search engine, and by becoming a player in the energy game Google Energy feels it will be able to contain the cost of energy for Google at the very least.

Google is all too aware of its enormous consumption of power, as the leading search provider with the desire to expand its purview online via other Web services. Google Energy’s request to buy and resell electricity to consumers was made on December 23, 2009 and asked to be approved by February 23, 2010. eWeek.com obtained the subsidiary’s application to the Federal Energy Regulatory Commission (FERC). Google’s request is a common one among companies that consume a tremendous amount of power, such as Safeway grocery store chains and Wal-Mart retail, to name a few.

Google has thousands of inexpensive, thin rack-mount computers and other servers stashed in large facilities scattered across the globe. Working in parallel, these servers route search engine requests and queries for data from the company’s Google Apps to the next available computers and send the data back to consumers’ PCs and mobile devices. A large amount of energy, and thus a large sum of money, is required for the cloud computing model, and in its application to FERC Google stated that by playing the energy game it can “contain and manage the cost of energy for Google.”

In a statement to eWeek.com, a Google spokesperson said, “Google is interested in procuring more renewable energy as part of our carbon neutrality commitment, and the ability to buy and sell energy on the wholesale market could give us more flexibility in doing so. We made this filing so we can have more flexibility in producing power for Google’s own operations, including our data centers. This FERC authority would improve our ability to hedge out purchases of energy and incorporate renewable into our energy portfolio.”

Google Energy guru Bill Weihl described the company’s objective in layman’s terms during a January 7 interview with the New York Times. “One [motivation] is that we use a moderate amount of energy ourselves: we have a lot of servers, and we have 22,000 employees around the world with office buildings that consume a lot of energy. So we use energy and we care about the cost of that, we care about the environmental impact of it, and we care about the reliability of it,” said the Google Energy czar.

While some might argue that Google’s consumption of power is far more than “moderate,” due to its rather large cloud computing footprint, there are companies out there that consume more energy and are not taking measures to account for it. Also during his interview with the Times, Weihl described Google’s intentions to profit from alternative energy, saying, “We’d be delighted if some of this stuff actually made money, obviously; it is not our goal not to make money. All else being equal, we’d like to makes as much money as we can, but the principle goal is to have a big impact for good.”

Google has invested about $45 million in alternative energy over the past few years, with some of that money going toward eSolar and BrightSource. (Both companies are building towers that capture sunlight to be used as a power source.) Thus while Google’s power plans can be deems capitalistic, they are nonetheless altruistic as well. For mroe information on Google’s Cloud offerings, contact a Nubifer representative today.

Survey Reveals Developers Concentrating on Hybrid Cloud in 2010

According to a survey of application developers conducted by Evans Data, over 60 percent of IT shops polled have plans to adopt a hybrid cloud model in 2010. The results for the poll, released on January 12, 2010, indicate that 61 percent of over 400 participating developers stated that some portion of their companies’ IT resources will transition into the public cloud within the next year.

The hybrid cloud is set to dominate the IT landscape in 2010 because of those surveyed, over 87 percent of the developers said that half or less of their resources will move. A statement obtained by eWeek.com quotes CEO of Evans Data Janel Garvin as saying, “The hybrid Cloud presents a very reasonable model, which is easy to assimilate and provides a gateway to Cloud computing without the need to commit all resources or surrender all control and security to an outside vendor. Security and government compliance are primary obstacles to public cloud adoption, but a hybrid model allows for selective implementation so these barriers can be avoided.”

Evans Data conducted its survey over November and December of last year as a way to examine timelines for public and private cloud adoption, ways in which to collaborate and develop within the cloud, obstacles and benefits of cloud development, architectures and tools for cloud, development, virtualization in the private data center and other aspects of cloud computing. The survey also concluded that 64 percent of developers surveyed expect their clod apps to venture into mobile devices in the near future as well.

Additional information about the future of cloud computing revealed by Evans Data’s poll revealed that the preferred database for use in the public cloud is MySQL, preferred by over 55 percent of developers. Following by Microsoft and IBM, VMware was also revealed to be the preferred hypervisor vendor or user in a virtualized private cloud. To learn more please visit nubfer.com.

Maximizing Effectiveness in the Cloud

At its most basic, the cloud is a nebulous infrastructure owned and operated by an outside party that accepts and runs workloads created by customers. When thinking about the cloud in this way, the basic question concerning cloud computing becomes, “Can I run all of my applications in the cloud?” If you answer “no” to that question, then ask yourself, “What divisions of my data can safely be run in the cloud?” When assessing how to include cloud computing in your architecture, one way to maximize your effectiveness in the cloud is to see how you can effectively complement your existing architectures.

The current cloud tools strive to manage provisioning and a level of mobility management, with security and audit capabilities on the horizon, in addition to the ability to move the same virtual machine in and out of the cloud. This is where virtualization, a new data center which includes a range of challenges for traditional data center management tools, comes into play. Identity, mobility and data separation are a few obvious sues for virtualization.

1.       Identity

Server identity becomes crucial when you can make 20 identical copies of an existing server and then distribute them around the environment with just a click of a mouse. In this way, the traditional identity based on physicality doesn’t measure up.

2.       Mobility

While physical servers are stationary, VMs are designed to be mobile, and tracking and tracing them throughout their life cycles is an important part of maintaining and proving control and compliance.

3.       Data separation

Resources are shared between host servers and the virtual servers running on them, thus portions of the host’s hardware (like the processor and memory) are allocated to each virtual server. There have not been any breaches of isolation between virtual servers yet, but this may not last.

These challenges are highlighted by cloud governance. While these three issues are currently managed and controlled by someone outside of the IT department, additional challenges that are specific to the cloud now exist. Some of them include life cycle management, access control, integrity and cloud-created VMS.

1.       Life cycle management

How is a workload’s life cycle managed once it has been transferred to the cloud?

2.       Access control

Who was given access to the application and its data while it was in the cloud?

3.       Integrity

Did its integrity remain while it was in the cloud, or was it altered?

4.       Cloud-created VMS

Clouds generate their own workloads and subsequently transfer them into the data center. These so-called “virtual appliances” are being downloaded into data centers each day and identity, integrity and configuration need to be managed and controlled there.

Cloud computing has the potential to increase the flexibility and responsiveness of your IT organization and there are things you can do to be pragmatic about the evolution of cloud computing. They include understanding what is needed in the cloud, gaining experience with “internal clouds” and testing external clouds.

1.       Understanding that is needed to play in the cloud

The term “internal clouds” has resulted from the use of virtualization in the data center. It is important to discuss with auditors how virtualization is impacting their requirements and new requirements and new policies may subsequently be added to your internal audit checklists.

2.       Gaining experience with “internal clouds”

It is important to be able to efficiently implement and enforce the policies with the right automation and control systems. It becomes easier to practice that in the cloud once you have established what you need internally.

3.       Testing external clouds

The use of low-priority workloads help provide a better understanding of what is needed for life cycle management as well as establish what role external cloud infrastructures may play in your overall business architecture.

Essentially, you must be able to manage, control and audit your own internal virtual environment in order to be able to do so with an external cloud environment. Please visit nubifer.com to learn more on maximizing officing effectiveness in the cloud.

The Arrival of Ubiquitous Computing

Among other things, one of the “ah ha” moments taken from this year’s CES (the world’s largest consumer technology tradeshow) was the arrival of ubiquitous computing. Formerly a purely academic concept, the data, voice, device and display convergence is now more relevant than ever. Ubiquitous convergence in consumer technology on enterprise software is poised to impact those highly involved in the field of cloud computing as well as the average consumer in the near future.

Industry prognosticators are now predicting that consumers will begin to expect the ubiquitous experience in practically everything they use on a daily basis, from their car to small household items. Take those that grew up in the digital world and will soon be entering the workforce; they will expect instant gratification when it comes to work and play and everything in between. For example, Apple made the Smartphone popular and a “must-have” item for non-enterprise consumers with its iPhone. The consumer-driven mobile phone revolution will likely seep into other areas as well, with consumers increasingly starting to expect to have a similar experience as with an iPhone in software. Due to this trend, many enterprise software vendors are now making mobile a greater priority than before, and in turn staying ahead of the curve will mean anticipating more and more ubiquitous convergence.

What Does Ubiquitous Computing Mean for ISVs?

CES showcased a wide range of new interface and display technology, such as a multi-touch screen by 3M, a screen with haptic feedback, pico projector and the list goes on. A cheap projector and a camera can combine to make virtually any surface into an interface or display, which will allow consumers to interact with software in innovative, unimaginable and unanticipated ways, thus putting ISVs to the task of supporting these new interfaces and displays. This gives ISVs the opportunity to differentiate their offering by leveraging rather than submitting to this new trend in technology.

The Combination of Location-based Apps and Geotagging

Both Google’s Favorite Places and Nokia’s Point and Find seek to organize and essentially own the information about places and objects using QR codes. The QR codes are generally easy to generate and have flexible and extensible structure to hold useful information, while the QR code readers are the devices—such as a camera phone with a working data connection—that most of us own already. When geotagging is combined with augmented reality that is already propelling the innovation in location-based apps, there is the potential for ample innovation. Smarter supply chain, sustainable product life cycle management and efficient manufacturing are all possible outcomes from the combination of location-based applications and geotagging.

The Evolution of 3D

While 3D simply adds a certain “cool” factor to playing video games or watching movies, 3D is poised to make the transition from merely a novelty into something useful. Although simply replicating 3D analog in the digital world won’t make software better, adding a third dimension could aid those looking at 2D. One way that 3D technology can be more effective is by using it in conjunction with complementing technology like multi-touch interface, to provide 3D accordances, and with location-based and mapping technology to manage objects in 3D analog world.

Rendering Technology to Outpace Non-Graphics Computation Technology

As shown by Toshiba’s TV with cell processors and ATI and nVidia’s graphic cards, the investment into rendering hardware complements the innovation in display elements (like LED, energy-efficient technology, etc). Hi-quality graphics at all former factors are being delivered via the combination of faster processors and sophisticated software. So far, enterprise software ISVs have been focusing on algorithmic computation of large volumes of data to design various solutions, and rendering computation technology lagged non-graphics data computation technology. Now rendering computation has caught up with non-graphics data and will outpace non-graphics data computation in the near future. This will allow for the creation of software that can crunch large volumes of data and leverage high-quality graphics without any lag, that delivers striking user experiences as well as realtime analytics and analysis.  For more information, please visit www.nubifer.com.

Scaling Storage and Analysis of Data Using Distributed Data Grids

One of the most important new methods for overcoming performance bottlenecks for a large class of applications is data parallel programming on a distributed data grid. This method is predicted to have important applications in cloud computing over the next couple years, and eWeek Knowledge Center contributor William L. Bain describes ways in which a distributed data grid can be used to implement powerful, Java-based applications for parallel data analysis.

In current Information Age, companies must store and analyze a large amount of business data. Companies that have the ability to efficiently search data for important patterns will have a competitive edge over others. An e-commerce Web site, for example, needs to be able to monitor online shopping carts in order to see which products are selling faster than others. Another example is a financial services company, which needs to hone its equity trading strategy as it optimizes its response to rapidly changing market conditions.

Businesses facing these challenges have turned to distributed data grids (also called distributed caches) in order to scale their ability to manage rapidly changing data and sort through data to identify patterns and trends that require a quick response. A few key advantages are offered by distributed data grids.

Distributed data grids store memory instead of on a disk for quick access. Additionally, they run seamlessly across various servers to scale performance. Lastly, they provide a quick, easy-to-use platform for running “what if” analyses on the data they store. They can take performance to a level unable to be matches by stand-alone database serves by breaking the sequential bottleneck.

Three simple steps for building a fast, scalable data storage and analysis solution:

1. Store rapidly changing business data directly in a distributed data grid rather than on a database server

Distributed data grids are designed to plug directly into the business logic of today’s enterprise application and services. They match the in-memory view of data already used by business logic by storing data as collections of objects rather than relational database tables. Because of this, distributed data grids are easy to integrate into existing applications using simple APIs (which are available for most modern languages like Java, C# and C++).

Distributed data grids run on server farms, thus their storage capacity and throughput scale just by adding more grid servers. A distributed data grid’s ability to store and quickly access large quantities of data can expand beyond a stand-alone database server when hosted on a large server farm or in the cloud.

2. Integrate the distributed data grid with database servers in an overall storage strategy

Distributed data grids are used to complement, not replace data servers, which are the authoritative repositories for transactional data and long-term storage. With an e-commerce Web site, for example, a distributed data grid would hold shopping carts to efficiently manage a large workload of online shopping traffic. A back-end database server would meanwhile store completed transactions, inventory and customer records.

Carefully separating application code used for business logic from other code used for data access is an important factor to integrating a distributed data grid into an enterprise application’s overall strategy. Distributed data grids naturally fit into business logic, which manages data as objects. This code is where rapid access to data is required and also where distributed data grids provide the greatest benefit. The data access layer, in contract, usually focuses on converting objects into a relational form for storage in database servers (or vice versa).

A distributed data grid can be integrated with a database server so that it can automatically access data from the database server if it is missing from the distributed data grid. This is incredibly useful for certain types of data such as product or customer information (stored in the database server and retrieved when needed by the application). Most types of rapidly changing, business logic data, however, can be stored solely in a distributed data grid without ever being written out to a database server.

3. Analyze grid-based data by using simple analysis codes as well as the MapReduce programming pattern

After a collection of objects, such as a Web site’s shopping carts, has been hosted in a distributed data grid, it is important to be able to scan this data for patterns and trends. Researchers have developed a two-step method called MapReduce for analyzing large volumes of data in parallel.

As the first step, each object in the collection is analyzed for a pattern of interest by writing and running a simple algorithm that assesses each object one at a time. This algorithm is run in parallel on all objects to analyze all of the data quickly. The results that were generated by running this algorithm are next combined to determine an overall result (which will hopefully identify an important trend).

Take an e-commerce developer, for example. The developer could write a simple code which analyzes each shopping cart to rate which product categories are generating the most interest. This code could be run on all shopping carts throughout the day in order to identify important shopping trends.

Using this MapReduce programming pattern, distributed data grids offer an ideal platform for analyzing data. Distributed data grids store data as memory-based objects, and thus the analysis code is easy to write and debug as a simple “in-memory” code. Programmers don’t need to learn parallel programming techniques nor understand how the grid works. Distributed data grids also provide the infrastructure needed to automatically run this analysis code on all grid servers in parallel and then combine the results. By using a distributed data grid, the net result is that the application developer can easily and quickly harness the full scalability of the grid to quickly discover data patterns and trends that are important to the success of an enterprise. For more information, please visit www.nubifer.com.

Answers to Your Questions on Cloud Connectors

Jeffrey Schwartz and Michael Desmond, both editors of Redmond Developer News, recently sat down with corporate vice president of Microsoft’s Connected Systems Division, Robert Wahbe, at the recent Microsoft Professional Developers Conference (PDC) to talk about Microsoft Azure and its potential impact on the developer ecosystem at Microsoft. Responsible for managing Microsoft’s engineering teams that deliver the company’s Web services and modeling platforms, Wahbe is a major advocate of the Azure Services Platform and offers insight into how to build applications that exist within the world of Software-as-a-Service, or as Microsoft calls it, Software plus Services (S + S).

When asked how much of Windows Azure is based on Hyper-V and how much is an entirely new set of technologies, Wahbe answered, “Windows Azure is a natural evolution of our platform. We think it’s going to have a long-term radical impact with customers, partners and developers, but it’s a natural evolution.” Wahbe continued to explain how Azure brings current technologies (i.e. the server, desktop, etc.) into the cloud and is fundamentally built out of Windows Server 2008 and .NET Framework.

Wahbe also referenced the PDC keynote of Microsoft’s chief software architect, Ray Ozzie, in which Ozzie discussed how most applications are not initially created with the idea of scale-out. Explained Wahbe, expanding upon Ozzie’s points, “The notion of stateless front-ends being able to scale out, both across the data center and across data centers requires that you make sure you have the right architectural base. Microsoft will be trying hard to make sure we have the patterns and practices available to developers to get those models [so that they] can be brought onto the premises.”

As an example, Wahbe created a hypothetical situation in which Visual Studio and .NET Framework can be used to build an ASP.NET app, which in turn can either be deployed locally or to Windows Azure. The only extra step taken when deploying to Windows Azure is to specify additional metadata, such as what kind of SLA you are looking for or how many instances you are going to run on. As explained by Wahbe, the Metadata is an .XML file and as an example of an executable model, Microsoft is easily able to understand that model. “You can write those models in ‘Oslo’ using the DSL written in ‘M,’ targeting Windows Azure in those models,” concludes Wahbe.

Wahbe answered a firm “yes” when asked if there is a natural fit for application developed in Oslo, saying that it works because Oslo is “about helping you write applications more productively,” also adding that you can write any kind of application—including cloud. Although new challenges undoubtedly face development shops, the basic process of writing and deploying code remains the same. According to Wahbe, Microsoft Azure simply provides a new deployment target at a basic level.

As for the differences, developers are going to need to learn a new set of services. An example used by Wahbe is if two businesses were going to connect through a business-to-business messaging app; technology like Windows Communication Foundation can make this as easy process. With the integration of Microsoft Azure, questions about the pros and cons of using the Azure platform and the service bus (which is part of .NET services) will have to be evaluated. Azure “provides you with an out-of-the-box, Internet-scale, pub-sub solution that traverses firewalls,” according to Wahbe. And what could be bad about that?

When asked if developers should expect new development interfaces or plug-ins to Visual Studio, Wahbe answered, “You’re going to see some very natural extensions of what’s in Visual Studio today. For example, you’ll see new project types. I wouldn’t call that a new tool … I’d call it a fairly natural extension to the existing tools.” Additionally, Wahbe expressed Microsoft’s desire to deliver tools to developers as soon as possible. “We want to get a CTP [community technology preview] out early and engage in that conversation. Now we can get this thing out broadly, get the feedback, and I think for me, that’s the most powerful way to develop a platform,” explained Wahbe of the importance of developers’ using and subsequently critiquing Azure.

When asked about the possibility of competitors like Amazon and Google gaining early share due to the ambiguous time frame of Azure, Wahbe’s responded serenely, “The place to start with Amazon is [that] they’re a partner. So they’ve licensed Windows, they’ve licensed SQL, and we have shared partners. What Amazon is doing, like traditional hosters, is they’re taking a lot of the complexity out for our mutual customers around hardware. The heavy lifting that a developer has to do to tale that and then build a scale-out service in the cloud and across data centers—that’s left to the developer.” Wahbe detailed how Microsoft has base computing and base storage—the foundation of Windows Azure—as well as higher-level services such as the database in the cloud. According to Wahbe, developers no longer have to build an Internet-scale pub-sub system, nor do they have to find a new way to do social networking and contacts nor have reporting services created themselves.

In discussing the impact that cloud connecting will have on the cost of development and the management of development processes, Wahbe said, “We think we’re removing complexities out of all layers of the stack by doing this in the cloud for you … we’ll automatically do all of the configuration so you can get load-balancing across all of your instances. We’ll make sure that the data is replicated both for efficiency and also for reliability, both across an individual data center and across multiple data centers. So we think that be doing that, you can now focus much more on what your app is and less on all that application infrastructure.” Wahbe predicts that it will be simpler for developers to build applications with the adoption of Microsoft Azure. For more information on Cloud Connectors, contact a Nubifer representative today.

Nubifer Cloud:Link

Nubifer Cloud:Link monitors your enterprise systems in real-time and strengthens interoperability with disparate owned and leased SaaS systems. When building enterprise mash-ups, custom addresses and custom source codes are created by engineers to bridge the white space, also known as electronic hand-shakes, between the various enterprise applications within your organization. By utilizing Nubifer Cloud:Link, you gain a real-time and historic view of system-based interactions.

Cloud:Link is designed and configured via robust administrative tools to monitor custom enterprise mash-ups and deliver real-time notifications, warning and performance metrics of your separated yet interconnected business systems. Cloud:Link offers the technology and functionality to help your company monitor and audit your enterprise system configurations.

ENTERPRISE MONITORING
Powerful components of Cloud:Link make managing enterprise grade mash-ups simple and easy.

  • Cloud:Link inter-operates with other analytic engines including popular tracking engines (eg: Google Analytics)
  • RIA (Rich Internet Applications): reporting, graphs and charts
  • WEB API handles secure key param calls
  • Verb- and Action-based scripting language powered by “Verbal Script”
  • XML Schema Reporting capabilities
  • Runs on-premise, as an installed solution, or in the cloud as a SaaS offering
  • Client-side recording technology tracks and stores ‘x’ and ‘y’ coordinate usage of enterprise screens for compliance, legal and regulatory play back
  • Graphical snapshots of hot maps show historical views of user interaction and image hit state selections
  • Creates a method for large systems to employ “data and session playback” technologies of system-generated and user-generated interaction sessions in a meaningful and reproducible way

USE CASE
Cloud:Link monitors and reports enterprise system handshakes, configurations, connections and latency reports in real time. Additionally, Cloud:Link rolls the data view up to your IT staff and system stakeholders via rich dashboards of charts and performance metrics. Cloud:Link also has a robust and scalable analytic data repository that keeps an eye on the connection points of enterprise applications, and audits things like “valid ssl cert warnings or pending expirations”, “mid to high latency warnings”, “ip logging”, “custom gateway SSO (Single Sign-On) landing page monitoring” among many other tracking features.

SUPPORTS POPULAR WEB ANALYTICS
Cloud:Link
also leverages Google Analytics by way of Cloud:Link extended AP,  which can complete parallel calls to your Google Analytics account API, and send data, logs, analytic summaries, and physical click and interface points by the end users to any third party provider or data store for use in your own systems.

SERVER SIDE
On the server side, Cloud:Link is a server-based application you can install or subscribe to as a service. Data points and Machine-to-Machine interaction is tracked at every point during a systems interaction. The Cloud:Link monitor can track remote systems without being embedded or adopted by the networked system, however, if your company chooses to leverage the Cloud:Link API for URI Mashup Tracking, you can see even more detailed real time reports of system interoperability and up-time.

CLIENT SIDE
On the client side, leverage Cloud:Link’s browser plug-in within your enterprise to extend your analytic reach into the interactions by your end-users. This approach is particularly powerful when tracking large systems being used by all types of users. Given the proper installation and setup, your company can leverage robust “Session Playback” of human interaction with your owned and leased corporate business systems.

ADMIN FUNCTIONALITY
Nubifer Inc. focuses on interoperability in the enterprise. Disparate applications operating in independent roles and duties need unified index management, Single Sign-On performance tracking, and application integration monitoring.

  • User Admin logs in and sees a dashboard with default reporting widgets configurable by the admin user
  • “My Reports” (Saved Wizard generated reports) and can be setup to auto send reports to key stake holders in your IT or Operations group
  • Logs (Raw log review in Text Area, exportable to csv, or API post to remote FTP account)
  • Users (Connecting known vs. unknown connecting IP’s)
  • Systems (URI lists of SSO (Single Sign-On)paths to your SaaS and on Premise Apps) – An Enterprise Schematic Map of your On-Prem and Cloud-Hosted Applications

At the core of Nubifer’s products are Nubifer Cloud:Portal, Nubifer Cloud:Link, and Nubifer Cloud:Connector, which offer machine-to-machine real time analytics, tracking and playback of machine to machine interaction for human viewers using Rich Internet Application Components to view on customize-able dashboards. Nubifer Cloud:Link enables large publicly traded or heavily regulated companies to follow compliance laws, regulations, such as SOX, SaS70, HL7/HPPA, and mitigate the risk of not knowing how your systems are interacting on a day to day basis.

PUBLIC AND PRIVATE CLOUD PLATFORM SUPPORT
Currently Cloud:Link is hosted on, and compatible with:

  • Microsoft® Windows Azure™ Platform
  • Amazon® EC3
  • Google® App Engine
  • On-Premise Hosted

To learn more about Cloud:Link technology please contact cloudlink@Nubifer.com or visit nubifer.com/cloud:link to find out how you can begin using the various features offered by Nubifer Cloud:Link.

Breaking Down the Fundamentals of Cloud Computing

Following in the footsteps of industry buzz words like utility computing, clustering and virtualization, cloud computing is on the tips of everyone’s tongues lately. Cloud computing does have its own unique meaning, although it shares overlapping ideas with distributed, utility and grid computing. The reason for the conceptual intersections partly stems from the evolving technological usages, changes and implementations in recent years.

The waning general interest in grid, utility and distributed computing coupled with marketing and service offerings from large corporations like Amazon, Google and IBM drive the increased interest in cloud computing in the past year. Google search trends confirm that the term cloud computing has only been in use for about one year. Some suggest that the term ‘cloud computing’ likely comes in part from the use of an image of a cloud representing the Internet or a huge network. While what lies in the cloud remains somewhat ambiguous, the cloud is relied upon to send and receive data.

Allied with an abstract notion of the cloud, cloud computing replaces servers, routers and data pipes with services. While the fundamental hard- and software of networking remains in place, applications are built through higher level service capabilities, data and compute resources, with cloud computing. How the service is managed, implemented and what types of technology are used are not of importance to the user, as the access to the service and confidence in the reliability of meeting application requirements are the only things that matter.

At its core, cloud computing is distributed computing. Using the resource from multiple services (possibly from multiple locations as well), an application is built. As opposed to relying on the cloud for available resources, the endpoint to access the services is usually still required at this point and is also known as Software as a Service, or SaaS. A grid of computers typically lies behind the service interface, provides the resources, and is typically hosted by one company, which makes it easier to support and maintain. Although definitions of a grid vary, it is commonly described as a uniform environment of hard- and software. Utility computing is in place when a user starts paying for the services and resources utilized.

The essence of cloud computing is accessing services and resources required to perform functions with actively changing needs. Rather than requesting access from a specific named resource or endpoint, an application or service developer uses the cloud. The events taking place within the cloud manage multiple infrastructures over multiple organizations and include one or sometimes more than one frameworks covering and uniting the infrastructures. These frameworks serve as catalysts for self-monitoring, self-healing, automatic reconfiguration, resource agreement definitions and resource registration and discovery.

While people maintain the order of hardware, operation systems and networking, the cloud is self managing and maintaining virtualization of resources. The user or application developer only reference the cloud in the process of cloud computing. A framework executing across a heterogeneous environment is a local area network, the Assimilator project offers a local cloud environment and the addition of a network overlay to begin providing an infrastructure across the Internet to further the goal of cloud computing is in the works.

Visit www.nubifer.com for more information about the future of Cloud Computing.

Get Your Java with Google App Engine

Google’s App Engine service has embraced Java’s programming language. The most requested feature for App Engine since its exception, Java support is currently in “testing mode,” although Google eventually plans on bringing GAE’s Java tools up to speed with its current Python support.

As Google’s service for hosting scalable and flexible web applications, App Engine is synonymous with cloud computing for Google. Java is one of the most frequently-used languages for coding applications on the web, and by adding Java Google is filling a major break in its cloud services plan. Also by adding Java, Google is catching up with one if its fiercest competitors in cloud computing, Amazon. Amazon’s Web Services platform has provided support for Java virtual machines for some time now.

In addition, Java support also allows for the possibility of making App Engine a means of running applications for Google’s Android mobile platform. Although no plans for Google’s Android GAW apps have not been outlined as of yet, it appears as if Google is preparing for an effortless and quick way to develop for Android, as Java is available on the device as well as the server.

With the addition of Java support to Google App Engine, other programming languages such as JavaScript, Ruby and maybe Scala, can run on Java virtual machines as well. The possibility of JRuby support or support for other JVM languages arriving any time in the near future, however, is unlikely due to the experimental status of Java.

Those wishing to play around with Google App Engine’s new Java support can add their name to the list on the sign up page; the first 10,000 developers will be rewarded with a spot in the testing group.

Along with Java support, the latest update for Google App Engine includes support for cron jobs which enables programmers to easily schedule recurring tasks such as weekly reports. The Secure Data Connector is another new feature; the Secure Data Connector lets Google App Engine access data behind a firewall. Thirdly, there is a new database import tool; the database import too makes it easier to transport large amounts of data into App Engine.

In summary, by embracing the programming language of Java, Google is filling a gap in its cloud services plan and catching up with competitors like Amazon. For more information, please visit nubifer.com.

Thoughts on Google Chrome OS

As a leading cloud computing and SaaS provider, everyone at Nubifer is excited about Google’s new operating system, Chrome. Designed, in Google’s words, for “people who live on the web,” (like us!) Google’s Chrome browser launched in late 2008 and now an extension of Google Chrome—the Google Chrome Operating System—has arrived. Google demonstrated its open source PC operating system on Nov. 19 and revealed that its code will be open-sourced later this year, with netbooks running Google Chrome OS available for consumers as early as the second half of 2010.

Citing speed, simplicity and security as key features, Google Chrome OS is designed as a modified browser which allows netbooks to carry out everyday computing with web-based applications. Google Chrome OS basically urges consumers to abandon the computing experience that they are used to in favor of one that exists entirely in the cloud (albeit Google’s cloud), which, you have to admit, is a pretty enticing offer. The obvious benefits of the Google Chrome OS are saving money (cloud storage replaces pricey external hard-disc drives) and gaining security (thanks to Google’s monitoring for malware in Chrome OS apps).

While may comparisons have been made between Google Chrome OS and Android (admittedly they do overlap somewhat), Chrome is designed for those who spend the majority of their time on the web, and is thus being created to power computers of varying size, while Android was designed to work across devices ranging from netbooks to cell phones. Google Chrome OS will run on x86 and ARM chips and Google is currently teaming up with several OEMs to offer multiple netbooks in 2010. The foundation of Google Chrome is this: Google Chrome runs within a new windowing system on top of a Linux kernel. The web is the platform for application developers, with new applications able to be written using already-in-place web technologies and existing web-based applications being able to work automatically.

Five benefits of using Google Chrome OS are laid out by Wired.com: Cost, Speed, Compatibility, Portability and New Applications. While netbooks are inexpensive, users often fork out a sizable chunk of change for a Windows license, but using Google’s small, fast-booting platform allows for this cost to be greatly downsized. Those with Linux versions of netbooks also ready know that they cost less than $50 on average and that is due to a Microsoft tax; because Chrome Os is based on Linux it would mostly likely be free. As for speed, Chrome OS is created to run on low-powered Atom and ARM processors, with Google promising boot times measured in mere seconds.

Drivers have caused major problems for those using an OS other than Windows XP on a netbook, but there is a chance that Google may devise an OS able to be downloaded, unloaded onto any machine and ready to use—all without being designed specifically for different netbook models. And now we come to portability, as Chrome allows for all of Google’s services, from Gmail and Google Docs to Picasa, to be built-in and available for offline access using Google Gears. Thus users won’t have to worry about not having data available when not connected to the Internet. As for new applications, it remains unclear whether Google will buy open-source options like the Firefox-based Songbird music player (which has the ability to sync with an iPod and currently runs on some Linux flavors) or if it will create its own.

Another company, Phoenix Technologies, is also offering an operating system, called HyperSpace. Instead of serving as a substitution for Windows, HyperSpace is an optional, complementary (notice it’s spelled with an “e,” not an “i”) mini OS which is already featured on some netbooks. Running parallel to Windows as an instant-on environment, HyperSpace allows netbooks to perform Internet-based functions, such as browsers, e-mail, multimedia players, etc., without booting into Windows. Phoenix Technologies’ idea is similar to Google’s, but Phoenix is a lesser-known company and is taking different approach at offering the mini OS than Google is with its Chrome OS.

Google’s eventual goal is to produce an OS that mirrors the streamlined, quick and easy characteristics of its individual web products. Google is the first to admit that it has its work cut out for it, but that doesn’t make the possibility of doing away with hard drives once and for all any less exciting for all of us. For more information please visit Nubifer.com.

Evaluating Zoho CRM

Although Salesforce may be the name most commonly associated with SaaS CRM, Zoho CRM is picking up speed as a cheap option for small business or large companies with only a few people using the service. While much attention has been paid to Google Apps, Zoho has been quietly creating a portfolio of on-line applications that is worth recognition. Now many are wondering if Zoho CRM will have as large of an impact on Salesforce that Salesforce did on SAP.

About Zoho

Part of Advent, Zoho has been producing SaaS Office-like applications since 2006. One of Zoho’s chief architects, Raju Vegesna, joined Advent upon graduating in 2000 and moving from India to the United States. Among Vegesna’s chief responsibilities is getting Zoho on the map.

Zoho initially offered spreadsheet and writing applications although the company, which targets smaller businesses with 10 to 100 employees, now has a complete range of productivity applications such as email, a database, project management, invoicing, HR, document management, planning and last but not least, CRM.

Zoho CRM

Aimed at businesses seeking to manage customer relations to transform leads into profitable relationships, Zoho CRM begins with lead generation. From there are lead conversion, accounts set up, contacts, potential mapping and campaign tabs. One of Zoho CRM’s best features is its layout. Full reporting facilities with formatting, graphical layouts and dashboards, forecasting and other management tools are neatly displayed and optimized.

Zoho CRM is fully email enabled and updates can be sent to any user set up along with full contact administration. Time lines ensure that leads are never forgotten or campaigns slipped. Like Zimbra and ProjectPlace, Zoho CRM offers brand alignment, which means users can change layout colors and add their own logo branding. Another key feature is Zoho’s comprehensive help section, which is constantly updated with comments and posts from other users online. Contact details from a standard comma separated value (.CSV) file from a user’s email system or spreadsheet application (such as Excel, Star or Open Office) can be imported by Zoho CRM. Users can also export CRM data in the same format as well.

The cost of Zoho CRM is surprisingly low. Zoho CRM offers up to three users (1,500) records for free, a Professional Version for $12 a month and as Enterprise version (20,000 records) for $25 a month. For more information about adopting Zoho’s CRM, contact a Nubifer representative today.

How Microsoft Windows 7 Changed the Game for Cloud Computing … and Signaled a Wave of Competition Between Microsoft, Google and Others.

On October 22 Microsoft released the successor to Windows Vista, Windows 7, and while excitement for the operating system mounted prior to its release, many are suggesting that its arrival is a sign of the end of computing on personal computers and the beginning of computing solely in the cloud. Existing cloud services like social networking, online games and web-based email are accessible through smart-phones, browsers or other client services, and because of the availability of these services Windows 7 is Microsoft’s fist operating system to include less features.

Although Windows is not in danger of extinction, cloud computing makes its operating systems less important. Other companies are following in Microsoft’s footsteps by launching products with fewer features than even Microsoft 7. In September, Microsoft opened a pair of data centers containing half a million servers between them and subsequently issued a new version of Windows for smart-phones. Perpetually ahead of the curve, Microsoft also launched a platform for developers, the highly publicized Azure, which allows them to write and run cloud services.

In addition to changing the game for Microsoft, the growth of cloud computing also heightens competition between the computer industry. Thus far, advancements in technology have pushed computing power in the opposite direction of central hubs (as seen in the shift from mainframes to minicomputers to PCs), while power is now being inverted back to the center in some ways, with less expensive and more powerful processors and faster networks. Basically, the cloud’s data centers are outsized public mainframes. While this is occurring, the PC is being pushed aside by more compact, wireless devices like netbooks and smart-phones.

The lessened importance of the PC enables companies like Apple, Google and IBM to fill in the gap caused my Microsoft’s former monopoly. There are currently hundreds of firms offering cloud services, and more by the day, but as The Economist points out, Microsoft, Google and Apple are in their own league. Each of the three companies has its own global network of data centers and plans on offering several services while also seeking to dominate the new field by developing new software or devices. The battle between Microsoft, Google and Apple sees each company trying to one-up each other. For example, Google’s free PC operating system, Chrome OS, shows Google’s attempt to catch up to Microsoft, while Microsoft’s recent operating system for smart-phones shows Microsoft’s attempt to catch up with the Apple iPhone as all as Google’s handset operating system, Android. Did you follow all of that?

Comparing Google, Microsoft and Apple

Professor Michael Cusamano of MIT’s Sloan School of Management recently told The Economist that while there are similarities between Google, Apple and Microsoft, they are each unique enough to carve out their own spot in the cloud because they approach the trend towards cloud computing in different ways.

Google is most well known for its search service as well as other web-based applications, and has recently began diversifying, launching Android for phones and Chrome OS. In this way, it can be said that Google has been a prototype for a cloud computing company since its inception in 1998. Google’s main source of revenue is advertising, with the company controlling over 75% of search-related ads in the States (and even more on a global scale). Additionally, Google is seeking to make money from selling services to companies, announcing in October that all 35,000 employees at the pest-control-to-parcel-delivery group Rentokil Initial will be using Google’s services.

While Microsoft is commonly associated with Microsoft Office and Windows, the company’s relations to cloud computing are not as distant as one might think. Microsoft’s new search engine, Bing, shows the company’s transition into the cloud, as does its web-based version of Office and the fact that Microsoft now offers many of its business software via online services. Microsoft smartly convinced Yahoo! to merge its search and a portion of its advertising business with Microsoft because consumers expect cloud services to be free, with everything paid for by ads.

As evidenced by the iPhone, the epitome of have-to-have-it, innovative bundles of hard- and software, Apple is largely known for its services outside the cloud. Online offering like the App Store, the iTunes store and MobileMe (a suite of online services), however, show that Apple’s hunger to get a piece of the cloud computing pie is growing by the day. Apple is also currently building what many have suggested is the world’s largest data center (worth a whopping $1 billion) in North Carolina.

While Apple, IBM and Microsoft previously battled for the PC in the late 1980s and early 1990s, cloud computing is an entirely different game. Why? Well, for starters, much of the cloud is based on open standards, making it easier for users to switch providers. Antitrust authorities will play into the rivalry between the companies, and so will other possible contenders, such as Amazon and Facebook, the world’s leading online retailer and social network, respectively (not to mention Zoho and a host of others). An interesting fact thrown to the debate on who will emerge victorious is the fact that all current major contenders in the cloud computing race are American, with Asian and European firms not yet showing up in cloud computing in any major way (although Nokia’s suite of online services, Ovi, is in beginning stages). Visit Nubifer.com for more information.

Worldwide SaaS Revenue to Increase 18 Percent in 2009 According to Gartner

According to the folks over at Gartner, Inc., one of the leading information technology research and advisory companies, worldwide SaaS (Software as a Service) revenue is predicted to reach $7.5 billion in 2009. If Gartner’s forecast is correct, this would show a 17.7 percent increase, as 2008 SaaS revenue totaled at $6.4 billion. Gartner also reports that the market will display significant and steady growth through 2013, at which point revenue is anticipated to extend past $14 billion for enterprise application markets.

Research director Sharon Mertz said of the projections, “The adoption of SaaS continues to grow and evolve within the enterprise application markets. The composition of the worldwide SaaS landscape is evolving as vendors continue to extend regionally, increase penetration within existing accounts and ‘greenfield’ opportunities, and offer more-vertical-specific solutions as part of their service portfolio or through partners.” Mertz continued to explain how the on-demand deployment model has flourished because of the broadening of on-demand vendors’ services through partner offerings, alliances and (recently) by offering and promoting user-application development through PaaS (Platform as a Service) capabilities. Added Mertz, “Although usage and adoption is still evolving, deployment of SaaS still varies between the enterprise application markets and within specific market segments because of buyer demand and applicability of the solution.”

Across market segments, the largest amount of SaaS revenue comes from CCC (content, communications and collaboration) and CRM (customer relationship management) markets. Gartner reports that the CCC market is generating $2.6 billion and the CRM market is generating $2.3 billion, in 2009. The CCC and CRM markets generated $2.14 billion and $1.9 billion in 2008, respectively. See Table 1 for figures.

[Insert graphic box here]

Growth in the CRM market continues to be driven by SaaS, a trend which began four year ago, as evidenced by the jump from less than $500 million and over 8 percent of the CRM market in 2005 to nearly $1.9 million in revenue and over 8 percent of the CRM market in 2008. Gartner anticipated this trend to continue, with SaaS representing nearly 24 percent of the CRM market’s total software revenue in 2009. Says Gartner’s Mertz in conclusion, highlighting the need in the marketplace filled by SaaS, “The market landscape for on-demand CRM continues to evolve as the availability and usage of SaaS solutions becomes more pervasive. The rapid adoption of SaaS and the marketplace success of salesforce.com have compelled vendors without an on-demand solution to either acquire smaller niche SaaS providers or develop the solution internationally in response to increasing buyer demand.” To receive more information contact Nubifer today.

Will Zoho Be the Surprise Winner in the Cloud Computing Race?

With all the talk of Microsoft, Google, Apple, IBM, Amazon and other major companies, it might be easy to forget about Zoho—but that would be a big mistake. The small, private company offers online email, spreadsheets and processors, much like one of the giants in cloud computing, Google, and is steadily showing it shouldn’t be discounted!

Based in Pleasanton, Calif., Zoho has never accepted bank loans or venture capital yet shows revenue of over $50 million a year. While Zoho has data center and networking management tools, its fastest-growing operation is its online productivity suite, according to Zoho’s chief executive, Sridhar Vembu. The company’s position suggests that there may be a spot for Zoho among online productivity application markets seemingly dominated by a few major companies. Vembu recently told the New York Times, “For now, the wholesale shift to the Web really creates opportunities for smaller companies like us.” And he may very well be right.

Zoho has 19 online productivity and collaboration applications (including invoicing, product management and customer relationship management), thus Zoho and Microsoft only overlap with five offerings. Zoho’s focus remains on the business market, with half of the company’s distribution through partners integrating Zoho’s products into their offerings. For example, Box.net, a service for storing, backing up and sharing documents, uses Zoho as an editing tool for uploaded documents. Most of Zoho’s partners are web-based services, showing that cheap, web-based software permits these business mash-ups to occur—while traditional software would make it nearly impossible. “Today, in the cloud model, this kind of integration is economical,” explains Vembu to the New York Times.

According to Vembu, most paying customers using Zoho’s hosted applications from its website (with prices ranging from free to just $25 per month, varying on features and services) are small businesses with anywhere from 40 to 200 employees. As evidence for the transition into the cloud, the chief executive of Zoho points to the Splashtop software created by DeviceVM, a start-up company. Dell, Asus and Hewlett-Packard reportedly plan on loading Splashtop, software able to be installed directly into a PCs hardware (thus completely doing without the operating system) on some of their PCs. “It is tailor-made for us. You go right into the browser,” says Vembu, clearly pleased at the evidence that smaller companies like Zoho are making leeway in the field of cloud computing.

Microsoft Azure Uncovered

Everyone is talking about Microsoft Azure, which could leave some people left in the dust wondering what exactly Azure is, how much it costs and what it means for cloud computing and Microsoft as a whole. If you are among those who have unanswered questions about Microsoft Azure, look no further: here is your guide to all things Azure.

The Basics

When cloud computing first emerged, everyone wondered if and how Microsoft would make the transition into the cloud—and Microsoft Azure is the answer. Windows Azure is a cloud operating system that is essentially Microsoft’s first big step into the cloud. Developers can build using .NET, Python, Java, Ruby on Rails and other languages on Azure. According to Windows Azure GM Doug Hauger, Microsoft plans on eventually offering an admin model, which will permit developers to have access to the virtual machine (as with traditional Infrastructure-as-a-Service offerings like Amazon’s EC2, they will have to manually allocate hardware resources). SQL Azure is Microsoft’s relational database in the cloud while .NET Services is Microsoft’s Platform-as-a-Service built on the Azure OS.

The Cost

There are three different pricing models for Azure. The first is consumption-based, in which a customer pays for what they use. The second is subscription-based, in which those committing to six months of use receive discounts. Available as of July 2010, the third is volume licensing for enterprise customers desiring to take existing Microsoft licenses into the cloud.

Azure compute costs 12 center per service hour, which is half a cent less than Amazon’s Windows-based cloud, while Azure’s storage service costs 15 cents per GB of data per moth, with an additional cent for every 10,000 transactions (movements of data within the stored material). .NET Services platform costs 15 cents for every 100,000 times the applications build on .NET Services accesses a chunk of code or tool. As for moving data, it costs 10 cents per GB of inbound data and 15 cents per GB of outbound data. For up to a 1 GB relational database, SQL Azure is $9.99, while it costs $99.99 for up to a 10 GB relational database.

The Impact on Microsoft and Cloud Computing

Although the introduction of Microsoft Windows Azure comes a bit late into the burgeoning field of cloud computing and as a Platform-as-a-Service party, Microsoft remains ahead of enterprises which the company is hoping to attract as customers. In other words, by eyeing enterprises that still remain skeptical of cloud computing, Microsoft may tap into customers not snatched up by other more established cloud computing parties. No enterprise data center runs solely on Microsoft software, which is likely why the company seems willing to test out other programming languages and welcome heterogeneous environments in Azure. Additionally, the Azure platform as has a service-level agreement that offers 99.9 percent uptime on the storage side with 99.95 percent uptime on the compute side.

As many have pointed out, Microsoft may be behind Amazon and others for the time being, but there is room for an open platform directed at enterprises, which is Azure’s niche. For more Azure related information visit Nubifer.com.

Assessing Risks in the Cloud

There is no denying that cloud computing is one of the most exciting alternatives to traditional IT functions, as cloud services—from Software-as-a-Service to Platform-as-a-Service—offer augmented collaboration, scale, availability, agility and cost reductions. Cloud services can both simplify and accelerate compliance initiatives and offer greater security, but some have pointed out that outsourcing traditional business and IT functions to cloud service providers doesn’t guarantee that these services will be realized.

The risks of outsourcing such services—especially those involving highly-regulated information like constituent data—must be actively managed by organizations or those organizations might increase their business risks rather than transferring or mitigating them. When the processing and storage of constituent information is outsourced, it is not inherently more secure, which brings to mind the boundaries of cloud computing as related to privacy legislation.

By definition, the nature of cloud services lacks clear boundaries and raises valid concerns with privacy legislation. The requirement to protect your constituent information remains your responsibility regardless of what contractual obligations were negotiated with the provider and where the data is located, the cloud included. Some important questions to ask include: Does your service provider outsource any storage functions or data processing to third-parties? Do such third-parties have adequate security programs? Do you know if your service provider—and their service providers—have adequate security programs?

Independent security assessments, such as those performed as part of a SAS70 or PCI audit, are point-in-time evaluations, which is better than nothing at all but still needs to be a consideration. Another thing to consider is that the scope of such assessments can be directed at the provider’s discretion, which does not mean that accurate insight into the provider’s ongoing security activities will be provided.

What all of this means is basically that many questions pertaining to Cloud Governance and Enterprise Risk still loom. For example, non-profit organizations looking to possibly migrate fundraising activities and solutions to cloud services need to first look at their own practices, needs and restrictions to identify possible compliance requirements and legal barriers. Because security is a process rather than a product, the technical security of your constituent data is only as strong as our organization’s weakest process. The security of the cloud computing environment is not mutually exclusive to your organization’s internal policies, standards, procedures, processes and guidelines.

When making the decision to put sensitive constituent information into the cloud, it is important to conduct comprehensive initial and ongoing due diligence audits of your business practices and your provider’s practices. For answers to your questions on Cloud Security visit Nubifer.com.

Launch of Azure

After months of media and technology buzz, Microsoft announced that Microsoft Azure, often described as “Windows in the Cloud,” would be launched on January 1, 2010. The software giant’s Internet-based cloud computing service is likely to alter the entire face of the ever-expanding cloud computing field.

Ray Ozzie, Microsoft chief software architect, revealed the official launch date for Microsoft Azure at the recent Microsoft Professional Developers Conference, held in Los Angeles. Known as an industry leader in selling packaged software like Windows operating systems and Office work programs, Microsoft is joining in on the increasing trend towards cloud computing by unveiling a program hosted on the Internet—or in the cloud.

Cloud computing is an attractive avenue for enterprise companies as well as individuals, as it eliminates the cost and time of buying, installing, updating and maintaining software on workplace machines by letting users and companies basically rent text, spreadsheet, calendar and other programs in the cloud on an as-needed basis. According to industry tracker Gartner, revenue from cloud computing will surpass 14 billion dollars annually by the end of 2013.

Speaking at the at the recent Microsoft Professional Developers Conference, Ozzie said that the first month of Windows Azure will be free of charge, with users being billed from February on. Ozzie described Windows Azure as part of a “three screens and a cloud” future, in which software is delivered across personal computers, televisions and phones connected by cloud-based services.

“Customers want choice and flexibility in how they develop and deploy applications,” explained Ozzie before continuing to say, “We’re moving into an era of solutions that are experienced by users across PCs, phones and the Web, and that are delivered from data centers we refer to as private clouds and public clouds.”

Due to advancements in the cloud made by competitors like Amazon and Google, Microsoft has been under the microscope to make the transition into offering cloud services as of late. Google, for example, has long since established Internet-based applications like its popular Web-hosted email service, Gmail, while Internet retail giant Amazon currently offers an online application platform called the Elastic Compute Cloud (EC2).

With the launch of Microsoft Azure, competition within the cloud computing field continues to expand, while the transition into the cloud for companies becomes more achievable. To see how Adopting Windows Azure could help your organization, visit Nubifer.com.

Google’s Continued Innovation of Technology Evolution

Google has the uncanny ability to introduce non-core disruptive innovations while simultaneously defending and expanding its core, and an analysis of the concepts and framework in Clayton Christensen’s book Seeing What’s Next offers insight into how.

Recently, Google introduced free GPS on the Android phone through a strategy that can be described as “sword and shield.” This latest disruptive innovation seeks to beat a current offering serving the “overshot customers,” i.e. the ones who would stop paying for additional performance improvements that historically had called for price premium. Google essentially entered into the “GPS Market” to serve said overshot customers by using a shield: asymmetric skills and motivation in the form of Android OS, mapping data and a lack of direct revenue expectations. Subsequently, Google transformed its “shield” into a “sword” by disinteremediating the map providers and using a revenue-share agreement to incentivize the carriers.

Examples of “incremental to radical,” to use Christensen’s terms, sustaining innovations in which Google sought out the “undershot customers” are GMail and Google’s core search technology. Frustrated with the products’ limitations, these customers are willing to swap their current product for another better one, should it exist. Web-based email solutions and search engines existed before the Google-introduced ones, but those introduced by Google solved problems that were frustrating users of other products. For example, users relished in GMail’s expansive email quota (compared to the limited quota they faced before) and also enjoyed the better indexing and relevancy algorithms of the Google search engine. Although Microsoft is blatantly targeting Google with Bing, Google appears unruffled and continues to steadily, if somewhat slowly, invest in its sustainable innovation (such as with Caffeine, the next-generation search platform, Gmail labs, social searches, profiles, etc.) to continue to maintain the revenue stream out of its core business.

By spending money on lower-end disruptive innovations and not “cramming” sustaining innovation, Google managed to thrive while most companies are practically destined to fail. The issue between Google’s sustaining and disruptive innovations was even coped with by using this strategy! According to insiders at Google, the GMail team was not used to create Google Wave, a fact unbeknownst to the GMail team. If Google had added wave-like functionality to Gmail, it would have been “cramming” sustaining innovation, while innovating outside of email can potentially serve a variety of both undershot and overshot customers.

So what does this mean for AT&T? Basically, AT&T needs to watch its back and keep an eye on Google! Smartphone revenue is predicted to surpass laptop revenue in 2012, after the number of Smartphone units this year surpassed the number of laptops sold. The current number of subscribers to Comcast exceeds 7 million (eight-fold what it used to be). While Google pays a pricey phone bill for Google Voice, which has 1.4 million users (with 570,000 of them using it seven days a week) Google is dedicated to making Google Voice work—and if it does Google could potentially serve a new brand of overshot customers that want to stay connected in realtime but don’t need or want a landline.

Although some argue that Chrome OS is more disruptive, using disruptive innovation theory it can be said that Chrome OS is created for the breed of overshot customer that is frustrated with other market solutions at the same level, not for the majority of customers. Should Google currently be scheming around Chrome OS, the business plan would be an expensive one, not to mention timely and draining in its use of resources. For more information on Google’s continued innovation efforts, please visit Nubifer.com.

Addressing Concerns for Networking in the Cloud

Many concerns arise when moving applications between internal data centers and public clouds. The considerations for cloud networking once transferred to the cloud will be addressed below.

In the respect that clouds have unique networking infrastructures that support flexible and complex multi-tenant environments, clouds do not vary from the enterprise. Each enterprise has an individual network infrastructure used for accessing servers and allowing applicants to communicate between varying components. That unique infrastructure includes address services (like DHCP/DNS), specific addressing (sub-nets), identity/directory services (like LDAP) and firewalls and routing rules.

It is important to remember that the cloud providers have to control their networking in order to route traffic within their infrastructure. The cloud providers’ design is different from enterprise networking in architecture, design and addressing. While this does not pose a problem when doing something stand-alone in the cloud (because it doesn’t matter what the network structure is, as long as it can be accessed over the Internet), discontinuities must be addressed when desiring to extend existing networks and using existing applications.

In terms of addressing, the typical cloud provider will assign a block of addresses as part of the cloud account. Flexiscale and GoGrid, for example, give the user a block of addresses which are able to be attached to the servers created. These are external addresses (i.e. public addresses that are able to be accessed from the Internet) in some cases, and internal in others. Whether external or internal, they are not assigned as part of the user’s addressing, which means that even if the resources are able to be connected to the data center, new routes will need to be built and services will need to be altered to allow these “foreign” addresses into the system.

A different approach was taken by Amazon, which provided a dynamic system where an address is assigned each time a server is started. In doing this, it was difficult to build multi-tier applications which require developers to create systems which are capable of passing changing address information between application components. The problem for connecting to the Amazon cloud is partially solved by the new VPC (Virtual Private Cloud), although some key problems persist, thus other cloud providers continue to look into similar networking capabilities.

Data protection is another key issue concerning networking in the cloud. A secure perimeter defined and developed by an IT organization, comprised of firewalls, rules and systems to create a protected environment for internal applications, is located within the data center. The reason this is important is that most applications need to communicate over ports and services not safe for general Internet access. It can be dangerous to move applications into the cloud unmodified because applications are developed for the protected environment of the data center. The application owner or developer usually has to build protection on a per-server basis and subsequently enact corporate protection policies.

An additional implication for the loss of control of the infrastructure referenced earlier is that in most clouds, the physical interface level cannot be controlled. MAC addresses are assigned in addition to IP addresses, and these can change each time a server is started, meaning that the identity of the server cannot be based on this common attribute.

Whenever enterprise applications require the support of data center infrastructure, networking issues like identity and naming services and access to internal databases and other resources are involved. Cloud resources thus need a way to connect to the data center, and the easiest is a VPN (Virtual Private Network). In creating this solution, it is essential to design for routing to the cloud and provide a method for cloud applications to “reach back” to the applications and services running in the data center. This connection ideally would allow Layer-2 connectivity due to a number of services required to function properly.

In conclusion, networking is a very important part of IT infrastructure, and the cloud contributes several new variables to the design and operation of the data center environment. A well-constructed architecture and solid understanding of the limitations imposed by the cloud are needed if you want to integrate with the public cloud successfully. Currently, this can be a major barrier to cloud adoption because enterprises are understandably reluctant to re-architect their network environments or become knowledgeable about each cloud provider’s underlying infrastructure’s complexities. In designing a cloud strategy, it is essential to choose a migration path which addresses these issues and protects from expensive engineering projects as well as cloud risks. Please visit Nubifer.com for more information.

Amazon Offers Private Clouds

While Amazon initially resisted offering a private cloud, and there are many advocates of the public cloud, Amazon recently introduced a new Virtual Public Cloud, or VPC. While many bloggers question whether or not Amazon’s VPC is truly a “virtually” private cloud or a “virtual” private cloud, there are some who believe that the VPC may be a way to break down the difficulties that face customers seeking to adopt cloud computing, such as security, ownership and virtualization. The following paragraphs will address each of these issues and how Amazon’s VPC would alleviate them.

One of the key concerns facing customers adopting cloud computing is the perceived security risks that may occur, but the placebo cloud may assuage these risks. The security risk stems from the past experiences of customers’; these customers believe that any connections made using Amazon’s VPN must be secure, even if they are connecting into a series of shared resources. Using Amazon’s private cloud, customers will deploy and consume the applications in an environment that they feel is safe and secure.

Amazon’s VPC provides a sense of ownership to customers without letting them actually own the computing. Customers may initially be skeptical about not owning the computing, thus it is up to Amazon’s marketing engine to provide ample information to alleviate that worry.

As long as the customers’ business goals are fully realized with Amazon’s VPC, they need not necessarily understand nor care about the differences between virtualization and the cloud. In using the VPC, customers are able to use VPN, and network-virtualization—the existing technology stack that they are already comfortable with. In addition, the VPC would allow the partners to enable the customers to bridge the gap between their on-premise systems to the cloud to create a hybrid virtualization environment, which spans several resources.

Whether or not some favor the public cloud, the customer should be able to first choose to enter into cloud computing and later choose which way to leverage the cloud on their own.  For more information about Private Clouds, please visit Nubifer.com.

Get Your Java with Google App Engine

Finally! Google’s App Engine service has finally embraced Java’s programming language. The most requested feature for App Engine since its exception, Java support is currently in “testing mode,” although Google eventually plans on bringing GAE’s Java tools up to speed with its current Python support.

As Google’s service for hosting scalable and flexible web applications, App Engine is synonymous with cloud computing for Google. Java is one of the most frequently-used languages for coding applications on the web, and by adding Java Google is filling a major break in its cloud services plan. Also by adding Java, Google is catching up with one if its fiercest competitors in cloud computing, Amazon. Amazon’s Web Services platform has provided support for Java virtual machines for some time now.

In addition, Java support also allows for the possibility of making App Engine a means of running applications for Google’s Android mobile platform. Although no plans for Google’s Android GAW apps have not been outlined as of yet, it appears as if Google is preparing for an effortless and quick way to develop for Android, as Java is available on the device as well as the server.

With the addition of Java support to Google App Engine, other programming languages such as JavaScript, Ruby and maybe Scala, can run on Java virtual machines as well. The possibility of JRuby support or support for other JVM languages arriving any time in the near future, however, is unlikely due to the experimental status of Java.

Those wishing to play around with Google App Engine’s new Java support can add their name to the list on the sign up page; the first 10,000 developers will be rewarded with a spot in the testing group.

Along with Java support, the latest update for Google App Engine includes support for cron jobs which enables programmers to easily schedule recurring tasks such as weekly reports. The Secure Data Connector is another new feature; the Secure Data Connector lets Google App Engine access data behind a firewall. Thirdly, there is a new database import tool; the database import too makes it easier to transport large amounts of data into App Engine.

In summary, by embracing the programming language of Java, Google is filling a gap in its cloud services plan and catching up with competitors like Amazon.  For more information on Google Apps, please visit Nubifer.com.

Answers to Your Questions on Cloud Connectors for Leading Platforms like Windows Azure Platform

Jeffrey Schwartz and Michael Desmond, both editors of Redmond Developer News, recently sat down with corporate vice president of Microsoft’s Connected Systems Division, Robert Wahbe, at the recent Microsoft Professional Developers Conference (PDC) to talk about Microsoft Azure and its potential impact on the developer ecosystem at Microsoft. Responsible for managing Microsoft’s engineering teams that deliver the company’s Web services and modeling platforms, Wahbe is a major advocate of the Azure Services Platform and offers insight into how to build applications that exist within the world of Software-as-a-Service, or as Microsoft calls it, Software plus Services (S + S).

When asked how much of Windows Azure is based on Hyper-V and how much is an entirely new set of technologies, Wahbe answered, “Windows Azure is a natural evolution of our platform. We think it’s going to have a long-term radical impact with customers, partners and developers, but it’s a natural evolution.” Wahbe continued to explain how Azure brings current technologies (i.e. the server, desktop, etc.) into the cloud and is fundamentally built out of Windows Server 2008 and .NET Framework.

Wahbe also referenced the PDC keynote of Microsoft’s chief software architect, Ray Ozzie, in which Ozzie discussed how most applications are not initially created with the idea of scale-out. Explained Wahbe, expanding upon Ozzie’s points, “The notion of stateless front-ends being able to scale out, both across the data center and across data centers requires that you make sure you have the right architectural base. Microsoft will be trying hard to make sure we have the patterns and practices available to developers to get those models [so that they] can be brought onto the premises.”

As an example, Wahbe created a hypothetical situation in which Visual Studio and .NET Framework can be used to build an ASP.NET app, which in turn can either be deployed locally or to Windows Azure. The only extra step taken when deploying to Windows Azure is to specify additional metadata, such as what kind of SLA you are looking for or how many instances you are going to run on. As explained by Wahbe, the Metadata is an .XML file and as an example of an executable model, Microsoft is easily able to understand that model. “You can write those models in ‘Oslo’ using the DSL written in ‘M,’ targeting Windows Azure in those models,” concludes Wahbe.

Wahbe answered a firm “yes” when asked if there is a natural fit for application developed in Oslo, saying that it works because Oslo is “about helping you write applications more productively,” also adding that you can write any kind of application—including cloud. Although new challenges undoubtedly face development shops, the basic process of writing and deploying code remains the same. According to Wahbe, Microsoft Azure simply provides a new deployment target at a basic level.

As for the differences, developers are going to need to learn a new set of services. An example used by Wahbe is if two businesses were going to connect through a business-to-business messaging app; technology like Windows Communication Foundation can make this as easy process. With the integration of Microsoft Azure, questions about the pros and cons of using the Azure platform and the service bus (which is part of .NET services) will have to be evaluated. Azure “provides you with an out-of-the-box, Internet-scale, pub-sub solution that traverses firewalls,” according to Wahbe. And what could be bad about that?

When asked if developers should expect new development interfaces or plug-ins to Visual Studio, Wahbe answered, “You’re going to see some very natural extensions of what’s in Visual Studio today. For example, you’ll see new project types. I wouldn’t call that a new tool … I’d call it a fairly natural extension to the existing tools.” Additionally, Wahbe expressed Microsoft’s desire to deliver tools to developers as soon as possible. “We want to get a CTP [community technology preview] out early and engage in that conversation. Now we can get this thing out broadly, get the feedback, and I think for me, that’s the most powerful way to develop a platform,” explained Wahbe of the importance of developers’ using and subsequently critiquing Azure.

When asked about the possibility of competitors like Amazon and Google gaining early share due to the ambiguous time frame of Azure, Wahbe’s responded serenely, “The place to start with Amazon is [that] they’re a partner. So they’ve licensed Windows, they’ve licensed SQL, and we have shared partners. What Amazon is doing, like traditional hosters, is they’re taking a lot of the complexity out for our mutual customers around hardware. The heavy lifting that a developer has to do to tale that and then build a scale-out service in the cloud and across data centers—that’s left to the developer.” Wahbe detailed how Microsoft has base computing and base storage—the foundation of Windows Azure—as well as higher-level services such as the database in the cloud. According to Wahbe, developers no longer have to build an Internet-scale pub-sub system, nor do they have to find a new way to do social networking and contacts nor have reporting services created themselves.

In discussing the impact that cloud connecting will have on the cost of development and the management of development processes, Wahbe said, “We think we’re removing complexities out of all layers of the stack by doing this in the cloud for you … we’ll automatically do all of the configuration so you can get load-balancing across all of your instances. We’ll make sure that the data is replicated both for efficiency and also for reliability, both across an individual data center and across multiple data centers. So we think that be doing that, you can now focus much more on what your app is and less on all that application infrastructure.” Wahbe predicts that it will be simpler for developers to build applications with the adoption of Microsoft Azure.  For more information regarding Windows Azure, please visit Nubifer.com.

Welcome to Nubifer Cloud Computing blogs

In this location, we share blogs, research, tutorials and opinions about the ever changing and emerging arena of cloud computing, software-as-a-service, platform-as-a-service, hosting-as-a-service, and user-interface-as-a-service. We also share key concepts focused on interoperability while always maintaining an agnostic viewpoint of technologies and services offered by the top cloud platform providers. For more information, please visit Nubifer.com.