Archive for July, 2010

Updated User Policy Management for Google Apps

Google has released a series of new features granting administrators more controls to manage Google Apps within their organizations, including new data migration tools, SSL enforcement capabilities, multi-domain support and the ability to tailor Google Apps with over 100 applications from the recently-introduced Google Apps Marketplace. On July 20 Google announced one of the most-requested features from administrators: User Policy Management.

With User Policy Management, administrators can segment their users into organizational units and control which applications are enabled or disabled for each group.  Take a manufacturing firm, for example. The company might want to give their office workers access to Google Talk, but not their production line employees, and this is possible with User Policy Management.

Additionally, organizations can use this functionality to test applications with pilot users before making them available on a larger scale. Associate Vice President for Computer Services at Temple University Sheri Stahler says, “Using the new User Policy Management feature in Google Apps, we’re able to test out new applications like Google Wave with a subset of users to decide how we should roll our new functionality more broadly.”

Customers can transition to Google Apps from on-premise environments with User Policy Management, as it grants them the ability to toggle services on or off for groups of users. A business can enable just the collaboration tools like Google Docs and Google sites for users who have yet to move off old on-premises messaging solutions, for example.

These settings can be managed by administrators on the ‘Organizations & Users’ tab in the ‘Next Generation’ control panel. On balance, organizations can mirror their existing LDAP organizational schema using Google Apps Directory Sync or programmatically assign users to organizational units using the Google Apps Provisioning API.

Premier and Educational edition users can begin using User Policy Management for Google Apps at no additional charge.

Dell and Microsoft Partner Up with the Windows Azure Platform Appliance

Dell and Microsoft announced a strategic partnership in which Dell will adopt the Windows Azure platform appliance as part of its Dell Services Cloud to develop and deliver next-generation cloud services at Microsoft’s Worldwide Partner Conference on July 12. With the Windows Azure platform, Dell will be able to deliver private and public cloud services for its enterprise, public, small and medium-sized business customers. Additionally, Dell will develop a Dell-powered Windows Azure platform appliance for enterprise organizations to run in their data-centers.

So what does this mean exactly? By implementing the limited production release of the Windows Azure platform appliance to host public and private clouds for its customers, Dell will leverage its vertical industry expertise in offering solutions for the speedy delivery of flexible application hosting and IT operations. In addition, Dell Services will produce application migration, advisory migration and integration and implementation services.

Microsoft and Dell will work together to develop a Windows Azure platform appliance for large enterprise, public and hosting customers to deploy to their own data centers. The resulting appliance will leverage infrastructure from Dell combined with the Windows Azure platform.

This partnership shows that both Dell and Microsoft recognize that more organizations can reap the benefits of the flexibility and efficiency of the Windows Azure platform. Both companies understand that cloud computing allows IT to increase responsiveness to business needs and also delivers significant efficiencies in infrastructure costs. The result will be an appliance to power a Dell Platform-as-a-Service (PaaS) Cloud.

The announcement with Dell occurred on the same day that Microsoft announced the limited production release of the Windows Azure platform appliance, a turnkey cloud platform for large service providers and enterprises to run in their own data centers. Initial partners (like Dell) and customers using the appliance in their data centers will have the scale-out application platform and data center efficiency of Windows Azure and SQL Azure that Microsoft currently provides.

Since the launch of the Windows Azure platform, Dell Data Center Solutions (DCS) has been working with Microsoft to built out and power the platform. Dell will use the insight gained as a primary infrastructure partner for the Windows Azure platform to make certain that the Dell-powered Windows Azure platform appliance is optimized for power and space to save ongoing operating costs and performance of large-scale cloud services.

A top provider of cloud computing infrastructure, Dell’s client roster boasts 20 of the 25 most heavily-trafficked Internet sites and four of the top global search engines. The company has been custom-designing infrastructure solutions for the top global cloud service providers and hyperscale data center operations for the past three years and has developed an expertise about the specific needs of organizations in hosting, HPC, Web 2.0, gaming, energy, social networking, energy, SaaS, plus public and private cloud builders in that time.

Speaking about the partnership with Microsoft, president of Dell Services Peter Altabef said, “Organizations are looking for innovative ways to use IT to increase their responsiveness to business needs and drive greater efficiency. With the Microsoft partnership and the Windows Azure platform appliance, Dell is expanding its cloud services capabilities to help customers reduce their total costs and increase their ability to succeed. The addition of the Dell-powered Windows Azure platform appliance marks an important expansion of Dell’s leadership as a top provider of cloud computing infrastructure.”

Dell Services delivers vertically-focused cloud solutions with the combined experience of Dell and Perot Systems. Currently, Dell Services delivers managed and Software-as-a-Service support to over 10,000 customers across the globe. Additionally, Dell boasts a comprehensive suite of services designed to help customers leverage public and private cloud models. With the new Dell PaaS powered by the Windows Azure platform appliance, Dell will be able to offer customers an expanded suite of services including transformational services to help organizations move applications into the cloud and cloud-based hosting.

Summarizing the goal of the partnership with Dell, Bob Muglia, president of Microsoft Server and Tools said at the Microsoft Windows Partner Conference on July 12, “Microsoft and Dell have been building, implementing and operating massive cloud operations for years. Now we are extending our longstanding partnership to help usher in the new era of cloud computing, by giving customers and partners the ability to deploy Windows Azure platform in their datacenters.”

Do You Still Need to Worry About Cloud Security?

The answer to the question posed above is … maybe, but definitely not as much as before! A few recent studies in a handful of technologically conservative industries suggest that people and businesses are becoming increasingly comfortable with storing and managing their data in the cloud.

Markets like health care, finance and government, which are typically technology risk-averse, are quickly adopting (and even advocating) disruptive cloud technologies.

Those that have yet to adopt Software-as-a-Service continue to raise two fears when considering making the move into the cloud: Who is in control of my data? Is it safe to store my data somewhere other than the office? These concerns are valid and must be understood by those making the move to the cloud, but the idea that my data must be stored under my roof is shifting.

One expert from Accenture was recently quoted in an article on InformationWeek.com as saying, “Healthcare firms are beginning to realize that cloud providers actually may offer more robust security than is available in-house.” Within that same story a recent study was cited that stated that about one-third of the health care industry currently uses cloud apps and that over 70% of respondents plan to shift more and more to SaaS and cloud apps. While these estimates are interesting in any field, the intrigue is heightened when it comes to health care, where HIPPA compliance rules are notoriously strict.

The finance world is seeing similar shifts. For example, a recent study conducted by SIFMA explained how cloud computing is enabling the financial industry to move forward with technology in spite of budget restraints. “The [finance] industry is showing a larger appetite for disruptive technologies such as cloud computing to force business model change,” said the study.

Even the federal government is showing traces of similar trends, with federal CIO Vivek Kundra singing the praises of cloud computing even more than Marc Benioff! “Far too long we’ve been thinking very much vertically and making sure things are separated. Now we have an opportunity to lead with solutions that by nature encourage collaboration both horizontally and vertically.”

Cloud security remains an important issue that vendors take seriously, but there is definitely a shifting mood towards acceptance of cloud security. In a recently blog post, John Soat summarized the current mood saying, “It’s not that security in the cloud isn’t still a concern for both [health care and finance] industries, but it’s a known, and perhaps better understood factor … So while security is still a legitimate concern, it doesn’t seem to be the show stopper it used to be …”

Evaluating Zoho CRM

Although SalesForce.com may be the name most commonly associated with SaaS CRM, Zoho CRM is picking up speed as a cheap option for small business or large companies with only a few people using the service. While much attention has been paid to Google Apps, Zoho has been quietly creating a portfolio of on-line applications that is worth recognition. Now many are wondering if Zoho CRM will have as large of an impact on SalesForce that SalesForce did on SAP.

About Zoho

Part of Advent, Zoho has been producing SaaS Office-like applications since 2006. One of Zoho’s chief architects, Raju Vegesna, joined Advent upon graduating in 2000 and moving from India to the United States. Among Vegesna’s chief responsibilities is getting Zoho on the map.

Zoho initially offered spreadsheet and writing applications although the company, which targets smaller businesses with 10 to 100 employees, now has a complete range of productivity applications such as email, a database, project management, invoicing, HR, document management, planning and last but not least, CRM.

Zoho CRM

Aimed at businesses seeking to manage customer relations to transform leads into profitable relationships, Zoho CRM begins with lead generation. From there are lead conversion, accounts set up, contacts, potential mapping and campaign tabs. One of Zoho CRM’s best features is its layout. Full reporting facilities with formatting, graphical layouts and dashboards, forecasting and other management tools are neatly displayed and optimized.

Zoho CRM is fully email enabled and updates can be sent to any user set up along with full contact administration. Time lines ensure that leads are never forgotten or campaigns slipped. Like Zimbra and ProjectPlace, Zoho CRM offers brand alignment, which means users can change layout colors and add their own logo branding. Another key feature is Zoho’s comprehensive help section, which is constantly updated with comments and posts from other users online. Contact details from a standard comma separated value (.CSV) file from a user’s email system or spreadsheet application (such as Excel, Star or Open Office) can be imported by Zoho CRM. Users can also export CRM data in the same format as well.

The cost of Zoho CRM is surprisingly low. Zoho CRM offers 100,000 records storage in Free Edition and Unlimited data storage in Professional and Enterprise Editions. In FE, users can “import” up to 1,500 records per batch in contrast to 20,000 records in the Enterprise Edition.

Protected: Microsoft Azure® Platform-as-a-Service Breaks Away from the Pack

This content is password protected. To view it please enter your password below:

Four Key Categories for Cloud Computing

When it comes to cloud computing, concerns about control and security have dominated recent discussions. While it was once assumed that all computing resources could be had from outside, now it is going towards a vision of a data center magically transformed for easy connections to internal and external IT resources.

According to IDC’s Cloud Services Overview report, sales of cloud-related technology is growing at 26 percent per year. That is six times the rate of IT spending as a whole; although they comprised only about 5 percent of total IT revenue this year. While the report points out that defining what constitutes cloud-related spending is complicated, it estimates global spending of $17.5 billion on cloud technologies in 2009 will grow to $44.2 billion by 2013. IDC predicts that hybrid or internal clouds will be the norm, although even in 2013 only an estimated 10 percent of that spending will go specifically to public clouds.

According to Chris Wolf, analyst at The Burton Group, hybrid cloud infrastructure isn’t that different from existing data-center best practices. The difference is that all of the pieces are meant to fit together using Internet-age interoperability standards as opposed to homegrown kludge.

The following are four items to consider when making a “shopping list” when preparing your IT budget for use of private or public cloud services:

1.       Application Integration

Software integration isn’t the first thing most companies consider when building a cloud, although Bernard Golden, CEO at cloud consulting firm HyperStratus, and CIO.com blogger, says it is the most important one.

Tom Fisher, vice president of cloud computing at SuccessFactors.com, a business-application SaaS provider in San Mateo, California, says that integration is a whole lot more than simply batch-processing chunks of data being traded between applications once or twice per day like it was done in mainframes.

Fisher continues to explain that it is critical for companies to be able to provision and manage user identities from a single location across a range of applications, especially when it comes to companies that are new in the software-providing business and do not view their IT as a primary product.

“What you’re looking for is to take your schema and map it to PeopleSoft or another application so you can get more functional integration. You’re passing messages back and forth to each other with proper error-handling agreement so you can be more responsive. It’s still not real time integration, but in most cases you don’t really need that,” says Fisher.

2.       Security

The ability to federate—securely connect without completely merging—two networks, is a critical factor in building a useful cloud, according to Golden.

According to Nick Popp, VP of product development at Verisign (VRSN), that requires layers of security, including multifactor authentication, identity brokers, access management and sometimes an external service provider who can provide that high a level of administrative control. Verisign is considering adding a cloud-based security service.

Wolf states that it requires technology that doesn’t yet exist. According to Wolf, an Information Authority that can act as a central repository for security data and control of applications, data and platforms within the cloud. It is possible to assemble that function out of some of the aspects Popp mentions today, yet Wolf maintains that there is no one technology able to span all platforms necessary to provide real control of even an internally hosted cloud environment.

3.       Virtual I/O

One IT manager at a large digital mapping firm states that if you have to squeeze data for a dozen VMs through a few NICs, the scaling of your VM cluster to cloud proportions will be inhibited.

“When you’re in the dev/test stage, having eight or 10 [Gigabit Ethernet] cables per box is an incredible labeling issue; beyond that, forget it. Moving to virtual I/O is a concept shift—you can’t touch most of the connections anymore—but you’re moving stuff across a high-bandwidth backplane and you can reconfigure the SAN connections or the LANs without having to change cables,” says the IT manager.

Virtual I/O servers (like the Xsigo I/O Director servers used by the IT manager’s company) can run 20Gbit/sec through a single cord and as many as 64 cords to a single server—connecting to a backplane with a total of 1,560Gbit/sec of bandwidth. The IT Manager states that concentrating such a large amount of bandwidth in one device saves space, power and cabling and keeps network performance high and saves money on network gear in the long run.

Speaking about the Xsigo servers, which start at approximately $28,000 through resellers like Dell (DELL), the manager says, “It becomes cost effective pretty quickly. You end up getting three, four times the bandwidth at a quarter the price.”

4.       Storage

Storage remains the weak point of the virtualization and cloud-computing worlds, and the place where the most money is spent.

“Storage is going to continue to be one of the big costs of virtualization. Even if you turn 90 percent of your servers into images, you still have to store them somewhere,” says Golden in summary. Visit Nubifer.com for more information.

Zuora Releases Z-Commerce

The first external service (SaaS) that actually understands the complex billing models of the cloud providers (which account for monthly subscription fees as well as automated metering, pricing and billing for products, bundles and highly individualized/specific configurations) arrived in mid-June in the form of Zuora’s Z-Commerce. An upgrade to Zuora’s billing and payment service that is built for cloud providers, Z-Commerce is a major development. With Z-Commerce, storage-as-a-service is able to charge for terabytes of storage used, or IP address usage, or data transfer charges. Cloud providers can also structure a per CPU instance charge or per application use charge and it can take complexities like peak usage into account. Zuora has provided 20 pre-configured templates for the billing and payment models that cloud providers use.

What makes this development so interesting that that Zuora is using what they are calling the “subscription economy” for the underlying rationale for their success: 125 customers, 75 employees and profitability.

Tien Tzou, the CEO of Zuora (also the former Chief Strategy Officer of Salesforce.com, described subscription economy below:

“The business model of the 21st century is a fundamentally different business model.

The 21st century world needs a whole new set of operational systems — ones that match the customer centric business model that is now necessary to succeed.

The business model of the 20th century was built around manufacturing.  You built products at the lowest possible cost, and you find buyers for that product.

They key metrics were all around inventory, cost of goods sold, product life cycles, etc. But over the last 30 years, we’ve been moving away from a manufacturing economy to a services economy. Away from an economy based on tangible goods, to an economy based on intangible ideas and experiences.

What is important now is the customer — of understanding customer needs, and building services & experiences that fulfill those customer needs.  Hence the rise of CRM.

But our financial and operational systems have not yet evolved!  What we need today are operational systems built around the customer, and around the services you offer to your customers.

You need systems that allow you to design different services, offered under different price plans that customers can choose from based on their specific needs.  So the phone companies have 450 minute plans, prepaid plans, unlimited plans, family plans, and more.  Salesforce has Professional Edition, and Enterprise Edition, and Group Edition, and PRM Edition, and more.  Amazon has Amazon Prime.  ZipCar has their Occasional Driving Plan and their Extra Value Plans.

You need systems that track customer lifecycles — things such as monthly customer value, customer lifetime value, customer churn, customer share of wallet, conversion rates, up sell rates, adoption levels.

You need systems that measure how much of your service your customers are consuming.  By the minute?  By the gigabyte?  By the mile?  By the user?  By the view?  And you need to establish an ongoing, recurring billing relationship with your customers, that maps to your ongoing service relationship, that allows you to monetize your customer interactions based on the relationship that the customer opted into.

The 21st century world needs a whole new set of operational systems — ones that match the customer centric business model that is now necessary to succeed.”

To summarize, what he is saying is that the model for future business isn’t the purchase of goods and services, but rather a price provided to a customer for an ongoing relationship to the company. Under this model, the customer is able to structure the relationship in a way which provides them with what they need to accomplish the job (s) that the company can help them with (which can be a variety of services, products, tools and structured experiences).

This is also interesting because your business is measuring the customer’s commitments to you and the other way around in operation terms, even as the business model is shifting to more interactions than ever before. If you are looking at traditional CRM metrics like CLV, churn, share of wallet, adoption rates and more, as they apply to a business model that has continued to evolve away from pure transactions, Tien is saying that the payment/billing, to him, is the financial infrastructure for this new customer-centered economic model (i.e. the subscription model).

Denis Pombriant of Beagle Research Group, LLC commented on this on his blog recently, pointing out that a subscription model does not guarantee a business will be successful. What does have significant bearing on the success of failure of a business is how well the business manages it or has it managed (i.e. by Zuora).

This can be applied to the subscription economy. Zuora is highlighting what they have predicted: that companies are increasingly moving their business models to subscription based pricing. This is the same model that supports free software and hardware, which charges customers by the month. How it is managed is another can of worms, but for now Zuora has done a service by recognizing that the customer-driven companies are realizing that the customers are willing to pay for the aggregate capabilities of the company in an ongoing way—as long as the company continues to support the customer’s needs in solving problems that arise. To learn more about cloud computing and the subscription model, contact a Nubifer.com representative.

Microsoft Releases Security Guidelines for Windows Azure

Industry analysts have praised Microsoft for doing a respectable job at ensuring the security of its Business Productivity Online Services, Windows and SQL Azure. With that said, deploying applications to the cloud requires additional considerations to ensure that data remains in the correct hands.

Microsoft released a version of its Security Development Lifecycle in early June as a result of these concerns. Microsoft’s Security Development Lifecycle, a statement of best practices to those building Windows and .NET applications, focuses on how to build security into Windows Azure applications and has been updated over the years to ensure the security of those apps.

Principle security program manager of Microsoft’s Security Development Lifecycle team Michael Howard warns that those practices were not, however, designed for the cloud. Speaking in a pre-recorded video statement embedded in a blog entry, Howard says, “Many corporations want to move their applications to the cloud but that changes the threats, the threat scenarios change substantially.”

Titled “Security Best Practices for Developing Windows Azure Applications,” the 26-page white paper is divided into three sections: the first describes the security technologies that are part of Windows Azure (including the Windows Identity Foundation, Windows Azure App Fabric Access Control Service and Active Directory Federation Services 2.0—a core component for providing common logins to Windows Server and Azure); the second explains how developers can apply the various SDL practices to build more secure Windows Azure applications, outlining various threats like namespace configuration issues and recommending data security practices like how to generate shared-access signatures and use of HTTPS in the request URL;  and the third is a matrix that identifies various threats and how to address them.

Says Howard, “Some of those threat mitigations can be technologies you use from Windows Azure and some of them are threat mitigations that you must be aware of and build into your application.”

Security is a major concern and Microsoft has address many key issues concerning security in the cloud. President of Lieberman Software Corp., a Microsoft Gold Certified Partner specializing in enterprise security Phil Lieberman says, “By Microsoft providing extensive training and guidance on how to properly and securely use its cloud platform, it can overcome customer resistance at all levels and achieve revenue growth as well as dominance in this new area. This strategy can ultimately provide significant growth for Microsoft.”

Agreeing with Lieberman, Scott Matsumoto, a principal consultant with the Washington, D.C.-based consultancy firm Cigital Inc., which specializes in security, says, “I especially like the fact that they discuss what the platform does and what’s still the responsibility of the application developer. I think that it could be [wrongly] dismissed as a rehash of other information or incomplete—that would be unfair.” To find more research on Cloud Security, please visit Nubifer.com.

Five Best Practices for Private Cloud Computing

Industry experts state that private cloud computing enables enterprise IT executives to maximize their organization’s resources and align IT services with business needs while they wait for public cloud computing standards to become defined.

Even for enterprises that like to manage infrastructure and application in-house, building a private cloud is good practice. Frank Gens, senior vice president and chief analyst at IDC, a research firm in Framingham, Massachusetts, says, “With virtualization and the private cloud, CIOs are much closer to that goal of efficient and dynamic IT service delivery and capability.”

Automation minimizes the IT staff’s involvement when the cloud is up and running and is thus a key goal. “The end user is the constituent who is going to leverage the workload for productive work,” says vice president for services and support at Surgient Inc. Brian Wilson. An Infrastructure as a Service provider in Austin, Texas, Surgient Inc. has deployed 150 private clouds for enterprises in the Fortune 500.

According to Wilson, the most important aspect of a private cloud is self-service. With that said, “a self-service portal does not guarantee self-service. Self-service needs to be layered on top of automation services.” CIOs need to consider the service’s design, definition, library and life-cycle. Additionally, the service should integrate applications which report usage for charge-back (preferable with an administrative dashboard and event broadcasting).

A private cloud doesn’t mean a less complex cloud, and as more enterprises launched their private clouds, best practices are beginning to emerge. Here is a list of five best practices for private cloud computing, according to Wilson:

1. Access

  • Evaluate current and planned hardware, hypervisors, network architecture and storage.
  • Understand corporate security standards and existing vendor relationships ad know where you vendors are going (so you don’t buy into dead-end technology).
  • Begin with a defined project and plan for scale, heterogeneity and change. Plan for and document your deployment plans using client-specific use cases and success criteria.

2. Deploy

  • Microsoft CEO Steve Ballmer compares the usage curve for cloud computing to a hockey stick, so be prepared for the uptick by establishing a deployment schedule.
  • Ensure that essential content is available in a centralized library.
  • Introduce critical members of the team, finalize use cases and confirm the schedule from the beginning.
  • Dynamically manage IT policies by automating self-service provisioning of applications while remaining flexible and understanding of change.
  • Plan for on-site training.

3. Analyze

  • Review usage trends, resource consumption trends, server use and administration overhead–a step that is skipped often, according to Wilson.
  • Understand the metrics for RIO and TCO and gain executive buy-in with formal ROI evaluations monthly and quarterly.
  • Continue to evaluate your processes, as the cloud is a fundamental shift from traditional processes. Ask yourself if there is a better way to do this throughout the process.

4. Create Reusable Code

  • Plan your service catalog wisely by creating reusable building blocks of virtual machines and services.
  • Take the time to understand your users needs and plan for their experience, as your content is critical.
  • Take the centralized view that is possible with a private cloud; avoid discrete stacks and multiple operating systems.

5. Don’t Forget to Charge Back

  • According to Wilson, very few organizations actually charge back, even though one of the pillars of the cloud is its ability to meter services on an as-needed-basis.
  • Saint Luke’s Health System, for example, operates 11 hospitals and clinics in the Kansas City, Missouri metropolitan area. CIO Debe Gash opted for public cloud computing because of the speed with which it enabled her organization to comply with new HIPPA regulations and says charge-back helps keep IT costs down and prove its mettle.
  • “The bill of IT for each entity is valuable. They can see what they’re using. The visibility into what something actually costs is very helpful to them,” says Gash. The charge-back also shows which systems are driving IT costs, thus Gash can “validate that we’re spending money on what’s strategic to the organization.”

To receive more information regarding best practices for private cloud computing contact a Nubifer.com representative today.

Microsoft Makes Strides for a More Secure and Trustworthy Cloud

Cloud computing currently holds court in the IT industry with vendors, service providers, press, analysts and customers all evaluating and discussing the opportunities presented by the cloud.

Security is a very important piece to the puzzle, and nearly every day a new press article or analyst report indicated that cloud security and privacy are a top concern for customers as the benefits of cloud computing continue to unfold. For example, a recent Microsoft survey revealed that although 86% of senior business leaders are thrilled about cloud computing, over 75% remain concerned about the security, access and privacy of data in the cloud.

Customers are correct in asking how cloud vendors are working to ensure the security of cloud applications, the privacy of individuals and protection of data. In March, Microsoft CEO Steve Ballmer told an audience at the University of Washington that, “This is a dimension of the cloud, and it’s a dimension of the cloud that needs all of our best work.”

Microsoft is seeking to address security-related concerns and help customers understand which questions they need to ask as part of Microsoft’s Trustworthy Computing efforts. The company is trying to become more transparent than competitors concerning how they help enable an increasingly secure cloud.

Server and Tools Business president Bob Muglia approached the issue in his recent keynote at Microsoft’s TechEd North America conference saying, “The data that you have is in your organization is yours. We’re not confused about that, that it’s incumbent on us to help you protect that information for you. Microsoft’s strategy is to deliver software, services and tools that enable customers to realize the benefits of a cloud-based model with the reliability and security of on-premise software.”

The Microsoft Global Foundations Services (GFS) site is a resource for users to learn about Microsoft’s cloud security efforts, with the white papers “Securing Microsoft’s Cloud Infrastructure” and “Microsoft’s Compliance Framework for Online Services” being very informative.

Driving a comprehensive, centralized Information Security Program for all Microsoft cloud data-centers and the 200+ consumer and commercial services they deliver –all built using the Microsoft Security Development Lifecycle–GFS covers everything from physical security to compliance, such as Risk Management Process, Response, and work with law enforcement; Defense-in-Depth Security controls across physical, network, identity and access, host, application and data; A Comprehensive Compliance Framework to address standards and regulations such as PCI, SOX, HIPPA, and the Media Ratings Council; and third party auditing, validation and certification (ISO 27001, SAS 70).

Muglia also pointed out Microsoft’s focus on identity, saying, “As you move to cloud services you will have a number of vendors, and you will need a common identity system.” In general, identity is the cornerstone of security, especially cloud security. Microsoft currently provides technologies with Windows Server and cloud offerings which customers can use to extend existing investments in identity infrastructure (like Active Directory) for easier and more secure access to cloud services.

Microsoft is not alone in working on cloud security, as noted by Microsoft’s chief privacy strategist Peter Cullen. “These truly are issues that no one company, industry or sector can tackle in isolation. So it is important to start these dialogs in earnest and include a diverse range of stakeholders from every corner of the globe,” Cullen said in his keynote at the Computers, Freedom and Privacy (CFP) conference. Microsoft is working with customers, governments, law enforcement, partners and industry organizers (like the Cloud Security Alliance) to ensure more secure and trustworthy cloud computing through strategies and technologies. To receive additional information on Cloud security contact a Nubifer.com representative today.

Don’t Underestimate a Small Start in Cloud Computing

Although many predict that cloud computing will forever alter the economics and strategic direction of corporate IT, it is likely that the impact of the cloud will continue to be largely from small projects. Some users and analysts say that these small projects, which do not project complex, enterprise-class, computing-on-demand services, are what to look out for.

David Tapper, outsourcing and offshoring analyst for IDC says, “What we’re seeing is a lot of companies using Google (GOOG) Apps, Salesforce and other SaaS apps, and sometimes platform-as-a-service providers, to support specific applications. A lot of those services are aimed at consumers, but they’re just as relevant in business environments, and they’re starting to make it obvious that a lot of IT functions are generic enough that you don’t need to build them yourself.” New enterprise offerings from Microsoft, such as Microsoft BPOS, have also shown up on the scene with powerful SaaS features to offer businesses.

According to Tapper, the largest representation of mini-cloud computing is small- and mid-sized businesses using commercial versions of Google Mail, Google Apps and similar ad hoc or low-cost cloud-based applications. With that said, larger companies are doing the exact same thing. “Large companies will have users whose data are confidential or who need certain functions, but for most of them, Google Apps is secure enough. We do hear about some very large cloud contracts, so there is serious work going on. They’re not the rule though,” says Tapper.

First Steps into the Cloud

A poll conducted by the Pew Research Center’s Internet & American Life Project found that 71 percent of the “technology stakeholders and critics” believe that most people will do their work from a range of computing devices using Internet-basd applications as their primary tools by 2020.

Respondents were picked from technology and analyst companies for their technical savvy and as a whole believe cloud computing will dominate information transactions by the end of the decade. The June report states that cloud computing will be adopted because of its ability to provide new functions quickly, cheaply and from anywhere the user wishes to work.

Chris Wolf, analyst at Gartner, Inc.’s Burton Group, thinks that while this isn’t unreasonable, it may be a little too optimistic. Wolf says that even fairly large companies sometimes use commercial versions of Google Mail or instant messaging, but it is a different story when it comes to applications requiring more fine tuning, porting, communications middleware or other heavy work to run on public clouds, or data that has to be protected and documented.

Says Wolf, “We see a lot of things going to clouds that aren’t particularly sensitive–training workloads, dev and test environments, SaaS apps; we’re starting to hear complaints about things that fall outside of IT completely, like rogue projects on cloud services. Until there are some standards for security and compliance, most enterprises will continue to move pretty slowly putting critical workloads in those environments. Right now all the security providers are rolling their own and it’s up to the security auditors to say if you’re in compliance with whatever rules govern that data.”

Small, focused projects using cloud technologies are becoming more common, in addition to the use of commercial cloud-based services, says Tapper.

For example, Beth Israel Deaconnes Hospital in Boston elevated a set of VMware (VMW) physical and virtual servers into a cloud-like environment to create an interface to its patient-records and accounting systems, enabling hundreds of IT-starved physician offices to link up with the use of just one browser.

New York’s Museum of Modern Art started using workgroup-on-demand computing systems from CloudSoft Corp. last year. This allowed the museum to create online workspaces for short-term projects that would otherwise have required real or virtual servers and storage on-site.

Cloud computing will make it clear to both IT and business management that some IT functions are just generic when they’re homegrown as when rented, in about a decade or so. Says Tapper, “Productivity apps are the same for the people at the top as the people at the bottom. Why buy it and make IT spend 80 percent of its time maintaining essentially generic technology?” Contact Nubifer.com to learn more…

Nubifer Cloud:Link Mobile and Why Windows Phone 7 is Worth the Wait

Sure, Android devices become more cutting-edge with each near-monthly release and Apple recently unveiled its new iPhone, but some industry experts suggest that Windows Phone 7 is worth the wait. Additionally, businesses may benefit from waiting until Windows Phone 7 arrives to properly compare the benefits and drawbacks of all three platforms before making a decision.

Everyone is buzzing about the next-generation iPhone and smartphones like the HTC Incredible and HTC EVO 4G, but iPhone and Android aren’t even the top smart phone platforms. With more market share than second place Apple and third place Microsoft combined, RIM remains the number one smartphone platform. Despite significant gains since its launch, Android is in fourth place, with only 60 percent as much market share as Microsoft.

So what gives? In two words: the business market. While iPhone was revolutionary for merging the line between consumer gadget and business tool, RIM has established itself as synonymous with mobile business communications. Apple and Google don’t provide infrastructure integration or management tools comparable to those available with the Blackberry Enterprise Server (BES).

The continued divide between consumer and business is highlighted by the fact that Microsoft is still in third place with 15 percent market share. Apple and Google continue to leapfrog one another while RIM and Microsoft are waiting to make their move.

The long delay in new smartphone technology from Microsoft is the result of leadership shakeups and the fact that Microsoft completely reinvented its mobile strategy, starting from scratch. Windows Phone 7 isn’t merely an incremental evolution of Windows Mobile 6.5. Rather, Microsoft went back to the drawing board to create an entirely new OS platform that recognizes the difference between a desktop PC and a smartphone as opposed to assuming that the smartphone is a scaled-down Windows PC.

Slated to arrive later this year, Windows 7 smartphones promise an attractive combination of the intuitive touch interface and experience found in the iPhone and Android, as well as the integration and native apps to tie in with the Microsoft server infrastructure that comprises the backbone of most customers network and communications architecture.

With that said, the Windows Phone 7 platform won’t be without its own set of issues. Like Apple’s iPhone, Windows Phone 7 is expected to lack true multitasking and the copy and paste functionality from the get-go. Additionally, Microsoft is also locking down the environment with hardware and software restrictions that limit how smartphone manufacturers can customize the devices, and doing away with all backward compatibility with existing Windows Mobile hardware and apps.

As a mobile computing platform, Cloud Computing today touches many devices and end points. From Application Servers to Desktops and of course the burgeoning ecosystem of smart phone devices. When studying the landscapes and plethora of cell phone operating systems, and technology capabilities of the smart phones, you start to see a whole new and exciting layer of technology for consumers and business people alike.

Given the rich capabilities of Windows Phone 7 offering Silverlight, and/or XNA technology, we at Nubifer have become compelled to engineer the upgrades to our cloud services to inter-operate with the powerful new upcoming technologies offered by Windows Phone 7. At Nubifer, we plan to deploy and inter-operate with many popular smart phones and hand-set devices by way of linking these devices to our Nubifer Cloud:Link technology and offering extended functionality delivered by Nubifer Cloud:Connector and Cloud:Portal which enable enterprise companies to gain a deeper view into the analytics and human computer interaction of end users and subscribers of various owned and leased software systems hosted entirely in the cloud or by way of the hybrid model.

It makes sense for companies that don’t need to replace their smartphones at once to wait for Windows Phone 7 to arrive, at which point all three platforms and be compared and contrasted. May the best smartphone win!