Posts Tagged ‘ Web Services ’

Get Your Java with Google App Engine

Google’s App Engine service has embraced Java’s programming language. The most requested feature for App Engine since its exception, Java support is currently in “testing mode,” although Google eventually plans on bringing GAE’s Java tools up to speed with its current Python support.

As Google’s service for hosting scalable and flexible web applications, App Engine is synonymous with cloud computing for Google. Java is one of the most frequently-used languages for coding applications on the web, and by adding Java Google is filling a major break in its cloud services plan. Also by adding Java, Google is catching up with one if its fiercest competitors in cloud computing, Amazon. Amazon’s Web Services platform has provided support for Java virtual machines for some time now.

In addition, Java support also allows for the possibility of making App Engine a means of running applications for Google’s Android mobile platform. Although no plans for Google’s Android GAW apps have not been outlined as of yet, it appears as if Google is preparing for an effortless and quick way to develop for Android, as Java is available on the device as well as the server.

With the addition of Java support to Google App Engine, other programming languages such as JavaScript, Ruby and maybe Scala, can run on Java virtual machines as well. The possibility of JRuby support or support for other JVM languages arriving any time in the near future, however, is unlikely due to the experimental status of Java.

Those wishing to play around with Google App Engine’s new Java support can add their name to the list on the sign up page; the first 10,000 developers will be rewarded with a spot in the testing group.

Along with Java support, the latest update for Google App Engine includes support for cron jobs which enables programmers to easily schedule recurring tasks such as weekly reports. The Secure Data Connector is another new feature; the Secure Data Connector lets Google App Engine access data behind a firewall. Thirdly, there is a new database import tool; the database import too makes it easier to transport large amounts of data into App Engine.

In summary, by embracing the programming language of Java, Google is filling a gap in its cloud services plan and catching up with competitors like Amazon. For more information, please visit nubifer.com.

Thoughts on Google Chrome OS

As a leading cloud computing and SaaS provider, everyone at Nubifer is excited about Google’s new operating system, Chrome. Designed, in Google’s words, for “people who live on the web,” (like us!) Google’s Chrome browser launched in late 2008 and now an extension of Google Chrome—the Google Chrome Operating System—has arrived. Google demonstrated its open source PC operating system on Nov. 19 and revealed that its code will be open-sourced later this year, with netbooks running Google Chrome OS available for consumers as early as the second half of 2010.

Citing speed, simplicity and security as key features, Google Chrome OS is designed as a modified browser which allows netbooks to carry out everyday computing with web-based applications. Google Chrome OS basically urges consumers to abandon the computing experience that they are used to in favor of one that exists entirely in the cloud (albeit Google’s cloud), which, you have to admit, is a pretty enticing offer. The obvious benefits of the Google Chrome OS are saving money (cloud storage replaces pricey external hard-disc drives) and gaining security (thanks to Google’s monitoring for malware in Chrome OS apps).

While may comparisons have been made between Google Chrome OS and Android (admittedly they do overlap somewhat), Chrome is designed for those who spend the majority of their time on the web, and is thus being created to power computers of varying size, while Android was designed to work across devices ranging from netbooks to cell phones. Google Chrome OS will run on x86 and ARM chips and Google is currently teaming up with several OEMs to offer multiple netbooks in 2010. The foundation of Google Chrome is this: Google Chrome runs within a new windowing system on top of a Linux kernel. The web is the platform for application developers, with new applications able to be written using already-in-place web technologies and existing web-based applications being able to work automatically.

Five benefits of using Google Chrome OS are laid out by Wired.com: Cost, Speed, Compatibility, Portability and New Applications. While netbooks are inexpensive, users often fork out a sizable chunk of change for a Windows license, but using Google’s small, fast-booting platform allows for this cost to be greatly downsized. Those with Linux versions of netbooks also ready know that they cost less than $50 on average and that is due to a Microsoft tax; because Chrome Os is based on Linux it would mostly likely be free. As for speed, Chrome OS is created to run on low-powered Atom and ARM processors, with Google promising boot times measured in mere seconds.

Drivers have caused major problems for those using an OS other than Windows XP on a netbook, but there is a chance that Google may devise an OS able to be downloaded, unloaded onto any machine and ready to use—all without being designed specifically for different netbook models. And now we come to portability, as Chrome allows for all of Google’s services, from Gmail and Google Docs to Picasa, to be built-in and available for offline access using Google Gears. Thus users won’t have to worry about not having data available when not connected to the Internet. As for new applications, it remains unclear whether Google will buy open-source options like the Firefox-based Songbird music player (which has the ability to sync with an iPod and currently runs on some Linux flavors) or if it will create its own.

Another company, Phoenix Technologies, is also offering an operating system, called HyperSpace. Instead of serving as a substitution for Windows, HyperSpace is an optional, complementary (notice it’s spelled with an “e,” not an “i”) mini OS which is already featured on some netbooks. Running parallel to Windows as an instant-on environment, HyperSpace allows netbooks to perform Internet-based functions, such as browsers, e-mail, multimedia players, etc., without booting into Windows. Phoenix Technologies’ idea is similar to Google’s, but Phoenix is a lesser-known company and is taking different approach at offering the mini OS than Google is with its Chrome OS.

Google’s eventual goal is to produce an OS that mirrors the streamlined, quick and easy characteristics of its individual web products. Google is the first to admit that it has its work cut out for it, but that doesn’t make the possibility of doing away with hard drives once and for all any less exciting for all of us. For more information please visit Nubifer.com.

Evaluating Zoho CRM

Although Salesforce may be the name most commonly associated with SaaS CRM, Zoho CRM is picking up speed as a cheap option for small business or large companies with only a few people using the service. While much attention has been paid to Google Apps, Zoho has been quietly creating a portfolio of on-line applications that is worth recognition. Now many are wondering if Zoho CRM will have as large of an impact on Salesforce that Salesforce did on SAP.

About Zoho

Part of Advent, Zoho has been producing SaaS Office-like applications since 2006. One of Zoho’s chief architects, Raju Vegesna, joined Advent upon graduating in 2000 and moving from India to the United States. Among Vegesna’s chief responsibilities is getting Zoho on the map.

Zoho initially offered spreadsheet and writing applications although the company, which targets smaller businesses with 10 to 100 employees, now has a complete range of productivity applications such as email, a database, project management, invoicing, HR, document management, planning and last but not least, CRM.

Zoho CRM

Aimed at businesses seeking to manage customer relations to transform leads into profitable relationships, Zoho CRM begins with lead generation. From there are lead conversion, accounts set up, contacts, potential mapping and campaign tabs. One of Zoho CRM’s best features is its layout. Full reporting facilities with formatting, graphical layouts and dashboards, forecasting and other management tools are neatly displayed and optimized.

Zoho CRM is fully email enabled and updates can be sent to any user set up along with full contact administration. Time lines ensure that leads are never forgotten or campaigns slipped. Like Zimbra and ProjectPlace, Zoho CRM offers brand alignment, which means users can change layout colors and add their own logo branding. Another key feature is Zoho’s comprehensive help section, which is constantly updated with comments and posts from other users online. Contact details from a standard comma separated value (.CSV) file from a user’s email system or spreadsheet application (such as Excel, Star or Open Office) can be imported by Zoho CRM. Users can also export CRM data in the same format as well.

The cost of Zoho CRM is surprisingly low. Zoho CRM offers up to three users (1,500) records for free, a Professional Version for $12 a month and as Enterprise version (20,000 records) for $25 a month. For more information about adopting Zoho’s CRM, contact a Nubifer representative today.

How Microsoft Windows 7 Changed the Game for Cloud Computing … and Signaled a Wave of Competition Between Microsoft, Google and Others.

On October 22 Microsoft released the successor to Windows Vista, Windows 7, and while excitement for the operating system mounted prior to its release, many are suggesting that its arrival is a sign of the end of computing on personal computers and the beginning of computing solely in the cloud. Existing cloud services like social networking, online games and web-based email are accessible through smart-phones, browsers or other client services, and because of the availability of these services Windows 7 is Microsoft’s fist operating system to include less features.

Although Windows is not in danger of extinction, cloud computing makes its operating systems less important. Other companies are following in Microsoft’s footsteps by launching products with fewer features than even Microsoft 7. In September, Microsoft opened a pair of data centers containing half a million servers between them and subsequently issued a new version of Windows for smart-phones. Perpetually ahead of the curve, Microsoft also launched a platform for developers, the highly publicized Azure, which allows them to write and run cloud services.

In addition to changing the game for Microsoft, the growth of cloud computing also heightens competition between the computer industry. Thus far, advancements in technology have pushed computing power in the opposite direction of central hubs (as seen in the shift from mainframes to minicomputers to PCs), while power is now being inverted back to the center in some ways, with less expensive and more powerful processors and faster networks. Basically, the cloud’s data centers are outsized public mainframes. While this is occurring, the PC is being pushed aside by more compact, wireless devices like netbooks and smart-phones.

The lessened importance of the PC enables companies like Apple, Google and IBM to fill in the gap caused my Microsoft’s former monopoly. There are currently hundreds of firms offering cloud services, and more by the day, but as The Economist points out, Microsoft, Google and Apple are in their own league. Each of the three companies has its own global network of data centers and plans on offering several services while also seeking to dominate the new field by developing new software or devices. The battle between Microsoft, Google and Apple sees each company trying to one-up each other. For example, Google’s free PC operating system, Chrome OS, shows Google’s attempt to catch up to Microsoft, while Microsoft’s recent operating system for smart-phones shows Microsoft’s attempt to catch up with the Apple iPhone as all as Google’s handset operating system, Android. Did you follow all of that?

Comparing Google, Microsoft and Apple

Professor Michael Cusamano of MIT’s Sloan School of Management recently told The Economist that while there are similarities between Google, Apple and Microsoft, they are each unique enough to carve out their own spot in the cloud because they approach the trend towards cloud computing in different ways.

Google is most well known for its search service as well as other web-based applications, and has recently began diversifying, launching Android for phones and Chrome OS. In this way, it can be said that Google has been a prototype for a cloud computing company since its inception in 1998. Google’s main source of revenue is advertising, with the company controlling over 75% of search-related ads in the States (and even more on a global scale). Additionally, Google is seeking to make money from selling services to companies, announcing in October that all 35,000 employees at the pest-control-to-parcel-delivery group Rentokil Initial will be using Google’s services.

While Microsoft is commonly associated with Microsoft Office and Windows, the company’s relations to cloud computing are not as distant as one might think. Microsoft’s new search engine, Bing, shows the company’s transition into the cloud, as does its web-based version of Office and the fact that Microsoft now offers many of its business software via online services. Microsoft smartly convinced Yahoo! to merge its search and a portion of its advertising business with Microsoft because consumers expect cloud services to be free, with everything paid for by ads.

As evidenced by the iPhone, the epitome of have-to-have-it, innovative bundles of hard- and software, Apple is largely known for its services outside the cloud. Online offering like the App Store, the iTunes store and MobileMe (a suite of online services), however, show that Apple’s hunger to get a piece of the cloud computing pie is growing by the day. Apple is also currently building what many have suggested is the world’s largest data center (worth a whopping $1 billion) in North Carolina.

While Apple, IBM and Microsoft previously battled for the PC in the late 1980s and early 1990s, cloud computing is an entirely different game. Why? Well, for starters, much of the cloud is based on open standards, making it easier for users to switch providers. Antitrust authorities will play into the rivalry between the companies, and so will other possible contenders, such as Amazon and Facebook, the world’s leading online retailer and social network, respectively (not to mention Zoho and a host of others). An interesting fact thrown to the debate on who will emerge victorious is the fact that all current major contenders in the cloud computing race are American, with Asian and European firms not yet showing up in cloud computing in any major way (although Nokia’s suite of online services, Ovi, is in beginning stages). Visit Nubifer.com for more information.

Worldwide SaaS Revenue to Increase 18 Percent in 2009 According to Gartner

According to the folks over at Gartner, Inc., one of the leading information technology research and advisory companies, worldwide SaaS (Software as a Service) revenue is predicted to reach $7.5 billion in 2009. If Gartner’s forecast is correct, this would show a 17.7 percent increase, as 2008 SaaS revenue totaled at $6.4 billion. Gartner also reports that the market will display significant and steady growth through 2013, at which point revenue is anticipated to extend past $14 billion for enterprise application markets.

Research director Sharon Mertz said of the projections, “The adoption of SaaS continues to grow and evolve within the enterprise application markets. The composition of the worldwide SaaS landscape is evolving as vendors continue to extend regionally, increase penetration within existing accounts and ‘greenfield’ opportunities, and offer more-vertical-specific solutions as part of their service portfolio or through partners.” Mertz continued to explain how the on-demand deployment model has flourished because of the broadening of on-demand vendors’ services through partner offerings, alliances and (recently) by offering and promoting user-application development through PaaS (Platform as a Service) capabilities. Added Mertz, “Although usage and adoption is still evolving, deployment of SaaS still varies between the enterprise application markets and within specific market segments because of buyer demand and applicability of the solution.”

Across market segments, the largest amount of SaaS revenue comes from CCC (content, communications and collaboration) and CRM (customer relationship management) markets. Gartner reports that the CCC market is generating $2.6 billion and the CRM market is generating $2.3 billion, in 2009. The CCC and CRM markets generated $2.14 billion and $1.9 billion in 2008, respectively. See Table 1 for figures.

[Insert graphic box here]

Growth in the CRM market continues to be driven by SaaS, a trend which began four year ago, as evidenced by the jump from less than $500 million and over 8 percent of the CRM market in 2005 to nearly $1.9 million in revenue and over 8 percent of the CRM market in 2008. Gartner anticipated this trend to continue, with SaaS representing nearly 24 percent of the CRM market’s total software revenue in 2009. Says Gartner’s Mertz in conclusion, highlighting the need in the marketplace filled by SaaS, “The market landscape for on-demand CRM continues to evolve as the availability and usage of SaaS solutions becomes more pervasive. The rapid adoption of SaaS and the marketplace success of salesforce.com have compelled vendors without an on-demand solution to either acquire smaller niche SaaS providers or develop the solution internationally in response to increasing buyer demand.” To receive more information contact Nubifer today.

Will Zoho Be the Surprise Winner in the Cloud Computing Race?

With all the talk of Microsoft, Google, Apple, IBM, Amazon and other major companies, it might be easy to forget about Zoho—but that would be a big mistake. The small, private company offers online email, spreadsheets and processors, much like one of the giants in cloud computing, Google, and is steadily showing it shouldn’t be discounted!

Based in Pleasanton, Calif., Zoho has never accepted bank loans or venture capital yet shows revenue of over $50 million a year. While Zoho has data center and networking management tools, its fastest-growing operation is its online productivity suite, according to Zoho’s chief executive, Sridhar Vembu. The company’s position suggests that there may be a spot for Zoho among online productivity application markets seemingly dominated by a few major companies. Vembu recently told the New York Times, “For now, the wholesale shift to the Web really creates opportunities for smaller companies like us.” And he may very well be right.

Zoho has 19 online productivity and collaboration applications (including invoicing, product management and customer relationship management), thus Zoho and Microsoft only overlap with five offerings. Zoho’s focus remains on the business market, with half of the company’s distribution through partners integrating Zoho’s products into their offerings. For example, Box.net, a service for storing, backing up and sharing documents, uses Zoho as an editing tool for uploaded documents. Most of Zoho’s partners are web-based services, showing that cheap, web-based software permits these business mash-ups to occur—while traditional software would make it nearly impossible. “Today, in the cloud model, this kind of integration is economical,” explains Vembu to the New York Times.

According to Vembu, most paying customers using Zoho’s hosted applications from its website (with prices ranging from free to just $25 per month, varying on features and services) are small businesses with anywhere from 40 to 200 employees. As evidence for the transition into the cloud, the chief executive of Zoho points to the Splashtop software created by DeviceVM, a start-up company. Dell, Asus and Hewlett-Packard reportedly plan on loading Splashtop, software able to be installed directly into a PCs hardware (thus completely doing without the operating system) on some of their PCs. “It is tailor-made for us. You go right into the browser,” says Vembu, clearly pleased at the evidence that smaller companies like Zoho are making leeway in the field of cloud computing.

Microsoft Azure Uncovered

Everyone is talking about Microsoft Azure, which could leave some people left in the dust wondering what exactly Azure is, how much it costs and what it means for cloud computing and Microsoft as a whole. If you are among those who have unanswered questions about Microsoft Azure, look no further: here is your guide to all things Azure.

The Basics

When cloud computing first emerged, everyone wondered if and how Microsoft would make the transition into the cloud—and Microsoft Azure is the answer. Windows Azure is a cloud operating system that is essentially Microsoft’s first big step into the cloud. Developers can build using .NET, Python, Java, Ruby on Rails and other languages on Azure. According to Windows Azure GM Doug Hauger, Microsoft plans on eventually offering an admin model, which will permit developers to have access to the virtual machine (as with traditional Infrastructure-as-a-Service offerings like Amazon’s EC2, they will have to manually allocate hardware resources). SQL Azure is Microsoft’s relational database in the cloud while .NET Services is Microsoft’s Platform-as-a-Service built on the Azure OS.

The Cost

There are three different pricing models for Azure. The first is consumption-based, in which a customer pays for what they use. The second is subscription-based, in which those committing to six months of use receive discounts. Available as of July 2010, the third is volume licensing for enterprise customers desiring to take existing Microsoft licenses into the cloud.

Azure compute costs 12 center per service hour, which is half a cent less than Amazon’s Windows-based cloud, while Azure’s storage service costs 15 cents per GB of data per moth, with an additional cent for every 10,000 transactions (movements of data within the stored material). .NET Services platform costs 15 cents for every 100,000 times the applications build on .NET Services accesses a chunk of code or tool. As for moving data, it costs 10 cents per GB of inbound data and 15 cents per GB of outbound data. For up to a 1 GB relational database, SQL Azure is $9.99, while it costs $99.99 for up to a 10 GB relational database.

The Impact on Microsoft and Cloud Computing

Although the introduction of Microsoft Windows Azure comes a bit late into the burgeoning field of cloud computing and as a Platform-as-a-Service party, Microsoft remains ahead of enterprises which the company is hoping to attract as customers. In other words, by eyeing enterprises that still remain skeptical of cloud computing, Microsoft may tap into customers not snatched up by other more established cloud computing parties. No enterprise data center runs solely on Microsoft software, which is likely why the company seems willing to test out other programming languages and welcome heterogeneous environments in Azure. Additionally, the Azure platform as has a service-level agreement that offers 99.9 percent uptime on the storage side with 99.95 percent uptime on the compute side.

As many have pointed out, Microsoft may be behind Amazon and others for the time being, but there is room for an open platform directed at enterprises, which is Azure’s niche. For more Azure related information visit Nubifer.com.

Assessing Risks in the Cloud

There is no denying that cloud computing is one of the most exciting alternatives to traditional IT functions, as cloud services—from Software-as-a-Service to Platform-as-a-Service—offer augmented collaboration, scale, availability, agility and cost reductions. Cloud services can both simplify and accelerate compliance initiatives and offer greater security, but some have pointed out that outsourcing traditional business and IT functions to cloud service providers doesn’t guarantee that these services will be realized.

The risks of outsourcing such services—especially those involving highly-regulated information like constituent data—must be actively managed by organizations or those organizations might increase their business risks rather than transferring or mitigating them. When the processing and storage of constituent information is outsourced, it is not inherently more secure, which brings to mind the boundaries of cloud computing as related to privacy legislation.

By definition, the nature of cloud services lacks clear boundaries and raises valid concerns with privacy legislation. The requirement to protect your constituent information remains your responsibility regardless of what contractual obligations were negotiated with the provider and where the data is located, the cloud included. Some important questions to ask include: Does your service provider outsource any storage functions or data processing to third-parties? Do such third-parties have adequate security programs? Do you know if your service provider—and their service providers—have adequate security programs?

Independent security assessments, such as those performed as part of a SAS70 or PCI audit, are point-in-time evaluations, which is better than nothing at all but still needs to be a consideration. Another thing to consider is that the scope of such assessments can be directed at the provider’s discretion, which does not mean that accurate insight into the provider’s ongoing security activities will be provided.

What all of this means is basically that many questions pertaining to Cloud Governance and Enterprise Risk still loom. For example, non-profit organizations looking to possibly migrate fundraising activities and solutions to cloud services need to first look at their own practices, needs and restrictions to identify possible compliance requirements and legal barriers. Because security is a process rather than a product, the technical security of your constituent data is only as strong as our organization’s weakest process. The security of the cloud computing environment is not mutually exclusive to your organization’s internal policies, standards, procedures, processes and guidelines.

When making the decision to put sensitive constituent information into the cloud, it is important to conduct comprehensive initial and ongoing due diligence audits of your business practices and your provider’s practices. For answers to your questions on Cloud Security visit Nubifer.com.

Google’s Continued Innovation of Technology Evolution

Google has the uncanny ability to introduce non-core disruptive innovations while simultaneously defending and expanding its core, and an analysis of the concepts and framework in Clayton Christensen’s book Seeing What’s Next offers insight into how.

Recently, Google introduced free GPS on the Android phone through a strategy that can be described as “sword and shield.” This latest disruptive innovation seeks to beat a current offering serving the “overshot customers,” i.e. the ones who would stop paying for additional performance improvements that historically had called for price premium. Google essentially entered into the “GPS Market” to serve said overshot customers by using a shield: asymmetric skills and motivation in the form of Android OS, mapping data and a lack of direct revenue expectations. Subsequently, Google transformed its “shield” into a “sword” by disinteremediating the map providers and using a revenue-share agreement to incentivize the carriers.

Examples of “incremental to radical,” to use Christensen’s terms, sustaining innovations in which Google sought out the “undershot customers” are GMail and Google’s core search technology. Frustrated with the products’ limitations, these customers are willing to swap their current product for another better one, should it exist. Web-based email solutions and search engines existed before the Google-introduced ones, but those introduced by Google solved problems that were frustrating users of other products. For example, users relished in GMail’s expansive email quota (compared to the limited quota they faced before) and also enjoyed the better indexing and relevancy algorithms of the Google search engine. Although Microsoft is blatantly targeting Google with Bing, Google appears unruffled and continues to steadily, if somewhat slowly, invest in its sustainable innovation (such as with Caffeine, the next-generation search platform, Gmail labs, social searches, profiles, etc.) to continue to maintain the revenue stream out of its core business.

By spending money on lower-end disruptive innovations and not “cramming” sustaining innovation, Google managed to thrive while most companies are practically destined to fail. The issue between Google’s sustaining and disruptive innovations was even coped with by using this strategy! According to insiders at Google, the GMail team was not used to create Google Wave, a fact unbeknownst to the GMail team. If Google had added wave-like functionality to Gmail, it would have been “cramming” sustaining innovation, while innovating outside of email can potentially serve a variety of both undershot and overshot customers.

So what does this mean for AT&T? Basically, AT&T needs to watch its back and keep an eye on Google! Smartphone revenue is predicted to surpass laptop revenue in 2012, after the number of Smartphone units this year surpassed the number of laptops sold. The current number of subscribers to Comcast exceeds 7 million (eight-fold what it used to be). While Google pays a pricey phone bill for Google Voice, which has 1.4 million users (with 570,000 of them using it seven days a week) Google is dedicated to making Google Voice work—and if it does Google could potentially serve a new brand of overshot customers that want to stay connected in realtime but don’t need or want a landline.

Although some argue that Chrome OS is more disruptive, using disruptive innovation theory it can be said that Chrome OS is created for the breed of overshot customer that is frustrated with other market solutions at the same level, not for the majority of customers. Should Google currently be scheming around Chrome OS, the business plan would be an expensive one, not to mention timely and draining in its use of resources. For more information on Google’s continued innovation efforts, please visit Nubifer.com.

Addressing Concerns for Networking in the Cloud

Many concerns arise when moving applications between internal data centers and public clouds. The considerations for cloud networking once transferred to the cloud will be addressed below.

In the respect that clouds have unique networking infrastructures that support flexible and complex multi-tenant environments, clouds do not vary from the enterprise. Each enterprise has an individual network infrastructure used for accessing servers and allowing applicants to communicate between varying components. That unique infrastructure includes address services (like DHCP/DNS), specific addressing (sub-nets), identity/directory services (like LDAP) and firewalls and routing rules.

It is important to remember that the cloud providers have to control their networking in order to route traffic within their infrastructure. The cloud providers’ design is different from enterprise networking in architecture, design and addressing. While this does not pose a problem when doing something stand-alone in the cloud (because it doesn’t matter what the network structure is, as long as it can be accessed over the Internet), discontinuities must be addressed when desiring to extend existing networks and using existing applications.

In terms of addressing, the typical cloud provider will assign a block of addresses as part of the cloud account. Flexiscale and GoGrid, for example, give the user a block of addresses which are able to be attached to the servers created. These are external addresses (i.e. public addresses that are able to be accessed from the Internet) in some cases, and internal in others. Whether external or internal, they are not assigned as part of the user’s addressing, which means that even if the resources are able to be connected to the data center, new routes will need to be built and services will need to be altered to allow these “foreign” addresses into the system.

A different approach was taken by Amazon, which provided a dynamic system where an address is assigned each time a server is started. In doing this, it was difficult to build multi-tier applications which require developers to create systems which are capable of passing changing address information between application components. The problem for connecting to the Amazon cloud is partially solved by the new VPC (Virtual Private Cloud), although some key problems persist, thus other cloud providers continue to look into similar networking capabilities.

Data protection is another key issue concerning networking in the cloud. A secure perimeter defined and developed by an IT organization, comprised of firewalls, rules and systems to create a protected environment for internal applications, is located within the data center. The reason this is important is that most applications need to communicate over ports and services not safe for general Internet access. It can be dangerous to move applications into the cloud unmodified because applications are developed for the protected environment of the data center. The application owner or developer usually has to build protection on a per-server basis and subsequently enact corporate protection policies.

An additional implication for the loss of control of the infrastructure referenced earlier is that in most clouds, the physical interface level cannot be controlled. MAC addresses are assigned in addition to IP addresses, and these can change each time a server is started, meaning that the identity of the server cannot be based on this common attribute.

Whenever enterprise applications require the support of data center infrastructure, networking issues like identity and naming services and access to internal databases and other resources are involved. Cloud resources thus need a way to connect to the data center, and the easiest is a VPN (Virtual Private Network). In creating this solution, it is essential to design for routing to the cloud and provide a method for cloud applications to “reach back” to the applications and services running in the data center. This connection ideally would allow Layer-2 connectivity due to a number of services required to function properly.

In conclusion, networking is a very important part of IT infrastructure, and the cloud contributes several new variables to the design and operation of the data center environment. A well-constructed architecture and solid understanding of the limitations imposed by the cloud are needed if you want to integrate with the public cloud successfully. Currently, this can be a major barrier to cloud adoption because enterprises are understandably reluctant to re-architect their network environments or become knowledgeable about each cloud provider’s underlying infrastructure’s complexities. In designing a cloud strategy, it is essential to choose a migration path which addresses these issues and protects from expensive engineering projects as well as cloud risks. Please visit Nubifer.com for more information.

Amazon Offers Private Clouds

While Amazon initially resisted offering a private cloud, and there are many advocates of the public cloud, Amazon recently introduced a new Virtual Public Cloud, or VPC. While many bloggers question whether or not Amazon’s VPC is truly a “virtually” private cloud or a “virtual” private cloud, there are some who believe that the VPC may be a way to break down the difficulties that face customers seeking to adopt cloud computing, such as security, ownership and virtualization. The following paragraphs will address each of these issues and how Amazon’s VPC would alleviate them.

One of the key concerns facing customers adopting cloud computing is the perceived security risks that may occur, but the placebo cloud may assuage these risks. The security risk stems from the past experiences of customers’; these customers believe that any connections made using Amazon’s VPN must be secure, even if they are connecting into a series of shared resources. Using Amazon’s private cloud, customers will deploy and consume the applications in an environment that they feel is safe and secure.

Amazon’s VPC provides a sense of ownership to customers without letting them actually own the computing. Customers may initially be skeptical about not owning the computing, thus it is up to Amazon’s marketing engine to provide ample information to alleviate that worry.

As long as the customers’ business goals are fully realized with Amazon’s VPC, they need not necessarily understand nor care about the differences between virtualization and the cloud. In using the VPC, customers are able to use VPN, and network-virtualization—the existing technology stack that they are already comfortable with. In addition, the VPC would allow the partners to enable the customers to bridge the gap between their on-premise systems to the cloud to create a hybrid virtualization environment, which spans several resources.

Whether or not some favor the public cloud, the customer should be able to first choose to enter into cloud computing and later choose which way to leverage the cloud on their own.  For more information about Private Clouds, please visit Nubifer.com.

Get Your Java with Google App Engine

Finally! Google’s App Engine service has finally embraced Java’s programming language. The most requested feature for App Engine since its exception, Java support is currently in “testing mode,” although Google eventually plans on bringing GAE’s Java tools up to speed with its current Python support.

As Google’s service for hosting scalable and flexible web applications, App Engine is synonymous with cloud computing for Google. Java is one of the most frequently-used languages for coding applications on the web, and by adding Java Google is filling a major break in its cloud services plan. Also by adding Java, Google is catching up with one if its fiercest competitors in cloud computing, Amazon. Amazon’s Web Services platform has provided support for Java virtual machines for some time now.

In addition, Java support also allows for the possibility of making App Engine a means of running applications for Google’s Android mobile platform. Although no plans for Google’s Android GAW apps have not been outlined as of yet, it appears as if Google is preparing for an effortless and quick way to develop for Android, as Java is available on the device as well as the server.

With the addition of Java support to Google App Engine, other programming languages such as JavaScript, Ruby and maybe Scala, can run on Java virtual machines as well. The possibility of JRuby support or support for other JVM languages arriving any time in the near future, however, is unlikely due to the experimental status of Java.

Those wishing to play around with Google App Engine’s new Java support can add their name to the list on the sign up page; the first 10,000 developers will be rewarded with a spot in the testing group.

Along with Java support, the latest update for Google App Engine includes support for cron jobs which enables programmers to easily schedule recurring tasks such as weekly reports. The Secure Data Connector is another new feature; the Secure Data Connector lets Google App Engine access data behind a firewall. Thirdly, there is a new database import tool; the database import too makes it easier to transport large amounts of data into App Engine.

In summary, by embracing the programming language of Java, Google is filling a gap in its cloud services plan and catching up with competitors like Amazon.  For more information on Google Apps, please visit Nubifer.com.

Answers to Your Questions on Cloud Connectors for Leading Platforms like Windows Azure Platform

Jeffrey Schwartz and Michael Desmond, both editors of Redmond Developer News, recently sat down with corporate vice president of Microsoft’s Connected Systems Division, Robert Wahbe, at the recent Microsoft Professional Developers Conference (PDC) to talk about Microsoft Azure and its potential impact on the developer ecosystem at Microsoft. Responsible for managing Microsoft’s engineering teams that deliver the company’s Web services and modeling platforms, Wahbe is a major advocate of the Azure Services Platform and offers insight into how to build applications that exist within the world of Software-as-a-Service, or as Microsoft calls it, Software plus Services (S + S).

When asked how much of Windows Azure is based on Hyper-V and how much is an entirely new set of technologies, Wahbe answered, “Windows Azure is a natural evolution of our platform. We think it’s going to have a long-term radical impact with customers, partners and developers, but it’s a natural evolution.” Wahbe continued to explain how Azure brings current technologies (i.e. the server, desktop, etc.) into the cloud and is fundamentally built out of Windows Server 2008 and .NET Framework.

Wahbe also referenced the PDC keynote of Microsoft’s chief software architect, Ray Ozzie, in which Ozzie discussed how most applications are not initially created with the idea of scale-out. Explained Wahbe, expanding upon Ozzie’s points, “The notion of stateless front-ends being able to scale out, both across the data center and across data centers requires that you make sure you have the right architectural base. Microsoft will be trying hard to make sure we have the patterns and practices available to developers to get those models [so that they] can be brought onto the premises.”

As an example, Wahbe created a hypothetical situation in which Visual Studio and .NET Framework can be used to build an ASP.NET app, which in turn can either be deployed locally or to Windows Azure. The only extra step taken when deploying to Windows Azure is to specify additional metadata, such as what kind of SLA you are looking for or how many instances you are going to run on. As explained by Wahbe, the Metadata is an .XML file and as an example of an executable model, Microsoft is easily able to understand that model. “You can write those models in ‘Oslo’ using the DSL written in ‘M,’ targeting Windows Azure in those models,” concludes Wahbe.

Wahbe answered a firm “yes” when asked if there is a natural fit for application developed in Oslo, saying that it works because Oslo is “about helping you write applications more productively,” also adding that you can write any kind of application—including cloud. Although new challenges undoubtedly face development shops, the basic process of writing and deploying code remains the same. According to Wahbe, Microsoft Azure simply provides a new deployment target at a basic level.

As for the differences, developers are going to need to learn a new set of services. An example used by Wahbe is if two businesses were going to connect through a business-to-business messaging app; technology like Windows Communication Foundation can make this as easy process. With the integration of Microsoft Azure, questions about the pros and cons of using the Azure platform and the service bus (which is part of .NET services) will have to be evaluated. Azure “provides you with an out-of-the-box, Internet-scale, pub-sub solution that traverses firewalls,” according to Wahbe. And what could be bad about that?

When asked if developers should expect new development interfaces or plug-ins to Visual Studio, Wahbe answered, “You’re going to see some very natural extensions of what’s in Visual Studio today. For example, you’ll see new project types. I wouldn’t call that a new tool … I’d call it a fairly natural extension to the existing tools.” Additionally, Wahbe expressed Microsoft’s desire to deliver tools to developers as soon as possible. “We want to get a CTP [community technology preview] out early and engage in that conversation. Now we can get this thing out broadly, get the feedback, and I think for me, that’s the most powerful way to develop a platform,” explained Wahbe of the importance of developers’ using and subsequently critiquing Azure.

When asked about the possibility of competitors like Amazon and Google gaining early share due to the ambiguous time frame of Azure, Wahbe’s responded serenely, “The place to start with Amazon is [that] they’re a partner. So they’ve licensed Windows, they’ve licensed SQL, and we have shared partners. What Amazon is doing, like traditional hosters, is they’re taking a lot of the complexity out for our mutual customers around hardware. The heavy lifting that a developer has to do to tale that and then build a scale-out service in the cloud and across data centers—that’s left to the developer.” Wahbe detailed how Microsoft has base computing and base storage—the foundation of Windows Azure—as well as higher-level services such as the database in the cloud. According to Wahbe, developers no longer have to build an Internet-scale pub-sub system, nor do they have to find a new way to do social networking and contacts nor have reporting services created themselves.

In discussing the impact that cloud connecting will have on the cost of development and the management of development processes, Wahbe said, “We think we’re removing complexities out of all layers of the stack by doing this in the cloud for you … we’ll automatically do all of the configuration so you can get load-balancing across all of your instances. We’ll make sure that the data is replicated both for efficiency and also for reliability, both across an individual data center and across multiple data centers. So we think that be doing that, you can now focus much more on what your app is and less on all that application infrastructure.” Wahbe predicts that it will be simpler for developers to build applications with the adoption of Microsoft Azure.  For more information regarding Windows Azure, please visit Nubifer.com.