You are on page 1of 11

Limitations and Challenges in Cloud Computing for Applications

I was supposed to be involved in a discussion about cloud computing at Cloudcamp Bangalore, but due to other commitments, I could not attend the event. I had a small writeup about the limitations and challenges in Application clouds. Here is the full text of it. Cloud Computing is a way of providing dynamically scalable and available resources such as computation, storage etc as a service to users who can use it to deploy their applications and data. Cloud Computing can handle data in both the public and the private domain. But this seemingly harmless way of thinking about building applications has its own set of issues.I am primarily referring to application cloud providers, the kind where you deploy your applications. Not storage and service clouds. Google AppEngine would be a good example for the cloud that I am describing. I note some of them here : From the Users perspective: 1. New unstructured and non standard paradigm of programming: Each cloud has its own supported programming language and syntax requirements for programming, though most of these clouds expose the typical hashtable based cache and datastore interfaces. There is an urgent need for standardization of interfaces and methods of programming them. One of the reasons why shared hosting environments work great is because , as a programmer, I know that I can move my PHP/PERL code to another server and it will work without too much of a fuss. Moving from one of the dozen odd cloud providers to another requires considerable developmental efforts, not to forget time (for businesses, this could spell doom). A look back at history shows languages like SQL, C etc being standardized to stop exactly this sort of undesirable proliferation. 2. Restrictions on the programming model : For cloud based applications to be highly available, they must be easy to dynamically mirror on multiple machines. Once these applications are mirrored, they can be served on demand by load balancing servers which makes them highly available and the user doesnt face delays in being serviced. This is an old trick used by busy websites from the early days of web publishing but these solutions were custom built for websites. So, extending this concept to cloud based platforms, servicing thousands of applications, mandates the platform providers to automate this task of replication and mirroring. This job is easier said than done. This process can be made seamless when the program stores as little state information as possible. By state, I mean transactional variables, static variables, variables in the context of the entire application etc. These things are almost a given in traditional programming environments but are very hard to come by in cloud based environments. The unnatural way of dealing with this situation is using the datastore or the cache to store state of an application. There are a lot of restrictions like lack of privileges to install third party libraries, no access to file system to write files etc ( which forces you to use the datastore and pay for it)
3. A good local debugging experience: A good local development environment, debugging

experience is a must for programming on the cloud. Most cloud providers do not provide good local development environments. There is also a lack of good IDEs that can help with programming and debugging programs written for the cloud. The providers that do provide a local debug experience, do not simulate real cloud like conditions. Both from my personal experience and from conversations with other developers, I have come to

realize that most people face problems when moving code from their local development servers to the actual cloud. This is only due to inconsistencies in the behavior of the local dev env compared to the cloud. 4. Appropriate metrics and documentation of programming best practices : On a cloud, since a user pays for almost every CPU cycle, appropriate metrics on usage of processing time and memory must be presented to the users. Typically a profile of the application with function names and their corresponding time taken, memory used, processing cycles used will definitely help the developer tune his/her code to optimize on usage of processing power. The best solution for this is for cloud providers to abstract common code patterns into optimal libraries so that the users can be assured that they are running the most optimal code for a certain operation. An example of this is Apache PIG, which gives a scripting like interface to Apache Hadoops HDFS for data analysis. Also, Most cloud providers do not provide enough statistics and also profiling capabilities. From the providers perspective: Here I look at challenges that cloud providers have to face: 1. Ensuring availability of the cloud: This proves to be crucial as Clouds host critical business applications, for whom, downtime would mean monetary losses. Effective monitoring and load balancing solutions are to be built. Most clouds employ virtualization technology to get the most out of any resource. In such cases, tools should be written to figure out a resource hog early and move the application to a more powerful grid or a machine, so that the other users get their share of the cloud without delays.
2. Ensuring Consistency: Both the data and code is replicated on the cloud and maintaining

consistency of data is extremely crucial. This is the reason why most transactional updates are not allowed on the cloud. Example: sequence objects, which are almost a given in traditional databases are not provided, probably because maintaining state across machines for such statements is non trivial. Problems like distributed updates, locking, partitioning, sharding etc arise when dealing with data. Such constructs are to be provided to the users as most of it is given in the non cloud deployment space. Most datastores provided by cloud vendors (except the ones that provide cloud based database services) do not support relational models. Which means all object relations have to be programmatically established. This could always lead to bad code, unnecessary joins, cascading problems and tons of other problems that developers faced before working with relational datastores.
3. Program verification : One of the biggest worries about deploying applications on the

cloud is the correctness of the program in execution. Erroneous conditions, like infinite loops, can not only put the machine at the risk of being overloaded and unavailable, but also cost the user a significant amount of money. Tools like static analysis should be used to analyze code uploaded on the cloud and it should be checked for infinite loops, possible race conditions, null references, unreachable code etc. The code uploaded should also be optimized or suggestions should be provided to the users about how they could optimize code to best utilize the available resources. Conclusion : The cloud should become a complete nonrestrictive platform for applications. There should be no restrictions on the constructs, functionality and privileges on the cloud. Also, it should be dead simple to move everyday applications onto the cloud without too much of rework. This could mean writing migration utilities, import/export options and other artifacts that make the transition to a cloud much easier. This will prove essential as most live applications, at

least currently, do not run on a cloud and helping them migrate easily will mean more revenue and adoption.

Cloud Computing: What are the limitations?


What complications and limitations exist for cloud computing? This could be for the technology in general or for end users.
TOPICS: Cloud Cloud Computing Data Information Technology Servers Storage Servers and Storage

Brielle Nikaido asked on Feb. 11, 2011


Flag Flagged

8 Answers Rating Newest Oldest

+2

Benjamin Breeland Enterprise Management Consultant, ca technologies Posted on Feb. 13, 2011

I do not believe there are any limitations to cloud computing. The biggest challenge to cloud computing is trust. Every day citizens of the US board planes, take taxis, and use services with little regard to what happens under the hood when was the last time one delayed a plane because a customer needed to check the flaps. We go online, find the flight (service), pay the price, show up, and expect to reach our destination on time. Now compare that to the cost of maintaining a private plane, pilot, and crew I think there is no comparison. This is the promise of cloud computing. Once we get past the trust, the path is clear.
You have 15 minutes from your original post to Edit your answer
Reply to this answer Flag Flagged Permalink |

Top of Form

Reply to this answ er...

Bottom of Form

+1

Chad Massaker Technologist, Blogger, Cheerleader & CEO, Carceron - Most Recommended IT Firm in Atlanta on Linkedin.com Posted on Feb. 13, 2011

Benjamin is correct for the most businesses, so long as the needs are simple. If you perform a cost benefit analysis of Cloud vs On-Premise solutions for basic computing infrastructure (email, fire sharing, etc.). The cloud wins every time up until the business hits about 50 users (+/-10). Beyond Trust, the second most prevalent limitation is Customization and Flexibility. As a business grows, it will inevitably want it's systems to communicate internally more. For example: a CRM that talks to a VoIP phone system so that users can dial a number directly from the CRM, or, a customer's account pop's up automatically from CRM when that call. If you have the servers on site, you can purchase off the shelf software, or have some middleware custom coded, to accomplish the integration. If these same systems are in The Cloud, it may not be so easy. Many Cloud systems are closed, and those that do have open APIs are usually still too limited to achieve everything someone wants. Ultimately, this lack of flexibility has the potential to affects one's ability to automate business processes.
You have 15 minutes from your original post to Edit your answer
Reply to this answer Flag Flagged Permalink |

Top of Form

Reply to this answ er...

Bottom of Form

Stanley Dakin IT Engineer, Pensionskassen DIP Posted on Feb. 11, 2011

Hi Brielle well, strict control of your sensitive data, might be the biggest complication. Ofcourse, you must make shure that your cloud vendor has a high level of security. But even then, you do not realy know, where in the world your data actualy is stored. That in essence, is the difference between the cloud, and traditional hosting. I know of a case where a public office, handling personal information about school children, wanted to use a cloud based document system. They checked out the security of the vendor, and found no problems. However, the national board for data protection, did not give permission, because there was no way to make shure that data was not hosted in blacklisted countrys.
You have 15 minutes from your original post to Edit your answer
Reply to this answer Flag Flagged Permalink |

Top of Form

Reply to this answ er...

Bottom of Form

Dennis Morgan CEO/Consultant, DK Morgan Group Posted on Feb. 15, 2011

All excellent points


You have 15 minutes from your original post to Edit your answer
Reply to this answer Flag Flagged Permalink |

Top of Form

Reply to this answ er...

Bottom of Form

Rasib Hassan Khan Master's Thesis Researcher, Ericsson Nomadic Research Lab, Helsinki, Finland Posted on Feb. 15, 2011

When referring to "Clouds", we should at first be clarified about the scenario; is it a private of a public cloud we are referring to. So, breaking down the focus into two separate scenarios, we would be seeing different existing limitations with Clouds. For public clouds, as it has been mentioned above as well, is the issue of trust. Corporate, private, sensitive data, being moved to a physical location, not really belonging to me, would incur the provider to establish a certain level of trust in the market. Loss of physical control on the data leads to the fact, that new comers in the providers arena might have to come up with certain new market strategies, rather than waiting for the time to pass on and the credibility be established. Data lock in and a generalized fear of not knowing where the information resides for the clients can also be directly addressed to the base of trust. Security obfuscation can also be considered as a primary issue in terms on public clouds, as the network isolation and inter-grid security is always a statement of trust between the provider and the client. In terms of service availability and portability, certain providers expose the APIs to deploy the services, but it still lacks the flexibility and portability from provider to provider.

In the case of private clouds, the limitation would remain at the edge of the deployment time complications. Till now, open source cloud platforms haven't been fully able to have a flawless operation, as it can be seen from the development communities and their actions. However, the dynamic relocation and resource sharing from private to public clouds are a very scalable and application solution to any organisation. Private clouds provide the organisation with better resource usage, but at the cost of a pretty deft admin team for the management. However, taking in public cloud computing as a rule-of-thumb solution for any organisation might not be a fully conformable idea. It might be seen as a primary solution for startups, and for small to medium sized organisations. However, with the scaling of the operations for the organisation and the information base, leaving behind the issues of trust, data lock in and security obfuscation, it might be a positive approach to go for private cloud deployments.
You have 15 minutes from your original post to Edit your answer
Reply to this answer Flag Flagged Permalink |

Top of Form

Reply to this answ er...

Bottom of Form

Dennis Morgan CEO/Consultant, DK Morgan Group Posted on Feb. 15, 2011

Do Corporations (any size) today really know where all their data resides with on-premise equipment? Good IT departments have this well under control. The rest I am not so sure. So that begs the question about public clouds. Who do you trust? You onsite IT people or the IT people that are managing the public cloud. I do not see the difference here. Maybe some of these contributors can address this question?
You have 15 minutes from your original post to Edit your answer
Reply to this answer Flag Flagged Permalink |

Top of Form

Reply to this answ er...

Bottom of Form

JP Morgenthal Cloud Evangelist, Smartronix Posted on March 29, 2011

Cloud computing introduces issues of governance for the organization. In the case of private cloud, convergence of hardware introduces the convergence of skills for management and operations of that new platform. Storage is no longer allocated on an application by application basis, but assigned to virtual resource pools that need to grow and shrink over time. Moreover, there is greater opportunity to exhaust these resources more quickly. For SaaS, governance should fall to the business user group responsible for selecting and using the tool. For example, salesforce.com would have been managed as a CRM application by IT, but now can be directly administered by the sales department. PaaS governance will most likely fall under the domain of application/software engineering and IaaS will still most likely rely on traditional data center IT for ongoing management. Additionally, depending on the platform there are a plethora of new skills to be learned. If using a public cloud service provider, it's important to understand the nuances of your applications running on their platforms. For private cloud, how to manage a virtual environment so that multitenancy works without wasting resources is a new skills over single-tenancy machine-based environments.

You might also like