You are on page 1of 7

Assignment - 2

Cloud Application Development

Submitted By

Astha Kumari
Roll No .R110219027
SAP ID: 500075792
Semester VI
B. Tech. (Computer Sc. and Engineering)
Specialization in Cloud Computing and Virtualization

Submitted To

Mr. Harvinder singh

School of Computer Science


UNIVERSITY OF PETROLEUM AND ENERGY STUDIES
Dehradun-248007 2022-23
Q 1 – What do you mean by Task Computing? Discuss its characteristics, scenario and framework in
detail

Answer 1- The general trend of computing is progressing towards the vision of Pervasive Computing (or
Ubiquitous Computing) in which technologies are seamlessly embedded everywhere and dissolve in the
fabric of everyday life . However, a challenge of adding intelligent technologies to our lives is to “support
our activities, complement our skills, and add to our pleasure, convenience, and accomplishments, but not
to our stress” .
As spaces increasingly embedded with technologies become more complex and sophisticated, everyday
users often find themselves spending more time and effort in configuring and instructing these spaces. A
smart environment is “a physical world that is richly and invisibly interwoven with sensors, actuators,
displays, and computational elements, embedded seamlessly in the everyday objects of our lives, and
connected through a continuous network”1 .
From the users’ perspectives, in order to improve the acceptance and usability of pervasive systems, there
should be a solution to relieve the complexity, invisibility, and inconsistency problems in pervasive
interactive systems.
There are two nominated solutions to these problems:
Context-aware computing and Task-driven computing. Context-aware computing aims to enable
computer systems to understand users’ situations, thus to provide relevant information, services, and
behaviors to the users . Task-driven computing is defined as shifting focus to what users want to do (i.e.,
on the tasks at hand) rather than on the specific means for doing those tasks .

Developers create distributed applications in terms of Task instances, the collective execution of which
describes a running application.
These tasks, together with all the required dependencies (data files and libraries), are grouped and
managed through the Aneka Application class, which is specialized to support the execution of tasks. Two
other components, Aneka Task and Task Manager, constitute the client-side view of a task-based
application.
The former constitutes the runtime wrapper Aneka uses to represent a task within the middleware; the
latter is the underlying component that interacts with Aneka, submits the tasks, monitors their execution,
and collects the results.
Framework - Frameworks for Task Computing Used to support the execution of task-based applications
on distributed computing resources including Clouds
Some popular software systems that support the task-computing framework are:
1. Condor - Condor is the most widely used and long-lived middleware for managing clusters, idle
workstations, and a collection of clusters.

2. Globus Toolkit - The Globus Toolkit is a collection of technologies that enable grid computing.

3. Sun Grid Engine (SGE) - Sun Grid Engine (SGE), now Oracle Grid Engine, is middleware for
workload and distributed resource management.

4. BOINC - Berkeley Open Infrastructure for Network Computing (BOINC) is framework for
volunteer and grid computing. It allows us to turn desktop machines into volunteer computing
nodes that are leveraged to run jobs when such machines become inactive

5. Nimrod/G- Tool for automated modeling and execution of parameter sweep applications over
global computational grids

Q 2 – What is MapReduce programming model? Discuss MapReduce computation workflow and its
different components in detail.
Answer 2- MapReduce is a programming model or pattern within the Handoop framework that is used to
access big data stored in the Hadoop File System (HDFS). It is a core component, integral to the
functioning of the Hadoop framework.

Applications deal with a large amount of data.


• Data needs to be efficiently stored, made accessible, indexed and analyzed.
• Information accumulates and increases over time at higher rates.
• Distributed computing.
It is a programming platform for processing large quantities of data.
• It expresses the computation logic of an application into two simple functions:
– Map
– Reduce

• The model is expressed in the form of two functions, which are defined as follows:
– map (k1,v1) → list(k2,v2)
– reduce(k2,list(v2)) → list(v2)

MapReduce workflow –

Components of MapReduce programming model –

A typical MapReduce task has the following components:

Mapper: Processes the initial input and enables user to emit the output into a dictionary to be used as an
input for the combiner or reducer.
Combiner Factory: creates and manages combiners for each key emitted into output by the mapper.

Combiner: Works as local reducer to the node where Mapper’s output is combined to minimize traffic
between Mapper and Reducer.

Reducer Factory: create and manages reducers for each key emitted into output by the mapper or
combiner.

Reducer: Processes all those intermediate key-value pairs generated by Mapper or combined by
Combiner to aggregate, perform calculations or apply different operations to produce the reduced output.

Key Filter: Key Filter, as the name indicates, allows the user to filter cache data based on its keys before
sent to the Mapper. The Key Filter is called during Mapper phase. If it returns true, the Map will be
executed on the key. If it returns false, Mapper will skip the key and move to next one from the Cache.

Tracker Task: This component lets you keep track of the progress of the task and its status as the task
is executed. And lets you fetch the output of the task and enumerate it.

Output: The output is stored in-memory, on the server side. It can be enumerated using the
Trackable Task instance on the client application.

Q 3 – How the development of Aneka cloud platform impacts the growth of Healthcare sector in India?
Justify your answer with real world scenarios and use case?

Answer 3- There are many kinds of Information Integration and Informatics framework for cloud – based
healthcare application. The data collected by the Electronic Health Record System need to be smart and
connected, so it uses informatica for the connection of data from different database.

Traditional Electronic Health Record Systems are based on different technologies, languages and
Electronic Health Record Standards. Electronic Health Record System stores data based on interaction
between patient and provider.

There are scalable cloud infrastructures, distributed and heterogeneous healthcare systems and there is a
need to develop advanced healthcare application.
.
Medsphere provides a full suite of solutions for acute care and inpatient behavioral health
hospitals.
From enterprise EHR, RCM and supply chain management to outsourced managed IT services
and an emergency department specific EHR, all of their solutions feature easy subscription model
pricing with little or no upfront fees.
The integrated CareVue EHR system incorporates clinical, financial and patient
accounting solutions for acute care and inpatient behavioral health environments, as well
as integrated delivery networks.
Physician practices choose ChartLogic EHR and Practice Management software for its
proprietary command-and-control methodology, which allows users to create notes
quickly and efficiently. Built for speed and efficiency,

You might also like