Professional Documents
Culture Documents
Q.1.a) What are the various leagal issues faced when using cloud 7 Marks
model? Discuss in detail.
ANSWER:
● Cloud Cloud inherently being stateless and serves located in different locations
and countries creates issues related to conflict of laws, applicable law and
jurisdiction.
● Cloud services usually involve multiple parties which makes onus and liability
shift on one another. Liability and responsibility of sub-contractors is often
limited or disclaimed in entirety.
● Contractual privity lacks between the parties which makes it difficult for the client
to bind a provider for a breach.
● Right to conduct due diligence and to understand the model of delivery of services
should be given to the customer.
● Data from different user are usually stored on a single virtual server
● Cloud services are usually provided on standard service level agreements which
are usually non-negotiable.
● Even if negotiation is not agreeable for SLA, higher degree of reporting should be
integrated in the agreement.
Audit Trail
● As data is on continuous move and flow in the cloud services, client should have
the right to know where and by whom its data is stored, accessed, transferred and
altered.
● Confirm whether the vendor provides the audit trails rights or not.
Exit Issues
● In case a user has to change provider in the future the options for portability and
interoperability are critical issues to be considered.
● In the event that cloud vendor system is hacked, does the owner of the data has
the right to move against the vendor for claiming lost profits.
Jurisdictional Issues
● In cloud services location of data is usually uncertain. The owner of data is not
aware of the country where the data is stored. The physical location of the data
raises the question of law to be governed and jurisdiction. Its important to be
aware of the prevailing law in that particular nation.
● What if a dispute arises, what will be the place of jurisdiction. The owner of the
data should be aware of the country’s court system which will govern the conflict
arose between the parties.
● For eg. The owner is based at India and cloud service provider is based in the US.
The vendor would prefer jurisdiction of American Court. But can the owner afford
to contest the matter in American court.
ANSWER:
Advantages:
Easy implementation: Cloud hosting allows business to retain the same applications
and business processes without having to deal with the backend technicalities.
Readily manageable by the Internet, a cloud infrastructure can be accessed by
enterprises easily and quickly.
Cost per head: Overhead technology costs are kept at a minimum with cloud hosting
services, enabling businesses to use the extra time and resources for improving the
company infrastructure.
Flexibility for growth: The cloud is easily scalable so companies can add or subtract
resources based on their needs. As companies grow, their system will grow with
them.
Efficient recovery: Cloud computing delivers faster and more accurate retrievals of
applications and data. With less downtime, it is the most efficient recovery plan.
Disadvantages:
No longer in control: When moving services to the cloud, you are handing over your
data and information. For companies who have an in-house IT staff, they will be
unable to handle issues on their own. However, Stratosphere Networks has a 24/7
live help desk that can rectify any problems immediately.
May not get all the features: Not all cloud services are the same. Some cloud
providers tend to offer limited versions and enable the most popular features only, so
you may not receive every feature or customization you want. Before signing up,
make sure you know what your cloud service provider offers.
Doesn't mean you should do away with servers: You may have fewer servers to
handle which means less for your IT staff to handle, but that doesn't mean you can let
go of all your servers and staff. While it may seem costly to have data centers and a
cloud infrastructure, redundancy is key for backup and recovery.
Bandwidth issues: For ideal performance, clients have to plan accordingly and not
pack large amounts of servers and storage devices into a small set of data centers.
ANSWER:
The main challenge to cloud computing is how it addresses the security and
privacyconcerns of businesses thinking of adopting it. The fact that the valuable
enterprise data will reside outside the corporate firewall raises serious concerns.
Hacking and various attacks to cloud infrastructure would affect multiple clients even
if only one site is attacked. These risks can be mitigated by using security applications,
encrypted file systems, data loss software, and buying security hardware to track
unusual behavior across servers.
It is difficult to assess the costs involved due to the on-demand nature of the services.
Budgeting and assessment of the cost will be very difficult unless the provider has
some good and comparable benchmarks to offer. The service-level agreements (SLAs)
of the provider are not adequate to guarantee the availability and scalability.
Businesses will be reluctant to switch to cloud without a strong service quality
guarantee.
Businesses should have the leverage of migrating in and out of the cloud and
switching providers whenever they want, and there should be no lock-in period.
Cloud computing services should have the capability to integrate smoothly with the
on-premise IT.
Cloud providers still lack round-the-clock service; this results in frequent outages. It is
important to monitor the service being provided using internal or third-party tools. It
is vital to have plans to supervise usage, SLAs, performance, robustness, and business
dependency of these services.
Businesses can save money on hardware but they have to spend more for the
bandwidth. This can be a low cost for smaller applications but can be significantly
high for the data-intensive applications. Delivering intensive and complex data over
the network requires sufficient bandwidth. Because of this, many businesses are
waiting for a reduced cost before switching to the cloud.
ANSWER:
ANSWER:
1. Gossip Protocol
● It is a communication protocol.
● Exactly same like IP but basic diff. is that CNLP addr. Size is 20 bytes as compared
to IP(4 byte).
● A router communicate with each other, routing algo. is used to choose the path to
route the info.
● Routing info. Protocol (RIP) is used to know the info. about path/routing.
● Communication protocol used to multicast the data to the nodes in a n/w via
router.
● It operates on n/w layer just like other management protocol like ICMP.
● The decryption algo. is placed inside a remote server for decryption of user_id and
password.
● It was a replacement of telnet, Rlogin, RSH because it encrypt login_id & password
login time.
● Packet will be loss when new packets are comes via switch so data will loss so it is
solution over it.
● Advantages – handle packet traffic on data-link layer, lower cost, for storaging
the packets.
● Used for publish subscriber system, video & file transfer in cloud.
● Developed by jabber open source community in 1999 & its freeware protocol.
● Used in cloud but now-a-days used in Red hat, Microsoft, Apache etc.
● Transfer the media files, audio files, metadata to & from the portable device over a
cloud.
● Disadvantage- not to transmit the video files so solution over their AVTP is used.
● It initialized the initiators to identify the capability of device with respect to file
format & functionality.
ANSWER:
Public Cloud
Public Cloud allows systems and services to be easily accessible to general public.
The IT giants such as Google, Amazon and Microsoft offer cloud services via
Internet.
Benefits
There are many benefits of deploying cloud as public cloud model. The following
diagram shows some of those benefits:
Cost Effective
Since public cloud shares same resources with large number of customers it turns
out inexpensive.
Reliability
The public cloud employs large number of resources from different locations. If any
of the resources fails, public cloud can employ another one.
Flexibility
The public cloud can smoothly integrate with private cloud, which gives customers a
flexible approach.
CCC W-16 PAPER SOLUTION Page 9
Location Independence
Public cloud services are delivered through Internet, ensuring location
independence.
High Scalability
Cloud resources are made available on demand from a pool of resources, i.e., they can
be scaled up or down according the requirement.
Disadvantages
Here are some disadvantages of public cloud model:
Low Security
In public cloud model, data is hosted off-site and resources are shared publicly,
therefore does not ensure higher level of security.
Less Customizable
It is comparatively less customizable than private cloud.
Private Cloud
Private Cloud allows systems and services to be accessible within an organization.
The Private Cloud is operated only within a single organization. However, it may be
managed internally by the organization itself or by third-party.
Benefits
There are many benefits of deploying cloud as private cloud model. The following
diagram shows some of those benefits:
More Control
The private cloud has more control on its resources and hardware than public cloud
because it is accessed only within an organization.
Disadvantages
Here are the disadvantages of using private cloud model:
High Priced
Purchasing new hardware in order to fulfill the demand is a costly transaction.
Limited Scalability
The private cloud can be scaled only within capacity of internal hosted resources.
Additional Skills
In order to maintain cloud deployment, organization requires skilled expertise.
Hybrid Cloud
Hybrid Cloud is a mixture of public and private cloud. Non-critical activities are
performed using public cloud while the critical activities are performed using private
cloud.
Benefits
There are many benefits of deploying cloud as hybrid cloud model. The following
diagram shows some of those benefits:
Scalability
It offers features of both, the public cloud scalability and the private cloud scalability.
Flexibility
It offers secure resources and scalable public resources.
Cost Efficiency
Public clouds are more cost effective than private ones. Therefore, hybrid clouds can
be cost saving.
Security
The private cloud in hybrid cloud ensures higher degree of security.
Disadvantages
Networking Issues
Networking becomes complex due to presence of private and public cloud.
Security Compliance
It is necessary to ensure that cloud services are compliant with security policies of
the organization.
Infrastructure Dependency
The hybrid cloud model is dependent on internal IT infrastructure, therefore it is
necessary to ensure redundancy across data centers.
Cost Effective
Community cloud offers same advantages as that of private cloud at low cost.
Security
The community cloud is comparatively more secure than the public cloud but less
secured than the private cloud.
Issues
➢ Since all data is located at one place, one must be careful in storing data in
community cloud because it might be accessible to others.
ANSWER:
Front End
Back End
Each of the ends is connected through a network, usually Internet. The following
diagram shows the graphical view of cloud computing architecture:
BackEnd
The back End refers to the cloud itself. It consists of all the resources required to
provide cloud computing services. It comprises of huge data storage, virtual
machines, security mechanism, services, deployment models, servers, etc.
Hypervisor
CCC W-16 PAPER SOLUTION Page 13
Hypervisor is a firmware or low-level program that acts as a Virtual Machine
Manager. It allows to share the single physical instance of cloud resources between
several tenants.
Management Software
It helps to maintain and configure the infrastructure.
Deployment Software
It helps to deploy and integrate the application on the cloud.
Network
It is the key component of cloud infrastructure. It allows to connect cloud services
over the Internet. It is also possible to deliver network as a utility over the Internet,
which means, the customer can customize the network route and protocol.
Server
The server helps to compute the resource sharing and offers other services such as
resource allocation and de-allocation, monitoring the resources, providing security
etc.
Storage
Cloud keeps multiple replicas of storage. If one of the storage resources fails, then it
can be extracted from another one, which makes cloud computing more reliable.
Q.4.b) List and explain three service models of cloud computing. 6 Marks
ANSWER:
Q.5.a) What is the use and need of big data? Explain its 7 Marks
characteristics.
ANSWER:
● When big data is effectively and efficiently captured, processed, and analyzed,
companies are able to gain a more complete understanding of their business,
customers, products, competitors, etc. which can lead to efficiency improvements,
increased sales, lower costs, better customer service, and/or improved products
and services.
● Perhaps more importantly, this telemetry also reveals usage patterns, failure
rates and other opportunities for product improvement that can reduce
development and assembly costs.
● The proliferation of smart phones and other GPS devices offers advertisers an
opportunity to target consumers when they are in close proximity to a store, a
coffee shop or a restaurant. This opens up new revenue for service providers and
offers many businesses a chance to target new customers.
● This can enable much more effective micro customer segmentation and targeted
marketing campaigns, as well as improve supply chain efficiencies.
● Other widely-cited examples of the effective use of big data exist in the following
areas:
■ Use of social media content in order to better and more quickly understand
customer sentiment about you/your customers, and improve products,
services, and customer interaction.
(i)Volume – The name 'Big Data' itself is related to a size which is enormous. Size of
data plays very crucial role in determining value out of data. Also, whether a
particular data can actually be considered as a Big Data or not, is dependent upon
volume of data. Hence, 'Volume' is one characteristic which needs to be considered
while dealing with 'Big Data'.
Variety refers to heterogeneous sources and the nature of data, both structured and
unstructured. During earlier days, spreadsheets and databases were the only sources
of data considered by most of the applications. Now days, data in the form of emails,
photos, videos, monitoring devices, PDFs, audio, etc. is also being considered in the
analysis applications. This variety of unstructured data poses certain issues for
storage, mining and analysing data.
Big Data Velocity deals with the speed at which data flows in from sources like
business processes, application logs, networks and social media sites,
sensors, Mobile devices, etc. The flow of data is massive and continuous.
(iv)Variability – This refers to the inconsistency which can be shown by the data at
times, thus hampering the process of being able to handle and manage the data
effectively.
Q.5.b) What is Hadoop? Why do we need it? How it differs from 7 Marks
RDBMS?
Need of Hadoop
Hadoop makes data sharing with its high sharing Ability:
The organizations use big data to improve the functionality of each and every
business unit. This includes research, design, development, marketing, advertising,
sales and customer handling. Sharing is difficult for to share across different platforms.
Hadoop is used to create a pond. It is a repository of various sources of data, intrinsic
or extrinsic sources of data.
As compared to the traditional tool, Hadoop provides more accurate facts and figures.
Hadoop supports advanced features like data visualization and predictive analytics in
Hadoop is considered affordable for both enterprise and small business which makes
it an attractive solution with endless potential. With the passage of time, companies
and enterprises are getting closer to Hadoop. They are moving to implement big data
to support the marketing and other efforts and resources.
ANSWER:
ANSWER:
Whatis MapReduce?
MapReduce is a processing technique and a program model for distributed
computing based on java. The MapReduce algorithm contains two important tasks,
namely Map and Reduce. Map takes a set of data and converts it into another set of
data, where individual elements are broken down into tuples (key/value pairs).
Secondly, reduce task, which takes the output from a map as an input and combines
those data tuples into a smaller set of tuples. As the sequence of the name
MapReduce implies, the reduce task is always performed after the map job.
1. Map phase
2. Reduce phase.
Input Splits:
Input to a MapReduce job is divided into fixed-size pieces called input splits Input
split is a chunk of the input that is consumed by a single map
Mapping
This is very first phase in the execution of map-reduce program. In this phase data in
each split is passed to a mapping function to produce output values. In our example,
job of mapping phase is to count number of occurrences of each word from input
splits (more details about input-split is given below) and prepare a list in the form of
<word, frequency>
Shuffling
This phase consumes output of Mapping phase. Its task is to consolidate the relevant
records from Mapping phase output. In our example, same words are clubed together
along with their respective frequency.
Reducing
In this phase, output values from Shuffling phase are aggregated. This phase combines
values from Shuffling phase and returns a single output value. In short, this phase
summarizes the complete dataset.
Consider you have following input data for your MapReduce Program
Hadoop is good
Hadoop is bad
bad 1
Class 1
good 1
Hadoop 3
is 2
to 1
Welcome 1
ANSWER:
CCC W-16 PAPER SOLUTION Page 23
1. Data Breaches
Cloud computing and services are relatively new, yet data breaches in all forms have
existed for years. The question remains: “With sensitive data being stored online
rather than on premise, is the cloud inherently less safe?”
A study conducted by the Ponemon Institute entitled “Man In Cloud Attack” reports
that over 50 percent of the IT and security professionals surveyed believed their
organization’s security measures to protect data on cloud services are low. This study
used nine scenarios, where a data breach had occurred, to determine if that belief was
founded in fact.
2. Hijacking of Accounts
The growth and implementation of the cloud in many organizations has opened a
whole new set of issues in account hijacking.
Attackers now have the ability to use your (or your employees’) login information to
remotely access sensitive data stored on the cloud; additionally, attackers can falsify
and manipulate information through hijacked credentials.
3. Insider Threat
An attack from inside your organization may seem unlikely, but the insider threat
does exist. Employees can use their authorized access to an organization’s
cloud-based services to misuse or access information such as customer accounts,
financial forms, and other sensitive information.
4. Malware Injection
Malware injections are scripts or code embedded into cloud services that act as “valid
instances” and run as SaaS to cloud servers. This means that malicious code can be
injected into cloud services and viewed as part of the software or service that is
running within the cloud servers themselves.
The expansion of cloud-based services has made it possible for both small and
enterprise-level organizations to host vast amounts of data easily. However, the
6. Insecure APIs
However, APIs can be a threat to cloud security because of their very nature. Not only
do they give companies the ability to customize features of their cloud services to fit
business needs, but they also authenticate, provide access, and effect encryption.
Most of the issues we’ve looked at here are technical in nature, however this
particular security gap occurs when an organization does not have a clear plan for its
goals, resources, and policies for the cloud. In other words, it’s the people factor.
9. Shared Vulnerabilities
Cloud security is a shared responsibility between the provider and the client.
This partnership between client and provider requires the client to take preventative
actions to protect their data. While major providers like Box, Dropbox, Microsoft, and
Google do have standardized procedures to secure their side, fine grain control is up
to you, the client.
Data on cloud services can be lost through a malicious attack, natural disaster, or a
data wipe by the service provider. Losing vital information can be devastating to
ANSWER:
ANSWER:
ANSWER:
CCC W-16 PAPER SOLUTION Page 31
Computer and network security is fundamentally about three goals/objectives:
-- confidentiality (C)
-- integrity (I), and
-- availability (A).
Integrity is a degree confidence that the data in the cloud is what is supposed
to be there, and is protected against accidental or intentional alteration without
authorization. It also extends to the hurdles of synchronizing multiple databases.
Integrity is supported by well audited code, well-designed distributed systems, and
robust access control mechanisms.
Latest Training Program on Cloud Computing and Windows Azure In order to address
the aforementioned challenges, Fujitsu Laboratories developed new cloud
information gateway technology that can flexibly control data, including data content,
transmitted from the inside of a company to a cloud and between multiple clouds.
In addition to the option of blocking confidential data, the data gateway also includes
the following three features.
● Data masking has also been known by such names as data obfuscation,
de-identification, or depersonalization.
● Using masking technology, when data passes through the information gateway,
confidential parts of the data can be deleted or changed before the data are
transmitted to an external cloud.
● For confidential data that cannot be released outside of the company, even
formed by concealing certain aspects of the data, by simply defining the security
level of data, the information gateway can transfer the cloud-based application to
the in-house sandbox for execution.
● The sandbox will block access to data or networks that lack pre-authorized access,
so even applications transferred from the cloud can be safely executed.
● The information gateway tracks all information flowing into and out of the cloud,
so these flows and their content can be checked.
● Data traceability technology uses the logs obtained on data traffic as well as the
characteristics of the related text to make visible the data used in the cloud.
● Authentication of users takes several forms, but all are based on a combination of
authentication factors: something an individual knows (such as a password),
something they possess (such as a security token), or some measurable quality
that is intrinsic to them (such as a fingerprint).
ANSWER:
7. "==" is used for comparison operation and "=" is used for assignment
operation.
2. MODERN
1. C# has been based according to the current trend and is very powerful and
simple for building interoperable, scable, robust applications.
2. C# includes built in support to turn any component into a web service that can
be invoked over the internet from any application running on any platform.
3. OBJECT ORIENTED
int i=1;
string a=i Tostring(); //conversion (or) Boxing
4. TYPE SAFE
2. Value types (priitive types) are initialized to zeros and reference types (objects
and classes) are initialized to null by the compiler automatically.
CCC W-16 PAPER SOLUTION Page 34
3. arrays are zero base indexed and are bound checked.
5. INTEROPERABILITY
1. C# includes native support for the COM and windows based applications.
3. Users no longer have to explicityly implement the unknown and other COM
interfacers, those features are built in.
4. C# allows the users to use pointers as unsafe code blocks to manipulate your
old code.
5. Components from VB NET and other managed code languages and directlyt be
used in C#.
1. .NET has introduced assemblies which are self describing by means of their
manifest. manifest establishes the assembly identity, version, culture and
digital signature etc. Assemblies need not to be register anywhere.
2. To scale our application we delete the old files and updating them with new
ones. No registering of dynamic linking library.
7. Abstraction in C#
The word abstract means a concept or an idea not associated with any specific
instance. In programming we apply the same meaning of abstraction by making
classes not associated with any specific instance. The abstraction is done when we
need to only inherit from a certain class, but do not need to instantiate objects of that
class. In such case the base class can be regarded as "Incomplete". Such classes are
known as an "Abstract Base Class".
8. Encapsulation in C#
9. Inheritance in C#
In the language of C#, a class that is inherited is called a base class. The class that does
the inheriting is called the derived class. Therefore a derived class is a specialized
version of a base class. It inherits all of the variables, methods, properties, and
indexers defined by the base class and adds its own unique elements.
10. Polymorphism in C#
Polymorphism means the same operation may behave differently on different classes.
Method Overloading: Method with same name but with different arguments is
called method overloading.
C# provides the three keywords try, catch and finally to do exception handling. The
try block encloses the statements that might throw an exception whereas catch
handles an exception if one exists. The finally can be used for doing any clean-up
process.
ANSWER:
DataSet:
The dataset represents a subset of the database. It does not have a continuous
connection to the database. To update the database a reconnection is required. The
DataSet contains DataTable objects and DataRelation objects. The DataRelation
objects represent the relationship between two tables.
The DataSet, which is an in-memory cache of data retrieved from a data source, is a
major component of the ADO.NET architecture. The DataSet consists of a collection of
DataTable objects that you can relate to each other with DataRelation objects.
Using system.Data;
DataSet objDS = new DataSet();
● Dataset is mostly used to populate the server control like DataGrid, DataList,
DropDown and datarepeater. You can pass data in binary format means
serialization is also possible using Dataset.
● Before using Dataset it must be connected with some source, once you connected
with the source you can populate any of the control with the filled dataset.
Most Important Property of DataSet
Table: Most Important property of your Dataset, you can come to know whether
your queries return something or not. It contains the collection of tables in your
dataset. It’s like a one dimensional array. You can assign any names to each tables of
your dataset
if (objDS.Tables[0].Rows.Count > 0)
{
//do some action
}
DataSetName: You can give one name to your dataset, which is easy to
remember the purpose of dataset. You can do this in two different way
objDS.DataSetName = "EmployeeDS";
DataSet objDS = new DataSet("EmployeeDS");
Relation: It is used to define the relationship between different tables. It defines the
relationship on the basis of certain keys
Properties Description
Events Gets the list of event handlers that are attached to this
component.
Prefix Gets or sets an XML prefix that aliases the namespace of the
DataSet.
ANSWER:
class Program
int num1;
int num2;
string operand;
float answer;
num1 = Convert.ToInt32(Console.ReadLine());
operand = Console.ReadLine();
num2 = Convert.ToInt32(Console.ReadLine());
switch (operand)
case "-":
break;
case "+":
break;
case "/":
break;
break;
default: answer = 0;
break;
Console.ReadLine();
ANSWER:
ADO.NET provides a bridge between the front end controls and the back end
database. The ADO.NET objects encapsulate all the data access operations and the
controls interact with these objects to display data, thus hiding the details of
movement of data.
ADO.NET Architecture
System.Data namespace is the core of ADO.NET and it contains classes used by all data
providers. ADO.NET is designed to be easy to use, and Visual Studio provides several
wizards and other features that you can use to generate ADO.NET data access code.
The two key components of ADO.NET are Data Providers and DataSet . The Data
Provider classes are meant to work with different kinds of data sources. They are
used to perform all data-management operations on specific databases. DataSet class
provides mechanisms for managing data when it is disconnected from the data
source.
Data Providers
The .Net Framework includes mainly three Data Providers for ADO.NET. They are the
Microsoft SQL Server Data Provider , OLEDB Data Provider and ODBC Data Provider .
SQL Server uses the SqlConnection object , OLEDB uses the OleDbConnection Object
and ODBC uses OdbcConnection Object respectively.
Connection
The Connection Object provides physical connection to the Data Source. Connection
object needs the necessary information to recognize the data source and to log on to it
properly, this information is provided through a connection string.
ASP.NET Connection
Command
ASP.NET Command
DataReader
ASP.NET DataReader
DataAdapter
DataAdapter Object populate a Dataset Object with results from a Data Source . It is a
special class whose purpose is to bridge the gap between the disconnected Dataset
objects and the physical data source.
ASP.NET DataAdapter
DataSet
DataSet contains rows, columns,primary keys, constraints, and relations with other
DataTable objects. It consists of a collection of DataTable objects that you can relate to
each other with DataRelation objects. The DataAdapter Object provides a bridge
between the DataSet and the Data Source.
ANSWER:
The objective of Windows Azure is to automate the service life cycle as much as
possible. Windows Azure service life cycle has five distinct phases and four different
roles.
Design and development: In this phase, the on-premise team plans, designs,
and develops a cloud service for Windows Azure. The design includes quality
attribute requirements for the service and the solution to fulfill them. This
phase is conducted completely on-premise, unless there is some proof of
concept (POC) involved. The key roles involved in this phase are on-premise
stakeholders. For the sake of simplicity, I have combined these on-site design
roles into a developer role.
Testing: In this phase, the quality attributes of the cloud service are tested.
This phase involves on-premise as well as Windows Azure cloud testing. The
tester role is in charge of this phase and tests end-to-end quality attributes of
the service deployed into cloud testing or staging environment.
ANSWER:
ANSWER:
Maximizing data availability:
Azure provides failover clustering of thrice replicated data and Hosted application
instances running when failure occurs.
Azure services can enable Transport layer security (TLS) to use secure HTTP protocol
(HTTPS) for transmission of encrypted requests to and responses from production
Hosted services and storage accounts for web roles.