Professional Documents
Culture Documents
PRACTICAL 1
Cloud computing-
Cloud computing is a general term for the delivery of hosted services over the internet.
Cloud computing enables companies to consume a compute resource, such as a virtual
machine (VM), storage or an application, as a utility -- just like electricity -- rather than
having to build and maintain computing infrastructures in house.
Self-service provisioning: End users can spin up compute resources for almost
any type of workload on demand. This eliminates the traditional need for IT
administrators to provision and manage compute resources.
Elasticity: Companies can scale up as computing needs increase and scale down
again as demands decrease. This eliminates the need for massive investments in
local infrastructure, which may or may not remain active.
Pay per use: Compute resources are measured at a granular level, enabling users
to pay only for the resources and workloads they use.
Workload resilience: Cloud service providers often implement redundant
resources to ensure resilient storage and to keep users' important workloads
running -- often across multiple global regions.
Migration flexibility: Organizations can move certain workloads to or from the
cloud -- or to different cloud platforms -- as desired or automatically for better cost
savings or to use new services as they emerge.
Private cloud services are delivered from a business's data center to internal users. This model
offers the versatility and convenience of the cloud, while preserving the management, control and
security common to local data centers. Internal users may or may not be billed for services
through IT chargeback.
Common private cloud technologies and vendors include VMware and OpenStack.
In the public cloud model, a third-party cloud service provider delivers the cloud service
over the internet. Public cloud services are sold on demand, typically by the minute or
hour, though long-term commitments are available for many services. Customers only pay
for the CPU cycles, storage or bandwidth they consume.
Leading public cloud service providers include Amazon Web Services (AWS),
Microsoft Azure, IBM and Google Cloud Platform.
The goal of a hybrid cloud is to create a unified, automated, scalable environment that
takes advantage of all that a public cloud infrastructure can provide, while still maintaining
control over mission-critical data.
PRACTICAL 2
Creating the foundation of a basic cloud app with Force.com requires just a few mouse
clicks. In this tutorial, you use the App Quick Start wizard to create an app that can help
you manage merchandise records in a warehouse.
From the Force.com Home page, click the big Add App button in the Getting Started
section. (If you're starting from somewhere else, click <your_name> | Setup to return to the
Force.com Home page).
Next, fill in the form as follows to match this screen, then click Create:
Once the wizard finishes, click Go To My App, Start Tour, and follow along for a quick
overview of your app's user interface.
Tell Me More
The app you just created is very simple -- or is it? Look closely around the screen to see
all of the functionality available by default to your Warehouse app.
PRACTICAL 3
Apex Code is designed explicitly for expressing business logic and manipulating data,
rather than generically supporting other programming tasks such as user interfaces and
interaction. Apex Code is therefore conceptually closer to the stored procedure languages
common in traditional database environments, such as PL/SQL and Transact-SQL. But
unlike those languages, which due to their heritage can be terse and difficult to use, Apex
Code uses a Java-like syntax, making it straightforward for most developers to
understand. And like Java, Apex Code is strongly typed, meaning that the code is
compiled by the developer before it is executed, and that variables must be associated
with specific object types during this compile process. Control structures are also Java-
like, with for/while loops and iterators borrowing that syntax directly.
if (leadMap.containsKey(lead.Email)) {
lead.Email.addError('Another new lead has the same email
address.');
} else {
leadMap.put(lead.Email, lead);
}
}
}
}
for (Lead lead : [select Email from Lead where Email IN :leadMap.KeySet()]) {
Lead newLead = leadMap.get(lead.Email);
newLead.Email.addError('A lead with this email address already exists.');
}
}
In addition, no Apex Code trigger would be complete without test coverage, this is done
using a class with a special testing method, the test code does not commit records to the
database and so can be run over and over without modifying your database. Here is a
sample test method to verify the above code works.
public class testBlockDuplicatesLeadTrigger {
static testMethod void testDuplicateTrigger(){
Lead[] l1 =new Lead[]{
new Lead( Email='homer@fox.tv', LastName='Simpson', Company='fox' )
};
insert l1; // add a known lead
Lead[] l2 =new Lead[]{
new Lead( Email='homer@fox.tv', LastName='Simpson', Company='fox' )
};
// try to add a matching lead
try { insert l2; } catch ( System.DmlException e) {
system.assert(e.getMessage().contains('firsterror:
FIELD_CUSTOM_VALIDATION_EXCEPTION, A lead with this email address already
exists'),
e.getMessage());
}
// test duplicates in the same batch
Lead[] l3 =new Lead[]{
new Lead( Email='marge@fox.tv', LastName='Simpson', Company='fox' ),
new Lead( Email='marge@fox.tv', LastName='Simpson', Company='fox' )
};
try { insert l3; } catch ( System.DmlException e) {
system.assert(e.getMessage().contains('first error:
FIELD_CUSTOM_VALIDATION_EXCEPTION, Another new lead has the same
email'),
e.getMessage());
}
// test update also
Lead[] lup = new Lead[]{
new Lead( Email='marge@fox.tv', LastName='Simpson', Company='fox' )
};
insert lup;
Lead marge = [ select id,Email from lead where Email = 'marge@fox.tv' limit 1];
system.assert(marge!=null);
marge.Email = 'homer@fox.tv';
try { update marge; } catch ( System.DmlException e) {
system.assert(e.getMessage().contains('irst error:
FIELD_CUSTOM_VALIDATION_EXCEPTION, A lead with this email address already
CLOUD COMPUTING LAB RAMAN
(1601915012)
HIET 8
exists'),
e.getMessage());
}
}
}
PRACTICAL 4
Choose ASP.NET Web Service as the template and name your project: Server.
Throughout this project, I’ll use C:\Project7 as my default folder.
Go to the Service.cs file and create the four needed methods by replacing:
[WebMethod]
public string HelloWorld() {
return "Hello World";
with:
[[WebMethod]
public int Add(int x, int y)
{
return x + y;
}
[WebMethod]
public int Subtract(int x, int y)
{
return x - y;
}
[WebMethod]
public int Multiply(int x, int y)
{
CLOUD COMPUTING LAB RAMAN
(1601915012)
HIET 10
return x * y;
}
[WebMethod]
public int Division(int x, int y)
{
return x / y;
}
Note that [WebMethod] tag allows the methods to be accessible to external clients. Now
replace the code:
[WebService(Namespace = "http://tempuri.org/")]
with:
[WebService(Namespace="http://tempuri.org/",
Description="A Simple Web Calculator Service",
Name="CalculatorWebService")]
The Description attribute gives external clients a brief description of the service; the Name
attribute lets external clients refer to the service as CalculatorWebService rather than
Service.
Expand the tree Sites, then right click on Default Web Site, and choose Add Virtual
Directory.
Enter WebService in the Alias field, and C:\Project7\Server in the Physical path field.
Click on the WebService folder and then switch IIS to Content View in order to see the
Service.asmx and the web.config files.
Since we’ve manually created the Virtual Directory WebService without the help of Visual
Studio testing mode, you should right click the WebService folder and choose Convert to
Application followed by clicking OK.
Now let’s add our web service as a service reference to our project as follows:
Right Click the Project in the solution explorer / choose Add Service Reference / enter
http://localhost/WebService/Service.asmx in the address field /click Go to view the
imported functions / choose ServiceReference as your namespace / click OK.
Edit your Default.aspx.cs source to add the method GetResult that takes as an input two
number strings and an integer function which corresponds to the four basic calculator
operations we need.
private string GetResult(string firstNumber, string secondNumber, int function)
{
ServiceReference.CalculatorWebServiceSoapClient client =
new ServiceReference.CalculatorWebServiceSoapClient();
int a, b;
string result = null;
erra.Text = "";
errb.Text = "";
try
{
switch (function)
{
case 0:
result = firstNumber + " + " + secondNumber +
" = " + client.Add(a, b);
break;
case 1:
result = firstNumber + " - " + secondNumber + " = "
+ client.Subtract(a, b);
break;
case 2:
result = firstNumber + " * " + secondNumber + " =
" + client.Multiply(a, b);
break;
case 3:
CLOUD COMPUTING LAB RAMAN
(1601915012)
HIET 17
PRACTICAL 5
7) In the Next step you need to specify a Key or a serial number of operating system. If
you are using trial version then that part can be skipped.
8) Enter the name for the virtual machine and specify a path to the directory where you
want to create your virtual machine. It is recommended that the drive you’re selecting to
install virtual machine should have sufficient space.
9) Specify an amount of disk space you want to allocate for a virtual machine. Allocate disk
space according to the size of software you are going to install on the virtual machine.
10) On the next screen it will show configuration you selected for a virtual machine.
11) It will allocate Hardware according to the default settings but you can change it by
using Customize Hardware button in the above screen.
You can specify what amount of RAM, a processor has to be allocated for a virtual
machine. Do not allocate complete RAM or complete Processor for a virtual machine.
Also, do not allocate very less RAM or processor. Leave default settings or allocate in
such way that your application should be able to run on the virtual machine. Else it will
result in a slow virtual machine.
12) Click on the Finish button to create the virtual machine at the specified location and
with specified resources.
If you have specified a valid file (.iso, .rar., .nrg) for the operating system it will take
standard time to complete operating system set up on the virtual machine and then it will
be ready to use your regular OS.
Notes:
If you didn’t specify any operating system while creating the virtual machine,
later you can install it just like we do for your laptop or desktop machines. We
can use CD/DVD or USB devices like Pen Drive or even set up a file on the disk
to install the operating system in the VM.
If your CD/DVD drive is not working then also it is very simple to install the
operating system. Go to VM -> Settings – > select CD/DVD -> in the right half
select radio button for ‘use ISO image from’ and specify the path on your hard
disk where the .iso file is placed. This location will be treated as CD/DVD drive
of your machine.
Make sure correct boot order is specified in BIOS so installation will start while
getting VM power on (in this case guest OS is not installed).
CLOUD COMPUTING LAB RAMAN
(1601915012)
HIET 23
Option 2. Using USB devices: When USB devices are plugged in those are default
available for host operating system and won’t show in the VM. To make them available in
VM do:
VM -> Removable devices -> mouse hover USB device and click Connect (Disconnect
from the host). Now USB device will be available in the Guest OS (VM) but won’t be
available in the host machine. Do reverse action to make it available in the host machine.
PRACTICAL 6
Re-login as hduser_
Select Stable
Once download is complete, navigate to the directory containing the tar file
Step 4) Modify ~/.bashrc file
Add following lines to end of file ~/.bashrc
#Set HADOOP_HOME
export HADOOP_HOME=<Installation Directory of Hadoop>
#Set JAVA_HOME
export JAVA_HOME=<Installation Directory of Java>
# Add bin/ directory of Hadoop to PATH
export PATH=$PATH:$HADOOP_HOME/bin
With
Next enter
sudo chmod +x /etc/profile.d/hadoop.sh
Open the mapred-site.xml file
sudo gedit $HADOOP_HOME/etc/hadoop/mapred-site.xml
Open $HADOOP_HOME/etc/hadoop/hdfs-site.xml as below,
sudo gedit $HADOOP_HOME/etc/hadoop/hdfs-site.xml
Step 7) Before we start Hadoop for the first time, format HDFS using below command
CLOUD COMPUTING LAB RAMAN
(1601915012)
HIET 30
$HADOOP_HOME/sbin/start-yarn.sh
Using 'jps' tool/command, verify whether all the Hadoop related processes are running or
not.
If Hadoop has started successfully then output of jps should show NameNode,
NodeManager, ResourceManager, SecondaryNameNode, DataNode.
$HADOOP_HOME/sbin/stop-yarn.sh
PRACTICAL 7
Steps-
Step 1. Open Eclipse> File > New > Java Project >( Name it – MRProgramsDemo) >
Finish
Step 2. Right Click > New > Package ( Name it - PackageDemo) > Finish
Step 3. Right Click on Package > New > Class (Name it - WordCount)
Step 4. Add Following Reference Libraries –
Right Click on Project > Build Path> Add External Archivals
/usr/lib/hadoop-0.20/hadoop-core.jar
Usr/lib/hadoop-0.20/lib/Commons-cli-1.2.jar
Step 5. Type following Program :
package PackageDemo;
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;
public class WordCount {
public static void main(String [] args) throws Exception
{
Configuration c=new Configuration();
String[] files=new GenericOptionsParser(c,args).getRemainingArgs();
Path input=new Path(files[0]);
Path output=new Path(files[1]);
Job j=new Job(c,"wordcount");
CLOUD COMPUTING LAB RAMAN
(1601915012)
HIET 32
j.setJarByClass(WordCount.class);
j.setMapperClass(MapForWordCount.class);
j.setReducerClass(ReduceForWordCount.class);
j.setOutputKeyClass(Text.class);
j.setOutputValueClass(IntWritable.class);
FileInputFormat.addInputPath(j, input);
FileOutputFormat.setOutputPath(j, output);
System.exit(j.waitForCompletion(true)?0:1);
}
public static class MapForWordCount extends Mapper<LongWritable, Text, Text,
IntWritable>{
public void map(LongWritable key, Text value, Context con) throws IOException,
InterruptedException
{
String line = value.toString();
String[] words=line.split(",");
for(String word: words )
{
Text outputKey = new Text(word.toUpperCase().trim());
IntWritable outputValue = new IntWritable(1);
con.write(outputKey, outputValue);
}
}
}
public static class ReduceForWordCount extends Reducer<Text, IntWritable, Text,
IntWritable>
{
public void reduce(Text word, Iterable<IntWritable> values, Context con) throws
IOException, InterruptedException
{
int sum = 0;
for(IntWritable value : values)
{
sum += value.get();
}
con.write(word, new IntWritable(sum));
}
}
}
Explanation
The program consist of 3 classes:
Driver class (Public void static main- the entry point)
Map class which extends public class
Mapper<KEYIN,VALUEIN,KEYOUT,VALUEOUT> and implements the Map
CLOUD COMPUTING LAB RAMAN
(1601915012)
HIET 33
function.
Reduce class which extends public class
Reducer<KEYIN,VALUEIN,KEYOUT,VALUEOUT> and implements the Reduce
function.
Step 6. Make Jar File
Right Click on Project> Export> Select export destination as Jar File > next> Finish
To Move this into Hadoop directly, open the terminal and enter the following commands:
[training@localhost ~]$ hadoop fs -put wordcountFile wordCountFile
Step 8 . Run Jar file
(hadoop jar jarfilename.jar packageName.ClassName PathToInputTextFile
PathToOutputDirectry)
[training@localhost ~]$ hadoop jar MRProgramsDemo.jar PackageDemo.WordCount
wordCountFile MRDir1
Step 9. Open Result
[training@localhost ~]$ hadoop fs -ls MRDir1
Found 3 items
-rw-r--r-- 1 training supergroup 0 2016-02-23 03:36
/user/training/MRDir1/_SUCCESS
drwxr-xr-x - training supergroup 0 2016-02-23 03:36
/user/training/MRDir1/_logs
-rw-r--r-- 1 training supergroup 20 2016-02-23 03:36 /user/training/MRDir1/part-
r-00000
[training@localhost ~]$ hadoop fs -cat MRDir1/part-r-00000
BUS 7
CAR 4
TRAIN 6
PRACTICAL 8
showed that even investing a minimum of $1/day on Facebook Ads can give you a
significant reach.
By consistently investing $1/day for 30 days, he was able to reach 120,000 people or 4000
people every day.
He in an active user of most advertising platforms and this is what he found as the cost to
reach 1000 people using popular advertising channels.
Facebook Ads are far cheaper than the legacy advertising solutions (newspaper, tv, etc.),
but also left behind its online competitors (Adwords and LinkedIn).
The objective of this case study or experiment was to show that even if you start with
a minimal budget, Facebook Ads can still prove beneficial.
PRACTICAL 9
Cloud Computing Case Study of Amazon Web Services for Ecommerce Websites.
ECommerce enterprises are the major beneficiaries with the advent of Cloud Computing.
These businesses, mostly attract visitors and sales online. More than 60% of its
resources used before sales are available online. Other aspects of eCommerce is
sourcing products from vendors, product delivery to customers and managing the supply
chain is offline. A typical eCommerce business relies heavily on virtual transactions.
These activities prompt the enterprise to build attractive and heavily featured websites with
database, high end applications (both web and mobile), high storage and computing
capacity for fast performance, 24 X 7 availability, accessibility on every mobile device.
Cloud computing is the advanced form of on premise hosting and are designed to manage
the feature deficient in eCommerce sites. There are typical problem for eCommerce
businesses like website break down during peak hours or surge in traffic, sudden need for
space due to increase in product portfolio and built in apps, ERP (like inventory
management or Customer Relationship Management). Cloud Computing has auto scaling,
load balancing features which automatically adjusts with the sudden need for increased
computing and increased storage thereby allowing smooth functioning of the online
resources.
Amazon Web Services is the leading cloud computing company and offer computing
services for eCommerce enterprises. We have discussed one of the cases to understand
the benefits of using Cloud Infrastructure and services.
Ice.com, an eCommerce enterprise into retailing jewelry migrated to Amazon Web
Services Cloud. It is based out of New York, United States. ICE was growing fast with its
increasing need for IT hardware and, so, hosted on two servers located in Montreal,
Canada. They were bearing a monthly expense of $26,000 for managing the physical
servers. Besides, they were running a risk for no disaster recovery plan for the resources
and also had plans to introduce many new web store application, an enterprise resource
planning (ERP), a content management system (CMS) and a business intelligence ( BI)
CLOUD COMPUTING LAB RAMAN
(1601915012)
HIET 37
platform for the smooth functioning of the business. ICE hired an AWS partner for system
development, remote administration, automation, deployment and scalability of web
applications on AWS Infrastructure.
AWS sets up ICE eCommerce sites, ERP, CMS, BI keeping in compliance with the PCI
(Payment Card Industry) standard and Cloud security best practices. ICE used all Amazon
features like Amazon EC2 (Elastic Cloud Computing), Amazon ELB (Elastic Load
Balancing) and Amazon EBS (Elastic Block Storage). Implemented Amazon VPC (Virtual
Private Cloud) to host the ERP and BI platforms. Amazon S3 (Simple Storage Service) for
storing html and static pages. AWS IAM (Identity and Access Management) allowing
secured authorized access to staffs and availing instances at multiple zones like East and
West US coast with a proper disaster recovery plan.
ICE reaped the benefit in fastest new application migration due to the presence of Cloud
Infrastructure, otherwise, which have not been possible. Overhead staff expense was
reduced to half wherein one database administrator and one IT professional replaced two
administrators and three IT professionals. Moreover, AWS helped ICE to save $250000
annually for migrating to Cloud Infrastructure.