You are on page 1of 22

Dhruval Patel IU2041230030 DPA CS-A

Practical 1
AIM: INTRODUCTION to JUPYTER

Installation
you can use a handy tool that comes with Python called pip to install
Jupyter Notebook like this:
$ pip install jupyter
The next most popular distribution of Python is Anaconda .
Starting the Jupyter Notebook Server
open up your terminal application and go to a folder of your choice. go
to that location in your terminal and run the following command:
$ jupyter notebook
This will start up Jupyter and your default browser should start (or open
a new tab) to the following URL: http://localhost:8888/tree
A browser window should immediately pop up with the Jupyter Notebook interface,
otherwise, you can use the address it gives you. The notebooks have a unique token
since the software uses pre-built Docker containers to put notebooks on their own
unique path. To stop the server and shutdown the kernel from the terminal, hit control-
C twice.

Jupyter Interface

Now you’re in the Jupyter Notebook interface, and you can see all of the files
in your current directory. All Jupyter Notebooks are identifiable by
the notebook icon next to their name. If you already have a Jupyter
Notebook in your current directory that you want to view, find it in your files
list and click it to open.
Dhruval Patel IU2041230030 DPA CS-A

Benefits

1. All in one place: As you know, Jupyter Notebook is an open-source web-based


interactive environment that combines code, text, images, videos, mathematical
equations, plots, maps, graphical user interface and widgets to a single
document.
2. Easy to convert: Jupyter Notebook allows users to convert the notebooks into
other formats such as HTML and PDF. It also uses online tools and nbviewer
which allows you to render a publicly available notebook in the browser directly.
3. Easy to share: Jupyter Notebooks are saved in the structured text files (JSON
format), which makes them easily shareable.
4. Language independent: Jupyter Notebook is platform-independent because it is
represented as JSON (JavaScript Object Notation) format, which is a language-
independent, text-based file format. Another reason is that the notebook can be
processed by any programing language, and can be converted to any file formats
such as Markdown, HTML, PDF, and others.
5. Interactive code: Jupyter notebook uses ipywidgets packages, which provide
many common user interfaces for exploring code and data interactivity.

Components of Jupyter Notebook

There are the following three components of Jupyter Notebook -


Dhruval Patel IU2041230030 DPA CS-A

1. The notebook web application: It is an interactive web application for


writing and running the code.

The notebook web application allows users to:

o Edit code in the browser with automatic syntax highlighting and


indentation.
o Run code on the browser.
o See results of computations with media representations, such as
HTML, LaTex, png, pdf, etc.
o Create and use JavaScript widgets.
o Includes mathematical equations using Markdown cells.

2. Kernels: Kernels are the separate processes started by the notebook


web application that is used to run a user's code in the given language and
return output to the notebook web application.

In Jupyter notebook kernel is available in the following languages:

o Python
o Julia
o Ruby
o R
o Scala
o node.js
o Go

3. Notebook documents: Notebook document contains a representation


of all content which is visible in the notebook web application, including
inputs and outputs of the computations, text, mathematical equations,
graphs, and images.

1
Dhruval Patel IU2041230030 DPA CS-A

Practical 2
Aim: IMPORT AND UTILIZE THE DATA FROM CSV FILE AND IMAGE
FROM DATASET REPOSITORY

(i) For CSV

(ii) File
Code:
from google.colab import files
import pandas as pd
uploaded = files.upload()
import io
df2 = pd.read_csv(io.BytesIO(uploaded['Book1.csv']))
df2

import csv
with open('Book1 (1).csv','r') as f: #
Create a CSV reader object
reader = csv.reader(f)
# Read the header row
header = next(reader)
# Initialize an empty list to store the data data =
[]
# Iterate over the rows of the CSV file for
row in reader:
# Append the row to the data list
data.append(row)

2
Dhruval Patel IU2041230030 DPA CS-A

print(header)

print(data)

For Img File


Code:
from google.colab import files
uploaded = files.upload()
import matplotlib.pyplot as plt
# Read the image into a NumPy
array im = plt.imread('Smiley.png')
# Display the
image
plt.imshow(im)
plt.show()

3
Dhruval Patel IU2041230030 DPA CS-A

Practical 3
Aim: Apply Data Cleaning Functions and Plotting using matplotlib and
seaborn libraries.

Code:

import numpy as np

import pandas as pd

import seaborn as sns

import matplotlib.pyplot as plt

df=pd.read_csv("/content/drive/MyDrive/Colab Notebooks/titanic.csv")

df.head()

df.info()

4
Dhruval Patel IU2041230030 DPA CS-A

df.describe()

df.isnull().sum()

5
Dhruval Patel IU2041230030 DPA CS-A

corr=df.corr()

print(corr)

mean=df.mean()

mode=df.mode()

median=df.median()

print("Mean:\n",mean,"\n\nMedian: \n",median,"\n\nMode:\n\n",mode)

6
Dhruval Patel IU2041230030 DPA CS-A

sns.pairplot(df)

7
Dhruval Patel IU2041230030 DPA CS-A

sns.pairplot(df,hue="Sex")

plt.legend()

8
Dhruval Patel IU2041230030 DPA CS-A

9
Dhruval Patel IU2041230030 DPA CS-A

Practical 4
Aim: Applying Various Encoding Techniques on Categorial Data.

Code:

import numpy as np

import pandas as pd

import seaborn as sns

import matplotlib.pyplot as plt

df=pd.read_csv("/content/drive/MyDrive/Colab Notebooks/titanic.csv")

df.head()

from sklearn.preprocessing import LabelEncoder #LABEL ENCODER

le=LabelEncoder()

for x in df:

  if df[x].dtype=="object":

    df[x]=le.fit_transform(df[x])

df.head()

10
Dhruval Patel IU2041230030 DPA CS-A

import category_encoders as ce #ONE HOT ENCODING

import pandas as pd

data=pd.DataFrame({'City':
['Delhi','Mumbai','Hydrabad','Chennai','Bangalore','Delhi','Hydrabad','Bangalore','
Delhi']})

#Create object for one-hot encoding

encoder=ce.OneHotEncoder(cols='City',handle_unknown='return_nan',return_df
=True,use_cat_names=True)

#Original Data

data

#Fit and transform Data

data_encoded = encoder.fit_transform(data)

data_encoded

11
Dhruval Patel IU2041230030 DPA CS-A

#Create the dataframe (BASE N ENCODING)

data=pd.DataFrame({'City':
['Delhi','Mumbai','Hyderabad','Chennai','Bangalore','Delhi','Hyderabad','Mum
bai','Agra']})

#Create an object for Base N Encoding

encoder= ce.BaseNEncoder(cols=['City'],return_df=True,base=5)

#Original Data

data

encoder.fit_transform(data)

#TARGET ENCODING

#Create the Dataframe

data=pd.DataFrame({'class':['A,','B','C','B','C','A','A','A'],'Marks':
[50,30,70,80,45,97,80,68]}) 12
Dhruval Patel IU2041230030 DPA CS-A

#Create target encoding object

encoder=ce.TargetEncoder(cols='class') 

data

#Fit and Transform Train Data

encoder.fit_transform(data['class'],data['Marks'])

#Create the Dataframe

data=pd.DataFrame({'City':
['Delhi','Mumbai','Hyderabad','Chennai','Bangalore','Delhi','Hyderabad','Mum
bai','Agra']})

#Create object for binary encoding

encoder= ce.BinaryEncoder(cols=['City'],return_df=True)
13
Dhruval Patel IU2041230030 DPA CS-A

#Original Data

data

encoder.fit_transform(data) 

data=pd.DataFrame({'Month':
['January','April','March','April','Februay','June','July','June','September']})

#Create object for hash encoder

encoder=ce.HashingEncoder(cols='Month',n_components=6)

data

#Fit and Transform Data

encoder.fit_transform(data)

14
Dhruval Patel IU2041230030 DPA CS-A

15
Dhruval Patel IU2041230030 DPA CS-A

Practical 5
Aim: Applying various Image augmentation methods using tensorflow
and keras.

Code:
import pandas as pd

import numpy as np

import tensorflow as tf

from tensorflow import keras

import matplotlib.pyplot as plt

import tensorflow_datasets as tfds

from tensorflow.keras import layers

from google.colab import drive

drive.mount('/content/drive')

image = tf.io.read_file("/content/drive/MyDrive/DPA LAB/boy.png")

image = tf.image.decode_png(image)

image = image.numpy()

plt.imshow(image)

plt.show()

16
Dhruval Patel IU2041230030 DPA CS-A

data_augmentation = keras.Sequential([

    keras.layers.experimental.preprocessing.Resizing(height=256, width=256
),

    keras.layers.experimental.preprocessing.Rescaling(scale=1./255),

])

augmented_image = data_augmentation(tf.expand_dims(image, 0), training
=False)

augmented_image = augmented_image[0].numpy()

plt.imshow(augmented_image)

plt.show()

17
Dhruval Patel IU2041230030 DPA CS-A

data_augmentation = keras.Sequential([

    

        keras.layers.experimental.preprocessing.RandomFlip("horizontal"),

        keras.layers.experimental.preprocessing.Rescaling(scale=1./255),

        keras.layers.experimental.preprocessing.RandomRotation(0.1),

   

])

augmented_image = data_augmentation(tf.expand_dims(image, 0), training
=True)

augmented_image = augmented_image[0].numpy()

plt.imshow(augmented_image)

plt.show()

18
Dhruval Patel IU2041230030 DPA CS-A

data_augmentation = keras.Sequential([

    keras.layers.experimental.preprocessing.Rescaling(scale=1./255),

    keras.layers.experimental.preprocessing.RandomFlip("vertical"),

])

augmented_image = data_augmentation(tf.expand_dims(image, 0), training
=True)

augmented_image = augmented_image[0].numpy()

plt.imshow(augmented_image)

plt.show()

19
Dhruval Patel IU2041230030 DPA CS-A

20

You might also like