You are on page 1of 6

1. Complete installation prior to creating tasks.

● Use the following commands to install the required packages automatically:


curl https://bootstrap.pypa.io/pip/3.6/get-pip.py -o get-pip.py;

sudo apt-get install python3-distutils -y

python3 get-pip.py --force-reinstall;

python3 -m pip install --user numpy scikit-learn flask flask-restful

2. Generating a Model

● Create a folder 'SampleProject' by using the mkdir command:


mkdir SampleProject

● Change the working directory to 'SampleProject' by using the cd command:


cd SampleProject

● Create an empty folder models to store the created models


mkdir models

● Create a file 'model_generator.py' using the following content:


from sklearn import datasets

from sklearn.model_selection import train_test_split

from sklearn.neighbors import KNeighborsClassifier

from sklearn.metrics import accuracy_score

import pickle

iris = datasets.load_iris()

validation_size = 0.20

seed = 100

X_train, X_test, Y_train, Y_test = train_test_split(iris.data,


iris.target,

test_size=validation_size,

random_state=seed)

knn = KNeighborsClassifier()

knn.fit(X_train, Y_train)

with open('models/iris_classifier_model.pk', 'wb') as model_file:

pickle.dump(knn, model_file)

● Run the 'model_generator.py' script. It creates the required model, and stores
it in the models folder.
python3 model_generator.py

3. Developing a RESTful API using Flask

● Add the following API to the file 'iris_classifier.py'


from flask import Flask, request

from flask_restful import Resource, Api

import pickle

app = Flask(__name__)

api = Api(app)

def classify(petal_len, petal_wd, sepal_len, sepal_wd):

species = ['Iris-Setosa', 'Iris-Versicolour', 'Iris-Virginica']

with open('models/iris_classifier_model.pk', 'rb') as model_file:

model = pickle.load(model_file)

species_class = int(model.predict([[petal_len, petal_wd, sepal_len,


sepal_wd]])[0])
return species[species_class]

class IrisPredict(Resource):

def get(self):

sl = float(request.args.get('sl'))

sw = float(request.args.get('sw'))

pl = float(request.args.get('pl'))

pw = float(request.args.get('pw'))

result = classify(sl, sw, pl, pw)

return {'sepal_length':sl,

'sepal_width':sw,

'petal_length':pl,

'petal_width':pw,

'species':result}

api.add_resource(IrisPredict, '/classify/')

4. Running Model as a service

● Set the 'FLASK_APP' environment variable:


export FLASK_APP=iris_classifier.py

● Set the other environment variables required:


export LC_ALL=C.UTF-8

export LANG=C.UTF-8

● Start the server and expose the API as a service:


python3 -m flask run --host=0.0.0.0 --port=8000

● Open a new terminal and try to access the API with the 'classify' end point.
curl "http://0.0.0.0:8000/classify/?sl=5.1&sw=3.5&pl=1.4&pw=0.3"

_________________________________________________________________________

Hands-on Scenario

Minikube installation
sudo apt install docker.io -y
sudo systemctl unmask docker
sudo service docker restart
curl -LO
https://storage.googleapis.com/minikube/releases/latest/minikube_latest_amd
64.deb
sudo dpkg -i minikube_latest_amd64.deb
rm -rf minikube_latest_amd64.deb

Follow the steps to complete the handson

1. Start Cluster : Launch the minikube cluster with the command 'minikube start'
2. Use the following yaml to deploy 'iris-classifier-site' to the cluster.
echo "apiVersion: apps/v1
kind: Deployment
metadata:
name: iris-classifier-site
labels:
app: web
spec:
replicas : 1
selector :
matchLabels:
app : iris-classifier
template :
metadata :
labels : { app : iris-classifier }
spec:
containers:
- name: mlexample
image: gpcplay/playimages:mlexample
ports:
- containerPort: 5000">iris-classifier-deployment.yaml

Hint: Use kubectl on the 'iris-classifier-deployment.yaml' file and wait till the pods are
created.
3. Create a Service for App Use the following yaml to expose port 30800 outside the
cluster. This service will be used by the iris-classifier-site app.
echo "apiVersion: v1
kind: Service
metadata:
name: iris-classifier-svc
spec:
selector:
app: iris-classifier
type: NodePort
ports:
- port: 5000
nodePort : 30800
targetPort: 5000">iris-classifier-service.yaml

Hint: Use kubectl on the 'iris-classifier-service.yaml' file.

Hands-on scenario Contd...

4. Verify if the application is running with the command 'kubectl get pods'
● List the service in the current cluster with the command 'kubectl get services'
● Create an environment variable 'NODE_PORT' by using the following
expression:
export NODE_PORT=$(kubectl get services/iris-classifier-svc -o go-
template='{{(index .spec.ports 0).nodePort}}')

echo NODE_PORT=$NODE_PORT

● Test if the app is exposed outside the cluster by using curl:


curl [http://$(minikube%20ip):$NODE_PORT/classify/?
sl=5.1&sw=3.5&pl=1.4&pw=0.3]http://$(minikube ip):$NODE_PORT/classify/?
sl=5.1&sw=3.5&pl=1.4&pw=0.3

You might also like