Embedding ML Models

Embedding Machine Learning Models Into Web App with Flask

The rise of Artificial Intelligence and Machine Learning has changed the way we live. Many people are venturing into this new field and have been mastering how to build machine learning(ML) models.

The benefits of building ML models are enormous but we can do more than just build models. We will have to utilize these fascinating ML models in a way suitable for the End-User who may not know anything about programming and all the algorithms of machine learning.

Hence we have to not only know how to build these models but also learn how to productionize them and deploy them for every one to benefit. This introduces us to the concept of embedding ML models into web applications. So let us learn how to embed our ML models in applications that  make the ML models user friendly.

The Basic Structure of Our ML Web Apps

ML Web apps like any web application has two main parts. These includes

  1. The Front End – what the end user sees
  2. The Back End – what runs the entire app which the developer or programmer works on.

We can also have an API section as well as metrics to check for how our model is being used.

Building The Front End

The Front End can be built with HTML,CSS and JavaScript. In our case we will be including a simple but stable CSS Framework called Material Bootstrap which is Bootstrap using Material Design Concept.

For our front -end the most important thing to know is

  • How to Receive Input From the User and then send it to the Back-End for processing
  • How to Show our predicted result to the user

To receive inputs from the user we can use the normal html form tag  and utilize a POST /GET method to send our inputs to the back-end. This is how the code will look like

<form method="POST" action="/predict">
	<input type="text" name="namequery" class="form-control">
	<button type="reset" class="btn btn-primary">Clear</button>
	<button type="submit" class="btn btn-primary">Predict</button>


On showing our prediction from the back-end we will be using jinja templating engine to enable us perform such. Jinja is a templating engine used by Flask,Django and several web frameworks. It uses curly brackets to display the results from the python back-end.

<div class="container">
	<div class="row">
		<div class="col-md-6">
          <div class="card mb-10 shadow-sm" id="custom_card">
          	<h4>Original Name</h4>
          	<p>{{ name }}</p>
	<div class="col-md-6">
          <div class="card mb-10 shadow-sm" id="custom_card2">
	      <div class="alert alert-success" role="alert"><p>{{ prediction }}</p></div>	


Building The Back End with Flask

Now let us move on to our Back-End. We will be using Flask, a powerful web micro-framework.

We will build routes and include all our logic for ML prediction in a file called app.py (You can name it as you want). There are two main ways of achieving the logic for the ML Prediction.

  • Use of Serialized (Pickled) ML Models and Vectorizers
  • Use of entire machine learning procedure in the back-end script

It is advisable to use the serialized ML models to optimize the speed and performance of your web application. Let us see how the code will look like.

from flask import Flask,render_template,url_for,request
import numpy as np 

# ML Packages
from sklearn.externals import joblib

# Init App
app = Flask(__name__)

# Prediction
def predict_gender(x):
	vect = gender_cv.transform(data).toarray()
	result = gender_clf.predict(vect)
	return result

# Prediction
def predict_nationality(x):
	vect = nationality_cv.transform(data).toarray()
	result = nationality_clf.predict(vect)
	return result

def index():
	return render_template('index.html')

def gender():
	return render_template('gender.html')

@app.route('/predict', methods=['POST'])
def predict():
	# Load Our Count Vectorizer
	nationality_vectorizer = open("models/nationality_vectorizer.pkl","rb")
	cv_nationality = joblib.load(nationality_vectorizer)

	# Loading our ML Model
	nationality_nv_model = open("models/nationality_nv_model.pkl","rb")
	nationality_clf = joblib.load(nationality_nv_model)

	# Receives the input query from form
	if request.method == 'POST':
		namequery = request.form['namequery']
		data = [namequery]

		vect = cv_nationality.transform(data).toarray()
		result = nationality_clf.predict(vect)
	return render_template('index.html',prediction = result ,name = namequery.upper())

@app.route('/predict_gender', methods=['POST'])
def predict_gender():
	# Load Our Count Vectorizer
	gender_vectorizer = open("models/gender_vectorizer.pkl","rb")
	cv_gender = joblib.load(gender_vectorizer)

	# Loading our ML Model
	gender_clf_nv_model = open("models/naivebayesgendermodel.pkl","rb")
	gender_clf = joblib.load(gender_clf_nv_model)

	# Receives the input query from form
	if request.method == 'POST':
		namequery = request.form['namequery']
		data = [namequery]

		vect = cv_gender.transform(data).toarray()
		result = gender_clf.predict(vect)
	return render_template('gender.html',prediction = result ,name = namequery.upper())

if __name__ == '__main__':

So that will be the back-end. In production mode you will have to set the debug to False.

Finally to  run our ML app locally we can use code below with will start a localhost on a port that you can use in your browser.

python app.py

In conclusion, embedding ML models into web apps adds a new touch to your machine learning and data science skills.  You can get the full code on Github.

Check out the entire video on our YouTube channel here or below

For more info on how to build ML web apps you can check these books.


Thanks For Your Time

By Jesse E.Agbe(JCharis)

Jesus Saves

Leave a Comment

Your email address will not be published. Required fields are marked *