Deploy a Deep Learning Forecast Model on Heroku by creating a web application using HTML/CSS, Flask & Python

After having spent a lot of time on training the best possible model which yields the lowest possible prediction error, it is recommended that you deploy your model on a Cloud platform with a frontend user interface. This would deem your project to be complete as it would allow others to use your model and make predictions, which otherwise only you will be able to do if your model is left on your local machine. Heroku, GCP, & AWS are the popular ones on which a lot of Data Scientists host their web applications containing the trained machine learning model. This article goes over the process of productionalizing your ML/Deep Learning model by deploying it on Heroku. We will be creating our frontend using HTML/CSS & will be using Flask to make API calls.

The code files, .txt files, .pkl files, and the folder structure that I have followed can be viewed on this Git Link. Also, the final product, i.e. the web app that I have created can be viewed on this webapp link.

Let us begin!

This would require the following steps:

Step 1: Create a .py script to train your model and save it as a ‘pickle’ file (.pkl)

This script is where you are going to follow the usual process of training the Model and creating a pickle file out of it. In my case, I have trained a Deep Learning LSTM Neural Network model. LSTM stands for Long Short Term Memory, which is a special type of Recurrent Neural Network model that is well suited for multivariate time series forecasting problems. More details on LSTMs and their working can be found here. The “model.py” script in my Git repository is doing this job for me. Once you are done training your best model following the usual Machine Learning process, you need to save it as a pickle file so that the same model can be used again later to make predictions. Import the pickle library for this purpose (import pickle). I am saving it as a “model.pkl” file using the 3 lines of code as shown below. Here, “model_lstm” is my trained LSTM model.

Step 2: Create an HTML/CSS web page to serve as a frontend for your web application

You can write an HTML script and save it in a subfolder called “templates”, like how I have done it. You can design the front end in a way you want your app to look and how you design it, is completely up to you. You have complete flexibility here to be as artistic as you want and make your UI as aesthetic as you want it to be! :)

Ensure to place a form on your HTML page which will allow the users to input the values or upload the suitable data file on which the predictions have to be made. In my case, I am providing the option to upload a CSV file and submit it. As shown below, the uploaded CSV will be saved by the name “file”, as that is the variable name.

Also, the output returned by our Flask API (forecasted sales value in my case) needs to be displayed on the HTML page again. Here, the “prediction_text” that I am displaying, is the output variable returned by the Flask API.

Step 3: Create .py script containing Flask App

In this script, we will read the pickle file containing the LSTM model and use it to make predictions on the data input by the user. In this case, the CSV file uploaded by the user is read into a pandas data frame and is used as input data on which the model prediction is made.
The “app.py” script in my Git repo contains the code for this.

Firstly, we import all the required packages such as Flask, torch, etc. We also need to import the “LSTM_Model” class which has been defined in the model.py code using the PyTorch library. We then start off by creating a Flask app with the app = Flask(__name__) command as shown below.

Next, we read the pickle files created in step 1, which will be used for making predictions.

We then define the route ‘/’ and write a function to render the index.html template as the landing page of the application upon startup.

In the remaining part of the code, under the ‘POST’ method, we define a function that does the following:
a. process the uploaded CSV,
b. make the predictions using the model read from the pickle file, and
c. render the HTML template again by passing the prediction text.
This has been shown below. Lastly, we run the Flask app with the app.run() command.

With the above 3 steps, the fully functional app is ready. The app can be viewed & tested on localhost by running this “app.py” script written in step 3.

But this does not complete our project. The model was always available with us in our local machine and we could have easily written few code snippets in a Jupyter notebook to read an input CSV file and make predictions.
The main purpose behind creating pickle files, building a Flask app, an HTML page, etc. was to make this model accessible by others & enable them to easily make predictions on their data, with the click of a button. Hence, to truly make it accessible to everyone via the internet and to make it easy to use, we need to deploy this application on any Cloud-based platform.

Over the next few steps, we will work on deploying this application on Heroku.

Step 4: Create requirements.txt & Procfile

The requirements.txt file contains a list of all the libraries that the Heroku server needs to install for our app to run. Essentially, all the libraries used by us shall be listed in this text file as shown below. Since the torch library is ~800 MB and Heroku free version has a limitation of 500 MB, I have used an alternative wheel link to install a basic version of the torch library which is of a much smaller size. Hence, instead of specifying the torch in my requirements file, I have specified this link https://download.pytorch.org/whl/cpu/torch-1.1.0-cp36-cp36m-linux_x86_64.whl which will install the Torch library of ~100 MB size.

Next, in our Procfile, I have written the following and saved it as a text file.

This Procfile instructs Heroku on the way in which our application has to be run. This file mentions the web server interface that we will use (Gunicorn”), and the name of our script, which is an app (app.py is our file name). I have enabled the ‘preload’ option because of the memory constrain & slow app boot time. This loads the application code before the worker processes are forked.

Step 5: Create Git Repository and upload all files to Git maintaining the same folder structure

With step 4, our folder structure and all the content is ready for deployment. Hence, in this step, we create a Git repository and upload all the files on Git maintaining the exact same folder structure, as this is our first step towards deployment.

Step 6: Deployment on Heroku

Now, this brings us to the last and final step which is deployment on Heroku. Firstly, visit this Heroku page to Sign Up for free and create a Heroku account. After the account has been created, create a New App with a unique name. Next, on the application dashboard, pick the Connect to GitHub option, log in to GitHub, grant permission, and choose the Git repository where you had uploaded all your files.

As shown below, Heroku then gets connected to our Git repository, thus making it ready for deployment.

Finally, click on “Deploy Branch” under the Manual deploy section as shown below. This will start the deployment process and allow us to track it too. Once deployment is complete the website link for the application will be ready and usable. Congratulations!

Debugging:

Sometimes, even after successful deployment, the app may not work as expected. This would require debugging which can be done by viewing the application logs by running the below-shown command on Heroku CLI. You will have to install Heroku CLI first. In the below command, “retail-sales-forecast-webapp” is the name of my Heroku App.

Conclusion:
We have successfully productionalized our Deep Learning Forecast Model on Heroku by creating a web application using HTML/CSS, Flask & Python. Thank you very much for reading!

You can connect with me on LinkedIn for any questions. I am also open to any feedback that you might have for me.

M.S. in Applied Data Science at Syracuse University | Data Science, Machine Learning, Deep Learning, Statistics, Business Intelligence, Data Analyst, Data Engg

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store