Deploying a Machine learning model as a Chatbot (Part 2)

Source Node: 1411742

© https://palletsprojects.com/p/flask/

Flask is a python web framework used to build a lightweight, scalable webhook service to communicate with our chatbot. When the chatbot gets input from the user, it communicates to the machine learning model using Flask to make a prediction.

Before creating our Flask webhook, we need to create a virtual environment to install Flask and its dependencies.

#Navigate to a folder on your computer cd abdulquadri/oshoare/desktop/loanbot#install the virtual environment pip install venv myflaskbot#myflaskbot is my virtual environment name

Install the Dependencies

Using the requirement.txt file found on my Github page, install the dependencies.

pip install -r requirement.txt

Create the app.py file

#import required libraries
import numpy as np
from flask import Flask, request, make_response,render_template,jsonify
import json
import pickle
from flask_cors import cross_origin
#instantiate flask
app = Flask(__name__)
model = pickle.load(open('rfc.pkl', 'rb'))
@app.route('/')
def hello():
return render_template('home.html')
if __name__ == '__main__':
app.run()

After creating the app.py file, create an optional folder called template. A simple HTML file with the tag H1 “Hello World” can be added to see the output in the local server.

Run the app.py

On the terminal, run python app.py

python app.py 

Verify that Flask has started on port 5000, as shown in the image above.

Create the predict & Webhook method

Flask uses the predict & webhook method to return JSON information of the prediction to the Dialogflow chatbot.

Would you please copy the code below and paste it into your app.py file?

@app.route('/predict', methods=['POST'])
def predict():
json_ = request.json
query = (json_)
prediction = model.predict(query)
return jsonify({'prediction': list(prediction)})
# geting and sending response to dialogflow
@app.route('/webhook/', methods=['GET','POST'])
@cross_origin()
def webhook():
req = request.get_json(silent=True, force=True)print("req")res = mlprediction(req)res = json.dumps(res, indent=4)
#print(res)
r = make_response(res)
r.headers['Content-Type'] = 'application/json'
return r
# processing the request from dialogflow
def mlprediction(req):
#sessionID=req.get('responseId')
result = req.get("queryResult")
#print(result)
parameters = result.get("parameters")
ApplicantIncome=parameters.get("Income")
CoapplicantIncome = parameters.get("CoapplicantIncome")
LoanAmount=parameters.get("LoanAmount")
Credit_History=parameters.get("Credit_History")
if str.lower(Credit_History) == 'yes':
Credit_History = int(1)
elif str.lower(Credit_History) == 'no':
Credit_History = int(0)
else:
return {
"fulfillmentText": "Error please start again and enter the correct information"

}

try:
int_features = [ApplicantIncome, CoapplicantIncome, LoanAmount,Credit_History]
final_features = [np.array(int_features)]

except ValueError:
return {
"fulfillmentText": "Incorrect information supplied"
}

print(final_features)intent = result.get("intent").get('displayName')

if (intent=='Default Welcome Intent - yes'):
prediction = model.predict(final_features)
print(prediction)

if(prediction=='Y'):
status = 'Congratulations you are eligible for a loan 😀'
else:
status = 'We are sorry you are not eligible for a loan at the moment'

fulfillmentText= status

print(fulfillmentText)
print(prediction)
return {
"fulfillmentText": fulfillmentText
}

if __name__ == '__main__':
app.run()

Source: https://chatbotslife.com/deploying-a-machine-learning-model-as-a-chatbot-part-2-20038a9b39ef?source=rss—-a49517e4c30b—4

Time Stamp:

More from Chatbots Life