Jupyter Notebook into PyCharm

Source Node: 879734

PyCharm is the perfect choice to deploy your Jupyter Notebook chatbot as a web app.

Photo by Alex Knight on Unsplash

Motivation:

Jupyter Notebooks are useful for developing on your local machine. But how can other people access your chatbot if it is only alive on your PC? In this post I am going to show you how to go live with your Jupyter Notebook chatbot using PyCharm.

Solution:

Our Jupyter chatbot’s job is to answer frequently asked questions. For this example, I used the Jupyter Notebook from Parul Pandey. I have only slightly amended that code and logged unanswered user input. Let’s have a look:

import random
import string
from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.metrics.pairwise import cosine_similarity
import warnings
warnings.filterwarnings(‘ignore’)
import nltk
from nltk.stem import WordNetLemmatizer
nltk.download(‘popular’, quiet=True) # for downloading packages
nltk.download(‘punkt’) # first-time use only
nltk.download(‘wordnet’) # first-time use only

Since this post focuses on how to convert a Jupyter Notebook into a PyCharm web app I will not explain this code in much detail. All we need to know is that we have a simple chats text file for our question-answer mapping:

Every sentence is separated by a dot.

..so our bot can answer questions like:

If you ask questions that bot can currently not answer:

..this is fetched in a simple text file “userinputs” for you to further train your bot to become smarter in the future:

Check your userinputs to further train your bot.

As you can see this simple Jupyter Notebook chatbot works fine, even though it’s not very beautiful within the Notebook. Even worse, it’s difficult to share your bot with the world as long as it’s locked in a Jupyter Notebook. We’ll now make it more attractive using PyCharm.

Photo by SpaceX on Unsplash

1. Chatbot Trends Report 2021

2. 4 DO’s and 3 DON’Ts for Training a Chatbot NLP Model

3. Concierge Bot: Handle Multiple Chatbots from One Chat Screen

4. An expert system: Conversational AI Vs Chatbots

For your convenience, you can download the PyCharm project from my post:

After you have downloaded that project, we will replace the “views.py” with our Jupyter Notebook from above. You can find the complete file in my Github.

# PyCharm specific
from django.shortcuts import render
from django.http import HttpResponse
from django.template import loader

# Original Jupyter Notebook
import random
import string
from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.metrics.pairwise import cosine_similarity
import warnings
warnings.filterwarnings(‘ignore’)
import nltk
from nltk.stem import WordNetLemmatizer
nltk.download(‘popular’, quiet=True) # for downloading packages
nltk.download(‘punkt’) # first-time use only
nltk.download(‘wordnet’) # first-time use only

Since we are going to have html input and output forms we’re going to load the template and take care of input and content:

# PyCharm specific
def index(request):
template = loader.get_template('app.html')
con = {}

if request.method == 'POST':

user_input = request.POST.get('input')
chat = request.POST.get('content')
print(user_input)

That’s how Jupyter’s code looks like in PyCharm

The rest remains our original Jupyter Notebook:

# Reading in the corpus
with open('app/chats.txt', 'r', encoding='utf8', errors='ignore') as fin:
raw = fin.read().lower()

# Tokenisation
sent_tokens = nltk.sent_tokenize(raw) # converts to list of sentences
word_tokens = nltk.word_tokenize(raw) # converts to list of words

# lemmer
# Preprocessing
lemmer = WordNetLemmatizer()

def LemTokens(tokens):
return [lemmer.lemmatize(token) for token in tokens]

remove_punct_dict = dict((ord(punct), None) for punct in string.punctuation)

def LemNormalize(text):
return LemTokens(nltk.word_tokenize(text.lower().translate(remove_punct_dict)))

# Generating response
def response(user_response):
robo_response = ''
sent_tokens.append(user_response)
TfidfVec = TfidfVectorizer(tokenizer=LemNormalize, stop_words='english')
tfidf = TfidfVec.fit_transform(sent_tokens)
vals = cosine_similarity(tfidf[-1], tfidf)
idx = vals.argsort()[0][-2]
flat = vals.flatten()
flat.sort()
req_tfidf = flat[-2]
if (req_tfidf == 0):
robo_response = robo_response + " > I am sorry I didn`t catched that, but I am constantly learning.. Could you please try to say it in other words?"
with open('app/userinputs.txt', 'a') as f:
f.write(' > '+user_response+'.'+'n')
return robo_response
else:
robo_response = robo_response + sent_tokens[idx]
return robo_response

user_response = user_input.lower()
res = response(user_response)
print("FAQ Bot: ", end="")
print(res)
res = res.split(">", 1)

You might have noticed that we slightly changed our chats.txt file for PyCharm. It now looks like this:

Everything before > are extra keywords which will not be displayed to the user.

The idea is just to add some extra keywords to help our Bot to give the right answers. These keywords will not be displayed to the user (res.split(“>”, 1)). This way we keep the answers clean and tidy:

Access your bot via browser

By the way, since some of you have asked me: instead of using localhost you can add your pc and port in the manage.py:

You can use YourPcName instead of localhost

.. so your colleagues can interact with your bot right away:

PyCharm and Jupyter Notebook going hand in hand (Photo by Andy Kelly on Unsplash)

Congratulations:

You have adapted your Jupyter Notebook bot into a web app with PyCharm. If you want to learn more about distance metrics used for chatbots you might want to read this post:

Many thanks for reading, I hope this was supportive! Any questions, please let me know. You can connect with me on LinkedIn or Twitter.

Originally published on my website DAR-Analytics.

Source: https://chatbotslife.com/jupyter-notebook-into-pycharm-ec2c03ab3ea2?source=rss—-a49517e4c30b—4

Time Stamp:

More from Chatbots Life