When you start reading about Machine Learning and Recurrent Neural networks there is a good chance that you will immediately want to write a bot. So, that's what I did!
I coded a Machine Learning model. But even after training it for a few days I wasn't able to get results as cool as the bots I used to see on Twitter. So after digging into how the community was implementing these bots, I found that most of the guys were using Markovify.
After a few hours reading and coding I was able to get my own bot, called TelegramBotFriend, ready to run in a Docker container.
So what do we need to create our own bot?
- A data source.
- A text provider.
- A content generator.
- A platform agnostic bot.
- A platform connector.
- An orchestrator.
The Data Source is the most important ingredient. If you don't have a good rich bunch of text, your bot will sound quite dumb. When I say rich I mean over 10.000 lines of text, the more the better.
Where can you get that text from?
- In my case, I went to one of my friend's WhatsApp group and exported the group history.
- Your Twitter account, and even your friend's Twitter accounts.
- Your blog.
You need to collect as much text as possible.
You might need to post-process that text because, as we all know, “garbage in, garbage out”. You might want to remove URLs or garbage generated by the source, e.g. WhatsApp adds "Audio ignored" "Image ignored" if you export the history without the content, but you should totally leave emojis :).
The text provider is a class with the following signature
def TextProvider(object): def get_text(self): #Loads the data source and returns its content def add_text(self): #Add new content to the file
Why do we need a class for this simple task?
In my case I wanted to get that file from dropbox so the docker instance could be easily removed or recreated. You might want to use your local file system or Google Drive as long as you create a class that matches that signature.
I added an
add_text function because we might want to feed our history file with new content we get from our platform.
Now we get to the fun part. The content generator will be responsible for learning from the source data and returning new content when called.
A content generator looks like this:
dev ContentGenerator(object): def load(self): #Setups the instance using a TextProvider def get_message(self, text): #It expects a text and try to returns a message based on that input
As I mentioned in the introduction I implemented a content generator using Markovify, but it could be any other content generator, such as a trained RNN.
Most of these content generators would (optionally) expect a seed text and return a string based on that text. You can process content as an argument by removing stop words and then trying to generate a meaningful reply..
I used NLTK to remove stop words and then I just iterate over the remaining text trying to get a valid content from Markovify, if I don't, I just ask for a text without a seed.
Though my bot is called Telegram BotFriend, after a few hours coding I found out that it could work with any platform. I would need to write all the logic in a platform agnostic class and then I would need to code a platform specific class that uses my abstract bot, which looks like this:
def AbstractBot(object): def process_incoming_message(self, chat_id, text): #Expects a text and returns a reply using a Content Generator #chat_id can be set to 0 if it's not a concept valid in the platform
The abstract bot will be called by the platform specific bot and:
- Return a string if he was mentioned (yes, it's a he).
- It will join in and return a message after a random number of messages received, here's where the chat_id comes into place.
- It will, optionally, add the new message to the source data.
A platform specific bot, such as a TelegramBot, a TwitterBot, etc, would receive a platform agnostic bot in the
__init__ function and then it would call the
process_incoming_message method when needed.
My Telegram bot just follows this simple signature:
class TelegramClient(object): def __init__(self, abstract_bot, token):
Just found suggestion edit mode :)
As you can see, the data source could be any data, the text provider any provider, a content generator any generator, and so on. Now we need something which will grab all the ingredients, set them up and connecting them to each other.
In my implementation a
bot_friend.py script gets the arguments from the environment (super easy to setup in docker) or from the command arguments (easy to debug) and then cooks the bot:
text_provider = DropboxTextProvider(dropbox_access_token, dropbox_file) provider = MarkovifyProvider(language, text_provider) meme_text_provider = DropboxTextProvider(dropbox_access_token, meme_file) meme_provider = MemeProvider(meme_text_provider) abstract_bot = AbstractBot(name, provider, auto_feed, meme_provider) TelegramClient(abstract_bot, token)
Yes, of course, I forgot to mention, I also have a MemeProvider. A bot is not a bot if it's not able to use memes.
You can use my bot as is by following these steps:
There are many tutorials on the web, but it's super easy:
- Open a chat with BotFather
- It will ask you for a name and it will give you the token
- Then you have to type
/setprivacyand set it to
disablein order to be able to read the incoming messages
- Go to https://www.dropbox.com/developers/apps/create
- Create a new "Dropbox API"
- It will give you a token
- Copy in the folder created in Dropbox the chat and the meme list file.
You can setup a new docker container using the instructions in the repo.
Note: The meme list file is comma separated file, something like this:
trigger word(s),meme url
What I liked about this little project is that, although I'm not a Python developer, it helped me to get into the language. If you're interested in the project, you're more than welcome to join on Github! I'd appreciate any comments not only with the work itself, but also on code styles and good practices.
Second, coding bots is fun, I mean, super fun!
And last but not least, this project opens the door to learning new stuff, having worked through the fundamentals. I can now go back to RNN in TensorFlow and simply replace the MarkovifyContentProvider or create a Slack bot by replacing the TelegramBot with the Slack one or even have both running using the same AbstractBot instance.