Project Repository
Quick updates from Mustafa
Hello everyone! Since this is going to be the first of a long string of devblogs, I will introduce myself. Me and my teammate Mustafa Tariq are working on a toolkit that allows game developers to easily intergrate artificial intelligence into their game, allowing players to essentially say whatever they want, and impact their game based on what they say. We even plan to open-source it once we're done with most of what we want to do. If anyone is interested, we're building this as part of buildspace's nights&weekends s3. It will last 6 weeks, but we're more than willing to pursue this project even past that deadline!
The idea is simple: imagine yourself playing Skyrim, but whenever you are talking to an NPC, you don't pick from a list of options, but rather type whatever you need into a dialogue box, and have that NPC react accordingly. What we also want to create are gameplay situations where you're required to find out how NPCs react to certain things you say. For example, you need to enter Skyrim's City of Windhelm, but the guards would not let you pass. Now, if you paid attention to the story, you would know that Windhelm is home to the Stormcloak Rebellion; therefore, you would tell the guard (not through an option, you'll have to type it out) that you support the rebellion and is there to enlist. The guard would then sympathizes with your cause and let you pass. You can also figure out how to make the guards like you based on their personalities and maybe they'll let you, a friend, pass!
So, you may ask, how will we do it? We have 1 immediate milestone that we want to set:
April 16th: MVP out. This is our hard deadline for nights&weekend; however, we are in the middle of an exam season, so we'll have to scale down the MVP by quite a bit. The MVP will allow a user to load up a NLP model of their choice, create a character and allow for conversations, the contexts of which will be saved (like memory) and loaded back up whenever another conversation is triggered. Devblog #1, which will go into technical details, will also be released then.
Beyond: User feedback and iteration. We will have a much clearer picture of the full timeline once we're done the MVP and get some feedback on it. So stick along to see what cool stuff we'll do
The project came by as we watched a youtuber play a very old gem called Façade.
Then, we thought with all the new technologies surrounding NLP, we can probably remake the game and expand on some of its ideas as a summer project (we're both still in university). But we realized we can turn this into something bigger by creating a toolkit that allows just about anybody to remake Façade in a short amount of time. Just imagine the amount of Skyrim mods that this will make possible! What we definitely want to see at the end is Telltale's The Walking Dead (or any Telltale games for that matter) fully playable by player speech with no text options.
We are totally aware of similar projects out there. However, our goals differ from them in that we are not trying to make a game per se, we're trying to make it easy for developers to make the games that we want to make! We're not exactly doing this because we're businessmen, we truly just want to work on something we really really care about and maybe it will turn into our careers after graduation. Because of this, our goal is to make this the best thing that we have ever made, and make sure that everyone who uses it feels the same.
So, that's it for today! Stay tuned (maybe even subscribe) on April 16th to see what we've cooked up. From Devblog #1 onwards we will set up a feedback form if you've tried it out, maybe even a Discord channel if we garner extra interests. Also, if you want to follow the project more closely and see updates as we make them, follow Mustafa's Twitter account where he posts all of them. Over and out.
Top comments (0)