Intro: Brainstorming
At the beginning of the project, I know from the very start that I wanted to create a software application using my new Samsung S9 phone as the targeted platform. It's a massive upgrade compared my previous phone but most importantly, it was AR capable. Augmented Reality mixes the real world with digital to create unique experiences for the users. After watching a few dog videos on YouTube, I started to think about owning a dog. I couldn’t afford a dog at the time, so I decided the next best thing to do was to create an AR dog that I could take care of. I began to take notes on games that I used to play growing up that featured a cool pet sim. Games like Nintendogs, Tamagotchi, and Pokémon Let’s Go Pikachu and Eevee feature mechanics where the user could interact with a virtual pet.
The goal behind this project was to make an interactive game where the user could care for a virtual pet on their phones and have it projected in the real-world using AR tech. I figured Unity would be the best option for this project. Unity is a game development software that can create games for many different platforms including android. Google has an SDK for Unity called AR Core. This is used to make the project created in Unity AR capable. The project was divided up into 4 sprints. The first two sprints were for prototyping the idea and laying the foundation, one sprint for solidifying mechanics and implementing UI elements, and last sprint was mainly bug fixing and writing this report. Each sprint except the last is two weeks long. The last sprint was only a week for a total of 7 weeks of progress.
Sprint #1: First Phase of Prototyping
Now that the idea is in mind and the goals are started, I created a new project in Unity and labeled it Creature Care and imported Google AR Core SDK from the Google Developer website. It includes a sample scene named Hello AR. The scene has an AR controller in the scene so that when the game is played, it will map out the environment with your phone’s camera and place a 3D plane at points that the camera collects. From here, I add a script for spawning a dog house when the user touches the detected plane and after a set amount of time, spawn the dog on the detected plane to give the effect of a virtual pet displayed in the real world. At this stage, the dog model was just a 3D sphere and the dog house a 3D box. I also gave the 3D plane the texture for grass so it would look like grass. Next step was getting the dog to move on the plane. I used a ray cast that shoots from the phone to the detected plane if finger touch aligned with the detected plane. Last part for this sprint was creating a basic UI for the movement command, laser pointer command, and creating a low poly dog for testing. The laser pointer command is exactly like the movement command except the dog will move to where the phone is pointing on the detected plane. The dog would play an animation if moving or standing still. The basic functionally of the app has been implemented and wraps up the first part of prototyping.
Sprint #2: Second Phase of Prototyping
The basic foundation was set in the last sprint and so now in this sprint, the intent was to create another dog with a better model so more animations can be used and the dog can be more fluid. The dog can now be pet if the user is close enough and rubbing the phone screen while close to it. The next addition is the inclusion to select the scaling for the dog and the house. There are two scale modes for this game. One is desk mode which allows the dog can be seen at a small scale while the other is room mode which allows the dog to be seen at a large scale.
New interactions were added such as a button to launch a ball in the direction the phone is pointed for the dog to pick up and take to the dog house to return the ball to the owner, a button for dropping food where the phone is located in room mode and in front of the house in desk mode for the dog to eat. A button was also implemented for the dog to perform a flip. These buttons are placed on a sidebar of the UI. With all these features implemented, the prototyping phase is coming to an end and now the focus becomes completing the UI and implementing a health system for the dog.
Sprint #3: Polish Time
In this sprint, the goal was to teach the player how to interact with the dog, so the focus was UI and presentation. After every move the dog makes, there is a UI ticker at the top of the screen that tells the player what is going up. Rather its moving, the dog is chasing after a laser, a ball, or food, the ticker will inform the user of the action. Along with this, the UI received a much-needed face lift, so the symbols are easier to understand. These additions make it clear to user what to do and what’s going on in the game.
The dog also received a health system. By moving the dog, the dog will eventually get tired and will start to cry. If the user doesn’t pet the dog after a certain amount of time has passed, the dog will begin to cry. These are the two stats that the player will have to keep track of in order to maintain and care for the dog. A particle system of tears appears from dog when crying and a particle system of hearts appear above dog when petting. The user can also create a dog, select the color and choose if it has spots, and give it a name at the main-menu. I also added sound effects and music in this sprint. Now that that’s completed, the rest of the project is mostly bug fixing and prep for end.
Sprint #4: The Final Sprint
This sprint consisted mainly of fixing bugs, getting feedback from different people around campus, and preparing for launch of the demo. The game demo is being launched on my Itch.io, Google Play Store, and GitHub. The feedback received while walking around on campus showing people the game was a huge boost for me. Watching people play the game and interact with the dog was a huge boost for me and I was able to spot bugs like the dog getting stuck under the detected plane, the ball disappearing when it’s thrown, and other small weird bugs. I got a lot helpful feedback and was able to add usefully additions such as mute button for the music and a pet indicator to let the player know if they are close enough to be pet. Also, when the game is in room mode, the player can active the dog house and turn into a large room that both the player and the dog can move around in. Using that feedback from people, I was able to complete the project on time and create a list of things I want to add after the release of the demo.
Failures: What I didn’t go as plan
During this project, I experienced two major problems from the project. One being the failure to set up of the IBM Watson Speech to Text SDK in Unity. The goal was to take what was said in the phone’s microphone and with the use of the speech to text translator, I could say commands and translate them to text. If the translated text is a key word like move here, flip, roll over, etc., the dog will perform the command given. I couldn’t get the SDK working in mobile, but I was able to get it working in Unity Editor, since I couldn’t find a fix for it. I had to cut it out.
The other major failure is dog model. In this project, I made all the 3D models in blender. The dog model I’ve made doesn’t have the best rigging or bone placement so some of the tricks that were planed for the dog were cut such as roll over, play dead, and potty had to be removed. Although it was not included in the release of this demo, A new dog model will be made in a future update. I ran into problems testing the application on my phone while the phone was connected to the computer. The display on my phone would bug out and I would have to reset my phone and computer to fix the problem. It was just an annoying bug that bother me throughout the creation of this app.
Future Update: What’s next for the game
Now that the demo is released, I can start to think about what are some new features that I would like to see come to the game in the future. One goal I have in mind is the addition of more types of dogs and colors. Currently there is just one type of dog with several colors and the addition of new dogs and colors will allow for more customization. With a new dog model comes better animation and a better rigged dog body so this allows for more ways for the dog to convey feelings to the user. This will allow for me to bring back ideas for the cut animations that were discussed in the failures of the project. I will review the feedback given from the testers and develop new goals in the future.
Top comments (0)