DEV Community

Cover image for Building RoboPoll: A Journey with A.I., Quasar, and TypeScript
Forrester Terry
Forrester Terry

Posted on

Building RoboPoll: A Journey with A.I., Quasar, and TypeScript

Introduction

I recently built a website using Quasar and Firebase and thought it would be fun to share what the process was like at a high level. The goal was to build an MVP version of the project and build it rapidly.

This was my journey with RoboPoll, a ‘simple’ website that combines fun with data. RoboPoll is designed to engage users with AI-generated summaries of news and a variety of topics.

About Me:
I have about 6 years of experience and am self taught. I’m familiar primarily with JavaScript and Python.

The Requirements and Design

Image description

This is how I planned out my work:

1) Initial Concept and PoC: To play around with some ideas, I made a very simple PoC. I decided what I wanted to build and just slapped it together. As days progressed, I found flaws and considerations to note down. This PoC wasn't about adding components or making anything super stable, but about shaping the core identity of the app and figuring out what I wanted to build ultimately.

2) Requirements: I began defining the app's requirements, carefully selecting features for the initial release (MVP) and planning subsequent versions. This approach prevented overwhelming myself with too many features at once. I cannot stress enough, start with your requirements/design and really understand what you want/need to build before you seriously start putting down code.

3) Design, Workflow and Wireframes: Before diving into coding, I focused on the app's basic layout and feel. I drafted wireframes and detailed out workflows as step-by-step lists, ensuring a clear understanding of the functionalities needed. This stage was crucial in visualizing the end product and planning the implementation steps. It also helped me find anything I missed during requirements gathering.

4) High-Level Planning / Code Design: The app's components included a frontend website, AI automation running on a separate machine, and Firebase as the backend. I drew flow charts to map out how these components interacted, providing a clear picture of the app’s infrastructure. At this point, I also figured out what under-the-hood code I would likely need for both the backend and web application. This typically involved detailing out various modules and potential functions with examples of the data being processed or passed.

Choosing the Tech Stack

Image description

My suggestions with picking a tech stack for a project:

  • make sure it works for what you need
  • make sure it works well in general and is supported
  • make sure you are comfortable with it.

If all three are true, then it is suitable.

My selected tech stack was:

  • TypeScript
  • Quasar Framework (VueJS)
  • NodeJS (for automation tools)
  • Ollama for LLM management and API
  • Docker (for containerizing my backend apps - incase I want to them to the cloud)
  • Firebase (backend cloud infrastructure - Firestore, Hosting, Storage, Authentication, Algolia plugin)

Reasons for Some of These Tech Choices:

LLMs and Testing: The concept of AI generating content was intriguing. I wanted to explore how to integrate LLMs into a service effectively. My hardware setup, consisting of one RTX 3060 and one RTX 3050 GPU, was suitable for running inference with 7B and 13B parameter models. I used Ollama to manage the deployment of the LLM due to the simplicity and nice out of the box configurations. Since the LLM does not need to be interacted with in real time, I chose to pick something off the shelf and simple.

Quasar Framework/VueJS: My familiarity with Vue and the availability Quasar is why I chose Vue. Quasar offers cross-platform development ease, bringing together UI components and essential tools.

Ollama: It simplified LLM management, allowing me to focus more on application development rather than the nuances of model management.

Docker: Provided flexibility for backend automation, making it easy to shift operations between machines. Ollama is also offered as a Docker image, so this made bundling my backend app, and the Ollama runtime together easy.

Firebase: Offers a robust and scalable backend solution, especially valuable for its cloud functions. They offer a free tier, which actually met most of my needs. I ultimately went with a Pay-As-You-Go plan for the deployment.

TypeScript: For a project of this scale, TypeScript's ability to define objects and return types is invaluable. It enhances organization and bug detection. This became very beneficial during refactors of code.

Development: Challenges and Solutions

Image description

Going through this project, there were a few things that I had to deal with:

Development Fatigue and Keeping Organized: Managing workload and staying motivated was a challenge. Utilizing project management tools like Jira helped me maintain focus. Taking breaks when needed and tackling tasks one at a time kept the project manageable. When I would hit a brick wall, I would step away from the problem and return to it the next day.

LLM Accuracy and Speed: I employed multiple prompts to refine content quality. Testing different LLM models led me to choose dolphin-mistrial for its performance. Regularly adjusting LLM settings and prompts was key to achieving the desired output quality. I also tracked the performance and quality of output across different configurations to find the best one.

App Complexity and Performance: Balancing performance and complexity was important. I used Firebase for atomic transactions to ensure accuracy and scalability in features like the voting system. TypeScript played a significant role in managing the complexity, aiding in a modular and organized code structure so refactoring could be possible. For anything I was unsure about, I checked on Google and used ChatGPT.

UX Focus: User experience was paramount. I conducted testing with friends and family, which was invaluable. Their feedback drove many refinements, particularly in navigation and intuitive design. The focus on key features that users cared most about was essential in making the app more engaging.

Security: I made sure to spend extra time focusing on secure implementations. Data sanitization and functionality abstraction via Cloud Functions was implemented. I also tried using a tool called ZAP to do penetration and fuzz testing. Firestore and Firebase storage rules are used to prevent unauthorized access to data.

Feature/Component Notes

Polls and News Sections: This was the heart of the app. The journey began here, with content populated through AI automation. Gradually, this section shaped the overall structure of the frontend.

User Authentication: Implemented smoothly with Firebase Authentication, enhancing user security and experience.

Comments, Save/Bookmark, Share Functionality: Building a robust commenting system was a complex task. It involved user association, scalable comment loading, and an intuitive UI. Youtube comments were used as inspiration.

Basic Analytics and SEO: Integration of Google Analytics provided insights into user behaviors, crucial for future enhancements.

Bug Reporting System: This feedback loop is vital for continuous improvement and user engagement.

Letting Users Add Polls to the Queue: A critical feature that underwent several iterations to ensure user engagement and interest.

Learning, Reflections, Next Steps

Image description

This project was a fun one. It taught me the importance of scalability, organization, and user-focused development. Docker and Firestore's advanced features were key learnings. Sharing the app and opening up to feedback, though initially daunting, proved to be incredibly valuable.

Additionally, using ChatGPT to brainstorm on implementation approaches and help with generating dummy data worked very well. For design inspiration, I looked at similar sites or sites with similar features to get a starting point.

I opted not to write unit tests for this app, but highly suggest carefully selected unit tests for anything really serious.

I'm now at a stage where user feedback will shape the app's future. My immediate goal is to gather insights from actual users and refine the app based on their experiences and suggestions. Things are very basic right now, and it would be great to learn more about what actual users care about.

Conclusion

Building the app has been filled with fun, challenges, and continuous learning. It's been rewarding to see it come together and to explore the potential of LLMs in a real-world application.

The site likely has a lot of bugs, and at this point in the process the next focus would be lots of end to end testing and bug fixes. The summaries and LLM automations still need to be fine tuned as well.

I invite you to check out RoboPoll though and let me know if you have any questions or feedback.

Thank you for joining me on this adventure. Let me know your thoughts or opinions on finishing and structuring personal projects.

Some Quick Resources:

Ollama - used for running LLMs on my machine
OpenInterpreter - a neat project which allows you to run a ChatGPT like code environment on your own machine.
Quasar Framework - VueJS framework for making web, mobile, and desktop apps.
Wikipedia API - Used to give the LLM data
News API | NewsData.io - API used to collect news

Top comments (0)