DEV Community

Cover image for My Journey into Novel Creation Using Generative AI: Day 1
Saugata Roy Arghya
Saugata Roy Arghya

Posted on

My Journey into Novel Creation Using Generative AI: Day 1

As someone passionate about exploring the capabilities of Generative AI, I recently embarked on a project to create literature using LLMs (Large Language Models). This is my first attempt at implementing a fully automated novel-writing pipeline, and I'm thrilled to share my experience so far. Here’s how it went on Day 1.

Choosing the Tools

For this project, I decided to use Groq Client due to its LPU’s (Linear Processing Unit) incredible speed. While I could have opted for Ollama on my computer, Groq Client’s efficiency was a clear winner for this task. Additionally, I brainstormed the overall implementation strategy with Microsoft Copilot, which proved invaluable in refining my ideas.

To test my approach, I created a Jupyter Notebook and started building the foundation of the project. My long-term plan includes deploying the process with Streamlit for a more interactive experience.

Tackling the Challenges

Generating long, coherent, and effective novels is no small feat, especially given the token size limitations of LLMs. My initial attempts fell short of the desired quality, so I refined the process as follows:

1. Outlining the Novel

The first step was generating a broad outline of the novel using the LLM. This outline served as the backbone for the entire story.

2. Creating Chapters and Subplots

Next, I instructed the LLM to generate a list of chapters with detailed descriptions, including key scenes and subplots. This ensured the story had structure and direction.

3. Expanding Each Chapter

Each chapter was then developed into a comprehensive story. To maintain coherence, the LLM had access to the outline and previous chapters, ensuring continuity across the narrative.

4. Overcoming Token Limitations

Despite the automation, token limitations occasionally interrupted the flow. To address this, I implemented summarization with overlapping chunking. This technique allowed the model to work within its constraints while retaining contextual integrity.

The Results: A Promising Start

The automation worked beautifully! With just a single click, the entire story was generated as the LLMs "conversed" among themselves. The resulting novel was creative, with:

  • A compelling plot and well-executed climax and twists.
  • Vivid descriptions that set immersive scenes.

However, there were areas for improvement:

  • Rushed Pacing: While the stage-setting was effective, transitions felt abrupt as the story moved to the next plot point.
  • Lack of Emotional Depth: The narrative occasionally felt mechanical, missing the nuanced emotions that make characters and events truly resonate.

Plans for Day 2

To address these issues, I’ll focus on:

  1. Better Prompt Engineering: Crafting prompts that encourage the LLM to add more emotional depth and smooth transitions.
  2. Refining Chapters: Feeding the chapters back into the LLM for iterative enhancements, focusing on making them more emotionally engaging and less rushed.
  3. Integrating Web Search and Databases: Exploring ways to incorporate real-world data into the pipeline for creating other forms of literature.

Conclusion

This is just the beginning of my journey using LLMs for creative writing. I'm excited about the potential of this technology and eager to see where it takes me. I'll be sure to share my progress and learnings along the way.

Top comments (0)