DEV Community

Cover image for localized Neuro Sama setup with only 3 lines of code using the livingrimoire
owly
owly

Posted on

localized Neuro Sama setup with only 3 lines of code using the livingrimoire

Run a Local Neuro Sama with Just 3 LOCs Using LivinGrimoire

Welcome to the coderpunk underground. We don’t wait for cloud APIs or corporate permission slips. We build our own AI companions—locally, modularly, and with full control. With the LivinGrimoire software design pattern, you can run a fully functional Neuro Sama on your machine using just 3 lines of code.

No subscriptions. No surveillance. Just raw Python and modular skill injection.


With the LivinGrimoire software design pattern, you can build a Neuro Sama (AI companion) that will run locally on your machine with just 3 lines of code!

Once you've added the LivinGrimoirePacket and main file, add the skills you want for your AI. In Neuro Sama's example, in a file with DLC in its name:

def add_DLC_skills(brain: Brain):
    brain.add_skill(DiSTT(brain))  # speech to text skill
    brain.add_skill(DiLLMOver())   # local LLM skill
    brain.add_skill(DiTTS_narakeet())  # text to speech skill
Enter fullscreen mode Exit fullscreen mode

You can even add skills codelessly—just copy-paste the skills + skill DLC files into the DLC directory in the Python project.

There are lots of skill DLC examples here:

LivinGrimoire DLC Examples

The example I showed only uses 3 skills, but you can add lots of additional skills, thus going from AI to AGI.

For example, LLMs can't access the system time, so what if you want her to tell time?

Just add a time utility skill:

brain.add_skill(DiTime())
Enter fullscreen mode Exit fullscreen mode

And puff—now she can tell times, days, dates, even predict which day it will be in 20 days from today.


Would you like to know more?

Here are 24 LivinGrimoire wikis:

LivinGrimoire Wiki

Top comments (0)