DEV Community

sc0v0ne
sc0v0ne

Posted on

9 3 3 3 3

Ollama - Custom Model - llama3.2

Import the ollama library.


import ollama

Enter fullscreen mode Exit fullscreen mode

Create a class to configure custom models.

Methods:

  • init: Initializes the model with attributes such as name, system and temperature.
  • name_custom: Returns the custom name.
  • get_description: Creates the ModelFile structure.

class ModelFile:
    def __init__(self, model: str, name_custom: str, system: str, temp: float = 0.1) -> None:
        self.__model = model
        self.__name_custom = name_custom
        self.__system = system
        self.__temp = temp

    @property
    def name_custom(self):
        return self.__name_custom

    def get_description(self):
        return (
            f"FROM {self.__model}\n"
            f"SYSTEM {self.__system}\n"
            f"PARAMETER temperature {self.__temp}\n"
        )

Enter fullscreen mode Exit fullscreen mode
  • Create a function to list all available models.
  • Output: Returns a list of models registered in ollama.

def ollama_list() -> None:
    response_ollama = ollama.list()
    return response_ollama['models']
Enter fullscreen mode Exit fullscreen mode

Create a function to build a custom model based on the passed configuration.


def ollama_build(custom_config: ModelFile) -> None:
    ollama.create(
        model=custom_config.name_custom,
        modelfile=custom_config.get_description()
    )
Enter fullscreen mode Exit fullscreen mode

Create a function to check if the custom model exists.


def check_custom_model(name_model) -> None:
    models = ollama_list()
    models_names = [model['name'] for model in models]
    if f'{name_model}:latest' in models_names:
        print('Exists')
    else:
        raise Exception('Model does not exists')

Enter fullscreen mode Exit fullscreen mode

Create a function to generate a response based on the provided template and prompt.


def ollama_generate(name_model, prompt) -> None:
    response_ollama = ollama.generate(
        model=name_model,
        prompt=prompt
    )
    print(response_ollama['response'])

Enter fullscreen mode Exit fullscreen mode

Create a function to delete a model by name.


def ollama_delete(name_model) -> None:
    ollama.delete(name_model)

Enter fullscreen mode Exit fullscreen mode

Create a function to Order the steps of building, verifying and using the model.


def main(custom_config: ModelFile, prompt) -> None:
    ollama_build(custom_config)
    check_custom_model(custom_config.name_custom)
    ollama_generate(custom_config.name_custom, prompt)
    # ollama_delete(custom_config.name_custom)

Enter fullscreen mode Exit fullscreen mode

Set the prompt and configure the Model File template.

Input:

  • Model: llama3.2
  • Custom name: xeroxvaldo_sharopildo
  • System: Smart anime assistant.

Output: Runs the main function to create the model, check for its existence, and generate a response to the prompt.


if __name__ == "__main__":
    prompt: str = 'Who is Naruto Uzumaki ?'
    MF: ModelFile = ModelFile(
        model='llama3.2',
        name_custom='xeroxvaldo_sharopildo',
        system='You are very smart assistant who knows everything about Anime',
    )
    main(MF, prompt)

Enter fullscreen mode Exit fullscreen mode

output:

Naruto Uzumaki is the main protagonist of the popular Japanese manga and anime series "Naruto," created by Masashi Kishimoto. He is a young ninja from the Hidden Leaf Village, who dreams of becoming the Hokage, the leader of his village.

Naruto is known for his determination, bravery, and strong sense of justice. He is also famous for his unique ninja style, which involves using his Nine-Tails chakra (a powerful energy that he possesses) to enhance his physical abilities.

Throughout the series, Naruto faces numerous challenges and adversaries, including other ninjas from different villages, as well as powerful enemies like Akatsuki members and the Ten-Tails' jinchuriki. Despite facing many setbacks and failures, Naruto perseveres and grows stronger with each challenge he overcomes.

Naruto's character development is a central theme of the series, as he learns valuable lessons about friendship, sacrifice, and the true meaning of being a ninja. His relationships with his teammates, Sakura Haruno and Sasuke Uchiha, are particularly significant in shaping his personality and growth.

The Naruto series consists of two main arcs: the original "Naruto" arc (2002-2007) and the "Naruto Shippuden" arc (2007-2014). The latter is a continuation of the first arc, with Naruto now older and more powerful.

Overall, Naruto Uzumaki is an iconic anime character who has captured the hearts of millions worldwide. His inspiring story and memorable personality have made him one of the most beloved characters in anime history!


import ollama


class ModelFile:
    def __init__(self, model: str, name_custom: str, system: str, temp: float = 0.1) -> None:
        self.__model = model
        self.__name_custom = name_custom
        self.__system = system
        self.__temp = temp

    @property
    def name_custom(self):
        return self.__name_custom

    def get_description(self):
        return (
            f"FROM {self.__model}\n"
            f"SYSTEM {self.__system}\n"
            f"PARAMETER temperature {self.__temp}\n"
        )


def ollama_list() -> None:
    response_ollama = ollama.list()
    return response_ollama['models']

def ollama_build(custom_config: ModelFile) -> None:
    ollama.create(
        model=custom_config.name_custom,
        modelfile=custom_config.get_description()
    )


def check_custom_model(name_model) -> None:
    models = ollama_list()
    models_names = [model['name'] for model in models]
    if f'{name_model}:latest' in models_names:
        print('Exists')
    else:
        raise Exception('Model does not exists')

def ollama_generate(name_model, prompt) -> None:
    response_ollama = ollama.generate(
        model=name_model,
        prompt=prompt
    )
    print(response_ollama['response'])

def ollama_delete(name_model) -> None:
    ollama.delete(name_model)

def main(custom_config: ModelFile, prompt) -> None:
    ollama_build(custom_config)
    check_custom_model(custom_config.name_custom)
    ollama_generate(custom_config.name_custom, prompt)
    #ollama_delete(custom_config.name_custom)

if __name__ == "__main__":
    prompt: str = 'Who is Naruto Uzumaki ?'
    MF: ModelFile = ModelFile(
        model='llama3.2',
        name_custom='xeroxvaldo_sharopildo',
        system='You are very smart assistant who knows everything about Anime',
    )
    main(MF, prompt)

Enter fullscreen mode Exit fullscreen mode

References

Author's notes

Thank you very much for reading this far. If you could like and share, I would be very grateful. If you didn't like it, I can't know if you liked the post. This way, you help me know where I should improve my posts. Thank you.


My Latest Posts


Favorites Projects Open Source


About the author:

A little more about me...

Graduated in Bachelor of Information Systems, in college I had contact with different technologies. Along the way, I took the Artificial Intelligence course, where I had my first contact with machine learning and Python. From this it became my passion to learn about this area. Today I work with machine learning and deep learning developing communication software. Along the way, I created a blog where I create some posts about subjects that I am studying and share them to help other users.

I'm currently learning TensorFlow and Computer Vision

Curiosity: I love coffee

Imagine monitoring actually built for developers

Billboard image

Join Vercel, CrowdStrike, and thousands of other teams that trust Checkly to streamline monitor creation and configuration with Monitoring as Code.

Start Monitoring

Top comments (2)

Collapse
 
dubsmart profile image
Dubsmart

Nice

Collapse
 
3dd1_learn profile image
eddi

Thank you very much for the example, it helped a lot, I was having doubts about this part. @sc0v0ne

Some comments may only be visible to logged-in visitors. Sign in to view all comments.

The Most Contextual AI Development Assistant

Pieces.app image

Our centralized storage agent works on-device, unifying various developer tools to proactively capture and enrich useful materials, streamline collaboration, and solve complex problems through a contextual understanding of your unique workflow.

👥 Ideal for solo developers, teams, and cross-company projects

Learn more

👋 Kindness is contagious

Immerse yourself in a wealth of knowledge with this piece, supported by the inclusive DEV Community—every developer, no matter where they are in their journey, is invited to contribute to our collective wisdom.

A simple “thank you” goes a long way—express your gratitude below in the comments!

Gathering insights enriches our journey on DEV and fortifies our community ties. Did you find this article valuable? Taking a moment to thank the author can have a significant impact.

Okay