DEV Community

Julien Simon
Julien Simon

Posted on • Originally published at julsimon.Medium on

Arcee.ai Llama-3.1-SuperNova-Lite is officially the 8-billion parameter model

The results came in 15 minutes ago. Arcee.ai Llama-3.1-SuperNova-Lite is officially the 🥇 8-billion parameter model on the Hugging Face LLM Leaderboard.

âž¡ Model page: https://huggingface.co/arcee-ai/Llama-3.1-SuperNova-Lite

âž¡ Notebook to deploy on SageMaker (GPU): https://github.com/arcee-ai/aws-samples/blob/main/model_notebooks/sample-notebook-llama-supernova-lite-on-sagemaker.ipynb

âž¡ Notebook to deploy on SageMaker (Inferentia2): https://github.com/arcee-ai/aws-samples/blob/main/model_notebooks/sample-notebook-llama-supernova-lite-on-sagemaker-inf2.ipynb

This model is a scaled-down version of our SuperNova Llama-3.1–70B, which we believe is the best 70B available today.

âž¡ SuperNova blog post: https://blog.arcee.ai/meet-arcee-supernova-our-flagship-70b-model-alternative-to-openai/

âž¡ Deploy SuperNova from the AWS Marketplace: https://aws.amazon.com/marketplace/pp/prodview-sb2ndlhwmzbhi

#ai #slm #byebyeopenai

Top comments (0)