DEV Community

Julien Simon
Julien Simon

Posted on • Originally published at julsimon.Medium on

Arcee.ai Llama-3.1-SuperNova-Lite is officially the 8-billion parameter model

The results came in 15 minutes ago. Arcee.ai Llama-3.1-SuperNova-Lite is officially the 🥇 8-billion parameter model on the Hugging Face LLM Leaderboard.

➡ Model page: https://huggingface.co/arcee-ai/Llama-3.1-SuperNova-Lite

➡ Notebook to deploy on SageMaker (GPU): https://github.com/arcee-ai/aws-samples/blob/main/model_notebooks/sample-notebook-llama-supernova-lite-on-sagemaker.ipynb

➡ Notebook to deploy on SageMaker (Inferentia2): https://github.com/arcee-ai/aws-samples/blob/main/model_notebooks/sample-notebook-llama-supernova-lite-on-sagemaker-inf2.ipynb

This model is a scaled-down version of our SuperNova Llama-3.1–70B, which we believe is the best 70B available today.

➡ SuperNova blog post: https://blog.arcee.ai/meet-arcee-supernova-our-flagship-70b-model-alternative-to-openai/

➡ Deploy SuperNova from the AWS Marketplace: https://aws.amazon.com/marketplace/pp/prodview-sb2ndlhwmzbhi

#ai #slm #byebyeopenai

AWS Security LIVE!

Join us for AWS Security LIVE!

Discover the future of cloud security. Tune in live for trends, tips, and solutions from AWS and AWS Partners.

Learn More

Top comments (0)

A Workflow Copilot. Tailored to You.

Pieces.app image

Our desktop app, with its intelligent copilot, streamlines coding by generating snippets, extracting code from screenshots, and accelerating problem-solving.

Read the docs

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay