DEV Community

Cover image for How I Use Google's Gemini Pro with LangChain
Chandler for TimeSurge Labs

Posted on β€’ Edited on

53

How I Use Google's Gemini Pro with LangChain

Google's Gemini Pro is one of the newest LLMs publicly available, and to the surprise of some its relatively price competitive.

Gemini Pro Pricing

Its effectively free while you're developing, and after you development is complete its relatively cheap, costing around $0.00025 per 1K characters (characters, not tokens like OpenAI), which is slightly more expensive than GPT-3.5-Turbo, and $0.0025 per image, which is effectively the same as OpenAI's GPT-4).

Okay, I get it, how do I use it?

Let's start fresh with a new project. Assuming you're using Python >= 3.10, let's initialize a new virtual environment.



python -m venv env
source env/bin/activate


Enter fullscreen mode Exit fullscreen mode

And let's install LangChain and our dotenv file loader first.



pip install langchain python-dotenv


Enter fullscreen mode Exit fullscreen mode

Once that's done, we can install Gemini Pro's libraries and its LangChain adapter.



pip install google-generativeai langchain-google-genai


Enter fullscreen mode Exit fullscreen mode

Next you need to acquire an API key, which can be done on the Google MakerSuite. In the top left of the page you should see a "Get API Key" button.

API Key Button

Click that button, then click "Create API Key in new project".

New Key Button

Copy the new API Key and save it to a .env file in your project directory.



GOOGLE_API_KEY=new_api_key_here


Enter fullscreen mode Exit fullscreen mode

Now we can create a script that calls Gemini Pro via LangChain.



import os

from dotenv import load_dotenv
from langchain_google_genai import ChatGoogleGenerativeAI
from langchain.prompts import PromptTemplate
from langchain.chains import LLMChain

load_dotenv()

GOOGLE_API_KEY = os.getenv("GOOGLE_API_KEY")

llm = ChatGoogleGenerativeAI(model="gemini-pro", google_api_key=GOOGLE_API_KEY)

tweet_prompt = PromptTemplate.from_template("You are a content creator. Write me a tweet about {topic}.")

tweet_chain = LLMChain(llm=llm, prompt=tweet_prompt, verbose=True)

if __name__=="__main__":
    topic = "how ai is really cool"
    resp = tweet_chain.run(topic=topic)
    print(resp)


Enter fullscreen mode Exit fullscreen mode

And that's it! You've now integrated Gemini Pro with LangChain! If you're interested in learning more about LangChain and AI, follow us here on Dev.to as well as on X! I also post a lot of AI and developer stuff on my personal X account! We also have more articles on LangChain an AI on our Dev Page!

Happy coding!

πŸ‘‹ While you are here

Reinvent your career. Join DEV.

It takes one minute and is worth it for your career.

Get started

Top comments (0)

The Most Contextual AI Development Assistant

Pieces.app image

Our centralized storage agent works on-device, unifying various developer tools to proactively capture and enrich useful materials, streamline collaboration, and solve complex problems through a contextual understanding of your unique workflow.

πŸ‘₯ Ideal for solo developers, teams, and cross-company projects

Learn more

πŸ‘‹ Kindness is contagious

Explore a sea of insights with this enlightening post, highly esteemed within the nurturing DEV Community. Coders of all stripes are invited to participate and contribute to our shared knowledge.

Expressing gratitude with a simple "thank you" can make a big impact. Leave your thanks in the comments!

On DEV, exchanging ideas smooths our way and strengthens our community bonds. Found this useful? A quick note of thanks to the author can mean a lot.

Okay