DEV Community

Evan Lin for Google Developer Experts

Posted on • Originally published at evanlin.com on

[Gemini][Google Maps] Building Location-Aware AI Apps with the Google Maps Grounding API

image-20251202231128366

Background

When developing a LINE Bot, I wanted to add a feature: after users share their location, AI can intelligently recommend nearby restaurants, gas stations, or parking lots. The traditional approach requires connecting to the Google Places API, handling complex search logic and result sorting. However, Google launched the Grounding with Google Maps feature in 2024, allowing Gemini models to directly access Google Maps' 250 million location data, enabling AI responses to automatically include geographical context!

This feature is provided through Vertex AI, allowing Gemini models to answer location-related questions in a "grounded" (realistic) manner, no longer just relying on imagination.

Problems Encountered During Development

When implementing maps_grounding.py, I initially used the Gemini Developer API with an API Key:

# ❌ Incorrect approach
client = genai.Client(
    api_key=api_key,
    http_options=HttpOptions(api_version="v1")
)

response = client.models.generate_content(
    model="gemini-2.0-flash-lite", # Does not support Maps Grounding
    contents=query,
    config=GenerateContentConfig(
        tools=[Tool(google_maps=GoogleMaps())],
        tool_config=ToolConfig(...)
    ),
)

Enter fullscreen mode Exit fullscreen mode

The result was this error:

google.genai.errors.ClientError: 400 INVALID_ARGUMENT.
{'error': {'code': 400, 'message': 'Invalid JSON payload received.
Unknown name "tools": Cannot find field.
Invalid JSON payload received. Unknown name "toolConfig": Cannot find field.'}}

Enter fullscreen mode Exit fullscreen mode

After reviewing the documentation, I discovered that Google Maps Grounding only supports Vertex AI and cannot be used with the Gemini Developer API!

Correct Solution

1. Understanding API Differences

Google provides two different Gemini API access methods:

Feature Gemini Developer API Vertex AI API
Authentication Method API Key ADC / Service Account
Maps Grounding ❌ Not Supported ✅ Supported
Enterprise-Level Features Limited Complete
Applicable Scenarios Rapid Prototyping Production Environment

2. Correcting the Code

Here's the correct implementation:

from google import genai
from google.genai import types

# ✅ Correct approach: Use Vertex AI
client = genai.Client(
    vertexai=True, # Enable Vertex AI mode
    project=project_id, # GCP Project ID
    location=location, # Recommended to use 'global'
    http_options=types.HttpOptions(api_version="v1")
)

# Use a model that supports Maps Grounding
response = client.models.generate_content(
    model="gemini-2.0-flash", # ✅ Supported Model
    contents=query,
    config=types.GenerateContentConfig(
        tools=[
            types.Tool(google_maps=types.GoogleMaps(
                enable_widget=False
            ))
        ],
        tool_config=types.ToolConfig(
            retrieval_config=types.RetrievalConfig(
                lat_lng=types.LatLng(
                    latitude=latitude,
                    longitude=longitude
                ),
                language_code="zh-TW", # Supports Traditional Chinese
            ),
        ),
    ),
)

Enter fullscreen mode Exit fullscreen mode

3. Environment Setup

To use Maps Grounding, you need to set the following environment variables:

# Required environment variables
export GOOGLE_CLOUD_PROJECT="your-project-id"
export GOOGLE_CLOUD_LOCATION="global"
export GOOGLE_GENAI_USE_VERTEXAI="True"

# Authentication method (choose one)
# Method 1: Use ADC (Development Environment)
gcloud auth application-default login

# Method 2: Use Service Account (Production Environment)
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account-key.json"

# Enable Vertex AI API
gcloud services enable aiplatform.googleapis.com

Enter fullscreen mode Exit fullscreen mode

Practical Application Example

image-20251202231340480

The implemented feature is very powerful and can be used to query nearby locations using natural language:

async def search_nearby_places(
    latitude: float,
    longitude: float,
    place_type: str = "restaurant",
    custom_query: Optional[str] = None,
    language_code: str = "zh-TW"
) -> str:
    """
    Use the Google Maps Grounding API to search for nearby locations

    Example queries:
    - "Please find nearby gas stations for me, and list the names, distances, and addresses."
    - "Please find nearby highly-rated restaurants for me, and list the names, types, and addresses."
    """

Enter fullscreen mode Exit fullscreen mode

Use Cases

  1. Conversational Assistant: "Find me a good espresso shop nearby"
  2. Personalized Recommendations: "What restaurants are suitable for families and within walking distance?"
  3. Area Summary: "What are the special features near this hotel?"

These application scenarios are particularly suitable for:

  • 🏠 Real Estate Platforms
  • ✈️ Travel Planning
  • 🚗 Mobility
  • 📱 Social Media

Supported Model List

Currently supported Gemini models for Google Maps Grounding:

  • ✅ Gemini 2.5 Pro
  • ✅ Gemini 2.5 Flash
  • ✅ Gemini 2.0 Flash
  • ✅ Gemini 2.5 Flash with Live API
  • ❌ Gemini 2.0 Flash-Lite (Not Supported)

Google Maps Platform Code Assist (MCP)

Code Assist Toolkit header

During development, I also discovered that Google launched the Google Maps Platform Code Assist toolkit, which is a tool based on the Model Context Protocol (MCP) that can:

  • 🔍 Real-time Document Retrieval: Search for the latest official documentation and code examples through RAG technology
  • 🤖 AI Assistant Integration: Supports Gemini CLI, Claude Code, Cursor, and other development environments
  • 📚 Rich Resources: Covers official documentation, tutorials, GitHub examples, and security resources

How to Use MCP

# Install using Node.js
npm install -g @googlemaps/code-assist-mcp

# Set up the MCP server in Claude Code or Cursor
# Then you can directly query the latest Google Maps documentation in the AI assistant
Enter fullscreen mode Exit fullscreen mode

This tool is especially suitable for quickly querying API usage during development, without switching between the browser and the editor!

Things to Note

1. Must Use Vertex AI

The Maps Grounding feature does not support the general Gemini Developer API and must be accessed through Vertex AI.

2. Authentication Settings

  • Development Environment: Use gcloud auth application-default login
  • Production Environment: Use a Service Account and set GOOGLE_APPLICATION_CREDENTIALS

3. Supported Models

Make sure to use a supported model (e.g., gemini-2.0-flash) and avoid using the -lite version.

4. Region Selection

It is recommended to set GOOGLE_CLOUD_LOCATION to global for the best availability.

5. Cost Considerations

Vertex AI's billing method is different from the Developer API. It is recommended to understand the cost structure on the pricing page first.

Development Experience

The biggest takeaway from this experience is: Not all Gemini features can be accessed through the Developer API. Enterprise-level features such as Maps Grounding, advanced security filters, etc., require Vertex AI.

Although setting up Vertex AI is more complex than simply using an API Key, the benefits are:

  • ✅ More powerful features (Maps Grounding, Search Grounding)
  • ✅ More complete enterprise-level support
  • ✅ More flexible deployment options
  • ✅ More granular access control

If you are developing an AI application that requires location awareness, Google Maps Grounding is definitely worth a try!

References


Enter fullscreen mode Exit fullscreen mode

Top comments (0)