🚀 Executive Summary
TL;DR: This guide addresses the inefficiency of manually syncing GitLab Merge Requests to Notion by providing a Python script. It automates fetching open MRs from GitLab and populating a Notion database, ensuring project managers and stakeholders are always updated without manual effort.
🎯 Key Takeaways
- The solution leverages a unique ‘GitLab ID’ property in Notion to effectively prevent duplicate merge request entries during subsequent sync operations.
- Secure management of API tokens and IDs is achieved using
python-dotenvand aconfig.envfile, preventing hardcoding of sensitive credentials within the script. - Successful integration requires exact, case-sensitive matching between Notion database property names (e.g., ‘MR Title’, ‘GitLab ID’, ‘URL’) and the corresponding keys in the Python script’s API payload.
Syncing GitLab Merge Requests to a Notion Database
Hey everyone, Darian here. Let’s talk about a common time-sink: keeping project managers and non-technical stakeholders updated on development progress. I used to spend a good chunk of my Monday morning manually copy-pasting Merge Request (MR) links into our project board in Notion. It was tedious, error-prone, and frankly, a waste of valuable engineering time. That’s why I built this simple Python script to automate it. It saves me at least an hour a week and keeps everyone in sync without any manual effort. Let’s get you set up.
Prerequisites
Before we dive in, make sure you have the following ready:
- A GitLab account with permissions to create Personal Access Tokens for your project.
- A Notion account and a workspace where you can create integrations and databases.
- Python 3 installed on the machine where you’ll run the script.
- Familiarity with installing Python packages. You’ll need
requestsandpython-dotenv.
The Guide: Step-by-Step
Step 1: Configure Your Notion Database & Integration
First, we need a destination for our GitLab data. Let’s prep Notion.
- Create a Notion Database: Make a new full-page database. I recommend the “Table” layout.
-
Define Properties (Columns): Set up the following columns. The names must be exact for the script to work out-of-the-box.
-
MR Title (Type:
Title) – This is the primary column. -
GitLab ID (Type:
Number) – Crucial for preventing duplicates. -
Status (Type:
Select) – Good options are “Open”, “Merged”, “Closed”. Our script will just set it to “Open”. -
URL (Type:
URL) – To link directly to the MR. -
Author (Type:
Text) – The name of the MR author.
-
MR Title (Type:
- Create a Notion Integration: Go to notion.so/my-integrations, create a “New integration”, and give it a name like “GitLab Sync”. Keep the associated “Internal Integration Token” safe; we’ll need it shortly.
- Share the Database: Go back to your database, click the “…” menu in the top-right, and select “Add connections”. Find your new “GitLab Sync” integration and grant it access.
-
Get the Database ID: The Database ID is in the URL of your database. It’s the long string between your workspace name and the
?v=. Copy this ID.
Step 2: Get Your GitLab Credentials
Now, let’s get the keys to the GitLab kingdom.
- Find your Project ID: Go to your project’s main page in GitLab. The Project ID is listed right under the project name.
-
Create a Personal Access Token (PAT): In GitLab, go to your User Settings > Access Tokens. Create a new token, give it a name, and select the
read_apiscope. This gives it read-only access to the API, which is all we need. Copy the generated token immediately—you won’t see it again.
Step 3: Set Up Your Python Environment
I’ll skip the standard virtualenv setup since you likely have your own workflow for that. The important part is to get the necessary libraries installed. In your project directory, you’ll want to install requests for making API calls and python-dotenv for managing our secrets.
Next, create a file named config.env in your project directory. This is much safer than hardcoding secrets in your script. Populate it with the credentials we just gathered:
NOTION_TOKEN="secret_..."
NOTION_DATABASE_ID="..."
GITLAB_TOKEN="..."
GITLAB_PROJECT_ID="..."
Step 4: The Python Script
Alright, let’s get to the core logic. Create a file named sync_script.py. I’ll break down the code into manageable parts and explain what each one does.
First, we import our libraries and load the environment variables from our config.env file.
import os
import requests
from dotenv import load_dotenv
load_dotenv('config.env')
# Load credentials from environment
NOTION_TOKEN = os.getenv('NOTION_TOKEN')
NOTION_DATABASE_ID = os.getenv('NOTION_DATABASE_ID')
GITLAB_TOKEN = os.getenv('GITLAB_TOKEN')
GITLAB_PROJECT_ID = os.getenv('GITLAB_PROJECT_ID')
# API Headers
notion_headers = {
"Authorization": f"Bearer {NOTION_TOKEN}",
"Content-Type": "application/json",
"Notion-Version": "2022-06-28",
}
gitlab_headers = {
"PRIVATE-TOKEN": GITLAB_TOKEN
}
Next, a function to fetch the currently open merge requests from your GitLab project.
def get_open_merge_requests():
"""Fetches open merge requests from the specified GitLab project."""
url = f"https://gitlab.com/api/v4/projects/{GITLAB_PROJECT_ID}/merge_requests?state=opened"
try:
response = requests.get(url, headers=gitlab_headers)
response.raise_for_status() # Raises an HTTPError for bad responses (4xx or 5xx)
print("Successfully fetched MRs from GitLab.")
return response.json()
except requests.exceptions.RequestException as e:
print(f"Error fetching from GitLab: {e}")
return []
Now, the most important part: a function to check which of these MRs already exist in Notion. This prevents us from creating duplicates every time the script runs. We use the unique GitLab ID we stored earlier.
def get_existing_mr_ids_in_notion():
"""Queries Notion to get the GitLab IDs of all MRs already in the database."""
url = f"https://api.notion.com/v1/databases/{NOTION_DATABASE_ID}/query"
existing_ids = set()
payload = {} # Empty payload fetches all pages
try:
response = requests.post(url, headers=notion_headers, json=payload)
response.raise_for_status()
results = response.json().get('results', [])
for item in results:
gitlab_id_prop = item.get('properties', {}).get('GitLab ID', {})
if gitlab_id_prop and gitlab_id_prop.get('number') is not None:
existing_ids.add(gitlab_id_prop['number'])
print(f"Found {len(existing_ids)} existing MRs in Notion.")
return existing_ids
except requests.exceptions.RequestException as e:
print(f"Error querying Notion: {e}")
return set()
With our checks in place, here’s the function that creates a new page in Notion for a new MR.
Pro Tip: Notice how the payload structure below directly maps to the Notion properties we created? If you change a property name in Notion (e.g., “URL” to “Link”), you must update it here as well. This is a common point of failure.
def add_mr_to_notion(mr):
"""Adds a single merge request as a new page in the Notion database."""
url = "https://api.notion.com/v1/pages"
payload = {
"parent": {"database_id": NOTION_DATABASE_ID},
"properties": {
"MR Title": {
"title": [{"text": {"content": mr['title']}}]
},
"GitLab ID": {
"number": mr['id']
},
"Status": {
"select": {"name": "Open"}
},
"URL": {
"url": mr['web_url']
},
"Author": {
"rich_text": [{"text": {"content": mr['author']['name']}}]
}
}
}
try:
response = requests.post(url, headers=notion_headers, json=payload)
response.raise_for_status()
print(f"Successfully added MR #{mr['id']} to Notion.")
except requests.exceptions.RequestException as e:
print(f"Error adding MR #{mr['id']} to Notion: {e} - {response.text}")
Finally, the main execution block to tie it all together.
def main():
"""Main function to run the sync process."""
print("Starting GitLab to Notion sync...")
gitlab_mrs = get_open_merge_requests()
if not gitlab_mrs:
print("No open MRs found or error fetching. Exiting.")
return
notion_mr_ids = get_existing_mr_ids_in_notion()
new_mrs_added = 0
for mr in gitlab_mrs:
if mr['id'] not in notion_mr_ids:
add_mr_to_notion(mr)
new_mrs_added += 1
print(f"Sync complete. Added {new_mrs_added} new MRs.")
if __name__ == "__main__":
main()
Step 5: Automate the Sync
A script is only useful if you don’t have to run it manually. A simple cron job is perfect for this. On a Linux server, you could set it to run every hour, for example.
Just be sure to use a command relative to where your script is located. For example, if you’re in the right directory, you can set up a crontab entry like this to run at 2 AM on Mondays:
0 2 * * 1 python3 sync_script.py
Alternatively, you could use a GitLab CI/CD Scheduled Pipeline, a GitHub Action, or a cloud function for a more serverless approach.
Common Pitfalls
Here are a few places where I’ve messed up in the past, so you can avoid them:
- Notion Integration Permissions: I’ve wasted hours debugging only to realize I forgot to “Share” the database with my integration. Always check that first.
-
Token Scopes: Using a GitLab token without the
read_apiscope will result in a 403 Forbidden error. Double-check your PAT permissions. - Property Name Mismatches: If you name a column “Link” in Notion but the script’s payload refers to “URL”, the API call will fail. The names must be an exact, case-sensitive match.
- API Rate Limits: If you have a massive project and run the script too frequently (e.g., every minute), you might hit rate limits from either GitLab or Notion. For most teams, running it once an hour is more than enough.
Conclusion
And that’s it! You now have a robust, automated workflow that bridges the communication gap between your development team’s work in GitLab and your project planning in Notion. It’s a “set-it-and-forget-it” solution that removes manual toil and keeps everyone on the same page. I hope this saves you as much time as it has saved me.
Happy building,
Darian Vance
👉 Read the original article on TechResolve.blog
☕ Support my work
If this article helped you, you can buy me a coffee:

Top comments (0)