DEV Community

Sapm Pub
Sapm Pub

Posted on

How I built Radar-Marée, a tide website for the French coast in two weeks

I wanted a simple website to answer a very basic question:

“What are the tide times today near where I am?”

Most tide websites do the job, but they are usually centered on a few main harbors.

When you search for “tide + city name”, you often land on a generic page that doesn’t really match what people actually type or where they actually go.

In two weeks, I built Radar-Marée, a website that shows tide times and coefficients for the French coast, by ZIP code and city.

👉 Live site: https://radar-maree.fr


The idea

The goal wasn’t to build just another tide website, but to:

  • start from ZIP codes and cities (what users actually type in search engines),
  • link them to existing tide reference data with a small custom algorithm,
  • generate a static website, fast and SEO-friendly.

This first version focuses on a clean grid of coastal cities / ZIP codes.

The algorithm is already there to attach each area to relevant reference points.


A Python pipeline that generates everything

Instead of building a big backend, I went for a very simple approach:

pre-compute everything in Python, then deploy static HTML.

The pipeline looks like this:

1. Coastal cities and ZIP codes inventory

I maintain a CSV file with:

  • cities and ZIP codes on or near the French coast,
  • which coastline they belong to (Channel, Atlantic, Mediterranean),
  • their coordinates (lat/lon).

This CSV is the backbone of the project.

2. Attaching each city/ZIP to tide reference data

A Python script links each city/ZIP code to existing tide data via a small home-made algorithm.

For a given ZIP code, the script decides which reference points are relevant, then fetches the corresponding high tide / low tide times and coefficients.

3. SEO enrichment

Another script enriches the inventory with SEO fields:

  • <title> for each page,
  • meta description,
  • a clean URL slug,
  • a canonical URL.

The idea is simple: each page should clearly answer a query like

"marée + nom de ville" (tide + city name).

4. Static HTML generation

Finally, a generator script takes:

  • the enriched CSV,
  • an HTML template,

and produces all the static pages (city pages, ZIP code pages, etc.).

Deployment is then just a matter of uploading static files behind a CDN.


Why static instead of a big dynamic app?

A project like Radar-Marée doesn’t need to compute things on every request.

By precomputing:

  • I keep the infrastructure very simple,
  • I don’t need a production database,
  • I get fast responses thanks to static hosting + CDN,
  • I can focus on the algorithm and the quality of the pages.

When I want to update something, I just:

  1. run the Python scripts,
  2. regenerate the site,
  3. redeploy.

That’s it.


What’s next

Right now, Radar-Marée covers coastal cities and ZIP codes on the French shoreline.

Next steps:

  • refine the attachment algorithm,
  • improve descriptions by region / coastline,
  • gradually extend to ports / beaches / local spots where it really makes sense.

The idea is to start from a solid, already useful base, and add the “ultra local” layer step by step, without overpromising.


If you want to see the result or follow the evolution of the project:

👉 https://radar-maree.fr

I’m also happy to talk with people working on similar topics

(data, geography, static site generation, SEO, or indie projects in general).

Top comments (0)