<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Matthew McTighe</title>
    <description>The latest articles on DEV Community by Matthew McTighe (@matthewmctighe1).</description>
    <link>https://dev.to/matthewmctighe1</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/matthewmctighe1"/>
    <language>en</language>
    <item>
      <title>Building Intelligent IoT Applications with PostgreSQL: Time-Series + Semantic Search + Text-to-SQL</title>
      <dc:creator>Matthew McTighe</dc:creator>
      <pubDate>Tue, 01 Jul 2025 20:04:55 +0000</pubDate>
      <link>https://dev.to/matthewmctighe1/building-intelligent-iot-applications-with-postgresql-time-series-semantic-search-text-to-sql-4pbm</link>
      <guid>https://dev.to/matthewmctighe1/building-intelligent-iot-applications-with-postgresql-time-series-semantic-search-text-to-sql-4pbm</guid>
      <description>&lt;p&gt;&lt;strong&gt;TL;DR:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can now use PostgreSQL to handle a wide range of workloads—including time-series, vector search, and transactional queries. I built a lightweight app using TigerData's PostgreSQL cloud and OpenAI to enable &lt;strong&gt;Text-to-SQL&lt;/strong&gt; and &lt;strong&gt;Semantic Search&lt;/strong&gt; on real-time IoT data.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/TheSurfingCoder/edge-insights-iot-platform" rel="noopener noreferrer"&gt;GitHub Link&lt;/a&gt;&lt;br&gt;
&lt;a href="https://edge-insights-iot-platform.vercel.app/" rel="noopener noreferrer"&gt;Live Link&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why PostgreSQL?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;PostgreSQL has gained massive adoption in recent years thanks to its reliability, extensibility, and familiarity. It's used in some of the most mission-critical applications today—for good reason.&lt;/p&gt;

&lt;p&gt;To explore its versatility, I built an IoT-focused app that collects real-time data from simulated sensors and layers on AI-powered querying and search. The goal: show how far you can take a single database when you combine time-series, vector search, and LLMs—all from within PostgreSQL.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why TigerData?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;TigerData made sense for this project for two key reasons:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;IoT data is inherently time-series – I wanted to leverage the TimescaleDB extension to efficiently store and aggregate time-stamped sensor readings.&lt;/li&gt;
&lt;li&gt;AI-native PostgreSQL – TigerData has built cutting-edge AI features directly into PostgreSQL through their pgai suite, making it easier to embed and search data using OpenAI models.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Stack Highlights&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;TimescaleDB is a PostgreSQL extension that optimizes time-series data ingestion and analysis.&lt;/li&gt;
&lt;li&gt;pgai is a suite of tools that turns PostgreSQL into a production-ready engine for Retrieval-Augmented Generation (RAG) and agentic applications.
🔄 Automatically generates and syncs vector embeddings from table data or documents via pg_vectorizer
🔍 Enables fast, native vector and semantic search via pgvector and pgvectorscale.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Together, TimescaleDB and pgai let you ingest high-volume time-series data and create real-time vector embeddings—all on the original source table.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;WebSocket connections for persistent bidirectional communication enabling real-time IoT device data ingestion and live dashboard updates&lt;/li&gt;
&lt;li&gt;OpenAI for query embedding generation and record embeddings&lt;/li&gt;
&lt;li&gt;React &amp;amp; Next.js for front-end&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq2kjjp34odg1opt52yiq.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq2kjjp34odg1opt52yiq.jpg" alt="This is an application that uses Websocket to send data to TigerData Cloud in real-time. Once in TigerData, it uses TigerData's pg_vectorizer to create vector embeddings in real-time based on the payload. These embeddings can then be used for semantic search." width="800" height="495"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Technical Implementation Notes
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;PostgreSQL Table Definition and Log Feed&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The core of the system is a sensor_readings table that stores incoming IoT data. Each row represents a single reading from a device, complete with metadata such as location, value, and timestamp.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;-- Create the sensor_readings table with proper structure
CREATE TABLE IF NOT EXISTS sensor_readings (
    time TIMESTAMPTZ NOT NULL,
    device_id TEXT NOT NULL,
    device_type TEXT NOT NULL,
    location TEXT,
    raw_value NUMERIC,
    unit TEXT,
    log_type TEXT NOT NULL DEFAULT 'INFO',
    message TEXT NOT NULL,
    CONSTRAINT sensor_readings_pkey PRIMARY KEY (time, device_id)
);

SELECT create_hypertable('sensor_readings', 'time', if_not_exists =&amp;gt; TRUE);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To optimize for time-series performance, the table is converted into a hypertable using TimescaleDB. This enables efficient inserts, compression, and aggregations over time-based data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Automatic Embeddings with TigerData's Vectorizer&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once the table is in place, TigerData enables you to define a vectorizer—a serverless function under the hood—that automatically creates and maintains an embedding column.&lt;/p&gt;

&lt;p&gt;Every time new data is inserted or existing data is updated, the vectorizer will generate or update the corresponding vector embedding in real-time using OpenAI’s text-embedding-3-small model:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;SELECT ai.create_vectorizer(
    'public.sensor_readings'::regclass,
    loading =&amp;gt; ai.loading_column('device_id'),
    chunking =&amp;gt; ai.chunking_none(),
    destination =&amp;gt; ai.destination_column('embedding'),
    embedding =&amp;gt; ai.embedding_openai('text-embedding-3-small', 1536, api_key_name=&amp;gt;'OPENAI_API_KEY'),
    formatting =&amp;gt; ai.formatting_python_template(
        'The device id is: $device_id and has a device type of: $device_type. The location of this device is: $location and it recorded a raw value of: $raw_value with a unit of: $unit. The log type is: $log_type and this was recorded at the following time: $time'
    )
);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Example Record After Vectorization&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once the vectorizer is in place, inserting a record like the one below will automatically produce a corresponding embedding: &lt;/p&gt;

&lt;p&gt;time:        2025-07-01 17:14:41.456252+00&lt;br&gt;
device_id:   device_004&lt;br&gt;
device_type: camera&lt;br&gt;
location:    parking_lot&lt;br&gt;
raw_value:   1&lt;br&gt;
unit:        boolean&lt;br&gt;
log_type:    INFO&lt;br&gt;
message:     Camera feed: Normal surveillance activity in parking_lot&lt;br&gt;
embedding:   [0.010583181, -0.006653716, -0.011511296, ...]&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Fast Aggregations with Continuous Aggregates&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Since the data is stored in a hypertable, TimescaleDB's continuous aggregates can be layered on top to support fast and efficient rollups—ideal for dashboards or powering text-to-SQL queries.&lt;/p&gt;

&lt;p&gt;Here’s how I set up 5-minute and hourly aggregates over the sensor_readings table:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;CREATE MATERIALIZED VIEW IF NOT EXISTS five_min_sensor_averages
WITH (timescaledb.continuous) AS
SELECT 
    time_bucket('5 minutes', time) AS five_min_bucket,
    device_type,
    location,
    avg(raw_value) as avg_value,
    min(raw_value) as min_value,
    max(raw_value) as max_value,
    count(*) as reading_count
FROM sensor_readings
WHERE raw_value IS NOT NULL
GROUP BY five_min_bucket, device_type, location;

-- Level 2: Hourly aggregates built on top of 5-minute aggregates
CREATE MATERIALIZED VIEW IF NOT EXISTS hourly_sensor_averages
WITH (timescaledb.continuous) AS
SELECT 
    time_bucket('1 hour', five_min_bucket) AS hour,
    device_type,
    location,
    avg(avg_value) as avg_value,
    min(min_value) as min_value,
    max(max_value) as max_value,
    sum(reading_count) as reading_count
FROM five_min_sensor_averages
GROUP BY hour, device_type, location;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Routing Between Text-to-SQL and Semantic Search&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;My application supports two different AI query modes:&lt;/p&gt;

&lt;p&gt;Text-to-SQL for structured, analytical queries like “What’s the average temperature in the last 24 hours?”&lt;/p&gt;

&lt;p&gt;Semantic Search for unstructured, investigative prompts like “Why did the camera in the parking lot report unusual activity?”&lt;/p&gt;

&lt;p&gt;To determine which path to take, I built a lightweight classifier that scans for keywords in the incoming query. Here's the core logic behind it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;func (s *AIService) determineQueryType(query string) string {
    queryLower := strings.ToLower(query)

    // Keywords that suggest specific data queries (use text-to-SQL)
    dataKeywords := []string{
        "show me", "what is", "how many", "average", "count", "temperature",
        "humidity", "motion", "camera", "controller", "device", "location",
        "last hour", "last 24 hours", "yesterday", "today", "this week",
        "above", "below", "between", "greater than", "less than",
        "raw_value", "unit", "time", "hour", "day", "week", "month",
    }

    // Keywords that suggest pattern discovery (use semantic search)
    patternKeywords := []string{
        "why", "how", "patterns", "similar", "unusual", "anomaly", "problem",
        "issue", "failure", "error", "warning", "critical", "security",
        "behavior", "trend", "insight", "analysis", "explain", "understand",
        "find logs", "search for", "discover", "investigate",
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If the query includes terms related to measurements, time ranges, or specific metrics, it’s routed to the text-to-SQL engine. If it contains exploratory or investigative language, it goes to semantic search.&lt;/p&gt;

&lt;p&gt;This design allows for a more natural and intuitive user experience—people can ask questions however they like, and the system adapts accordingly.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This project shows how far you can go using just PostgreSQL as the foundation. By combining:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;TimescaleDB for efficient time-series ingestion and continuous aggregates&lt;/li&gt;
&lt;li&gt;pgai for vector embeddings and semantic search&lt;/li&gt;
&lt;li&gt;OpenAI for both embeddings and natural language SQL generation&lt;/li&gt;
&lt;li&gt;TigerData Cloud to tie it all together&lt;/li&gt;
&lt;/ul&gt;

</description>
    </item>
    <item>
      <title>How I built a 3D Drone Planning Software</title>
      <dc:creator>Matthew McTighe</dc:creator>
      <pubDate>Fri, 27 Jun 2025 00:46:12 +0000</pubDate>
      <link>https://dev.to/matthewmctighe1/how-i-built-a-3d-drone-planning-software-1npn</link>
      <guid>https://dev.to/matthewmctighe1/how-i-built-a-3d-drone-planning-software-1npn</guid>
      <description>&lt;h2&gt;
  
  
  🎉 I just finished building my full-stack drone mission planner!
&lt;/h2&gt;

&lt;p&gt;Live link: &lt;a href="https://drone-app-q23f.vercel.app" rel="noopener noreferrer"&gt;https://drone-app-q23f.vercel.app&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  ✈️ What does it do?
&lt;/h2&gt;

&lt;p&gt;Users can create autonomous flight paths for DJI drones through an interactive 2D and 3D interface. These missions can be saved to the cloud — and in the future, a mobile app (still in the works) will let users connect to their drone and trigger the mission in real-time.&lt;/p&gt;

&lt;h2&gt;
  
  
  🧱 Tech Stack
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Frontend Hosting &amp;amp; CI/CD: Vercel&lt;/li&gt;
&lt;li&gt;Backend Hosting &amp;amp; CI/CD: Render&lt;/li&gt;
&lt;li&gt;Auth: Supabase (JWT-based)&lt;/li&gt;
&lt;li&gt;Database: MongoDB&lt;/li&gt;
&lt;li&gt;Backend Language: Go&lt;/li&gt;
&lt;li&gt;Frontend: React, Tailwind, Vite&lt;/li&gt;
&lt;li&gt;Maps: CesiumJS (3D), Leaflet (2D)&lt;/li&gt;
&lt;li&gt;IDE: Cursor&lt;/li&gt;
&lt;li&gt;UI/UX Design: Magic Patterns&lt;/li&gt;
&lt;li&gt;Timezone API: TimezoneDB (used lat/lng-based lookup)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;📸 Before &amp;amp; After&lt;br&gt;
Before: &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F13dy9vsncjkvjb006ker.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F13dy9vsncjkvjb006ker.png" alt="This is a photo of my web application at the very start. It's very minimal and all the buttons are overlapping and messy" width="800" height="405"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgnioeaf230ms3a0k5meq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgnioeaf230ms3a0k5meq.png" alt="Here's a photo of my application once I was done building it. It looks a lot more modern and user friendly and I added three-dimensional features to it" width="800" height="411"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  🧠 Goals of the Project
&lt;/h2&gt;

&lt;p&gt;I set out to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Build something I thought was fascinating&lt;/li&gt;
&lt;li&gt;Design and develop a complete full-stack app&lt;/li&gt;
&lt;li&gt;Learn (and level up in) React, Tailwind, Go, and web authentication — areas I was pretty new to at the start&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  💡 Inspiration
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;I’ve always been fascinated by the intersection of physical systems and digital interfaces — and drones are a perfect example.&lt;/li&gt;
&lt;li&gt;While browsing full-stack engineering job postings, I noticed recurring patterns in the skills companies were hiring for… so I decided to build something that put those skills into practice.&lt;/li&gt;
&lt;li&gt;I also drew inspiration from existing tools like Litchi and wanted to take it a step further by incorporating modern UI and eventually mobile integration.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  🛠 Optimizations
&lt;/h2&gt;

&lt;p&gt;3D rendering engines (like Cesium) are powerful but resource-intensive. I implemented optimizations using conditionals based on camera height:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const optimizeTileLoading = () =&amp;gt; {
  const camera = viewer.camera;
  const height = camera.positionCartographic.height;

  if (height &amp;lt; 500) {
    viewer.scene.globe.maximumScreenSpaceError = 2;
  } else if (height &amp;lt; 2000) {
    viewer.scene.globe.maximumScreenSpaceError = 4;
  } else if (height &amp;lt; 10000) {
    viewer.scene.globe.maximumScreenSpaceError = 8;
  } else {
    viewer.scene.globe.maximumScreenSpaceError = 16;
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;For time zone lookups, I used TimezoneDB, which has API limits — so I added server-side caching using a hash map keyed on lat/lng.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  ⚠️ Challenges
&lt;/h2&gt;

&lt;p&gt;Too many to list… but here are the big ones:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;I originally used @cesium/engine, the modularized version of Cesium, but ran into build issues — likely due to missing dependencies. After a week of debugging, I switched to the full cesium package.&lt;/li&gt;
&lt;li&gt;I didn’t know where this project was headed when I started. What began as a small 2D waypoint sim turned into a 3D planner for real drone missions. As the scope grew, I had to revisit and refactor large parts of the codebase — including a complete rethink of the data model and UX flow after reviewing DJI’s Mobile SDK. Let’s just say there were many rabbit holes.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;(Coding with AI — A Blessing and a Curse)&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Where it struggles:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Large outputs — If you let AI generate 800+ lines of code in one go, things get chaotic fast. It’s much harder to review, and you often end up with redundant, messy, or even broken logic buried in there. Stick to 100–200 lines at a time — it’s way easier to review and actually understand what’s happening.&lt;/li&gt;
&lt;li&gt;Niche technologies — For less mainstream tools like Cesium, AI often gives outdated or flat-out wrong answers. In some cases, it really struggled to generate usable examples or explanations.&lt;/li&gt;
&lt;li&gt;Refactoring — Don’t let it refactor too much at once. Small, incremental changes are the way to go.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Where it shines:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Fast iteration for isolated components or logic blocks&lt;/li&gt;
&lt;li&gt;Handling boilerplate or UI layout&lt;/li&gt;
&lt;li&gt;Getting unstuck when you just need a quick idea to move forward&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  📱 Future Goals
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Build the companion mobile app using DJI’s Mobile SDK v4 for iOS (data model is already structured for it)&lt;/li&gt;
&lt;li&gt;Continue optimizing Cesium’s performance&lt;/li&gt;
&lt;li&gt;Fix known bugs&lt;/li&gt;
&lt;li&gt;Improve mobile web UI (started but deprioritized for launch)&lt;/li&gt;
&lt;li&gt;Get user feedback — and maybe monetize it someday&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>webdev</category>
      <category>javascript</category>
      <category>career</category>
      <category>learning</category>
    </item>
  </channel>
</rss>
