Hi there! I'm Shrijith Venkatramana. Right now, I’m building LiveReview. An AI reviewer that transforms both code quality and team performance. LiveReview gives instant feedback and helps you ship cleaner code, faster.
Check it out, if that’s your kind of thing.
Dashboards are the heartbeat of data-driven apps, but slow queries can kill the vibe. Let’s build a fast, scalable dashboard using PostgreSQL caching and Golang scheduled routines to keep things snappy. This guide dives into practical steps, code, and tips to make your dashboard fly. We’ll cover setting up a PostgreSQL cache, scheduling data updates with Go, and wiring it all into a sleek dashboard. Ready to code?
Why Caching and Scheduling Matter for Dashboards
Dashboards often pull data from complex queries, and hitting the database every time a user refreshes is a recipe for lag. Caching stores precomputed results, slashing response times. Scheduled routines in Go keep that cache fresh without manual intervention. Together, they ensure your dashboard is fast and up-to-date. We’ll use PostgreSQL’s materialized views for caching and Go’s time.Ticker
for scheduling.
Key benefit: Users get instant data, and your database doesn’t cry under pressure.
Setting Up Your PostgreSQL Database
First, you need a PostgreSQL database. I’m assuming you’ve got PostgreSQL installed (if not, grab it from postgresql.org). Let’s create a sample database for a sales dashboard with orders data.
-- Create database and table
CREATE DATABASE sales_dashboard;
\c sales_dashboard
CREATE TABLE orders (
id SERIAL PRIMARY KEY,
customer_id INTEGER,
amount DECIMAL(10, 2),
order_date TIMESTAMP
);
-- Insert sample data
INSERT INTO orders (customer_id, amount, order_date) VALUES
(1, 99.50, '2025-07-01 10:00:00'),
(2, 150.75, '2025-07-02 12:30:00'),
(1, 45.25, '2025-07-03 09:15:00');
Output: A sales_dashboard
database with an orders
table containing sample data.
This sets the stage for our cache and dashboard queries.
Creating a Materialized View for Caching
PostgreSQL’s materialized views are perfect for caching. Unlike regular views, they store query results physically, so you don’t rerun expensive queries. Let’s create a materialized view to summarize daily sales.
CREATE MATERIALIZED VIEW daily_sales_cache AS
SELECT
DATE(order_date) AS sale_date,
COUNT(*) AS order_count,
SUM(amount) AS total_amount
FROM orders
GROUP BY DATE(order_date)
WITH DATA;
-- Refresh the view (run this to update cache)
REFRESH MATERIALIZED VIEW daily_sales_cache;
Output: A daily_sales_cache
view with columns sale_date
, order_count
, and total_amount
.
Why this rocks: Queries hit the cached view instead of scanning the entire orders
table. Use REFRESH MATERIALIZED VIEW
to update it when data changes. We’ll automate this later with Go.
Building the Golang Backend
Now, let’s set up a Golang backend to serve the dashboard and manage the cache. You’ll need Go installed (get it at golang.org). We’ll use the net/http
package for the API and database/sql
with pq
for PostgreSQL.
package main
import (
"database/sql"
"encoding/json"
"log"
"net/http"
_ "github.com/lib/pq"
)
const (
dbHost = "localhost"
dbPort = 5432
dbUser = "your_user"
dbPassword = "your_password"
dbName = "sales_dashboard"
)
func main() {
// Connect to PostgreSQL
connStr := "host=" + dbHost + " port=" + string(dbPort) + " user=" + dbUser + " password=" + dbPassword + " dbname=" + dbName + " sslmode=disable"
db, err := sql.Open("postgres", connStr)
if err != nil {
log.Fatal(err)
}
defer db.Close()
// API endpoint for dashboard data
http.HandleFunc("/api/sales", func(w http.ResponseWriter, r *http.Request) {
rows, err := db.Query("SELECT sale_date, order_count, total_amount FROM daily_sales_cache")
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
defer rows.Close()
type Sale struct {
Date string `json:"date"`
OrderCount int `json:"order_count"`
TotalAmount float64 `json:"total_amount"`
}
var sales []Sale
for rows.Next() {
var s Sale
if err := rows.Scan(&s.Date, &s.OrderCount, &s.TotalAmount); err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
sales = append(sales, s)
}
w.Header().Set("Content-Type", "application/json")
json.NewEncoder(w).Encode(sales)
})
log.Fatal(http.ListenAndServe(":8080", nil))
}
Output: A Go server running on localhost:8080
with an /api/sales
endpoint returning JSON from the daily_sales_cache
view.
Pro tip: Replace your_user
and your_password
with your PostgreSQL credentials. Install the pq
driver with go get github.com/lib/pq
.
Scheduling Cache Refreshes with Go
To keep the cache fresh, we’ll use a Go routine with time.Ticker
to refresh the materialized view periodically. This runs in the background, so your API stays responsive.
package main
import (
"database/sql"
"log"
"time"
_ "github.com/lib/pq"
)
func refreshCache(db *sql.DB) {
ticker := time.NewTicker(1 * time.Hour)
for range ticker.C {
_, err := db.Exec("REFRESH MATERIALIZED VIEW daily_sales_cache")
if err != nil {
log.Printf("Error refreshing cache: %v", err)
} else {
log.Println("Cache refreshed successfully")
}
}
}
func main() {
connStr := "host=localhost port=5432 user=your_user password=your_password dbname=sales_dashboard sslmode=disable"
db, err := sql.Open("postgres", connStr)
if err != nil {
log.Fatal(err)
}
defer db.Close()
// Start cache refresh routine
go refreshCache(db)
// Rest of the server code (same as above)
// ...
}
Output: Logs “Cache refreshed successfully” every hour, or an error if the refresh fails.
Why this works: The go
keyword runs refreshCache
in a separate goroutine, so it doesn’t block the main server. Adjust the time.Hour
interval as needed.
Wiring Up a Simple Dashboard Frontend
Let’s create a basic HTML dashboard to display the data. We’ll use plain JavaScript to fetch data from the Go API and render it in a table. Host this in the same Go server by adding a file server.
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Sales Dashboard</title>
<style>
table { border-collapse: collapse; width: 100%; }
th, td { border: 1px solid #ddd; padding: 8px; text-align: left; }
th { background-color: #f2f2f2; }
</style>
</head>
<body>
<h1>Sales Dashboard</h1>
<table id="salesTable">
<thead>
<tr>
<th>Date</th>
<th>Order Count</th>
<th>Total Amount</th>
</tr>
</thead>
<tbody id="salesBody"></tbody>
</table>
<script>
fetch('/api/sales')
.then(response => response.json())
.then(data => {
const tbody = document.getElementById('salesBody');
data.forEach(sale => {
const row = document.createElement('tr');
row.innerHTML = `
<td>${sale.date}</td>
<td>${sale.order_count}</td>
<td>$${sale.total_amount.toFixed(2)}</td>
`;
tbody.appendChild(row);
});
})
.catch(error => console.error('Error:', error));
</script>
</body>
</html>
Output: A webpage at localhost:8080
showing a table with sales data.
Add this to your Go server:
http.Handle("/", http.FileServer(http.Dir("static")))
Save the HTML as static/index.html
. The server now serves both the API and the dashboard.
Optimizing Cache Performance
Materialized views are great, but they can bloat if your data grows. Index your materialized view to speed up queries, and use CONCURRENTLY for refreshes to avoid locking the view.
-- Add index on sale_date
CREATE INDEX idx_daily_sales_date ON daily_sales_cache (sale_date);
-- Refresh without locking
REFRESH MATERIALIZED VIEW CONCURRENTLY daily_sales_cache;
Update the Go refresh function to use CONCURRENTLY
:
_, err := db.Exec("REFRESH MATERIALIZED VIEW CONCURRENTLY daily_sales_cache")
Why this matters: CONCURRENTLY
lets users query the view during refreshes, keeping the dashboard responsive. The index speeds up queries on sale_date
.
Optimization | Benefit |
---|---|
Index on sale_date
|
Faster filtering/sorting |
CONCURRENTLY |
No downtime during refresh |
Handling Errors and Monitoring
Things can go wrong—database outages, network issues, or query failures. Add error handling and logging to your Go code to catch issues early. Let’s enhance the refresh function with retries.
func refreshCache(db *sql.DB) {
ticker := time.NewTicker(1 * time.Hour)
for range ticker.C {
for retries := 3; retries > 0; retries-- {
_, err := db.Exec("REFRESH MATERIALIZED VIEW CONCURRENTLY daily_sales_cache")
if err == nil {
log.Println("Cache refreshed successfully")
break
}
log.Printf("Error refreshing cache (retries left: %d): %v", retries-1, err)
time.Sleep(5 * time.Second)
}
}
}
Output: Logs success or errors with retry attempts.
Pro tip: Use a monitoring tool like Prometheus to track refresh times and API response times. Check out Prometheus for setup details.
Next Steps for Scaling Your Dashboard
You’ve got a solid dashboard, but there’s room to grow. Add authentication to secure your API (try JWT). Scale the cache by partitioning the orders
table for huge datasets. Enhance the frontend with a framework like React for interactivity. Finally, consider deploying to a cloud provider like AWS or Heroku, and use a connection pooler like PgBouncer for high traffic.
Next Step | Why Do It |
---|---|
Add JWT | Secure API access |
Partition tables | Handle massive data |
Use React | Dynamic UI |
Deploy with PgBouncer | Manage connections |
This setup gives you a fast, reliable dashboard that’s ready to scale. Keep tweaking based on your app’s needs, and you’ll have users raving about the speed.
Top comments (0)