In e-commerce and on-demand delivery, the moment a user clicks "Place Order" begins an anxiety cycle. Where is my order? When will it arrive? This uncertainty kills conversions.
Between November 2020 and July 2021, I worked as a Full-Stack Developer at Viaduct, where I engineered a real-time delivery tracking system for an on-demand platform serving 5,000+ active users. The platform connected customers with local vendors for same-day delivery. Before I joined, users were abandoning carts and cancelling orders at alarming rates because they had zero visibility into delivery status.
This article details how I designed and built the tracking system — Google Maps integration, WebSocket-powered live updates, a Vue.js reactive interface, and the Laravel backend tying it together. The result: a 20% increase in checkout conversions.
The Problem: Delivery Opacity Was Killing Revenue
When I joined Viaduct, the delivery platform was functional but opaque. Users placed orders, received a confirmation email, and waited. The only updates were manual — a dispatcher marked orders "out for delivery" when a driver departed and "delivered" when reported back.
The consequences were measurable:
Cart abandonment: 32% of users abandoned at checkout. Exit surveys showed the top reason was "uncertain delivery time."
Post-order cancellations: 15% of completed checkouts were cancelled within 30 minutes, most citing "no delivery updates."
Support overload: 200+ daily calls asking "where is my order?" — each averaging 4 minutes, consuming 13 person-hours daily.
Driver accountability gaps: Without GPS tracking, there was no way to verify delivery times, identify route inefficiencies, or resolve disputes.
My Role: Full-Stack Architect of the Tracking System
I designed and built the entire real-time tracking system end-to-end: the Laravel backend infrastructure, Google Maps API integration, the WebSocket communication layer, and the Vue.js frontend tracking interface. I owned it from initial technical design through production deployment and post-launch optimisation.
Technical Deep Dive: Building the Real-Time Tracking Infrastructure
1. Driver Location Collection Architecture
The foundation of real-time tracking is reliable location data. I built a location reporting service on drivers' devices, transmitting GPS coordinates at configurable intervals.
class DriverLocationController extends Controller
{
public function updateLocation(UpdateLocationRequest $request): JsonResponse
{
$driver = $request->user();
$validated = $request->validated();
$location = DriverLocation::create([
'driver_id' => $driver->id,
'latitude' => $validated['latitude'],
'longitude' => $validated['longitude'],
'accuracy' => $validated['accuracy'],
'speed' => $validated['speed'] ?? null,
'heading' => $validated['heading'] ?? null,
'recorded_at' => Carbon::createFromTimestamp($validated['timestamp']),
]);
if ($activeDelivery = $driver->activeDelivery) {
$this->broadcastLocationUpdate($activeDelivery, $location);
$this->updateEstimatedArrival($activeDelivery, $location);
}
return response()->json(['status' => 'recorded'], 200);
}
private function broadcastLocationUpdate(Delivery $delivery, DriverLocation $location): void
{
Redis::publish("delivery:{$delivery->id}:location", json_encode([
'delivery_id' => $delivery->id,
'driver' => [
'latitude' => $location->latitude,
'longitude' => $location->longitude,
'heading' => $location->heading,
],
'timestamp' => $location->recorded_at->toIso8601String(),
]));
}
}
Location updates published to Redis channels, bridging the Laravel backend and the WebSocket server. This decoupled location ingestion (high-frequency HTTP POSTs from drivers) from location broadcasting (persistent WebSocket connections to customers), allowing each to scale independently.
2. Google Maps Integration for ETA Calculation
Accurate ETAs were critical. I integrated the Google Maps Directions API with live traffic data:
class GoogleMapsService
{
public static function getDirections(array $origin, array $destination): ?DirectionsResult
{
$response = Http::get('https://maps.googleapis.com/maps/api/directions/json', [
'origin' => "{$origin['lat']},{$origin['lng']}",
'destination' => "{$destination['lat']},{$destination['lng']}",
'mode' => 'driving',
'departure_time' => 'now',
'traffic_model' => 'best_guess',
'key' => config('services.google_maps.api_key'),
]);
if (!$response->successful() || $response->json('status') !== 'OK') {
return null;
}
$leg = $response->json('routes.0.legs.0');
return new DirectionsResult(
durationMinutes: (int) ceil($leg['duration_in_traffic']['value'] / 60),
distanceKm: round($leg['distance']['value'] / 1000, 1),
polyline: $response->json('routes.0.overview_polyline.points'),
);
}
}
Using departure_time=now and traffic_model=best_guess ensured ETAs reflected live traffic — during rush hours, static estimates might show 10 minutes while actual time was 25. However, calling Google Maps on every location update (every 10 seconds per driver) would be prohibitively expensive. I implemented intelligent throttling using Haversine distance:
class ETAService
{
private const ETA_CACHE_TTL = 60;
private const MIN_DISTANCE_FOR_RECALC = 0.2; // km
public function shouldRecalculateETA(Delivery $delivery, DriverLocation $newLocation): bool
{
if (!$delivery->eta_updated_at) return true;
if (now()->diffInSeconds($delivery->eta_updated_at) < self::ETA_CACHE_TTL) return false;
$lastCalcLocation = Cache::get("eta_calc_location:{$delivery->id}");
if (!$lastCalcLocation) return true;
return $this->haversineDistance(
$lastCalcLocation['lat'], $lastCalcLocation['lng'],
$newLocation->latitude, $newLocation->longitude
) >= self::MIN_DISTANCE_FOR_RECALC;
}
private function haversineDistance(float $lat1, float $lon1, float $lat2, float $lon2): float
{
$dLat = deg2rad($lat2 - $lat1);
$dLon = deg2rad($lon2 - $lon1);
$a = sin($dLat/2)**2 + cos(deg2rad($lat1)) * cos(deg2rad($lat2)) * sin($dLon/2)**2;
return 6371 * 2 * atan2(sqrt($a), sqrt(1 - $a));
}
}
ETAs were recalculated only when the driver moved 200m+ and 60+ seconds had elapsed, reducing Google Maps API calls by ~80%.
3. WebSocket Server for Live Customer Updates
The real-time frontend was powered by a Node.js WebSocket server subscribed to Redis channels:
const wss = new WebSocket.Server({ port: 6001 });
const subscriber = new Redis(process.env.REDIS_URL);
const deliverySubscriptions = new Map();
wss.on('connection', (ws) => {
ws.on('message', (raw) => {
const msg = JSON.parse(raw);
if (msg.type === 'subscribe_delivery') {
if (!deliverySubscriptions.has(msg.deliveryId)) {
deliverySubscriptions.set(msg.deliveryId, new Set());
subscriber.subscribe(
`delivery:${msg.deliveryId}:location`,
`delivery:${msg.deliveryId}:eta`,
`delivery:${msg.deliveryId}:status`
);
}
deliverySubscriptions.get(msg.deliveryId).add(ws);
}
});
ws.on('close', () => {
if (ws.deliveryId) {
const subs = deliverySubscriptions.get(ws.deliveryId);
if (subs) { subs.delete(ws); if (subs.size === 0) deliverySubscriptions.delete(ws.deliveryId); }
}
});
});
subscriber.on('message', (channel, message) => {
const [, deliveryId, eventType] = channel.split(':');
const subs = deliverySubscriptions.get(deliveryId);
if (!subs) return;
const payload = JSON.stringify({ type: `delivery_${eventType}`, data: JSON.parse(message) });
for (const ws of subs) { if (ws.readyState === WebSocket.OPEN) ws.send(payload); }
});
The server was deliberately stateless — it bridged Redis pub/sub to WebSocket connections with no business logic, making it horizontally scalable.
4. Vue.js Real-Time Tracking Interface
The customer-facing tracking page used Vue.js Composition API. The core was a WebSocket composable with exponential backoff reconnection:
export function useDeliveryWebSocket(deliveryId) {
const driverLocation = ref(null);
const eta = ref(null);
const deliveryStatus = ref('confirmed');
let ws = null, reconnectAttempts = 0;
function connect() {
ws = new WebSocket(process.env.VUE_APP_WS_URL);
ws.onopen = () => {
reconnectAttempts = 0;
ws.send(JSON.stringify({ type: 'subscribe_delivery', deliveryId }));
};
ws.onmessage = ({ data }) => {
const msg = JSON.parse(data);
if (msg.type === 'delivery_location') driverLocation.value = msg.data.driver;
else if (msg.type === 'delivery_eta') eta.value = msg.data.eta_minutes;
else if (msg.type === 'delivery_status') deliveryStatus.value = msg.data.status;
};
ws.onclose = () => {
if (reconnectAttempts < 10)
setTimeout(connect, Math.min(1000 * 2 ** reconnectAttempts++, 30000));
};
}
return { driverLocation, eta, deliveryStatus, connect, disconnect: () => ws?.close() };
}
The map component initialised Google Maps, rendered destination and driver markers, and animated the driver marker between positions using cubic easing. Without animation, the marker would "teleport" every 10 seconds — with smooth interpolation, users perceived continuous movement, dramatically improving perceived quality.
5. Delivery Status Lifecycle and Performance
Every status transition published to Redis for real-time updates, sent push notifications, and recorded metrics:
class Delivery extends Model
{
public function markPickedUp(): void
{
$this->update(['status' => 'picked_up', 'picked_up_at' => now()]);
Redis::publish("delivery:{$this->id}:status", json_encode([
'delivery_id' => $this->id, 'status' => 'picked_up',
'driver' => ['name' => $this->driver->name, 'phone' => $this->driver->phone],
]));
$this->order->customer->notify(new DeliveryPickedUpNotification($this));
}
public function markDelivered(): void
{
$this->update(['status' => 'delivered', 'delivered_at' => now()]);
Redis::publish("delivery:{$this->id}:status", json_encode([
'delivery_id' => $this->id, 'status' => 'delivered',
]));
$this->order->customer->notify(new DeliveryCompletedNotification($this));
DeliveryMetric::create([
'delivery_id' => $this->id, 'driver_id' => $this->driver_id,
'estimated_minutes' => $this->estimated_arrival_minutes,
'actual_minutes' => $this->picked_up_at?->diffInMinutes($this->delivered_at),
]);
}
}
The DeliveryMetric model enabled comparing estimated vs. actual delivery times, identifying underperforming drivers and slow routes.
For performance at scale, I buffered location writes in Redis and flushed to PostgreSQL in batches every 30 seconds, reducing database write load by 90% while maintaining real-time delivery through the pub/sub layer. Vendor discovery pages used PostGIS spatial queries (ST_DWithin, ST_Distance) with aggressive caching, returning results in under 50ms across thousands of records.
The Results: Measurable Business Impact
The system launched after eight weeks of development. Results over a 60-day observation period:
| Metric | Before | After | Change |
|---|---|---|---|
| Checkout conversion rate | 68% | 88% | +20 percentage points |
| Post-order cancellation rate | 15% | 4% | 73% reduction |
| "Where is my order?" support calls | 200+/day | ~35/day | 82% reduction |
| Customer satisfaction score | 3.4/5 | 4.3/5 | 26% improvement |
| Active platform users | ~3,200 | 5,000+ | 56% growth |
| Tracking page views per order | 0 | 4.2 | New engagement channel |
The support call reduction alone freed 11 person-hours daily — redirected to proactive customer outreach and vendor management.
Technical Lessons
Decouple ingestion from delivery. Location collection (Laravel) and broadcasting (Node.js WebSocket) had different performance profiles. Separate services connected by Redis allowed independent scaling.
Animate transitions, don't teleport. Users perceive smooth movement as real-time tracking and position jumps as a broken system.
Throttle expensive API calls intelligently. The Haversine check before ETA recalculation saved thousands of unnecessary Google Maps calls daily.
Buffer writes, stream reads. Batch-writing to the database while streaming through Redis provided both persistence and live performance.
Measure business metrics. WebSocket latency matters, but checkout conversion rate proved the system's value.
Top comments (1)
Thank you for sharing this article