So! A plain server backend (in NodeJS) to talk to real estate's site (TradeMe) is working with real data :)
The next step is to put them in a structured, modern way. Time to play with Supabase, the popular backend database. I'm starting with one table for now.
Current State Analysis
- Express.js backend fetching open homes from TradeMe API
- Data is fetched on-demand from TradeMe
- Environment variables already configured via dotenv
- Two endpoints:
/api/open-homesand/api/open-homes/:id
Steps!
- Install
@supabase/supabase-js - Create Supabase project and get credentials
- Set up database schema in Supabase dashboard
- Create
config/supabase.jsfor client initialization - Create
lib/database.jswith CRUD functions - Update
index.jsto integrate database calls - Add sync logic (on-demand for now)
- Test endpoints with database integration
Sketch new directory structure
househunt-backend/
├── config/
│ └── supabase.js # Supabase client initialization
├── lib/
│ └── database.js # Database helper functions
├── services/
│ └── syncService.js # TradeMe sync logic
├── index.js # Main Express app (updated)
├── package.json # Updated with @supabase/supabase-js
└── .env # Updated with Supabase credentials
Table Structure
open_homes table:
- `id` (uuid, primary key)
- `listing_id` (text, unique) - TradeMe listing ID
- `title` (text)
- `location` (text)
- `bedrooms` (integer)
- `bathrooms` (integer)
- `open_home_time` (timestamp)
- `price` (text)
- `picture_href` (text)
- `trademe_data` (jsonb) - store full TradeMe response
- `created_at` (timestamp)
- `updated_at` (timestamp)
Benefits of Adding Supabase
- Faster responses (cached data)
- Reduced TradeMe API calls
- Historical data tracking
- User features (favorites, search history)
- Real-time updates
- Scalable PostgreSQL database
- Built-in authentication (if needed)
Coming back to do this tomorrow morning!
Top comments (0)