<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: rika</title>
    <description>The latest articles on DEV Community by rika (@w95).</description>
    <link>https://dev.to/w95</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/w95"/>
    <language>en</language>
    <item>
      <title>Amass API - REST API Solution for Domain Reconnaissance</title>
      <dc:creator>rika</dc:creator>
      <pubDate>Tue, 14 Jan 2025 22:53:29 +0000</pubDate>
      <link>https://dev.to/w95/amass-api-rest-api-solution-for-domain-reconnaissance-4h1d</link>
      <guid>https://dev.to/w95/amass-api-rest-api-solution-for-domain-reconnaissance-4h1d</guid>
      <description>&lt;p&gt;For a long time, I searched for a solution like this and finally decided to create my own. This project is a Flask-based web application integrated with OWASP Amass, designed to automate domain reconnaissance for security professionals. This innovative solution significantly simplifies subdomain discovery during penetration testing, saving substantial time and effort compared to manual methods.&lt;/p&gt;

&lt;h3&gt;
  
  
  What is AMASS?
&lt;/h3&gt;

&lt;p&gt;OWASP Amass is a powerful tool for network infrastructure reconnaissance. It collects data from over 55 external sources to identify subdomains, IP addresses, and other network information of target systems. Amass combines passive and active reconnaissance techniques to provide security professionals with extensive and accurate data. By leveraging DNS data collection, SSL certificate analysis, and other techniques, it offers a detailed view of the target system's network landscape.&lt;/p&gt;

&lt;p&gt;This tool was created to solve a longstanding problem and aims to simplify the daily workflows of cybersecurity professionals.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Features:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Fully automated domain reconnaissance via Amass&lt;/li&gt;
&lt;li&gt;Recursive search capability with configurable minimum recursion depth&lt;/li&gt;
&lt;li&gt;Subdomain discovery using brute-force methods&lt;/li&gt;
&lt;li&gt;API endpoint to retrieve results in JSON format&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Installation Requirements:
&lt;/h3&gt;

&lt;p&gt;Docker and Docker Compose must be installed on the system. The application can be run using a pre-built image from Docker Hub:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker pull enrikenur/amass-api
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Installation Steps:
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Clone the repository:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git clone https://github.com/w95/amass-api
cd amass-api
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Build and start the application using Docker Compose:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker-compose up --build
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;The application will be accessible at &lt;code&gt;http://localhost:5000&lt;/code&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  API Usage:
&lt;/h3&gt;

&lt;p&gt;To initiate domain reconnaissance, use the &lt;code&gt;/api/amass/enum&lt;/code&gt; endpoint with the POST method. Request parameters include the target domain (&lt;code&gt;domain&lt;/code&gt;), brute-force mode (&lt;code&gt;brute&lt;/code&gt;), and the minimum number of findings for recursion (&lt;code&gt;min_for_recursive&lt;/code&gt;).&lt;/p&gt;

&lt;p&gt;Sample Request:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "domain": "example.com",
  "brute": true,
  "min_for_recursive": 2
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Developers interested in contributing can fork the repository and propose changes. All contributions are welcomed.&lt;/p&gt;

</description>
      <category>amass</category>
      <category>infosec</category>
      <category>api</category>
      <category>security</category>
    </item>
    <item>
      <title>Cloning PostgreSQL Databases: A Developer's Guide to Local Replication</title>
      <dc:creator>rika</dc:creator>
      <pubDate>Sat, 23 Nov 2024 14:21:00 +0000</pubDate>
      <link>https://dev.to/w95/cloning-postgresql-databases-a-developers-guide-to-local-replication-14a6</link>
      <guid>https://dev.to/w95/cloning-postgresql-databases-a-developers-guide-to-local-replication-14a6</guid>
      <description>&lt;p&gt;Like many developers, I frequently need to work with production-like data locally. Maybe you're debugging a tricky issue that only happens with specific data patterns, or you're testing a complex migration that could affect millions of rows. Whatever the reason, having a local copy of your production database can be incredibly useful.&lt;/p&gt;

&lt;p&gt;I've spent considerable time perfecting a reliable process for this at work, and I want to share my approach. This guide will walk you through creating an exact copy of a remote PostgreSQL database on your local machine.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Clone Databases Locally?
&lt;/h2&gt;

&lt;p&gt;Before we dive in, let's talk about when you might want to do this. I typically clone databases when:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;I'm writing a complex migration and want to verify it won't blow up with real data&lt;/li&gt;
&lt;li&gt;I need to debug a production issue that I can't reproduce with test data&lt;/li&gt;
&lt;li&gt;I'm optimizing queries and need realistic data volumes for meaningful performance testing&lt;/li&gt;
&lt;li&gt;I want to experiment with schema changes without affecting anyone else&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Script
&lt;/h2&gt;

&lt;p&gt;Here's the script I use. It's battle-tested and handles most edge cases I've encountered:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;

&lt;span class="c"&gt;# Remote database configuration&lt;/span&gt;
&lt;span class="nv"&gt;REMOTE_HOST&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"your-remote-host.database.azure.com"&lt;/span&gt;
&lt;span class="nv"&gt;REMOTE_PORT&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"5432"&lt;/span&gt;
&lt;span class="nv"&gt;REMOTE_USER&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"your_remote_user"&lt;/span&gt;
&lt;span class="nv"&gt;REMOTE_PASSWORD&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"your_remote_password"&lt;/span&gt;

&lt;span class="c"&gt;# Local database configuration&lt;/span&gt;
&lt;span class="nv"&gt;LOCAL_HOST&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"localhost"&lt;/span&gt;
&lt;span class="nv"&gt;LOCAL_PORT&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"5432"&lt;/span&gt;
&lt;span class="nv"&gt;LOCAL_USER&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"your_local_user"&lt;/span&gt;
&lt;span class="nv"&gt;LOCAL_PASSWORD&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"your_local_password"&lt;/span&gt;

&lt;span class="c"&gt;# Function to copy a database&lt;/span&gt;
copy_database&lt;span class="o"&gt;()&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
    &lt;span class="nb"&gt;local &lt;/span&gt;&lt;span class="nv"&gt;DB&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;$1&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Processing database: &lt;/span&gt;&lt;span class="nv"&gt;$DB&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;

    &lt;span class="c"&gt;# Create a timestamp for the backup file&lt;/span&gt;
    &lt;span class="nv"&gt;TIMESTAMP&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;date&lt;/span&gt; +%Y%m%d_%H%M%S&lt;span class="si"&gt;)&lt;/span&gt;
    &lt;span class="nv"&gt;BACKUP_FILE&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;DB&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;_backup_&lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;TIMESTAMP&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;.sql"&lt;/span&gt;

    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Starting database copy process for &lt;/span&gt;&lt;span class="nv"&gt;$DB&lt;/span&gt;&lt;span class="s2"&gt;..."&lt;/span&gt;

    &lt;span class="c"&gt;# Step 1: Create a dump of the remote database&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Creating dump from remote database..."&lt;/span&gt;
    &lt;span class="nv"&gt;PGPASSWORD&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;$REMOTE_PASSWORD&lt;/span&gt; pg_dump &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-h&lt;/span&gt; &lt;span class="nv"&gt;$REMOTE_HOST&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-p&lt;/span&gt; &lt;span class="nv"&gt;$REMOTE_PORT&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-U&lt;/span&gt; &lt;span class="nv"&gt;$REMOTE_USER&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="nv"&gt;$DB&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-F&lt;/span&gt; c &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-f&lt;/span&gt; &lt;span class="nv"&gt;$BACKUP_FILE&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt; &lt;span class="nv"&gt;$?&lt;/span&gt; &lt;span class="nt"&gt;-ne&lt;/span&gt; 0 &lt;span class="o"&gt;]&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
        &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Error: Failed to create database dump for &lt;/span&gt;&lt;span class="nv"&gt;$DB&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
        &lt;span class="k"&gt;return &lt;/span&gt;1
    &lt;span class="k"&gt;fi&lt;/span&gt;

    &lt;span class="c"&gt;# Step 2: Drop the local database if it exists&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Dropping local database if exists..."&lt;/span&gt;
    &lt;span class="nv"&gt;PGPASSWORD&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;$LOCAL_PASSWORD&lt;/span&gt; dropdb &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-h&lt;/span&gt; &lt;span class="nv"&gt;$LOCAL_HOST&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-p&lt;/span&gt; &lt;span class="nv"&gt;$LOCAL_PORT&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-U&lt;/span&gt; &lt;span class="nv"&gt;$LOCAL_USER&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;--if-exists&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nv"&gt;$DB&lt;/span&gt;

    &lt;span class="c"&gt;# Step 3: Create new local database&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Creating new local database..."&lt;/span&gt;
    &lt;span class="nv"&gt;PGPASSWORD&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;$LOCAL_PASSWORD&lt;/span&gt; createdb &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-h&lt;/span&gt; &lt;span class="nv"&gt;$LOCAL_HOST&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-p&lt;/span&gt; &lt;span class="nv"&gt;$LOCAL_PORT&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-U&lt;/span&gt; &lt;span class="nv"&gt;$LOCAL_USER&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nv"&gt;$DB&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt; &lt;span class="nv"&gt;$?&lt;/span&gt; &lt;span class="nt"&gt;-ne&lt;/span&gt; 0 &lt;span class="o"&gt;]&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
        &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Error: Failed to create local database for &lt;/span&gt;&lt;span class="nv"&gt;$DB&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
        &lt;span class="k"&gt;return &lt;/span&gt;1
    &lt;span class="k"&gt;fi&lt;/span&gt;

    &lt;span class="c"&gt;# Step 4: Restore the dump to the local database&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Restoring database locally..."&lt;/span&gt;
    &lt;span class="nv"&gt;PGPASSWORD&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;$LOCAL_PASSWORD&lt;/span&gt; pg_restore &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-h&lt;/span&gt; &lt;span class="nv"&gt;$LOCAL_HOST&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-p&lt;/span&gt; &lt;span class="nv"&gt;$LOCAL_PORT&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-U&lt;/span&gt; &lt;span class="nv"&gt;$LOCAL_USER&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="nv"&gt;$DB&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;--no-owner&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nt"&gt;--no-privileges&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
        &lt;span class="nv"&gt;$BACKUP_FILE&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt; &lt;span class="nv"&gt;$?&lt;/span&gt; &lt;span class="nt"&gt;-ne&lt;/span&gt; 0 &lt;span class="o"&gt;]&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
        &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Warning: Some errors occurred during restore of &lt;/span&gt;&lt;span class="nv"&gt;$DB&lt;/span&gt;&lt;span class="s2"&gt; (this might be normal)"&lt;/span&gt;
    &lt;span class="k"&gt;fi&lt;/span&gt;

    &lt;span class="c"&gt;# Step 5: Cleanup&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Cleaning up temporary files..."&lt;/span&gt;
    &lt;span class="nb"&gt;rm&lt;/span&gt; &lt;span class="nt"&gt;-f&lt;/span&gt; &lt;span class="nv"&gt;$BACKUP_FILE&lt;/span&gt;

    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Database copy process completed for &lt;/span&gt;&lt;span class="nv"&gt;$DB&lt;/span&gt;&lt;span class="s2"&gt;!"&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"----------------------------------------"&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;

&lt;span class="c"&gt;# Copy each database&lt;/span&gt;
copy_database &lt;span class="s2"&gt;"database1"&lt;/span&gt;
copy_database &lt;span class="s2"&gt;"database2"&lt;/span&gt;

&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"All databases have been copied!"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  How It Works
&lt;/h2&gt;

&lt;p&gt;The script is pretty straightforward. Here's what's happening under the hood:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;First, it takes a snapshot of your remote database using &lt;code&gt;pg_dump&lt;/code&gt;. I use the custom format (&lt;code&gt;-F c&lt;/code&gt;) because it's both faster and more flexible than plain SQL dumps.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Then it wipes your local database (if it exists) and creates a fresh one. This helps avoid any weird state issues that might come from partial updates.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Finally, it restores the dump to your local database using &lt;code&gt;pg_restore&lt;/code&gt;. I've added the &lt;code&gt;--no-owner&lt;/code&gt; and &lt;code&gt;--no-privileges&lt;/code&gt; flags because you probably don't have the same users and permissions locally as you do in production.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Things That Might Trip You Up
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Credentials
&lt;/h3&gt;

&lt;p&gt;Don't store credentials in the script file. I usually set them as environment variables:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;export &lt;/span&gt;&lt;span class="nv"&gt;PG_REMOTE_PASSWORD&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"your_password"&lt;/span&gt;
&lt;span class="nb"&gt;export &lt;/span&gt;&lt;span class="nv"&gt;PG_LOCAL_PASSWORD&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"your_local_password"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then modify the script to use them:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;PGPASSWORD&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;$PG_REMOTE_PASSWORD&lt;/span&gt; pg_dump ...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Large Databases
&lt;/h3&gt;

&lt;p&gt;If you're working with a massive database, you might want to use compression:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pg_dump &lt;span class="nt"&gt;-Z&lt;/span&gt; 9 ... &lt;span class="c"&gt;# Maximum compression&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Or parallel restore for faster imports:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pg_restore &lt;span class="nt"&gt;-j&lt;/span&gt; 4 ... &lt;span class="c"&gt;# Use 4 parallel jobs&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Network Issues
&lt;/h3&gt;

&lt;p&gt;For unstable connections or very large dumps, I recommend using &lt;code&gt;screen&lt;/code&gt; or &lt;code&gt;tmux&lt;/code&gt;. There's nothing worse than losing a 2-hour transfer because your VPN hiccuped.&lt;/p&gt;

&lt;h2&gt;
  
  
  Selective Copying
&lt;/h2&gt;

&lt;p&gt;Sometimes you don't need the entire database. Here's how to grab specific parts:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Just one schema&lt;/span&gt;
pg_dump &lt;span class="nt"&gt;-n&lt;/span&gt; specific_schema ...

&lt;span class="c"&gt;# Specific tables&lt;/span&gt;
pg_dump &lt;span class="nt"&gt;-t&lt;/span&gt; schema.table_name ...

&lt;span class="c"&gt;# Just the structure, no data&lt;/span&gt;
pg_dump &lt;span class="nt"&gt;--schema-only&lt;/span&gt; ...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Example
&lt;/h2&gt;

&lt;p&gt;Here's a specific example from my work. We had a complex migration that needed to modify historical transaction data. The staging environment didn't have enough real-world edge cases, so I needed a local copy of production to test thoroughly.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Copy just the transactions schema&lt;/span&gt;
pg_dump &lt;span class="nt"&gt;-h&lt;/span&gt; prod-db.example.com &lt;span class="nt"&gt;-U&lt;/span&gt; prod_user &lt;span class="nt"&gt;-d&lt;/span&gt; prod_db &lt;span class="nt"&gt;-n&lt;/span&gt; transactions &lt;span class="nt"&gt;-F&lt;/span&gt; c &lt;span class="nt"&gt;-f&lt;/span&gt; transactions.dump

&lt;span class="c"&gt;# Create a new local database&lt;/span&gt;
createdb migration_test

&lt;span class="c"&gt;# Restore just that schema&lt;/span&gt;
pg_restore &lt;span class="nt"&gt;-d&lt;/span&gt; migration_test transactions.dump
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This let me test the migration multiple times without affecting anyone else's work.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;Having a reliable way to clone databases has saved me countless hours of debugging and prevented numerous production issues. The script above has served me well, but feel free to modify it for your needs.&lt;/p&gt;

</description>
      <category>postgres</category>
      <category>bash</category>
      <category>database</category>
    </item>
  </channel>
</rss>
