DEV Community

Cover image for Host Your Own Q&A Community Using Apache Answer (with Backups to S3)
Athreya aka Maneshwar
Athreya aka Maneshwar

Posted on

Host Your Own Q&A Community Using Apache Answer (with Backups to S3)

Hi there! I'm Maneshwar. Right now, I’m building LiveAPI, a first-of-its-kind tool that helps you automatically index API endpoints across all your repositories. LiveAPI makes it easier to discover, understand, and interact with APIs in large infrastructures.


If you've ever wanted to run your own StackOverflow-style Q&A platform, Apache Answer is your plug-and-play solution.

In this guide, we’ll go from scratch to production — including auto-setup, MySQL config, Docker, and scheduled backups to S3.

1. Build a Custom Docker Image

You can build a custom Docker image that includes Discord notifications and timezone config.

# Dockerfile

FROM apache/answer as answer-builder

FROM golang:1.22-alpine AS golang-builder

COPY --from=answer-builder /usr/bin/answer /usr/bin/answer

RUN apk --no-cache add \
    build-base git bash nodejs npm go && \
    npm install -g pnpm@9.7.0

RUN answer build \
    --with github.com/HexmosTech/notification-discord@main \
    --output /usr/bin/new_answer

FROM alpine

ARG TIMEZONE
ENV TIMEZONE=${TIMEZONE:-"Asia/Kolkata"}

RUN apk update && apk --no-cache add \
    bash ca-certificates curl dumb-init gettext \
    openssh sqlite gnupg tzdata && \
    ln -sf /usr/share/zoneinfo/${TIMEZONE} /etc/localtime && \
    echo "${TIMEZONE}" > /etc/timezone

COPY --from=golang-builder /usr/bin/new_answer /usr/bin/answer
COPY --from=answer-builder /data /data
COPY --from=answer-builder /entrypoint.sh /entrypoint.sh

RUN chmod 755 /entrypoint.sh

VOLUME /data
EXPOSE 80
ENTRYPOINT ["/entrypoint.sh"]
Enter fullscreen mode Exit fullscreen mode

2. Configure .env

AUTO_INSTALL=true

DB_TYPE=mysql
DB_USERNAME=answer
DB_PASSWORD=answer
DB_HOST=host_ip:3306
DB_NAME=answer

LANGUAGE=en_US
SITE_NAME=Answer
SITE_URL=https://answer.domain.com
CONTACT_EMAIL=answer@gmail.com

ADMIN_NAME=answer
ADMIN_PASSWORD=answer
ADMIN_EMAIL=answer@gmail.com

EXTERNAL_CONTENT_DISPLAY=always_display
Enter fullscreen mode Exit fullscreen mode

3. Setup MariaDB

Create the DB and user manually:

DROP DATABASE IF EXISTS `answer`;
CREATE DATABASE `answer`;

CREATE USER 'answer'@'%' IDENTIFIED BY 'answer';
GRANT ALL PRIVILEGES ON `answer`.* TO 'answer'@'%';
FLUSH PRIVILEGES;
Enter fullscreen mode Exit fullscreen mode

4. Run the Server

docker run -d \
  --env-file .env \
  -v answer-data:/data \
  -p 9080:80 \
  --name answer \
  answer:latest
Enter fullscreen mode Exit fullscreen mode

Your community will now be live on http://your-ip:9080 or the configured domain.

5. Backup to S3 (Daily Cron)

Create a script backup.sh:

#!/bin/bash

# === ENVIRONMENT VARIABLES ===
export AWS_ACCESS_KEY_ID="YOUR_AWS_KEY"
export AWS_SECRET_ACCESS_KEY="YOUR_AWS_SECRET"
export AWS_DEFAULT_REGION="us-west-2"

S3_BUCKET="answer-backup"
DISCORD_WEBHOOK="https://discord.com/api/webhooks/..."

DB_USER="answeanswer"
DB_PASS="answer"
DB_HOST="127.0.0.1"
DB_PORT="3306"
DB_NAME="answer"

# === BACKUP SETUP ===
TIMESTAMP=$(date +"%Y-%m-%d-%H.%M.%S")
ARCHIVE_NAME="answer_backup_$TIMESTAMP.tar.gz"
TMP_DIR="./tmp_backup"

# === CLEANUP OLD FILES ===
echo "🧹 Cleaning old .gz and .sql files..."
rm -f ./*.gz ./*.sql
rm -rf "$TMP_DIR"
mkdir -p "$TMP_DIR"

# === SQL BACKUP ===
echo "📦 Dumping MariaDB..."
mysqldump -u "$DB_USER" -p"$DB_PASS" -h "$DB_HOST" -P "$DB_PORT" "$DB_NAME" > "$TMP_DIR/db_${DB_NAME}.sql" || {
  echo "❌ MariaDB dump failed!"
  curl -s -X POST "$DISCORD_WEBHOOK" -H 'Content-Type: application/json' \
    --data-raw '{"content": "❌ MariaDB dump failed during Apache Answer backup."}'
  exit 1
}

# === Docker Volume Backup ===
echo "📁 Backing up Docker volume 'answer-data'..."
docker run --rm \
  -v answer-data:/volume \
  -v "$PWD/$TMP_DIR":/backup \
  alpine \
  tar czf /backup/answer-data.tar.gz -C /volume .

# === Combine ===
echo "🗜️ Compressing full backup..."
tar -czf "$ARCHIVE_NAME" -C "$TMP_DIR" .

# === Upload ===
echo "☁️ Uploading to S3..."
aws s3 cp "$ARCHIVE_NAME" "s3://$S3_BUCKET/" || {
  curl -s -X POST "$DISCORD_WEBHOOK" -H 'Content-Type: application/json' \
    --data-raw '{"content": "❌ Failed uploading backup to S3."}'
  exit 1
}

# === File Size Check ===
filesize=$(stat -c%s "$ARCHIVE_NAME")
maxsize=524288
filesize_mb=$(bc <<< "scale=2; $filesize / 1024 / 1024")

if [ "$filesize" -lt "$maxsize" ]; then
  curl -s -X POST "$DISCORD_WEBHOOK" -H 'Content-Type: application/json' \
    --data-raw "{\"content\": \"❌ Backup too small (probably failed): ${filesize_mb} MB\"}"
  exit 1
else
  curl -s -X POST "$DISCORD_WEBHOOK" -H 'Content-Type: application/json' \
    --data-raw "{\"content\": \"✅ Apache Answer backup uploaded (${filesize_mb} MB)\"}"
fi

rm -rf "$TMP_DIR"
Enter fullscreen mode Exit fullscreen mode

6. Schedule it via Cron

Edit your crontab:

crontab -e
Enter fullscreen mode Exit fullscreen mode

Add:

0 18 * * * /home/ubuntu/backup-answer/backup.sh
Enter fullscreen mode Exit fullscreen mode

Done

You now have a fully running, auto-installed Apache Answer instance + daily backups to S3 + Discord alerts.

Let your users ask away. 🔥


LiveAPI helps you get all your backend APIs documented in a few minutes.

With LiveAPI, you can generate interactive API docs that allow users to search and execute endpoints directly from the browser.

LiveAPI Demo

If you're tired of updating Swagger manually or syncing Postman collections, give it a shot.

Top comments (2)

Collapse
 
dotallio profile image
Dotallio

This kind of automation for both community Q&A and backups is exactly what makes self-hosting manageable. Have you tried tying LiveAPI and Answer together for integrated support or docs?

Collapse
 
nathan_tarbert profile image
Nathan Tarbert

this is extremely impressive, seeing all the steps broken down like that actually makes me want to try it myself
you think skipping managed platforms is worth it for that much control?