DEV Community

Cover image for Host Your Own Q&A Community Using Apache Answer (with Backups to S3)
Athreya aka Maneshwar
Athreya aka Maneshwar

Posted on • Edited on

Host Your Own Q&A Community Using Apache Answer (with Backups to S3)

Hello, I'm Maneshwar. I'm building git-lrc, an AI code reviewer that runs on every commit. It is free, unlimited, and source-available on Github. Star Us to help devs discover the project. Do give it a try and share your feedback for improving the product.

Hi there! I’m building LiveAPI, a first-of-its-kind tool that helps you automatically index API endpoints across all your repositories. LiveAPI makes it easier to discover, understand, and interact with APIs in large infrastructures.

If you've ever wanted to run your own StackOverflow-style Q&A platform, Apache Answer is your plug-and-play solution.

In this guide, we’ll go from scratch to production — including auto-setup, MySQL config, Docker, and scheduled backups to S3.

1. Build a Custom Docker Image

You can build a custom Docker image that includes Discord notifications and timezone config.

# Dockerfile

FROM apache/answer as answer-builder

FROM golang:1.22-alpine AS golang-builder

COPY --from=answer-builder /usr/bin/answer /usr/bin/answer

RUN apk --no-cache add \
    build-base git bash nodejs npm go && \
    npm install -g pnpm@9.7.0

RUN answer build \
    --with github.com/HexmosTech/notification-discord@main \
    --output /usr/bin/new_answer

FROM alpine

ARG TIMEZONE
ENV TIMEZONE=${TIMEZONE:-"Asia/Kolkata"}

RUN apk update && apk --no-cache add \
    bash ca-certificates curl dumb-init gettext \
    openssh sqlite gnupg tzdata && \
    ln -sf /usr/share/zoneinfo/${TIMEZONE} /etc/localtime && \
    echo "${TIMEZONE}" > /etc/timezone

COPY --from=golang-builder /usr/bin/new_answer /usr/bin/answer
COPY --from=answer-builder /data /data
COPY --from=answer-builder /entrypoint.sh /entrypoint.sh

RUN chmod 755 /entrypoint.sh

VOLUME /data
EXPOSE 80
ENTRYPOINT ["/entrypoint.sh"]
Enter fullscreen mode Exit fullscreen mode

2. Configure .env

AUTO_INSTALL=true

DB_TYPE=mysql
DB_USERNAME=answer
DB_PASSWORD=answer
DB_HOST=host_ip:3306
DB_NAME=answer

LANGUAGE=en_US
SITE_NAME=Answer
SITE_URL=https://answer.domain.com
CONTACT_EMAIL=answer@gmail.com

ADMIN_NAME=answer
ADMIN_PASSWORD=answer
ADMIN_EMAIL=answer@gmail.com

EXTERNAL_CONTENT_DISPLAY=always_display
Enter fullscreen mode Exit fullscreen mode

3. Setup MariaDB

Create the DB and user manually:

DROP DATABASE IF EXISTS `answer`;
CREATE DATABASE `answer`;

CREATE USER 'answer'@'%' IDENTIFIED BY 'answer';
GRANT ALL PRIVILEGES ON `answer`.* TO 'answer'@'%';
FLUSH PRIVILEGES;
Enter fullscreen mode Exit fullscreen mode

4. Run the Server

docker run -d \
  --env-file .env \
  -v answer-data:/data \
  -p 9080:80 \
  --name answer \
  answer:latest
Enter fullscreen mode Exit fullscreen mode

Your community will now be live on http://your-ip:9080 or the configured domain.

5. Backup to S3 (Daily Cron)

Create a script backup.sh:

#!/bin/bash

# === ENVIRONMENT VARIABLES ===
export AWS_ACCESS_KEY_ID="YOUR_AWS_KEY"
export AWS_SECRET_ACCESS_KEY="YOUR_AWS_SECRET"
export AWS_DEFAULT_REGION="us-west-2"

S3_BUCKET="answer-backup"
DISCORD_WEBHOOK="https://discord.com/api/webhooks/..."

DB_USER="answeanswer"
DB_PASS="answer"
DB_HOST="127.0.0.1"
DB_PORT="3306"
DB_NAME="answer"

# === BACKUP SETUP ===
TIMESTAMP=$(date +"%Y-%m-%d-%H.%M.%S")
ARCHIVE_NAME="answer_backup_$TIMESTAMP.tar.gz"
TMP_DIR="./tmp_backup"

# === CLEANUP OLD FILES ===
echo "🧹 Cleaning old .gz and .sql files..."
rm -f ./*.gz ./*.sql
rm -rf "$TMP_DIR"
mkdir -p "$TMP_DIR"

# === SQL BACKUP ===
echo "📦 Dumping MariaDB..."
mysqldump -u "$DB_USER" -p"$DB_PASS" -h "$DB_HOST" -P "$DB_PORT" "$DB_NAME" > "$TMP_DIR/db_${DB_NAME}.sql" || {
  echo "❌ MariaDB dump failed!"
  curl -s -X POST "$DISCORD_WEBHOOK" -H 'Content-Type: application/json' \
    --data-raw '{"content": "❌ MariaDB dump failed during Apache Answer backup."}'
  exit 1
}

# === Docker Volume Backup ===
echo "📁 Backing up Docker volume 'answer-data'..."
docker run --rm \
  -v answer-data:/volume \
  -v "$PWD/$TMP_DIR":/backup \
  alpine \
  tar czf /backup/answer-data.tar.gz -C /volume .

# === Combine ===
echo "🗜️ Compressing full backup..."
tar -czf "$ARCHIVE_NAME" -C "$TMP_DIR" .

# === Upload ===
echo "☁️ Uploading to S3..."
aws s3 cp "$ARCHIVE_NAME" "s3://$S3_BUCKET/" || {
  curl -s -X POST "$DISCORD_WEBHOOK" -H 'Content-Type: application/json' \
    --data-raw '{"content": "❌ Failed uploading backup to S3."}'
  exit 1
}

# === File Size Check ===
filesize=$(stat -c%s "$ARCHIVE_NAME")
maxsize=524288
filesize_mb=$(bc <<< "scale=2; $filesize / 1024 / 1024")

if [ "$filesize" -lt "$maxsize" ]; then
  curl -s -X POST "$DISCORD_WEBHOOK" -H 'Content-Type: application/json' \
    --data-raw "{\"content\": \"❌ Backup too small (probably failed): ${filesize_mb} MB\"}"
  exit 1
else
  curl -s -X POST "$DISCORD_WEBHOOK" -H 'Content-Type: application/json' \
    --data-raw "{\"content\": \"✅ Apache Answer backup uploaded (${filesize_mb} MB)\"}"
fi

rm -rf "$TMP_DIR"
Enter fullscreen mode Exit fullscreen mode

6. Schedule it via Cron

Edit your crontab:

crontab -e
Enter fullscreen mode Exit fullscreen mode

Add:

0 18 * * * /home/ubuntu/backup-answer/backup.sh
Enter fullscreen mode Exit fullscreen mode

Done

You now have a fully running, auto-installed Apache Answer instance + daily backups to S3 + Discord alerts.

Let your users ask away. 🔥


helps you get all your backend APIs documented in a few minutes.

With , you can generate interactive API docs that allow users to search and execute endpoints directly from the browser.

If you're tired of updating manually, give it a shot.

git-lrc
*AI agents write code fast. They also silently remove logic, change behavior, and introduce bugs -- without telling you. You often find out in production.

git-lrc fixes this. It hooks into git commit and reviews every diff before it lands. 60-second setup. Completely free.*

Any feedback or contributors are welcome! It's online, source-available, and ready for anyone to use.

⭐ Star it on GitHub:

GitHub logo HexmosTech / git-lrc

Free, Unlimited AI Code Reviews That Run on Commit

git-lrc logo

git-lrc

Free, Unlimited AI Code Reviews That Run on Commit


git-lrc - Free, unlimited AI code reviews that run on commit | Product Hunt

AI agents write code fast. They also silently remove logic, change behavior, and introduce bugs -- without telling you. You often find out in production.

git-lrc fixes this. It hooks into git commit and reviews every diff before it lands. 60-second setup. Completely free.

See It In Action

See git-lrc catch serious security issues such as leaked credentials, expensive cloud operations, and sensitive material in log statements

git-lrc-intro-60s.mp4

Why

  • 🤖 AI agents silently break things. Code removed. Logic changed. Edge cases gone. You won't notice until production.
  • 🔍 Catch it before it ships. AI-powered inline comments show you exactly what changed and what looks wrong.
  • 🔁 Build a habit, ship better code. Regular review → fewer bugs → more robust code → better results in your team.
  • 🔗 Why git? Git is universal. Every editor, every IDE, every AI…




Top comments (2)

Collapse
 
dotallio profile image
Dotallio

This kind of automation for both community Q&A and backups is exactly what makes self-hosting manageable. Have you tried tying LiveAPI and Answer together for integrated support or docs?

Collapse
 
nathan_tarbert profile image
Nathan Tarbert

this is extremely impressive, seeing all the steps broken down like that actually makes me want to try it myself
you think skipping managed platforms is worth it for that much control?