This post will explore the next generation of WordPress development, which centers around adding an application-level package manager and a modern WordPress tech stack. You could make the case that these are the two things that separate professionals from highly motivated enthusiasts.
Composer for PHP is the package manager that we are using today. You probably understand the value of controlling your dependencies if you've used npm for javascript or gem for ruby on rails. Knowing exactly which dependency has changed can be invaluable when troubleshooting a problem.
Bedrock upgrades the tech stack that WordPress comes with out of the box. It improves the directory structure by moving the wp-config.php file from the root directory into the config directory one level-up. It also replaces the standard password encryption with the more secure wp-password-bcrypt.
A little documented feature of Bedrock is that it strictly enforces a separation between development and production environments. If you try to place a development environment into production, then your SEO will tank. The reason is that all of your content will be tagged with no-follow.
Prerequisites
- AWS Account (works with free Account)
- VSCode on the local machine
- SSH Client
- SSH Public and Private Key (with a passphrase)
- Some experience installing WordPress
- Some experience with docker-compose
- Some experience launching AWS EC2 Instances
Launch AWS EC2
- Log in to the AWS Console
- Import the SSH Public Key as the EC2 key pair
- Provision an Amazon Linux 2 AMI
- Increase the storage to 30 GiB
- Choose the public key you imported in step 1.
- Review and Launch
Setup AWS EC2 docker environment
Run the SSH command from you local machine to log in into the remote EC2 Instance. You can find the address on the EC2 console. You should be prompted by SSH for a passcode once you connect with the site. Enter the following command once you're logged in to the EC2 Instance.
sudo yum update -y
sudo amazon-linux-extras install docker -y
sudo service docker start
sudo usermod -a -G docker ec2-user
exit
Setup WordPress docker project root
All you need to do to create a new WordPress project is to create a new directory. Your projects shouldn't interfere with each other except for potential physical port conflicts. So feel free to use the following commands to develop multiple WordPress sites on the same EC2 Instance.
pip3 install docker-compose
mkdir wp-composer-remote-demo
cd wp-wordpress-remote
touch README.md
touch docker-compose.yml
Install Composer for PHP
Composer prep
Run the following command in prep for Composer being added to the system. The shared PHP will only be used to install and run Composer.
sudo amazon-linux-extras enable php7.4
sudo yum clean metadata
sudo yum install php-cli php-pdo php-fpm php-json php-mysqlnd -y
sudo yum -y install php-xml
Note: You can find instructions oh how to add Composer as a service. However, I think that it clutters up the docker-compose.yml file.
Follow Install Instructions on the Composer for PHP Website
Open a brower window to Composer PHP Download. Composer PHP is regularly updated and the installation hash changes when it's updated. Per the maintainers request, we won't distribute the instructions here.
Set up the Bedrock directory structure
Run the following comands to working directory for WordPress development. You could the case that this is where you start to create a more professional WordPress website. So, run the commands and brace yourself.
composer create-project roots/bedrock
cd bedrock
composer update
cd ..
Note: Composer uses the package.json to manage dependencies. So, if you want to update a package you make the change there, then run composer update. If the file is available from Packagist.org then it will install the file for you. Checkout Packagist.org for more complete instructions.
Configure the AWS-ClI
Run "aws configure" is something I like to do in preparation for installing and configuring git. The reason is because you really want to use the credential.helper so that you won't have to enter the user name and password every time you want to do a git push. You'll have to answer the following questions:
Answer the questions:
AWS Access Key ID [****************]:
AWS Secret Access Key [****************]:
Default region name [us-east-2]:
Default output format [json]:
Install Git
It's best to set up git and .gitignore as early as possible when starting a new project. There are two directories that you will most likely want to ignore right away: ./bedrock and ./mysql. I like to start by copying the .gitignore in the bedrock directory. Eventually, you'll want to ignore the nginx/certs directory.
Once again, you should check the internet or YouTube for examples and more detailed instructions. You can also ignore the git step all together however it is not best practice.
sudo yum install git -y
git init
git config --global credential.helper '!aws codecommit credential-helper $@'
git config --global credential.UseHttpPath true
touch .gitignore
git add .
git commit -m "initial commit"
git checkout -b main
Let's switch to the terminal in VSCode and set up the docker-composer.yml to run NGNIX. You can look at my previous article on remote development with VSCode if you are unsure. Technically, you don't need the mysql service to run NGINX. However, it's nice to have all of the Docker containers run without complaining.
Copy the following config in the docker-compose.yml file, then update the EC2 security group to permit HTTP traffic on port 80. Finally, run "docker-compose up". This will run all the docker containers. Do a "control-c" to stop or escape the docker-compose process.
version: '3.9'
services:
nginx:
image: nginx:stable-alpine
ports:
- 80:80
mysql:
image: mysql:5.7.34
environment:
MYSQL_DATABASE: wp
MYSQL_USER: wp
MYSQL_PASSWORD: secret
MYSQL_ROOT_PASSWORD: secret
php:
image: php:7.4-fpm-alpine
Create Self-Signed Cert
Eventually, you may want to encrypt information in transit. So, we will give you the SSL encryption information now so you won't have to fuss with the NGINX server again.
Use OpenSSL to create a self-signed cert and key
Run the following command in the project root. You will create both a .crt file and a key file in the nginx/certs directory. Unfortunately, the site will show up as "Not Secure" even though you're using SSL. You can fix that in production.
mkdir -p nginx/certs && cd nginx/certs
openssl req -newkey rsa:4096 -x509 -sha256 -days 365 -nodes -out <Public DNS Address>.crt -keyout <Public DNS Address>.key
Configure Self-signed SSL Cert and key
Next, we are going to set up the nginx.dockerfile and nginx/default.conf.
cd <project root>
touch nginx.dockerfile
touch nginx/default.conf
Add the following configuration to the nginx.dockerfile. The configure changes will update the base container.
FROM nginx:stable-alpine
ADD ./nginx/certs /etc/nginx/certs/self-signed
ADD ./nginx/default.conf /etc/nginx/conf.d/default.conf
Add the following configuration to the nginx/default.conf file. The default.conf file changes the root director to /var/www/html/bedrock/web. And configures the Self-Signed SSL cert and key.
upstream php {
server unix:/tmp/php-cgi.socket;
server php:9000;
}
server {
listen 80;
listen [::]:80;
server_name <Public DNS Address> <Public IPv4 Address>;
root /var/www/html/bedrock/web;
index index.html index.php;
location / {
try_files $uri $uri/ /index.php$args;
}
location ~ \.php$ {
include fastcgi.conf;
fastcgi_intercept_errors on;
fastcgi_pass php;
}
}
server {
listen 443 ssl;
listen [::]:443;
server_name <Public DNS Address> <Public IPv4 Address>;
root /var/www/html/bedrock/web;
index index.html index.php;
location / {
try_files $uri $uri/ /index.php$args;
}
location ~ \.php$ {
include fastcgi.conf;
fastcgi_intercept_errors on;
fastcgi_pass php;
}
ssl_certificate /etc/nginx/certs/self-signed/<Public DNS Address>.crt;
ssl_certificate_key /etc/nginx/certs/self-signed/<Public DNS Address>.key;
}
Next, we'll update the docker-compose.yml and the security group so that it will support SSL.
version: '3.9'
services:
nginx:
build:
context: .
dockerfile: nginx.dockerfile
ports:
- 80:80
- 443:443
volumes:
- ./bedrock:/var/www/html/bedrock
mysql:
image: mysql:5.7.34
environment:
MYSQL_DATABASE: wp
MYSQL_USER: wp
MYSQL_PASSWORD: secret
MYSQL_ROOT_PASSWORD: secret
php:
image: php:7.4-fpm-alpine
volumes:
- ./bedrock:/var/www/html/bedrock
Note: You'll get an error because the PHP Service can't talk with MySql. However, you'll see that the server is using SSL.
Let's get WordPress working
We are going to do a couple of things with the next set of changes. First, we are going to enable MySql support. We are going to add support for the wp-cli. Finally, we are going to pick a user to make updating the WordPress website easy.
So, "touch php.dockerfile" and add the following configuration.
FROM php:7.4-fpm-alpine
RUN docker-php-ext-install mysqli pdo pdo_mysql && docker-php-ext-enable pdo_mysql
RUN curl -O https://raw.githubusercontent.com/wp-cli/builds/gh-pages/phar/wp-cli.phar
RUN chmod +x wp-cli.phar
RUN mv wp-cli.phar /usr/local/bin/wp
# add docker user
ARG USER=docker
ARG UID=1000
ARG GID=1000
# Setup default user, when enter docker container
USER ${UID}:${GID}
WORKDIR /var/www/html/bedrock
Update the docker-compose.yml file to build the PHP service.
php:
build:
context: .
dockerfile: php.dockerfile
volumes:
- ./bedrock:/var/www/html/bedrock
wp:
build:
context: .
dockerfile: php.dockerfile
volumes:
- ./bedrock:/var/www/html/bedrock
entrypoint: ['wp', '--allow-root']
Update the bedrock/.env file with environment varibles from the docker-compose.yml file.
DB_NAME='wp'
DB_USER='wp'
DB_PASSWORD='secret'
# Optionally, you can use a data source name (DSN)
# When using a DSN, you can remove the DB_NAME, DB_USER, DB_PASSWORD, and DB_HOST variables
# DATABASE_URL='mysql://database_user:database_password@database_host:database_port/database_name'
# Optional database variables
DB_HOST='mysql'
# DB_PREFIX='wp_'
WP_ENV='development'
WP_HOME='https://<Public DNS Address>'
WP_SITEURL="${WP_HOME}/wp"
# Specify optional debug.log path
# WP_DEBUG_LOG='/path/to/debug.log'
# Generate your keys here: https://roots.io/salts.html
AUTH_KEY='generateme'
SECURE_AUTH_KEY='generateme'
LOGGED_IN_KEY='generateme'
NONCE_KEY='generateme'
AUTH_SALT='generateme'
SECURE_AUTH_SALT='generateme'
LOGGED_IN_SALT='generateme'
NONCE_SALT='generateme'
Run "docker-compose up --build" and answer the install questions as you normally would for WordPress.
Let's make the MySql Database persistent
The containers are destroyed everytime the docker-compose process is stopped. That means any information that is not written to disk for contained within the docker-container is also destroyed. So, we need to fix this so that data will persist beyound each cycling of docker-compose up or "cntl c".
Create mysql.dockerfile
Do a "control-c" to bring down the containers. Run the following commands.
mkdir mysql
touch mysql.dockerfile
Copy the following configuration into the mysql.dockerfile
FROM mysql:5.7.34
# add docker user
ARG USER=docker
ARG UID=1000
ARG GID=1000
ENV MYSQL_DATABASE=wp
ENV MYSQL_USER=wp
ENV MYSQL_PASSWORD=secret
ENV MYSQL_ROOT_PASSWORD=secret
# add docker user
#USER ${UID}:${GID}
#WORKDIR /var/lib/mysql
Update bedrock/config/environments/development.php by adding the following section. The development.php and staging.php, and eventually production.php are like the wp-config.php. The only difference is that every statement like "define" is preceded by "Config::" in the section.
/** Set up'direct method for wordpress, auto update with ftp */
Config::define('FS_METHOD','direct');
Update the docker-compose.yml to reflect the mysql.dockerfile build process.
mysql:
build:
context: .
dockerfile: mysql.dockerfile
volumes:
- ./mysql:/var/lib/mysql
Bring down the docker-compose with "control-c" and look at the mysql directory. It will show that it is owned by 999:ec2-user. Let's fix that. Uncomment the add docker user section.
sudo chown -R ec2-user:ec2-user mysql
docker-compose up --build
How to use the wp-cli functionality
First, I like to create a bedrock/src folder. You can place commercial 3rd party themes and plugins in this folder. Next, you can start running wp-cli commands, including installing and updating commercial 3rd party themes and plugins.
One of the reasons you would want to use the wp-cli is the file sizes of third-party commercial plugins and themes. You will have to fuss with the php.ini max_file_uploads parameter. Try the following set of commands just to get started:
docker-compose up -d --build
docker-compose run --rm wp plugin list
Congratulations, you're ready to start developing WordPress using VSCode on remote EC2 instances. It's been a long journey. And I want to thank you for finishing it with me. The good news is that you now have a git repository that you can use to create new baselines.
Top comments (0)