Cover image for Goodbye Nginx, hello Caddy

Goodbye Nginx, hello Caddy

hanna profile image Hanna ・3 min read

So recently I switched my personal website away from Nginx to Caddy. I've been planning this for quite a while as I wanted to give it a try, as I've heard a lot about it and it's perks. So in this post I'm going to go over my journey and setup, the downs and ups I've had thus far, and what my jurisdiction is on it.

Obtaining the web server

The first thing in any web server journey is installing and setting up the web server software itself. As my VPS is running ubuntu, installing things like Nginx and Caddy are usually as simple as running

sudo apt install -y <package>

however this was not the case this time, as due to my server being behind cloudflare, I had to enable a plugin for caddy to enable cloudflare dns support.

To do this I first had to install the latest version of the Go language, as Caddy is built on Go, however the latest version in the Ubuntu repositories at the time was Go 1.13, and Caddy needed Go 1.14, to fix this I ran the following commands to get Go 1.14.2 and finally added Go to my path.

wget -c https://dl.google.com/go/go1.14.2.linux-amd64.tar.gz -O - | sudo tar -xz -C /usr/local

After Go was installed, I installed Caddy's xcaddy tool, to allow me to build the web server from source using the plugin, the tool was fairly easy to use and allowed me to build the executable with a single command

xcaddy build --with github.com/caddy-dns/cloudflare

After it finished compiling I just slapped the single executable into /usr/bin and now had Caddy installed.


After installing Caddy came configuration, this part was a little tricky at first but after a while I understood what I needed to do. The first thing I had to do was create a scoped cloudflare API token with 2 permissions, those can be read about here.

Making the actual configuration for the web server was extremely simple compared to Nginx configuration files, below is my current configuration for my website

<domain here> {
  root * /var/www/html

  tls <email here> {
    dns cloudflare {env.CLOUDFLARE_API_TOKEN}

After that you can either tell Caddy to use the Caddyfile or convert it to json with a command, and use that, so I went ahead and did the latter with the below command

caddy adapt --config Caddyfile >> /etc/caddy/config.json

then finally I was able to start the web server with a simple command

sudo caddy start --config /etc/caddy/config.json

Systemd Configuration

As I wanted to be able to run Caddy in the background and let systemd manage it, I created a simple service for it

Description=Caddy Web Server

ExecStart=/usr/bin/caddy run --config /etc/caddy/config.json
ExecReload=/usr/bin/caddy reload --config /etc/caddy/config.json
ExecStop=/usr/bin/caddy stop


Final Words

Installing Caddy was surely a journey, but caddy does have it's upsides, from automatic certificate renewal, automatic https/ssl in general, being built on newer technologies, and simpler configurations and plugin installation, I think it was worth it in the end. You can read more about Caddy here. Thanks for reading!

Posted on by:

hanna profile



I'm a software engineer, system administrator, community moderator, and designer. I mainly make things that are useful to me personally, and most commonly make random for fun because why not honestly.


Editor guide

"Built on newer technologies" in this case means Go vs C, which is rather disadvantage.


I disagree, but you do you.


It's not a question of agreement or disagreement. If to assume that both apps are properly designed (I'm pretty sure they are), the next thing which affects performance most is the quality of compiler optimization. Go is weak in this regard in comparison to modern C/C++ compilers (or any other compilers which use backends like LLVM, for example Rust). In addition, Go uses reference-counting GC which is also quite slow comparing to modern GC's available in, for example, JVM-based languages.

I completely agree. Caddy looks promising, but being built with Go is advantage only in the sense of "I know Go so I can hack on Caddy". Otherwise it's actually a disadvantage. It's certainly slower.

Rather than reducing the discussion to "is C faster than Go", it's worth looking at benchmarks of any server you're looking at using. I thought the Caddy benchmarks looked quite good, maybe not as fast as Nginx or Lighttpd. So the question would be, is that extra level of performance so important to you that it outweighs the ease of use benefits, built-in Let's Encrypt support, etc? If yes then of course stick with the one that suits your needs.

However, all of those servers are so ridiculously fast that it's very unlikely that they would be the main bottleneck your site or web service faces.

The discussion started from answer on "Caddy ... built on newer technologies". I just pointed out that "newer technologies" does not necessarily mean "better".

Oh, I certainly meant benchmarks and experience of people that tried it.

That's a fair point. Actually I'm seeing a lot of cases recently where we seem to have forgotten about technical advances made in the past, and end up using systems today that are less robust in some ways.

Just for reference caddy vs nginx benchmarks:

I see, that's interesting. Though I do like how easy caddy was to setup, so I guess that's an upside in some cases, at least for smaller sites that don't need the best performance on the net.

I really don't understand why some people consider application performance irrelevant. Better performance in same conditions means that less energy is wasted. In other words, better performing app is more environment friendly. Those who don't care about environment or believe that impact is too small to worry, may remember themselves that better performing app reduces bill for cloud hosting.

If everything is just about performance write your whole stack from scratch in assembly. See how that goes ;) The authors point is some sites will come up plenty fast enough on Caddy. Use what's best for your scenario. I don't use it myself but it seems like a good option.

For last 15-20 years it's very hard (if at all possible) for human to beat compiler optimization in real life tasks. So, your advice has little sense.

Sure, if you are just using it for your personal site, the performance difference shown in the benchmarks, doesn't matter at all. If you are running a relatively popular web-shop, it matters a lot.

"Amazon’s calculated that a page load slowdown of just one second could cost it $1.6 billion in sales each year. Google has calculated that by slowing its search results by just four tenths of a second they could lose 8 million searches per day–meaning they’d serve up many millions fewer online adverts."

Uh, it was supposed to be a joke about worrying about every single performance optimization for a web site that may have very little need. People still use Wordpress and other "fat" CMS's etc. without a problem...

Well, we're draining planet resources without a problem as well. And affecting climate without a problem too. Does that mean that we should not worry about these issues at all, if they don't cause any problems to us?

You're totally right that we should consider the efficiency (and therefore energy + cost) aspect. But it can't be the only thing we consider. If two options are equal in all respects except efficiency, there can be no question that we would choose the less efficient option.
However, if for example configuration is simpler, and managing SSL certs is very important, then it might make sense for someone to choose a less efficient option that has those other advantages. I'm sure you've run into configuration problems that wasted hours of your time -- that could be enough to cancel out the cost savings of the more efficient option.

Completely agree. I just answered on an attempt to downplay importance of efficiency.

Funnily enough Caddy handles medium load more efficiently than NGINX, with much lower times to first byte on average, and with much less CPU consumption. Matt (the project author) made the same findings and dropped them in a thread on the Caddy forums.

Performance is not the only factor to take into account though when building good software, and whilst C or the likes will outperform many languages when counting the Fibonacci sequence, it can't compete with the Go ecosystem when it comes to networking tools. Hell, Netflix uses Go for a huge part of its networking stack, and they boast one of the largest, most complex setups to date.

I'm all for questioning new or "hype" projects, but it's unfair to describe Caddy as such. NGINX is very dated, and whilst it performs fantastically in specific use cases, it's good to have a new alternative that's more forgiving in the many new challenges we face today.

Well, taking into account that vast majority of networking code across all operating systems is written in C or C++, Go ecosystem is more or less usable at best. And I know what I'm talking about because at one of my previous jobs I was writing networking tools in Go.


Caddy is great choice over Nginx in so many cases. I really enjoy it as file and proxy server.


"in so many cases"


As mentioned, files server and proxy due to minimum configuration. Bootstraping with Caddy is easier, since it is just less steps to get going. Automation is easier and more reliable.

Can you please clarify "more reliable" statement? Do you have some stats that may proof that? Any benchmarks that highlight benefits of Caddy vs Ngnix?

Sure. Fore example, with Caddy acme challenge for let's encrypt is built in. No need to write shell or provisioner (ansible, cheff, etc) scripts to handle this part. Less parts to fail, thus more reliable when automating.


I'm using caddy as the development server for PHP projects. No need to install anything and create virtual hosts for multiple projects. Just unpack the binary, and copy the config file - and you're good to go. For apache/nginx, I have to create a separate vhost for every new project. With caddy I just copy a 5/6 lines long simple config file to a new project and start working immediately. (I run caddy from the terminal of course)

However, for production hosting I will go with nginx any day.


Yeah I've really been loving it so far! It's quite refreshing to try it after using Nginx for such a long time.


That's very interesting,

I felt the same when I switched after years of apache httpd usage to nginx ..


This is all great, except for one thing: I can't find a single reason to go over the hassle of setting up your own web server to serve a very simple personal website in the era of serverless, jamstack, free static website hosting and such :P


Since Go is available on ARM, Caddy makes a great webserver for Raspberry Pi projects too. Caddy can run its own Certificate Authority for local HTTPS just as smoothly as automatic HTTPS via Let's Encrypt.


Hanna, considering your personal page is a static website and a simple splash page, you would do really well by serving it directly from clouflare's edge:

Cloudflare extending the workers platform

Cloudflare Static Page Edge Comput

Caddy over Nginx won't make much of a difference for your site.

Later you can take advantage of Cloudflare's edge compute:

Cloudflare - What is Edge Computing


Had a vps I had nothing on at the time and this gave me something to put it on so :p but edge is pretty cool


I've had a great experience using Caddy.

I recently decided to replace my older NGINX stack for my personal site since it was getting a bit long and unwieldy, and was able to discard a good dozen or so lines of purely SSL config.

Getting the systemd service set up was a little more difficult, but that was more on the fact I was also setting it up in a NixOS environment rather than it being Caddy's fault. I'd definitely like to use Caddy again in my other projects.


Nice post.
Can you update this article in a few days and tell how is caddy behaving? If you do so, please remember to mention characteristics and what kind of usage you have given to it



If it helps, I can tell you that I use Caddy to host landing pages (for paid traffic from Facebook and Google Ads). It has survived 30-50 requests per SECOND for two years without a single reboot. And all of that on an anemic VPS with a single core and 1GB of RAM. I still use the first version of Caddy. Don't touch it if it works, right?


Will try my best to remember


I love nginx and openresty and will probably still be using them in many years, but I do agree that setting up an nginx server can be a huge pain, involving googling lots of config options and writing lots of wrapper scripts to make things easier.

However, I did build myself a script to take care of the setup for me and since then I haven't had to deal with those problems once, except when trying to do some really weird setup.


Rock stable, bulletproof, faster and heavy production tested Nginx or caddy (new kid on the block). I'll go with Nginx every time. DevOps perspective is missing here.... it's not only easier config (which is only one aspect).


I would definitely be using nginx for production stuff. But wanted to give caddy a try as a friend told me about it.


I'd never heard of it. I think I'm out of the loop these days :)
Have you been running this for a while or will you follow this up and let us know how it is after a month or so of actual use?


Been using it for around a week with no problems!


I'm using it since years in production systems with more than 40 applications/services (frontend, backend and db) together with docker in different companies.

I would call it bulletproof ;)


I was really impressed by Caddy when it first hit the scene, and when I heard about its support for LetsEncrypt. I did hear that there might be some sort of licensing issues with it though -- is that a real issue or not?


I believe this discussion thread brings up some of the licensing stuff, but as far as I know they fixed most of it.


cloudflare is absolute garbage and a privacy nightmare.


Cloudflare is a busyness and has their own profit in mind above all else. They're kind of in the position google had a few years ago, where some where saying "These guys be evil!" (me among them) but most still saw them as a kind of benevolent internet deity that kept everything together aiming only for the common good.

Setting aside the messy details, cloudflare does make the internet faster, at least parts of it. The privacy part is not exclusive to them either, it's just a problem you'll get with any CDN (Which is why I prefer to just keep files on my own servers).


Could you elaborate on the situation with Cloudflare?


Agree to disagree, helps me with a few things, though was a pain in the ass getting caddy working with it.


Uh oh ... lot of people digging their heels in for Nginx. Should be a fascinating thread. Thanks Hanna for putting this out for view!


No problem! Just trying to post some good content lol.


sad Developer noises

Oh, Caddy?



Just updated my own site with this and it is definitely a little bit faster. I mostly just love how little configuration it needs!


It's definitely quite nice!


Hey Hanna,

How do you set subdomain in Caddy like api.domain.com?


Same way as domains!

api.example.com {

Nice! I am planning on using Caddy to proxy to a few different apps running on my vps as well. The CaddyFile seems much more intuitive for me

Some comments have been hidden by the post's author - find out more

Code of Conduct Report abuse