Craft CMS Deployment Methods
When I first started working in the web development industry the go-to method of moving code from one place to another was via FTP. Good old, trusty, insecure, breaks every 5th file FTP.
Luckily things have progressed a lot since then and the options we have available for deploying Craft 3 websites have both grown and improved.
Here's an overview of a few different techniques that you can use, along with some pros and cons and a few resources to get you started, whichever option you decide is best for your project.
Things to keep in mind
Craft is a relatively easy framework to deploy remotely. It has two main components that you need to consider: files and the database. The two are very tightly linked and updating one without the other can quickly lead to things exploding and unhappy clients.
Because of this, any successful deployment strategy needs to take both components into consideration. In this article we'll be looking at the two individually before figuring out how to combine them.
To begin, lets look at some options for deploying our file changes to a remote server...
The spiritual successor to FTP. By design SFTP provides a very similar experience to FTP but with added security and stability. This is the go-to deployment option for many existing hosting services as it is well established, easy to understand and has many GUI tools available for all major platforms.
It does however suffer from some of the same issues as FTP. It works by uploading lists of files, one after the other. This has two drawbacks:
- If one file upload fails and the rest succeed your site could end up broken if those files rely on one another.
- Sometimes uploading large groups of files can take a significant amount of time. During this time your site might show errors as files that rely on each other have not been updated simultaneously.
SFTP also has no method of ensuring that only changed files are being uploaded. If you're deploying via SFTP you'll need to either keep a reference of which files need to be uploaded for every update or just upload the whole lot every time.
Finally SFTP is almost always restricted to uploading files to a single destination. If you need to deploy code updates to multiple servers simultaneously (such as in a load balanced environment) there are much better alternatives.
In the context of Craft CMS websites specifically, SFTP offers a simple and accessible way to move files from your local environment to a remote location. If you're not too worried about potential downtime due to long uploads or errors uploading individual files then it could work well for you with minimum fuss.
- Easy to understand - just upload and download files
- Easy to use - many GUI clients available for free
- No method of syncing - either keep a list of files to update or upload the whole lot every time
- Uploads can take a long time - each file is uploaded as an individual task
- Deployments are not atomic - if a single file fails to upload it could break your site until fixed
'scp' stands for 'secure copy' and is a command line tool which can be used for transferring one or more files to and from remote systems. But that is where its abilities end - it can't be used for browsing the files on a system or changing their properties, just transferring. Because of this, scp is usually used in conjunction with other tools, or as part of a well defined automated process which relies on the same task being performed in the same way every time.
scp also suffers from the same drawbacks as SFTP related to uploading files sequentially - any errors or delays can result in a half-uploaded update resulting in errors. It also doesn't have any method of syncing files - just like SFTP.
At face value it would seem as though SFTP is a better option due to its additional functionality, however scp beats it with raw speed. The methods used to facilitate and confirm files transfers are vastly different in the two programs and scp's method results in much faster upload times.
Anecdotally, I have never had an scp transfer fail on my home internet connection and I believe this is partly due to the fact that it simply takes a lot less time and therefore decreases the chance of network based errors occurring.
When working with Craft CMS both scp and SFTP offer similar functionality however they each have their place, and scp's place tends to be as a tool which is used as part of a scripted deployment process. You will need double check that your remote server supports the use of scp for uploads before you consider it though - many hosting providers don't allow its use.
- Simple - does one job and does it very well
- Fast - optimised to move files around as quickly as possible
- Easy to automate - can be integrated into larger deployment scripts with ease
- Simple - doesn't provide any functionality other than uploading and downloading files, not even directory listing
- Command-line only - no sensible person would create a GUI for it
- Uploads aren't atomic - as per SFTP, if an error occurs during an upload it can cause problems
- Less support provided by common hosting providers
Now we're getting a bit more interesting. Rsync is very similar to scp in that it is a command line tool which allows you to upload or download groups of files, however rsync's special ability is that it is able to only transfer the differences between the files.
It achieves this by comparing the size and hashed contents of the files on the two systems - any sizes or hashes that differ indicate a file that needs to be updated. It is even able to replace only the part of a file that has changed by hashing small blocks of it and only transferring mismatched blocks.
This ability makes pushing updates to your application super-quick. It also reduces the chance of errors occurring because only files which have changed will actually be transferred, even if you tell rsync to sync your entire project.
With this extra power also comes a slight jump in complexity. Rsync has a lot of configuration options which can change the way it transfers files or what file meta data it tries to maintain. You can choose to keep or discard file modified times and user permissions, whether to overwrite files which have recently been changed on the remote system, how to handle symlinks etc...
There's one significant drawback with rsync though: it has to be installed on both the local and remote systems. Installing locally shouldn't be a problem (even if you're using Windows), but if you don't have the ability to install software on your remote host this might not be a viable option for you.
- Syncs files - only uploads the changes that have been made between deployments
- Fast - because it only uploads changes it is lightning fast, less chance of network errors occurring
- Configurable - you can change its behaviour in several ways to perfectly suit the project's requirements
- Needs to be installed on both local and remote systems - might not be feasible with some hosting providers
- Some complexity - the configuration options need to be read and understood before it can be used effectively
If you use git to version control your application, you can often also use it to deploy too. As long as git is installed on the server to which you are deploying you will be able to clone the repo and pull updates using standard git commands.
These benefits require a thorough understanding of git's functionality to ensure you don't end up in a mess though. Ensuring that only the appropriate files are checked into the project repo, developing good working practices regarding branching and keeping an eye on filesystem usage are a must.
Similarly, it can be a pain to store build assets in a git repo as they cause constant merge conflicts in multi-developer workflows. However, deploying with git requires these built files to be present.
- Files are usually pulled from a central project repo. If multiple developers are working on a project this ensures all of their changes are included and you aren't deploying just one developers version of the codebase
- Git only syncs file changes when pulling updates so updates tend to be quick
- Updates are atomic, if there are any errors the update will be aborted in its entirety
- File updates can be rolled back easily by 'checking out' a previous git commit
- Understanding of git commands and workflows are required to do this safely
- The entire repo and its history will be stored on the remote server
- Requires build assets to be checked in to the git repo
- git needs to be installed on the remote server and you'll need command line access (SSH access) in order to run it
So far we've mainly been discussing tools which aren't necessarily designed to handle application deployments but we can use to do so. Capistrano is much closer to a true 'deployment tool'.
Written in Ruby (so you'll need Ruby installed to use it) it allows commands and workflows to be automated and executed on remote servers. It does all this using simple SSH connections so no software needs to be installed on your remote, but you will need to ensure you have SSH connectivity.
One of Capistrano's primary use cases is for deploying and updating applications. Because of this it has a few features which make it a great tool for doing so.
One of these features is automated atomic deployments. This is made possible by using a symlink to point to the root of your application. Any software installed on the server which helps to run your project will point towards this symlink. When you deploy an update with Capistrano it will upload all of your files to a separate location on the remote's filesystem. It is then able to perform any necessary checks to ensure the transfer was successful. Once confirmed the symlink (which all the server software is using to find your app) is re-pointed to the newly uploaded files. This allows application updates to be applied instantly even if it takes several minutes to upload all of the required files.
By uploading each version of your application to a separate location on the remote system Capistrano is also able to rollback your changes by simply re-pointing the symlink back to the previous version.
- Has specifically curated functionality for deploying applications including atomic updates
- Simple rollback support
- Supports arbitrary pre and post update checks and can roll back automatically is these fail
- Can be used for other remote execution jobs
- Nothing needs to be installed on the remote system
- Your entire app will be uploaded with each update (however there are rsync plugins to address this)
- You might need to learn some Ruby if you want to do any custom scripting
- Requires some careful management of folders on the remote system in order to maintain state on the filesystem (E.G. Craft's storage folder or user uploads)
- Command line only so you'll need to get comfortable configuring and using it there
Originally built for deploying Laravel apps Envoyer is a SaaS tool specifically designed to deploy applications which are stored in a git repo.
Its deployment methodology is very similar to Capistrano's in that it will keep multiple versions of an application on the remote simultaneously and switch between them using a symlink.
As a web based service specifically designed for helping with deployments it offers a nice GUI with several useful extra bits of functionality including health checks and auditing. It also provides a simple interface for deploying to multiple servers, either in parallel or sequentially such as in a staging > production workflow.
There are a couple of things you need to be aware of before jumping into Envoyer though:
- Envoyer will be connecting directly to your server - you must trust them explicitly as you are handing complete access to them
- Your deployment methodology will be tightly bound to the Envoyer system - they hold all of the scripts and configuration needed to make it work
- Envoyer is a paid service and therefore if your credit card is declined you lose the ability to deploy app updates
In return for your trust (and money) you are provided with a deployment platform that, for the large part, Will Just Work™.
If you do go down this route just make sure that your boss/client is happy with the implications. Nobody likes getting sued.
- Relatively simple if you don't mind playing with SSH keys - deployment complexity is all handled by someone else
- Multi-server, multi-tier deployments
- Nice web based GUI for all operations
- Pulls your code directly from your git repo
- Atomic deployments and rollbacks
- Lots of nice additional features like health checks and auditing
- Costs money
- You're handing access to your server over to someone else
- If you stop paying you'll need to set up a new deployment mechanism - no exporting of deployment scripts
- Built assets (JS, CSS etc) need to be checked into your git repo
- Tailored to Laravel (but can be tweaked to work with most applications)
We're gradually building up the complexity, and next on the list comes DeployBot. This is a SaaS like Envoyer but a bit more grown up. It allows multiple deployment methods to be used (SFTP, scp etc), allows arbitrary commands to be executed on the remote server during deployment (if you have SSH access) and also provides a mechanism for building your static assets (css, JS etc) before they are deployed. This means you don't need to check build assets into your repo! 🎉 (This is a real pet peeve of mine).
The deployment method is similar to that of Capistrano and Envoyer so we get all the same Atomic update and rollback functionality as those.
The asset building features can require a significant amount of additional code writing if you're not careful. It works by cloning your git repo into a docker container which has some basic software installed. It is then your responsibility to figure out what you need to install and execute your build scripts which will also need to be checked into your git repo. Once your build scripts have completed DeployBot will deploy all of your repo's files along with any other files which were generated as part of your build process.
It's a bit like having a simple continuous integration process which is only executed when you deploy new version of your codebase. This process can be slow though, increasing the time between updates being committed to your repo and them arriving on your servers.
Finally, as per Envoyer, you will be handing access to your servers over to DeployBot so all the same caveats apply. You can mitigate this somewhat by using SFTP as the deployment mechanism but that'll increase your deploy times significantly. The same warnings apply as did with Envoyer.
- Web interface for managing deployments and configuration
- Still relatively simple compared to what's coming up next
- Multi-server, multi-tier deployments
- Pulls code directly from git repo
- Allows assets to be compiled as part of the deployment process
- Atomic deploys and rollbacks
- Language and framework agnostic
- Send notifications to Slack, HipChat etc
- Costs money
- You're handing access to your server over to someone else
- If you stop paying you'll need to set up a new deployment mechanism - no exporting of deployment scripts
- A bit more complex compared to Envoyer, you might need to add a few bits of config or scripts yourself
My weapon of choice. Docker is a completely different way of packaging and deploying applications which is being adopted by software development teams and hosting platforms very quickly.
Docker allows an application to be bundled into an 'image' along with all of the software required to allow it to execute. This image can then be run on any server which has docker installed. This has two main benefits:
- Docker becomes your only project dependency - no need to install nginx, PHP etc on your server, it's bundled into your image. Even when different developer are working locally, they just need docker, nothing else.
- Your image is immutable and (somewhat) deterministic - it'll run the same everywhere. No more "Well it works on my computer...".
The general process for deploying with docker is to update your local development files, use those to build an image and then swap the currently running image on your remote servers with the new one.
To read more about how this all works in the context of Craft specifically, feel free to read through my post on using Craft 3 with docker. I also encourage you to have a read through my entire Craft continuous integration and deployment process which covers setting up a docker based workflow from scratch.
For all the praises I've been singing, Docker does come with one major down-side: the learning curve is significant.
As it's a totally different way of packaging, deploying and running applications there will be a lot that you need to get your head around before you feel comfortable putting all of the pieces together. In my opinion the process is worth it though - and the world is moving toward containerisation as a standard solution for cloud-based deployments so you won't be wasting your time learning.
- Application images bundle the app and its dependencies into a single, deployable unit
- The only required server dependency is docker
- When applications are executed they run in the same way no matter where they're run - they are isolated from their environment
- Atomic updates, rollbacks possible but might need a little scripting
- Build your images once, deploy multiple times - great for multi-server, multi-tier workflows
- Steep learning curve
- File and folder permissions problems occur sometimes - make sure you understand unix permissions
- You need to be careful with state stored on the filesystem - these need to be handled with host filesystem mounts so that they persist when a docker container exits
Well done for getting this far!
But so far we've only covered half the problem. It must be time for a pomodoro break so go refill your coffee and lets delve into the
depressing interesting world of database synchronisation......
We've covered a few options for syncing your project's files, but as I mentioned earlier Craft uses a combination of files and database structure in order to provide its flexibility.
Unlike your project's files, which can largely just be replaced for each update, the database holds both configuration and user generated content structure in such a way as to make it very difficult to just perform a single blanket action to get your project updated.
To make this work well we need a method of updating specific records and structures in the database without removing any of the other content that it contains.
But lets start at the beginning and work our way up the complexity scale again.
Dump and Restore
In order to move our database updates from one environment to another the simplest option is to simply export the database and then import the resulting SQL on our remote database. The import and export operations can be performed using a variety of tools but 'mysqldump' is probably the most popular.
This process is simple, quick and is supported by pretty much any hosting provider. However, importing the dumped database on our remote server will destroy any data that exists on that server and replace it with whatever we had in our local environment.
This obviously isn't great as all of our user generated content is in there - we'd lose all of the Entries which had been created in our live environment. 😖
Take Notes and Update Manually
A process that works ok for smaller sites is to perform all updates in a local environment and simply take notes of all changes to Sections, Fields, Categories etc which are made in the Craft control panel.
Once all of these changes have been made and your updated templates are ready to be deployed you can then re-play all of the control panel changes in your production environment and deploy your new template files.
There are two major downsides to this approach:
- Human error is almost guaranteed on anything non-trivial. Even small difference in the changes that are made in each environment can result is errors in your production environment
- Applying all of the updates manually can take a while. During this time your template files and database schema will be out of sync which could result in user-facing errors unless you put the site into maintenance mode for the duration of the update.
I use this process for less-complex sites which don't need to guarantee uptime. It avoids unnecessary complexity whilst also keeping things simple. It is extremely important to make sure you take comprehensive notes of any changes you make to your data schema if you take this approach.
On the negative side - this approach doesn't scale well: the update process would need to be performed in any environment that exists. If you have a local development environment, staging and production you're going to be performing all of your schema changes three times, multiplying the chance of human error creeping in.
Combining the previous two methods can provide some relief from both of their downsides. If we are working with a local > staging > production set of environments we could designate a single 'master' environment on which schema changes are performed and user content is created.
Following this methodology our update process would look like this:
- Perform schema and template updates locally using test Entry data. Make detailed notes of schema changes.
- Apply the schema changes to staging - which is also where the Client has been creating their real site content.
- Test in staging using the real site content.
- Export the staging database and import into production.
This allow us to perform an no-downtime deployment in production and removes any chance of human error whilst reapplying the schema updates in production. IT also ensures we're able to check our update against production-ready Entries before we go live.
A significant problem with this approach however is that the Client will be creating all of their content in a staging environment. In order to get this live we'll need to perform an export from staging and import into production. We don't want to be doing this manually every time the client performs a content update so we'd need to script it and run it on a regular basis. We'd also need a method of syncing any Client uploaded files between staging and production so that they're available when the production database is updated.
It'll work but the trade offs just feel a bit too much for me.
Many PHP frameworks offer a method of updating the database called 'Migrations'. These are files which contain a set of commands which should be executed in order to migrate the database from one state to another.
Craft offers this functionality with Content Migrations. You can put whatever code you want into these migrations, including editing the structure and content of the database directly or using Craft's API to create Sections, Fields etc without having to touch the DB directly.
You can check these migrations into your project's version control and deploy them to all of your environments. It then only takes a single CLI command to run all of the migrations which haven't yet been executed in each environment.
This sounds like the perfect solution for our problem as we'll be able to perform local development using Craft's control panel, then codify any schema changes we've made into migration files and check them into version control. Once they're there any other developers or environments will be able to grab the updated files and run any pending migrations in order to get everything in sync.
This workflow matches that used in many other popular frameworks such as Laravel, however there's one thing that you'll need to keep in mind before jumping into using this strategy in all of your projects: Craft migrations can get big very quickly.
Because Craft's data model is abstracted away from the database structure we aren't able to simply add columns to the DB in order to add properties to specific Entry types. Instead we have to add it as a Field, add the field to a FieldGroup, then append the FieldGroup to a Tab which has been associated with the Entry type. All that searching, creating and associating takes a fair amount of code which will all need testing and debugging before it's ready to share.
A good example of what to expect can be found here.
Use A Plugin
It would be great if we could get something to create these migration files on our behalf. The closest we can get to that reality are plugins like The Architect. This plugin allows portions of Craft's schema to be exported as a JSON file and the imported in other environments.
The process for doing this is manual so you'll still need to keep track of all the schema changes you make during development, so that you know what to export, but applying schema changes becomes a single import action and your schema change JSON files can be checked into version control so that they're easily sharable with other developers and between environments.
One downside compared to migrations is that there's no automated application of pending updates so you'll need to keep track of which updates have been applied in which environments and only run the pending ones. You'll also need to be careful to run them in the correct order.
If you have multiple developers working on a project simultaneously it would be prudent to establish a workflow for sharing and merging branches which contain schema updates as it would be easy for developers to end up with broken DB schemas if they're applying the updates in the wrong order.
I haven't used this technique much in the past but it could certainly act as an improvement over fully manual methods as long as you have an export/import process which you follow very carefully.
A final point on this approach: Be careful if you're using any custom field types or fields from a plugin which The Architect doesn't support. It won't know how to export and import these custom types so you'll need to take care of these bits manually.
Craft 3.1 Project Config
As of the time of writing Craft 3.1 is yet to be released, but it contains an update which is intended to fix this schema migration problem.
The solution is called Project Config and it is simply a yaml file which contains a description of your project's schema along with any project specific settings which are normally stored in the database.
This file is updated whenever a change is made to the schema via the Craft control panel, by a plugin, or due to a bump in Craft's own version.
Craft also regularly checks this file to see if a newer version is available. If one is, Craft will compare its contents to the existing schema and apply any changes between the two automatically.
This provides us with a win-win situation: we can develop locally without worrying about what schema changes we're making - Craft will just ensure our Project Config yaml is kept up to date. When we're happy with our schema changes we can move this file to each of our environments and, when we do, Craft will automatically apply the changes we've made to the schema without destroying any user generated content.
For a more thorough write-up on how Project Config works you can check out my deeper dive into it.
Once stable, Project Config will become the standard method of keeping schema changes in sync across multiple environments and between developers. While we're waiting for that you now have a few options to chose from and I encourage you to try them out and see what works best for you, your team and your specific projects.
This all sounds like a lot of work!
Indeed. I've always believed that web developers should be able to create, without necessarily having to understand all of the ins and outs of how their creation needs to be deployed, scaled and maintained. These days the disciplines of Front End Development and Server Sysadmin are as far apart as Dentist and Brain Surgeon - both of them go to med school, but I wouldn't trust a dentist with my cranium!
So I've created Servd - a hosting platform for Craft CMS which doesn't assume everyone is a server guru. It gives you want you need, without you having to learn how.
Have a look and see if it can solve some of your challenges. I certainly hope it will. 😃