One of the most interesting parts of this journey for me has been seeing how much things have moved on since I last did serious web dev work 5 - 10 years ago. The first thing we put in place was Git for source code control. That saves our bacon on a near daily basis. By recording every iteration of every file and providing an easy way to move back and forward between versions, it's a refreshing change from copying the entire directory before you make a major change and deleting it once you're done. Much better.
I knew we didn't want to transfer files to the server via FTP because it's insecure, and secure FTP is a bit slow and old. For a good while we pushed to a Git repo on the server (via SSH) then used post-receive hooks on the server to deploy code for testing, but that's not great because:
- Git doesn't know or care about file ownership or permissions
- Git, focusing on managing files, doesn't transfer empty directories. We need some ready for things like cache data.
- Checking out the files from the repo while quick isn't instant, which leaves our application in an unknown state during the deployment process as files are copied over
- It doesn't push direct to the live directory. It pushes to a 'releases' directory then once everything is tested successful, it updates a single symbolic link to the new release. This is instantaneous.
- It maintains five (or however many you like) previous releases, keeps them on the live server, and can rollback the live site to any of them at a moment's notice.
- Once it's done, it can execute any combination of scripts to finish the deployment, as if you were SSHd into a terminal on the server.
That's a massive step forward since previously I've tried to mirror our live setup on Windows 7 (no native SSH client!), Mac OS X (MAMP and not MAMP) and Ubuntu 11.10 (which isn't quite the same as Ubuntu 10.04). In every case you get it 'almost, but not quite' the same. It's those 'not quite' bits that sap hours away from feature development until you realise 'ah, that's because X is subtly different on this machine'.
Vagrant makes that a thing of the past, and that's brilliant. Whatever machine I'm on, I transfer the 800mb dev server image and can get to work. That frees me from my home webserver, so I can now extend Cardsinthepost.com from your nearest Pret, woohoo! This did need a memory upgrade (to 4GB) for my Macbook - and I'm going to switch to an SSD too.
|My new Crucial 128GB SSD.|
Awaiting a T-6 screwdriver for installation.
It seems actual files on an actual server are becoming a thing of the past. We now have a series of incremental references to bits of files (the Git repo), inside a virtual machine running a virtual operating system (Vagrant), stored on a hard disk that's really memory, that publishes to a directory that's just a temporary reference (Capifony) on a server that doesn't really exist (Rackspace Cloud).
I'm already impressed with the huge library of AMIs from Amazon that provide server snapshots that you can import and fire up as your own whenever you like, like a giant server supermarket. Maybe one day there'll be an open standard for server images where you can run them locally or spin them up as 'live' via your favourite provider? That would be excellent.