As I was moving on, I decided to refactor my code to handle publishing my web site to a remote host through my JavaScript deploy script. I decided to use rsync, so I found an rsync wrapper for Node.js to allow calls. It's called rsyncwrapper:

// Load the rsync module
var rsync = require("rsyncwrapper"); // Installed via $ npm install rsyncwrapper

This adds the rsync function to Node. The function takes several options, as does the rsync command, and they are passed to the rsync command. To help, I created a simple factory to generate the most used options that need to be passed to the rsync function:

var rsyncOpts =
 {
 // Options for rsync.
 src: ".",
 dest: rsyncuser+"@"+rsyncserver+":",
 ssh: true,
 port: rsyncport,
 };
// getOpts returns a copy of the rsync options with varying 
// source and dest.
rsyncOpts.getOpts = function(source, destPath, sshport) 
 {
 var newObj = 
 ({
 src: source,
 dest: this.dest + destPath,
 ssh: this.ssh,
 port: (!sshport?this.port:sshport)
 });
 return newObj;
 };

I then modified my CopyFolder command to use rsync.

At first, I was walking the source folder (as in my last article) and calling rsync file-by-file. This is nonsensical (and lazy on my part). rsync is way more efficient copying a folder than file-by-file. Not only that, but it was overloading the network connections and file copies were failing. I'm not going to show this, because it is messy and overly complex. The problem was that I was using a regular expression to test each file and rsync uses globbing patterns (*.ext), so I had to add a separate parameter for the globbing patterns.

Here is an outline of the updated copy folder function. This function is a work in progress:

var ds = path.sep; // Environment directory separator.

// Deploy the site either locally or to another server.
function CopyFolder(source, dest, rxFilter, useRsync, rsInclude)
{
?????? // Get the fully qualified source path
?????? var sSrcRoot = path.resolve(source)
    // Copy using rsync options to recursively copy a folder.
    dRsyncOpts = rsyncOpts.getOpts(sSrcRoot + ds, dest + ds);
    dRsyncOpts.args = ['-a'];
    dRsyncOpts.include = rsInclude;
    dRsyncOpts.exclude = ["*~", '-f"- .git/"'];
    dRsyncOpts.recursive = true;

    rsync(dRsyncOpts, function(err, stderr, out, cmd) {
      if (err) return console.error('Error copying folder ' + source + ': ' + 
        err + " || " + stderr + " || " + cmd);
      logger.log('Sent folder ' + cmd);
    }); 
...

The rules above are a work in progress, and it's including more than I like (because it's not explicitly excluded). I've worked with rsync, but not used the advanced includes and excludes above with the -a option. Anyway, using rsync can replace the use of the fsextra copy command and should be much more efficient. I need to test it on Windows to make sure it still works there (with rsync installed). Working on Linux provides lots of standard command line functionality that helps working with local/remote files.

After doing this and testing it locally for deployment, I went to deploy to my site host. This is where I got a bunch of random errors and then everything errored out. Why? Because rsync was running in parallel and the hosting provider thought it was a DOS attack!

My next step is to use promises to serialize the requests. My current code does:

CopyFolder(src1, dest1, /\.php$/i, isRemote, ["*.php"]);
CopyFolder(src2, dest2, /\.php$/i, isRemote, ["*.php"]);
...

The regular expression is for backward compatibility. This will eventually be refactored out. Instead it should read as below to ensure the rsync's are done serially:

CopyFolder(src1, dest1, isRemote, ["*.php"])
  .then(CopyFolder(src2, dest2, isRemote, ["*.php"]))
  .then(CopyFolder(...))
  .done(console.error("Finished copying!");

It seems that using Node.js for deployments is not a common practice, but by building up simple tools and functions it will be easy and allow for more functionality to be included in the future. Next will be promises and calling from Jenkins!