Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to do scripted backups? #16

Closed
klausagnoletti opened this issue Jul 13, 2020 · 2 comments
Closed

How to do scripted backups? #16

klausagnoletti opened this issue Jul 13, 2020 · 2 comments

Comments

@klausagnoletti
Copy link

Hi

Do you have any suggestions to how I implement this in a typical backup script that automatically does backup every x days, puts stuff into a dated directory, deletes old backups etc?

Great work - very useful!

/klaus

@tlaanemaa
Copy link
Owner

tlaanemaa commented Aug 6, 2020

Hey

Sorry for the super delayed response, work's been crazy lately

I've built myself a script that does exactly that but it's not open sourced since its not general enough and contains some my use-case specific things. I'll try to look into making it into a package.

In general tho, the process should be easy enough with a shell script for example. My process contains the following steps:

  • Create new folder for the new backup
  • Create new backup by running this here package
  • Zip the backup and remove original files created in previous step (in my case these files are very compressable)
  • Delete old end-of-life backups
  • Run RSync to sync the local backup directory with a remote server

I also use a staggered ageing strategy so it aims to keep backups of varied ages (1d, 2d, 3d, 5d, 10d, 15d, 30d, etc...)

Below is the main parts of the backup script I've written for my use:

// Shell.js
"use-strict";

const path = require("path");
const { spawn } = require("child_process");

class Shell {
  constructor(cwd = path.resolve(__dirname, "..")) {
    this.cwd = cwd;
  }

  run(command, args = [], options = {}) {
    return new Promise((resolve, reject) => {
      const child = spawn(command, args, { cwd: this.cwd, ...options });
      const stdout = [];
      const stderr = [];

      child.stdout.on("data", (data) => {
        const dataString = data.toString();
        stdout.push(dataString);
      });

      child.stderr.on("data", (data) => {
        const dataString = data.toString();
        stderr.push(dataString);
      });

      child.on("close", (code) => {
        const outputData = {
          code,
          stdout: stdout.join(""),
          stderr: stderr.join(""),
        };

        resolve(outputData);
      });

      child.on("error", reject);
      child.stdout.pipe(process.stdout);
      child.stderr.pipe(process.stderr);
    });
  }
}

module.exports = Shell;
// Backup.js
"use-strict";

const path = require("path");
const fs = require("fs");
const util = require("util");
const Shell = require("./Shell");
const config = require("../config.json");

// Promisified methods
const writeFile = util.promisify(fs.writeFile);
const rimraf = util.promisify(require("rimraf"));

class Backup {
  constructor() {
    this.startDate = new Date();
    const name = this.startDate.toISOString().replace(/:/g, "_");
    this.baseDirectory = path.resolve(__dirname, "..", "backups");
    this.directory = path.resolve(this.baseDirectory, name);
  }

  /**
   * The main method to run the backup procedures.
   * Run this method to kick off the backup->zip->cleanup->sync process
   */
  async run() {
    // Only zip and cleanup if backup creation and zipping succeeded without errors
    if ((await this.create()) && (await this.zip())) {
      await this.cleanup();
    }

    await this.sync();
  }

  /**
   * Creates container backups.
   *
   * This runs `backup-docker backup` on the background
   */
  async create() {
    console.log("\n>> Creating container backups...");
    fs.mkdirSync(this.directory);
    const shell = new Shell(this.directory);
    const { stdout, code } = await shell.run("backup-docker", ["backup"]);
    await writeFile(path.resolve(this.directory, "backup.log"), stdout);
    return code === 0;
  }

  /**
   * Zips the created container backups
   */
  async zip() {
    console.log("\n>> Zipping...");
    const subdirectories = fs
      .readdirSync(this.directory)
      .filter((x) =>
        fs.lstatSync(path.resolve(this.directory, x)).isDirectory()
      );

    const outputFile = path.resolve(this.directory, "files");
    const shell = new Shell(this.directory);
    const { stdout, code } = await shell.run("zip", [
      "-rm9",
      outputFile,
      ...subdirectories,
    ]);
    await writeFile(path.resolve(this.directory, "zip.log"), stdout);
    return code === 0;
  }

  /**
   * Get all backup folders, parse their names to dates and order them by age, descending.
   */
  getCurrentBackups() {
    return fs
      .readdirSync(this.baseDirectory)
      .filter((x) =>
        fs.lstatSync(path.resolve(this.baseDirectory, x)).isDirectory()
      )
      .map((directory) => {
        const date = new Date(directory.replace(/_/g, ":"));
        return {
          directory,
          date,
          ageInDays: Math.round(
            (this.startDate - date) / (1000 * 60 * 60 * 24)
          ),
        };
      })
      .sort((a, b) => b.ageInDays - a.ageInDays);
  }

  /**
   * Performs the cleanup, removing backups that are no longer needed.
   * The decision which backups to remove is based on the `backupVersioningStrategy`
   */
  async cleanup() {
    console.log("\n>> Running cleanup...");
    const backupsToRemove = this.getCurrentBackups();

    /*
      For each element in the strategy (max backup age in days) we'll find a backup whose age is lass or equal to that strategy value.
      We then remove these backups from the array to preserve them. Its is important that both backup and strategy arrays are sorted
      in the same order, else we'll get bad matches and might mistakenly remove some backups we were meant to keep
      In the end we'll have an array that only contains backups that were not matched to any strategy elements.
      We'll then go ahead and remove those
    */
    config.backupVersioningStrategy
      .sort((a, b) => b - a)
      .forEach((strategyValue) => {
        for (let i = 0; i < backupsToRemove.length; i++) {
          if (backupsToRemove[i].ageInDays <= strategyValue) {
            backupsToRemove.splice(i, 1);
            return;
          }
        }
      });

    // Remove backups left behind in the array (not matched to any strategy element)
    for (const backup of backupsToRemove) {
      console.log(`Deleting backup '${backup.directory}'`);
      await rimraf(path.resolve(this.baseDirectory, backup.directory));
    }

    // Write which backups will be removed into a log file
    await writeFile(
      path.resolve(this.directory, "cleanup.log"),
      `The following backups were removed by cleanup:\n${backupsToRemove
        .map((x) => x.directory)
        .join("\n")}`
    );
  }

  /**
   * Syncs the new backup state to a remote location with rsync
   */
  sync() {
    console.log("\n>> Syncing...");
    const shell = new Shell(this.baseDirectory);

    return Promise.all(
      config.remoteBackupTargets.map((target) =>
        shell.run("sshpass", [
          "-p",
          target.password,
          "rsync",
          "-rltDv",
          "--delete",
          "./",
          target.target,
        ])
      )
    );
  }
}

module.exports = Backup;
// config.json
{
  "backupVersioningStrategy": [0, 1, 2, 4, 7, 12, 20, 33, 58, 88],
  "remoteBackupTargets": [
    {
      "target": "backups@192.168.1.10:/volume/directory",
      "password": "passwordpasswordpassword"
    }
  ]
}

I hope this helps

@tlaanemaa
Copy link
Owner

Hey 👋
I'll close this since it seems inactive. Feel free to reopen if you've got more questions.

@tlaanemaa tlaanemaa pinned this issue Jul 12, 2021
@tlaanemaa tlaanemaa mentioned this issue Dec 16, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants