Easy incremental backups to an external hard drive












53















For a while I used Dirvish to do incremental backups of my machines, but it is slightly cumbersome to configure, and if you do not carry a copy of your configuration it can be hard to reproduce elsewhere.



I am looking for backup programs for Unix, Linux that could:




  • Incrementally update my backup

  • Create "mirror" trees like dirvish did using hardlinks (to save space)

  • Ideally with a decent UI










share|improve this question





























    53















    For a while I used Dirvish to do incremental backups of my machines, but it is slightly cumbersome to configure, and if you do not carry a copy of your configuration it can be hard to reproduce elsewhere.



    I am looking for backup programs for Unix, Linux that could:




    • Incrementally update my backup

    • Create "mirror" trees like dirvish did using hardlinks (to save space)

    • Ideally with a decent UI










    share|improve this question



























      53












      53








      53


      29






      For a while I used Dirvish to do incremental backups of my machines, but it is slightly cumbersome to configure, and if you do not carry a copy of your configuration it can be hard to reproduce elsewhere.



      I am looking for backup programs for Unix, Linux that could:




      • Incrementally update my backup

      • Create "mirror" trees like dirvish did using hardlinks (to save space)

      • Ideally with a decent UI










      share|improve this question
















      For a while I used Dirvish to do incremental backups of my machines, but it is slightly cumbersome to configure, and if you do not carry a copy of your configuration it can be hard to reproduce elsewhere.



      I am looking for backup programs for Unix, Linux that could:




      • Incrementally update my backup

      • Create "mirror" trees like dirvish did using hardlinks (to save space)

      • Ideally with a decent UI







      linux backup bsd






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Sep 12 '10 at 16:05









      Michael Mrozek

      61.9k29193213




      61.9k29193213










      asked Aug 17 '10 at 3:06









      miguel.de.icazamiguel.de.icaza

      4,29322525




      4,29322525






















          9 Answers
          9






          active

          oldest

          votes


















          24














          Try rsnapshot. It uses rsync and hardlinks and is incremental.






          share|improve this answer





















          • 3





            I should mention that I have no idea what Dirvish is or how it works.

            – xenoterracide
            Aug 17 '10 at 3:24











          • I think it might be GUI-less so I miss that bonus... but since you said 'Ideally'

            – xenoterracide
            Aug 17 '10 at 3:27






          • 3





            A GUI does not a good UI make.

            – Eli Frey
            Aug 17 '10 at 3:32






          • 2





            i've been using rsnapshot for years

            – cmcginty
            Aug 17 '10 at 19:35



















          21














          This crude -but functional- script will backup everything under the sun to your external hard drive under a hard link farm. The directory name is a timestamp, and it maintains a symlink to the latest sucessful backup. Think of it as a Time Machine sans the fancy GUI.



          #!/bin/sh
          DATE=`/bin/date +%Y%m%d%H%M%S`
          RSYNC=/usr/bin/rsync
          BASE=/mnt/externalhd
          TARGET=$BASE/daily
          $RSYNC -av --exclude $TARGET --exclude-from=/etc/backup/rsync.exclude --link-dest=$TARGET/latest/ / $TARGET/$DATE/
          touch $TARGET/$DATE/
          rm $TARGET/latest
          ln -s $TARGET/$DATE $TARGET/latest


          Set it up creating an empty $TARGET and symlink a dummy $TARGET/latest to it. Populate /etc/backup/rsync.exclude with lost+found, tmp, var/run and everything else you need to skip during backup, or go for --include-from if it fits you better; man rsync is your friend.



          Proper sanity checks, error control, remote backup and pretty GNOME GUI are left as an exercise to the reader ;-)






          share|improve this answer





















          • 1





            +1 I do something very similar to this. --link-dest for the win.

            – kbyrd
            Aug 17 '10 at 19:37



















          9














          The Backup-Comparison of backup tools at the Ubuntu-Stackexchange is not really Ubuntu-specific. Perhaps you get some suggestions there.



          I recommend DAR - the Disk ARchive program. It does not come with a GUI, but its config is easy to reproduce. It has great incremental backup support. It does not use hardlink mirror trees, but it has a convenient shell for navigating the filesystem view of different snapshots.






          share|improve this answer


























          • DAR has inconvenient restoration procedure: each incremental backup physically overrides files from previous step. So, if your file changes 7 times, it would be extracted 7 times, and 6 copies would be wasted, overridden by the 7th.

            – ayvango
            May 20 '17 at 4:28



















          8














          I use backintime, which is primarily targeted towards Gnome/KDE desktops. However, it can work from the commandline as well.



          I describe backintime as a backup system with "poor man's deduplication".



          If you were to write your own backup script to use rsync and hardlinks, you would end up with something similar to backintime.




          • I use cron to kick off the backintime job once per night.

          • As the documentation says: The real magic is done by rsync (take snapshots and restore), diff (check if somethind changed) and cp (make hardlinks).

          • backintime can be configured with different schedules. I keep monthly backups for 1 year, weeklies for 1 month, and dailies for 1 week.

          • backintime uses hardlinks. I have 130GB worth of data, and I back this up nightly. It only uses 160GB worth of space on the second drive because of the magic of hardlinks.

          • Restoring data from the backup location is as simple as running cp /u1/backintime/20100818-000002/backup/etc/rsyslog.conf /etc/rsyslog.conf. You don't need to use the GUI.

          • On the second drive, the initial copy was expensive (since you can't do hardlinks between two different filesystems), but subsequent copies are fast.

          • I copy data from my primary filesystems to a second filesystem onto a second hot-swappable drive, and periodically rotate the secondary drive.






          share|improve this answer


























          • Surely you want the initial copy to be expensive, otherwise you don't have a backup, just another link to a single file? Of course, it's also possible that I'm missing some crucial point which makes this comment pointless :-)

            – dr-jan
            Aug 25 '10 at 13:05











          • @Dr-jan : I agree with you. However, I think some users expect the initial copy to be fast.

            – Stefan Lasiewski
            Aug 26 '10 at 17:11



















          4














          Rdiff Backup is really good http://rdiff-backup.nongnu.org/



          Note that it is abandoned, with latest stable and unstable releases from 2009.






          share|improve this answer


























          • But currently unmaintained.

            – Faheem Mitha
            Nov 28 '15 at 17:31



















          3














          I've had some success with RIBS (Rsync Incremental Backup System)



          It uses rsync so hardlinks are supported and can do incremental backups hourly, daily, weekly and monthly.



          However, it is a PHP script only. To set up you need to edit the settings and then set up related cronjobs. It works, but it's not the most user friendly and requires PHP.






          share|improve this answer

































            1














            I've been using epitome for about a year now for deduplicated backups of my personal data . It has a tar like interface so it's quite comfortable for a unix user and setup is a breeze, at least, on OpenBSD. You can easily cron it to backup your directories on a daily basis, and it takes care of the deduplication of your data. You basically are left with a meta-file that you can use to restore your snapshot at a later date. As I said the interface is tar-like so doing a backup is as easy as:




            # epitomize -cvRf 2010-08-16-home.md /home


            Note that epitome is abandoned, only partial copy of website at https://web.archive.org/web/20140908075740/https://www.peereboom.us/epitome/ remains.






            share|improve this answer


























            • It's currently experimental but, works quite well. I've been able to do full restores from arbitrary meta files and recover information that I needed, and have had 0 problems with it in ~1 year of use.

              – gabe.
              Aug 17 '10 at 5:01



















            1














            BackupPC sounds like it fits the bill. It manages a tree of hard links for dedupe and can backup many machines, or just the local machine.






            share|improve this answer


























            • +1 for BackupPC I use it to backup a group of servers regularly. It also has a good web-based UI.

              – dr-jan
              Aug 25 '10 at 13:00





















            1














            Lars Wirzenius's obnam:




            • Does deduplication when it backs up things, which means that backups are likely to take little space, potentially a lot more than simply hardlinking files.

            • As the backups are with deduplication, every backup is "full", with no need of having incremental backups. It simply detects that not many things have changed and only does what is needed.

            • Each backup is, effectively, a snapshot of your system, without the need to recover the last full backup and each incremental backup in turn to get the system to be restored.

            • Contrary to bup (which is another strong contender with deduplication), obnam is able to delete previous backups to save space of unnecessary backups.

            • It's retired

            • Besides using the regular recovery methods of a backup program, there is a fuse filesystem that provides a view of obnam's backups as a plain filesystem and that can choose which snapshot/backup/generation to mount, which is super handy, as far as "user" interfaces go (given that we are in a Unix-related site, a flexible command line interface is highly valued).

            • It supports encryption as an integral part of the backups (and not as an afterthought).

            • It was written with support for remote backups in mind.


            In my opinion, one serious contender for the Backup World Day (and not only that day).






            share|improve this answer


























            • "As the backups are with deduplication, every backup is "full", with no need of having incremental backups. It simply detects that not many things have changed and only does what is needed" -as it relies on previous backup versions to provide data it means that it IS an incremental backup.

              – Mateusz Konieczny
              Apr 16 '16 at 15:37











            Your Answer








            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "106"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: false,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f553%2feasy-incremental-backups-to-an-external-hard-drive%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            9 Answers
            9






            active

            oldest

            votes








            9 Answers
            9






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            24














            Try rsnapshot. It uses rsync and hardlinks and is incremental.






            share|improve this answer





















            • 3





              I should mention that I have no idea what Dirvish is or how it works.

              – xenoterracide
              Aug 17 '10 at 3:24











            • I think it might be GUI-less so I miss that bonus... but since you said 'Ideally'

              – xenoterracide
              Aug 17 '10 at 3:27






            • 3





              A GUI does not a good UI make.

              – Eli Frey
              Aug 17 '10 at 3:32






            • 2





              i've been using rsnapshot for years

              – cmcginty
              Aug 17 '10 at 19:35
















            24














            Try rsnapshot. It uses rsync and hardlinks and is incremental.






            share|improve this answer





















            • 3





              I should mention that I have no idea what Dirvish is or how it works.

              – xenoterracide
              Aug 17 '10 at 3:24











            • I think it might be GUI-less so I miss that bonus... but since you said 'Ideally'

              – xenoterracide
              Aug 17 '10 at 3:27






            • 3





              A GUI does not a good UI make.

              – Eli Frey
              Aug 17 '10 at 3:32






            • 2





              i've been using rsnapshot for years

              – cmcginty
              Aug 17 '10 at 19:35














            24












            24








            24







            Try rsnapshot. It uses rsync and hardlinks and is incremental.






            share|improve this answer















            Try rsnapshot. It uses rsync and hardlinks and is incremental.







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited May 22 '11 at 10:10









            Tshepang

            26.3k72186264




            26.3k72186264










            answered Aug 17 '10 at 3:23









            xenoterracidexenoterracide

            25.9k53159222




            25.9k53159222








            • 3





              I should mention that I have no idea what Dirvish is or how it works.

              – xenoterracide
              Aug 17 '10 at 3:24











            • I think it might be GUI-less so I miss that bonus... but since you said 'Ideally'

              – xenoterracide
              Aug 17 '10 at 3:27






            • 3





              A GUI does not a good UI make.

              – Eli Frey
              Aug 17 '10 at 3:32






            • 2





              i've been using rsnapshot for years

              – cmcginty
              Aug 17 '10 at 19:35














            • 3





              I should mention that I have no idea what Dirvish is or how it works.

              – xenoterracide
              Aug 17 '10 at 3:24











            • I think it might be GUI-less so I miss that bonus... but since you said 'Ideally'

              – xenoterracide
              Aug 17 '10 at 3:27






            • 3





              A GUI does not a good UI make.

              – Eli Frey
              Aug 17 '10 at 3:32






            • 2





              i've been using rsnapshot for years

              – cmcginty
              Aug 17 '10 at 19:35








            3




            3





            I should mention that I have no idea what Dirvish is or how it works.

            – xenoterracide
            Aug 17 '10 at 3:24





            I should mention that I have no idea what Dirvish is or how it works.

            – xenoterracide
            Aug 17 '10 at 3:24













            I think it might be GUI-less so I miss that bonus... but since you said 'Ideally'

            – xenoterracide
            Aug 17 '10 at 3:27





            I think it might be GUI-less so I miss that bonus... but since you said 'Ideally'

            – xenoterracide
            Aug 17 '10 at 3:27




            3




            3





            A GUI does not a good UI make.

            – Eli Frey
            Aug 17 '10 at 3:32





            A GUI does not a good UI make.

            – Eli Frey
            Aug 17 '10 at 3:32




            2




            2





            i've been using rsnapshot for years

            – cmcginty
            Aug 17 '10 at 19:35





            i've been using rsnapshot for years

            – cmcginty
            Aug 17 '10 at 19:35













            21














            This crude -but functional- script will backup everything under the sun to your external hard drive under a hard link farm. The directory name is a timestamp, and it maintains a symlink to the latest sucessful backup. Think of it as a Time Machine sans the fancy GUI.



            #!/bin/sh
            DATE=`/bin/date +%Y%m%d%H%M%S`
            RSYNC=/usr/bin/rsync
            BASE=/mnt/externalhd
            TARGET=$BASE/daily
            $RSYNC -av --exclude $TARGET --exclude-from=/etc/backup/rsync.exclude --link-dest=$TARGET/latest/ / $TARGET/$DATE/
            touch $TARGET/$DATE/
            rm $TARGET/latest
            ln -s $TARGET/$DATE $TARGET/latest


            Set it up creating an empty $TARGET and symlink a dummy $TARGET/latest to it. Populate /etc/backup/rsync.exclude with lost+found, tmp, var/run and everything else you need to skip during backup, or go for --include-from if it fits you better; man rsync is your friend.



            Proper sanity checks, error control, remote backup and pretty GNOME GUI are left as an exercise to the reader ;-)






            share|improve this answer





















            • 1





              +1 I do something very similar to this. --link-dest for the win.

              – kbyrd
              Aug 17 '10 at 19:37
















            21














            This crude -but functional- script will backup everything under the sun to your external hard drive under a hard link farm. The directory name is a timestamp, and it maintains a symlink to the latest sucessful backup. Think of it as a Time Machine sans the fancy GUI.



            #!/bin/sh
            DATE=`/bin/date +%Y%m%d%H%M%S`
            RSYNC=/usr/bin/rsync
            BASE=/mnt/externalhd
            TARGET=$BASE/daily
            $RSYNC -av --exclude $TARGET --exclude-from=/etc/backup/rsync.exclude --link-dest=$TARGET/latest/ / $TARGET/$DATE/
            touch $TARGET/$DATE/
            rm $TARGET/latest
            ln -s $TARGET/$DATE $TARGET/latest


            Set it up creating an empty $TARGET and symlink a dummy $TARGET/latest to it. Populate /etc/backup/rsync.exclude with lost+found, tmp, var/run and everything else you need to skip during backup, or go for --include-from if it fits you better; man rsync is your friend.



            Proper sanity checks, error control, remote backup and pretty GNOME GUI are left as an exercise to the reader ;-)






            share|improve this answer





















            • 1





              +1 I do something very similar to this. --link-dest for the win.

              – kbyrd
              Aug 17 '10 at 19:37














            21












            21








            21







            This crude -but functional- script will backup everything under the sun to your external hard drive under a hard link farm. The directory name is a timestamp, and it maintains a symlink to the latest sucessful backup. Think of it as a Time Machine sans the fancy GUI.



            #!/bin/sh
            DATE=`/bin/date +%Y%m%d%H%M%S`
            RSYNC=/usr/bin/rsync
            BASE=/mnt/externalhd
            TARGET=$BASE/daily
            $RSYNC -av --exclude $TARGET --exclude-from=/etc/backup/rsync.exclude --link-dest=$TARGET/latest/ / $TARGET/$DATE/
            touch $TARGET/$DATE/
            rm $TARGET/latest
            ln -s $TARGET/$DATE $TARGET/latest


            Set it up creating an empty $TARGET and symlink a dummy $TARGET/latest to it. Populate /etc/backup/rsync.exclude with lost+found, tmp, var/run and everything else you need to skip during backup, or go for --include-from if it fits you better; man rsync is your friend.



            Proper sanity checks, error control, remote backup and pretty GNOME GUI are left as an exercise to the reader ;-)






            share|improve this answer















            This crude -but functional- script will backup everything under the sun to your external hard drive under a hard link farm. The directory name is a timestamp, and it maintains a symlink to the latest sucessful backup. Think of it as a Time Machine sans the fancy GUI.



            #!/bin/sh
            DATE=`/bin/date +%Y%m%d%H%M%S`
            RSYNC=/usr/bin/rsync
            BASE=/mnt/externalhd
            TARGET=$BASE/daily
            $RSYNC -av --exclude $TARGET --exclude-from=/etc/backup/rsync.exclude --link-dest=$TARGET/latest/ / $TARGET/$DATE/
            touch $TARGET/$DATE/
            rm $TARGET/latest
            ln -s $TARGET/$DATE $TARGET/latest


            Set it up creating an empty $TARGET and symlink a dummy $TARGET/latest to it. Populate /etc/backup/rsync.exclude with lost+found, tmp, var/run and everything else you need to skip during backup, or go for --include-from if it fits you better; man rsync is your friend.



            Proper sanity checks, error control, remote backup and pretty GNOME GUI are left as an exercise to the reader ;-)







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited May 5 '13 at 11:10









            Anthon

            61.3k17105168




            61.3k17105168










            answered Aug 17 '10 at 14:16









            codeheadcodehead

            3,0881138




            3,0881138








            • 1





              +1 I do something very similar to this. --link-dest for the win.

              – kbyrd
              Aug 17 '10 at 19:37














            • 1





              +1 I do something very similar to this. --link-dest for the win.

              – kbyrd
              Aug 17 '10 at 19:37








            1




            1





            +1 I do something very similar to this. --link-dest for the win.

            – kbyrd
            Aug 17 '10 at 19:37





            +1 I do something very similar to this. --link-dest for the win.

            – kbyrd
            Aug 17 '10 at 19:37











            9














            The Backup-Comparison of backup tools at the Ubuntu-Stackexchange is not really Ubuntu-specific. Perhaps you get some suggestions there.



            I recommend DAR - the Disk ARchive program. It does not come with a GUI, but its config is easy to reproduce. It has great incremental backup support. It does not use hardlink mirror trees, but it has a convenient shell for navigating the filesystem view of different snapshots.






            share|improve this answer


























            • DAR has inconvenient restoration procedure: each incremental backup physically overrides files from previous step. So, if your file changes 7 times, it would be extracted 7 times, and 6 copies would be wasted, overridden by the 7th.

              – ayvango
              May 20 '17 at 4:28
















            9














            The Backup-Comparison of backup tools at the Ubuntu-Stackexchange is not really Ubuntu-specific. Perhaps you get some suggestions there.



            I recommend DAR - the Disk ARchive program. It does not come with a GUI, but its config is easy to reproduce. It has great incremental backup support. It does not use hardlink mirror trees, but it has a convenient shell for navigating the filesystem view of different snapshots.






            share|improve this answer


























            • DAR has inconvenient restoration procedure: each incremental backup physically overrides files from previous step. So, if your file changes 7 times, it would be extracted 7 times, and 6 copies would be wasted, overridden by the 7th.

              – ayvango
              May 20 '17 at 4:28














            9












            9








            9







            The Backup-Comparison of backup tools at the Ubuntu-Stackexchange is not really Ubuntu-specific. Perhaps you get some suggestions there.



            I recommend DAR - the Disk ARchive program. It does not come with a GUI, but its config is easy to reproduce. It has great incremental backup support. It does not use hardlink mirror trees, but it has a convenient shell for navigating the filesystem view of different snapshots.






            share|improve this answer















            The Backup-Comparison of backup tools at the Ubuntu-Stackexchange is not really Ubuntu-specific. Perhaps you get some suggestions there.



            I recommend DAR - the Disk ARchive program. It does not come with a GUI, but its config is easy to reproduce. It has great incremental backup support. It does not use hardlink mirror trees, but it has a convenient shell for navigating the filesystem view of different snapshots.







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited Apr 12 '17 at 7:23









            Community

            1




            1










            answered Sep 12 '10 at 8:58









            maxschlepzigmaxschlepzig

            34.4k33139213




            34.4k33139213













            • DAR has inconvenient restoration procedure: each incremental backup physically overrides files from previous step. So, if your file changes 7 times, it would be extracted 7 times, and 6 copies would be wasted, overridden by the 7th.

              – ayvango
              May 20 '17 at 4:28



















            • DAR has inconvenient restoration procedure: each incremental backup physically overrides files from previous step. So, if your file changes 7 times, it would be extracted 7 times, and 6 copies would be wasted, overridden by the 7th.

              – ayvango
              May 20 '17 at 4:28

















            DAR has inconvenient restoration procedure: each incremental backup physically overrides files from previous step. So, if your file changes 7 times, it would be extracted 7 times, and 6 copies would be wasted, overridden by the 7th.

            – ayvango
            May 20 '17 at 4:28





            DAR has inconvenient restoration procedure: each incremental backup physically overrides files from previous step. So, if your file changes 7 times, it would be extracted 7 times, and 6 copies would be wasted, overridden by the 7th.

            – ayvango
            May 20 '17 at 4:28











            8














            I use backintime, which is primarily targeted towards Gnome/KDE desktops. However, it can work from the commandline as well.



            I describe backintime as a backup system with "poor man's deduplication".



            If you were to write your own backup script to use rsync and hardlinks, you would end up with something similar to backintime.




            • I use cron to kick off the backintime job once per night.

            • As the documentation says: The real magic is done by rsync (take snapshots and restore), diff (check if somethind changed) and cp (make hardlinks).

            • backintime can be configured with different schedules. I keep monthly backups for 1 year, weeklies for 1 month, and dailies for 1 week.

            • backintime uses hardlinks. I have 130GB worth of data, and I back this up nightly. It only uses 160GB worth of space on the second drive because of the magic of hardlinks.

            • Restoring data from the backup location is as simple as running cp /u1/backintime/20100818-000002/backup/etc/rsyslog.conf /etc/rsyslog.conf. You don't need to use the GUI.

            • On the second drive, the initial copy was expensive (since you can't do hardlinks between two different filesystems), but subsequent copies are fast.

            • I copy data from my primary filesystems to a second filesystem onto a second hot-swappable drive, and periodically rotate the secondary drive.






            share|improve this answer


























            • Surely you want the initial copy to be expensive, otherwise you don't have a backup, just another link to a single file? Of course, it's also possible that I'm missing some crucial point which makes this comment pointless :-)

              – dr-jan
              Aug 25 '10 at 13:05











            • @Dr-jan : I agree with you. However, I think some users expect the initial copy to be fast.

              – Stefan Lasiewski
              Aug 26 '10 at 17:11
















            8














            I use backintime, which is primarily targeted towards Gnome/KDE desktops. However, it can work from the commandline as well.



            I describe backintime as a backup system with "poor man's deduplication".



            If you were to write your own backup script to use rsync and hardlinks, you would end up with something similar to backintime.




            • I use cron to kick off the backintime job once per night.

            • As the documentation says: The real magic is done by rsync (take snapshots and restore), diff (check if somethind changed) and cp (make hardlinks).

            • backintime can be configured with different schedules. I keep monthly backups for 1 year, weeklies for 1 month, and dailies for 1 week.

            • backintime uses hardlinks. I have 130GB worth of data, and I back this up nightly. It only uses 160GB worth of space on the second drive because of the magic of hardlinks.

            • Restoring data from the backup location is as simple as running cp /u1/backintime/20100818-000002/backup/etc/rsyslog.conf /etc/rsyslog.conf. You don't need to use the GUI.

            • On the second drive, the initial copy was expensive (since you can't do hardlinks between two different filesystems), but subsequent copies are fast.

            • I copy data from my primary filesystems to a second filesystem onto a second hot-swappable drive, and periodically rotate the secondary drive.






            share|improve this answer


























            • Surely you want the initial copy to be expensive, otherwise you don't have a backup, just another link to a single file? Of course, it's also possible that I'm missing some crucial point which makes this comment pointless :-)

              – dr-jan
              Aug 25 '10 at 13:05











            • @Dr-jan : I agree with you. However, I think some users expect the initial copy to be fast.

              – Stefan Lasiewski
              Aug 26 '10 at 17:11














            8












            8








            8







            I use backintime, which is primarily targeted towards Gnome/KDE desktops. However, it can work from the commandline as well.



            I describe backintime as a backup system with "poor man's deduplication".



            If you were to write your own backup script to use rsync and hardlinks, you would end up with something similar to backintime.




            • I use cron to kick off the backintime job once per night.

            • As the documentation says: The real magic is done by rsync (take snapshots and restore), diff (check if somethind changed) and cp (make hardlinks).

            • backintime can be configured with different schedules. I keep monthly backups for 1 year, weeklies for 1 month, and dailies for 1 week.

            • backintime uses hardlinks. I have 130GB worth of data, and I back this up nightly. It only uses 160GB worth of space on the second drive because of the magic of hardlinks.

            • Restoring data from the backup location is as simple as running cp /u1/backintime/20100818-000002/backup/etc/rsyslog.conf /etc/rsyslog.conf. You don't need to use the GUI.

            • On the second drive, the initial copy was expensive (since you can't do hardlinks between two different filesystems), but subsequent copies are fast.

            • I copy data from my primary filesystems to a second filesystem onto a second hot-swappable drive, and periodically rotate the secondary drive.






            share|improve this answer















            I use backintime, which is primarily targeted towards Gnome/KDE desktops. However, it can work from the commandline as well.



            I describe backintime as a backup system with "poor man's deduplication".



            If you were to write your own backup script to use rsync and hardlinks, you would end up with something similar to backintime.




            • I use cron to kick off the backintime job once per night.

            • As the documentation says: The real magic is done by rsync (take snapshots and restore), diff (check if somethind changed) and cp (make hardlinks).

            • backintime can be configured with different schedules. I keep monthly backups for 1 year, weeklies for 1 month, and dailies for 1 week.

            • backintime uses hardlinks. I have 130GB worth of data, and I back this up nightly. It only uses 160GB worth of space on the second drive because of the magic of hardlinks.

            • Restoring data from the backup location is as simple as running cp /u1/backintime/20100818-000002/backup/etc/rsyslog.conf /etc/rsyslog.conf. You don't need to use the GUI.

            • On the second drive, the initial copy was expensive (since you can't do hardlinks between two different filesystems), but subsequent copies are fast.

            • I copy data from my primary filesystems to a second filesystem onto a second hot-swappable drive, and periodically rotate the secondary drive.







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited Apr 16 '16 at 15:56









            Mateusz Konieczny

            322114




            322114










            answered Aug 18 '10 at 17:47









            Stefan LasiewskiStefan Lasiewski

            8,887196179




            8,887196179













            • Surely you want the initial copy to be expensive, otherwise you don't have a backup, just another link to a single file? Of course, it's also possible that I'm missing some crucial point which makes this comment pointless :-)

              – dr-jan
              Aug 25 '10 at 13:05











            • @Dr-jan : I agree with you. However, I think some users expect the initial copy to be fast.

              – Stefan Lasiewski
              Aug 26 '10 at 17:11



















            • Surely you want the initial copy to be expensive, otherwise you don't have a backup, just another link to a single file? Of course, it's also possible that I'm missing some crucial point which makes this comment pointless :-)

              – dr-jan
              Aug 25 '10 at 13:05











            • @Dr-jan : I agree with you. However, I think some users expect the initial copy to be fast.

              – Stefan Lasiewski
              Aug 26 '10 at 17:11

















            Surely you want the initial copy to be expensive, otherwise you don't have a backup, just another link to a single file? Of course, it's also possible that I'm missing some crucial point which makes this comment pointless :-)

            – dr-jan
            Aug 25 '10 at 13:05





            Surely you want the initial copy to be expensive, otherwise you don't have a backup, just another link to a single file? Of course, it's also possible that I'm missing some crucial point which makes this comment pointless :-)

            – dr-jan
            Aug 25 '10 at 13:05













            @Dr-jan : I agree with you. However, I think some users expect the initial copy to be fast.

            – Stefan Lasiewski
            Aug 26 '10 at 17:11





            @Dr-jan : I agree with you. However, I think some users expect the initial copy to be fast.

            – Stefan Lasiewski
            Aug 26 '10 at 17:11











            4














            Rdiff Backup is really good http://rdiff-backup.nongnu.org/



            Note that it is abandoned, with latest stable and unstable releases from 2009.






            share|improve this answer


























            • But currently unmaintained.

              – Faheem Mitha
              Nov 28 '15 at 17:31
















            4














            Rdiff Backup is really good http://rdiff-backup.nongnu.org/



            Note that it is abandoned, with latest stable and unstable releases from 2009.






            share|improve this answer


























            • But currently unmaintained.

              – Faheem Mitha
              Nov 28 '15 at 17:31














            4












            4








            4







            Rdiff Backup is really good http://rdiff-backup.nongnu.org/



            Note that it is abandoned, with latest stable and unstable releases from 2009.






            share|improve this answer















            Rdiff Backup is really good http://rdiff-backup.nongnu.org/



            Note that it is abandoned, with latest stable and unstable releases from 2009.







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited Apr 16 '16 at 15:56









            Mateusz Konieczny

            322114




            322114










            answered Aug 17 '10 at 22:04









            BaunaBauna

            44124




            44124













            • But currently unmaintained.

              – Faheem Mitha
              Nov 28 '15 at 17:31



















            • But currently unmaintained.

              – Faheem Mitha
              Nov 28 '15 at 17:31

















            But currently unmaintained.

            – Faheem Mitha
            Nov 28 '15 at 17:31





            But currently unmaintained.

            – Faheem Mitha
            Nov 28 '15 at 17:31











            3














            I've had some success with RIBS (Rsync Incremental Backup System)



            It uses rsync so hardlinks are supported and can do incremental backups hourly, daily, weekly and monthly.



            However, it is a PHP script only. To set up you need to edit the settings and then set up related cronjobs. It works, but it's not the most user friendly and requires PHP.






            share|improve this answer






























              3














              I've had some success with RIBS (Rsync Incremental Backup System)



              It uses rsync so hardlinks are supported and can do incremental backups hourly, daily, weekly and monthly.



              However, it is a PHP script only. To set up you need to edit the settings and then set up related cronjobs. It works, but it's not the most user friendly and requires PHP.






              share|improve this answer




























                3












                3








                3







                I've had some success with RIBS (Rsync Incremental Backup System)



                It uses rsync so hardlinks are supported and can do incremental backups hourly, daily, weekly and monthly.



                However, it is a PHP script only. To set up you need to edit the settings and then set up related cronjobs. It works, but it's not the most user friendly and requires PHP.






                share|improve this answer















                I've had some success with RIBS (Rsync Incremental Backup System)



                It uses rsync so hardlinks are supported and can do incremental backups hourly, daily, weekly and monthly.



                However, it is a PHP script only. To set up you need to edit the settings and then set up related cronjobs. It works, but it's not the most user friendly and requires PHP.







                share|improve this answer














                share|improve this answer



                share|improve this answer








                edited Apr 16 '16 at 15:56









                Mateusz Konieczny

                322114




                322114










                answered Aug 17 '10 at 19:20









                mendicantmendicant

                23937




                23937























                    1














                    I've been using epitome for about a year now for deduplicated backups of my personal data . It has a tar like interface so it's quite comfortable for a unix user and setup is a breeze, at least, on OpenBSD. You can easily cron it to backup your directories on a daily basis, and it takes care of the deduplication of your data. You basically are left with a meta-file that you can use to restore your snapshot at a later date. As I said the interface is tar-like so doing a backup is as easy as:




                    # epitomize -cvRf 2010-08-16-home.md /home


                    Note that epitome is abandoned, only partial copy of website at https://web.archive.org/web/20140908075740/https://www.peereboom.us/epitome/ remains.






                    share|improve this answer


























                    • It's currently experimental but, works quite well. I've been able to do full restores from arbitrary meta files and recover information that I needed, and have had 0 problems with it in ~1 year of use.

                      – gabe.
                      Aug 17 '10 at 5:01
















                    1














                    I've been using epitome for about a year now for deduplicated backups of my personal data . It has a tar like interface so it's quite comfortable for a unix user and setup is a breeze, at least, on OpenBSD. You can easily cron it to backup your directories on a daily basis, and it takes care of the deduplication of your data. You basically are left with a meta-file that you can use to restore your snapshot at a later date. As I said the interface is tar-like so doing a backup is as easy as:




                    # epitomize -cvRf 2010-08-16-home.md /home


                    Note that epitome is abandoned, only partial copy of website at https://web.archive.org/web/20140908075740/https://www.peereboom.us/epitome/ remains.






                    share|improve this answer


























                    • It's currently experimental but, works quite well. I've been able to do full restores from arbitrary meta files and recover information that I needed, and have had 0 problems with it in ~1 year of use.

                      – gabe.
                      Aug 17 '10 at 5:01














                    1












                    1








                    1







                    I've been using epitome for about a year now for deduplicated backups of my personal data . It has a tar like interface so it's quite comfortable for a unix user and setup is a breeze, at least, on OpenBSD. You can easily cron it to backup your directories on a daily basis, and it takes care of the deduplication of your data. You basically are left with a meta-file that you can use to restore your snapshot at a later date. As I said the interface is tar-like so doing a backup is as easy as:




                    # epitomize -cvRf 2010-08-16-home.md /home


                    Note that epitome is abandoned, only partial copy of website at https://web.archive.org/web/20140908075740/https://www.peereboom.us/epitome/ remains.






                    share|improve this answer















                    I've been using epitome for about a year now for deduplicated backups of my personal data . It has a tar like interface so it's quite comfortable for a unix user and setup is a breeze, at least, on OpenBSD. You can easily cron it to backup your directories on a daily basis, and it takes care of the deduplication of your data. You basically are left with a meta-file that you can use to restore your snapshot at a later date. As I said the interface is tar-like so doing a backup is as easy as:




                    # epitomize -cvRf 2010-08-16-home.md /home


                    Note that epitome is abandoned, only partial copy of website at https://web.archive.org/web/20140908075740/https://www.peereboom.us/epitome/ remains.







                    share|improve this answer














                    share|improve this answer



                    share|improve this answer








                    edited Apr 16 '16 at 15:57









                    Mateusz Konieczny

                    322114




                    322114










                    answered Aug 17 '10 at 4:59









                    gabe.gabe.

                    6,56593654




                    6,56593654













                    • It's currently experimental but, works quite well. I've been able to do full restores from arbitrary meta files and recover information that I needed, and have had 0 problems with it in ~1 year of use.

                      – gabe.
                      Aug 17 '10 at 5:01



















                    • It's currently experimental but, works quite well. I've been able to do full restores from arbitrary meta files and recover information that I needed, and have had 0 problems with it in ~1 year of use.

                      – gabe.
                      Aug 17 '10 at 5:01

















                    It's currently experimental but, works quite well. I've been able to do full restores from arbitrary meta files and recover information that I needed, and have had 0 problems with it in ~1 year of use.

                    – gabe.
                    Aug 17 '10 at 5:01





                    It's currently experimental but, works quite well. I've been able to do full restores from arbitrary meta files and recover information that I needed, and have had 0 problems with it in ~1 year of use.

                    – gabe.
                    Aug 17 '10 at 5:01











                    1














                    BackupPC sounds like it fits the bill. It manages a tree of hard links for dedupe and can backup many machines, or just the local machine.






                    share|improve this answer


























                    • +1 for BackupPC I use it to backup a group of servers regularly. It also has a good web-based UI.

                      – dr-jan
                      Aug 25 '10 at 13:00


















                    1














                    BackupPC sounds like it fits the bill. It manages a tree of hard links for dedupe and can backup many machines, or just the local machine.






                    share|improve this answer


























                    • +1 for BackupPC I use it to backup a group of servers regularly. It also has a good web-based UI.

                      – dr-jan
                      Aug 25 '10 at 13:00
















                    1












                    1








                    1







                    BackupPC sounds like it fits the bill. It manages a tree of hard links for dedupe and can backup many machines, or just the local machine.






                    share|improve this answer















                    BackupPC sounds like it fits the bill. It manages a tree of hard links for dedupe and can backup many machines, or just the local machine.







                    share|improve this answer














                    share|improve this answer



                    share|improve this answer








                    edited Apr 16 '16 at 15:58









                    Mateusz Konieczny

                    322114




                    322114










                    answered Aug 17 '10 at 20:11









                    TREETREE

                    1112




                    1112













                    • +1 for BackupPC I use it to backup a group of servers regularly. It also has a good web-based UI.

                      – dr-jan
                      Aug 25 '10 at 13:00





















                    • +1 for BackupPC I use it to backup a group of servers regularly. It also has a good web-based UI.

                      – dr-jan
                      Aug 25 '10 at 13:00



















                    +1 for BackupPC I use it to backup a group of servers regularly. It also has a good web-based UI.

                    – dr-jan
                    Aug 25 '10 at 13:00







                    +1 for BackupPC I use it to backup a group of servers regularly. It also has a good web-based UI.

                    – dr-jan
                    Aug 25 '10 at 13:00













                    1














                    Lars Wirzenius's obnam:




                    • Does deduplication when it backs up things, which means that backups are likely to take little space, potentially a lot more than simply hardlinking files.

                    • As the backups are with deduplication, every backup is "full", with no need of having incremental backups. It simply detects that not many things have changed and only does what is needed.

                    • Each backup is, effectively, a snapshot of your system, without the need to recover the last full backup and each incremental backup in turn to get the system to be restored.

                    • Contrary to bup (which is another strong contender with deduplication), obnam is able to delete previous backups to save space of unnecessary backups.

                    • It's retired

                    • Besides using the regular recovery methods of a backup program, there is a fuse filesystem that provides a view of obnam's backups as a plain filesystem and that can choose which snapshot/backup/generation to mount, which is super handy, as far as "user" interfaces go (given that we are in a Unix-related site, a flexible command line interface is highly valued).

                    • It supports encryption as an integral part of the backups (and not as an afterthought).

                    • It was written with support for remote backups in mind.


                    In my opinion, one serious contender for the Backup World Day (and not only that day).






                    share|improve this answer


























                    • "As the backups are with deduplication, every backup is "full", with no need of having incremental backups. It simply detects that not many things have changed and only does what is needed" -as it relies on previous backup versions to provide data it means that it IS an incremental backup.

                      – Mateusz Konieczny
                      Apr 16 '16 at 15:37
















                    1














                    Lars Wirzenius's obnam:




                    • Does deduplication when it backs up things, which means that backups are likely to take little space, potentially a lot more than simply hardlinking files.

                    • As the backups are with deduplication, every backup is "full", with no need of having incremental backups. It simply detects that not many things have changed and only does what is needed.

                    • Each backup is, effectively, a snapshot of your system, without the need to recover the last full backup and each incremental backup in turn to get the system to be restored.

                    • Contrary to bup (which is another strong contender with deduplication), obnam is able to delete previous backups to save space of unnecessary backups.

                    • It's retired

                    • Besides using the regular recovery methods of a backup program, there is a fuse filesystem that provides a view of obnam's backups as a plain filesystem and that can choose which snapshot/backup/generation to mount, which is super handy, as far as "user" interfaces go (given that we are in a Unix-related site, a flexible command line interface is highly valued).

                    • It supports encryption as an integral part of the backups (and not as an afterthought).

                    • It was written with support for remote backups in mind.


                    In my opinion, one serious contender for the Backup World Day (and not only that day).






                    share|improve this answer


























                    • "As the backups are with deduplication, every backup is "full", with no need of having incremental backups. It simply detects that not many things have changed and only does what is needed" -as it relies on previous backup versions to provide data it means that it IS an incremental backup.

                      – Mateusz Konieczny
                      Apr 16 '16 at 15:37














                    1












                    1








                    1







                    Lars Wirzenius's obnam:




                    • Does deduplication when it backs up things, which means that backups are likely to take little space, potentially a lot more than simply hardlinking files.

                    • As the backups are with deduplication, every backup is "full", with no need of having incremental backups. It simply detects that not many things have changed and only does what is needed.

                    • Each backup is, effectively, a snapshot of your system, without the need to recover the last full backup and each incremental backup in turn to get the system to be restored.

                    • Contrary to bup (which is another strong contender with deduplication), obnam is able to delete previous backups to save space of unnecessary backups.

                    • It's retired

                    • Besides using the regular recovery methods of a backup program, there is a fuse filesystem that provides a view of obnam's backups as a plain filesystem and that can choose which snapshot/backup/generation to mount, which is super handy, as far as "user" interfaces go (given that we are in a Unix-related site, a flexible command line interface is highly valued).

                    • It supports encryption as an integral part of the backups (and not as an afterthought).

                    • It was written with support for remote backups in mind.


                    In my opinion, one serious contender for the Backup World Day (and not only that day).






                    share|improve this answer















                    Lars Wirzenius's obnam:




                    • Does deduplication when it backs up things, which means that backups are likely to take little space, potentially a lot more than simply hardlinking files.

                    • As the backups are with deduplication, every backup is "full", with no need of having incremental backups. It simply detects that not many things have changed and only does what is needed.

                    • Each backup is, effectively, a snapshot of your system, without the need to recover the last full backup and each incremental backup in turn to get the system to be restored.

                    • Contrary to bup (which is another strong contender with deduplication), obnam is able to delete previous backups to save space of unnecessary backups.

                    • It's retired

                    • Besides using the regular recovery methods of a backup program, there is a fuse filesystem that provides a view of obnam's backups as a plain filesystem and that can choose which snapshot/backup/generation to mount, which is super handy, as far as "user" interfaces go (given that we are in a Unix-related site, a flexible command line interface is highly valued).

                    • It supports encryption as an integral part of the backups (and not as an afterthought).

                    • It was written with support for remote backups in mind.


                    In my opinion, one serious contender for the Backup World Day (and not only that day).







                    share|improve this answer














                    share|improve this answer



                    share|improve this answer








                    edited 3 hours ago









                    Matthias Braun

                    2,11921424




                    2,11921424










                    answered Feb 27 '13 at 3:04









                    rbritorbrito

                    402313




                    402313













                    • "As the backups are with deduplication, every backup is "full", with no need of having incremental backups. It simply detects that not many things have changed and only does what is needed" -as it relies on previous backup versions to provide data it means that it IS an incremental backup.

                      – Mateusz Konieczny
                      Apr 16 '16 at 15:37



















                    • "As the backups are with deduplication, every backup is "full", with no need of having incremental backups. It simply detects that not many things have changed and only does what is needed" -as it relies on previous backup versions to provide data it means that it IS an incremental backup.

                      – Mateusz Konieczny
                      Apr 16 '16 at 15:37

















                    "As the backups are with deduplication, every backup is "full", with no need of having incremental backups. It simply detects that not many things have changed and only does what is needed" -as it relies on previous backup versions to provide data it means that it IS an incremental backup.

                    – Mateusz Konieczny
                    Apr 16 '16 at 15:37





                    "As the backups are with deduplication, every backup is "full", with no need of having incremental backups. It simply detects that not many things have changed and only does what is needed" -as it relies on previous backup versions to provide data it means that it IS an incremental backup.

                    – Mateusz Konieczny
                    Apr 16 '16 at 15:37


















                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Unix & Linux Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f553%2feasy-incremental-backups-to-an-external-hard-drive%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Accessing regular linux commands in Huawei's Dopra Linux

                    Can't connect RFCOMM socket: Host is down

                    Kernel panic - not syncing: Fatal Exception in Interrupt