Home Network Backup and Sharing Disk Space

We have spoken about the configuration of the hard disk partitions and how they can help in maintaining and upgrading a Linux system. Now we will turn our attention to how to backup a Linux system. Once we understand backup strategies, we will look at how to share directories across the network to simplify the backup of the workstations.

Backup of Server

Lets start by reviewing what we should backup and what there is no point in saving on a backup device. The short answer is anything which is part of the installation should not be backed up, and anything unique to this machine should be backed up.

Lets start with a list of the common directories as defined in the File System Standard. And add a couple of others which are common on most Linux installs. Here is the list:

  • /
  • /bin
  • /boot
  • /dev
  • /etc
  • /home
  • /lib
  • /mnt
  • /opt
  • /root
  • /sbin
  • /storage
  • /tmp
  • /usr
  • /usr/local
  • /var

The directories I added to the standard were:

  • /home where the users home directories live
  • /root the home of the root system user.
  • /usr/local the place commonly used for programs added by the user which did not come from RPM packages.
  • /storage a place to keep upgrade packages.

Given this strategy what should not be backed up. By directory it would be /bin, /boot, /dev, /lib, /mnt, /sbin, /tmp, /usr, and /var. All these directories contain files which were installed by the original CD’s or later upgrades. In Redhat Linux the upgrades are performed using RPM’s so they are easy to redo. I did not list the root directory since it would contain all the sub directories.

The what should be backed up by directory then: /home, /usr/local, /root, /etc /opt, /storage and /var/spool/mail. This combination of directories would backup all the user created data along with system configuration file and upgrade packages.

How to do a Restore

Given the backup tape defined above how would you restore a system from say a hard disk crash? Assuming you get the hard ware fixed or a new computer.

The obvious place to start is by reinstalling the Operating System from the initial CD’s. This will give you a running installation from which we can go to the backup tape. If you restore the tape you should put back the the home directories, and the sources for the package upgrades first, directories /home, /usr/local, /root, /opt and /storage. Next you should run the upgrades again. This is don by changing to the /storage directory and using the command rpm -U *.rpm. This will go through all the rpm package files in the directory and update everything. Once the updates are in place, you should run the tape restore again to replace the configuration files in /etc and the /var/spool/mail files. At this point you could simply restart all the services, but it is probably easier to reboot. Yes reboot is useful in Unix since it restarts all the services.

Saving User data

On our home network we had both Linux and windows workstations. Lets look at a strategy to save the user data from our Windows PC to the users home directory on the Linux Server. If you go to Windows explorer and select the tools menu, the Map network drive selection. In the drive box select a convenient drive letter. In the folder box, browse to your Linux server and select the folder containing the user’s name. What you have just done is map the Linux server directory /home/user to a drive letter under Windows, lets call it D:. This allows the windows machines to read and write to the users home directory on Linux. I would normally create a directory named something like edit . I then configure the Windows programs to use a folder under edit to save their data. So, if you are setting up Mozilla for use, configure the folder d:\edit\mozilla as the place to save the address book and bookmarks. This way, when you backup the Linux servers /home directory, you will be backing up the window’s users personal data as well.

So how do you do the same trick for the Linux workstations? This is a little tricker, but not all that hard. Here are the steps:

  1. On the Linux server create the configuration file /etc/export. This file lists what resources are available for NFS mount. This file lists the directories who can access it and the permissions of the mount. So it might look like this:


    This says allow the computer to NFS mount the directory /home/frank for read/write access.

  2. Start the NFS service on you server with the command: service NFS start . This will start the service on the server. You can check that it is running with the command rcpinfo -p , look for portmapper running.
  3. On the Linux workstation log in as Root and to to the main users home directory, lets assume /home/frank. Create a directory there called edit, and make sure it is owned by frank.
  4. Mount the NFS share from the server Wizard, to the new directory with the command: mount wizard:/home/frank /home/frank/edit . This will attach the NFS share to the local directory.
  5. Edit the file /etc/fstab and the following line:

           wizard:/home/frank   /home/frank/edit   nfs   0 0

    The will tell the computer to make the connection at boot time.

That is all that is needed to allow the Linux workstation to keep the created data on the Server. The advantage to this setup is that if the network went down the workstations are still running but not connected.

You could make the Linux workstation use wizard:/home/frank in place of the local /home/frank. The advantage to this is that if the local Linux workstation was not working, the user could log into the Linux server console and work from there until his/her machine was fixed. The windows user does not have this luxury since his/her programs will not run under Linux. Of course if they are using tools like mozilla and OpenOffice, they could use the Linux server too.

To give you an example why this strategy works well. When one of my sons trash their computer, they do their homework on another computer by telneting into the server and running all the programs from there. This highlights one of Linux’s strengths, since X-windows allows the display and the computer to be on different boxes, you can run Linux programs over the network. If you are really ambitious you can turn the workstations into Thin Clients and run all the applications from the server. But that is another discussion.


So we have learned how a backup strategy can save you time and effort. With some configuration changes, you can do all your backups from the server and still recover the information on the Workstations. By centralizing the information this way you simplify your work.

Written by John F. Moore

Last Revised: Wed Oct 18 11:01:34 EDT 2017

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
HTML5 Powered with CSS3 / Styling, and Semantics