Here’s 7 simple rules to follow in order to guarantee integrity of data, never forgetting good sense.

Actually, it’s like convincing someone of the usefulness of warm water - every professional of company should always have an updated and complete backup that provides an easy restore - yet unfortunately every time I talk with an IT responsible, consultant or sysadmin I discover, with my great surprise, that some basic concepts are misunderstood or ???, if not even unused.

Let’s list the 7 requisites of a good backup system.

1: “2 is better than 1”: an additional copy isn’t a pity
Don’t ever think that a single copy on an external HD, NAS or resource is enough. Failures are always around the corner and you’ll get to face them in the worst moments. Therefore duplicate your backups on two different supports at least, even if the main one seems to be adequately reliable. You don’t have to copy everything: if you make, say, a complete backup of each computer, probably copying terabytes of data is a waste. Backing up critical data (ie documents and emails) however must be done on (at least) two different supports, both of them reliable.

2: Test your backups
Most times a backup is good, it’s when restoring it that problems arise, and only when in the need you’ll notice errors, missing files or other problems related to the material saved. Therefore it’s of uttermost importance to test backups by restoring them in a testing environment. Not all technologies require this check: for instance if you are making a pure copy of a folder it’s unlikely that something goes wrong, at least if the software can alert you in case of reading/writing errors (ie multiple accesses to a file at the same time).
Things get harder when using software that elaborate data creating a proprietary-formatted file. In this case the only way to test it is to restore it, indeed. Some software offer a testing system that it’s quite good to start with, but only a real, complete restore will make you sure about the result.

restore 1backup 1

The chances of a backup being unusable further grows when it’s a “hot saving”, eg a powered on Virtual Machine running on a virtualization system, or a system image with Windows running. A backup of this kind might not be consistent, in particular if you don’t use dedicated technologies like Microsoft VSS (Windows) and if you don’t make a backup compatible with some applications, like databases, SharePoint, Exchange and similar products. In these cases, a periodic restore is fundamental. More sophisticated tools like Veeam and HPE VM Explorer allows you to restore VMs in a virtual environment in a completely automated manner, to launch scripts at the end of the process and to capture screenshots that can help to understand whether the restore has been successful or not. It’s not overkill, it’s a best practice that gives certainty about the result.

3: Use the right technologies
With regards to the previous point, it’s quite evident that backing up the complete image of a PC or VM isn’t always the best choice. First because it takes a large space to be stored on and lots of time to be restored, and secondly because it doesn’t gives all the needed guarantees of a precise functioning.

In most cases a copy of the database or critical files of the application you’re using allows to save plenty of space and to perform backups more frequently, thus having a greater number of version. However, the best technology to use varies according to the object of the backup.

veeam1

If you are backing up the content of a file server you will experience less problems by adopting a periodic synchronization policy with versioning. So you will be able to retrieve a single file without using third-party tools to extract data.
If you are backing up the content of a database, you have to use a script compatible with the database server being in use. You can’t “hot copy” file as they can get corrupted and unusefull. As an alternative, you can use a commercial software that supports databases and the version of the database server in use.
If you are backing up SharePoint, you can also use its integrated backup tool, which is secure and can save the state of the whole system. This backup can also be scripted with ease and be performed as an incremental or complete backup.
Naturally with this script you can backup the VMs it’s running on and, if possible, another backup with a specific software that offers an easy incremental restore.
If you’re using Exchange, then it’s time to move over to an online email system. Jokes aside, if you use it, then purchase a commercial software for backup and granular restore, and be sure to read very carefully the documentation and respect every best practice. Moreover, leverage as much as you can the features in Exchange for the retrieval of deleted email, perhaps extend the retention period beyond 30 days, too.

4: Granularity matters
This point goes together with the previous. When you will be asked to retrieve some files from the backup, you’ll be likely not willing and having to restore the whole backup. Perhaps you just need a single email or file: you must plan a restore strategy according to how you manage backups and to the technologies being used. What matters, in the end, is to be prepared in every circumstance.

 

synco1

5: Notifications: yes, but with the proper clarity
Some backup software can send you each day an email with the complete log of all backup activities of the day. As long as this feature can make you safe against the loss of logs inside the server, surely it won’t help you to understand what’s going on, unless you’ve enough time each day to open logs, extract the content and read each details, taking care of each error.

synco2

A good backup software will send you a clear and readable report that let’s you understand if things are bad or good just by reading the object. The text should include details of files, VMs, databases, etc.., that have errors.
With the least effort, in this way you can very what’s went wrong, and also perhaps postpone the check if the error is about a low importance file.
The report should also be sent if there are no files to backup or the backup software isn’t running for some reason. Another problem that might raise is the backup software that stops sending emails because it’s not working anymore, because of a network problem, a new configuration, the SMTP server’s password, or because another reason. You must have a system that checks it doesn’t happen, otherwise you can’t monitor the state of backups. If you receive a single email each day you might notice the lack of such email, but what if you receive tens of emails from all your clients? Chances are that you won’t notice lacks of reports.

6: The intelligent use of Cloud
If you’ve pondered about a Disaster Recovery plan, naturally you’ll have thought about copying data outside your company in another office or into the Cloud.

When copying files in a remote place you could be tempted to copy everything it’s being copied on a local NAS too. But that’s an approach we don’t advise, especially if connectivity is limited. It’s better to use a backup service that supports the initial transfer of data by means of hard disks and only then activate the incremental backup feature.

acro

If you’ve pondered about a Disaster Recovery plan, naturally you’ll have thought about copying data outside your company in another office or into the Cloud.
When copying files in a remote place you could be tempted to copy everything it’s being copied on a local NAS too. But that’s an approach we don’t advise, especially if connectivity is limited. It’s better to use a backup service that supports the initial transfer of data by means of hard disks and only then activate the incremental backup feature.

7: Don’t forget about Ransomware
Ransomware can cipher backups too, therefore any backup support must be protected against a possible attack, otherwise backups themselves lose their efficiency.

Clearly -at least up to now- the Cloud destination is intrinsically protected, but it’s important to limit the access to backup within the local network too, so that the chances of acts of vandalism by part of shady employees of collaborators lower. Remember about these potential threats when planning access permissions to files and folders.

 

About the Author

Filippo Moriggia

After more than 10 years of experience in the technical journalism with PC Professionale (the italian version of PC Magazine) and other newspapers of Mondadori group, Filippo Moriggia founded GURU advisor, the reference website for IT professionals, system integrators, cloud providers and MSPs. He has a Master of Science in Telecommunications Engineering and works as a independent consultant and contractor for different firms. His main focuses are software, virtualization, servers, cloud, networking and security. He's certified VMware VCA for Data Center Virtualization.

banner eng

fb icon evo twitter icon evo

Word of the Day

The term Edge Computing refers, when used in the cloud-based infrastructure sphere, the set of devices and technologies that allows...

>

The acronym SoC (System on Chip) describes particular integrated circuit that contain a whole system inside a single physical chip:...

>

The acronym PtP (Point-to-Point) indicates point-to-point radio links realized with wireless technologies. Differently, PtMP links connects a single source to...

>

Hold Down Timer is a technique used by network routers. When a node receives notification that another router is offline...

>

In the field of Information Technology, the term piggybacking refers to situations where an unauthorized third party gains access to...

>
Read also the others...

Download of the Day

Netcat

Netcat is a command line tool that can be used in both Linux and Windows environments, capable of...

>

Fiddler

Fiddler is a proxy server that can run locally to allow application debugging and control of data in...

>

Adapter Watch

Adapter Watch is a tool that shows a complete and detailed report about network cards. Download it here.

>

DNS DataView

DNS DataView is a graphical-interface software to perform DNS lookup queries from your PC using system-defined DNS, or...

>

SolarWinds Traceroute NG

SolarWinds Traceroute NG is a command line tool to perform advanced traceroute in Windows environment, compared to the...

>
All Download...

Issues Archive

  • GURU advisor: issue 18 - April 2018

    GURU advisor: issue 18 - April 2018

  • GURU advisor: issue 17 - January 2018

    GURU advisor: issue 17 - January 2018

  • GURU advisor: issue 16 - october 2017

    GURU advisor: issue 16 - october 2017

  • GURU advisor: issue 15 - July 2017

    GURU advisor: issue 15 - July 2017

  • GURU advisor: issue 14 - May 2017

    GURU advisor: issue 14 - May 2017

  • GURU advisor: issue 13 - March 2017

    GURU advisor: issue 13 - March 2017

  • GURU advisor: issue 12 -  January 2017

    GURU advisor: issue 12 - January 2017

  • GURU advisor: issue 11 -  October 2016

    GURU advisor: issue 11 - October 2016

  • 1
  • 2
  • 3
  • BYOD: your devices for your firm

    The quick evolution of informatics and technologies, together with the crisis that mined financial mines, has brought to a tendency inversion: users that prefer to work with their own devices as they’re often more advanced and modern than those the companies would provide. Read More
  • A switch for datacenters: Quanta LB4M

    You don’t always have to invest thousands of euros to build an enterprise-level networking: here’s our test of the Quanta LB4M switch Read More
  • Mobile World Congress in Barcelona

    GURU advisor will be at the Mobile World Congress in Barcelona from February 22nd to 25th 2016!

    MWC is one of the biggest conventions about the worldwide mobile market, we'll be present for the whole event and we'll keep you posted with news and previews from the congress.

    Read More
  • 1