You’re ready to move into the cloud, but before you can get there you actually have to get your data to the cloud. Below are some tips on how you can optimize your first cloud backup deployment.
When you sign up for a cloud backup service, your trusted managed service provider will have to make an initial backup before they can begin to back up your data incrementally. Depending on the amount of data that is required to be backed up and on the speed of the internet connection, this initial backup can take a long time to complete.
With the first backup taking so long, it is important to prioritize your data. You may want to organize your organization’s operational documents (word processing files, spreadsheets, etc) to be backed up first and have uncommon file types backed up last. Depending on your managed service provider, you may be able to determine which files are used most often in your business and back that up first.
Take Advantage of Bandwidth Throttling
Although your initial backup may take a long time, you don’t want it to affect your network during working hours while people are trying to get their work done via the internet. During the day, you should be able to strike a balance between getting your backups done and having enough bandwidth for the workday. After business hours and on weekends you can increase your bandwidth to focus on your backups.
Deduplication and Compression
It’s best to minimize the data being sent over the wire and to the cloud through deduplication, especially if you’re paying for backups per gigabyte on a monthly basis.
One way to decrease the amount of data being backed up (without sacrificing data protection) is to use de-duplication. When seeking the services of a cloud backup services provider, this feature should be standard. The way de-duplication is performed can often be unique to each managed service provider.
Some providers will only back up each file once and if the same file exists in multiple locations, pointers to the files will be created. Other service providers will provide block-level de-duplication. Rather than skipping duplicated files, the software which powers the cloud backup service will create a checksum for each block that’s being backed up and then uses the checksum value as a way of determining whether a duplicate block has already been backed up.
Keep a Local Copy of Backup Files
It’s important that you continue to store backups on premise – it will always be easier and faster to restore data from a local backup then from the internet. Local backups also allow you to further align the value of data with the cost of protecting it. Using the cloud for backup will allow you to recover in any situation when data loss occurs, but creating a second local backup is best for accidental file deletion or to quickly recover a single server in your network.
For more information or to request a demonstration please visit http://www.c24.co.uk