Even with the Latest Backup Technologies, Defragmentation is Vital
Since the earliest days of computers, backup of data has been crucial. Vista, the latest in operating systems from Microsoft, offers robust backup functionality. Vista’s backup and restore features make it easy for IT personnel to keep corporate data safe from user error, hardware failure, and other problems, offering automatic, scheduled, and other backup options.
The necessity of backing up data has always brought a plethora of problems and solutions for the data center. Questions relating to how often data should be backed up, how quickly backed-up data should be accessible, what backup technology would be most beneficial, and what backup medium is most cost-effective for a particular site, are continuously asked by corporate and IT executives in an effort to make sure company data is always secure and recoverable in emergencies.
A common hindrance to backups, however, despite ever-advancing technologies, is file fragmentation. Fragmentation already causes problems in day-to-day file access, as a file split into hundreds or even thousands (it’s more common than you might think) of fragments is going to take considerably longer to access. Multiply that by all the data files in all an enterprise’s computer systems–because a backup procedure must access all of those files–and it can be seen what a nighmarish problem file fragmentation can be to backups in terms of backup time.
Numerous sites discovered some time back that defragmenting disks prior to running backup greatly speeded backup times. The same holds true today. Advances in defragmentation technology have allowed defragmentation to be scheduled so that it would be done prior to backups occurring.
But since these technologies were developed, disks have gained incredibly high capacities, storing more active files than ever. In addition, the advent of video and other technologies have meant much larger files. Not only do all of those files need to be backed up, but if they are to be backed up in a decent amount of time, they must be defragmented. Due to the sheer volume of files and sizes, scheduled defragmentation may leave files still fragmented after its timed run, and in addition system resources and use are impacted while the defragmenter is running, so that run time cannot usually be expanded. The result: backup times are still impacted by fragmentation.
The true solution is to have disks constantly defragmented, with no impact on system resources, so that when backups occur they can run as fast as possible. Fortunately, defragmentation technology is now arriving on the market which meets these criteria.
A vital element to backups, however, is the same no matter the methodologies used, and that is regular defragmentation. File fragmentation, especially unchecked, can greatly slow backup procedures due to the multiple I/Os necessary to retrieve data from each file fragment.
http://www.microsoft.com/windows/products/windowsvista/features/details/backup.m spx