Hard drive manufacturers tend the earth’s wounds

New Western Digital hard drive range, the Greenpower Drive, has proven excellence in its remarkable achievement of reducing power consumption by a whopping estimated 40 percent. Not only does this amount to huge savings for large companies and corporations that have massive employee rosters who all use computers, but it also means less CO² emission into the atmosphere each year, making it an ideal motivator for use in a modern globally hot under the collar society. Western Digital will be including this new earth friendly technology in their new terabyte sized external hard drives that will be commercially available as from this month, August 2007.

Other power saving hard drives that are becoming or have become available would include the Kanguru External Eco Drive that has 3 power usage user settable modes, the Buffalo Ministation and Drivestation that have a power shutdown scheduling feature and an auto on/off feature, the Hitachi Deskstar 7K1000, which boasts a patented feature where its drive heads use energy converted from the spinning action of the disks to operate (also protecting against data loss).

This progression is a testament to the will of manufacturers to provide a quality service to their customers and chuck in their two cents to help save the planet as well; it’s good to see corporation’s actions having positive effects on the well being of nature for a change.

Software Corruption

As is generally estimated, software corruption is responsible for 13-14 percent of all the known forms of data loss. So it is not just a coincidence how so many people the world over, are encountering the same nonsense with their home computers, where they install various programs, gradually more as they go along, and the computer system becomes slower and slower as the time passes and more software applications are added to its inventory.

Various things can cause your software to go haywire. Your computer could become infected with a virus that found its way in through an insecure connection to the Internet or Local area network, an external disk like a CD or DVD or any afflicted data storage device.

Most people don’t know their bum from their elbow when it comes to what goes on behind the scenes with their personal PC’s, which is certainly nothing to be ashamed about. So much is being automatically controlled behind the curtain of information technology that it’s like going to the theatre: you are captivated by the on-stage performance yet couldn’t be bothered with whatever happens back-stage, where all the preparation is actually going on. So with that in mind, think of how easily that insecure system can fall by the wayside without anyone monitoring it. As you add more software to your PC, the processor and supporting components struggle to simultaneously run all that information at once – like one person trying to carry eight full shopping packets up the stairs – therefore the whole system gets slower and eventually crashes.

I’m sure you’ve heard of others, or actually had to yourself, reinstall the entire operating system from scratch, losing all the accumulated programs in the process. And who indefinitely keeps, let alone knows how to install, all their little programs they’ve gradually gathered? One way to prevent this from happening is to use a disk-imaging program to take an “image” of the hard drive before the software corruption takes place. In fact there are many ways that this can be prevented, but such methods usually rely on the computer user having a certain amount of savvy and taking a certain amount of responsibility with the backing up of their data. Due to the general incompetence therein, however, this is an area where calling in the professionals may be the better option. A professional data recovery company such as Data Detect can use technological wizardry (special equipment and skill) to successfully hunt down and restore a suffered computers data in more than less cases.

Overwritten Data

Hard drive files are stored on areas of the disk called sectors, in units called clusters. Clusters can be strewn across the hard disk, or in a continuous line. A windows file system called NTFS or the older FAT system, logically orders data by assigning a file name to point to locations on the disk magnetic surface. The location contains all data pertaining to that file and the file name holds information about how to reorder or reassemble the data. Because the data may be located arbitrarily around the overall disk surface, it becomes very difficult for a data recovery expert to reconstruct the puzzle that is the complete file. So even if filenames could be recovered and the address on disk of the data could be found, there is no guarantee that all the necessary data can be restored to full original condition. So some, not all scenarios have successful outcomes.

According to a paper by Peter Gutman, New Zealand science professor, from 1996, it is proposed that data can be recovered from disk sectors that have been over written. He outlines how read/write disk heads never travel over the same exact area of the disk twice, so by using a scanning electron microscope, one can reveal a “ghost track” of the old sector. This was based on the type of hard disk drives that preceded the IDE or ATA style drives used commonly today.

Now with a massive increase in bit density per square inch, and a decrease in hard disk track widths (down to nano levels), the possibility of recovering data in this way has greatly reduced.

Thomas Feher has claimed that if a disk drive is opened in a hermetically safe, clean environment, specialized equipment can be used to read residual magnetic traces that surround the newly written tracks and extract old data, even if the data has been overwritten numerous additional times. He claims that one of the Companies that can achieve this level of high-tech data recovery is Kuert Information Management.

Useful Flash Memory

Flash memory has fast become one of the most popular mediums of data storage throughout the world.

Flash memory utilizes the popular universal system bus as its communication interface, and is used widely by ordinary people and businesses alike as a quick program and erase portable storage facility. It is a common storage medium for personal media devices such as MP3 players and personal digital assistants. To put the usage into context for the less expert hardware users, the format is used in much the way floppy or stiffy disks were used throughout the last storage era – only far with far bigger capacity for storage.

The electrically programmable and erasable memory does not need to be powered constantly by an energy source such as a battery, this being a leap ahead from the old technology. The read time is almost as fast as the internal random access memory of a personal computer. A huge advantage is the memory is much more resistant to shock and vibration, which is a far cry more convenient than the jumping and skipping that most compact disk players are prone to. The silent running of flash memory also challenges hard disk technology, which have long presented a struggle in creating absolutely silent running hard drives. On the down side, the cost is greater on the flash memory side since a megabyte of flash memory is more expensive than the same in hard disk space. This is not a permanent fact, however, since the research into flash memory capacity, speed and cost efficiency continues to hammer increasingly, to accommodate the high commercial demand. It is not yet foreseen whether flash memory will totally replace hard disk technology, but then nothing is impossible!

Data De-duplication

De-duplication is a marvellous technology that greatly drops the cost of data backup down as it rids backup data of redundant, useless impertinent information. Up to a 50 to 1 storage size reduction rate can be achieved by the most recent virtual tape libraries. De-duplication suppliers, such as Diligent Technologies, Sepaton, FalconStor Software Inc., Network Appliance Inc. (NetApp), Quantum, Symantec Corp. and NEC, have chipped in well to the effort to lower storage costs in the world of data backup. There are a couple of negative points to the de-duplication idea. Power efficiency is a problem as users have had to revert to tape backup storage due to high consumption levels of energy associated with the de-duplication process. Also, the question remains if de-duplication will work properly together with encryption and compression technologies in the first place.