SATA disk drives still have some way to go to reach perfection

Serial Advanced Technology Attachment (SATA) disk drives have worked their way through a range of performance and reliability problems to become a major feature in data storage systems. Aside from their rapid data transfer rate, they make it possible for companies to increase their storage capacity (into the terabyte range) without too much damage to their IT budgets. But according to Jerome Wendt, SATA disk drives haven’t ironed out all the kinks in their system just yet. In an article for Computer World, he says that SATA disk drives have a known deficiency that makes it necessary for companies to be careful when deploying the drives into a range of systems.

The problem is not that serious for small systems, but once a system goes beyond the 10TB mark it risks pieces of data becoming unreadable. Systems with over 100TB are certain to encounter this problem. The crux of the matter lies in a pesky bit error rate that occurs roughly once every 100 trillion bits. According to Wendt, not even RAID technology, which normally protects all storage systems against data loss, can detect unreadable bits on a SATA drive.

Wendt says that the problem is not really all that serious, provided companies don’t de-duplicate their data to increase their storage capacity. Even super-organised companies generate vast amounts of duplicate data as files and reports are sent around, recreated, edited, amended, saved and resaved in different locations. De-duplication gets rid of unnecessary duplicates, opens space and helps improve the organisation and management of data. The problem with the bit error rate in SATA drives, however, means that if the system can’t read a particular bit of information, and there are no duplicates to fall back on, companies risk a snowball effect. One unreadable bit of data can cause many other files to also become unreadable; the key to unlocking or reconstructing the data lies in that one vital bit that the SATA drive missed.

So while high-capacity SATA disk drives and storage systems have overcome numerous hurdles since their inception, and while they are valuable in solving the archiving and backup problems that many companies face, they are still not as infallible as users and vendors would like them to be.

SSDs not all they’re cracked up to be

Solid state drives (SSDs) are marketed primarily on the basis that they are more energy efficient than traditional magnetic hard drives. According to the theory, the fact that SSDs don’t have any moving parts means that they consume less energy than regular hard drives that have spinning platters and movable heads. But an article at Tom’s Hardware called “The SSD power consumption hoax” says that far from being energy efficient, SSDs actually use more power than other hard drives.

According to the article, SSDs draw a consistent level of power (the maximum) regardless of whether they are active or not. The level of power that conventional hard drives use, on the other hand, fluctuates according to the amount of activity. Power consumption drops considerably when they are inactive or activity levels are low. Data analysts tested a number of different SSDs from seven different vendors and each one rendered disappointing results.

Understandably, SSD manufacturers have responded strongly to these new, potentially damning revelations. Dean Klein, vice-president of memory system development for Micron said that only early-generation devices were used in the tests, which he freely admits were “quite power hungry”. Patrick Wilkison, who is vice-president of marketing and business development at STEC, has also spoken out against the results. He says that the tests were flawed because legacy drives were used, none of which will be used by any PC manufacturer. Wilkison adds that new versions of SSD have intelligent power management systems to combat excessive power use.

Intel, who weren’t included in Tom’s Hardware SSD tests, has nevertheless added their two cents to the battle by saying that SSDs can be “architected (sic) to improve battery life”. They are expected to release new SSDs with improved capacity (80GB to 160GB) later this year, all of which they claim to be power efficient.

Throughout this entire furore, it appears that everyone has overlooked the fact that Tom’s Hardware believes in the value of SSDs, and says that they are the way of the future. They merely wanted to point out the areas that they believe need improving in order for SSDs to reach their full potential. The thing that can’t be overlooked, however, is that they may have mislead consumers by testing old and outdated SSDs that have already been replaced with improved models.

Beam me up Scotty: Star Trek technology about to become reality

If you’re an avid Star Trek fan (as I am), and even if you’re not (poor culturally deprived person), you’ll be able to appreciate the significance of the latest medical technology breakthrough. Doctors in Star Trek are never without their tricorders,which are very cool handheld devices that can diagnose everything from the common cold to a subdural haematoma. Soon, doctors in rural areas will be able to use their cell phones in a very tricorder-like manner.

A team from the University of Berkeley, California, has broken down bulky medical imagers into their component parts and placed the most complicated elements in one central location. Then, using off-the-shelf cell phone technology, they created a portable scanner that can be plugged into any cell phone capable of sending and receiving pictures. Scanned data is sent to the central location for analysis and diagnosis, and the results are sent back. Practical tests carried out by the researchers found that the entire process uses fewer kilobytes than a single sentence email.

These portable scanners can be used for anything from detecting tumours to monitoring the progress of a child in the womb. Boris Rubinsky, professor of bioengineering at Berkeley, says that big bulky imagers and scanners are often too expensive and impractical to run in most areas of developing countries. The new portable cell phone scanners will play an important role in helping hard-pressed doctors improve their diagnoses and treatment options in even the most trying circumstances. The best part is that it needs only a cell phone signal to work. And where, apart from deep within the depths of an underground cave, can you not get a cell phone signal?

It’s quick and easily affordable, for example, an ultrasound machine costs around $70,000 (AU$ 74,341), but a scanner coupled to a cell phone would cost only around $1000 (AU$ 1,062). Another major benefit is that one central server would be capable of dealing with data from several portable devices, eliminating the need for many machines. As you can see, the savings are enormous, and exactly what developing countries need.

But Rubinsky doesn’t see them benefiting only developing countries; he hopes to see the devices in ambulances all around the developed world as well. Completing scans en route means that diagnosis and treatment could begin that much sooner, and could spell the difference between life and death for many critical patients.

We are fortunate enough to live in a world where science fiction is increasingly becoming science fact. If technology continues to advance at this pace, and scientists continue to take their inspiration from cult TV and books, it won’t be long before we’re traversing the universe with the help of someone who may or may not be named Scotty.

Sensors in cell phones provide more reasons to love mobile technology

There are many reasons to love mobile technology: it’s convenient, it’s accessible and it’s improving all the time. Nokia is only one of the companies that are hard at work perfecting innovative initiatives that they hope will revolutionise mobile technology, making it even more indispensable than it already is.

According to an article by Daniel Langendorf on Read Write Web, the latest thing to hit cell phones is the incorporation of billions of sensors that will make reporting on everything from traffic to the weather more interactive.

Bob Iannucci, chief technology officer at Nokia, has let the cat out of the bag regarding several of Nokia’s projects that use mobile sensor technology. One of these projects was carried out with the help of 150 students from the University of California, Berkeley. During the course of the project, Nokia placed 100 N95 smartphones into the student’s cars and used them to gauge real-time traffic. The idea behind the initiative is to use mobile sensors to collect data from thousands of motorists in any given area, which would then be analysed and interpreted, and the results sent back to the recipients. That way, your phone will be able to warn you of upcoming traffic problems specific to your route, and provide you with a viable, hassle-free alternative.

Another initiative proposes to use cell phones with barometric sensors to help meteorologists gauge changing weather patterns, and to provide up to the minute weather reports with pin-point accuracy. The theory is similar to the traffic initiative, in that data will be collected from millions of cell phones around the world, in order to provide a unique view of global climates. This would also have a profound effect on determining the state of the planet and could be instrumental in planning environmental interventions.

On a more surreal note, the MobileLab at the University of Texas in Dallas are looking into “the use of mobile devices in augmented reality”. According to Dean Terry, who is the director of the MobileLab, cell phones could soon be used to leave behind “virtual artefacts” for others to find. Imagine what it would be like to walk into a museum, art gallery or theatre and be able to view comments left behind by other patrons on your cell phone. If that doesn’t strike your fancy, perhaps you would prefer recommendations for restaurants that you’ve always wondered about. These ethereal communications could be in any form, pictures, video, audio or text.

Bob Iannucci sees a future where cell phones will be used in ways that extend far beyond simple communication. As he says, “The ability to move information changes societies and livelihoods.”

Forensic Recovery – Cache 24

The world of television and film has influenced users to think of crime laboratories, where evidence is analysed, to be clean white areas with microscopes and glass jars. Data recovery isn’t far from that image. They do have clean rooms, which resemble a lab. These rooms are purely there to be able to recover data without any environmental factors affecting the hardware.

There is another area where evidence labs and data recovery overlap and that is in forensic data. Data recovery specialists are able to analysis the data on PCs and provide digital evidence on any activity. This has been a great assistance in crime fighting; especially murder cases, where users are acquiring information off the Internet before committing the crime.

But before you panic about what your daily searches will reveal, you should know that the information is highly detailed. Data specialists are able to tell the difference between an obscene pop-up ad and a specific search for crime information. For criminals, clearing the cache won’t help. Data recovery specialists have mastered retrieving files from the memory, regardless of a clean cache or not.

Forensic Data is becoming a common procedure to find evidence in murder cases. Recently, it was used when Indian PhD student, Anurag Johri killed his wife with a baseball bat. Prosecutors were able to reveal that the day before the killing, Anurag was searching for terms such as “tips with killing with a baseball bat” and “how to murder someone and not get caught.”

Data Detect has had more than 30 years combined experience in the data recovery industry. One of the services that they provide is Forensic Recovery. They are able to secure evidence from an entire system, network or data storage device. From the information they will be able to conclude whether data has been erased or damaged.

They are able to search on the system, using keywords, numbers, files or even phrases to find specific data. Once the digital information is found, they can verify if there was unlawful use of proprietary information or if an authenticate software license was breached.

In another murder case study, a woman attempted to kill her husband by giving him anti-freeze in his wine. Her husband managed to live through the ordeal, but is now blind and deaf from the poisoning. The wife was hoping he would pass on so she could use his life insurance to pay off mounting debts. According to reports, she had been researching ways to kill her husband of seven years, on Google.

Data Detect is able to repair altered or damaged data through a variety of computer investigation methods. With the advantages of having clean rooms, they are able to control and fully document an assessment of the computer media.

Forensic data isn’t a service that is used only for criminal investigations. Data Detect provides it for the public as well, in recovering or repairing data for personal use. They are able to assist in finding lost files and emails, which have either gone missing due to error or physical damage.

Data detect forensics
https://datadetect.com.au/forensics.php
Google kill
http://antimisandry.com/cop_wife_googled_commit_murder_then_killed_her_husband-t4447.html?s=eaaddc20965096241af7d3aeb9eb3e26&s=22dc29b758db08b92a16a7fe97217101&
anti freeze
http://www.telegraph.co.uk/news/main.jhtml?xml=/news/2008/01/29/npoison229.xml