Approximately 90 percent of the world’s data was created within the past two years. If we continue to hoard data at the current rate, scientists and creators of data storage technologies will be scrambling constantly to keep up with increasing storage needs.
Inventors are responding. Several breakthrough storage prototypes promise groundbreaking amounts of storage considering the small scale of the devices. We’re talking hundreds of terabytes.
A revolutionary discovery and patent in nanotechnology from Michael E. Thomas, president of Colossal Storage Corporation, seems like one of the first promising steps.
According to Thomas, “Hard drives of today will be reaching their paramagnetic limit – or maximum data storage capacity – in the next few years.” Seeing the opportunity, he devised a rewritable ferroelectric optical storage nanotechnology that can store remarkable amounts of data in a very small amount of physical space.
A single 3.5 inch disk in Thomas’s Atomic Holographic DVR drive can hold up to 100 terabytes of data – 1,000 times more than any current state-of-the-art hard disk technology.
Meanwhile, a team at the University of Southampton in England has applied nanotechnology to come up with another new type of data storage.
Their discovery revolves around five-dimensional optical data and femtosecond laser writing. Data can be recorded in five parameters because of the 3D position of each nano-dot in addition to the two ways by which light is diffracted inside the silica glass disk. The result is 360 terabytes of data storage on a piece of glass the size of a standard CD.
So, does this mean data storage facilities can start to shrink too? Or, will the exponential growth of content continue to drive expansion? When is it time to stop storing such massive amounts of data?
Jeff Biggs, executive vice president of operations and technology at Peak 10, sheds some light.
“Although certain examples of extreme mass data storage, like the Library of Congress archiving every publicly available tweet, may seem absurd, there can be profound, and sometimes unintuitive, benefits to having access to huge amounts of information,” says Biggs.
The ability to store large amounts of data has made life easier for his department and increased tracking and trending possibilities to boot.
“Technicians in my department make rounds to check on the conditions of the various devices in Peak 10’s data centers,” he explains. “Using QR codes tagged on each piece of equipment, our technicians collect information about return air temperature, supply air temperature, humidity, the status of valves, and the status of compressors. When there’s an issue, technicians use photo, audio, and video files for documentation and to identify what’s going on.”
Collectively, these files take up enormous amounts of storage. But the information is beneficial to have.
“All of the diagnostic data rolls up into trend reports that show valuable insights regarding uninterruptible power supply capacity, cooling capacity, and utility costs. These reports make business more efficient, which is better for Peak 10 customers,” he said.
Even though he is a fan of mass data storage, Biggs warns that it is important to keep in mind the limitations of nanotechnology inventions.
“Using nanotechnology, you can store terabyte after terabyte, or thousands of gigs, in a very finite footprint. While that sounds great, sometimes writing that data out is slow, and then reading it back again can be equally slow.”
Because of this drawback, nanotechnology data storage solutions are typically reserved for large amounts of data that may be old or that do not need to be accessed often.
For now, the data continues to pile up, and the search for new and better technology solutions goes on as it always has and always will. It won’t be too long before today’s “revolutionary” nanotechnology storage will be yesterday’s floppy disk.