Follow Us

We use cookies to provide you with a better experience. If you continue to use this site, we'll assume you're happy with this. Alternatively, click here to find out how to manage these cookies

hide cookie message

IBM wrestles with world's largest storage system

Storage Tank is a grid-based system that could change the future of storage. We look at how the project has been progressing.

Article comments

Until last year, storage was arguably the least sexy market in IT. But with the Enron and and new laws on data retention on both sides of the Atlantic, it has suddenly jumped to the forefront of everyone's minds.

Storing the ever-increasing amount of data produced every day brings with it three problems: one, how to do it; two, what to store it all on; and three, how on earth to find it later on.

Just one of the companies sniffing a hugely expanding market was IBM. And soon after it started work developing what it hopes will become the de facto technology for the future, named Storage Tank.

The idea straight off the bat was to create a technology that would form a vast Storage Attached Network (SAN), working seamlessly with different companies' storage devices, and able to expand simply and easily. It was a tall order and as well as running tests in its own labs, IBM publicly announced its plan to run an R&D project at Europe's nuclear research organisation CERN.

Its Storage Tank work fits into an industry-sponsored R&D arm of CERN's IT department, called Openlab, which CERN hoped would produce some innovative technology that might make it into its final system.

It's been designed to support the Large Hadron Collider project (LHC), a next-generation particle accelerator and the biggest scientific instrument on the planet. It is currently being built in Geneva and is due to go live sometime in 2007. By analysing the 40 million particle collisions per second the machine will produce, scientists hope to find clues to the origins of the universe. However, that process also produces massive amounts of constant data.

It is calculated to pump out 15 Petabytes (PB) of data a year (or 34 Terabytes a day) continuously. The problem will become bigger over time as well, with CERN's head of IT, Wolfgang von Rüden, estimating that by 2010, 100PB a year of data will be produced.

IBM's plan was to build Storage Tank from the ground up and within two years challenge the huge storage system that CERN was developing with various public-funded research organisations. It didn't get off to a good start however. Having started work on the storage system in April 2003, nearly a year later, in January 2004, CERN reported that StorageTank had "never completed a successful test" and it "hangs and crashes".

IBM continued to work at the complex technology however, and last month proudly announced Storage Tank had ran more than 100 simultaneous SAN File System clients with its 28TB of storage distributed among 10 storage servers. It was a step in the right direction but will most likely prove too late in the day to be included in the CERN project.

The 28Tb system tested by IBM would be able to store just 20 hours of data expected to be produced by the LHC in its first year. The test configuration would also have to be scaled up 100 times before CERN could risk using it on the LHC, although IBM claims that the product could handle a fraction of all LCG needs and still be accepted as part of the overall solution. With von Rü:den, explaining [ppt - 10MB] in April that the Storage Tank system was hoping to expand to 100TB by the end of this year and hit a goal of 1,000TB by 2005, it looks unlikely that the technology will be selected when decisions on the final system build are made in March 2005.

IBM's claim that it will "extend Storage Tank's capabilities so it can manage and provide access from any location worldwide to the unprecedented torrent of data... when it goes online in 2007", may be no more than wishful thinking.

This does not mean the technology has failed however, maintains general manager of IBM's storage software division, Brian Truskowski. "We have always stated that this is a research project," he said, adding that the work done in the past year had already produced successful commercial results. Truskowski is confident the technology will be able to scale up, saying that from the very beginnings of the project, his team decided to make expansion a main consideration.

The reason the CERN project has only dealt with 28TB of storage so far, he told us, is more a case of CERN not wishing to spend a fortune on storage materials that will be out-of-date by the time the project goes fully live in 2007, than in Storage Tank not being up to the job.

As a storage network becomes bigger, it gets harder to keep the filing system out of the way - with more and more pieces added, more information is needed to record where a particular piece of data has gone. However, IBM is continuing to develop its caching technology, so the amount of questioning a system has to do to find the right material is get as small as possible. In this sense, it is similar to the DNS system that the Internet works on - something that you can argue has proven its worth.

Truskowski also explains that the system performs better with large files than large numbers of small files. "With lots of small files, you have to be more efficient," he said. In an IBM lab, Storage Tank has already scaled up to a quarter-of-a-billion files. As it begins testing with bigger and bigger storage networks, Truskowski said he expects there will be some bumps, but says none of the scientists on the team are currently stratching their heads.

Nevetheless, while Big Blue has argued StorageTank "will play a pivotal role" and CERN has said it will be "providing key storage technology" in the grid techology at the research centre, it is not yet be able to deal with the LHC's vast storage demands.

CERN spokesman Francois Grey admitted to Techworld: "Yes, at the moment [Storage Tank] is not able to meet the requirements [of the LCG]." But pointed out that currently "no system is able to".

Not that it will impact the Grid project itself. "The Storage Tank results do not reflect directly on our ability to cope with the LHC data in 2007," he said. "The CERN Openlab partnership is about testing and validation of future solutions for Grid technology. In parallel we have a Grid deployment effort for 2007 which addresses the pressing here-and-now needs for 2007."

It may have got off to a slow start but Storage Tank's clever way of tagging material - which can also allow important information to be kept close to hand, while less important information is stored on cheaper storage devices - is proving increasingly useful, and commercially successful.

And as for the LCG, CERN refused to rule it out altogether, with Grey remarking that: "Storage Tank may be part of the picture two or three years down the line after 2007." It better hurry however, as the LHC itself will shut down in 2020.


Share:

More from Techworld

More relevant IT news

Comments




Send to a friend

Email this article to a friend or colleague:

PLEASE NOTE: Your name is used only to let the recipient know who sent the story, and in case of transmission error. Both your name and the recipient's name and address will not be used for any other purpose.

Techworld White Papers

Choose – and Choose Wisely – the Right MSP for Your SMB

End users need a technology partner that provides transparency, enables productivity, delivers...

Download Whitepaper

10 Effective Habits of Indispensable IT Departments

It’s no secret that responsibilities are growing while budgets continue to shrink. Download this...

Download Whitepaper

Gartner Magic Quadrant for Enterprise Information Archiving

Enterprise information archiving is contributing to organisational needs for e-discovery and...

Download Whitepaper

Advancing the state of virtualised backups

Dell Software’s vRanger is a veteran of the virtualisation specific backup market. It was the...

Download Whitepaper

Techworld UK - Technology - Business

Innovation, productivity, agility and profit

Watch this on demand webinar which explores IT innovation, managed print services and business agility.

Techworld Mobile Site

Access Techworld's content on the move

Get the latest news, product reviews and downloads on your mobile device with Techworld's mobile site.

Find out more...

From Wow to How : Making mobile and cloud work for you

On demand Biztech Briefing - Learn how to effectively deliver mobile work styles and cloud services together.

Watch now...

Site Map

* *