The LHC produces a petabyte of data every second. And its because of that researchers simply can't store it all. Which means, they'll have to get rid of it, if its not useful.
A petabyte is a million gigabytes, or enough capacity to store 13.3 years of HDTV content. And every second, the LHC is collecting data worth that much. Too bad the facility doesn't have the capacity to keep all the data.
The LHC's detectors can capture 40 million snapshots in a single second when a collision occurs. That petabyte of data is then processed by a sophisticated array of electronics that will decide whether or not the snapshots will be useful to them, and paring it down to about 100,000. The smaller group is sent to a large farm of in-house computers which will further narrow them down to 100 to 300 snapshots for analysis.
That's a lot of data shedding. Maybe we can learn from this too. Data hoarding could be a bane.