Top Document: comp.arch.storage FAQ 2/2 Previous Document: [9.1] Video vs Datagrade tapes {brief, 5/94} Next Document: [10] Benchmarking See reader questions & answers on this topic! - Help others by sharing your knowledge From: Other See the comp.compression FAQ, and don't believe everything a vendor tells you. 2x compression is the standard going rate for lossless compression of arbitrary data, though some vendors claim 2.5 or 3x. Your mileage will vary with your data type. Compressing tape drives are common, but for disks and other block devices I don't know of anything being done. The unpredictability of the compression ratio generally makes it inappropriate for devices that need fixed capacities and addresses. Online compression of files can be accomplished by hand using utilities such as gzip and Unix compress. Some systems support software compression of files in the file system software, and will transparently compress and decompress files as needed. Stacker for PCs is one example; for Unix-like systems this seems to be common research for object-oriented file systems (including the GNU Hurd), but I don't know of any production versions offhand (SHMO). Compression may make your data more vulnerable to errors. A single error early in a compressed stream of data can render the entire data stream unreadable. User Contributions:Top Document: comp.arch.storage FAQ 2/2 Previous Document: [9.1] Video vs Datagrade tapes {brief, 5/94} Next Document: [10] Benchmarking Part1 - Part2 - Single Page [ Usenet FAQs | Web FAQs | Documents | RFC Index ] Send corrections/additions to the FAQ Maintainer: rdv@alumni.caltech.edu (Rodney D. Van Meter)
Last Update March 27 2014 @ 02:11 PM
|
Comment about this article, ask questions, or add new information about this topic: