Data compression is the compacting of information by reducing the number of bits that are stored or transmitted. Consequently, the compressed info needs substantially less disk space than the original one, so much more content might be stored on the same amount of space. There're many different compression algorithms which function in different ways and with several of them just the redundant bits are deleted, so once the info is uncompressed, there is no decrease in quality. Others delete unnecessary bits, but uncompressing the data at a later time will lead to reduced quality compared to the original. Compressing and uncompressing content needs a significant amount of system resources, in particular CPU processing time, therefore every hosting platform which uses compression in real time needs to have sufficient power to support this attribute. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of storing the actual code.