Big data needs large amount of space for storing their data, so cloud service providers must spent more money for memory. They want to reduce the cost by reducing the memory space. Many researchers can implement some other technology to reduce the big data file size. But it is not in effective manner and also increase the system complexity. Decentralized control is one of the problems in existing system. Star Student Project MaduraiStoring and retrieving of big data must be secure and flexible; the above existing theorem could not meet it.In big data cases, cloud server needs more memory space than a normal client system. The cloud server can be accessed by multiple users at a single time. So cloud must provide efficient access to each client without delaying. Memory management is one of the important features in cloud. So we design a new system for achieving reduce the cost and memory, improve the system performance by compress the stored data. Our new system provides retrieving this compressed data without loss.
Star Student Project Madurai
In our proposed system the big data is compressed and stored it into a cloud server. In our proposed work we are going to use the Huffman coding algorithm. The Huffman coding is used as the compression technique in our proposed system. This compression technique which reduces the storage space of the big data in the cloud. The Huffman encoding is an lossless compression technique. In our proposed system there is no any data loss during the compression process.