to download the project abstract to download the project base paper


Data de duplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. To protect the confidentiality of sensitive data while supporting de duplication, the convergent encryption technique has been proposed to encrypt the data before outsourcing. We propose hashing
technique to split the data into fragments and their fragment is matched with other data, then the matched data is denoted for previous file name and extra content added as a chunk. We also present several new de duplication constructions supporting authorized duplicate check in hybrid cloud architecture. Security analysis demonstrates that our scheme is secure in terms of the definitions specified in the proposed security model. We show that our proposed methodology duplicate check scheme incurs minimal overhead compared to normal operations. The security requirements of data confidentiality and tag consistency are also achieved by introducing a deterministic
secret sharing scheme in distributed storage systems, instead of using convergent encryption as in previous deduplication systems.

Leave a Comment


No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *