Get Complete Project Material File(s) Now! »
Secure storage and secure computation
Three main functions are required to protect digital data during its life cycle: secure storage, secure computing and secure sharing. One of the most promising method for securing computing is Fully Homomorphic Encryption (FHE) which provides full privacy during the whole computing process for the encrypted data. And the most popular technology for data storage and sharing is Cloud computing, which offers several benefits like fast development, pay-for-use and lower costs, scalability, rapid provisioning, greater resiliency, low-cost disaster recovery, and data storage solutions. With over three decades long, outsourcing information storage and processing, cloud-based services for data storage have gained in popularity and today can be considered as mainstream. They attract organizations or enterprises as well as end users who do not want or cannot cope with the cost of a private cloud.
The cloud offers all these advantages, however, this is not without taking cloud computing needs to move the application data or databases to large data centers, where the operation and management of the data and services are not trustworthy [122]. Hardening data protection using multiple methods rather than ‘just’ encryption is becoming of paramount importance when considering continuous and powerful attacks to spy, alter, or even destroy private and confidential information. Even if encryption is a great technology rapidly progressing, encryption is ‘just’ not enough to progress with this unsolvable question not mentioning its high computational complexity. In [2], the author shows how to compromise Diffie-Hellman key exchange (used in https sites) with 512-bit group. It is also shown that 1024-bit encryption could be cryptanalyzed with enough computational power. Cryptographs never like the idea that a cipher can be broken and information can be read given sufficient computational resources [106], this is nevertheless one of the central design tenets of a number of projects like the Potshards system [147]. Moreover, there remains the difficult question of the management of the encryption key that over time, can be known by too many people, and stolen or lost. Our purpose and ultimate ambition is to look at data protection and privacy from end to end by way of combining fragmentation, encryption, and then dispersion. This means to derive general schemes and architecture to protect data during their entire life cycle everywhere they go throughout a network of machines where they are being processed, transmitted, and stored. Moreover, it is to offer end users choices among various well understood cost effective levels of privacy and security which would come with predictable levels of performance in terms of memory occupation and processing time. For this thesis, we aim to provide secure data storage scheme for end users with reasonable assumptions that end users will have a resource limited personal environment and will look at a honest but curious third party cloud storage provider with a cost effectiveness additional constraint.
Traditional full encryption
Cryptography is the science of writing in secret code and is an ancient art; the first documented use of cryptography in writing dates back to circa 1900 B.C. when an Egyptian scribe used non-standard hieroglyphs in an inscription. Some experts argue that cryptography appeared spontaneously sometime after writing was invented, with applications ranging from diplomatic missives to war-time battle plans. It is no surprise, then, that new forms of cryptography came soon after the widespread development of computer communications. In data and telecommunications, cryptography is necessary when communicating over any untrusted medium, which includes just about any network, particularly the Internet. Within the context of any application-to-application communication, there are some specific security requirements, including:
• Authentication: The process of proving one’s identity. (The primary forms of host-tohost authentication on the Internet today are name-based or address, both of which are notoriously weak.)
• Confidentiality: Ensuring that no one can read the message except the intended receiver.
• Integrity: Assuring the receiver that the received message has not been altered in any way from the original.
• Non-repudiation: A mechanism to prove that the sender really sent this message. Encryption is one of the principal means to guarantee privacy and confidentiality of information. Traditional encryption algorithms in the recent several decades, which is also widely used in information security in telecommunication fields, perform various substitutions and transformations on the plaintext (original message before encryption) and transforms it into ciphertext (scrambled messages after encryption). The goal of encryption is to make the plain information unreadable, invisible or unintelligible to keep it secure from any unauthorized attackers.
Basic concept of selective encryption
Selective encryption (SE) used for protecting data especially multimedia data has been introduced more recently. The basic idea is to go as fast as possible to reduce the overhead involved by securing data. Although traditional data encryption techniques such as Advanced Encryption Standard (AES) [134] have become very popular, they have some clear limitations for multimedia applications. The main problem is that the majority of existing encryption standards such as DES and AES have been developed for i.i.d. (independent and identically distributed) data sources [32]; however, multimedia data are typically non i.i.d. which will lead to poor speed of encryption pointed out in Fig. 2.2 by Grangetto et al. [64]. This is because the statistics for image and video data are strongly correlated and have strong spatial/temporal redundancy that makes them differ a lot from classical text data. And as pointed by Lookabaugh in [92, 93], the relationship between plaintext statistics and ciphertext security is already highlighted by Shannon in [142]: a secure encryption scheme should remove all the redundancies in the plaintext; otherwise, the more redundant the souce code is, the less secure the ciphtertext is [101]. Based on this viewpoint, the naïve full encryption algorithms are not suitable for protecting the multimedia contents and SE methods are designed to fit the need.
Table of contents :
List of figures
List of tables
1 Introduction
1.1 Background
1.2 Motivation
1.2.1 Benchmark problem
1.2.2 Security analysis
2 Data protection methods
2.1 Secure storage and secure computation
2.2 Fully homomorphic encryption
2.2.1 What is FHE
2.2.2 Related work of FHE
2.2.3 Performance study
2.2.4 Discussion
2.3 Traditional full encryption
2.4 Selective encryption
2.4.1 Basic concept of selective encryption
2.4.2 Related work of SE
2.4.3 Our SE approach
2.4.4 Performance issue of SE
3 Hardware acceleration
3.1 Background of parallel computing
3.2 Development of modern GPGPU
3.2.1 Hardware development
3.2.2 From GPU to GPGPU
3.3 CUDA platform
3.3.1 CUDA cores
3.3.2 CUDA threads model
3.3.3 CUDA memory access management
3.4 Different hardware platforms
3.4.1 PC GPU platform
3.4.2 Mobile GPU platform
3.5 Discussion
4 DCT based selective encryption for bitmaps
4.1 DCT transformation and selective image protection
4.2 DCT acceleration on GPGPU
4.2.1 DCT implementation on CPU
4.2.2 DCT implementation on GPU
4.3 Design of SE for bitmaps based on DCT
4.3.1 First level protection
4.3.2 Second level protection
4.4 Storage space usage and numeric precision
4.4.1 Storage space design
4.4.2 Numeric precision analysis
4.5 Result analysis
4.5.1 Probability Density Function analysis
4.5.2 Coefficients analysis
4.6 Evaluations with different computer architecture
4.6.1 Allocation of calculation tasks for a moderately powerful GPU (laptop)
4.6.2 Allocation of calculation tasks for a powerful GPU (desktop)
4.7 Discussions
5 DWT for general purpose protection
5.1 Discrete wavelet transform and GPU acceleration
5.1.1 DWT
5.1.2 DWT acceleration based on GPGPU
5.2 Design of DWT based SE
5.2.1 Designs
5.2.2 Evaluation of the storage necessary for DWT
5.2.3 Storage space usage and numeric precision
5.3 Security analysis
5.3.1 Uniformity Analysis
5.3.2 Information Entropy Analysis
5.3.3 Test Correlation between Original and protected and public fragments
5.3.4 Difference Between input Data and the public and protected fragment
5.3.5 Sensitivity Test
5.3.6 Visual Degradation for images
5.3.7 Propagation of errors
5.3.8 Cryptanalysis Discussion: Resistance against well-known types of attacks
5.4 Benchmark with two computer architectures
5.5 Discussion for benchmark
5.6 Fragment transmission
6 Conclusion and future work
7 Résumé
7.1 Introduction
7.2 Motivations
7.2.1 Evaluation de peerformance et définition d’un benchmark
7.2.2 Analyse de sécurité
7.3 Contexte de l’informatique parallèle
7.3.1 Du GPU au GPGPU
7.4 Chiffrement sélectif basé sur DCT pour les bitmaps
7.4.1 Transformée DCT
7.4.2 conception et implémentation
7.4.3 Analyse de résultats
7.4.4 Évaluations de référence
7.4.5 Discussion
7.5 DWT pour la protection d’usage général
7.5.1 DWT
7.5.2 Accélération de DWT basée sur GPGPU
7.5.3 Conception de SE basée sur DWT
7.5.4 Analyse de sécurité
7.5.5 Benchmark avec deux architectures logicielles
7.5.6 Discussions sur le benchmark
7.5.7 Transmission de fragments
7.6 Conclusion
References