The biggest problem facing the world is information security in the digital era. Information protection and integrity are hot topics at all times, so many techniques have been introduced to transmit and store data securely. The increase in computing power is increasing the number of security breaches and attacks at a higher rate than before on average. Thus, a number of existing security systems are at risk of hacking. This paper proposes an encryption technique called Partial Deep-Learning Encryption Technique (PD-LET) to achieve data security. PD-LET includes several stages for encoding and decoding digital data. Data preprocessing, convolution layer of standard deep learning algorithm, zigzag transformation, image partitioning, and encryption key are the main stages of PD-LET. Initially, the proposed technique converts digital data into the corresponding matrix and then applies encryption stages to it. The implementation of encrypting stages is frequently changed. This collaboration between deep learning and zigzag transformation techniques provides the best output result and transfers the original data into a completely undefined image which makes the proposed technique efficient and secure via data encryption. Moreover, its implementation phases are continuously changed during the encryption phase, which makes the data encryption technique more immune to some future attacks because breaking this technique needs to know all the information about the encryption technique. The security analysis of the obtained results shows that it is computationally impractical to break the proposed technique due to the large size and diversity of keys and PD-LET has achieved a reliable security system.

In the past decades, for enhancing the transmission and security of electronic data, it is essential to continue the development of digital network communications technology. During transmission of data over open communication networks, it was a prerequisite to protect and keep the secrecy of data. Especially, over the doubtful network. One of the solutions is cryptography which hides secret data from all but authorized persons [

The researchers to solve the problem of digital data security proposed many cryptographic systems. In the past forty years, numerous robust and effective encryption systems are proposed based on the above-mentioned cryptography categories. As is known, encryption is the process of changing clear data into an incomprehensible form and is used to provide data authentication and confidentiality.

A well-built encryption technology should have high statistical advantages and fulfill the requirements of confusion and diffusion [

In recent years and shortly, the risks and threats of breaching security systems will increase and the probability of revealing secret data will increase significantly as a result of increasing computing power and new technologies. The main contributions to this paper are 1) the PD-LET encryption technique, which is hard to break and resists new generation attacks; 2) designing a multi-tier encryption technique that significantly improved the quality of encrypted data based on multiple rounds of the block cipher; 3) using a convolutional layer (CL) that comprises a group of filters (or kernels) with variable sizes and values at every round in the proposed encryption technique; and 4) PD-LET merges substitution, transposition and key expansion based on CLs and its other stages, which creates a large amount of diffusion and confusion. To achieve a reliable security system, symmetric cryptography and iterative process technique have been proposed. The proposed technique is built on some standard deep learning algorithm processes, zigzag transformation, image partitioning, and keys. In this section, the basic definition and some properties of previous encryption techniques are briefly summarized and discussed.

The rest of this manuscript is organized as follows. Section 2 is devoted to the construction of the proposed technique. Section 3 depicts the results of the proposed encryption and its comparison with the existing results. The last section concludes the discussion.

In general, all people try to protect their sensitive information and data. Especially, if it is private data or top secret. The most important parameters that make security algorithms robust and unbreakable are computational and time complexity. As a result of increasing computing power, the threats and risks of breaching security systems will increase and the likelihood of unauthorized disclosure of secret data will increase. To achieve a reliable security system, symmetric cryptography and iterative process technique are proposed. The Partially Deep-Learning Encryption Technique (PD-LET) is built on some standard deep learning algorithm processes, zigzag transformation, image partitioning, and keys.

This technique encrypts any digital data type. The main steps of the proposed technique are preprocessing, image partitioning, part interchanging, adding value, convolution stage, zigzag transformation, and symmetric key encryption as shown in

Algorithm 1 represents the encoding processes of PD-LET to encrypt secret data. Algorithm 2 represents the decoding processes to reconstruct the original data from an encrypted image.

The main objective of this stage is to convert the secret data into a two-dimensional (2D) array of bytes. Padding is used by completing the data with zeros so that the dimension of the array is divisible by an integer in preparation for dividing the data into several equal parts as shown in

The secret square-shaped data is divided into several equal parts as shown in

At this stage, the predefined value is added or/and XOR to each byte of the secret data, then the data is normalized to be represented from the range of image scale values represented between 0 and 255.

A convolutional layer (CL) is an important layer of a CNN (Convolutional Neural Network) architecture. This layer comprises a group of filters (or kernels), these kernels convolve with the data and extract features of input data. The filter’s size is generally smaller than the input data. The objective of this stage is to replace and change the position and value of data.

The dimension of the filter may be from 2 × 2 up to the half size of the part (secret square-shaped data). The kernel size is used to scramble the input data by swapping them diagonally, horizontally, or vertically. The values of kernel weights alter the output of the convolutional layer.

A linear process is used to encrypt the data. A kernel is a small set of numbers that is applied across the input data. Every output byte of each CL is the result of the sum of the product of each element of the kernel and its corresponding input data using zero padding to retain dimensions.

The stride of the kernel is one. After the kernel completes scanning the entire data, the normalized process converts the convoluted output to the range 0 and 255.

2D zigzag scanning is important in many applications such as the graphic compression algorithm and medical imaging. 2D zigzag scrambles the data by changing their locations. There are several types of zigzag scanning depending on the starting point and direction [

For implementation purposes, the proposed is designed and developed to protect data in a 2D format using MATLAB 2020A. The proposed system is executed on Windows 10, a 64-bit Operating system in Intel Pentium G2020 Dual-core (2 Core) 2.90 GHz Processor and 8 GB RAM. For testing the efficiency of the proposed technique, several experiments are made using different image types and sizes. To illustrate the effects of CL round and kernel weights in encrypted output,

Security analysis is important to judge the quality of a cryptosystem as it is determined if a cryptosystem is strong enough to resist any type of attack. The quality and strength of the proposed technique are verified based on key space, visual testing, histogram analysis, differential analysis, information entropy, and correlation coefficient analysis. The restored data is a copy of the original one.

A key space is defined as the number of attempts an attacker must make to read confidential data. The rule says: “the larger the key space, the fewer the chances of a brute attack”. In the proposed technique, the weight values of kernel filters in convolution layers and added values combine the key space of the proposed technique and it is also the initial parameters of chaotic function and keys. A small change in these values will affect the output. A brute force attack is almost impossible and impractical due to the large key size. All other parameters of the proposed technique increase key space size. The sequence of the phased implementation of the proposed technique and its repetition are good resistance to brute force attacks. Any slight change in the value of the secret key makes a complete difference in the encryption and decryption output.

The efficiency of the encryption system is excellent if only an authorized person can read the secret data.

Using histogram (intensity function) that represents the distribution of pixel intensity values in graphic form, to test the effectiveness of the encryption technique.

There is a mutual relationship between any two contiguous pixels in an unvarying image. Scatter plots in

The correlation coefficient (

As a result of the obtained correlation coefficient value in the range of −0.0034 to 0.0026, the attacker cannot extract any information about the original data from the encrypted data, and the relationship of the original data with the encrypted data is almost negligible.

Entropy is defined as a degree of the randomness of data. The following equation defines Information entropy (Entropy):
_{i}

The chi-square value shows the distribution of encrypted data values that is estimated using the following equation:
_{i}_{i}

Technique | NPCR | UACI | Entropy | LS entropy | Chi-square test |
---|---|---|---|---|---|

Proposed (Lena 256 × 256) | 99.5903 | 34.8508 | 7.9991 | 7.9600 | 234.1314 |

Proposed (Lake 512 × 512) | 99.6093 | 33.4635 | 7.9993 | 7.9746 | 236.8242 |

Proposed (Pepper 512 × 512) | 99.6094 | 33.4635 | 7.9998 | 7.9777 | 233.2637 |

Ref [ |
99.6216 | 33.4994 | 7.9972 | 7.9002 | 253.4844 |

Ref [ |
99.6109 | 33.4783 | 7.9972 | 7.9024 | 233.1328 |

The rule of the encryption schema states: Minor modifications in the regular data should make a big difference to the encryption technique outcome. To break the encryption scheme, The attacker slightly modifies the plain data and checks the encrypted data. There are two formulas for tracking the alternation level NPCR (rate of pixel change) and UACI (uniform average change intensity) [

In general, an encryption technique is good if its output is completely different when making slight modifications to the input (secret data). The ability to resist differential attacks is confirmed by NPCR/UACI tests [

UACI is estimated as the percentage of a difference value between the encrypted data and the secret data divided by the maximum value into the secret data (

Using these two scores to study the output of the proposed technique when changing a single byte in secret data and to verify the resistance of the proposed technique to differential attacks [

We randomly change only one value in secret data that is called ‘‘I1’’ and the result data after changing is called ‘‘I2’’. After that, these two data (I1 and I2) are encrypted, and the outputs of the proposed technique are encrypted data (A and B) respectively.

Infrequently the encrypted data contains some blocks with very low entropy information values [_{i}_{i})_{i}

SSI (Structural Similarity Index) measures the different degrees of an image’s texture after processing. The SSI value is usually in the range of 0 to 1. If two images match, the SSI value is one. The smaller the value, the greater the difference between them [

Using the MATLAB (SSIM) function to calculate SSI. The SSI value between the original and the encrypted image is on average 0.049. As a result of the small value of SSI, there is no correlation between them.

The proposed technique is tested with a full-black and full-white data image to verify its suitability and ability to resist the plain-text attack.

The results of the proposed technique are compared with previous encryption systems based on statistical score metrics. From the results presented in

This paper proposes a strong multi-stage cryptographic system. The proposed technique is called Partially Deep-Learning Encryption Technique (PD-LET). The first stage is preprocessing which scrambles data by reshaping and transposing the secret data. Deep learning and zigzag transformation processes are responsible for increasing encryption efficiency. It also included random image partitioning and encryption keys for encryption quality. The results show the resistance of the PD-LET technique against different attacks based on its operations sequence. There is no indication of the original data in the encrypted data so the cryptanalysis possibilities of attacks are negligible. Furthermore, the size of encrypted data is frequently different from the original which gives additional security, and protects against discovery and revealing. Performance is evaluated by estimating entropy, plain-text attack analysis, structural similarity index, Local Shannon Entropy, Chi-Square test, Histogram, NPCR, and UACI values. There is a 100% match between the original and reconstructed data. In future work, we will be trying to decrease the encryption data size and computation cost. Data encryption systems based on full deep learning algorithms will be proposed because quantum computers can crack most encryption algorithms in the future.

The author expresses his appreciation to God almighty and to those who carry the torches of knowledge to light the way of mankind.

^{λ}-cos-cot map