Department of Computer Science, College of Science, University of Sulaimani, Sulaymaniyah, Iraq.
ABSTRACT
The progress of quantum computing has posed serious threats to classical cryptographic systems, necessitating much research into developing post-quantum cryptography (PQC). Of the schemes available in PQC, the strongest candidates appear to be lattice-based cryptography (LBC), which encompasses an ample security basis and good computation efficiency. However, practically implementing LBC is faced with key-generation and optimization difficulties, mainly because of its enormous key sizes and computational overhead. The research proposes a novel concept whereby genetic algorithms (GAs) are blended with LBC to increase the merits of key generation while guaranteeing security. Through the evolutionary capacity of GAs, the proposed method optimizes lattice-based keys through selection, crossover, and mutation to ensure high entropy and computationally feasible with experimental results indicating that the GA-based method can cut down memory requirements and computational complexity, making it favorable for resource-constrained environments such as the Internet of Things and embedded systems. The method thus suggested accelerates encryption speed and simultaneously strengthens the security of the optimized key structures. This study emphasizes evolutionary algorithms’ potential to facilitate PQC advancement and provides a scalable and efficient framework for cryptographic systems.
Index Terms: Cryptographic Security, Genetic Algorithm, Key Optimization, Lattice-Based Cryptography, Post-Quantum Cryptography
The launch of quantum computing has considerably challenged the standard cryptographic setups [1]. Existing classical cryptographic systems, such as RSA and Elliptic Curve Cryptography (ECC) [2], bank on mathematical complexities that are computationally very hard for standard computers but can be quite simply solved by quantum algorithms, such as Shor’s algorithm [3]. This weakness has sped up the search for post-quantum computing (PQC) solutions and eventually led to the design of cryptosystems that are amenable to attack by quantum computers [4]. Lattice-based-cryptography (LBC) [5], with its robust security guarantees and great ease of computation has since been in the limelight. In this spirit, the National Institute of Standards and Technology (NIST) [6] has broadened its interests in forming global efforts to standardize PQC algorithms: Some, like lattices, are under scrutiny to become future standards for cryptographic schemes.
Lattice-based cryptography (LBC) is based on the extreme hardness of two computationally challenging lattice problems called the Shortest Vector Problem (SVP) and the Learning with Errors (LWE) [7]. This kind of cryptography is considered a potent building block for digital signatures, key agreements, and encryption security for the quantum age [8]. However, the practical deployment of such systems faces challenges in key generation and optimization. Key sizes in large key lattice systems may yield troublesome memory sizing. At the same time, computational cost may restrict their applicability in low-resource environments such as the Internet of Things (IoT) and embedded systems [9].
LBC derives its security from the computational intractability of certain fundamental lattice problems, primarily the SVP and the LWE problem [7]. The SVP involves finding the shortest non-zero vector in a high-dimensional lattice, which is known to be NP-hard under specific norms [10]. The LWE problem, on the other hand, is based on solving linear equations with small random errors, making it resistant to both classical and quantum attacks. These problems serve as the basis for constructing secure encryption schemes, digital signatures, and key exchange protocols [11]. Unlike factorization-based cryptosystems, which are vulnerable to Shor’s algorithm, the hardness of SVP and LWE remains intact even in the presence of quantum computers [12]. As a result, LBC has emerged as a leading candidate for PQC and is actively being standardized by NIST for future cryptographic applications.
To tackle these problems, genetic algorithms (GAs) are used in this study to address the optimization incorporating lattice-based cryptographic keys. GAs [13], which were taken from biology around 1970, are an efficient and effective technique for solving optimization problems in complex algebraic and other structures. GAs have, in so many words, implemented all the desirable features that were essential to hyperplanes’ construction [14], such as characters, crossover/mutation, strategies, concealing, and non-linearities. Through the practice of GA, it is possible to refine lattice-based cryptographic keys by adopting an evolutionary multi-objective technique selection, crossover, and mutation. The GA and LBC concepts meshing enhance the key generation process by pruning the infeasible keyword combinations that do not guarantee high security.
Major Contributions:
The paper introduces GAs in LBC modification, stating clear processes for obtaining security
Our proposed change advances the efficiency of key generation, thus reducing resource waste for computation while guaranteeing security
We mean improving key optimization, memory efficiency, and encryption speed and illustrate how the proposed method performs better for this particular purpose
This article examines this technique within real-world cryptography systems resistant to quantum computer attacks, focusing on particularly resource-limited environments.
The remaining sections of this paper are as follows: Section 2 provides a comprehensive survey of PQC and lattice-based schemes and explanatory notes on optimization. Section 3 shows how GAs are combined with lattice-based key generation and explains the research strategy used. Section 4 describes an experimental study on the improvement method, including the performance measurement and the results obtained. Finally, Section 5 provides closure to the text and delves into the prospective directions of the study’s development.
A serious threat to traditional cryptographic schemes by quantum computers has spurred the development of PQC – an arms race for creating cryptographic algorithms capable of withstanding quantum-level attacks. The quest for more robust encryption methods in the future of quantum computing has led the NIST to assume a leading role in standardizing such algorithms. It then seeks an extensive account of recent pasta and relevant methodologies of PQC in LBC, its applications, improved performance, and enhanced security against threats from quantum computers.
Quantum computers use quantum mechanical principles to perform tasks at exponentially faster speeds than classical ones. Notably, integer factorization, which is an example of the task, could be broken through the use of Shor’s algorithm, thus making the most used cryptographic systems, such as RSA and ECC “dead.” Although Shor’s algorithm has only been deduced by theory, it is powerful or efficient enough to continue with any practical quantum computer for big-sized everyday applications. Thus, the emergence of PQC would keep it secure even against the quantum era. Standardization about resistance to quantum algorithms is yet crucial, and even candidates have been evaluated at NIST primarily based on complex mathematical problems, e.g., LBC [15].
The rapid advancements in quantum computing pose significant threats to traditional cryptographic systems, necessitating a transition toward PQC. UAVs, which rely on open wireless channels and have constrained computational resources, are particularly vulnerable to quantum-based attacks, making PQC integration crucial for securing their communications [16]. The NIST PQC standardization process has driven research toward hardware implementations, leading to the development of PQC hardware circuits and system design techniques to enhance security in various applications [17]. However, the migration of IP networks to PQC remains a complex challenge due to inconsistencies in migration steps, terminology, and security concerns, with present implementations being largely experimental [18]. A structured transition plan to PQC is essential, incorporating identification, protection, detection, and response phases to ensure a quantum-resilient cybersecurity infrastructure [19]. While hybrid cryptographic systems combining traditional and quantum-resistant techniques are being explored, significant challenges remain in practical deployment and regulatory alignment.
One of the candidates in the post-quantum computing (PQC) contest, LBC, has become a beautiful choice. LBC entails the solving of lattice-based problems such as the closest vector problem, the flexible LWE, and the ring LWE issues that are known to be classically and quantumly hard to solve [20]. LBC-based protocols are available for encryption, key exchange, digital signature, and even homomorphic encryption, and they can be used to secure various kinds of data and communication channels. Even though LBC is safe in practice, some real-world problems arise while trying to implement this scheme, where primary key and memory management structures may be inefficient, making it not convenient for use in areas where resources, such as embedded systems and IoT equipment, are few.
The global spread of IoT devices and the subsequent advancement in capabilities of mobile networks to 5G has necessitated the search and development of cryptosystems that will resist attacks facilitated by a quantum computer. IoT devices are of limited computational power, which requires lightweight cryptographic solutions that provide resistance to quantum attacks and are not stretched when it comes to performance [21]. Such a lightweight LBC instance is beneficial in such a scenario as it provides almost the same level of security as the ordinary LBC with approaches that decrease both memory usage and the level of computing resources in general [22]. In particular, a recent study provided efficient symmetric and asymmetric quantum-safe algorithms for an IoT environment, showing significant performance and memory cost improvements over the existing NIST-approved algorithms [21].
With its abundance of highly complex operations, the prospect of LBC has put a lot of pressure on its developers to improve performance [23]. Various forms of vectorization have been instrumental in enhancing lattice-based schemes, namely key exchange protocols. The section has improved and reduced time and energy consumption [24]. Finally, attention was focused on the problems associated with enlarging keys and using lattice-based systems. An example of such systems is NTRU systems, which can provide faster encryption and key generation than RSA but also suffer from the problem of the large sizes of the keys, which renders them unsuitable for many users [25].
LBC has the potential to address challenging security issues, but great care is still required to protect all such systems from the effects of quantum computers. There continue to be hazards associated with lattice structures and their operations, irrespective of whether they are hardware or software components. For that reason, research in this area is necessary to improve the security for LBC and come up with workable measures in the wake of new threats, such as those that may rely on stealth, as seen in side channel methodologies or physical access [26]. Moreover, the transformation of NIST’s PQC faceless approach has to consider the security necessity along with the practicality of memory consumed, scheduler cycles utilized, and power consumed by the devices to ensure that the said tablets satisfy the limits of different hardware equipment [27].
Cryptographic systems that are lattice-based have arisen as the answer to the problem of post-quantum vulnerability. This security concern is determined by the specific algorithms encountered while encrypting information. Although the security model of such systems is rather strong, certain complexity is observed in their practical implementation in environments where resources are limited, such as the IOTs or Edge processing. Optimizing LBC production or incorporating other LBC optimal techniques, such as vectorization, has dramatically increased the ability of LBC to adapt to different appliances. Moreover, it may be expedient to evaluate the security, performance, and other parameters in scale as pioneers of this technology accept other changes that are inevitable toward quantum computers. NIST’s attempts to invent Projected Quantum Computation Standalone systems for the future are significant for the following safety standards needed. A comparison of the most relevant related works is provided in Table 1 to highlight the differences and advantages of the proposed method.
TABLE 1: Comparison of the related works
PQC [28] is a field of cryptographic research that aims to develop secure cryptographic algorithms resilient to attacks by quantum computers. Classical cryptographic schemes such as RSA and ECC rely on problems, such as integer factorization and discrete logarithms, which are efficiently solvable using Shor’s algorithm on a sufficiently powerful quantum computer. PQC schemes are designed to counteract this vulnerability based on problems that remain computationally intractable even in the presence of quantum adversaries.
LBC has emerged as one of the most promising candidates for post-quantum secure cryptographic protocols. It relies on the computational hardness of lattice problems, such as the SVP and the LWE problem. Given a basis B = {b1|, b2|…,bn|} of an n-dimensional lattice Λ, the SVP requires finding the shortest non-zero vector v ∈ Λ which is computationally difficult even for quantum computers. The LWE problem, introduced by Regev, is formulated as follows: given a set of linear equations with small noise, recovering the original secret vector s is assumed to be hard. Mathematically, LWE can be defined as Eq(1).
A⋅s + e = b mod q (1)
Where is a randomly chosen matrix, is the secret vector, and is an error term sampled from a small noise distribution. Recovering s given (A,b) is believed to be computationally infeasible for large enough parameters. LBC provides efficient and scalable constructions for encryption, digital signatures, and key exchange protocols.
GA [13] is an evolutionary optimization technique inspired by natural selection and genetics principles. It is widely employed to solve complex optimization problems where traditional mathematical or heuristic approaches may be inefficient. GA operates on a population of candidate solutions, evolving them iteratively through selection, crossover, and mutation operations to enhance solution quality. The fitness function, denoted as f(x), evaluates the suitability of each candidate solution x within the search space.
The algorithm begins with an initial population P = {x1, x2,…,xn}, where each xi is a chromosome encoding a potential solution. Selection is performed to retain high-fitness individuals, commonly using roulette wheel selection, tournament selection, or rank-based selection. In roulette wheel selection, the probability of selecting an individual xi is proportional to its fitness:
Crossover, or recombination, involves exchanging genetic material between selected parents to create offspring. A common approach is single-point crossover, where a random crossover point c is chosen, and genes are swapped between two parents to produce new chromosomes:
Child1 = (Parent1[: c], Parent2 [c: ]) (3)
Child2 = (Parent2[: c], Parent1 [c: ]) (4)
Mutation introduces small random alterations to maintain genetic diversity and prevent pre-mature convergence. If xi is a binary-encoded chromosome, a bit-flip mutation at position j can be expressed as:
xij = 1-xij (5)
Where xij is the jth gene of chromosome xi. The algorithm iterates through these operations until a termination criterion is met, such as achieving a pre-defined fitness threshold or reaching a maximum number of generations.
This study presents a hybrid cryptographic framework hybrids GA and lattice-based encryption to enhance security and adaptability. The method consists of four main stages: key optimization, encryption, mutation, and decryption. Key optimization is performed using GA to generate an invertible encryption key matrix K over Z256. The fitness function ensures the invertibility of K by verifying that its determinant satisfies gcd(det(K),256) = 1 and that its modular inverse exists in modulo 256. The fitness function is defined as:
Where ∑K represents the sum of all elements in the key matrix. The GA iteratively refines K using selection, crossover, and mutation to obtain an optimal encryption key. Encryption transforms the plaintext message P into its numerical representation and computes the ciphertext C using matrix multiplication in modular arithmetic:
C = P⋅K mod 256 (7)
Where padding is applied to align with encryption parameters. To enhance security, a GA-based mutation process introduces controlled randomness by flipping bits in C, further obfuscating the encrypted text and increasing resistance against attacks. Decryption is performed using the modular inverse of the key matrix, ensuring correct retrieval of the original message:
P’ = C⋅K−1 mod 256 (8)
Where P’ ideally reconstructs the original plaintext. The invertibility of K ensures accurate recovery of P without information loss. Integrating GA-based optimization with lattice cryptography provides a robust mechanism resistant to conventional attacks. The introduced noise through mutation adds a layer of obfuscation. The proposed scheme offers a potential post-quantum alternative, leveraging the hardness of modular matrix inversion while maintaining computational efficiency.
To address the potential security vulnerabilities associated with using GAs for cryptographic key generation, we acknowledge that while GAs offer efficiency improvements, they may introduce risks. Specifically, the optimization process might lead to patterns or predictable sequences in the generated keys, which could be exploited by attackers. This concern arises because GAs are inherently stochastic, and if not carefully designed, they may converge to weak or vulnerable keys, thereby reducing the security of the encryption system. In addition, if the search space is not sufficiently large or diverse, or if the genetic operators (such as selection, crossover, and mutation) are not effectively applied, the randomness required for secure cryptographic keys could be compromised, making them more susceptible to cryptanalysis techniques such as brute-force or statistical attacks.
To mitigate these risks, we have incorporated mechanisms within our GA-based key generation process to preserve randomness and avoid pre-mature convergence. This includes ensuring that the search space remains large and diverse, and that proper randomization techniques are used. We also recommend conducting further security evaluations, such as testing against known cryptographic attacks (e.g., differential and linear cryptanalysis) and assessing the strength of GA-generated keys in realistic environments. While GAs can enhance the efficiency of cryptographic key generation, it is essential to carefully design and test the approach to prevent the introduction of vulnerabilities and ensure its applicability in secure cryptographic systems. The flowchart of the complete process is illustrated in Figure 1, providing a visual representation of the step-by-step workflow.
Fig. 1. Flowchart of the proposed method.
The computational complexity of the proposed method, “Optimization of Lattice-Based Cryptographic Key Generation Using Genetic Algorithms for Post-Quantum Security,” is evaluated by considering the key stages: key optimization using GAs, encryption, mutation, and decryption. In the key optimization stage, the GA iterates over populations of candidate solutions. Selection evaluates each matrix based on the fitness function, which requires computing the determinant and verifying the invertibility condition. The time complexity of this operation is O (n3). Crossover combines matrices with complexity O (n2), and mutation flips bits in the matrix, also requiring O (n2). Evaluating the fitness function involves determinant and modular inverse calculations, leading to a total complexity of O (n3 + n2) per generation.
For encryption, the process involves matrix multiplication between the plaintext and the key matrix, with a complexity of O (n2). Similarly, the mutation step in the ciphertext involves flipping bits, which also takes O (n2). Decryption requires matrix multiplication with the inverse of the key matrix, along with modular inverse computation, resulting in a complexity of O (n3 + n2).
The overall time complexity of the method is dominated by the GA key optimization and can be expressed as O (g⋅p⋅(n3 + n2)), where g is the number of generations, p is the population size, and n is the matrix dimension. The total complexity also includes the O (n3 + n2) cost of encryption, mutation, and decryption. While the GA introduces computational overhead due to its iterative nature, it provides a flexible approach to solving NP-hard optimization problems, such as LBC, which is resistant to quantum attacks. This hybrid approach offers strong security guarantees while maintaining efficiency within the constraints of the cryptographic operations. The overall computational complexity of the proposed method is dominated by the key optimization process using the GA. The total complexity of the method can be expressed as: O (g⋅p⋅(n3 + n2)) + O (n3 + n2). where g is the number of generations in the GA, p is the population size, and n is the dimension of the key matrix.
The experimental evaluation of the proposed method was conducted with a population size of 50 and a maximum iteration count of 50. The implementation was implemented in Python and executed on a Core i7-13650HX processor with 24GB of RAM. To optimize the efficiency of the GA in cryptographic key generation, we used the face-centered central composite design method [29], which tests parameters such as population size, mutation rate, and number of generations at three levels: low, medium, and high. This systematic approach allows for a thorough exploration of the parameter space to identify optimal settings. The following tables and figures comprehensively assess the method’s performance in encryption, decryption, and mutation processes.
Table 2 presents the optimized key matrices derived from the proposed method. The optimized key structure is critical to encryption security, ensuring randomness and unpredictability while maintaining computational efficiency. The displayed key values highlight the capability of the GA in optimizing cryptographic parameters. The key matrix exhibits a well-distributed pattern, reinforcing the robustness of the encryption scheme. These optimized keys are instrumental in generating secure ciphertext, thereby preventing unauthorized decryption attempts. The structured yet dynamic nature of the key suggests that the GA is effective in producing high-entropy keys, which is crucial for strong encryption. The results confirm that the proposed approach generates optimized encryption keys that enhance security and balance complexity and computational feasibility. By ensuring a high degree of randomness, the key optimization process significantly improves the overall encryption robustness, making it resistant to cryptanalysis.
TABLE 2: Optimized Keys achieved by the proposed method
Fig. 2 illustrates the convergence behavior of the GA in optimizing the key generation process for lattice-based cryptographic schemes. The x-axis represents the number of generations, while the y-axis denotes the best fitness value observed during the optimization process. As the number of generations increases, the fitness value improves consistently, demonstrating the effectiveness of the GA in evolving optimal cryptographic keys. The initial fluctuations in the fitness values indicate the exploration phase of the algorithm, where diverse candidate solutions are evaluated. As the generations progress, the curve exhibits a steady upward trend, reflecting the exploitation phase where the algorithm refines solutions to achieve higher fitness values. The smooth yet incremental improvement in fitness confirms the stability of the GA and its ability to converge toward an optimized key structure. This result validates the efficacy of using GA-based optimization in post-quantum cryptographic key generation by reducing computational overhead while maintaining high-security standards.
Fig. 2. The convergence curve is attained by the proposed method.
Table 3 presents the encryption and mutation results for short ciphertexts containing between 1 and 100 characters. The original plaintext and its corresponding decrypted output confirm the accuracy of the GA-based key recovery. The exact match between the decrypted text and the original message indicates that the proposed decryption approach successfully reconstructs the original input without introducing errors. Furthermore, the analysis of original and mutated ciphertexts reveals subtle alterations applied to the encoded text, a critical feature in enhancing security. These controlled mutations introduce diversity in the ciphertext representation, ensuring that even minor variations in the key can lead to significantly different encrypted outputs. This characteristic is essential in preventing pattern-based cryptanalysis while maintaining the ability to decrypt the message when the correct key is used accurately. The results validate the proposed approach’s effectiveness in ensuring secure communication with a robust decryption mechanism.
TABLE 3: Short ciphertext and mutation analysis 1–100 characters
Table 4 extends the ciphertext analysis to medium-length messages, ranging from 101 to 200 characters. Similar to the previous case, the decrypted text precisely matches the original, further establishing the reliability of the GA-based decryption mechanism. The analysis of the ciphertext mutations reveals systematic modifications applied to the encrypted message. These modifications serve the dual purpose of enhancing security while preserving the integrity of the decryption process. By introducing slight perturbations in the ciphertext structure, the mutation process ensures that even if an attacker gains partial knowledge of the encryption method, reconstructing the exact plaintext without the key remains computationally infeasible. The results highlight the robustness of the proposed method in handling larger messages while maintaining high decryption accuracy. This adaptability to varying message lengths is crucial to practical cryptographic implementations, making the method viable for real-world applications requiring secure communication.
TABLE 4: Medium-length ciphertext and mutation evolution 101–200 characters
Table 5 evaluates the performance of the proposed approach on long ciphertexts containing between 201 and 300 characters. The results reinforce the algorithm’s ability to decrypt lengthy messages accurately. The decrypted output remains consistent with the original, demonstrating the method’s scalability and reliability. The analysis of the mutated ciphertexts further substantiates the security-enhancing capabilities of the proposed approach. The systematic variations introduced in the encrypted text ensure that even slight modifications to the input lead to substantial changes in the ciphertext, a desirable property in cryptographic systems. These controlled mutations make it challenging for adversaries to exploit patterns or predict key variations, strengthening encryption security. The successful decryption of long messages while preserving security properties establishes the proposed method as a robust solution for high-volume data encryption scenarios, such as secure messaging and data storage.
TABLE 5: Long Ciphertext with GA-based decryption and mutations 201–300 characters
Table 6 presents another analysis of long ciphertexts, reaffirming the consistency of the proposed method in handling extensive encryption and decryption tasks. The decrypted output aligns perfectly with the original message, confirming the reliability of the GA-based approach. The mutations applied to the ciphertext exhibit controlled randomness, ensuring that each encryption process yields a unique representation of the plaintext. This diversity in ciphertext generation enhances security by preventing pattern recognition and statistical attacks. The ability to maintain decryption accuracy despite introducing systematic variations in the ciphertext underlines the robustness of the encryption framework. These findings further validate the scalability and efficiency of the proposed approach, making it a viable candidate for real-world cryptographic applications involving long-form textual data.
TABLE 6: Long ciphertext with GA-based decryption and mutations 201–300 characters
Table 7 extends the evaluation to an additional long ciphertext, reinforcing the established trends observed in previous analyses. The decrypted message matches the original text, highlighting the precision and effectiveness of the GA-based decryption mechanism. The mutation process continues to demonstrate its role in enhancing security by introducing structured variations in the encrypted output. These mutations contribute to the overall unpredictability of the ciphertext, making it increasingly resistant to cryptographic attacks. The results further support the applicability of the proposed method in diverse encryption scenarios where security and accuracy are paramount. The method’s ability to adapt to different message lengths while preserving encryption integrity ensures its practical utility in securing sensitive communications.
TABLE 7: Long ciphertext with GA-based decryption and mutations 201–300 characters
Table 8 examines the performance of the proposed approach on very long ciphertexts exceeding 301 characters. The results affirm the scalability and reliability of the encryption-decryption framework in handling extensive textual data. The decrypted output accurately reconstructs the original message, demonstrating the robustness of the GA-based key generation and recovery process. The mutations applied to the ciphertext introduce calculated variations, reinforcing encryption security. These systematic alterations ensure that even minor differences in the key result in significantly distinct ciphertext outputs, a property essential for preventing decryption by unauthorized entities. The findings confirm the proposed method is well-suited for high-security applications, where message integrity and confidentiality are critical. By successfully encrypting, decrypting, and mutating long messages, the technique establishes itself as a versatile solution for modern cryptographic challenges.
TABLE 8: Very long ciphertext with GA-based decryption and mutations of 301+characters
Table 9 presents a comparative analysis of the encryption and decryption times for different instances using a genetic optimization-based algorithm and a particle swarm optimization-based algorithm. The results indicate that the particle swarm optimization-based algorithm consistently exhibits slightly higher encryption and decryption times across all instances. For instance, in the first case, the encryption time for the genetic optimization-based approach is 0.000108957 s, whereas for the particle swarm optimization-based approach, it increases to 0.000158957 s. Similarly, the decryption time follows the same trend, with 0.000183343 s for the genetic optimization-based algorithm and 0.000253343 s for the particle swarm-based approach. This pattern persists across all instances, with increasing encryption and decryption times as the instance number grows. The key generation time is also reported, with the genetic optimization-based algorithm achieving 0.453506 s, whereas the particle swarm optimization-based algorithm requires 0.553506 s, highlighting a notable computational overhead. These findings suggest that while both optimization techniques enable secure encryption and decryption, the genetic optimization-based method demonstrates slightly superior efficiency in terms of execution time. This difference in computational performance may have implications for real-time security applications, where minimizing encryption and decryption times is crucial for maintaining system responsiveness.
TABLE 9: Performance evaluation of encryption and decryption times for different instances
In conclusion, the results presented in the tables and figures provide a comprehensive evaluation of the proposed method’s performance in encryption, decryption, and mutation processes. The method demonstrates high accuracy in decryption, robust security enhancements through ciphertext mutations, and scalability across varying message lengths. The effectiveness of the GA in optimizing encryption keys ensures a high degree of security while maintaining computational efficiency. These findings establish the proposed approach as a reliable and practical cryptographic solution suitable for diverse applications requiring secure communication and data protection.
The rapid advancement of quantum computing necessitates the development of robust post-quantum cryptographic solutions to replace traditional cryptographic methods that are vulnerable to quantum attacks. LBC has emerged as a leading candidate due to its resistance to quantum algorithms such as Shor’s. However, the practical implementation of LBC is hindered by challenges related to key size, memory requirements, and computational complexity. In this study, we introduced a GA-based optimization framework that enhances the key generation process in lattice-based encryption. The results demonstrate that the proposed GA-driven approach optimizes key structures, reducing computational overhead while maintaining high-security standards. By refining the selection, crossover, and mutation mechanisms, the GA ensures the generation of highly secure and computationally efficient encryption keys.
The findings of this study indicate that integrating GAs into PQC methodologies can significantly improve performance and resource efficiency. The optimized key generation process enhances encryption speed. It reduces memory footprint, making LBC more viable for real-world applications, particularly in IoT and embedded systems with limited computational resources. Moreover, the GA-based approach introduces an additional layer of randomness and robustness, ensuring that cryptographic keys maintain high entropy and security resilience. However, GAs have inherent limitations. As a heuristic approach, GAs do not guarantee global optimality and rely on extensive parameter tuning (e.g., population size, and mutation rates) to achieve desirable results.
For future research, further refinements can be made by exploring hybrid optimization techniques that integrate GAs with other metaheuristic approaches, such as particle swarm optimization or simulated annealing. In addition, implementing deep learning-driven GAs may further enhance the adaptability and performance of the proposed framework. Expanding the scope of experimental evaluations to real-world cryptographic applications, including secure communication protocols and blockchain security, will provide deeper insights into the practicality and scalability of the method. Furthermore, investigating hardware acceleration techniques, such as FPGA or GPU-based implementations, may improve computational efficiency for large-scale cryptographic operations. The proposed GA-optimized LBC framework can be further refined by addressing these avenues to meet the evolving demands of secure communication in the post-quantum era.
[1] F. Bova, A. Goldfarb and R. G. Melko. “Commercial applications of quantum computing”. EPJ Quantum Technology, vol. 8, no. 1, 2, 2021.
[2] M. R. Khan, K. Upreti, M. I. Alam, H. Khan, S. T. Siddiqui, M. Haque and J. Parashar. “Analysis of elliptic curve cryptography &RSA”. Journal of ICT Standardization, vol. 11, no. 4, pp. 355-378, 2023.
[3] F. Kappel and A. V. “Kuntsevich. An implementation of Shor's r-algorithm”. Computational Optimization and Applications, vol. 15, pp. 193-205, 2000.
[4] T. N. A. Al Attar, M. A. Mohammed and R. N. Mohammed. “Exploring post-quantum cryptography:Evaluating algorithm resilience against global quantum threats”. UHD Journal of Science and Technology, vol. 9, no. 1, pp. 18-28, 2025.
[5] A. Khalid, S. McCarthy, M. O'Neil and W. Liu. “Lattice-based Cryptography for IoT in a Quantum World:Are We Ready?”In:2019 IEEE 8th International Workshop on Advances in Sensors and Interfaces (IWASI). IEEE, 2019.
[6] North American Numbering Plan. National Institute of Standards and Technology (NIST). Available on:https://www.nationalnanpa.com/. [Last accessed on 2025 Apr 12].
[7] X. Wang, G. Xu and Y. Yu. “Lattice-based cryptography:A survey”. Chinese Annals of Mathematics, Series B, vol. 44, no. 6, pp. 945-960, 2023.
[8] J. Yao and V. Zimmer. “Cryptography”. In:Building Secure Firmware:Armoring the Foundation of the Platform. Springer, Cham, pp. 767-823, 2020.
[9] R. Chaudhary, G. S. Aujla, Member, N. Kumar and S. Zeadally. “Lattice-based public key cryptosystem for internet of things environment:Challenges and solutions”. IEEE Internet of Things Journal, vol. 6, no. 3. pp. 4897-4909, 2018.
[10] M. Yasuda. “A Survey of Solving SVP Algorithms and Recent Strategies for Solving the SVP Challenge”. In:International Symposium on Mathematics, Quantum Theory, and Cryptography:Proceedings of MQC 2019. Springer, Singapore, 2021.
[11] R. Lindner and C. Peikert. “Better key sizes (and attacks) for LWE-based encryption”. In:Topics in Cryptology-CT-RSA 2011:The Cryptographers'Track at the RSA Conference 2011, Proceedings. Springer, San Francisco, CA, USA, 2011.
[12] S. Farooq, A. Altaf, F. Iqbal, E. B. Thompson, D. L. R. Vargas, I. d. l. T. Díez, and I. Ashraf, “Resilience Optimization of Post-Quantum Cryptography Key Encapsulation Algorithms,”Sensors, vol. 23, no. 12, 5379, 2023.
[13] J. H. Holland. “Genetic algorithms”. Scientific American, vol. 267, no. 1, pp. 66-73, 1992.
[14] J. H. Holland. “Building blocks, cohort genetic algorithms, and hyperplane-defined functions”. Evolutionary Computation, vol. 8, no. 4, pp. 373-391, 2000.
[15] S. Farooq, A. Altaf, F. Iqbal, E. B. Thompson, D. L. R. Vargas, I. de la Torre Díez and I. Ashraf. “Resilience optimization of post-quantum cryptography key encapsulation algorithms”. Sensors (Basel), vol. 23, no. 12, 5379, 2023.
[16] M. A. Khan, S. Javaid, S. A. H. Mohsan, M. Tanveer and I. Ullah. “Future-proofing security for UAVs with post-quantum cryptography:A review”. IEEE Open Journal of the Communications Society, vol. 5, pp. 6849-6871, 2024.
[17] J. Xie, W. Zhao, H. Lee, D. B. Roy and X. Zhang. “Hardware circuits and systems design for post-quantum cryptography-A tutorial brief”. IEEE Transactions on Circuits and Systems II:Express Briefs, vol. 71, no. 3, pp. 1670-1676, 2024.
[18] C. Näther, D. Herzinger, S. L. Gazdag, J. P. Steghöfer, S. Daum and D. Loebenberger. “Migrating software systems towards post-quantum cryptography-a systematic literature review”. IEEE Access, vol. 12, pp. 132107-132126, 2024.
[19] A. Aydeger, E. Zeydan, A. K. Yadav, K. T. Hemachandra and M. Liyanage. “Towards a Quantum-resilient Future:Strategies for Transitioning to Post-quantum Cryptography”. In:2024 15th International Conference on Network of the Future (NoF). IEEE, 2024.
[20] H. Nejatollahi, N. Dutt, S. Ray, F. Regazzoni, I. Banerjee and R. Cammarota. “Post-quantum lattice-based cryptography implementations:A survey”. ACM Computing Surveys, vol. 51, no. 6, pp. 1-41, 2019.
[21] A. Kaushik, L. S. S. Vadlamani, M. M. Hussain, M. Sahay, R. Singh, A. K. Singh, S. Indu, P. Goswami and N. G. V. Kousik. “Post quantum public and private key cryptography optimized for IoT security”. Wireless Personal Communications, vol. 129, no. 2, pp. 893-909, 2023.
[22] R. Asif. “Post-quantum cryptosystems for Internet-of-Things:A survey on lattice-based algorithms”. IoT, vol. 2, no. 1, pp. 71-91, 2021.
[23] S. Koteshwara, M. Kumar and P. Pattnaik. “Performance Optimization of Lattice post-Quantum Cryptographic Algorithms on Many-core Processors”. In:2020 IEEE International Symposium on Performance Analysis of Systems and Software (ISPASS). IEEE, 2020.
[24] E. Alkim, L. Ducas, T. Pöppelmann and P. Schwabe. “Post-quantum Key {Exchange-A}New Hope”. In:25th USENIX Security Symposium (USENIX Security 16). 2016.
[25] A. Gagnidze, M. Iavich, and G. Iashvili. “Analysis of post quantum cryptography use in practice”. Bulletin of the Georgian National Academy of Sciences, vol. 11, no. 2, pp. 29-36, 2017.
[26] Z. Liu, K. K. R. Choo and J. Grossschadl. “Securing edge devices in the post-quantum internet of things using lattice-based cryptography”. IEEE Communications Magazine, vol. 56, no. 2, pp. 158-162, 2018.
[27] C. A. Roma, C. E. A. Tai and M. A. Hasan. “Energy efficiency analysis of post-quantum cryptographic algorithms”. IEEE Access, vol. 9, pp. 71295-71317, 2021.
[28] M. Kumar and P. Pattnaik. “Post Quantum Cryptography (pqc)-an Overview”. In:2020 IEEE High Performance Extreme Computing Conference (HPEC). IEEE, 2020.
[29] M. Balachandran, S. Devanathan, R. Muraleekrishnan annd S. S. Bhagawan. “Optimizing properties of nanoclay-nitrile rubber (NBR) composites using face centred central composite design”. Materials &Design, vol. 35, pp. 854-862, 2012.