Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to satisfy the purpose of information security by adding a new level of security to Triple Data Encryption Standard algorithm using Nth Degree Truncated Polynomial Ring Unit algorithm. This aim achieved by adding two new key functions, the first one is Enckey(), and the second one is Deckey() for encryption and decryption key of Triple Data Encryption Standard to make this algorithm more stronger. The obtained results of this paper also have good resistance against brute-force attack which makes the system more effective by applying Nth Degree Truncated Polynomial Ring Unit algorithm to encrypt and decrypt key of Triple Data Encryption Standard. Also, these modifications enhance the degree of complexity, increase key search space, and make the ciphered message difficult to be cracked by the attacker.
This article deals with estimations of system Reliability for one component, two and s-out-of-k stress-strength system models with non-identical component strengths which are subjected to a common stress, using Exponentiated Exponential distribution with common scale parameter. Based on simulation, comparison studies are made between the ML, PC and LS estimators of these system reliabilities when scale parameter is known.
A new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show MoreThere is no doubt that the project control function is very important for administration, so the project Management depends on to monitor and control the project. The project control integrated to the planning which is the base of the administration functions; planning, organizing, directing, and controlling. Without project control cannot be insure to fulfill the plan of the project by the budget and specified time. The project management apply many methods of control to achieve the goals of project which are cost, time, and required specifications. Earned Value Management one of control methods that used in the project by international companies.
Earned Value Method is used in the project o
... Show MoreThe objective of all planning research is to plan for human comfort and safety, and one of the most significant natural dangers to which humans are exposed is earthquake risk; therefore, earthquake risks must be anticipated, and with the advancement of global technology, it is possible to obtain information on earthquake hazards. GIS has been utilized extensively in the field of environmental assessment research due to its high potential, and GIS is a crucial application in seismic risk assessment. This paper examines the methodologies used in recent GIS-based seismic risk studies, their primary environmental impacts on urban areas, and the complexity of the relationship between the applied methodological approaches and the resulting env
... Show MoreCopper, and its, alloys and composites (being the matrix), are broadly used in the electronic as well as bearing materials due to the excellent thermal and electrical conductivities it has.
In this study, powder metallurgy technique was used for the production of copper graphite composite with three volume perc ent of graphite. Processing parameters selected is (900) °C sintering temperature and (90) minutes holding time for samples that were heated in an inert atmosphere (argon gas). Wear test results showed a pronounced improvement in wear resistance as the percent of graphite increased which acts as solid lubricant (where wear rate was decreased by about 88% as compared with pure Cu). Microhardness and
... Show MoreSoil stabilization with stone powder is a good solution for the construction of subgrade for road way and railway lines, especially under the platforms and mostly in transition zones between embankments and rigid structures, where the mechanical properties of supporting soils are very influential. Stone powder often has a unique composition which justifies the need for research to study the feasibility of using this stone powder type for ground improvement applications. This paper presents results from a comprehensive laboratory study carried out to investigate the feasibility of using stone powder for improvement of engineering properties of clays.
The stone powder contains bassanite (CaSO4. ½ H
... Show MoreFeature selection, a method of dimensionality reduction, is nothing but collecting a range of appropriate feature subsets from the total number of features. In this paper, a point by point explanation review about the feature selection in this segment preferred affairs and its appraisal techniques are discussed. I will initiate my conversation with a straightforward approach so that we consider taking care of features and preferred issues depending upon meta-heuristic strategy. These techniques help in obtaining the best highlight subsets. Thereafter, this paper discusses some system models that drive naturally from the environment are discussed and calculations are performed so that we can take care of the prefe
... Show MoreCutaneous Leishmaniasis (CL) is an endemic disease and one of the major health problems in Iraq. Leishmania tropica is known as the causative agent of Cutaneous Leishmaniasis in Baghdad.The classical serological methods of diagnosing leishmaniasis is a poor sensitivity especially for the sub genus and time consuming Here we have investigated two primer pairs, one specific for Leishmania as genus and the primer specific for the species of L. tropica to be detected by polymerase chain reaction (PCR).Samples were collected from (AL-karama Teaching Hospital) and whole genomic DNA was extracted from axenic promastigotes.The extracted DNA was amplified by PCRwith two KDNA primer pairs, for genus specific (13A/13B) and (Lmj4/Uni21) to identify
... Show MoreThis research was aimed to study the osmotic efficiency of the draw solutions and the factors affecting the performance of forward osmosis process : The draw solutions used were magnesium sulfate hydrate (MgSO4.7H2O) pojtassium chloride (KCL), calcium chloride (CaCl2) and ammonium bicarbonate (NH4HCO3). It was found that water flux increases with increasing draw solution concentration, and feed solution flow rate and decreases with increasing draw solution flow rate and feed solution concentration. And also found that the efficiency of the draw solutions is in the following order:
CaCl2> KCI > NH4HCO3> MgSO4.7H
This paper is focusing on reducing the time for text processing operations by taking the advantage of enumerating each string using the multi hashing methodology. Text analysis is an important subject for any system that deals with strings (sequences of characters from an alphabet) and text processing (e.g., word-processor, text editor and other text manipulation systems). Many problems have been arisen when dealing with string operations which consist of an unfixed number of characters (e.g., the execution time); this due to the overhead embedded-operations (like, symbols matching and conversion operations). The execution time largely depends on the string characteristics; especially its length (i.e., the number of characters consisting
... Show More