This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random encryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.
The Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,
... Show MoreThe regression analysis process is used to study and predicate the surface response by using the design of experiment (DOE) as well as roughness calculation through developing a mathematical model. In this study; response surface methodology and the particular solution technique are used. Design of experiment used a series of the structured statistical analytic approach to investigate the relationship between some parameters and their responses. Surface roughness is one of the important parameters which play an important role. Also, its found that the cutting speed can result in small effects on surface roughness. This work is focusing on all considerations to make interaction between the parameters (position of influenc
... Show MoreBackground: Generally, genetic disorders are a leading cause of spontaneous abortion, neonatal death, increased morbidity and mortality in children and adults as well. They a significant health care and psychosocial burden for the patient, the family, the healthcare system and the community as a whole. Chromosomal abnormalities occur much more frequently than is generally appreciated. It is estimated that approximately 1 of 200 newborn infants had some form of chromosomal abnormality. The figure is much higher in fetuses that do not survive to term. It is estimated that in 50% of first trimester abortions, the fetus has a chromosomal abnormality. Aim of the study: This study aims to shed some light on the results of chromosomal studies per
... Show MoreIraq has seen many changes at the social, economic and political levels. This led to cause many shifts in the structure of its society and imposed great challenges reflected in the behavior and awareness of that society in general and youth in particular.
Those changes made the Iraqi society undergoing the transformation of value and culture aspects formed a political awareness that caused cultural and political diversity within the family and society. A greater openness to the outside world caused by the communication revolution, as the world has witnessed during the past two decades, has helped in making that change. Iraq had its share of media and political openness, which were included after the US occupation in 2003. As a re
... Show MoreA simple, accurate and sensitive spectrophotometric method for the determination of Procaine penicillin (PP) is described. The method is based on charge-transfer reaction of PP with metol (N-methyl-p-hydroxy aniline) in the presence of ferric sulphate to form a purple-water soluble complex ,which is stable and has a maximum absorption at 510 nm .A graph of absorbance versus concentration shows that Beer’s low is obeyed over the concentration range of 3-80 µg /ml of PP (i.e.,3-80 ppm) with a molar absorbativity of 4.945 ×103 L.mol-1.cm-1 ,Sandell sensitivity of 0.1190 µg cm-2 ,a relative error of (-1.57)-2.79 % and a standard deviation of less than 0.59 depending on the concentration of PP.The optimum conditions for full co
... Show MorePerchloroethylene (PERC) is commonly used as a dry-cleaning solvent, it is attributed to many deleterious effects in the biological system. The study aimed to investigate the harmful effect associated with PERC exposure among dry-cleaning workers. The study was carried out on 58 adults in two groups. PERC-exposed group; include thirty-two male dry-cleaning workers using PERC as a dry-cleaning solvent and twenty-six healthy non-exposed subjects. History of PERC exposure, use of personal protection equipment (PPE), safety measurement of the exposed group was recorded. Blood sample was taken from each participant for measurement of hematological markers, liver and kidney function tests. The results showed that 28.1% of the workers were usin
... Show MoreThis paper introduces some properties of separation axioms called α -feeble regular and α -feeble normal spaces (which are weaker than the usual axioms) by using elements of graph which are the essential parts of our α -topological spaces that we study them. Also, it presents some dependent concepts and studies their properties and some relationships between them.
The knowledge related with lexical items can be realized as including relations of meaning a cross words. Words that share a similarity of meaning are called to be synonymous, and words that share a contrary of meaning are called to be antonymous. Both of them are universal linguistic phenomenon that exist in terms of linguistic system of every language. The present study aims at finding out areas of difficulty that Iraqi EFL learners encounter in the use of synonymy and antonymy, both on the recognition and production levels. Also tries to detect the main reasons behind such difficulties. A diagnostic test of two parts, namely, recognition and production, is designed. The test is built to include two linguistic phenomenon which are: synony
... Show MoreMeloxicam (MLX) is non-steroidal anti -inflammatory, poorly water soluble, highly permeable drug and the rate of its oral absorption is often controlled by the dissolution rate in the gastrointestinal tract. Solid dispersion (SD) is an effective technique for enhancing the solubility and dissolution rate of such drug.
The present study aims to enhance the solubility and the dissolution rate of MLX by SD technique by solvent evaporation method using sodium alginate (SA), hyaluronic acid (HA), collagen and xyloglucan (XG) as gastro-protective hydrophilic natural polymers.
Twelve formulas were prepared in different drug: polymer ratios and evaluated for their, percentage yield, drug content, water so
... Show MoreOptimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiment
... Show More