In this paper, new method have been investigated using evolving algorithms (EA's) to cryptanalysis one of the nonlinear stream cipher cryptosystems which depends on the Linear Feedback Shift Register (LFSR) unit by using cipher text-only attack. Genetic Algorithm (GA) and Ant Colony Optimization (ACO) which are used for attacking one of the nonlinear cryptosystems called "shrinking generator" using different lengths of cipher text and different lengths of combined LFSRs. GA and ACO proved their good performance in finding the initial values of the combined LFSRs. This work can be considered as a warning for a stream cipher designer to avoid the weak points, which may be found in the stream cipher, and may be explored by the cryptanalysts. This work can find the optimal solution for text with minimum lengths of 20 characters and 100 iteration were very enough to find the real initial values of key stream.
Objectives The strategies of tissue-engineering led to the development of living cell-based therapies to repair lost or damaged tissues, including periodontal ligament and to construct biohybrid implant. This work aimed to isolate human periodontal ligament stem cells (hPDLSCs) and implant them on fabricated polycaprolactone (PCL) for the regeneration of natural periodontal ligament (PDL) tissues. Methods hPDLSCs were harvested from extracted human premolars, cultured, and expanded to obtain PDL cells. A PDL-specific marker (periostin) was detected using an immunofluorescent assay. Electrospinning was applied to fabricate PCL at three concentrations (13%, 16%, and 20% weight/volume) in two forms, which were examined through field emission
... Show MoreObjective: the aim of this study is to invest age and determine the effect of using (2) packing
technique (conventional and new tension technique) on hardness of (2) types of heat cure acrylic
resin which are (Ivoclar and Qual dental type).
Methodology : this study was intended the using of two types of heat cure acrylic (IVoclar and
Qual dental type) which are used in construction of complete denture which packed in two different
packing technique (conventional and new tension technique) and accomplished by using a total of
(40) specimens in diameter of ( 2mm thickness, 2 cm length and 1 cm width) . This specimens were
sectioned and subdivide into (4) group each (10) specimens for one group , then signed as (A, Al B
This work implements the face recognition system based on two stages, the first stage is feature extraction stage and the second stage is the classification stage. The feature extraction stage consists of Self-Organizing Maps (SOM) in a hierarchical format in conjunction with Gabor Filters and local image sampling. Different types of SOM’s were used and a comparison between the results from these SOM’s was given.
The next stage is the classification stage, and consists of self-organizing map neural network; the goal of this stage is to find the similar image to the input image. The proposal method algorithm implemented by using C++ packages, this work is successful classifier for a face database consist of 20
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreIn this work, the emission spectra and atomic structure of the aluminum target had been studied theoretically using Cowan code. Cowan code was used to calculate the transitions of electrons between atomic configuration interactions using the mathematical method called (Hartree-Fock). The aluminum target can give a good emission spectrum in the XUV region at 10 nm with oscillator strength of 1.82.
The hydrodynamic properties of laser produced plasma (LPP) were investigated for the purpose of creating a light source working in the EUV region. Such a light source is very important for lithography (semiconductor manufacturing). The improved MEDUSA (Med103) code can calculate the plasma hydrodynamic properties (velocity, electron density,
This paper describes a research effort that aims of developing solar models for housing suitable for the Arabian region since the Arabian Peninsula is excelled with very high levels of solar radiation.
The current paper is focused on achieving energy efficiency through utilizing solar energy and conserving energy. This task can be accomplished by implementation the major elements related to energy efficiency in housing design , such as embark on an optimum photovoltaic system orientation to maximize seize solar energy and produce solar electricity. All the precautions were taken to minimizing the consumption of solar energy for providing the suitable air-condition to the inhibitor of the solar house in addition to use of energy effici
The term "tight reservoir" is commonly used to refer to reservoirs with low permeability. Tight oil reservoirs have caused worry owing to its considerable influence upon oil output throughout the petroleum sector. As a result of its low permeability, producing from tight reservoirs presents numerous challenges. Because of their low permeability, producing from tight reservoirs is faced with a variety of difficulties. The research aim is to performing hydraulic fracturing treatment in single vertical well in order to study the possibility of fracking in the Saady reservoir. Iraq's Halfaya oil field's Saady B reservoir is the most important tight reservoir. The diagnostic fracture injection test is determined for HF55using GOHFER soft
... Show MoreIn this work, the study of
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show More