So far, APT (Advanced Persistent Threats) is a constant concern for information security. Despite that, many approaches have been used in order to detect APT attacks, such as change controlling, sandboxing and network traffic analysis. However, success of 100% couldn’t be achieved. Current studies have illustrated that APTs adopt many complex techniques to evade all detection types. This paper describes and analyzes APT problems by analyzing the most common techniques, tools and pathways used by attackers. In addition, it highlights the weaknesses and strengths of the existing security solutions that have been used since the threat was identified in 2006 until 2019. Furthermore, this research proposes a new framework that can be u
... Show MoreModern ciphers are one of the more difficult to break cipher systems because these ciphers high security, high speed, non - propagation error and difficulty in breaking it. One of the most important weaknesses of stream cipher is a matching or correlation between the output key-stream and the output of shift registers.
This work considers new investigation methods for cryptanalysis stream cipher using ciphertext only attack depending on Particle Swarm Optimization (PSO) for the automatic extraction for the key. It also introduces a cryptanalysis system based on PSO with suggestion for enhancement of the performance of PSO, by using Simulated Annealing (SA). Additionally, it presents a comparison for the cryptanal
... Show MoreCipher security is becoming an important step when transmitting important information through networks. The algorithms of cryptography play major roles in providing security and avoiding hacker attacks. In this work two hybrid cryptosystems have been proposed, that combine a modification of the symmetric cryptosystem Playfair cipher called the modified Playfair cipher and two modifications of the asymmetric cryptosystem RSA called the square of RSA technique and the square RSA with Chinese remainder theorem technique. The proposed hybrid cryptosystems have two layers of encryption and decryption. In the first layer the plaintext is encrypted using modified Playfair to get the cipher text, this cipher text will be encrypted using squared
... Show MoreDiabetes is considered by the World Health Organization (WHO) as a main health problem globally. In recent years, the incidence of Type II diabetes mellitus was increased significantly due to metabolic disorders caused by malfunction in insulin secretion. It might result in various diseases, such as kidney failure, stroke, heart attacks, nerve damage, and damage in eye retina. Therefore, early diagnosis and classification of Type II diabetes is significant to help physician assessments.
The proposed model is based on Multilayer Neural Network using a dataset of Iraqi diabetes patients obtained from the Specialized Center for Endocrine Glands and Diabetes Diseases. The investigation includes 282 samples, o
... Show MoreToday the Genetic Algorithm (GA) tops all the standard algorithms in solving complex nonlinear equations based on the laws of nature. However, permute convergence is considered one of the most significant drawbacks of GA, which is known as increasing the number of iterations needed to achieve a global optimum. To address this shortcoming, this paper proposes a new GA based on chaotic systems. In GA processes, we use the logistic map and the Linear Feedback Shift Register (LFSR) to generate chaotic values to use instead of each step requiring random values. The Chaos Genetic Algorithm (CGA) avoids local convergence more frequently than the traditional GA due to its diversity. The concept is using chaotic sequences with LFSR to gene
... Show MoreExponential distribution is one of most common distributions in studies and scientific researches with wide application in the fields of reliability, engineering and in analyzing survival function therefore the researcher has carried on extended studies in the characteristics of this distribution.
In this research, estimation of survival function for truncated exponential distribution in the maximum likelihood methods and Bayes first and second method, least square method and Jackknife dependent in the first place on the maximum likelihood method, then on Bayes first method then comparing then using simulation, thus to accomplish this task, different size samples have been adopted by the searcher us
... Show MoreMissing data is one of the problems that may occur in regression models. This problem is usually handled by deletion mechanism available in statistical software. This method reduces statistical inference values because deletion affects sample size. In this paper, Expectation Maximization algorithm (EM), Multicycle-Expectation-Conditional Maximization algorithm (MC-ECM), Expectation-Conditional Maximization Either (ECME), and Recurrent Neural Networks (RNN) are used to estimate multiple regression models when explanatory variables have some missing values. Experimental dataset were generated using Visual Basic programming language with missing values of explanatory variables according to a missing mechanism at random general pattern and s
... Show MoreIn this study, we derived the estimation for Reliability of the Exponential distribution based on the Bayesian approach. In the Bayesian approach, the parameter of the Exponential distribution is assumed to be random variable .We derived posterior distribution the parameter of the Exponential distribution under four types priors distributions for the scale parameter of the Exponential distribution is: Inverse Chi-square distribution, Inverted Gamma distribution, improper distribution, Non-informative distribution. And the estimators for Reliability is obtained using the two proposed loss function in this study which is based on the natural logarithm for Reliability function .We used simulation technique, to compare the
... Show MoreImage quality has been estimated and predicted using the signal to noise ratio (SNR). The purpose of this study is to investigate the relationships between body mass index (BMI) and SNR measurements in PET imaging using patient studies with liver cancer. Three groups of 59 patients (24 males and 35 females) were divided according to BMI. After intravenous injection of 0.1 mCi of 18F-FDG per kilogram of body weight, PET emission scans were acquired for (1, 1.5, and 3) min/bed position according to the weight of patient. Because liver is an organ of homogenous metabolism, five region of interest (ROI) were made at the same location, five successive slices of the PET/CT scans to determine the mean uptake (signal) values and its standard deviat
... Show MoreNatural Bauxite (BXT) mineral clay was modified with a cationic surfactant (hexadecy ltrimethy lammonium bromide (BXT-HDTMA)) and characterized with different techniques: FTIR spectroscopy, X-ray powder diffraction (XRD) and scanning electron microscopy (SEM). The modified and natural bauxite (BXT) were used as adsorbents for the adsorption of 4- Chlorophenol (4-CP) from aqueous solutions. The adsorption study was carried out at different conditions and parameters: contact time, pH value, adsorbent dosage and ionic strength. The adsorption kinetic (described by a pseudo-first order and a pseudo-second order), equilibrium experimental data (analyzed by Langmuir, Freundlich and Temkin isotherm models) and thermodynamic parameters (change in s
... Show More