In modern era, which requires the use of networks in the transmission of data across distances, the transport or storage of such data is required to be safe. The protection methods are developed to ensure data security. New schemes are proposed that merge crypto graphical principles with other systems to enhance information security. Chaos maps are one of interesting systems which are merged with cryptography for better encryption performance. Biometrics is considered an effective element in many access security systems. In this paper, two systems which are fingerprint biometrics and chaos logistic map are combined in the encryption of a text message to produce strong cipher that can withstand many types of attacks. The histogram analysis of ciphertext shows that the resulted cipher is robust. Each character in the plaintext has different representations in the ciphertext even if the characters are repeated through the message. The strength of generated cipher was measured through brute force attackers, they were unable to deduce the key from the knowledge about pairs of plaintext- ciphertext due to the fact that each occurrence of characters in the message will have different shift value, and as a result a diverse representation will be obtained for same characters of the message.
Active learning is a teaching method that involves students actively participating in activities, exercises, and projects within a rich and diverse educational environment. The teacher plays a role in encouraging students to take responsibility for their own education under their scientific and pedagogical supervision and motivates them to achieve ambitious educational goals that focus on developing an integrated personality for today’s students and tomorrow’s leaders. It is important to understand the impact of two proposed strategies based on active learning on the academic performance of first-class intermediate students in computer subjects and their social intelligence. The research sample was intentionally selected, consis
... Show MoreAbstract
Suffering the human because of pressure normal life of exposure to several types of heart disease as a result of due to different factors. Therefore, and in order to find out the case of a death whether or not, are to be modeled using binary logistic regression model
In this research used, one of the most important models of nonlinear regression models extensive use in the modeling of applications statistical, in terms of heart disease which is the binary logistic regression model. and then estimating the parameters of this model using the statistical estimation methods, another problem will be appears in estimating its parameters, as well as when the numbe
... Show More
Abstract
The use of modern scientific methods and techniques, is considered important topics to solve many of the problems which face some sector, including industrial, service and health. The researcher always intends to use modern methods characterized by accuracy, clarity and speed to reach the optimal solution and be easy at the same time in terms of understanding and application.
the research presented this comparison between the two methods of solution for linear fractional programming models which are linear transformation for Charnas & Cooper , and denominator function restriction method through applied on the oil heaters and gas cookers plant , where the show after reac
... Show MoreA condense study was done to compare between the ordinary estimators. In particular the maximum likelihood estimator and the robust estimator, to estimate the parameters of the mixed model of order one, namely ARMA(1,1) model.
Simulation study was done for a varieties the model. using: small, moderate and large sample sizes, were some new results were obtained. MAPE was used as a statistical criterion for comparison.
In regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreIn the hybrid coolingsolar systems , a solar collectoris used to convertsolar energy intoheat sourcein order to super heat therefrigerant leave thecompressor,andthisprocess helpsin the transformation ofrefrigerant state from gaseous statetothe liquid statein upper two-thirdsof thecondenserinstead of the lower two-thirdssuchas in thetraditional air-conditioning systems and this willreduce theenergyneeded torun the process ofcooling.In this research two hybrid air-conditioning system with an evacuated tube solar collector were used, therefrigerant was R22 and the capacity was 2 tons each.The tilt angle of the evacuated tube solar collector was changed and the solar collector fluid was replaced into oil instead of water.A comparison wasi
... Show MoreElectrical Discharge Machining (EDM) is a widespread Nontraditional Machining (NTM) processes for manufacturing of a complicated geometry or very hard metals parts that are difficult to machine by traditional machining operations. Electrical discharge machining is a material removal (MR) process characterized by using electrical discharge erosion. This paper discusses the optimal parameters of EDM on high-speed steel (HSS) AISI M2 as a workpiece using copper and brass as an electrode. The input parameters used for experimental work are current (10, 24 and 42 A), pulse on time (100, 150 and 200 µs), and pulse off time (4, 12 and 25 µs) that have effect on the material removal rate (MRR), electrode wear rate (EWR) and wear ratio (WR). A
... Show MoreThis research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v
... Show MoreNano gamma alumina was prepared by double hydrolysis process using aluminum nitrate nano hydrate and sodium aluminate as an aluminum source, hydroxyle poly acid and CTAB (cetyltrimethylammonium bromide) as templates. Different crystallization temperatures (120, 140, 160, and 180) 0C and calcinations temperatures (500, 550, 600, and 650) 0C were applied. All the batches were prepared at PH equals to 9. XRD diffraction technique and infrared Fourier transform spectroscopy were used to investigate the phase formation and the optical properties of the nano gamma alumina. N2 adsorption-desorption (BET) was used to measure the surface area and pore volume of the prepared nano alumina, the particle size and the
... Show More