Classification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Borderline-SMOTE + Imbalanced Ratio(IR), Adaptive Synthetic Sampling (ADASYN) +IR) Algorithm, where the work these techniques are generate the synthetic samples for the minority class to achieve balance between minority and majority classes and then calculate the IR between classes of minority and majority. Experimental results show ImprovedSMOTE algorithm outperform the Borderline-SMOTE + IR and ADASYN + IR algorithms because it achieves a high balance between minority and majority classes.
This research a study model of linear regression problem of autocorrelation of random error is spread when a normal distribution as used in linear regression analysis for relationship between variables and through this relationship can predict the value of a variable with the values of other variables, and was comparing methods (method of least squares, method of the average un-weighted, Thiel method and Laplace method) using the mean square error (MSE) boxes and simulation and the study included fore sizes of samples (15, 30, 60, 100). The results showed that the least-squares method is best, applying the fore methods of buckwheat production data and the cultivated area of the provinces of Iraq for years (2010), (2011), (2012),
... Show MoreIn the last few years, the literature conferred a great interest in studying the feasibility of using memristive devices for computing. Memristive devices are important in structure, dynamics, as well as functionalities of artificial neural networks (ANNs) because of their resemblance to biological learning in synapses and neurons regarding switching characteristics of their resistance. Memristive architecture consists of a number of metastable switches (MSSs). Although the literature covered a variety of memristive applications for general purpose computations, the effect of low or high conductance of each MSS was unclear. This paper focuses on finding a potential criterion to calculate the conductance of each MMS rather t
... Show MoreThe most significant function in oil exploration is determining the reservoir facies, which are based mostly on the primary features of rocks. Porosity, water saturation, and shale volume as well as sonic log and Bulk density are the types of input data utilized in Interactive Petrophysics software to compute rock facies. These data are used to create 15 clusters and four groups of rock facies. Furthermore, the accurate matching between core and well-log data is established by the neural network technique. In the current study, to evaluate the applicability of the cluster analysis approach, the result of rock facies from 29 wells derived from cluster analysis were utilized to redistribute the petrophysical properties for six units of Mishri
... Show More
The demand for single photon sources in quantum key distribution (QKD) systems has necessitated the use of weak coherent pulses (WCPs) characterized by a Poissonian distribution. Ensuring security against eavesdropping attacks requires keeping the mean photon number (µ) small and known to legitimate partners. However, accurately determining µ poses challenges due to discrepancies between theoretical calculations and practical implementation. This paper introduces two experiments. The first experiment involves theoretical calculations of µ using several filters to generate the WCPs. The second experiment utilizes a variable attenuator to generate the WCPs, and the value of µ was estimated from the photons detected by the BB
... Show More<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T
... Show MoreThe aim of this paper is to present a new methodology to find the private key of RSA. A new initial value which is generated from a new equation is selected to speed up the process. In fact, after this value is found, brute force attack is chosen to discover the private key. In addition, for a proposed equation, the multiplier of Euler totient function to find both of the public key and the private key is assigned as 1. Then, it implies that an equation that estimates a new initial value is suitable for the small multiplier. The experimental results show that if all prime factors of the modulus are assigned larger than 3 and the multiplier is 1, the distance between an initial value and the private key
... Show MoreThis paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to
... Show MoreA novel series of chitosan derivatives were synthesized via reaction of chitosan with carbonyl compounds and grafted it’s by with different amine compounds substituted hydrogen. The produced polymers were characterized by different analyses FTIR, 1HCNMR, XRD, DSC and TGA. Solubility in water as well as many solvent was investigated, antibacterial activity of chitosan and its derivatives against two types of bacteria E. coli and S. aureus was also investigated. The results showed that derivatives sort of have antibacterial activities against Esherichia coli (Gram negative) better than chitosan whilst compound IX has better antibacterial against Staphylococcus aureus (Gram positive). SEM analysis showed that increase of surface roughness wi
... Show More