This paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estimation through working with rough set theory. The results obtained from most code sets show that Bees algorithm better than ID3 in decreasing the number of extracted rules without affecting the accuracy and increasing the accuracy ratio of null values estimation, especially when the number of null values is increasing
“Smart city” projects have become fully developed and are actively using video analytics. Our study looks at how video analytics from surveillance cameras can help manage urban areas, making the environment safer and residents happier. Every year hundreds of people fall on subway and railway lines. The causes of these accidents include crowding, fights, sudden health problems such as dizziness or heart attacks, as well as those who intentionally jump in front of trains. These accidents may not cause deaths, but they cause delays for tens of thousands of passengers. Sometimes passers-by have time to react to the event and try to prevent it, or contact station personnel, but computers can react faster in such situations by using ethical
... Show MoreThe distribution of the expanded exponentiated power function EEPF with four parameters, was presented by the exponentiated expanded method using the expanded distribution of the power function, This method is characterized by obtaining a new distribution belonging to the exponential family, as we obtained the survival rate and failure rate function for this distribution, Some mathematical properties were found, then we used the developed least squares method to estimate the parameters using the genetic algorithm, and a Monte Carlo simulation study was conducted to evaluate the performance of estimations of possibility using the Genetic algorithm GA.
There are many images you need to large Khoznah space With the continued evolution of storage technology for computers, there is a need nailed required to reduce Alkhoznip space for pictures and image compression in a good way, the conversion method Alamueja
Flow-production systems whose pieces are connected in a row may not have maintenance scheduling procedures fixed because problems occur at different times (electricity plants, cement plants, water desalination plants). Contemporary software and artificial intelligence (AI) technologies are used to fulfill the research objectives by developing a predictive maintenance program. The data of the fifth thermal unit of the power station for the electricity of Al Dora/Baghdad are used in this study. Three stages of research were conducted. First, missing data without temporal sequences were processed. The data were filled using time series hour after hour and the times were filled as system working hours, making the volume of the data relativel
... Show MoreThe research aims to estimate missing values using covariance analysis method Coons way to the variable response or dependent variable that represents the main character studied in a type of multi-factor designs experiments called split block-design (SBED) so as to increase the accuracy of the analysis results and the accuracy of statistical tests based on this type of designs. as it was noted in the theoretical aspect to the design of dissident sectors and statistical analysis have to analyze the variation in the experience of experiment )SBED) and the use of covariance way coons analysis according to two methods to estimate the missing value, either in the practical side of it has been implemented field experiment wheat crop in
... Show MorePotential data interpretation is significant for subsurface structure characterization. The current study is an attempt to explore the magnetic low lying between Najaf and Diwaniyah Cities, In central Iraq. It aims to understand the subsurface structures that may result from this anomaly and submit a better subsurface structural image of the region. The study area is situated in the transition zone, known as the Abu Jir Fault Zone. This tectonic boundary is an inherited basement weak zone extending towards the NW-SE direction. Gravity and magnetic data processing and enhancement techniques; Total Horizontal Gradient, Tilt Angle, Fast Sigmoid Edge Detection, Improved Logistic, and Theta Map filters highlight source boundaries and the
... Show MoreThis paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show More