In the recent years, remote sensing applications have a great interest because it's offers many advantages, benefits and possibilities for the applications that using this concept, satellite it's one must important applications for remote sensing, it's provide us with multispectral images allow as study many problems like changing in ecological cover or biodiversity for earth surfers, and illustrated biological diversity of the studied areas by the presentation of the different areas of the scene taken depending on the length of the characteristic wave, Thresholding it's a common used operation for image segmentation, it's seek to extract a monochrome image from gray image by segment this image to two region (foreground & background) depending on pixels intensity to reducing image distortion, and also separated the target area from the rest of scene features under study, so we seek to used number of thresholding techniques in this paper for clarify the importance of this concept in image processing and we proposed a new statistical thresholding techniques which compared with techniques used, and the result showed the advantage of proposed techniques that achieved from applying the techniques on multispectral satellite image takin for an area west of Iraq that characterized their environmental diversity so it's a good case to study.
The focus of this paper is the presentation of a new type of mapping called projection Jungck zn- Suzuki generalized and also defining new algorithms of various types (one-step and two-step algorithms) (projection Jungck-normal N algorithm, projection Jungck-Picard algorithm, projection Jungck-Krasnoselskii algorithm, and projection Jungck-Thianwan algorithm). The convergence of these algorithms has been studied, and it was discovered that they all converge to a fixed point. Furthermore, using the previous three conditions for the lemma, we demonstrated that the difference between any two sequences is zero. These algorithms' stability was demonstrated using projection Jungck Suzuki generalized mapping. In contrast, the rate of convergenc
... Show More<span>Dust is a common cause of health risks and also a cause of climate change, one of the most threatening problems to humans. In the recent decade, climate change in Iraq, typified by increased droughts and deserts, has generated numerous environmental issues. This study forecasts dust in five central Iraqi districts using machine learning and five regression algorithm supervised learning system framework. It was assessed using an Iraqi meteorological organization and seismology (IMOS) dataset. Simulation results show that the gradient boosting regressor (GBR) has a mean square error of 8.345 and a total accuracy ratio of 91.65%. Moreover, the results show that the decision tree (DT), where the mean square error is 8.965, c
... Show MoreThe purpose of this paper is to solve the stochastic demand for the unbalanced transport problem using heuristic algorithms to obtain the optimum solution, by minimizing the costs of transporting the gasoline product for the Oil Products Distribution Company of the Iraqi Ministry of Oil. The most important conclusions that were reached are the results prove the possibility of solving the random transportation problem when the demand is uncertain by the stochastic programming model. The most obvious finding to emerge from this work is that the genetic algorithm was able to address the problems of unbalanced transport, And the possibility of applying the model approved by the oil products distribution company in the Iraqi Ministry of Oil to m
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Feature selection represents one of the critical processes in machine learning (ML). The fundamental aim of the problem of feature selection is to maintain performance accuracy while reducing the dimension of feature selection. Different approaches were created for classifying the datasets. In a range of optimization problems, swarming techniques produced better outcomes. At the same time, hybrid algorithms have gotten a lot of attention recently when it comes to solving optimization problems. As a result, this study provides a thorough assessment of the literature on feature selection problems using hybrid swarm algorithms that have been developed over time (2018-2021). Lastly, when compared with current feature selection procedu
... Show MoreThe problem of the study is to monitor the content presented in the Iraqi satellite channels to identify the nature of the ideas contained in these ads and to identify the values carried by the creative strategies and ad campaigns that use. Where the satellite is one of the most important technological developments in the field of communications in the nineties and the ads in various forms and functions one of the contents provided by satellite channels where these channels are keen to direct their messages in particular and communication, in general, to address a certain audience and convince and influence in order to achieve certain purposes of the source Or the body from which it originates, especially those that result in an in
... Show MoreThis research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods
Statistical fluctuations of nuclear energy spectra for the isobar A = 68 were examined by means of the random matrix theory together with the nuclear shell model. The isobar A = 68 nuclei are suggested to consist of an inert core of 56Ni with 12 nucleons in f5p-space (2p3/2, 1f5/2 and 2p1/2 orbitals). The nuclear excitation energies, required by this work, were obtained through performing f5p-shell model calculations using the isospin formalism f5pvh interaction with realistic single particle energies. All calculations of the present study were conducted using the OXBASH code. The calculated level densities were found to have a Gaussian shape. The distributions of level spacing P(s) an
... Show MoreStatistical learning theory serves as the foundational bedrock of Machine learning (ML), which in turn represents the backbone of artificial intelligence, ushering in innovative solutions for real-world challenges. Its origins can be linked to the point where statistics and the field of computing meet, evolving into a distinct scientific discipline. Machine learning can be distinguished by its fundamental branches, encompassing supervised learning, unsupervised learning, semi-supervised learning, and reinforcement learning. Within this tapestry, supervised learning takes center stage, divided in two fundamental forms: classification and regression. Regression is tailored for continuous outcomes, while classification specializes in c
... Show More