The deep learning algorithm has recently achieved a lot of success, especially in the field of computer vision. This research aims to describe the classification method applied to the dataset of multiple types of images (Synthetic Aperture Radar (SAR) images and non-SAR images). In such a classification, transfer learning was used followed by fine-tuning methods. Besides, pre-trained architectures were used on the known image database ImageNet. The model VGG16 was indeed used as a feature extractor and a new classifier was trained based on extracted features.The input data mainly focused on the dataset consist of five classes including the SAR images class (houses) and the non-SAR images classes (Cats, Dogs, Horses, and Humans). The Convolutional Neural Network (CNN) has been chosen as a better option for the training process because it produces a high accuracy. The final accuracy has reached 91.18% in five different classes. The results are discussed in terms of the probability of accuracy for each class in the image classification in percentage. Cats class got 99.6 %, while houses class got 100 %.Other types of classes were with an average score of 90 % and above.
According to the importance of the conveyor systems in various industrial and service lines, it is very desirable to make these systems as efficient as possible in their work. In this paper, the speed of a conveyor belt (which is in our study a part of an integrated training robotic system) is controlled using one of the artificial intelligence methods, which is the Artificial Neural Network (ANN). A visions sensor will be responsible for gathering information about the status of the conveyor belt and parts over it, where, according to this information, an intelligent decision about the belt speed will be taken by the ANN controller. ANN will control the alteration in speed in a way that gives the optimized energy efficiency through
... Show MoreIn this paper, the Mars orbital elements were calculated. These orbital elements—the major axis, the inclination (i), the longitude of the ascending node (W), the argument of the perigee (w), and the eccentricity (e)—are essential to knowing the size and shape of Mars' orbit. The quick basic program was used to calculate the orbital elements and distance of Mars from the Earth from 25/5/1950 over 10000 days. These were calculated using the empirical formula of Meeus, which depended on the Julian date, which slightly changed for 10000 days; Kepler's equation was solved to find Mars' position and its distance from the Sun. The ecliptic and equatorial coordinates of Mars were calculated. The distance between Mars and the center of the E
... Show MoreAmong the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreThe researcher studied transportation problem because it's great importance in the country's economy. This paper which ware studied several ways to find a solution closely to the optimization, has applied these methods to the practical reality by taking one oil derivatives which is benzene product, where the first purpose of this study is, how we can reduce the total costs of transportation for product of petrol from warehouses in the province of Baghdad, to some stations in the Karsh district and Rusafa in the same province. Secondly, how can we address the Domandes of each station by required quantity which is depending on absorptive capacity of the warehouses (quantities supply), And through r
... Show MoreThis article presents the results of an experimental investigation of using carbon fiber–reinforced polymer sheets to enhance the behavior of reinforced concrete deep beams with large web openings in shear spans. A set of 18 specimens were fabricated and tested up to a failure to evaluate the structural performance in terms of cracking, deformation, and load-carrying capacity. All tested specimens were with 1500-mm length, 500-mm cross-sectional deep, and 150-mm wide. Parameters that studied were opening size, opening location, and the strengthening factor. Two deep beams were implemented as control specimens without opening and without strengthening. Eight deep beams were fabricated with openings but without strengthening, while
... Show MoreThis article presents the results of an experimental investigation of using carbon fiber–reinforced polymer sheets to enhance the behavior of reinforced concrete deep beams with large web openings in shear spans. A set of 18 specimens were fabricated and tested up to a failure to evaluate the structural performance in terms of cracking, deformation, and load-carrying capacity. All tested specimens were with 1500-mm length, 500-mm cross-sectional deep, and 150-mm wide. Parameters that studied were opening size, opening location, and the strengthening factor. Two deep beams were implemented as control specimens without opening and without strengthening. Eight deep beams were fabricated with openings but without strengthening, while
... Show MoreThis paper deals to how to estimate points non measured spatial data when the number of its terms (sample spatial) a few, that are not preferred for the estimation process, because we also know that whenever if the data is large, the estimation results of the points non measured to be better and thus the variance estimate less, so the idea of this paper is how to take advantage of the data other secondary (auxiliary), which have a strong correlation with the primary data (basic) to be estimated single points of non-measured, as well as measuring the variance estimate, has been the use of technique Co-kriging in this field to build predictions spatial estimation process, and then we applied this idea to real data in th
... Show MoreThe research aims to identify how to enhance the quality of the human resources, focusing on four dimensions (efficiency, effectiveness, flexibility, and reliability), by adopting an adventure learning method that combines theoretical and applied aspects at the same time, when developing human resources and is applied using information technology, and that Through its dimensions, which are (cooperation, interaction, communication, and understanding), as the research problem indicated a clear deficiency in the cognitive perception of the mechanism of employing adventure learning dimensions in enhancing human resources quality, so the importance of research was to present treatments and proposals to reduce this problem. To achieve
... Show MoreThe main aim of this paper is to study how the different estimators of the two unknown parameters (shape and scale parameter) of a generalized exponential distribution behave for different sample sizes and for different parameter values. In particular,
. Maximum Likelihood, Percentile and Ordinary Least Square estimators had been implemented for different sample sizes (small, medium, and large) and assumed several contrasts initial values for the two parameters. Two indicators of performance Mean Square Error and Mean Percentile Error were used and the comparisons were carried out between different methods of estimation by using monte carlo simulation technique .. It was obse
... Show More