The main intention of this study was to investigate the development of a new optimization technique based on the differential evolution (DE) algorithm, for the purpose of linear frequency modulation radar signal de-noising. As the standard DE algorithm is a fixed length optimizer, it is not suitable for solving signal de-noising problems that call for variability. A modified crossover scheme called rand-length crossover was designed to fit the proposed variable-length DE, and the new DE algorithm is referred to as the random variable-length crossover differential evolution (rvlx-DE) algorithm. The measurement results demonstrate a highly efficient capability for target detection in terms of frequency response and peak forming that was isolated from noise distortion. The modified method showed significant improvements in performance over traditional de-noising techniques.
Market share is a major indication of business success. Understanding the impact of numerous economic factors on market share is critical to a company’s success. In this study, we examine the market shares of two manufacturers in a duopoly economy and present an optimal pricing approach for increasing a company’s market share. We create two numerical models based on ordinary differential equations to investigate market success. The first model takes into account quantity demand and investment in R&D, whereas the second model investigates a more realistic relationship between quantity demand and pricing.
This study was conducted to examine the discharge capacity of the reach of the Tigris River between Kut and Amarah Barrages of 250km in length. The examination includes simulation the current capacity of the reach by using HEC-RAS model. 247cross sections surveyed in 2012 were used in the simulation. The model was calibrated using observed discharges of 533, 800, 1025 and 3000m3/s discharged at Kut Barrage during 2013, 1995, 1995 and 1988, respectively, and its related water level at three gauge stations located along the reach. The result of calibration process indicated that the lowest Root Mean Square Error of 0.095 can be obtained when using Manning’s n coefficient of 0.026, 0.03 for th
... Show MoreSupport vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different ca
... Show MoreA simple straightforward mathematical method has been developed to cluster grid nodes on a boundary segment of an arbitrary geometry that can be fitted by a relevant polynomial. The method of solution is accomplished in two steps. At the first step, the length of the boundary segment is evaluated by using the mean value theorem, then grids are clustered as desired, using relevant linear clustering functions. At the second step, as the coordinates cell nodes have been computed and the incremental distance between each two nodes has been evaluated, the original coordinate of each node is then computed utilizing the same fitted polynomial with the mean value theorem but reversibly.
The method is utilized to predict
... Show MoreThe first section of this research discussed the manner of the research from many sides like the problem it faces, importance of it , its targets ,boundaries, the way to collect and get information's and its assumption.
When the second chapter discussed the press – manufacturing and the development ,importance and types of newspapers, also its merits and weaknesses.
The third chapter talked about the scientific side and how to choose an assumption for the research . as it talked also about the apparent honest and stability tests that help in analyzing the research until getting results and so the right assumption for the research will be chosen.
And finally, the fourth chapter put highlight on the be
... Show MoreComparative literature is one of the important research topics in finding new relations and results that other types of studies do not allow.
The present research is a comparative study between two contemporary poets : Al-Sayyab and Prévert. The reason for accomplishing this research is Al-Sayyab’s reading for the western literature. Moreover, the study sheds a light on translational criticism.
It tackles the lives of the two writers and their points of similarities and differences. Prévert and Al-Sayyab’s are two modern poets. The first employed his daily routines to express reality, specially the events of the two world wars. The second’s pain, on the other hand, was the starting point to express others’ suffe
... Show MoreIntroduction: Carrier-based gutta-percha is an effective method of root canal obturation creating a 3-dimensional filling; however, retrieval of the plastic carrier is relatively difficult, particularly with smaller sizes. The purpose of this study was to develop composite carriers consisting of polyethylene (PE), hydroxyapatite (HA), and strontium oxide (SrO) for carrier-based root canal obturation. Methods: Composite fibers of HA, PE, and SrO were fabricated in the shape of a carrier for delivering gutta-percha (GP) using a melt-extrusion process. The fibers were characterized using infrared spectroscopy and the thermal properties determined using differential scanning calorimetry. The elastic modulus and tensile strength tests were dete
... Show MorePlagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and
... Show More