The Korteweg-de Vries equation plays an important role in fluid physics and applied mathematics. This equation is a fundamental within study of shallow water waves. Since these equations arise in many applications and physical phenomena, it is officially showed that this equation has solitary waves as solutions, The Korteweg-de Vries equation is utilized to characterize a long waves travelling in channels. The goal of this paper is to construct the new effective frequent relation to resolve these problems where the semi analytic iterative technique presents new enforcement to solve Korteweg-de Vries equations. The distinctive feature of this method is, it can be utilized to get approximate solutions for travelling waves of non-linear partial differential equations with small amount of computations does not require to calculate restrictive assumptions or transformation like other conventional methods. In addition, several examples clarify the relevant features of this presented method, so the results of this study are debated to show that this method is a powerful tool and promising to illustrate the accuracy and efficiency for solving these problems. To evaluate the results in the iterative process we used the Matlab symbolic manipulator.
In this paper, some new types of regularity axioms, namely pairwise quasi-regular, pairwise semi-regular, pairwise pseudo regular and pairwise regular are defined and studied in both ech fuzzy soft bi-closure spaces ( bicsp’s) and their induced fuzzy soft bitopological spaces. We also study the relationships between them. We show that in all these types of axioms, the hereditary property is satisfied under closed fs bi-csubsp of . Furthermore, we define some normality axioms, namely pairwise semi-normal, pairwise pseudo normal, pairwise normal and pairwise completely normal in both bicsp’s and their induced fuzzy soft bitopological spaces, as well as their basic properties and the relationships between them are studied.
... Show MoreThe Log-Logistic distribution is one of the important statistical distributions as it can be applied in many fields and biological experiments and other experiments, and its importance comes from the importance of determining the survival function of those experiments. The research will be summarized in making a comparison between the method of maximum likelihood and the method of least squares and the method of weighted least squares to estimate the parameters and survival function of the log-logistic distribution using the comparison criteria MSE, MAPE, IMSE, and this research was applied to real data for breast cancer patients. The results showed that the method of Maximum likelihood best in the case of estimating the paramete
... Show MoreDue to the importance of solutions of partial differential equations, linear, nonlinear, homogeneous, and non-homogeneous, in important life applications, including engineering applications, physics and astronomy, medical sciences, and life technology, and their importance in solutions to heat transfer equations, wave, Laplace equation, telegraph, etc. In this paper, a new double integral transform has been proposed.
In this work, we have introduced a new double transform ( Double Complex EE Transform ). In addition, we presented the convolution theorem and proved the properties of the proposed transform, which has an effective and useful role in dealing with the solution of two-dimensional partial differential equations. Moreover
... Show MoreBlock cipher technique is one of cryptography techniques to encrypt data block by block. The Serpent is one of AES candidates. It encrypts a 128-bit block by using 32 rounds of a similar calculation utilizing permutations and substitutions. Since the permutations and substitutions of it are static. Then this paper proposes dynamic methods for permutation, substitution and key generation based on chaotic maps to get more security. The proposed methods are analyzed and the results showed that they were able to exceed the weakness resulting from the use of static permutations and substitutions boxes in the original algorithm and also can reduce number of rounds and time usage compared with a classical Serpent block
... Show MoreThis research aims to study the optical characteristics of semiconductor quantum dots (QDs) composed of CdTe and CdTe/CdSe core-shell structures. It utilizes the refluxed method to synthesize these nanoscale particles and aims to comprehend the growth process by monitoring their optical properties over varied periods of time and pH 12. Specifically, the optical evolution of these QDs is evaluated using photoluminescence (PL) and ultraviolet (UV) spectroscopy. For CdTe QDs, a consistent absorbance and peak intensity increase were observed across the spectrum over time. Conversely, CdTe/CdSe QDs displayed distinctive absorbance and peak intensity variations. These disparities might stem from irregularities in forming selenium (Se) layers a
... Show MoreAs the banking sector is a strong influence on the country's economic growth,The solid financial well-being of anybank does not mean only a guarantee for its investors, It is also important for both owners and workers and for theeconomy in all its joints.The elements of capital adequacy and quality of assets are important to the functioning of thebanking business.In this study, the research sample included four private banks. Quarterly data were used for the period(2011 - 2018).Moreover, data is also collected from articles, papers, the World Wide Web (the Internet) and specializedinternational journals.In this research, an effort was made to try to find out the effect of (the ratio of the capital owned todeposits on the value of the bank),
... Show MoreAbstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show MoreStudying extreme precipitation is very important in Iraq. In particular, the last decade witnessed an increasing trend in extreme precipitation as the climate change. Some of which caused a disastrous consequences on social and economic environment in many parts of the country. In this paper a statistical analysis of rainfall data is performed. Annual maximum rainfall data obtained from monthly records for a period of 127 years (1887-2013 inclusive) at Baghdad metrology station have been analyzed. The three distributions chosen to fit the data were Gumbel, Fréchet and the generalized Extreme Value (GEV) distribution. Using the maximum likelihood method, results showed that the GEV distribution was the best followed by Fréchet distribut
... Show MoreThe current study aims to apply the methods of evaluating investment decisions to extract the highest value and reduce the economic and environmental costs of the health sector according to the strategy.In order to achieve the objectives of the study, the researcher relied on the deductive approach in the theoretical aspect by collecting sources and previous studies. He also used the applied practical approach, relying on the data and reports of Amir almuminin Hospital for the period (2017-2031) for the purpose of evaluating investment decisions in the hospital. A set of conclusions, the most important of which is: The failure to apply
... Show MoreA fault is an error that has effects on system behaviour. A software metric is a value that represents the degree to which software processes work properly and where faults are more probable to occur. In this research, we study the effects of removing redundancy and log transformation based on threshold values for identifying faults-prone classes of software. The study also contains a comparison of the metric values of an original dataset with those after removing redundancy and log transformation. E-learning and system dataset were taken as case studies. The fault ratio ranged from 1%-31% and 0%-10% for the original dataset and 1%-10% and 0%-4% after removing redundancy and log transformation, respectively. These results impacted direct
... Show More