The current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet packets are the best performers when associated with a robust threshold strategy for estimation. Illustrated below is the applicability of the proposed method by a real-world application from the foreign exchange market, emphasizing the use of wavelet packets for parameter estimation and the potential for improvement under stochastic modeling and analysis.
This research aims to identify the reality of teaching political science research methods curriculum, to observe practices, and differences in teaching and learning between the Arab and Western universities. Moreover, it focuses on the difficulties that face students' acquisition of the course skills. The research uses the course model of some Western and Arab universities as case study.
This research shows that the curriculum do not reach yet the final form as other political science curriculums, and its upcoming changes will reflect the needs of stakeholders. The best method to teach this curriculum is to use applied learning in groups, learning by doing, and finally problem-based learning approach. Using optimal assessment deep
... Show MoreReflection cracking in asphalt concrete (AC) overlays is a common form of pavement deterioration that occurs when underlying cracks and joints in the pavement structure propagate through an overlay due to thermal and traffic-induced movement, ultimately degrading the pavement’s lifespan and performance. This study aims to determine how alterations in overlay thickness and temperature conditions, the incorporation of chopped fibers, and the use of geotextiles influence the overlay’s capacity to postpone the occurrence of reflection cracking. To achieve the above objective, a total of 36 prism specimens were prepared and tested using an overlay testing machine (OTM). The variables considered in this study were the thickness of the
... Show MoreReflection cracking in asphalt concrete (AC) overlays is a common form of pavement deterioration that occurs when underlying cracks and joints in the pavement structure propagate through an overlay due to thermal and traffic-induced movement, ultimately degrading the pavement’s lifespan and performance. This study aims to determine how alterations in overlay thickness and temperature conditions, the incorporation of chopped fibers, and the use of geotextiles influence the overlay’s capacity to postpone the occurrence of reflection cracking. To achieve the above objective, a total of 36 prism specimens were prepared and tested using an overlay testing machine (OTM). The variables considered in this study were the thickness of the
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
In this work, a comparative analysis for the behavior and pattern of the variations of the IF2 and T Ionospheric indices was conducted for the minimum and maximum years of solar cycles 23 and 24. Also, the correlative relationship between the two ionospheric indices was examined for the seasonal periods spanning from August 1996 to November 2008 for solar cycle 23 and from December 2008 to November 2019 for solar cycle 24. Statistical calculations were performed to compare predicted values with observed values for the selected indices during the tested timeframes. The study's findings revealed that the behavior of the examined indices exhibited almost similar variations throughout the studied timeframe. The seasonal variations were
... Show MoreTwelve species from Brassicaceae family were studied using two different molecular techniques: RAPD and ISSR; both of these techniques were used to detect some molecular markers associated with the genotype identification. RAPD results, from using five random primers, revealed 241 amplified fragments, 62 of them were polymorphic (26%).
ISSR results showed that out of seven primers, three (ISSR3, UBC807, UBC811) could not amplify the genomic DNA; other primers revealed 183 amplified fragments, 36 of them were polymorphic (20%). The similarity evidence and dendrogram for the genetic distances of the incorporation between the two techniques showed that the highest similarity was 0.897 between the va
... Show MoreThe area of character recognition has received a considerable attention by researchers all over the world during the last three decades. However, this research explores best sets of feature extraction techniques and studies the accuracy of well-known classifiers for Arabic numeral using the Statistical styles in two methods and making comparison study between them. First method Linear Discriminant function that is yield results with accuracy as high as 90% of original grouped cases correctly classified. In the second method, we proposed algorithm, The results show the efficiency of the proposed algorithms, where it is found to achieve recognition accuracy of 92.9% and 91.4%. This is providing efficiency more than the first method.
Gumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical featu
... Show MoreA new approach for baud time (or baud rate) estimation of a random binary signal is presented. This approach utilizes the spectrum of the signal after nonlinear processing in a way that the estimation error can be reduced by simply increasing the number of the processed samples instead of increasing the sampling rate. The spectrum of the new signal is shown to give an accurate estimate about the baud time when there is no apriory information or any restricting preassumptions. The performance of the estimator for random binary square waves perturbed by white Gaussian noise and ISI is evaluated and compared with that of the conventional estimator of the zero crossing detector.
This paper presents a comparison between the electroencephalogram (EEG) channels during scoliosis correction surgeries. Surgeons use many hand tools and electronic devices that directly affect the EEG channels. These noises do not affect the EEG channels uniformly. This research provides a complete system to find the least affected channel by the noise. The presented system consists of five stages: filtering, wavelet decomposing (Level 4), processing the signal bands using four different criteria (mean, energy, entropy and standard deviation), finding the useful channel according to the criteria’s value and, finally, generating a combinational signal from Channels 1 and 2. Experimentally, two channels of EEG data were recorded fro
... Show More