In this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the base and obtaining several values for the suggested threshold and applying then Haar wavelet function With the cut-off hard and mid threshold and Comparing the results according to several criteria.
The Hubble telescope is characterized by the accuracy of the image formed in it, as a result of the fact that the surrounding environment is free of optical pollutants. Such as atmospheric gases and dust, in addition to light pollution emanating from industrial and natural light sources on the earth's surface. The Hubble telescope has a relatively large objective lens that provides appropriate light to enter the telescope to get a good image. Because of the nature of astronomical observation, which requires sufficient light intensity emanating from celestial objects (galaxies, stars, planets, etc.). The Hubble telescope is classified as type of the Cassegrain reflecting telescopes, which gives it the advantage of eliminating chromat
... Show MoreThis paper discusses a comparative study to relate parametric and non-parametric mode decomposition algorithms for response-only data. Three popular mode decomposition algorithms are included in this study: the Eigensystem Realization Algorithm with the Natural Excitation Technique (NExT-ERA) for the parametric algorithm, as well as the Principal Component Analysis (PCA) and the Independent Component Analysis (ICA) for the non-parametric algorithms. A comprehensive parametric study is provided for (i) different response types, (ii) excitation types, (iii) system damping, and (iv) sensor spatial resolution to compare the mode shapes and modal coordinates of using a 10-DOF building model. The mode decomposition results are also compared using
... Show MoreThe unstable and uncertain nature of natural rubber prices makes them highly volatile and prone to outliers, which can have a significant impact on both modeling and forecasting. To tackle this issue, the author recommends a hybrid model that combines the autoregressive (AR) and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) models. The model utilizes the Huber weighting function to ensure the forecast value of rubber prices remains sustainable even in the presence of outliers. The study aims to develop a sustainable model and forecast daily prices for a 12-day period by analyzing 2683 daily price data from Standard Malaysian Rubber Grade 20 (SMR 20) in Malaysia. The analysis incorporates two dispersion measurements (I
... Show MoreSufficient high-quality data are unavailable to describe the management approach and guideline of COVID-19 disease in pediatric and adolescent population which may be due to mild presentation in most of cases and less severe complications than older ages.
World Health Organization was concerned with the establishment of an approved guideline to manage the increasing number of COVID-19 patients worldwide aiming to prevent or lessen COVID-19 global burden.
The clinical features have a wide spectrum starting from uncomplicated mild illness, mild-moderate pneumonia, severe pneumonia, acute respiratory distress syndrome, sepsis, septic shock, and multisystem inflammatory syndrome in children.
Many important definitions
... Show MoreThis study is unique in this field. It represents a mix of three branches of technology: photometry, spectroscopy, and image processing. The work treats the image by treating each pixel in the image based on its color, where the color means a specific wavelength on the RGB line; therefore, any image will have many wavelengths from all its pixels. The results of the study are specific and identify the elements on the nucleus’s surface of a comet, not only the details but also their mapping on the nucleus. The work considered 12 elements in two comets (Temple 1 and 67P/Churyumoy-Gerasimenko). The elements have strong emission lines in the visible range, which were recognized by our MATLAB program in the treatment of the image. The percen
... Show MoreThis paper describes a practical study on the impact of learning's partners, Bluetooth Broadcasting system, interactive board, Real – time response system, notepad, free internet access, computer based examination, and interaction classroom, etc, had on undergraduate student performance, achievement and involving with lectures. The goal of this study is to test the hypothesis that the use of such learning techniques, tools, and strategies to improve student learning especially among the poorest performing students. Also, it gives some kind of practical comparison between the traditional way and interactive way of learning in terms of lectures time, number of tests, types of tests, student's scores, and student's involving with lectures
... Show MoreFluidization process is widely used by a great assortment of industries worldwide and represents a trillion dollar industry [6]. They are currently used in separation, classification, drying and mixing of particles, chemical reactions and regeneration processes; one of these processes is the mass transfer from an immersed surface to a gas fluidized bed
Objective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu
... Show MoreToday’s academics have a major hurdle in solving combinatorial problems in the actual world. It is nevertheless possible to use optimization techniques to find, design, and solve a genuine optimal solution to a particular problem, despite the limitations of the applied approach. A surge in interest in population-based optimization methodologies has spawned a plethora of new and improved approaches to a wide range of engineering problems. Optimizing test suites is a combinatorial testing challenge that has been demonstrated to be an extremely difficult combinatorial optimization limitation of the research. The authors have proposed an almost infallible method for selecting combinatorial test cases. It uses a hybrid whale–gray wol
... Show More