The rehabilitation of deteriorated pavements using Asphalt Concrete (AC) overlays consistently confronts the reflection cracking challenge, where inherent cracks and joints from an existing pavement layer are mirrored in the new overlay. To address this issue, the current study evaluates the effectiveness of Engineered Cementitious Composite (ECC) and geotextile fabric as mitigation strategies. ECC, characterized by its tensile ductility, fracture resistance, and high deformation capacity, was examined in interlayer thicknesses of 7, 12, and 17 mm. Additionally, the impact of geotextile fabric positioning at the base and at 1/3 depth of the AC specimen was explored. Utilizing the Overlay Testing Machine (OTM) for evaluations, the research demonstrated that ECC17 significantly mitigated reflection cracking, showing a notable 764% increase in the number of load cycles to failure (Nf) compared to the Geotextile Base (GB) specimen. Against the Reference Specimen (RS), ECC17 exhibited a remarkable 1307% enhancement in Nf values, underscoring its effectiveness. Geotextile fabric, particularly at 1/3 depth, demonstrated notable resistance but was overshadowed by the performance of ECC interlayers. The results clearly indicate that ECC, especially ECC17, stands out as an effective solution for mitigating reflection cracking, including joints, in AC overlays.
Correct grading of apple slices can help ensure quality and improve the marketability of the final product, which can impact the overall development of the apple slice industry post-harvest. The study intends to employ the convolutional neural network (CNN) architectures of ResNet-18 and DenseNet-201 and classical machine learning (ML) classifiers such as Wide Neural Networks (WNN), Naïve Bayes (NB), and two kernels of support vector machines (SVM) to classify apple slices into different hardness classes based on their RGB values. Our research data showed that the DenseNet-201 features classified by the SVM-Cubic kernel had the highest accuracy and lowest standard deviation (SD) among all the methods we tested, at 89.51 % 1.66 %. This
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreTwo simple methods for the determination of eugenol were developed. The first depends on the oxidative coupling of eugenol with p-amino-N,N-dimethylaniline (PADA) in the presence of K3[Fe(CN)6]. A linear regression calibration plot for eugenol was constructed at 600 nm, within a concentration range of 0.25-2.50 μg.mL–1 and a correlation coefficient (r) value of 0.9988. The limits of detection (LOD) and quantitation (LOQ) were 0.086 and 0.284 μg.mL–1, respectively. The second method is based on the dispersive liquid-liquid microextraction of the derivatized oxidative coupling product of eugenol with PADA. Under the optimized extraction procedure, the extracted colored product was determined spectrophotometrically at 618 nm. A l
... Show Moreيعتقد البعض ان مفهوم العلم يعني الآلات والاجهزة العلمية (تقنيات التعليم) وهي لا تختلف عن مفهوم تكنولوجيا المعلومات , ويعد هذا الاعتقاد خاطئ , لان العلم هو بناء المعرفة العلمية المنظمة والتي يتم التوصل اليها عن طريق البحث العلمي , اما تكنولوجيا المعلومات فهي "التطبيقات العملية للمعرفة العلمية في مختلف المجالات ذات الفائدة المباشرة بحياة الانسان, او هي النواحي التطبيقية للعلم وما يرتبط بها من آلات واجهزة".
The research aims to achieve a set of objectives, the most important of which is determining the extent to which the auditors of the research sample in the Federal Bureau of Financial Supervision adhere to the requirements of the quality control system according to the Iraqi Audit Manual No. The federal financial / research sample with the quality control system according to the Iraqi audit guide No. 7), and the researcher seeks to test the main research hypothesis and sub-hypotheses, and to achieve this, a questionnaire was designed by (Google Form) and distributed electronically to the elements of the research sample, Through the statistical package program (SPSS), the results of the questionnaire were analysed. In light of the applied
... Show MoreThe present paper discusses one of the most important Russian linguistic features of Arabic origin Russian lexes denoting some religious worship or some political and social positions like Qadi, Wally, Sultan, Alam, Ruler, Caliph, Amir, Fakih, Mufti, Sharif, Ayatollah, Sheikh.. etc. A lexical analysis of the two of the most efficient and most used words of Arabic origin Russian lexes that are “Caliph and Sheikh” is considered in the present study. The lexicographic analysis of these words makes it possible to identify controversial issues related to their etymology and semantic development.
The study is conducted by the use of the modern Russian and Arabic dictionary, specifically, (Intermediate lexicon Dictionary
... Show MoreCurrently, one of the topical areas of application of machine learning methods is the prediction of material characteristics. The aim of this work is to develop machine learning models for determining the rheological properties of polymers from experimental stress relaxation curves. The paper presents an overview of the main directions of metaheuristic approaches (local search, evolutionary algorithms) to solving combinatorial optimization problems. Metaheuristic algorithms for solving some important combinatorial optimization problems are described, with special emphasis on the construction of decision trees. A comparative analysis of algorithms for solving the regression problem in CatBoost Regressor has been carried out. The object of
... Show MoreThere are numbers of automatic translation services that internet users can choose to automatically translate a certain text, and Google translate is one of these automatic services that proposes over 51 Languages. The present paper sheds light on the nature of the translation process offered by Google, and analyze the most prominent problems faced when Google translate is used. Direct translation is common with Google Translate and often results in nonsensical literal translations, particularly with long compound sentences. This is due to the fact that Google translation system uses a method based on language pair frequency that does not take into account grammatical rules which, in turn, affects the quality of the translation. The
... Show More<span>Dust is a common cause of health risks and also a cause of climate change, one of the most threatening problems to humans. In the recent decade, climate change in Iraq, typified by increased droughts and deserts, has generated numerous environmental issues. This study forecasts dust in five central Iraqi districts using machine learning and five regression algorithm supervised learning system framework. It was assessed using an Iraqi meteorological organization and seismology (IMOS) dataset. Simulation results show that the gradient boosting regressor (GBR) has a mean square error of 8.345 and a total accuracy ratio of 91.65%. Moreover, the results show that the decision tree (DT), where the mean square error is 8.965, c
... Show MoreSignificant advances in the automated glaucoma detection techniques have been made through the employment of the Machine Learning (ML) and Deep Learning (DL) methods, an overview of which will be provided in this paper. What sets the current literature review apart is its exclusive focus on the aforementioned techniques for glaucoma detection using the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines for filtering the selected papers. To achieve this, an advanced search was conducted in the Scopus database, specifically looking for research papers published in 2023, with the keywords "glaucoma detection", "machine learning", and "deep learning". Among the multiple found papers, the ones focusing
... Show More