There continues to be a need for an in-situ sensor system to monitor the engine oil of internal combustion engines. Engine oil needs to be monitored for contaminants and depletion of additives. While various sensor systems have been designed and evaluated, there is still a need to develop and evaluate new sensing technologies. This study evaluated Terahertz time-domain spectroscopy (THz-TDS) for the identification and estimation of the glycol contamination of automotive engine oil. Glycol contamination is a result of a gasket or seal leak allowing coolant to enter an engine and mix with the engine oil. An engine oil intended for use in both diesel and gasoline engines was obtained. Fresh engine oil samples were contaminated with four levels of glycol (0 ppm, 150 ppm, 300 ppm, and 500 ppm). The samples were analyzed with THz-TDS and converted to frequency domain parameters of refractive index and absorption coefficient. While both parameters showed potential, the absorption coefficient had the best potential and was able to statistically discriminate among the four contamination levels.
A total of 243 serum samples were tested for the presence of
Chlamydia antibodies by ind irect immunofluorescent antibody test.Ninety
nine females were suffering from abortions, 64 were infertile and other 80 were none aborted women. The incidence of Ch lamydia were (15%,
9.4%) and (3.8%) in abortion, infertile and non aborted group,
respecti vely. The results also showed a difference in prevalence rate between the age groups. The highest incidence was found in the age group 20-39 &
... Show MoreSoftware-defined networking (SDN) presents novel security and privacy risks, including distributed denial-of-service (DDoS) attacks. In response to these threats, machine learning (ML) and deep learning (DL) have emerged as effective approaches for quickly identifying and mitigating anomalies. To this end, this research employs various classification methods, including support vector machines (SVMs), K-nearest neighbors (KNNs), decision trees (DTs), multiple layer perceptron (MLP), and convolutional neural networks (CNNs), and compares their performance. CNN exhibits the highest train accuracy at 97.808%, yet the lowest prediction accuracy at 90.08%. In contrast, SVM demonstrates the highest prediction accuracy of 95.5%. As such, an
... Show MoreWe propose a new method for detecting the abnormality in cerebral tissues present within Magnetic Resonance Images (MRI). Present classifier is comprised of cerebral tissue extraction, image division into angular and distance span vectors, acquirement of four features for each portion and classification to ascertain the abnormality location. The threshold value and region of interest are discerned using operator input and Otsu algorithm. Novel brain slices image division is introduced via angular and distance span vectors of sizes 24˚ with 15 pixels. Rotation invariance of the angular span vector is determined. An automatic image categorization into normal and abnormal brain tissues is performed using Support Vector Machine (SVM). St
... Show MoreFrance attached great importance to public funds Article 15 of the Declaration on Human and Citizen's Rights (DDHC) stipulates that society has the right to hold any public official accountable for his or her administration. For that reason, the French legislature has established a body specialized in the control of public funds, which it calls the Court of Accounting. It has established the Court of Budget and Financial Discipline to assist it. The courts are run by judges who cannot be dismissed. The courts are also given jurisdiction, administrative jurisdiction and the role and purpose for which they have been assigned
In the task of detecting intrinsic plagiarism, the cases where reference corpus is absent are to be dealt with. This task is entirely based on inconsistencies within a given document. Detection of internal plagiarism has been considered as a classification problem. It can be estimated through taking into consideration self-based information from a given document.
The core contribution of the work proposed in this paper is associated with the document representation. Wherein, the document, also, the disjoint segments generated from it, have been represented as weight vectors demonstrating their main content. Where, for each element in these vectors, its average weight has been considered instead of its frequency.
Th
... Show MoreIt is well-known that the existence of outliers in the data will adversely affect the efficiency of estimation and results of the current study. In this paper four methods will be studied to detect outliers for the multiple linear regression model in two cases : first, in real data; and secondly, after adding the outliers to data and the attempt to detect it. The study is conducted for samples with different sizes, and uses three measures for comparing between these methods . These three measures are : the mask, dumping and standard error of the estimate.
n this research, some thermophysical properties of ethylene glycol with water (H2O) and two solvent mixtures dimethylformamide/ water (DMF + H2O) were studied. The densities (ρ) and viscosities (η) of ethylene glycol in water and a mixed solvent dimethylformamide (DMF + H2O) were determined at 298.15 K, t and a range of concentrations from 0.1 to1.0 molar. The ρ and η values were subsequently used to calculate the thermodynamics of mixing including the apparent molar volume (ϕv), partial molar volume (ϕvo) at infinite dilution. The solute-solute interaction is presented by Sv results from the equation ∅_v=ϕ_v^o+S_v √m. The values of viscosity (B) coefficients and Falkenhagen coefficient(A) of the Jone-Dole equation and Gibbs free
... Show MoreThis study aims to develop a recommendation engine methodology to enhance the model’s effectiveness and efficiency. The proposed model is commonly used to assign or propose a limited number of developers with the required skills and expertise to address and resolve a bug report. Managing collections within bug repositories is the responsibility of software engineers in addressing specific defects. Identifying the optimal allocation of personnel to activities is challenging when dealing with software defects, which necessitates a substantial workforce of developers. Analyzing new scientific methodologies to enhance comprehension of the results is the purpose of this analysis. Additionally, developer priorities were discussed, especially th
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me