Estimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that represents the relationship between the compared texts and extracts the degree of similarity between them. Representing a text as a semantic network is the best knowledge representation that comes close to the human mind's understanding of the texts, where the semantic network reflects the sentence's semantic, syntactical, and structural knowledge. The network representation is a visual representation of knowledge objects, their qualities, and their relationships. WordNet lexical database has been used as a knowledge-based source while the GloVe pre-trained word embedding vectors have been used as a corpus-based source. The proposed method was tested using three different datasets, DSCS, SICK, and MOHLER datasets. A good result has been obtained in terms of RMSE and MAE.
Treated effluent wastewater is considered an alternative water resource which can provide an important contribution for using it in different purposes, so, the wastewater quality is very important for knowing its suitability for different uses before discharging it into fresh water ecosystems. The wastewater quality index (WWQI) may be considered as a useful and effective tool to assess wastewater quality by indicating one value representing the overall characteristic of the wastewater. It could be used to indicate the suitability of wastewater for different uses in water quality management and decision making. The present study was conducted to evaluate the Al-Diwaniyah sewage treatment plant (STP) effluent quality based on wastewa
... Show MoreThis paper presents a hybrid energy resources (HER) system consisting of solar PV, storage, and utility grid. It is a challenge in real time to extract maximum power point (MPP) from the PV solar under variations of the irradiance strength. This work addresses challenges in identifying global MPP, dynamic algorithm behavior, tracking speed, adaptability to changing conditions, and accuracy. Shallow Neural Networks using the deep learning NARMA-L2 controller have been proposed. It is modeled to predict the reference voltage under different irradiance. The dynamic PV solar and nonlinearity have been trained to track the maximum power drawn from the PV solar systems in real time.
Moreover, the proposed controller i
... Show MoreThe concept of the active contour model has been extensively utilized in the segmentation and analysis of images. This technology has been effectively employed in identifying the contours in object recognition, computer graphics and vision, biomedical processing of images that is normal images or medical images such as Magnetic Resonance Images (MRI), X-rays, plus Ultrasound imaging. Three colleagues, Kass, Witkin and Terzopoulos developed this energy, lessening “Active Contour Models” (equally identified as Snake) back in 1987. Being curved in nature, snakes are characterized in an image field and are capable of being set in motion by external and internal forces within image data and the curve itself in that order. The present s
... Show MoreThe demand for single photon sources in quantum key distribution (QKD) systems has necessitated the use of weak coherent pulses (WCPs) characterized by a Poissonian distribution. Ensuring security against eavesdropping attacks requires keeping the mean photon number (µ) small and known to legitimate partners. However, accurately determining µ poses challenges due to discrepancies between theoretical calculations and practical implementation. This paper introduces two experiments. The first experiment involves theoretical calculations of µ using several filters to generate the WCPs. The second experiment utilizes a variable attenuator to generate the WCPs, and the value of µ was estimated from the photons detected by the BB
... Show MoreBackground and Aim: due to the rapid growth of data communication and multimedia system applications, security becomes a critical issue in the communication and storage of images. This study aims to improve encryption and decryption for various types of images by decreasing time consumption and strengthening security. Methodology: An algorithm is proposed for encrypting images based on the Carlisle Adams and Stafford Tavares CAST block cipher algorithm with 3D and 2D logistic maps. A chaotic function that increases the randomness in the encrypted data and images, thereby breaking the relation sequence through the encryption procedure, is introduced. The time is decreased by using three secure and private S-Boxes rather than using si
... Show MoreIn this work, an optical fiber biomedical sensor for detecting the ratio of the hemoglobin in the blood is presented. A surface plasmon resonance (SPR)-based coreless optical fiber was developed and implemented using single- and multi-mode optical fibers. The sensor is also utilized to evaluate refractive indices and concentrations of hemoglobin in blood samples, with 40 nm thickness of (20 nm Au and 20 nm Ag) to increase the sensitivity. It is found in practice that when the sensitive refractive index increases, the resonant wavelength increases due to the decrease in energy.
The emphasis of Master Production Scheduling (MPS) or tactic planning is on time and spatial disintegration of the cumulative planning targets and forecasts, along with the provision and forecast of the required resources. This procedure eventually becomes considerably difficult and slow as the number of resources, products and periods considered increases. A number of studies have been carried out to understand these impediments and formulate algorithms to optimise the production planning problem, or more specifically the master production scheduling (MPS) problem. These algorithms include an Evolutionary Algorithm called Genetic Algorithm, a Swarm Intelligence methodology called Gravitational Search Algorithm (GSA), Bat Algorithm (BAT), T
... Show MoreBecause of the quick growth of electrical instruments used in noxious gas detection, the importance of gas sensors has increased. X-ray diffraction (XRD) can be used to examine the crystal phase structure of sensing materials, which affects the properties of gas sensing. This contributes to the study of the effect of electrochemical synthesis of titanium dioxide (TiO2) materials with various crystal phase shapes, such as rutile TiO2 (R-TiO2NTs) and anatase TiO2 (A-TiO2NTs). In this work, we have studied the effect of voltage on preparing TiO2 nanotube arrays via the anodization technique for gas sensor applications. The results acquired from XRD, energy dispersion spectro
... Show More
The implementation of technology in the provision of public services and communication to citizens, which is commonly referred to as e-government, has brought multitude of benefits, including enhanced efficiency, accessibility, and transparency. Nevertheless, this approach also presents particular security concerns, such as cyber threats, data breaches, and access control. One technology that can aid in mitigating the effects of security vulnerabilities within e-government is permissioned blockchain. This work examines the performance of the hyperledger fabric private blockchain under high transaction loads by analyzing two scenarios that involve six organizations as case studies. Several parameters, such as transaction send ra
... Show MoreIn this paper reliable computational methods (RCMs) based on the monomial stan-dard polynomials have been executed to solve the problem of Jeffery-Hamel flow (JHF). In addition, convenient base functions, namely Bernoulli, Euler and Laguerre polynomials, have been used to enhance the reliability of the computational methods. Using such functions turns the problem into a set of solvable nonlinear algebraic system that MathematicaⓇ12 can solve. The JHF problem has been solved with the help of Improved Reliable Computational Methods (I-RCMs), and a review of the methods has been given. Also, published facts are used to make comparisons. As further evidence of the accuracy and dependability of the proposed methods, the maximum error remainder
... Show More