Advances in digital technology and the World Wide Web has led to the increase of digital documents that are used for various purposes such as publishing and digital library. This phenomenon raises awareness for the requirement of effective techniques that can help during the search and retrieval of text. One of the most needed tasks is clustering, which categorizes documents automatically into meaningful groups. Clustering is an important task in data mining and machine learning. The accuracy of clustering depends tightly on the selection of the text representation method. Traditional methods of text representation model documents as bags of words using term-frequency index document frequency (TFIDF). This method ignores the relationship and meanings of words in the document. As a result the sparsity and semantic problem that is prevalent in textual document are not resolved. In this study, the problem of sparsity and semantic is reduced by proposing a graph based text representation method, namely dependency graph with the aim of improving the accuracy of document clustering. The dependency graph representation scheme is created through an accumulation of syntactic and semantic analysis. A sample of 20 news groups, dataset was used in this study. The text documents undergo pre-processing and syntactic parsing in order to identify the sentence structure. Then the semantic of words are modeled using dependency graph. The produced dependency graph is then used in the process of cluster analysis. K-means clustering technique was used in this study. The dependency graph based clustering result were compared with the popular text representation method, i.e. TFIDF and Ontology based text representation. The result shows that the dependency graph outperforms both TFIDF and Ontology based text representation. The findings proved that the proposed text representation method leads to more accurate document clustering results.
The study aimed at identifying the strategic gaps in the actual reality of the management of public organizations investigated to determine the strategy used based on the study model. The study relied on the variable of the general organization strategy in its dimensions (the general organization strategy, the organization's political strategy and the defense strategy of the organization) The sample of the study was (General Directorate of Traffic, Civil Status Directorate and Civil Defense Directorate), formations affiliated to the Ministry of the Interior, for the importance of the activity carried out by these public organizations by providing them In order to translate the answers into a quantitative expression in the analysi
... Show MoreNowadays, internet security is a critical concern; the One of the most difficult study issues in network security is "intrusion detection". Fight against external threats. Intrusion detection is a novel method of securing computers and data networks that are already in use. To boost the efficacy of intrusion detection systems, machine learning and deep learning are widely deployed. While work on intrusion detection systems is already underway, based on data mining and machine learning is effective, it requires to detect intrusions by training static batch classifiers regardless considering the time-varying features of a regular data stream. Real-world problems, on the other hand, rarely fit into models that have such constraints. Furthermor
... Show MoreThe synthesis, characterization and mesomorphic properties of two new series of triazine-core based liquid crystals have been investigated. The amino triazine derivatives were characterized by elemental analysis, Fourier transforms infrared (FTIR), 1HNMR and mass spectroscopy. The liquid crystalline properties of these compounds were examined by differential scanning calorimetry (DSC) and polarizing optical microscopy (POM). DSC and POM confirmed nematic (N) and columnar mesophase textures of the materials. The formation of mesomorphic properties was found to be dependent on the number of methylene unit in alkoxy side chains.

Objective: Carbamazepine is typically used for the treatment of seizure disorders and neuropathic pain. One of the major problems with this drug is its low solubility in water; therefore the objective of this study was to enhance the solubility of carbamazepine by complexation with cyclodextrin to be formulated as effervescent and dispersible granules.Methods: Solvent evaporation method was used to prepare, binary (Carbamazepine/β-cyclodextrin) complex and ternary (Carbamazepine/β-cyclodextrin/hydroxypropyl methyl cellulose (HPMC E5). The more soluble complex will be further formulated as unit dose effervescent and dispersible granules. The complexes were evaluated for their solubility, drug content, percentage practical yield and
... Show MoreBackground: The base of the denture is largely responsible for providing the prosthesis with retention, stability, and support by being closely adapted to the oral mucosa. However; the process of bone resorption is irreversible and may lead to an inadequate fit of the prosthesis; this can be overcome by relining. Materials and methods: Acrylic based soft denture liner is prepared by preparing polymer from purified methylmethacrylate monomer with (10-2) initiator and (30%) dibutylphthalate plasticizer concentrations. Biological properties were evaluated in comparison with the control material through subcutaneous specimens' implantation in the New Zealand rabbits. Excisional biopsies were taken after (1, 3, days 1, 2, 3, 4 weeks) period. Mic
... Show MoreThe demand for single photon sources in quantum key distribution (QKD) systems has necessitated the use of weak coherent pulses (WCPs) characterized by a Poissonian distribution. Ensuring security against eavesdropping attacks requires keeping the mean photon number (µ) small and known to legitimate partners. However, accurately determining µ poses challenges due to discrepancies between theoretical calculations and practical implementation. This paper introduces two experiments. The first experiment involves theoretical calculations of µ using several filters to generate the WCPs. The second experiment utilizes a variable attenuator to generate the WCPs, and the value of µ was estimated from the photons detected by the BB
... Show MorePeak ground acceleration (PGA) is one of the critical factors that affect the determination of earthquake intensity. PGA is generally utilized to describe ground motion in a particular zone and is able to efficiently predict the parameters of site ground motion for the design of engineering structures. Therefore, novel models are developed to forecast PGA in the case of the Iraqi database, which utilizes the particle swarm optimization (PSO) approach. A data set of 187 historical ground-motion recordings in Iraq’s tectonic regions was used to build the explicit proposed models. The proposed PGA models relate to different seismic parameters, including the magnitude of the earthquake (Mw), average shear-wave velocity (VS30), focal depth (FD
... Show More<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T
... Show More