A resume is the first impression between you and a potential employer. Therefore, the importance of a resume can never be underestimated. Selecting the right candidates for a job within a company can be a daunting task for recruiters when they have to review hundreds of resumes. To reduce time and effort, we can use NLTK and Natural Language Processing (NLP) techniques to extract essential data from a resume. NLTK is a free, open source, community-driven project and the leading platform for building Python programs to work with human language data. To select the best resume according to the company’s requirements, an algorithm such as KNN is used. To be selected from hundreds of resumes, your resume must be one of the best. Therefore, our work also focuses on creating an automated system that can recommend the right skills and courses to help the desired candidates by using Natural Language Processing to analyze writing style (linguistic fingerprints) and also used to measure style and analyze word frequency from the submitted resume. Through semantic search and relying on individual resumes, forensic experts can query the huge semantic datasets provided to companies and institutions and facilitate the work of government forensics by obtaining official institutional databases. With global cybercrime and the increase in applicants seeking work and leveraging their multilingual data, Natural Language Processing (NLP) is making it easier. Through the important relationship between Natural Language Processing (NLP) and digital forensics, NLP techniques are increasingly being used to enhance investigations involving digital evidence and leverage the support of NLP for open-source data by analyzing massive amounts of public data.
Assessing the accuracy of classification algorithms is paramount as it provides insights into reliability and effectiveness in solving real-world problems. Accuracy examination is essential in any remote sensing-based classification practice, given that classification maps consistently include misclassified pixels and classification misconceptions. In this study, two imaginary satellites for Duhok province, Iraq, were captured at regular intervals, and the photos were analyzed using spatial analysis tools to provide supervised classifications. Some processes were conducted to enhance the categorization, like smoothing. The classification results indicate that Duhok province is divided into four classes: vegetation cover, buildings,
... Show MoreThe subject of this study is one of the most interactive media networks, which is the Instagram.The study uses a descriptive approach and focuses on how Instagram turns from a media tool to a marketing tool. The study problematic consists of this question: How Instagram turns to a marketing tool, and what are the advantages and the disadvantages for that?This study highlights the definition of Instagram, the creation of it and its improvement, the Instagram in Bahrain, the Instagram uses, the Instagram as a marketing tool, and the advantages and disadvantages of using Instagram for marketing.The study confirms that Instagram is a new media network, and focuses on how it develops on later stages and mentions the increase of the percentage
... Show MoreThis paper proposes improving the structure of the neural controller based on the identification model for nonlinear systems. The goal of this work is to employ the structure of the Modified Elman Neural Network (MENN) model into the NARMA-L2 structure instead of Multi-Layer Perceptron (MLP) model in order to construct a new hybrid neural structure that can be used as an identifier model and a nonlinear controller for the SISO linear or nonlinear systems. Two learning algorithms are used to adjust the parameters weight of the hybrid neural structure with its serial-parallel configuration; the first one is supervised learning algorithm based Back Propagation Algorithm (BPA) and the second one is an intelligent algorithm n
... Show MoreMedicine is one of the fields where the advancement of computer science is making significant progress. Some diseases require an immediate diagnosis in order to improve patient outcomes. The usage of computers in medicine improves precision and accelerates data processing and diagnosis. In order to categorize biological images, hybrid machine learning, a combination of various deep learning approaches, was utilized, and a meta-heuristic algorithm was provided in this research. In addition, two different medical datasets were introduced, one covering the magnetic resonance imaging (MRI) of brain tumors and the other dealing with chest X-rays (CXRs) of COVID-19. These datasets were introduced to the combination network that contained deep lea
... Show MoreBackground: Cytology is one of the important diagnostic tests done on effusion fluid. It can detect malignant cells in up to 60% of malignant cases. The most important benign cell present in these effusions is the mesothelial cell. Mesothelial atypia can be striking andmay simulate metastatic carcinoma. Many clinical conditions may produce such a reactive atypical cells as in anemia,SLE, liver cirrhosis and many other conditions. Recently many studies showed the value of computerized image analysis in differentiating atypical cells from malignant adenocarcinoma cells in effusion smears. Other studies support the reliability of the quantitative analysisand morphometric features and proved that they are objective prognostic indices. Method
... Show MoreThis paper is dealing with non-polynomial spline functions "generalized spline" to find the approximate solution of linear Volterra integro-differential equations of the second kind and extension of this work to solve system of linear Volterra integro-differential equations. The performance of generalized spline functions are illustrated in test examples
In this research, (MOORA) approach based– Taguchi design was used to convert the multi-performance problem into a single-performance problem for nine experiments which built (Taguchi (L9) orthogonal array) for carburization operation. The main variables that had a great effect on carburizing operation are carburization temperature (oC), carburization time (hrs.) and tempering temperature (oC). This study was also focused on calculating the amount of carbon penetration, the value of hardness and optimal values obtained during the optimization by Taguchi approach and MOORA method for multiple parameters. In this study, the carburization process was done in temperature between (850 to 950 ᵒC) for 2 to 6
... Show MoreEffective decision-making process is the basis for successfully solving any engineering problem. Many decisions taken in the construction projects differ in their nature due to the complex nature of the construction projects. One of the most crucial decisions that might result in numerous issues over the course of a construction project is the selection of the contractor. This study aims to use the ordinal priority approach (OPA) for the contractor selection process in the construction industry. The proposed model involves two computer programs; the first of these will be used to evaluate the decision-makers/experts in the construction projects, while the second will be used to formul
In this study, gold nanoparticles were synthesized in a single step biosynthetic method using aqueous leaves extract of thymus vulgaris L. It acts as a reducing and capping agent. The characterizations of nanoparticles were carried out using UV-Visible spectra, X-ray diffraction (XRD) and FTIR. The surface plasmon resonance of the as-prepared gold nanoparticles (GNPs) showed the surface plasmon resonance centered at 550[Formula: see text]nm. The XRD pattern showed that the strong four intense peaks indicated the crystalline nature and the face centered cubic structure of the gold nanoparticles. The average crystallite size of the AuNPs was 14.93[Formula: see text]nm. Field emission scanning electron microscope (FESEM) was used to s
... Show MoreA new method is characterized by simplicity, accuracy and speed for determination of Oxonuim ion in ionisable inorganic acid such as hydrochloric (0.1 - 10) ,Sulphuric ( 0.1 - 6 ),nitric ( 0.1 - 10 ), perchloric ( 0.1 - 7 ), acetic (0.1 - 100 ) and phosphoric ( 0.1 - 30 ) ( mMol.L-1 )acids. By continuous flow injection analysis. The proposed method was based on generation of bromine from the Bro-3-Br-- H3O+. Bromine reacts with fluorescein to quenches the fluorescence . A sample volume no.1 (31μl) and no.2 (35μl) were used with flow rate of 0.95 mL.min-1 using H2O line no.1as carrier stream and 1.3 mL.min-1 using fluorescein sodium salt line no.2. Linear regression of the concentration ( mMol.L-1 ) Vs quenched fluorescence gives a correla
... Show More