A resume is the first impression between you and a potential employer. Therefore, the importance of a resume can never be underestimated. Selecting the right candidates for a job within a company can be a daunting task for recruiters when they have to review hundreds of resumes. To reduce time and effort, we can use NLTK and Natural Language Processing (NLP) techniques to extract essential data from a resume. NLTK is a free, open source, community-driven project and the leading platform for building Python programs to work with human language data. To select the best resume according to the company’s requirements, an algorithm such as KNN is used. To be selected from hundreds of resumes, your resume must be one of the best. Therefore, our work also focuses on creating an automated system that can recommend the right skills and courses to help the desired candidates by using Natural Language Processing to analyze writing style (linguistic fingerprints) and also used to measure style and analyze word frequency from the submitted resume. Through semantic search and relying on individual resumes, forensic experts can query the huge semantic datasets provided to companies and institutions and facilitate the work of government forensics by obtaining official institutional databases. With global cybercrime and the increase in applicants seeking work and leveraging their multilingual data, Natural Language Processing (NLP) is making it easier. Through the important relationship between Natural Language Processing (NLP) and digital forensics, NLP techniques are increasingly being used to enhance investigations involving digital evidence and leverage the support of NLP for open-source data by analyzing massive amounts of public data.
Abstract :
In view of the fact that high blood pressure is one of the serious human diseases that a person can get without having to feel them, which is caused by many reasons therefore it became necessary to do research in this subject and to express these many factors by specific causes through studying it using (factor analysis).
So the researcher got to the five factors that explains only 71% of the total variation in this phenomenon is the subject of the research, where ((overweight)) and ((alcohol in abundance)) and ((smoking)) and ((lack of exercise)) are the reasons that influential the most in the incidence of this disease.
Abstract
Disturbs the social system in any society specially in Iraq other than
other Islamic and Arabic countries. That is according to the abnormal
conditions that Iraq has past through as wars and blockage, and lastly the
invasion. Therefore it has been necessary to put this phenomenon under
study and analysis to discover Juvenile Delinquency is one of the most
prominent social phenomenon that the important reasons behind it, and trying
to treat what can be treated of the effects of it upon society.
This study is mainly concerned with the explaining the social factors
leading toward juvenile delinquency trying to crystallize the problem of the
study in the following question: (What are the psychological,
The present study is a qualitative study that aims to investigate the way the Iraqi caricaturist,Dheaa Al-Hajjar uses caricatures to produce a satirical meaning humorously.Producing satire while at the same maintaining humor requires a creative thinking on the part of the caricaturist. Thus, the study examines the production of humorous satire in terms of creativity. The analysis is done from the cognitive linguistic point of view using Arthur Koestler's theory of bisociation as presented in his book The Act of Creation in 1964. The main principle on which the theory is based is that humor is created via linking (or bisociating in Koestler's terms) two habitually incompatible trains of thought in order to come up with a novel me
... Show MoreComputer vision seeks to mimic the human visual system and plays an essential role in artificial intelligence. It is based on different signal reprocessing techniques; therefore, developing efficient techniques becomes essential to achieving fast and reliable processing. Various signal preprocessing operations have been used for computer vision, including smoothing techniques, signal analyzing, resizing, sharpening, and enhancement, to reduce reluctant falsifications, segmentation, and image feature improvement. For example, to reduce the noise in a disturbed signal, smoothing kernels can be effectively used. This is achievedby convolving the distributed signal with smoothing kernels. In addition, orthogonal moments (OMs) are a cruc
... Show MoreNeurolinguistics is a new science, which studies the close relationship between language and neuroscience, and this new interdisciplinary field confirms the functional integration between language and the nervous system, that is, the movement of linguistic information in the brain in receiving, acquiring and producing to achieve linguistic communication; Because language is in fact a mental process that takes place only through the nervous system, and this research shows the benefit of each of these two fields to the other, and this science includes important topics, including: language acquisition, the linguistic abilities of the two hemispheres of the brain, the linguistic responsibility of the brain centers, and the time limit for langua
... Show MoreOpenStreetMap (OSM), recognised for its current and readily accessible spatial database, frequently serves regions lacking precise data at the necessary granularity. Global collaboration among OSM contributors presents challenges to data quality and uniformity, exacerbated by the sheer volume of input and indistinct data annotation protocols. This study presents a methodological improvement in the spatial accuracy of OSM datasets centred over Baghdad, Iraq, utilising data derived from OSM services and satellite imagery. An analytical focus was placed on two geometric correction methods: a two-dimensional polynomial affine transformation and a two-dimensional polynomial conformal transformation. The former involves twelve coefficients for ad
... Show MoreThe increasing complexity of how humans interact with and process information has demonstrated significant advancements in Natural Language Processing (NLP), transitioning from task-specific architectures to generalized frameworks applicable across multiple tasks. Despite their success, challenges persist in specialized domains such as translation, where instruction tuning may prioritize fluency over accuracy. Against this backdrop, the present study conducts a comparative evaluation of ChatGPT-Plus and DeepSeek (R1) on a high-fidelity bilingual retrieval-and-translation task. A single standardize prompt directs each model to access the Arabic-language news section of the College of Medicine, University of Baghdad, retrieve the three most r
... Show MoreThis study is marked by: The ignorant poem and body language
Its main objective is to reveal the manifestations of this language in the text mentioned, and accordingly, the sieve poem has been read semantic (semantic) and hermeneutic, revealing the poet's ability to employ symbols and signals (body language) in the poem chosen for this purpose; The existence of such language in pre-Islamic poetry. After a long reflection and reading, the signs and symbols of the physical movement of the body, and its feminine and aesthetic manifestations were identified, and this was achieved through the use of modern critical methodologies that directly affect this language. The study consisted of an introduction and three topics, followed by t