A substantial portion of today’s multimedia data exists in the form of unstructured text. However, the unstructured nature of text poses a significant task in meeting users’ information requirements. Text classification (TC) has been extensively employed in text mining to facilitate multimedia data processing. However, accurately categorizing texts becomes challenging due to the increasing presence of non-informative features within the corpus. Several reviews on TC, encompassing various feature selection (FS) approaches to eliminate non-informative features, have been previously published. However, these reviews do not adequately cover the recently explored approaches to TC problem-solving utilizing FS, such as optimization techniques. This study comprehensively analyzes different FS approaches based on optimization algorithms for TC. We begin by introducing the primary phases involved in implementing TC. Subsequently, we explore a wide range of FS approaches for categorizing text documents and attempt to organize the existing works into four fundamental approaches: filter, wrapper, hybrid, and embedded. Furthermore, we review four optimization algorithms utilized in solving text FS problems: swarm intelligence-based, evolutionary-based, physics-based, and human behavior-related algorithms. We discuss the advantages and disadvantages of state-of-the-art studies that employ optimization algorithms for text FS methods. Additionally, we consider several aspects of each proposed method and thoroughly discuss the challenges associated with datasets, FS approaches, optimization algorithms, machine learning classifiers, and evaluation criteria employed to assess new and existing techniques. Finally, by identifying research gaps and proposing future directions, our review provides valuable guidance to researchers in developing and situating further studies within the current body of literature.
The application of the test case prioritization method is a key part of system testing intended to think it through and sort out the issues early in the development stage. Traditional prioritization techniques frequently fail to take into account the complexities of big-scale test suites, growing systems and time constraints, therefore cannot fully fix this problem. The proposed study here will deal with a meta-heuristic hybrid method that focuses on addressing the challenges of the modern time. The strategy utilizes genetic algorithms alongside a black hole as a means to create a smooth tradeoff between exploring numerous possibilities and exploiting the best one. The proposed hybrid algorithm of genetic black hole (HGBH) uses the
... Show MoreA cumulative review with a systematic approach aimed to provide a comparison of studies’ investigating the possible impact of the active form of vitamin D3, calcitriol (CTL), on the tooth movement caused by orthodontic forces (OTM) by evaluating the quality of evidence, based on collating current data from animal model studies, in vivo cell culture studies, and human clinical trials. Methods: A strict systematic review protocol was applied following the application of the International Prospective Register of Systematic Reviews (PROSPERO). A structured search strategy, including main keywords, was defined during detailed search with the application of electronic database systems: Medline/Pubmed, EMBASE, Scopus, Web of Science, and
... Show MorePeak ground acceleration (PGA) is one of the critical factors that affect the determination of earthquake intensity. PGA is generally utilized to describe ground motion in a particular zone and is able to efficiently predict the parameters of site ground motion for the design of engineering structures. Therefore, novel models are developed to forecast PGA in the case of the Iraqi database, which utilizes the particle swarm optimization (PSO) approach. A data set of 187 historical ground-motion recordings in Iraq’s tectonic regions was used to build the explicit proposed models. The proposed PGA models relate to different seismic parameters, including the magnitude of the earthquake (Mw), average shear-wave velocity (VS30), focal depth (FD
... Show MoreBackground: Obesity tends to appear in modern societies and constitutes a significant public health problem with an increased risk of cardiovascular diseases.
Objective: This study aims to determine the agreement between actual and perceived body image in the general population.
Methods: A descriptive cross-sectional study design was conducted with a sample size of 300. The data were collected from eight major populated areas of Northern district of Karachi Sindh with a period of six months (10th January 2020 to 21st June 2020). The Figure rating questionnaire scale (FRS) was applied to collect the demographic data and perception about body weight. Body mass index (BMI) used for ass
... Show MoreThis study employs wavelet transforms to address the issue of boundary effects. Additionally, it utilizes probit transform techniques, which are based on probit functions, to estimate the copula density function. This estimation is dependent on the empirical distribution function of the variables. The density is estimated within a transformed domain. Recent research indicates that the early implementations of this strategy may have been more efficient. Nevertheless, in this work, we implemented two novel methodologies utilizing probit transform and wavelet transform. We then proceeded to evaluate and contrast these methodologies using three specific criteria: root mean square error (RMSE), Akaike information criterion (AIC), and log
... Show MoreIt is not available for us to go back in time and see plays old and it plays giants tragedy ancient Greek (Aeschylus, Sofokls, and yourbedes) through the eyes of a generation ago, and if we were able to go back to Ntegathm play it is certain that we will not taste or Nstassig for much of what we see from these offers will not afford the traditional religious rituals, which was accompanied also dance and music in the style of ancient Greek play was representing a large part of the theater see manifestations Can our eyes and ears we twentieth century audience to accept those appearances, and it was then?
Inevitably it will look like a museum bycollection not only. So we find ourselves in the light of the foregoing forced When you do rem
The widespread of internet allover the world, in addition to the increasing of the huge number of users that they exchanged important information over it highlights the need for a new methods to protect these important information from intruders' corruption or modification. This paper suggests a new method that ensures that the texts of a given document cannot be modified by the intruders. This method mainly consists of mixture of three steps. The first step which barrows some concepts of "Quran" security system to detect some type of change(s) occur in a given text. Where a key of each paragraph in the text is extracted from a group of letters in that paragraph which occur as multiply of a given prime number. This step cannot detect the ch
... Show MoreCentralization and decentralization, planning and development, and community participation in the management of its affairs and to activate all the abilities that multiple methods aimed at creating the proper environment for the growth and development of society in the place where he lives. As long as the overall trend in Iraq, represented by the Permanent Constitution of decentralization to regions and provinces, the solutions to the obstacles that may face this transition in some respects presents ways of coordination and integration between multiple levels of planning which can be exercised by the schematic in the future the organization. In this paper some of the visions and ideas that can contribute to the organization
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreDrought is a natural phenomenon in many arid, semi-arid, or wet regions. This showed that no region worldwide is excluded from the occurrence of drought. Extreme droughts were caused by global weather warming and climate change. Therefore, it is essential to review the studies conducted on drought to use the recommendations made by the researchers on drought. The drought was classified into meteorological, agricultural, hydrological, and economic-social. In addition, researchers described the severity of the drought by using various indices which required different input data. The indices used by various researchers were the Joint Deficit Index (JDI), Effective Drought Index (EDI), Streamflow Drought Index (SDI), Sta
... Show More