A substantial portion of today’s multimedia data exists in the form of unstructured text. However, the unstructured nature of text poses a significant task in meeting users’ information requirements. Text classification (TC) has been extensively employed in text mining to facilitate multimedia data processing. However, accurately categorizing texts becomes challenging due to the increasing presence of non-informative features within the corpus. Several reviews on TC, encompassing various feature selection (FS) approaches to eliminate non-informative features, have been previously published. However, these reviews do not adequately cover the recently explored approaches to TC problem-solving utilizing FS, such as optimization techniques. This study comprehensively analyzes different FS approaches based on optimization algorithms for TC. We begin by introducing the primary phases involved in implementing TC. Subsequently, we explore a wide range of FS approaches for categorizing text documents and attempt to organize the existing works into four fundamental approaches: filter, wrapper, hybrid, and embedded. Furthermore, we review four optimization algorithms utilized in solving text FS problems: swarm intelligence-based, evolutionary-based, physics-based, and human behavior-related algorithms. We discuss the advantages and disadvantages of state-of-the-art studies that employ optimization algorithms for text FS methods. Additionally, we consider several aspects of each proposed method and thoroughly discuss the challenges associated with datasets, FS approaches, optimization algorithms, machine learning classifiers, and evaluation criteria employed to assess new and existing techniques. Finally, by identifying research gaps and proposing future directions, our review provides valuable guidance to researchers in developing and situating further studies within the current body of literature.
With the high usage of computers and networks in the current time, the amount of security threats is increased. The study of intrusion detection systems (IDS) has received much attention throughout the computer science field. The main objective of this study is to examine the existing literature on various approaches for Intrusion Detection. This paper presents an overview of different intrusion detection systems and a detailed analysis of multiple techniques for these systems, including their advantages and disadvantages. These techniques include artificial neural networks, bio-inspired computing, evolutionary techniques, machine learning, and pattern recognition.
This paper is specifically a detailed review of the Spatial Quantile Autoregressive (SARQR) model that refers to the incorporation of quantile regression models into spatial autoregressive models to facilitate an improved analysis of the characteristics of spatially dependent data. The relevance of SARQR is emphasized in most applications, including but not limited to the fields that might need the study of spatial variation and dependencies. In particular, it looks at literature dated from 1971 and 2024 and shows the extent to which SARQR had already been applied previously in other disciplines such as economics, real estate, environmental science, and epidemiology. Accordingly, evidence indicates SARQR has numerous benefits compar
... Show MoreTransdermal drug delivery has made an important contribution to medical practice but has yet to fully achieve its potential as an alternative to oral delivery and hypodermic injections. Transdermal therapeutic systems have been designed to provide controlled continuous delivery of drugs through the skin to the systemic circulation. A transdermal patch is an adhesive patch that has a coating of drug; the patch is placed on the skin to deliver particular amount of drug into the systemic circulation over a period of time. The transdermal drug delivery systems (TDDS) review articles provide information regarding the transdermal drug delivery systems and its evaluation process as a ready reference for the research scientist who is involved
... Show MoreThe No Mobile Phone Phobia or Nomophobia notion is referred to the psychological condition once humans have a fear of being disconnected from mobile phone connectivity. Hence, it is considered as a recent age phobia that emerged nowadays as a consequence of high engagement between people, mobile data, and communication inventions, especially the smart phones. This review is based on earlier observations and current debate such as commonly used techniques that modeling and analyzing this phenomenon like statistical studies. All that in order to possess preferable comprehension concerning human reactions to the speedy technological ubiquitous. Accordingly, humans ought to restrict their utilization of mobile phones instead of prohibit
... Show MoreTesting is a vital phase in software development, and having the right amount of test data is an important aspect in speeding up the process. As a result of the integrationist optimization challenge, extensive testing may not always be practicable. There is also a shortage of resources, expenses, and schedules that impede the testing process. One way to explain combinational testing (CT) is as a basic strategy for creating new test cases. CT has been discussed by several scholars while establishing alternative tactics depending on the interactions between parameters. Thus, an investigation into current CT methods was started in order to better understand their capabilities and limitations. In this study, 97 publications were evalua
... Show MoreIn many video and image processing applications, the frames are partitioned into blocks, which are extracted and processed sequentially. In this paper, we propose a fast algorithm for calculation of features of overlapping image blocks. We assume the features are projections of the block on separable 2D basis functions (usually orthogonal polynomials) where we benefit from the symmetry with respect to spatial variables. The main idea is based on a construction of auxiliary matrices that virtually extends the original image and makes it possible to avoid a time-consuming computation in loops. These matrices can be pre-calculated, stored and used repeatedly since they are independent of the image itself. We validated experimentally th
... Show MoreThe current study introduces a novel method for calculating the stability time by a new approach based on the conversion of degradation from the conductivity curve results obtained by the conventional method. The stability time calculated by the novel method is shorter than the time measured by the conventional method. The stability time in the novel method can be calculated by the endpoint of the tangency of the conversion curve with the tangent line. This point of tangency represents the stability time, as will be explained in detail. Still, it gives a clear and accurate envisage of the dehydrochlorination behavior and can be generalized to all types of polyvinyl chloride compared to the stability time measured by conventional ones based
... Show MoreIn present days, drug resistance is a major emerging problem in the healthcare sector. Novel antibiotics are in considerable need because present effective treatments have repeatedly failed. Antimicrobial peptides are the biologically active secondary metabolites produced by a variety of microorganisms like bacteria, fungi, and algae, which possess surface activity reduction activity along with this they are having antimicrobial, antifungal, and antioxidant antibiofilm activity. Antimicrobial peptides include a wide variety of bioactive compounds such as Bacteriocins, glycolipids, lipopeptides, polysaccharide-protein complexes, phospholipids, fatty acids, and neutral lipids. Bioactive peptides derived from various natural sources like bacte
... Show More