Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On these bases, this work aims to improve FA using variable neighborhood search (VNS) as a local search method, providing VNS the benefit of the trade-off between the exploration and exploitation abilities. The proposed FA-VNS allows fireflies to improve the clustering solutions with the ability to enhance the clustering solutions and maintain the diversity of the clustering solutions during the search process using the perturbation operators of VNS. To evaluate the performance of the algorithm, eight benchmark datasets are utilized with four well-known clustering algorithms. The comparison according to the internal and external evaluation metrics indicates that the proposed FA-VNS can produce more compact clustering solutions than the well-known clustering algorithms.
Purpose: The purpose of this study was to clarify the basic dimensions, which seeks to indestructible scenarios practices within the organization, as a final result from the use of this philosophy.
Methodology: The methodology that focuses adoption researchers to study survey of major literature that dealt with this subject in order to provide a conceptual theoretical conception of scenarios theory .
The most prominent findings: The only successful formulation of scenarios, when you reach the decision-maker's mind wa takes aim to form a correct mental models, which appear in the expansion of Perception managers, and adopted as the basis of the decisions taken. The strength l
... Show MoreIn this paper, the generalized inverted exponential distribution is considered as one of the most important distributions in studying failure times. A shape and scale parameters of the distribution have been estimated after removing the fuzziness that characterizes its data because they are triangular fuzzy numbers. To convert the fuzzy data to crisp data the researcher has used the centroid method. Hence the studied distribution has two parameters which show a difficulty in separating and estimating them directly of the MLE method. The Newton-Raphson method has been used.
... Show MoreCertainly there is a negative impact on development due to corruption in its corruption forms It is administrative and financial as well as direct impact on all sectors Including the health sector, which has seen in Iraq a marked setback in performance and services , The research aims at the possibility of demonstrating the negative effects of financial and administrative corruption on health institutions and Inability to meet health needs The extent to which Regulatory institutions and institutions manage to reduce financial and administrative corruption and ways of dealing with it The phenomenon of financial and administrative corruption is the most serious phenomenon because of its impact on economic growth, which in turn impedes the
... Show MoreDue to the lack of vehicle-to-infrastructure (V2I) communication in the existing transportation systems, traffic light detection and recognition is essential for advanced driver assistant systems (ADAS) and road infrastructure surveys. Additionally, autonomous vehicles have the potential to change urban transportation by making it safe, economical, sustainable, congestion-free, and transportable in other ways. Because of their limitations, traditional traffic light detection and recognition algorithms are not able to recognize traffic lights as effectively as deep learning-based techniques, which take a lot of time and effort to develop. The main aim of this research is to propose a traffic light detection and recognition model based on
... Show MoreIn the cuurent article, the photophysical properties of 3,6-bis(5-bromothiophen-2-yl)-2,5-bis(2-ethylhexyl)-2,5-dihydropyrrolo[3,4-c]pyrrole-1,4-dione were investigated. The visible absorption bands at 527, 558 and 362 nm in propylene carbonate and the compound was found to be fluorescent in solution and in the plastic film with emission wavelengths between 550- 750 nm. The Stokes Shift of P.C., acetonitrile, diethyl ether, Tetrahydrofuran THF, cyclohexane, dibutyl ether, and dichloromethane DCM are 734, 836, 668, 601, 601, 719, and 804 cm-1 in respectively. The Stokes Shift Δ was less in THF and cyclohexane, than the solvents, which indicates that the energy loss is less between the excitation and fluorescence states. The
... Show MoreThe article analyzes the neologisms that arose in the Iraqi dialect after the 2003 US-British invasion and the fall of Saddam Hussein's regime, according to the theory I advocate: "The Basic Outline of Reference," a developed theory of Arab legacy and cognitive theory, which came out in 1987 in America, so we have used the terminology of cognitive grammar. In this theory it is stated that the reference is the interaction between four components: perception, imagination, imaginative comprehension and the linguistic sign or symbolization (the neological word in this article), which are closely related, so that none of them can be lacking, because they constitute a holistic whole that belongs to a deeper level. Let us
... Show MoreMulti-document summarization is an optimization problem demanding optimization of more than one objective function simultaneously. The proposed work regards balancing of the two significant objectives: content coverage and diversity when generating summaries from a collection of text documents.
Any automatic text summarization system has the challenge of producing high quality summary. Despite the existing efforts on designing and evaluating the performance of many text summarization techniques, their formulations lack the introduction of any model that can give an explicit representation of – coverage and diversity – the two contradictory semantics of any summary. In this work, the design of
... Show MoreJean-Paul Sartre and Badr Shakir al-Sayyabe are among the most prominent writers that critiqued the destructive role of capitalism and the patriarchal power system in the period of the Post-World War II crisis. Divided into three chapters, the present study examines two of the most eminent literary works in the history of the Western and Eastern societies in the fifties of the last decade: Jean Paul Sartre’s play : The Respectful Prostitute and Badr Shaker al-Sayyabe’s poem: The Blind Prostitute.
Chapter one discusses the position of the prostitute in a patriarchal societies. Chapter two linguistically analy
... Show MoreDistributed Denial of Service (DDoS) attacks on Web-based services have grown in both number and sophistication with the rise of advanced wireless technology and modern computing paradigms. Detecting these attacks in the sea of communication packets is very important. There were a lot of DDoS attacks that were directed at the network and transport layers at first. During the past few years, attackers have changed their strategies to try to get into the application layer. The application layer attacks could be more harmful and stealthier because the attack traffic and the normal traffic flows cannot be told apart. Distributed attacks are hard to fight because they can affect real computing resources as well as network bandwidth. DDoS attacks
... Show MoreDust is a frequent contributor to health risks and changes in the climate, one of the most dangerous issues facing people today. Desertification, drought, agricultural practices, and sand and dust storms from neighboring regions bring on this issue. Deep learning (DL) long short-term memory (LSTM) based regression was a proposed solution to increase the forecasting accuracy of dust and monitoring. The proposed system has two parts to detect and monitor the dust; at the first step, the LSTM and dense layers are used to build a system using to detect the dust, while at the second step, the proposed Wireless Sensor Networks (WSN) and Internet of Things (IoT) model is used as a forecasting and monitoring model. The experiment DL system
... Show More