Market share is a major indication of business success. Understanding the impact of numerous economic factors on market share is critical to a company’s success. In this study, we examine the market shares of two manufacturers in a duopoly economy and present an optimal pricing approach for increasing a company’s market share. We create two numerical models based on ordinary differential equations to investigate market success. The first model takes into account quantity demand and investment in R&D, whereas the second model investigates a more realistic relationship between quantity demand and pricing.
Abiotic stress-induced genes may lead to understand the response of plants and adaptability to salinity and drought stresses. Differential display reverse transcriptase – polymerase chain reaction (DDRT-PCR) was used to investigate the differences in gene expression between drought- and salinity-stressed plantlets of Ruta graveolens. Direct and stepwise exposures to drought- or salt-responsive genes were screened in R. graveolens plantlets using the DDRT technique. Gene expression was investigated both in the control and in the salt or drought-stressed plantlets and differential banding patterns with different molecular sizes were observed using the primers OPA-01 (646,770 and 983 pb), OPA-08 (593 and 988 pb), OPA-11 (674 and 831 pb
... Show MoreThis research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa
... Show MoreAbstract
The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .
The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation
... Show MoreProducing pseudo-random numbers (PRN) with high performance is one of the important issues that attract many researchers today. This paper suggests pseudo-random number generator models that integrate Hopfield Neural Network (HNN) with fuzzy logic system to improve the randomness of the Hopfield Pseudo-random generator. The fuzzy logic system has been introduced to control the update of HNN parameters. The proposed model is compared with three state-ofthe-art baselines the results analysis using National Institute of Standards and Technology (NIST) statistical test and ENT test shows that the projected model is statistically significant in comparison to the baselines and this demonstrates the competency of neuro-fuzzy based model to produce
... Show MoreSurvivin, a member of inhibitor of apoptosis family is increasingly used as a target for cancer therapy design because it has a key role in cell growth and inhibition of cell apoptosis. Also it can be used as a biomarker for targeting cancer because it is found in almost all cancer but not normal cells. Our strategy was to design (computationally) a molecule to be used as survivin inhibitor. This molecule was named lead10 and was used furthermore to find (virtually) existing drugs with a good survivin inhibition activity.
Abstract: Word sense disambiguation (WSD) is a significant field in computational linguistics as it is indispensable for many language understanding applications. Automatic processing of documents is made difficult because of the fact that many of the terms it contain ambiguous. Word Sense Disambiguation (WSD) systems try to solve these ambiguities and find the correct meaning. Genetic algorithms can be active to resolve this problem since they have been effectively applied for many optimization problems. In this paper, genetic algorithms proposed to solve the word sense disambiguation problem that can automatically select the intended meaning of a word in context without any additional resource. The proposed algorithm is evaluated on a col
... Show MoreNowadays, information systems constitute a crucial part of organizations; by losing security, these organizations will lose plenty of competitive advantages as well. The core point of information security (InfoSecu) is risk management. There are a great deal of research works and standards in security risk management (ISRM) including NIST 800-30 and ISO/IEC 27005. However, only few works of research focus on InfoSecu risk reduction, while the standards explain general principles and guidelines. They do not provide any implementation details regarding ISRM; as such reducing the InfoSecu risks in uncertain environments is painstaking. Thus, this paper applied a genetic algorithm (GA) for InfoSecu risk reduction in uncertainty. Finally, the ef
... Show MoreAs s widely use of exchanging private information in various communication applications, the issue to secure it became top urgent. In this research, a new approach to encrypt text message based on genetic algorithm operators has been proposed. The proposed approach follows a new algorithm of generating 8 bit chromosome to encrypt plain text after selecting randomly crossover point. The resulted child code is flipped by one bit using mutation operation. Two simulations are conducted to evaluate the performance of the proposed approach including execution time of encryption/decryption and throughput computations. Simulations results prove the robustness of the proposed approach to produce better performance for all evaluation metrics with res
... Show More