Data Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the needs of stakeholders and the goals of the organization. The research objectives in this research to the process collected and integrating data from multiple sources and ensuring interoperability. Conclusion in this research is determining is the clustering algorithm help the collection data and elicitation process has a somewhat greater impact on the ratings provided by professionals for pairs that belong to the same cluster. However, the influence of POS tagging on the ratings given by professionals is relatively consistent for pairs within the same cluster and pairs in different clusters.
|
Abstract This research deals with the definition of the concept of nodal purposes, And what is related to it, from its aim and importance, And for the purposes of the importance of Streptococcus In trying to understand the nodal truths For different minds, Especially with those who object to the introduction of belief in purposes studies, This research has two requirements: The first requirement: the concept and the aim of contractual purposes,It consists of two branches, The first is in the concept of nodal purposes, And it dealt with the definitions in terms of language and terminology And what we see is proportional to what aim |
Abstract
The research aims to diagnose the reality of applying the eighth requirement (operation) of the business continuity management system according to the international standard (ISO 22301: 2019), in the General Tax Authority, which is related to planning, implementing and controlling specific processes and procedures to address risks and opportunities, and the research adopted the checklist of the standard ( ISO 22301: 2019), in obtaining information, to measure the extent of application and documentation, the percentages and the weighted arithmetic mean were relied upon, and the research reached a set of result
... Show MoreThe availability of different processing levels for satellite images makes it important to measure their suitability for classification tasks. This study investigates the impact of the Landsat data processing level on the accuracy of land cover classification using a support vector machine (SVM) classifier. The classification accuracy values of Landsat 8 (LS8) and Landsat 9 (LS9) data at different processing levels vary notably. For LS9, Collection 2 Level 2 (C2L2) achieved the highest accuracy of (86.55%) with the polynomial kernel of the SVM classifier, surpassing the Fast Line-of-Sight Atmospheric Analysis of Spectral Hypercubes (FLAASH) at (85.31%) and Collection 2 Level 1 (C2L1) at (84.93%). The LS8 data exhibits similar behavior. Conv
... Show MoreIn this research, Haar wavelets method has been utilized to approximate a numerical solution for Linear state space systems. The solution technique is used Haar wavelet functions and Haar wavelet operational matrix with the operation to transform the state space system into a system of linear algebraic equations which can be resolved by MATLAB over an interval from 0 to . The exactness of the state variables can be enhanced by increasing the Haar wavelet resolution. The method has been applied for different examples and the simulation results have been illustrated in graphics and compared with the exact solution.
The bile salt hydrolase gene (bshA), encoding bile salt hydrolase enzyme (EC 3.5.1.24) from probiotic isolate Lactobacillus acidophilus Ar strain which is responsible for assimilation cholesterol were studied in the present work. About 801 bp in length DNA fragment of Lb. acidophilus Ar strain was amplified by PCR techniques. Two restriction sites (PstI/SacI) were added to each end of that fragment for manipulation of DNA during cloning. Amplified fragment inserted into pJET1.2\blunt end vector and pMG36e vector respectively. pJET1.2\blunt end vector is overexpression plasmid for E. coli MC1022, and pMG36e vector is a shuttle vector which is able to replicate in both E. coli and lactic acid bacteria. The resulted constructs were named as pJ
... Show MoreHeuristic approaches are traditionally applied to find the optimal size and optimal location of Flexible AC Transmission Systems (FACTS) devices in power systems. Genetic Algorithm (GA) technique has been applied to solve power engineering optimization problems giving better results than classical methods. This paper shows the application of GA for optimal sizing and allocation of a Static Compensator (STATCOM) in a power system. STATCOM devices used to increase transmission systems capacity and enhance voltage stability by regulate the voltages at its terminal by controlling the amount of reactive power injected into or absorbed from the power system. IEEE 5-bus standard system is used as an example to illustrate the te
... Show MoreArtificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing flower pollination algorithm in the environmental field to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement. We estimate the regression function of the semi-parametric model by estimating the parametric model and estimating the non-parametric model, the parametric model is estimated by using an instrumental variables method (Wald method, Bartlett’s method, and Durbin
... Show MoreImage segmentation using bi-level thresholds works well for straightforward scenarios; however, dealing with complex images that contain multiple objects or colors presents considerable computational difficulties. Multi-level thresholding is crucial for these situations, but it also introduces a challenging optimization problem. This paper presents an improved Reptile Search Algorithm (RSA) that includes a Gbest operator to enhance its performance. The proposed method determines optimal threshold values for both grayscale and color images, utilizing entropy-based objective functions derived from the Otsu and Kapur techniques. Experiments were carried out on 16 benchmark images, which inclu
Many approaches of different complexity already exist to edge detection in
color images. Nevertheless, the question remains of how different are the results
when employing computational costly techniques instead of simple ones. This
paper presents a comparative study on two approaches to color edge detection to
reduce noise in image. The approaches are based on the Sobel operator and the
Laplace operator. Furthermore, an efficient algorithm for implementing the two
operators is presented. The operators have been applied to real images. The results
are presented in this paper. It is shown that the quality of the results increases by
using second derivative operator (Laplace operator). And noise reduced in a good