Data Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the needs of stakeholders and the goals of the organization. The research objectives in this research to the process collected and integrating data from multiple sources and ensuring interoperability. Conclusion in this research is determining is the clustering algorithm help the collection data and elicitation process has a somewhat greater impact on the ratings provided by professionals for pairs that belong to the same cluster. However, the influence of POS tagging on the ratings given by professionals is relatively consistent for pairs within the same cluster and pairs in different clusters.
In current article an easy and selective method is proposed for spectrophotometric estimation of metoclopramide (MCP) in pharmaceutical preparations using cloud point extraction (CPE) procedure. The method involved reaction between MCP with 1-Naphthol in alkali conditions using Triton X-114 to form a stable dark purple dye. The Beer’s law limit in the range 0.34-9 μg mL-1 of MCP with r =0.9959 (n=3) after optimization. The relative standard deviation (RSD) and percentage recoveries were 0.89 %, and (96.99–104.11%) respectively. As well, using surfactant cloud point extraction as a method to extract MCP was reinforced the extinction coefficient(ε) to 1.7333×105L/mol.cm in surfactant-rich phase. The small volume of organi
... Show More
In this work, the modified Lyapunov-Schmidt reduction is used to find a nonlinear Ritz approximation of Fredholm functional defined by the nonhomogeneous Camassa-Holm equation and Benjamin-Bona-Mahony. We introduced the modified Lyapunov-Schmidt reduction for nonhomogeneous problems when the dimension of the null space is equal to two. The nonlinear Ritz approximation for the nonhomogeneous Camassa-Holm equation has been found as a function of codimension twenty-four.
the most important purposes and uses of the test results in the educational sector. This is because the quality of tests is related to their ability to predict the learner's behavior in the future, and the accuracy of the educational and administrative decisions that are taken in light of their results. The study aimed accordingly to reveal the predictive ability of the university Grade Point Average (GPA) in the Score of the specialized test for the position of teacher in the Ministry of Education in the Sultanate of Oman. It further aimed to investigate the differences in the predictive ability according to the specialization and academic year using the descriptive approach. The sample of the study consisted of (349) s/he students enro
... Show MoreGeophysical data interpretation is crucial in characterizing the subsurface structure. The Bouguer gravity map analysis of the W-NW region of Iraq serves as the basis for the current geophysical research. The Bouguer gravity data were processed using the Power Spectrum Analysis method. Four depth slices have been acquired after the PSA process, which are: 390 m, 1300 m, 3040 m, and 12600 m depth. The gravity anomaly depth maps show that shallow-depth anomalies are mainly related to the sedimentary cover layers and structures, while the gravity anomaly of the deeper depth slice of 12600 m is more presented to the basement rocks and mantle uplift. The 2D modeling technique was used for
Future wireless networks will require advance physical-layer techniques to meet the requirements of Internet of Everything (IoE) applications and massive communication systems. To this end, a massive MIMO (m-MIMO) system is to date considered one of the key technologies for future wireless networks. This is due to the capability of m-MIMO to bring a significant improvement in the spectral efficiency and energy efficiency. However, designing an efficient downlink (DL) training sequence for fast channel state information (CSI) estimation, i.e., with limited coherence time, in a frequency division duplex (FDD) m-MIMO system when users exhibit different correlation patterns, i.e., span distinct channel covariance matrices, is to date ve
... Show MoreEvolutionary algorithms are better than heuristic algorithms at finding protein complexes in protein-protein interaction networks (PPINs). Many of these algorithms depend on their standard frameworks, which are based on topology. Further, many of these algorithms have been exclusively examined on networks with only reliable interaction data. The main objective of this paper is to extend the design of the canonical and topological-based evolutionary algorithms suggested in the literature to cope with noisy PPINs. The design of the evolutionary algorithm is extended based on the functional domain of the proteins rather than on the topological domain of the PPIN. The gene ontology annotation in each molecular function, biological proce
... Show More