In the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and forwarding planes. So, due to the rapid increase in the number of applications, websites, storage space, and some of the network resources are being underutilized due to static routing mechanisms. To overcome these limitations, a Software Defined Network based Openflow Data Center network architecture is used to obtain better performance parameters and implementing traffic load balancing function. The load balancing distributes the traffic requests over the connected servers, to diminish network congestions, and reduce underutilization problem of servers. As a result, SDN is developed to afford more effective configuration, enhanced performance, and more flexibility to deal with huge network designs.
Background: The problem of difficult gallbladder is not clearly defined and associated with real missing of therapeutic approaches that decreased morbidity. Moreover, the difficult gallbladder was reported as a contributing risk factor for biliary injury due to raised difficulty in surgical dissection within Calot’s triangle. The aim of this study is to determine the surgical outcomes of the open fundus-first cholecystectomy in lowering the rate of lethal intraoperative risks.
Subjects and Methods: Our prospective study conducted during the period of January 2019 to December 2022 at Ibn Sina specialized hospital, Khartoum, Sudan, for two hundred and fifty-three patients underw
... Show MorePrediction of daily rainfall is important for flood forecasting, reservoir operation, and many other hydrological applications. The artificial intelligence (AI) algorithm is generally used for stochastic forecasting rainfall which is not capable to simulate unseen extreme rainfall events which become common due to climate change. A new model is developed in this study for prediction of daily rainfall for different lead times based on sea level pressure (SLP) which is physically related to rainfall on land and thus able to predict unseen rainfall events. Daily rainfall of east coast of Peninsular Malaysia (PM) was predicted using SLP data over the climate domain. Five advanced AI algorithms such as extreme learning machine (ELM), Bay
... Show MoreMultiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain
... Show MoreThis study sought to investigate the impacts of big data, artificial intelligence (AI), and business intelligence (BI) on Firms' e-learning and business performance at Jordanian telecommunications industry. After the samples were checked, a total of 269 were collected. All of the information gathered throughout the investigation was analyzed using the PLS software. The results show a network of interconnections can improve both e-learning and corporate effectiveness. This research concluded that the integration of big data, AI, and BI has a positive impact on e-learning infrastructure development and organizational efficiency. The findings indicate that big data has a positive and direct impact on business performance, including Big
... Show MoreBackground: Appreciation of the crucial role of risk factors in the development of coronary artery disease (CAD) is one of the most significant advances in the understanding of this important disease. Extensive epidemiological research has established cigarette smoking, diabetes, hyperlipidemia, and hypertension as independent risk factors for CADObjective: To determine the prevalence of the 4 conventional risk factors(cigarette smoking, diabetes, hyperlipidemia, and hypertension) among patients with CAD and to determine the correlation of Thrombolysis in Myocardial Infarction (TIMI) risk score with the extent of coronary artery disease (CAD) in patients with unstable angina /non ST elevation myocardial infarction (UA/NSTEMI).Methods: We
... Show MoreThe objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.
Was conducted to compare the two methods above and it became clear by comparing the logistic regression model best of a Linear Discriminant function written
... Show MoreIn this study, we focused on the random coefficient estimation of the general regression and Swamy models of panel data. By using this type of data, the data give a better chance of obtaining a better method and better indicators. Entropy's methods have been used to estimate random coefficients for the general regression and Swamy of the panel data which were presented in two ways: the first represents the maximum dual Entropy and the second is general maximum Entropy in which a comparison between them have been done by using simulation to choose the optimal methods.
The results have been compared by using mean squares error and mean absolute percentage error to different cases in term of correlation valu
... Show MoreThis study investigates the impact of spatial resolution enhancement on supervised classification accuracy using Landsat 9 satellite imagery, achieved through pan-sharpening techniques leveraging Sentinel-2 data. Various methods were employed to synthesize a panchromatic (PAN) band from Sentinel-2 data, including dimension reduction algorithms and weighted averages based on correlation coefficients and standard deviation. Three pan-sharpening algorithms (Gram-Schmidt, Principal Components Analysis, Nearest Neighbour Diffusion) were employed, and their efficacy was assessed using seven fidelity criteria. Classification tasks were performed utilizing Support Vector Machine and Maximum Likelihood algorithms. Results reveal that specifi
... Show MoreAbstract:
Witness the current business environment changes rapidly reflected on the performance of the facility wishing to stay , which is no longer style reaction enough to handle installations with their environment , and quickly began to lose its luster with the emergence of a message and the vision of contemporary business environment from a set of parts interacting with each other and the concept of behavioral includes all dimensions of performance, it is imperative to adopt a system installations influence variables and positive interaction through the development of strategic plans and the use of implementation and follow-up strategies to ensure the effectiveness of the method for meas
... Show More