Background: Acute cholecystitis is common surgical
problem, which was treated previously by conservative
treatment .Later early open has been introduced as an
alternative to interval for treatment of acute cholecystitis.
Early open was found to be a safe, successful with
comparable postoperative complication rate. With the
advent of laparoscopy laparoscopic have been used for
chronic cholecystitis and became the first line of
treatment. New reports have shown that laparoscopic can
be used as an alternative to open for surgical treatment of
acute cholecystitis.
Objectives: to compare the success, safety of early
laparoscopic versus early open as a primary treatment of
acute cholecystitis.
Methods: out of 68 patients were treated for clinical
acute cholecystitis between January 2002 and February
2004 in the department of surgery, at Al – Kindy teaching
hospital. A total of 62 patients underwent early for acute
cholecystitis as soon as possible after diagnosis. The
preferred preoperative imaging technique was ultrasound.
30 (48.3%) of the operations were attempted
laparoscopically, whereas the remaining 32 patients
(51.7%) underwent initial open .
Results: The mean operative time for the open cases
was 75 minutes versus 60 minutes for the laparoscopic
group. There was no perioperative mortality in either
group. The incidence of conversion to open was 10% (3
patients). Surgical complications related to laparoscopic
and open occurred in 2 (6.6%) and 3 (9.3%) cases,
respectively. There was no difference between the open
and laparoscopic groups in regard to the major
postoperative complications.
Conclusion: The current study shows that early
(whether performed by open or laparoscopically) is a
safe and effective treatment for acute cholecystitis. Low
conversion rates can be maintained with strict guidelines
for appropriate patient selection, adequate experience,
and proper laparoscopic technique.
This research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MoreThe Artificial Neural Network methodology is a very important & new subjects that build's the models for Analyzing, Data Evaluation, Forecasting & Controlling without depending on an old model or classic statistic method that describe the behavior of statistic phenomenon, the methodology works by simulating the data to reach a robust optimum model that represent the statistic phenomenon & we can use the model in any time & states, we used the Box-Jenkins (ARMAX) approach for comparing, in this paper depends on the received power to build a robust model for forecasting, analyzing & controlling in the sod power, the received power come from
... Show MoreAbstract:
The great importance that distinguish these factorial experiments made them subject a desirable for use and application in many fields, particularly in the field of agriculture, which is considered the broad area for experimental designs applications.
And the second case for the factorial experiment, which faces researchers have great difficulty in dealing with the case unbalance we mean that frequencies treatments factorial are not equal meaning (that is allocated a number unequal of blocks or units experimental per tre
... Show MoreIn this study, the performance of the adaptive optics (AO) system was analyzed through a numerical computer simulation implemented in MATLAB. Making a phase screen involved turning computer-generated random numbers into two-dimensional arrays of phase values on a sample point grid with matching statistics. Von Karman turbulence was created depending on the power spectral density. Several simulated point spread functions (PSFs) and modulation transfer functions (MTFs) for different values of the Fried coherent diameter (ro) were used to show how rough the atmosphere was. To evaluate the effectiveness of the optical system (telescope), the Strehl ratio (S) was computed. The compensation procedure for an AO syst
... Show MoreAlongside the development of high-speed rail, rail flaw detection is of great importance to ensure railway safety, especially for improving the speed and load of the train. Several conventional inspection methods such as visual, acoustic, and electromagnetic inspection have been introduced in the past. However, these methods have several challenges in terms of detection speed and accuracy. Combined inspection methods have emerged as a promising approach to overcome these limitations. Nondestructive testing (NDT) techniques in conjunction with artificial intelligence approaches have tremendous potential and viability because it is highly possible to improve the detection accuracy which has been proven in various conventional nondestr
... Show MoreThe basic goal of this research is to utilize an analytical method which is called the Modified Iterative Method in order to gain an approximate analytic solution to the Sine-Gordon equation. The suggested method is the amalgamation of the iterative method and a well-known technique, namely the Adomian decomposition method. A method minimizes the computational size, averts round-off errors, transformation and linearization, or takes some restrictive assumptions. Several examples are chosen to show the importance and effectiveness of the proposed method. In addition, a modified iterative method gives faster and easier solutions than other methods. These solutions are accurate and in agreement with the series
... Show MorePhishing is an internet crime achieved by imitating a legitimate website of a host in order to steal confidential information. Many researchers have developed phishing classification models that are limited in real-time and computational efficiency. This paper presents an ensemble learning model composed of DTree and NBayes, by STACKING method, with DTree as base learner. The aim is to combine the advantages of simplicity and effectiveness of DTree with the lower complexity time of NBayes. The models were integrated and appraised independently for data training and the probabilities of each class were averaged by their accuracy on the trained data through testing process. The present results of the empirical study on phishing websi
... Show MoreKrawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the
... Show More