Background: Appreciation of the crucial role of risk factors in the development of coronary artery disease (CAD) is one of the most significant advances in the understanding of this important disease. Extensive epidemiological research has established cigarette smoking, diabetes, hyperlipidemia, and hypertension as independent risk factors for CADObjective: To determine the prevalence of the 4 conventional risk factors(cigarette smoking, diabetes, hyperlipidemia, and hypertension) among patients with CAD and to determine the correlation of Thrombolysis in Myocardial Infarction (TIMI) risk score with the extent of coronary artery disease (CAD) in patients with unstable angina /non ST elevation myocardial infarction (UA/NSTEMI).Methods: We conducted a descriptive study among 100 patients admitted with UA/NSTEMI to three major cardiac centers in Iraq: Iraqi Centre for Heart Disease ,Ibn- Al-Bitar Hospital for cardiac surgery and Al -Nasyria Cardiac Centre from January 2010 to January 2o11.Frequency of each conventional risk factors and number of conventional risk factors present among patients with CAD, compared between men and women and by age are estimated at study entry. The TIMI risk score was stratified on seven standard variables. The extent of CAD was evaluated on angiography and significant CAD was defined as ≥ 70% stenosis in any one of the three major epicardial vessels and ≥50% in LMS.Results : Among 100 patients with UA/NSTEMI , 82% of patients have one or more risk factors and only 18%of patients lacked any of 4 conventional risk factors.Smoking is the most common risk factor in male patients while diabetes mellitus and dyslipidemia are common among female patients, and all these results are statistically significant.There were 64 % patients with TIMI score < 4 (low and intermediate TIMI risk score) and 36% patients with TIMI score >4 (high TIMI risk score). Patients with TIMI score > 4 were more likely to have significant three vessel CAD and LMS versus those with TIMI risk score < 4 who have less severe disease (single and two vessel disease).Conclusion: Antecedent major CAD risk factor exposures were very common among those who developed CAD emphasizing the importance of considering all major riskfactors in determining CAD risk estimation . Patients with a high TIMI risk score were more likely to have severe multivessel CAD compared with those with low or intermediate TIMI risk score. Hence, patients with TIMI score >4 should be referred for early invasive coronary evaluation to derive clinical benefit.Key words: unstable angina , Thrombolysis in Myocardial Infarction score, risk factors
Saudi Arabia’s banking sector plays an important role in the country’s development as it is among the leading sectors in the financial sector. Considering, two main Saudi banks (The National Commercial Bank and Saudi American bank), the present study aims to observe the impact of emotional intelligence on employee performance. The components of emotional intelligence affecting employee performance include self-management, relationship management, self-awareness, and social awareness. A quantitative methodology was applied to analyse the survey results of 300 respondents over the period from 2018 to 2019. The results show that there was a significant positive impact of self-management, self-awareness, and relationship manageme
... Show MoreThe purpose of this study is to evaluate the effect of hydrated lime addition methods as filler replacement on fatigue performance of Hot Mix Asphalt (HMA). Three types of addition methods of hydrated lime were introduced namely dry HL on dry aggregate and saturated surface aggregate above 3% and slurry HL on dry aggregate, ordinary Lime stone powder was reduced by three HL percentage (1.0, 2.0 and 3.0 %). The effect of different methods were investigated on the fatigue properties of HMA using, third-point flexural fatigue bending Test. Pneumatic Repeated Load System (PRLS) was carried out to establish the effect of hydrated lime on the fatigue failure criteria and to select the proper hydrated lime application methods on fatigue behavior o
... Show MoreIn recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur
... Show MoreVisual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visu
... Show MoreIn the current paradigms of information technology, cloud computing is the most essential kind of computer service. It satisfies the need for high-volume customers, flexible computing capabilities for a range of applications like as database archiving and business analytics, and the requirement for extra computer resources to provide a financial value for cloud providers. The purpose of this investigation is to assess the viability of doing data audits remotely inside a cloud computing setting. There includes discussion of the theory behind cloud computing and distributed storage systems, as well as the method of remote data auditing. In this research, it is mentioned to safeguard the data that is outsourced and stored in cloud serv
... Show MoreThe using of the parametric models and the subsequent estimation methods require the presence of many of the primary conditions to be met by those models to represent the population under study adequately, these prompting researchers to search for more flexible parametric models and these models were nonparametric, many researchers, are interested in the study of the function of permanence and its estimation methods, one of these non-parametric methods.
For work of purpose statistical inference parameters around the statistical distribution for life times which censored data , on the experimental section of this thesis has been the comparison of non-parametric methods of permanence function, the existence
... Show MoreThis research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MoreBusiness organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas