Hyperbole is an obvious and intentional exaggeration in the sense that it takes things to such an extreme that the audience goes too far and then pulls itself back to a more reasonable position, i.e. it is an extravagant statement or figure of speech not intended to be taken literally. This paper focuses on the formal and functional perspectives in the analysis of hyperbole which American candidates produce in their speeches in electoral campaigns, for it is hypothesized that candidates in their electoral campaigns use hyperbolic expressions excessively to persuade voters of the objectives of their electoral campaign programs. Hence, it aims to analyze hyperbole in context to determine the range of pragmatic functions that this figure fulfills and to present a formal analysis of hyperbole to demonstrate which formal realizations employed with a hyperbolic function are more or less likely to serve the persuasive aspect of hyperbole. To achieve these aims, three campaign speeches by Barack Obama from the 2012 Presidential Election, chosen at random from the American Presidency Project, were analyzed, and the occurrences of hyperbolic expressions identified. The frequency findings, in terms of the formal analysis, reveal that the exaggerated content found in single words is the type which represents the most common realization of hyperbole in Obama's speeches. In terms of the functional analysis, the results reveal that emphasis and evaluation appear to be the most prominent functions suggesting that the intended impression on voters is only constructed through the combined effects of these two devices.
Coronary artery disease (CAD) is the leading cause of death worldwide. Certain genetic polymorphisms play an important role in this multifactorial disease, being linked with increased risk of early onset CAD.
To assess six genetic polymorphisms and clinical risk factors in relation to early onset nondiabetic Iraqi Arab CAD patients compared to controls.
This case–contro
n this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show Moreيهدف البحث الى دراسة وتحليل الهندسة المتزامنة (CE) وتحسين التكلفة(CO)، واستعمال مخرجات الهندسة المتزامنة كمدخلات لتحسين التكلفة، وبيان دور الهندسة المتزامنة في تحسين جودة المنتوج، وتحقيق وفورات في وقت التصميم والتصنيع والتجميع وتخفيض التكاليف، فضلاً عن توظيف بعض النماذج لتحديد مقدار الوفورات في الوقت ومنها نموذج(Lexmark) ونموذج (Pert) لتحديد الوفورات في وقت التصميم وقت لتصنيع والتجميع. ولتحقيق اهداف
... Show MoreThe expanding use of multi-processor supercomputers has made a significant impact on the speed and size of many problems. The adaptation of standard Message Passing Interface protocol (MPI) has enabled programmers to write portable and efficient codes across a wide variety of parallel architectures. Sorting is one of the most common operations performed by a computer. Because sorted data are easier to manipulate than randomly ordered data, many algorithms require sorted data. Sorting is of additional importance to parallel computing because of its close relation to the task of routing data among processes, which is an essential part of many parallel algorithms. In this paper, sequential sorting algorithms, the parallel implementation of man
... Show MoreRecently, all over the world mechanism of cloud computing is widely acceptable and used by most of the enterprise businesses in order increase their productivity. However there are still some concerns about the security provided by the cloud environment are raises. Thus in this our research project, we are discussing over the cloud computing paradigm evolvement for the large business applications like CRM as well as introducing the new framework for the secure cloud computing using the method of IT auditing. In this case our approach is basically directed towards the establishment of the cloud computing framework for the CRM applications with the use of checklists by following the data flow of the CRM application and its lifecycle. Those ch
... Show MoreThis investigation was carried out to study the treatment and recycling of wastewater in the Battery industry for an effluent containing lead ion. The reuse of such effluent can only be made possible by appropriate treatment method such as electro coagulation.
The electrochemical process, which uses a cell comprised aluminum electrode as anode and stainless steel electrode as cathode was applied to simulated wastewater containing lead ion in concentration 30 – 120 mg/l, at different operational conditions such as current density 0.4-1.2 mA/cm2, pH 6 -10 , and time 10 - 180 minute.
The results showed that the best operating conditions for complete lead removal (100%) at maximum concentration 120 mg/l was found to be 1.2 mA/cm2 cur
This study presents an adaptive control scheme based on synergetic control theory for suppressing the vibration of building structures due to earthquake. The control key for the proposed controller is based on a magneto-rheological (MR) damper, which supports the building. According to Lyapunov-based stability analysis, an adaptive synergetic control (ASC) strategy was established under variation of the stiffness and viscosity coefficients in the vibrated building. The control and adaptive laws of the ASC were developed to ensure the stability of the controlled structure. The proposed controller addresses the suppression problem of a single-degree-of-freedom (SDOF) building model, and an earthquake control scenario was conducted and simulat
... Show MoreAbstract
The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .
The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation
... Show MoreIn this study, a 3 mm thickness 7075-T6 aluminium alloy sheet was used in the friction stir welding process. Using the design of experiment to reduce the number of experiments and to obtain the optimum friction stir welding parameters by utilizing Taguchi technique based on the ultimate tensile test results. Orthogonal array of L9 (33) was used based on three numbers of the parameters and three levels for each parameter, where shoulder-workpiece interference depth (0.20, 0.25, and 0.3) mm, pin geometry (cylindrical thread flat end, cylindrical thread with 3 flat round end, cylindrical thread round end), and thread pitch (0.8, 1, and 1.2) mm) this technique executed by Minitab 17 software. The results showed th
... Show More