The current research is concerned with studying the decisive answers which are considered quick and conclusive. These answers can effectively interrupt the opponent's argument and close the dialogue.This research is concentrated on deliberative methodology focusing on the decisive answer's activity and ending them through several completing and argument sides. This research consists of an introduction and three parts, the current introduction is focused the light on the concept of decisive answers and its uses in literature and the scarce of speech, and how to consider it with one dialogue description,that dialogue constitute by ? The first part is concerned with those answers through the deliberative methodology and classifying decisive answers in sequence with those answers. Part two is dealt with arrangement and employment of arguments in decisive answers in consist with argument concept, it is studied the mechanism of presenting arguments in this field. The last and the third partis dealt with the origins and the essence of decisive answers through critical necessities that arguments are concentrated through.In this research, there is a concentration in dialogue necessities and the mechanism of intentions for the basis of those answers.
This paper aims to validate a proposed finite element model to be adopted in predicting displacement and soil stresses of a piled-raft foundation. The proposed model adopts the solid element to simulate the raft, piles, and soil mass. An explicit integration scheme has been used to simulate nonlinear static aspects of the piled-raft foundation and to avoid the computational difficulties associated with the implicit finite element analysis.
The validation process is based on comparing the results of the proposed finite element model with those of a scaled-down experimental work achieved by other researchers. Centrifuge apparatus has been used in the experimental work to generate the required stresses to simulate t
... Show MoreDelays occur commonly in construction projects. Assessing the impact of delay is sometimes a contentious
issue. Several delay analysis methods are available but no one method can be universally used over another in
all situations. The selection of the proper analysis method depends upon a variety of factors including
information available, time of analysis, capabilities of the methodology, and time, funds and effort allocated to the analysis. This paper presents computerized schedule analysis programmed that use daily windows analysis method as it recognized one of the most credible methods, and it is one of the few techniques much more likely to be accepted by courts than any other method. A simple case study has been implement
Speech recognition is a very important field that can be used in many applications such as controlling to protect area, banking, transaction over telephone network database access service, voice email, investigations, House controlling and management ... etc. Speech recognition systems can be used in two modes: to identify a particular person or to verify a person’s claimed identity. The family speaker recognition is a modern field in the speaker recognition. Many family speakers have similarity in the characteristics and hard to identify between them. Today, the scope of speech recognition is limited to speech collected from cooperative users in real world office environments and without adverse microphone or channel impairments.
Recently there has been an urgent need to identify the ages from their personal pictures and to be used in the field of security of personal and biometric, interaction between human and computer, security of information, law enforcement. However, in spite of advances in age estimation, it stills a difficult problem. This is because the face old age process is determined not only by radical factors, e.g. genetic factors, but also by external factors, e.g. lifestyle, expression, and environment. This paper utilized machine learning technique to intelligent age estimation from facial images using support vector machine (SVM) on FG_NET dataset. The proposed work consists of three phases: the first phase is image preprocessing include four st
... Show MoreTotal quality management considers one of the modern scientific entrances which practiced by productivity service organizations alike to provide appropriate quality required outputs according to the needs and desires of customers manage , enable the organization seeking to continue and grow in light of the increasing competition from the satisfy and provide the appropriate total quality management requirements whenever led to face risks that they may have in a manner in which they can be addressed and find ways to avoid them in the future when repeated. &n
... Show MoreIn this paper, a mathematical model consisting of the prey- predator model with disease in both the population is proposed and analyzed. The existence, uniqueness and boundedness of the solution are discussed. The existences and the stability analysis of all possible equilibrium points are studied. Numerical simulation is carried out to investigate the global dynamical behavior of the system.
In this research, a qualitative seismic processing and interpretation is made up
through using 3D-seismic reflection data of East-Baghdad oil field in the central part
of Iraq. We used the new technique, this technique is used for the direct hydrocarbons
indicators (DHI) called Amplitude Versus Offset or Angle (AVO or AVA) technique.
For this purposes a cube of 3D seismic data (Pre-stack) was chosen in addition to the
available data of wells Z-2 and Z-24. These data were processed and interpreted by
utilizing the programs of the HRS-9* software where we have studied and analyzed
the AVO within Zubair Formation. Many AVO processing operations were carried
out which include AVO processing (Pre-conditioning for gathe
In this paper, a new hybridization of supervised principal component analysis (SPCA) and stochastic gradient descent techniques is proposed, and called as SGD-SPCA, for real large datasets that have a small number of samples in high dimensional space. SGD-SPCA is proposed to become an important tool that can be used to diagnose and treat cancer accurately. When we have large datasets that require many parameters, SGD-SPCA is an excellent method, and it can easily update the parameters when a new observation shows up. Two cancer datasets are used, the first is for Leukemia and the second is for small round blue cell tumors. Also, simulation datasets are used to compare principal component analysis (PCA), SPCA, and SGD-SPCA. The results sh
... Show MoreThe paper is concerned with posterior analysis of five exponentiated (Weibull, Exponential, Inverted Weibull, Pareto, Gumbel) distrebutions. The expressions for Bayes estimators of the shape parameters have been derived under four different prior distributions assuming four different loss functions. The posterior predictive distributions have been obtained, and the comparison between estimators made by using the mean squared errors through generated different sample sizes by using simulation technique. In general, the performance of estimators under Chi-square prior using squared error loss function is the best.
A new simple and sensitive spectrophotometric method is described for quantification of Nifedipine (NIF) and their pharmaceutical formulation. The selective method was performed by the reduction of NIF nitro group to yield primary amino group using zinc powder with hydrochloric acid. The produced aromatic amine was submitted to oxidative coupling reaction with pyrocatechol and ammonium ceric nitrate to form orange color product measured spectrophotometrically with maximum absorption at 467nm. The product was determined through flow injection analysis (FIA) system and all the chemical and physical parameters were optimized. The concentration range from 5.0 to 140.0 μg.mL-1 was obeyed Beer’s law with a limit of detection and quantitatio
... Show More