One study whose importance has significantly grown in recent years is lip-reading, particularly with the widespread of using deep learning techniques. Lip reading is essential for speech recognition in noisy environments or for those with hearing impairments. It refers to recognizing spoken sentences using visual information acquired from lip movements. Also, the lip area, especially for males, suffers from several problems, such as the mouth area containing the mustache and beard, which may cover the lip area. This paper proposes an automatic lip-reading system to recognize and classify short English sentences spoken by speakers using deep learning networks. The input video extracts frames and each frame is passed to the Viola-Jones to detect the face area. Then 68 landmarks of the facial area are determined, and the landmarks from 48 to 68 represent the lip area extracted based on building a binary mask. Then, the contrast is enhanced to improve the quality of the lip image by applying contrast adjustment. Finally, sentences are classified using two deep learning models, the first is AlexNet, and the second is VGG-16 Net. The database consists of 39 participants (32 males and 7 females). Each participant repeats the short sentences five times. The outcomes demonstrate the accuracy rate of AlexNet is 90.00%, whereas the accuracy rate for VGG-16 Net is 82.34%. We concluded that AlexNet performs better for classifying short sentences than VGG-16 Net.
This research takes up address the practical side by taking case studies for construction projects that include the various Iraqi governorates, as it includes conducting a field survey to identify the impact of parametric costs on construction projects and compare them with what was reached during the analysis and the extent of their validity and accuracy, as well as adopting the approach of personal interviews to know the reality of the state of construction projects. The results showed, after comparing field data and its measurement in construction projects for the sectors (public and private), the correlation between the expected and actual cost change was (97.8%), and this means that the data can be adopted in the re
... Show MoreAmplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the
... Show More This study includes Estimating scale parameter, location parameter and reliability function for Extreme Value (EXV) distribution by two methods, namely: -
- Maximum Likelihood Method (MLE).
- Probability Weighted Moments Method (PWM).
Used simulations to generate the required samples to estimate the parameters and reliability function of different sizes(n=10,25,50,100) , and give real values for the parameters are and , replicate the simulation experiments (RP=1000)
... Show MoreThis study aimed at investigating the effect of using computer in
Efficiency of Training Programme of Science Teachers in Ajloun District in
Jordan.
1- What is the effect of using computer in program for the two groups
2- ( the experimental and control group ) .
3- Are there any statistics different in the effect of using computer
program for the two groups ?
4- Are there any statistics (comparison ) or different of the effect of the
effect of using computer program refer to the sex (male or female )?
The community of the study consisted of all the science student in
educational directorate of Ajloun district for the academic year 2009 –
2010, they are (120) ( male and female) . The sample of the study<
Steganography is defined as hiding confidential information in some other chosen media without leaving any clear evidence of changing the media's features. Most traditional hiding methods hide the message directly in the covered media like (text, image, audio, and video). Some hiding techniques leave a negative effect on the cover image, so sometimes the change in the carrier medium can be detected by human and machine. The purpose of suggesting hiding information is to make this change undetectable. The current research focuses on using complex method to prevent the detection of hiding information by human and machine based on spiral search method, the Structural Similarity Index Metrics measures are used to get the accuracy and quality
... Show MoreThis study includes analytical methods for the determination of the drug amoxicillin trihydrate (Amox.) in some pharmaceutical preparations using Cobalt ion (Co(II)) as complexing metal. The best conditions for complexation were: the reaction time was 20 minutes, pH=1.5 and the best temperature of reaction was 70 ËšC. Benzyl alcohol was the best solvent for extraction the complex.
Keywords: Amoxicillin, Cobalt(II), Complex, Molar ratio.
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreThe statistical distributions study aimed to obtain on best descriptions of variable sets phenomena, which each of them got one behavior of that distributions . The estimation operations study for that distributions considered of important things which could n't canceled in variable behavior study, as result this research came as trial for reaching to best method for information distribution estimation which is generalized linear failure rate distribution, throughout studying the theoretical sides by depending on statistical posteriori methods like greatest ability, minimum squares method and Mixing method (suggested method).
The research
... Show More