Computer literacy is an urgent necessity for university students, given the rapid development in the means of communication in which we live in this era, and the flow of abundant information. Mainly on the computer in all administrative and academic transactions, where first of all the registration for the semester is done through the computer. Computer culture has many characteristics and advantages that distinguish it from other sciences, including the concept of computer culture that cannot be defined absolutely, and it is difficult to define its levels, because the specifications of the computer-educated individual differ from one individual to another, and from time to time also, you find it a luxury in a country What, and you find it necessary in another country. In order to measure and know the level of computer culture among university students, a computerized scale of (40) items with five multiple-choice alternatives were built. In order to know that they have the skills of searching for information electronically, a computerized scale of (21) items were prepared, and a five-point Likert scale was adopted. Results showed that they have computer literacy and therefore they have skills of searching for information electronically at all.
Proxy-based sliding mode control PSMC is an improved version of PID control that combines the features of PID and sliding mode control SMC with continuously dynamic behaviour. However, the stability of the control architecture maybe not well addressed. Consequently, this work is focused on modification of the original version of the proxy-based sliding mode control PSMC by adding an adaptive approximation compensator AAC term for vibration control of an Euler-Bernoulli beam. The role of the AAC term is to compensate for unmodelled dynamics and make the stability proof more easily. The stability of the proposed control algorithm is systematically proved using Lyapunov theory. Multi-modal equation of motion is derived using the Galerkin metho
... Show MoreArtificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing flower pollination algorithm in the environmental field to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement. We estimate the regression function of the semi-parametric model by estimating the parametric model and estimating the non-parametric model, the parametric model is estimated by using an instrumental variables method (Wald method, Bartlett’s method, and Durbin
... Show MoreThis paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time [Formula: see text]. The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integ
... Show MoreIn information security, fingerprint verification is one of the most common recent approaches for verifying human identity through a distinctive pattern. The verification process works by comparing a pair of fingerprint templates and identifying the similarity/matching among them. Several research studies have utilized different techniques for the matching process such as fuzzy vault and image filtering approaches. Yet, these approaches are still suffering from the imprecise articulation of the biometrics’ interesting patterns. The emergence of deep learning architectures such as the Convolutional Neural Network (CNN) has been extensively used for image processing and object detection tasks and showed an outstanding performance compare
... Show MoreBackground: Coronavirus, which causes respiratory illness, has been a public health issue in recent decades. Because the clinical symptoms of infection are not always specific, it is difficult to expose all suspects to qualitative testing in order to confirm or rule out infection as a test. Methods: According to the scientific studies and investigations, seventy-three results of scientific articles and research were obtained using PubMed, Medline, Research gate and Google Scholar. The research keywords used were COVID-19, coronavirus, blood parameters, and saliva. Results: This review provides a report on the changes in the blood and saliva tests of those who are infected with the COVID-19.COVID-19 is a systemic infection that has
... Show MoreThe particle-hole state densities have been calculated for 232Th in
the case of incident neutron with , 1 Z Z T T T T and 2 Z T T .
The finite well depth, surface effect, isospin and Pauli correction are
considered in the calculation of the state densities and then the
transition rates. The isospin correction function ( ) iso f has been
examined for different exciton configurations and at different
excitation energies up to 100 MeV. The present results are indicated
that the included corrections have more affected on transition rates
behavior for , , and above 30MeV excitation energy
The aim of this research is to estimate the parameters of the linear regression model with errors following ARFIMA model by using wavelet method depending on maximum likelihood and approaching general least square as well as ordinary least square. We use the estimators in practical application on real data, which were the monthly data of Inflation and Dollar exchange rate obtained from the (CSO) Central Statistical organization for the period from 1/2005 to 12/2015. The results proved that (WML) was the most reliable and efficient from the other estimators, also the results provide that the changing of fractional difference parameter (d) doesn’t effect on the results.
Flexible job-shop scheduling problem (FJSP) is one of the instances in flexible manufacturing systems. It is considered as a very complex to control. Hence generating a control system for this problem domain is difficult. FJSP inherits the job-shop scheduling problem characteristics. It has an additional decision level to the sequencing one which allows the operations to be processed on any machine among a set of available machines at a facility. In this article, we present Artificial Fish Swarm Algorithm with Harmony Search for solving the flexible job shop scheduling problem. It is based on the new harmony improvised from results obtained by artificial fish swarm algorithm. This improvised solution is sent to comparison to an overall best
... Show MoreA geographic information system (GIS) is a very effective management and analysis tool. Geographic locations rely on data. The use of artificial neural networks (ANNs) for the interpretation of natural resource data has been shown to be beneficial. Back-propagation neural networks are one of the most widespread and prevalent designs. The combination of geographic information systems with artificial neural networks provides a method for decreasing the cost of landscape change studies by shortening the time required to evaluate data. Numerous designs and kinds of ANNs have been created; the majority of them are PC-based service domains. Using the ArcGIS Network Analyst add-on, you can locate service regions around any network
... Show MoreThis paper considers approximate solution of the hyperbolic one-dimensional wave equation with nonlocal mixed boundary conditions by improved methods based on the assumption that the solution is a double power series based on orthogonal polynomials, such as Bernstein, Legendre, and Chebyshev. The solution is ultimately compared with the original method that is based on standard polynomials by calculating the absolute error to verify the validity and accuracy of the performance.