This paper describes the use of microcomputer as a laboratory instrument system. The system is focused on three weather variables measurement, are temperature, wind speed, and wind direction. This instrument is a type of data acquisition system; in this paper we deal with the design and implementation of data acquisition system based on personal computer (Pentium) using Industry Standard Architecture (ISA)bus. The design of this system involves mainly a hardware implementation, and the software programs that are used for testing, measuring and control. The system can be used to display the required information that can be transferred and processed from the external field to the system. A visual basic language with Microsoft foundation classes (MFC) is the fundamental tool for windows programming. It has been used to build a Man-Machine Interface (MMI), which was used for processing and monitoring acquisition data from environment weather.
The Neutron Fermi Age, t, and the neutron slowing down density, q (r, t) , have been measured for some materials such as Graphite and Iron by using gamma spectrometry system UCS-30 with NaI (Tl) detector. This technique was applied for Graphite and Iron materials by using Indium foils covered by Cadmium and the measurements done at the Indium resonance of 1.46 eV. These materials are exposed to a plane 241Am/Be neutron source with recent activity 38 mCi. The measurements of the Fermi Age were found to be t = 297 ± 21 cm2 for Graphite, t = 400 ± 28 cm2 for Iron. Neutron slowing down density was also calculated depending on the recent experimental t value and distance.
Conditional logistic regression is often used to study the relationship between event outcomes and specific prognostic factors in order to application of logistic regression and utilizing its predictive capabilities into environmental studies. This research seeks to demonstrate a novel approach of implementing conditional logistic regression in environmental research through inference methods predicated on longitudinal data. Thus, statistical analysis of longitudinal data requires methods that can properly take into account the interdependence within-subjects for the response measurements. If this correlation ignored then inferences such as statistical tests and confidence intervals can be invalid largely.
The study aims to identify the metamemory and perceptual speed among College students, the correlation between metamemory and perceptual speed among College students, and to which extend does metamemory contribute to perceptual speed among College students. The sample consisted of group of students were selected randomly by the researcher from five-different disciplines at the college of education for pure sciences. To collect study data, the researcher utilized two scales: perceptual speed scale that has translated to Arabic language by (Al-Shraqawi, Al- Shaikh, and Nadia Abed Al-Salam (1993). The second scale is metamemory scale (2002) which has translated to Arabic by Abu Ghazal (2007). The results revealed that college students have
... Show MoreThis paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b
... Show MoreCorrelation equations for expressing the boiling temperature as direct function of liquid composition have been tested successfully and applied for predicting azeotropic behavior of multicomponent mixtures and the kind of azeotrope (minimum, maximum and saddle type) using modified correlation of Gibbs-Konovalov theorem. Also, the binary and ternary azeotropic point have been detected experimentally using graphical determination on the basis of experimental binary and ternary vapor-liquid equilibrium data.
In this study, isobaric vapor-liquid equilibrium for two ternary systems: “1-Propanol – Hexane – Benzene” and its binaries “1-Propanol –
... Show MoreIn this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
Reliable data transfer and energy efficiency are the essential considerations for network performance in resource-constrained underwater environments. One of the efficient approaches for data routing in underwater wireless sensor networks (UWSNs) is clustering, in which the data packets are transferred from sensor nodes to the cluster head (CH). Data packets are then forwarded to a sink node in a single or multiple hops manners, which can possibly increase energy depletion of the CH as compared to other nodes. While several mechanisms have been proposed for cluster formation and CH selection to ensure efficient delivery of data packets, less attention has been given to massive data co
In data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.