In this research, optical absorption data (the imaginary part of the dielectric function Ɛ2 as a function of photon energy E) were re-analyzed for three samples of a-Si:H thin films using derivative methods trying to investigate the ambiguity that accompany the interpretation of the optical data of these film in order to obtainm the optical energy gap (Eg) and the factor (r) which in concerned with the density of state distribution near the mobility edge directly without the need for a pre- assumption for the factor r usually followed in traditional methods such as the Tauc plot. The derivative method was used for two choices for the factor q (which in connected with the dependence of the dipole matrix element on the photon energy ) for two choices q=0 and q=2. Results showed that r might take non-integer values . the result for two of the samples, those prepared by Jackson et al and Cody, showed that the derivative plot that adopt q=0 better fits the experimental data, thus Cody's model might seems closer to the experimental results than the Tauc model. While the third sample the one prepared by Ferlauto et al showed somewhat different behavior such that neither Cody nor Tauc models could be considered a better fit to experimental data for this sample
In data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
In this research we solved numerically Boltzmann transport equation in order to calculate the transport parameters, such as, drift velocity, W, D/? (ratio of diffusion coefficient to the mobility) and momentum transfer collision frequency ?m, for purpose of determination of magnetic drift velocity WM and magnetic deflection coefficient ? for low energy electrons, that moves in the electric field E, crossed with magnetic field B, i.e; E×B, in the nitrogen, Argon, Helium and it's gases mixtures as a function of: E/N (ratio of electric field strength to the number density of gas), E/P300 (ratio of electric field strength to the gas pressure) and D/? which covered a different ranges for E/P300 at temperatures 300°k (Kelvin). The results show
... Show MoreIn the last decade, the web has rapidly become an attractive platform, and an indispensable part of our lives. Unfortunately, as our dependency on the web increases so programmers focus more on functionality and appearance than security, has resulted in the interest of attackers in exploiting serious security problems that target web applications and web-based information systems e.g. through an SQL injection attack. SQL injection in simple terms, is the process of passing SQL code into interactive web applications that employ database services such applications accept user input such as form and then include this input in database requests, typically SQL statements in a way that was not intende
... Show MoreRecently, digital communication has become a critical necessity and so the Internet has become the most used medium and most efficient for digital communication. At the same time, data transmitted through the Internet are becoming more vulnerable. Therefore, the issue of maintaining secrecy of data is very important, especially if the data is personal or confidential. Steganography has provided a reliable method for solving such problems. Steganography is an effective technique in secret communication in digital worlds where data sharing and transfer is increasing through the Internet, emails and other ways. The main challenges of steganography methods are the undetectability and the imperceptibility of con
... Show MoreModern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MoreThis research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show MoreThe estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreInternet of Things (IoT) contributes to improve the quality of life as it supports many applications, especially healthcare systems. Data generated from IoT devices is sent to the Cloud Computing (CC) for processing and storage, despite the latency caused by the distance. Because of the revolution in IoT devices, data sent to CC has been increasing. As a result, another problem added to the latency was increasing congestion on the cloud network. Fog Computing (FC) was used to solve these problems because of its proximity to IoT devices, while filtering data is sent to the CC. FC is a middle layer located between IoT devices and the CC layer. Due to the massive data generated by IoT devices on FC, Dynamic Weighted Round Robin (DWRR)
... Show More3D geological model of a simple petroleum reservoir for Yamama Formation has
been built in Abu Amood Oil Field using Petrel software, which is a product of
Schlumberger. This model contains the structure, stratigraphy and reservoir
properties (porosity and water saturation) in three directions(X, Y and Z).Geologic
modeling is an applied science of creating computerized representations of portions
of the earth's crust, especially oil and gas fields.
Yamama Formation in Abu Amood Oil Field is divided into thirteen zones by
using well logs and their petrophysical properties, six of which are reservoir zones.
From the top of the formation these six zones are: (YB-1, YB-2, YB-3, YC-1, YC-2
and YC-3). These reservoir
The dependable and efficient identification of Qin seal script characters is pivotal in the discovery, preservation, and inheritance of the distinctive cultural values embodied by these artifacts. This paper uses image histograms of oriented gradients (HOG) features and an SVM model to discuss a character recognition model for identifying partial and blurred Qin seal script characters. The model achieves accurate recognition on a small, imbalanced dataset. Firstly, a dataset of Qin seal script image samples is established, and Gaussian filtering is employed to remove image noise. Subsequently, the gamma transformation algorithm adjusts the image brightness and enhances the contrast between font structures and image backgrounds. After a s
... Show More