In this paper we present the first ever measured experimental electron momentum density of Cu2Sb at an intermediate resolution (0.6 a.u.) using 59.54 keV 241Am Compton spectrometer. The measurements are compared with the theoretical Compton profiles using density function theory (DFT) within a linear combination of an atomic orbitals (LCAO) method. In DFT calculation, Perdew-Burke-Ernzerhof (PBE) scheme is employed to treat correlation whereas exchange is included by following the Becke scheme. It is seen that various approximations within LCAO-DFT show relatively better agreement with the experimental Compton data. Ionic model calculations for a number of configurations (Cu+x/2)2(Sb-x) (0.0≤x≤2.0) are also performed utilizing free atom profiles, the ionic model suggests transfer of 2.0 electrons per Cu atom from 4s state to 5p state of Sb.
As we live in the era of the fourth technological revolution, it has become necessary to use artificial intelligence to generate electric power through sustainable solar energy, especially in Iraq and what it has gone through in terms of crises and what it suffers from a severe shortage of electric power because of the wars and calamities it went through. During that period of time, its impact is still evident in all aspects of daily life experienced by Iraqis because of the remnants of wars, siege, terrorism, wrong policies ruling before and later, regional interventions and their consequences, such as the destruction of electric power stations and the population increase, which must be followed by an increase in electric power stations,
... Show MoreNeural cryptography deals with the problem of “key exchange” between two neural networks by using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between two communicating parties ar eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process.
Shadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreEye Detection is used in many applications like pattern recognition, biometric, surveillance system and many other systems. In this paper, a new method is presented to detect and extract the overall shape of one eye from image depending on two principles Helmholtz & Gestalt. According to the principle of perception by Helmholz, any observed geometric shape is perceptually "meaningful" if its repetition number is very small in image with random distribution. To achieve this goal, Gestalt Principle states that humans see things either through grouping its similar elements or recognize patterns. In general, according to Gestalt Principle, humans see things through genera
... Show MoreWe present a reliable algorithm for solving, homogeneous or inhomogeneous, nonlinear ordinary delay differential equations with initial conditions. The form of the solution is calculated as a series with easily computable components. Four examples are considered for the numerical illustrations of this method. The results reveal that the semi analytic iterative method (SAIM) is very effective, simple and very close to the exact solution demonstrate reliability and efficiency of this method for such problems.
Electrochemical machining is one of the widely used non-conventional machining processes to machine complex and difficult shapes for electrically conducting materials, such as super alloys, Ti-alloys, alloy steel, tool steel and stainless steel. Use of optimal ECM process conditions can significantly reduce the ECM operating, tooling, and maintenance cost and can produce components with higher accuracy. This paper studies the effect of process parameters on surface roughness (Ra) and material removal rate (MRR), and the optimization of process conditions in ECM. Experiments were conducted based on Taguchi’s L9 orthogonal array (OA) with three process parameters viz. current, electrolyte concentration, and inter-electrode gap. Sig
... Show MoreFree Space Optics (FSO) plays a vital role in modern wireless communications due to its advantages over fiber optics and RF techniques where a transmission of huge bandwidth and access to remote places become possible. The specific aim of this research is to analyze the Bit-Error Rate (BER) for FSO communication system when the signal is sent the over medium of turbulence channel, where the fading channel is described by the Gamma-Gamma model. The signal quality is improved by using Optical Space-Time Block- Code (OSTBC) and then the BER will be reduced. Optical 2×2 Alamouti scheme required 14 dB bit energy to noise ratio (Eb/N0) at 10-5 bit error rate (BER) which gives 3.5 dB gain as compared to no diversity scheme. Th
... Show MoreAutism is a lifelong developmental deficit that affects how people perceive the world and interact with each others. An estimated one in more than 100 people has autism. Autism affects almost four times as many boys than girls. The commonly used tools for analyzing the dataset of autism are FMRI, EEG, and more recently "eye tracking". A preliminary study on eye tracking trajectories of patients studied, showed a rudimentary statistical analysis (principal component analysis) provides interesting results on the statistical parameters that are studied such as the time spent in a region of interest. Another study, involving tools from Euclidean geometry and non-Euclidean, the trajectory of eye patients also showed interesting results. In this
... Show MoreMost of the known cases of strong gravitational lensing involve multiple imaging of an active galactic nucleus. The properties of lensed active galactic nuclei make them promising systems for astrophysical applications of gravitational lensing. So we present a simple model for strong lensing in the gravitational lensed systems to calculate the age of four lensed galaxies, in the present work we take the freedman models with (k curvature index =0) Euclidian case, and the result show a good agreement with the other models.