We studied the effect of Ca- doping on the properties of Bi-based superconductors by
adding differ ent amounts of CaO
to the Bi
2
Sr2La2-xCaxCu3O10+δ
compound. consequently, we
obtained three samples A,B and C with x=0.0, 0.4 and 0.8 respectively. The usual solid-state
reaction method has been applied under optimum conditions. The x-ray diffraction analy sis
showed that the samples A and B have tetragonal structures conversely the sample C has an
orthorhombic structure. In addition XRD analysis show that decreasing the c-axis lattice
constant and thus decreasing the ratio c/a for samples A,B and C resp ectively. The X-ray
florescence proved that the compositions of samples A,B and C with the ratio of
Bi:Sr :La(Ca):Cu which were about 2:2:2:3. Resistivity were measured at different
temperatures under zero magnetic f ields and the data were interp reted. Sample A shows that
semiconductor behavior, sa mple B showed a metal behavior andsa mple C showed a
sup erconductor behaviors with transition temperatureat zeroresistanceTc( off set)
were 85 K .
Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreWith the revolutionized expansion of the Internet, worldwide information increases the application of communication technology, and the rapid growth of significant data volume boosts the requirement to accomplish secure, robust, and confident techniques using various effective algorithms. Lots of algorithms and techniques are available for data security. This paper presents a cryptosystem that combines several Substitution Cipher Algorithms along with the Circular queue data structure. The two different substitution techniques are; Homophonic Substitution Cipher and Polyalphabetic Substitution Cipher in which they merged in a single circular queue with four different keys for each of them, which produces eight different outputs for
... Show More
The great scientific progress has led to widespread Information as information accumulates in large databases is important in trying to revise and compile this vast amount of data and, where its purpose to extract hidden information or classified data under their relations with each other in order to take advantage of them for technical purposes.
And work with data mining (DM) is appropriate in this area because of the importance of research in the (K-Means) algorithm for clustering data in fact applied with effect can be observed in variables by changing the sample size (n) and the number of clusters (K)
... Show MoreSeismic inversion technique is applied to 3D seismic data to predict porosity property for carbonate Yamama Formation (Early Cretaceous) in an area located in southern Iraq. A workflow is designed to guide the manual procedure of inversion process. The inversion use a Model Based Inversion technique to convert 3D seismic data into 3D acoustic impedance depending on low frequency model and well data is the first step in the inversion with statistical control for each inversion stage. Then, training the 3D acoustic impedance volume, seismic data and porosity wells data with multi attribute transforms to find the best statistical attribute that is suitable to invert the point direct measurement of porosity from well to 3D porosity distribut
... Show MoreIn real situations all observations and measurements are not exact numbers but more or less non-exact, also called fuzzy. So, in this paper, we use approximate non-Bayesian computational methods to estimate inverse Weibull parameters and reliability function with fuzzy data. The maximum likelihood and moment estimations are obtained as non-Bayesian estimation. The maximum likelihood estimators have been derived numerically based on two iterative techniques namely “Newton-Raphson†and the “Expectation-Maximization†techniques. In addition, we provide compared numerically through Monte-Carlo simulation study to obtained estimates of the parameters and reliability function i
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
This study deals with the application of surface-consistent deconvolution to the two-dimensional seismic data applied to the Block 11 area within the administrative boundaries of Najaf and Muthanna Governorates with an area of 4822 , the processed seismic data of line (7Gn 21) is 54 km long. The study was conducted within the Processing Department of the Oil Exploration Company. The gap surface- consistent deconvolution was applied using best results of the parameters applied were: The length of the operator 240, the gap operator 24, the white noise 0.01%, the seismic sections of this type showed improvement with the decay of the existing complications and thus give a good continuity of the reflectors
... Show MoreIn this study two types of extraction solvents were used to extract the undesirable polyaromatics, the first solvent was furfural which was used today in the Iraqi refineries and the second was NMP (N-methyl-2-pyrrolidone).
The studied effecting variables of extraction are extraction temperature ranged from 70 to 110°C and solvent to oil ratio in the range from 1:1 to 4:1.
The results of this investigation show that the viscosity index of mixed-medium lubricating oil fraction increases with increasing extraction temperature and reaches 107.82 for NMP extraction at extraction temperature 110°C and solvent to oil ratio 4:1, while the viscosity index reaches to 101 for furfural extraction at the same extraction temperature and same