This study is unique in this field. It represents a mix of three branches of technology: photometry, spectroscopy, and image processing. The work treats the image by treating each pixel in the image based on its color, where the color means a specific wavelength on the RGB line; therefore, any image will have many wavelengths from all its pixels. The results of the study are specific and identify the elements on the nucleus’s surface of a comet, not only the details but also their mapping on the nucleus. The work considered 12 elements in two comets (Temple 1 and 67P/Churyumoy-Gerasimenko). The elements have strong emission lines in the visible range, which were recognized by our MATLAB program in the treatment of the image. The percentage of the elements was determined relative to iron, where in comet Temple 1, the most significant percentage of the element ratio potassium to iron is K / Fe ~ 28.2%, while the lowest value is Ca / Fe ~ 1.3%. For the comet, 67P/Churyumov-Gerasimenko, the most significant percentage of the elements relative to iron is also for potassium, K / Fe ~ 89.5%; while the lowest value is Ni / Fe ~ 0.26. In general, comparing both comets, the greatest percentage of the elements relative to iron is K / F. Iron is the base element in the structure of both comets, followed by potassium.
Untreated municipal solid waste (MSW) release onto land is prevalent in developing countries. To reduce the high levels of harmful components in polluted soils, a proper evaluation of heavy metal concentrations in Erbil's Kani Qrzhala dump between August 2021 and February 2022 is required. The purpose of this research was to examine the impact of improper solid waste disposal on soil properties within a landfill by assessing the risks of contamination for eight heavy elements in two separate layers of the soil by using geoaccumulation index (I-geo) and pollution load index (PLI) supported. The ArcGIS software was employed to map the spatial distribution of heavy element pollution and potential ecological risks. The I-geo values in summe
... Show MoreThe distribution of the expanded exponentiated power function EEPF with four parameters, was presented by the exponentiated expanded method using the expanded distribution of the power function, This method is characterized by obtaining a new distribution belonging to the exponential family, as we obtained the survival rate and failure rate function for this distribution, Some mathematical properties were found, then we used the developed least squares method to estimate the parameters using the genetic algorithm, and a Monte Carlo simulation study was conducted to evaluate the performance of estimations of possibility using the Genetic algorithm GA.
Abstract:
This research aims to compare Bayesian Method and Full Maximum Likelihood to estimate hierarchical Poisson regression model.
The comparison was done by simulation using different sample sizes (n = 30, 60, 120) and different Frequencies (r = 1000, 5000) for the experiments as was the adoption of the Mean Square Error to compare the preference estimation methods and then choose the best way to appreciate model and concluded that hierarchical Poisson regression model that has been appreciated Full Maximum Likelihood Full Maximum Likelihood with sample size (n = 30) is the best to represent the maternal mortality data after it has been reliance value param
... Show MoreSemiparametric methods combined parametric methods and nonparametric methods ,it is important in most of studies which take in it's nature more progress in the procedure of accurate statistical analysis which aim getting estimators efficient, the partial linear regression model is considered the most popular type of semiparametric models, which consisted of parametric component and nonparametric component in order to estimate the parametric component that have certain properties depend on the assumptions concerning the parametric component, where the absence of assumptions, parametric component will have several problems for example multicollinearity means (explanatory variables are interrelated to each other) , To treat this problem we use
... Show MoreLearning programming is among the top challenges in computer science education. A part of that, program visualization (PV) is used as a tool to overcome the high failure and drop-out rates in an introductory programming course. Nevertheless, there are rising concerns about the effectiveness of the existing PV tools following the mixed results derived from various studies. Student engagement is also considered a vital factor in building a successful PV, while it is also an important part of the learning process in general. Several techniques have been introduced to enhance PV engagement; however, student engagement with PV is still challenging. This paper employed three theories—constructivism, social constructivism and cognitive load t
... Show MoreIt is often needed to have circuits that can display the decimal representation of a binary number and specifically in this paper on a 7-segment display. In this paper a circuit that can display the decimal equivalent of an n-bit binary number is designed and it’s behavior is described using Verilog Hardware Descriptive Language (HDL). This HDL program is then used to configure an FPGA to implement the designed circuit.
As an important resource, entanglement light source has been used in developing quantum information technologies, such as quantum key distribution(QKD). There are few experiments implementing entanglement-based deterministic QKD protocols since the security of existing protocols may be compromised in lossy channels. In this work, we report on a loss-tolerant deterministic QKD experiment which follows a modified “Ping-Pong”(PP) protocol. The experiment results demonstrate for the first time that a secure deterministic QKD session can be fulfilled in a channel with an optical loss of 9 dB, based on a telecom-band entangled photon source. This exhibits a conceivable prospect of ultilizing entanglement light source in real-life fiber-based
... Show MoreInterval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an ef
... Show More