This paper determined the difference between the first image of the natural and the second infected image by using logic gates. The proposed algorithm was applied in the first time with binary image, the second time in the gray image, and in the third time in the color image. At start of proposed algorithm the process images by applying convolution to extended images with zero to obtain more vision and features then enhancements images by Edge detection filter (laplacion operator) and smoothing images by using mean filter ,In order to determine the change between the original image and the injury the logic gates applied specially X-OR gates . Applying the technique for tooth decay through this comparison can locate injury, this difference may be tooth decay or a broken bone , cancerous cells or infected gums, The X-OR gate is the best gates uses in this technique. Simulation program using visual basic has been in order to determine final results.
The Digital Elevation Model (DEM) has been known as a quantitative description of the surface of the Earth, which provides essential information about the terrain. DEMs are significant information sources for a number of practical applications that need surface elevation data. The open-source DEM datasets, such as the Advanced Space-borne Thermal Emission and Reflection Radiometer (ASTER), the Shuttle Radar Topography Mission (SRTM), and the Advanced Land Observing Satellite (ALOS) usually have approximately low accuracy and coarser resolution. The errors in many datasets of DEMs have already been generally examined for their importance, where their quality could be affected within different aspects, including the types of sensors, algor
... Show MoreNon uniform channelization is a crucial task in cognitive radio receivers for obtaining separate channels from the digitized wideband input signal at different intervals of time. The two main requirements in the channelizer are reconfigurability and low complexity. In this paper, a reconfigurable architecture based on a combination of Improved Coefficient Decimation Method (ICDM) and Coefficient Interpolation Method (CIM) is proposed. The proposed Hybrid Coefficient Decimation-Interpolation Method (HCDIM) based filter bank (FB) is able to realize the same number of channels realized using (ICDM) but with a maximum decimation factor divided by the interpolation factor (L), which leads to less deterioration in stop band at
... Show MoreIncreasing world demand for renewable energy resources as wind energy was one of the goals behind research optimization of energy production from wind farms. Wake is one of the important phenomena in this field. This paper focuses on understanding the effect of angle of attack (α) on wake characteristics behind single horizontal axis wind turbines (HAWT). This was done by design three rotors different from each other in value of α used in the rotor design process. Values of α were (4.8˚,9.5˚,19˚). The numerical simulations were conducted using Ansys Workbench 19- Fluent code; the used turbulence model was (k-ω SST). The results showed that best value for extracted wind energy was at α=19˚, spread distance of wak
... Show MoreGenistein (GEN) is one of the predominant dietary isoflavones found in legumes such as soybeans. Genistein has been recommended as an osteoporosis treatment for postmenopausal women and elderly men, with the intention of reducing cardiovascular disease and hormone-dependent malignancies. Therefore, two sensitive and simple methods for quantifying it in the supplements preparation were developed.The first method (A) comprised employing the surfactant Triton X-114 to extract the result of the diazotization reaction with 4-Aminoacetophenone(4AMA) utilizing a cloud point extraction technique. The product was extracted using micelles of a non-ionic surfactant (TritonX-114) and then spectrophotometrically detected at a specified wa
... Show MoreThe majority of systems dealing with natural language processing (NLP) and artificial intelligence (AI) can assist in making automated and automatically-supported decisions. However, these systems may face challenges and difficulties or find it confusing to identify the required information (characterization) for eliciting a decision by extracting or summarizing relevant information from large text documents or colossal content. When obtaining these documents online, for instance from social networking or social media, these sites undergo a remarkable increase in the textual content. The main objective of the present study is to conduct a survey and show the latest developments about the implementation of text-mining techniqu
... Show MoreThe current study sheds light on the measurement and estimation of the radioactivity of radionuclides (238U, 226Ra, 232Th, and 40k) in natural waters of different regions of Nineveh Governorate in Iraq.15 samples were collected from different sources of natural waters, where gamma-ray spectroscopy was used using NaI)TI) sodium iodide detector to determine the concentration of radioactivity in the samples. According to the results, the radioactivity concentration in the tested water sample were ranged from 0.36 ± 0.04-1.57 ± 0.09with an average value of 0.69 ± 0.06 Bq/l for 238U, and 2.9 ± 0.02-0.88 ± 0.03 with an average value of 0.65 ± 0.03 Bq/l for 226Ra Bq/l
... Show MoreMerging images is one of the most important technologies in remote sensing applications and geographic information systems. In this study, a simulation process using a camera for fused images by using resizing image for interpolation methods (nearest, bilinear and bicubic). Statistical techniques have been used as an efficient merging technique in the images integration process employing different models namely Local Mean Matching (LMM) and Regression Variable Substitution (RVS), and apply spatial frequency techniques include high pass filter additive method (HPFA). Thus, in the current research, statistical measures have been used to check the quality of the merged images. This has been carried out by calculating the correlation a
... Show MoreSurvival analysis is widely applied to data that described by the length of time until the occurrence of an event under interest such as death or other important events. The purpose of this paper is to use the dynamic methodology which provides a flexible method, especially in the analysis of discrete survival time, to estimate the effect of covariate variables through time in the survival analysis on dialysis patients with kidney failure until death occurs. Where the estimations process is completely based on the Bayes approach by using two estimation methods: the maximum A Posterior (MAP) involved with Iteratively Weighted Kalman Filter Smoothing (IWKFS) and in combination with the Expectation Maximization (EM) algorithm. While the other
... Show MoreIn this paper, preliminary test Shrinkage estimator have been considered for estimating the shape parameter α of pareto distribution when the scale parameter equal to the smallest loss and when a prior estimate α0 of α is available as initial value from the past experiences or from quaintance cases. The proposed estimator is shown to have a smaller mean squared error in a region around α0 when comparison with usual and existing estimators.
In this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show More