The Enhanced Thematic Mapper Plus (ETM+) that loaded onboard the Landsat-7 satellite was launched on 15 April 1999. After 4 years, the image collected by this sensor was greatly impacted by the failure of the system’s Scan Line Corrector (SLC), a radiometry error.The median filter is one of the basic building blocks in many image processing situations. Digital images are often distorted by impulse noise due to errors generated by the noise sensor, errors that occur during the conversion of signals from analog-to-digital, as well as errors generated in communication channels. This error inevitably leads to a change in the intensity of some pixels, while some pixels remain unchanged. To remove impulse noise and improve the quality of the image we are working on. In this paper, the Landsat -7 data was corrected from line droop out radiometric errors using the median filter method. we studied the median filter and offer a method based on an improved median filtering algorithm, [2]. We apply the median filter (3 x 3) to correct the image taken by of Landsat 7 and correct it, and we will restore the damaged pixels using the Erdas imagine program.
In this paper we present a new method for solving fully fuzzy multi-objective linear programming problems and find the fuzzy optimal solution of it. Numerical examples are provided to illustrate the method.
Titanium dioxide (TiO2) Nano powder has been synthesized by hydrothermal method. The reaction took place between titanium tetrachloride (TiCI4) and mixture solution consisted of deionized water and ethanol, in the ratio (3:7) respectively. Structure and surface morphology of TiO2 Nano powder at different annealing temperatures in the range 200-800°C for 120 min were characterized by X-ray diffraction (XRD), Atomic Force Microscope (AFM), Scanning Electron Microscopy (SEM), FT-IR and UV/visible spectroscopy measurements. The results show that with an increase in annealing temperature, the value of the intensity of (110) peak for rutile phase increases while the value of the full-width at half maximum (FWHM) decreases, and the band gap de
... Show MoreDeep learning convolution neural network has been widely used to recognize or classify voice. Various techniques have been used together with convolution neural network to prepare voice data before the training process in developing the classification model. However, not all model can produce good classification accuracy as there are many types of voice or speech. Classification of Arabic alphabet pronunciation is a one of the types of voice and accurate pronunciation is required in the learning of the Qur’an reading. Thus, the technique to process the pronunciation and training of the processed data requires specific approach. To overcome this issue, a method based on padding and deep learning convolution neural network is proposed to
... Show MoreSimultaneous determination of Furosemide, Carbamazepine, Diazepam, and Carvedilol in bulk and pharmaceutical formulation using the partial least squares regression (PLS-1 and PLS-2) is described in this study. The two methods were successfully applied to estimate the four drugs in their quaternary mixture using UV spectral data of 84synthetic mixtures in the range of 200-350nm with the intervals Δλ=0.5nm. The linear concentration range were 1-20 μg.mL-1 for all, with correlation coefficient (R2) and root mean squares error for the calibration (RMSE) for FURO, CARB, DIAZ, and CARV were 0.9996, 0.9998, 0.9997, 0.9997, and 0.1128, 0.1292, 0.1868,0.1562 respectively for PLS-1, and for PLS-2 were 0.9995, 0.9999, 0.9997, 0.9998, and 0.1127, 0.
... Show MoreDeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detectio
... Show MoreIn this paper Alx Ga1-x As:H films have been prepared by using new deposition method based on combination of flash- thermal evaporation technique. The thickness of our samples was about 300nm. The Al concentration was altered within the 0 x 40.
The results of X- ray diffraction analysis (XRD) confirmed the amorphous structure of all AlXGa1-x As:H films with x 40 and annealing temperature (Ta)<200°C. the temperature dependence of the DC conductivity GDC with various Al content has been measured for AlXGa1-x As:H films.
We have found that the thermal activation energy Ea depends of Al content and Ta, thus the value of Ea were approximately equal to half the value of optical gap.
The objective of this research is to develop a method for applying financial derivatives in the local environment to reduce the risk of foreign exchange rate fluctuations to enhance quality of accounting profits through Financial reporting to local units In accordance with international financial reporting standards, To accomplish this objective was selected a sample of Iraqi units exposed to the risk of fluctuations in foreign currency rates, As the research found:
- many companies and banks in the local environment a lot of losses due to fluctuations in foreign currency exchange rates.
- that financial derivatives in the Iraqi environment represent
The increasing Global Competitive and the continuous improvement in information technology has led the way to the development of the modern systems and using modern techniques. One of these techniques is benchmarking style and Total Quality Management all of them are used to improve the production process and target rid from the losts on the other side.
The Benchmarking style has become a very important for all the industrial systems and the serving systems as well. And an instrument to improve their performance specially those which are suffering from the highness of the costs or waste in time on the other side.
This study aims to depend on virtual Benchmarking style in the eval
... Show MoreSurvival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d
... Show More