Assessing the accuracy of classification algorithms is paramount as it provides insights into reliability and effectiveness in solving real-world problems. Accuracy examination is essential in any remote sensing-based classification practice, given that classification maps consistently include misclassified pixels and classification misconceptions. In this study, two imaginary satellites for Duhok province, Iraq, were captured at regular intervals, and the photos were analyzed using spatial analysis tools to provide supervised classifications. Some processes were conducted to enhance the categorization, like smoothing. The classification results indicate that Duhok province is divided into four classes: vegetation cover, buildings, water bodies, and bare lands. During 2013-2022, vegetation cover increased from 63% in 2013 to 66% in 2022; buildings roughly increased by 1% to 3% yearly; water bodies showed a decrease of 2% to 1%; the amount of unoccupied land showed a decrease from 34% to 30%. Therefore, the classification accuracy was assessed using the approach of comparison with field data; the classification accuracy was about 85%.
Merging images is one of the most important technologies in remote sensing applications and geographic information systems. In this study, a simulation process using a camera for fused images by using resizing image for interpolation methods (nearest, bilinear and bicubic). Statistical techniques have been used as an efficient merging technique in the images integration process employing different models namely Local Mean Matching (LMM) and Regression Variable Substitution (RVS), and apply spatial frequency techniques include high pass filter additive method (HPFA). Thus, in the current research, statistical measures have been used to check the quality of the merged images. This has been carried out by calculating the correlation a
... Show MoreThe source and channel coding for wireless data transmission can reduce
distortion, complexity and delay in multimedia services. In this paper, a joint sourcechannel
coding is proposed for orthogonal frequency division multiplexing -
interleave division multiple access (OFDM-IDMA) systems to transmit the
compressed images over noisy channels. OFDM-IDMA combines advantages of
both OFDM and IDMA, where OFDM removes inter symbol interference (ISI)
problems and IDMA removes multiple access interference (MAI). Convolutional
coding is used as a channel coding, while the hybrid compression method is used as
a source coding scheme. The hybrid compression scheme is based on wavelet
transform, bit plane slicing, polynomi
A new approach presented in this study to determine the optimal edge detection threshold value. This approach is base on extracting small homogenous blocks from unequal mean targets. Then, from these blocks we generate small image with known edges (edges represent the lines between the contacted blocks). So, these simulated edges can be assumed as true edges .The true simulated edges, compared with the detected edges in the small generated image is done by using different thresholding values. The comparison based on computing mean square errors between the simulated edge image and the produced edge image from edge detector methods. The mean square error computed for the total edge image (Er), for edge regio
... Show MoreNecessary and sufficient conditions for the operator equation I AXAX n  ï€* , to have a real positive definite solution X are given. Based on these conditions, some properties of the operator A as well as relation between the solutions X andAare given.
This paper presents results about the existence of best approximations via nonexpansive type maps defined on modular spaces.
Derivative spectrophotometry is one of the analytical chemistry techniques used
in the analysis and determination of chemicals and pharmaceuticals. This method is
characterized by simplicity, sensitivity and speed. Derivative of Spectra conducted
in several ways, including optical, electronic and mathematical. This operation
usually be done within spectrophotometer. The paper is based on form of a new
program. The program construction is written in Visual Basic language within
Microsoft Excel. The program is able to transform the first, second, third and fourth
derivatives of data and the return of these derivatives to zero order (normal plot).
The program was applied on experimental (trial) and reals values of su
The huge evolving in the information technologies, especially in the few last decades, has produced an increase in the volume of data on the World Wide Web, which is still growing significantly. Retrieving the relevant information on the Internet or any data source with a query created by a few words has become a big challenge. To override this, query expansion (QE) has an important function in improving the information retrieval (IR), where the original query of user is recreated to a new query by appending new related terms with the same importance. One of the problems of query expansion is the choosing of suitable terms. This problem leads to another challenge of how to retrieve the important documents with high precision, high recall
... Show MoreThe transmitting and receiving of data consume the most resources in Wireless Sensor Networks (WSNs). The energy supplied by the battery is the most important resource impacting WSN's lifespan in the sensor node. Therefore, because sensor nodes run from their limited battery, energy-saving is necessary. Data aggregation can be defined as a procedure applied for the elimination of redundant transmissions, and it provides fused information to the base stations, which in turn improves the energy effectiveness and increases the lifespan of energy-constrained WSNs. In this paper, a Perceptually Important Points Based Data Aggregation (PIP-DA) method for Wireless Sensor Networks is suggested to reduce redundant data before sending them to the
... Show MoreThis study produces an image of theoretical and experimental case of high loading stumbling condition for hip prosthesis. Model had been studied namely Charnley. This model was modeled with finite element method by using ANSYS software, the effect of changing the design parameters (head diameter, neck length, neck ratio, stem length) on Charnley design, for stumbling case as impact load where the load reach to (8.7* body weight) for impact duration of 0.005sec.An experimental rig had been constructed to test the hip model, this rig consist of a wood box with a smooth sliding shaft where a load of 1 pound is dropped from three heights.
The strain produced by this impact is measured by using rosette strain gauge connected to Wheatstone