In this paper, Bayes estimators of the parameter of Maxwell distribution have been derived along with maximum likelihood estimator. The non-informative priors; Jeffreys and the extension of Jeffreys prior information has been considered under two different loss functions, the squared error loss function and the modified squared error loss function for comparison purpose. A simulation study has been developed in order to gain an insight into the performance on small, moderate and large samples. The performance of these estimators has been explored numerically under different conditions. The efficiency for the estimators was compared according to the mean square error MSE. The results of comparison by MSE show that the efficiency of Bayes estimators of the shape parameter of the Maxwell distribution decreases with the increase of Jeffreys prior constants. The results also show that values of Bayes estimators are almost close to the maximum likelihood estimator when the Jeffreys prior constants are small, yet they are identical in some certain cases. Comparison with respect to loss functions show that Bayes estimators under the modified squared error loss function has greater MSE than the squared error loss function especially with the increase of r.
Phlebotomus papatasi sand fly is the main vector of Zoonotic Cutaneous Leishmaniasis (ZCL) in Iraq. The aim of this study was to assess and predict the effects of climate change on the distribution of the cutaneous leishmaniasis (CL) cases and the main vector presently and in the future. Data of the CL cases were collected for the period (2000-2018) in addition to sand fly (SF) abundance. Geographic information system, R studio and MaxEnt (Maximum entropy niche model) software were used for analysis and predict effect of (elevation, population, Bio1-19, and Bio28-35) on CL cases distribution and SF occurrence. HadGEM2-ES model with two climate change scenarios, RCP 4.5 and RCP 8.5 were used for future projections 2050. The results showed th
... Show MoreThe research aims to identify the theoretical foundations for measuring and analyzing quality costs and continuous improvement, as well as measuring and analyzing quality costs for the Directorate of Electricity Supply / Middle Euphrates and continuous improvement of the distribution of electrical energy,The problem was represented by the high costs of failure and waste in electrical energy result to the excesses on the network and the missing (lost) energy,Thus, measuring and analyzing quality costs for the distribution of electrical energy and identifying continuous improvement leads to a reduction in missing and an increase in sales, as the research reached many conclusions, the most important of which is the high percentage o
... Show MoreIn this research, a low cost, portable, disposable, environment friendly and an easy to use lab-on-paper platform sensor was made. The sensor was constructed using a mixture of Rhodamine-6G and gold nanoparticles also Sodium chloride salt. Drop–casting method was utilized as a technique to make a platform which is a commercial office paper. A substrate was characterized using Field Emission Scanning Electron Microscope, Fourier transform infrared spectroscopy, UV-visible spectrophotometer and Raman Spectrometer. Rh-6G Raman signal was enhanced based on Surface Enhanced Raman Spectroscopy technique utilized gold nanoparticles. High Enhancement factor of Plasmonic commercial office paper reaches up to 0.9 x105 because of local surface pl
... Show MoreAchieving reliable operation under the influence of deep-submicrometer noise sources including crosstalk noise at low voltage operation is a major challenge for network on chip links. In this paper, we propose a coding scheme that simultaneously addresses crosstalk effects on signal delay and detects up to seven random errors through wire duplication and simple parity checks calculated over the rows and columns of the two-dimensional data. This high error detection capability enables the reduction of operating voltage on the wire leading to energy saving. The results show that the proposed scheme reduces the energy consumption up to 53% as compared to other schemes at iso-reliability performance despite the increase in the overhead number o
... Show MoreRecently, digital communication has become a critical necessity and so the Internet has become the most used medium and most efficient for digital communication. At the same time, data transmitted through the Internet are becoming more vulnerable. Therefore, the issue of maintaining secrecy of data is very important, especially if the data is personal or confidential. Steganography has provided a reliable method for solving such problems. Steganography is an effective technique in secret communication in digital worlds where data sharing and transfer is increasing through the Internet, emails and other ways. The main challenges of steganography methods are the undetectability and the imperceptibility of con
... Show MoreImage fusion is one of the most important techniques in digital image processing, includes the development of software to make the integration of multiple sets of data for the same location; It is one of the new fields adopted in solve the problems of the digital image, and produce high-quality images contains on more information for the purposes of interpretation, classification, segmentation and compression, etc. In this research, there is a solution of problems faced by different digital images such as multi focus images through a simulation process using the camera to the work of the fuse of various digital images based on previously adopted fusion techniques such as arithmetic techniques (BT, CNT and MLT), statistical techniques (LMM,
... Show MoreImage pattern classification is considered a significant step for image and video processing. Although various image pattern algorithms have been proposed so far that achieved adequate classification, achieving higher accuracy while reducing the computation time remains challenging to date. A robust image pattern classification method is essential to obtain the desired accuracy. This method can be accurately classify image blocks into plain, edge, and texture (PET) using an efficient feature extraction mechanism. Moreover, to date, most of the existing studies are focused on evaluating their methods based on specific orthogonal moments, which limits the understanding of their potential application to various Discrete Orthogonal Moments (DOM
... Show MoreIn this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform
In this paper, wireless network is planned; the network is predicated on the IEEE 802.16e standardization by WIMAX. The targets of this paper are coverage maximizing, service and low operational fees. WIMAX is planning through three approaches. In approach one; the WIMAX network coverage is major for extension of cell coverage, the best sites (with Band Width (BW) of 5MHz, 20MHZ per sector and four sectors per each cell). In approach two, Interference analysis in CNIR mode. In approach three of the planning, Quality of Services (QoS) is tested and evaluated. ATDI ICS software (Interference Cancellation System) using to perform styling. it shows results in planning area covered 90.49% of the Baghdad City and used 1000 mob
... Show More