Metasurface polarizers are essential optical components in modern integrated optics and play a vital role in many optical applications including Quantum Key Distribution systems in quantum cryptography. However, inverse design of metasurface polarizers with high efficiency depends on the proper prediction of structural dimensions based on required optical response. Deep learning neural networks can efficiently help in the inverse design process, minimizing both time and simulation resources requirements, while better results can be achieved compared to traditional optimization methods. Hereby, utilizing the COMSOL Multiphysics Surrogate model and deep neural networks to design a metasurface grating structure with high extinction ration of »60000 at visible spectral wavelength of 632 nm, could be achieved.
This paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method
... Show MoreIn this article, performing and deriving te probability density function for Rayleigh distribution is done by using ordinary least squares estimator method and Rank set estimator method. Then creating interval for scale parameter of Rayleigh distribution. Anew method using is used for fuzzy scale parameter. After that creating the survival and hazard functions for two ranking functions are conducted to show which one is beast.
In this paper was discussed the process of compounding two distributions using new compounding procedure which is connect a number of life time distributions ( continuous distribution ) where is the number of these distributions represent random variable distributed according to one of the discrete random distributions . Based on this procedure have been compounding zero – truncated poisson distribution with weibell distribution to produce new life time distribution having three parameter , Advantage of that failure rate function having many cases ( increasing , dicreasing , unimodal , bathtube) , and study the resulting distribution properties such as : expectation , variance , comulative function , reliability function and fa
... Show MoreThis Research Tries To Investigate The Problem Of Estimating The Reliability Of Two Parameter Weibull Distribution,By Using Maximum Likelihood Method, And White Method. The Comparison Is done Through Simulation Process Depending On Three Choices Of Models (?=0.8 , ß=0.9) , (?=1.2 , ß=1.5) and (?=2.5 , ß=2). And Sample Size n=10 , 70, 150 We Use the Statistical Criterion Based On the Mean Square Error (MSE) For Comparison Amongst The Methods.
The problem of solid waste from domestic, industrial, commercial and medical sources is one of the most important problems facing the local administration in all Iraqi cities. The danger of this problem increases with the rapid increase in the population, changing lifestyles, consumption patterns, limited land suitable for landfill, and high costs of collection and disposal. This research aims to solve these problems by determining the locations of current landfills located in the outskirts of Baghdad Governorate. The ArcGIS program was used, where the sites of the landfills were determined on the map and through the available data about the areas. it was concluded that the existing landfill sites do not meet environmental conditions and
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreExponential distribution is one of most common distributions in studies and scientific researches with wide application in the fields of reliability, engineering and in analyzing survival function therefore the researcher has carried on extended studies in the characteristics of this distribution.
In this research, estimation of survival function for truncated exponential distribution in the maximum likelihood methods and Bayes first and second method, least square method and Jackknife dependent in the first place on the maximum likelihood method, then on Bayes first method then comparing then using simulation, thus to accomplish this task, different size samples have been adopted by the searcher us
... Show MoreFingerprints are commonly utilized as a key technique and for personal recognition and in identification systems for personal security affairs. The most widely used fingerprint systems utilizing the distribution of minutiae points for fingerprint matching and representation. These techniques become unsuccessful when partial fingerprint images are capture, or the finger ridges suffer from lot of cuts or injuries or skin sickness. This paper suggests a fingerprint recognition technique which utilizes the local features for fingerprint representation and matching. The adopted local features have determined using Haar wavelet subbands. The system was tested experimentally using FVC2004 databases, which consists of four datasets, each set holds
... Show MoreThis article explores the process of VGI collection by assessing the relative usability and accuracy of a range of different methods (Smartphone GPS, Tablet, and analogue maps) for data collection amongst different demographic and educational groups, and in different geographical contexts. Assessments are made of positional accuracy, completeness, and data collectors’ experiences with reference to the official cadastral data and the administration system in a case-study region of Iraq. Ownership data was validated by crowd agreement. The result shows that successful VGI projects have access to varying data collection methods.