In this paper we proposes the philosophy of the Darwinian selection as synthesis method called Genetic algorithm ( GA ), and include new merit function with simple form then its uses in other works for designing one of the kinds of multilayer optical filters called high reflection mirror. Here we intend to investigate solutions for many practical problems. This work appears designed high reflection mirror that have good performance with reduction the number of layers, which can enable one to controlling the errors effect of the thickness layers on the final product, where in this work we can yield such a solution in a very shorter time by controlling the length of the chromosome and optimal genetic operators . Result shows that the construction of multilayer high reflection mirror using in this approach can be considered as a master stone for design another type of filters with most complicated performance, and it is difficult designing in other approach The experiment results demonstrate that our approach is a powerful technique. It is enable to locate the global optimum optimal automatically with high confidence without need for a good starting design.
In this paper, we derived an estimators and parameters of Reliability and Hazard function of new mix distribution ( Rayleigh- Logarithmic) with two parameters and increasing failure rate using Bayes Method with Square Error Loss function and Jeffery and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived of Bayesian estimator compared to the to the Maximum Likelihood of this function using Simulation technique by Monte Carlo method under different Rayleigh- Logarithmic parameter and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator in all sample sizes with application
Transforming the common normal distribution through the generated Kummer Beta model to the Kummer Beta Generalized Normal Distribution (KBGND) had been achieved. Then, estimating the distribution parameters and hazard function using the MLE method, and improving these estimations by employing the genetic algorithm. Simulation is used by assuming a number of models and different sample sizes. The main finding was that the common maximum likelihood (MLE) method is the best in estimating the parameters of the Kummer Beta Generalized Normal Distribution (KBGND) compared to the common maximum likelihood according to Mean Squares Error (MSE) and Mean squares Error Integral (IMSE) criteria in estimating the hazard function. While the pr
... Show MoreIn this paper an estimator of reliability function for the pareto dist. Of the first kind has been derived and then a simulation approach by Monte-Calro method was made to compare the Bayers estimator of reliability function and the maximum likelihood estimator for this function. It has been found that the Bayes. estimator was better than maximum likelihood estimator for all sample sizes using Integral mean square error(IMSE).
The aim of this work is to evaluate the one- electron expectation value from the radial electronic density function D(r1) for different wave function for the 2S state of Be atom . The wave function used were published in 1960,1974and 1993, respectavily. Using Hartree-Fock wave function as a Slater determinant has used the partitioning technique for the analysis open shell system of Be (1s22s2) state, the analyze Be atom for six-pairs electronic wave function , tow of these are for intra-shells (K,L) and the rest for inter-shells(KL) . The results are obtained numerically by using computer programs (Mathcad).
The aim of this paper to find Bayes estimator under new loss function assemble between symmetric and asymmetric loss functions, namely, proposed entropy loss function, where this function that merge between entropy loss function and the squared Log error Loss function, which is quite asymmetric in nature. then comparison a the Bayes estimators of exponential distribution under the proposed function, whoever, loss functions ingredient for the proposed function the using a standard mean square error (MSE) and Bias quantity (Mbias), where the generation of the random data using the simulation for estimate exponential distribution parameters different sample sizes (n=10,50,100) and (N=1000), taking initial
... Show MoreIn this paper reliable computational methods (RCMs) based on the monomial stan-dard polynomials have been executed to solve the problem of Jeffery-Hamel flow (JHF). In addition, convenient base functions, namely Bernoulli, Euler and Laguerre polynomials, have been used to enhance the reliability of the computational methods. Using such functions turns the problem into a set of solvable nonlinear algebraic system that MathematicaⓇ12 can solve. The JHF problem has been solved with the help of Improved Reliable Computational Methods (I-RCMs), and a review of the methods has been given. Also, published facts are used to make comparisons. As further evidence of the accuracy and dependability of the proposed methods, the maximum error remainder
... Show MoreIn this research, optical communication coding systems are designed and constructed by utilizing Frequency Shift Code (FSC) technique. Calculations of the system quality represented by signal to noise ratio (S/N), Bit Error Rate (BER),and Power budget are done. In FSC system, the data of Nonreturn- to–zero (NRZ ) with bit rate at 190 kb/s was entered into FSC encoder circuit in transmitter unit. This data modulates the laser source HFCT-5205 with wavelength at 1310 nm by Intensity Modulation (IM) method, then this data is transferred through Single Mode (SM) optical fiber. The recovery of the NRZ is achieved using decoder circuit in receiver unit. The calculations of BER and S/N for FSC system a
... Show MoreAbstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreThe prevalence of using the applications for the internet of things (IoT) in many human life fields such as economy, social life, and healthcare made IoT devices targets for many cyber-attacks. Besides, the resource limitation of IoT devices such as tiny battery power, small storage capacity, and low calculation speed made its security a big challenge for the researchers. Therefore, in this study, a new technique is proposed called intrusion detection system based on spike neural network and decision tree (IDS-SNNDT). In this method, the DT is used to select the optimal samples that will be hired as input to the SNN, while SNN utilized the non-leaky integrate neurons fire (NLIF) model in order to reduce latency and minimize devices
... Show More