Bragg Reflectors consist of periodic dielectric layers having an optical path length of quarter wavelength for each layer giving them important properties and makes them suitable for optoelectronics applications. The reflectivity can be increased by increasing the number of layers of the mirror to get the required value. For example for an 8 layers Bragg mirror (two layers for each dielectric pair), the contrast of the refractive index has to be equal to 0.275 for reaching reflectivity > 99%. Doubling the number of layers results in a reflectivity of 99.99%. The high reflectivity is purely caused by multiple-interference effects. It can be analyzed by using different matrix methods such as the transfer matrix method (TMM) which is the simplest method to study the characteristic of devices with different alternating layers.
This paper deals with the thirteenth order differential equations linear and nonlinear in boundary value problems by using the Modified Adomian Decomposition Method (MADM), the analytical results of the equations have been obtained in terms of convergent series with easily computable components. Two numerical examples results show that this method is a promising and powerful tool for solving this problems.
With the proliferation of both Internet access and data traffic, recent breaches have brought into sharp focus the need for Network Intrusion Detection Systems (NIDS) to protect networks from more complex cyberattacks. To differentiate between normal network processes and possible attacks, Intrusion Detection Systems (IDS) often employ pattern recognition and data mining techniques. Network and host system intrusions, assaults, and policy violations can be automatically detected and classified by an Intrusion Detection System (IDS). Using Python Scikit-Learn the results of this study show that Machine Learning (ML) techniques like Decision Tree (DT), Naïve Bayes (NB), and K-Nearest Neighbor (KNN) can enhance the effectiveness of an Intrusi
... Show MoreBig data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show MorePolyacrylonitrile nanofiber (PANFS), a well-known polymers, has been extensively employed in the manufacturing of carbon nanofibers (CNFS), which have recently gained substantial attention due to their excellent features, such as spinnability, environmental friendliness, and commercial feasibility. Because of their high carbon yield and versatility in tailoring the final CNFS structure, In addition to the simple formation of ladder structures through nitrile polymerization to yield stable products, CNFS and PAN have been the focus of extensive research as potential production precursors. For instance, the development of biomedical and high-performance composites has now become achievable. PAN homopolymer or PAN-based precursor copolymer can
... Show More-
The problem of the study represented in the gap between what banks possess of Information Technology and Knowledge Management and what they need to help them in decision making and problem solving and achieving High Business Value. It formed a focus and fundamental point for this study in its analysis and interpretation. This was done by a scientific methodology and five chapters.
The study aimed at analyzing the use of partnering Information Technology with Knowledge Management to achieve High Business Value at commercial banks in Jordan. Data were collected from 116 managers, experts, and advisors working for 16 Jordanian banks through a questionnaire de
In this research, we studied the multiple linear regression models for two variables in the presence of the autocorrelation problem for the error term observations and when the error is distributed with general logistic distribution. The auto regression model is involved in the studying and analyzing of the relationship between the variables, and through this relationship, the forecasting is completed with the variables as values. A simulation technique is used for comparison methods depending on the mean square error criteria in where the estimation methods that were used are (Generalized Least Squares, M Robust, and Laplace), and for different sizes of samples (20, 40, 60, 80, 100, 120). The M robust method is demonstrated the best metho
... Show MoreIn this research, we studied the multiple linear regression models for two variables in the presence of the autocorrelation problem for the error term observations and when the error is distributed with general logistic distribution. The auto regression model is involved in the studying and analyzing of the relationship between the variables, and through this relationship, the forecasting is completed with the variables as values. A simulation technique is used for comparison methods depending
Compressing an image and reconstructing it without degrading its original quality is one of the challenges that still exist now a day. A coding system that considers both quality and compression rate is implemented in this work. The implemented system applies a high synthetic entropy coding schema to store the compressed image at the smallest size as possible without affecting its original quality. This coding schema is applied with two transform-based techniques, one with Discrete Cosine Transform and the other with Discrete Wavelet Transform. The implemented system was tested with different standard color images and the obtained results with different evaluation metrics have been shown. A comparison was made with some previous rel
... Show MoreAn easy, eclectic, precise high-Performance Liquid Chromatographic (HPLC) procedure was evolved and validated to estimate of Piroxicam and Codeine phosphate. Chromatographic demarcation was accomplished on a C18 column [Use BDS Hypersil C18, 5μ, 150 x 4.6 mm] using a mobile phase of methanol: phosphate buffer (60:40, v/v, pH=2.3), the flow rate was 1.1 mL/min, UV detection was at 214 nm. System Suitability tests (SSTs) are typically performed to assess the suitability and effectiveness of the entire chromatography system. The retention time for Piroxicam was found to be 3.95 minutes and 1.46 minutes for Codeine phosphate. The evolved method has been validated through precision, limit of quantitation, specificity,
... Show MoreMost vegetation’s are Land cover (LC) for the globe, and there is an increased attention to plants since they represent an element of balance to natural ecology and maintain the natural balance of rapid changes due to systematic and random human uses, including the subject of the current study (Bassia eriophora ) Which represent an essential part of the United Nations system for land cover classification (LCCS), developed by the World Food Organization (FAO) and the world Organization for environmental program (UNEP), to observe basic environmental elements with modern techniques. Although this plant is distributed all over Iraq, we found that this plant exists primarily in the middle
... Show More