This work discusses the beginning of fractional calculus and how the Sumudu and Elzaki transforms are applied to fractional derivatives. This approach combines a double Sumudu-Elzaki transform strategy to discover analytic solutions to space-time fractional partial differential equations in Mittag-Leffler functions subject to initial and boundary conditions. Where this method gets closer and closer to the correct answer, and the technique's efficacy is demonstrated using numerical examples performed with Matlab R2015a.
A two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MoreThis study proposed control system that has been presented to control the electron lens resistance in order to obtain a stabilized electron lens power. This study will layout the fundamental challenges, hypothetical plan arrangements and development condition for the Integrable Optics Test Accelerator (IOTA) in progress at Fermilab. Thus, an effective automatic gain control (AGC) unit has been introduced which prevents fluctuations in the internal resistance of the electronic lens caused by environmental influences to affect the system's current and power values and keep them in stable amounts. Utilizing this unit has obtained level balanced out system un impacted with electronic lens surrounding natural varieties.
Improving students’ use of argumentation is front and center in the increasing emphasis on scientific practice in K-12 Science and STEM programs. We explore the construct validity of scenario-based assessments of claim-evidence-reasoning (CER) and the structure of the CER construct with respect to a learning progression framework. We also seek to understand how middle school students progress. Establishing the purpose of an argument is a competency that a majority of middle school students meet, whereas quantitative reasoning is the most difficult, and the Rasch model indicates that the competencies form a unidimensional hierarchy of skills. We also find no evidence of differential item functioning between different scenarios, suggesting
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreA high-performance liquid chromatography method was employed for the quantitative determination of ascorbic acid (AA) which called vitamin C in three types of Iraqi citrus (orange mandarin and aurantium ) and to establish this goal , evaluation of ascorbic acid degradation is so important due to its significant criticality when exposure to ordinary atmospheric conditions. The chromatographic analysis of AA was carried out after their sequential elution with KH2PO4 ( as mobile phase) by reverse-phase HPLC technique with C8 column and UV detection at 214 nm. .Bad resolutions was appeared clearly for C8 column , so another alternative condition were carried out to improve the resolution by replacement of C8 by C18 column .Statistical treat
... Show MoreImage compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye
... Show MoreOptical fiber chemical sensor based surface Plasmon resonance for sensing and measuring the refractive index and concentration for Acetic acid is designed and implemented during this work. Optical grade plastic optical fibers with a diameter of 1000μm were used with a diameter core of 980μm and a cladding of 20μm, where the sensor is fabricated by a small part (10mm) of optical fiber in the middle is embedded in a resin block and then the polishing process is done, after that it is deposited with about (40nm) thickness of gold metal and the Acetic acid is placed on the sensing probe.
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreEntropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation
... Show More