Artificial fish swarm algorithm (AFSA) is one of the critical swarm intelligent algorithms. In this
paper, the authors decide to enhance AFSA via diversity operators (AFSA-DO). The diversity operators will
be producing more diverse solutions for AFSA to obtain reasonable resolutions. AFSA-DO has been used to
solve flexible job shop scheduling problems (FJSSP). However, the FJSSP is a significant problem in the
domain of optimization and operation research. Several research papers dealt with methods of solving this
issue, including forms of intelligence of the swarms. In this paper, a set of FJSSP target samples are tested
employing the improved algorithm to confirm its effectiveness and evaluate its execution. Finally, this paper
concludes that the enhanced algorithm via diversity operators has discrepancies about the initial AFSA, and
it also provided both sound quality resolution and intersected rate.
With growing global demand for hydrocarbons and decreasing conventional reserves, the gas industry is shifting its focus in the direction of unconventional reservoirs. Tight gas reservoirs have typically been deemed uneconomical due to their low permeability which is understood to be below 0.1mD, requiring advanced drilling techniques and stimulation to enhance hydrocarbons. However, the first step in determining the economic viability of the reservoir is to see how much gas is initially in place. Numerical simulation has been regarded across the industry as the most accurate form of gas estimation, however, is extremely costly and time consuming. The aim of this study is to provide a framework for a simple analytical method to esti
... Show MoreIn this paper, a simulation of the electrical performance for Pentacene-based top-contact bottom-gate (TCBG) Organic Field-Effect Transistors (OFET) model with Polymethyl methacrylate (PMMA) and silicon nitride (Si3N4) as gate dielectrics was studied. The effects of gate dielectrics thickness on the device performance were investigated. The thickness of the two gate dielectric materials was in the range of 100-200nm to maintain a large current density and stable performance. MATLAB simulation demonstrated for model simulation results in terms of output and transfer characteristics for drain current and the transconductance. The layer thickness of 200nm may result in gate leakage current points to the requirement of optimizing the t
... Show MoreA dispersive liquid-liquid microextraction combines with UV-V is spectrophotometry for the preconcentration and determination of Mefenamic acid in pharmaceutical preparation was developed and introduced. The proposed method is based on the formation of charge transfer complexation between mefenamic acid and chloranil as an n-electron donor and a p-acceptor, respectively to form a violet chromogen complex measured at 542 nm. The important parameters affecting the efficiency of DLLME were evaluated and optimized. Under the optimum conditions, the calibration graphs of standard and drug, were ranged 0.03-10 µg mL-1. The limits of detection, quantification and Sandell's sensitivity were calculated. Good recoveries of MAF Std. and drug at 0.05,
... Show MoreThe azo Schiff base [Reaction of 4-aminoanypyrine and P-hydroxy acetophenone] and O-Phenylene diamine have been prepared. One azo Schiff base chelate of Co(Il), Ni(II), Cu(II) and Zn(II)ion was also prepared. The chemical frameworks of the azo Schiff base and like elemental analyses (CHN), determinations of molar conductance, 1 H &13C NMR, IR mass and electronic spectroscopy .The elemental analyses exhibited the combination of [L: M] 1:1 ratio. Established on the values IR spectral, it is showed that the azo Schiff base compound acts as neutral hexadentate ligand bonded with the metal ion from two hydroxyl, two azomethine and two azo groups of the azo Schiff base compound in chelation was confirmed by IR , 1Hand 13CNMR spectral outco
... Show MoreIn this study, an efficient compression system is introduced, it is based on using wavelet transform and two types of 3Dimension (3D) surface representations (i.e., Cubic Bezier Interpolation (CBI)) and 1 st order polynomial approximation. Each one is applied on different scales of the image; CBI is applied on the wide area of the image in order to prune the image components that show large scale variation, while the 1 st order polynomial is applied on the small area of residue component (i.e., after subtracting the cubic Bezier from the image) in order to prune the local smoothing components and getting better compression gain. Then, the produced cubic Bezier surface is subtracted from the image signal to get the residue component. Then, t
... Show MoreIn this paper, we present multiple bit error correction coding scheme based on extended Hamming product code combined with type II HARQ using shared resources for on chip interconnect. The shared resources reduce the hardware complexity of the encoder and decoder compared to the existing three stages iterative decoding method for on chip interconnects. The proposed method of decoding achieves 20% and 28% reduction in area and power consumption respectively, with only small increase in decoder delay compared to the existing three stage iterative decoding scheme for multiple bit error correction. The proposed code also achieves excellent improvement in residual flit error rate and up to 58% of total power consumption compared to the other err
... Show MoreResearch on the automated extraction of essential data from an electrocardiography (ECG) recording has been a significant topic for a long time. The main focus of digital processing processes is to measure fiducial points that determine the beginning and end of the P, QRS, and T waves based on their waveform properties. The presence of unavoidable noise during ECG data collection and inherent physiological differences among individuals make it challenging to accurately identify these reference points, resulting in suboptimal performance. This is done through several primary stages that rely on the idea of preliminary processing of the ECG electrical signal through a set of steps (preparing raw data and converting them into files tha
... Show More