A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the results present that the improved median filter with crow optimization algorithm is more effective than the original median filter algorithm and some recently methods; they show that the suggested process is robust to reduce the error problem and remove noise because of a candidate of the median filter; the results will show by the minimized mean square error to equal or less than (1.38), absolute error to equal or less than (0.22) ,Structural Similarity (SSIM) to equal (0.9856) and getting PSNR more than (46 dB). Thus, the percentage of improvement in work is (25%).
JPEG is most popular image compression and encoding, this technique is widely used in many applications (images, videos and 3D animations). Meanwhile, researchers are very interested to develop this massive technique to compress images at higher compression ratios with keeping image quality as much as possible. For this reason in this paper we introduce a developed JPEG based on fast DCT and removed most of zeros and keeps their positions in a transformed block. Additionally, arithmetic coding applied rather than Huffman coding. The results showed up, the proposed developed JPEG algorithm has better image quality than traditional JPEG techniques.
The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show MoreThe research aims to identify the importance of using the style of the cost on the basis of activity -oriented in time TDABC and its role in determining the cost of products more equitably and thus its impact on the policy of allocation of resources through the reverse of the changes that occur on an ongoing basis in the specification of the products and thus the change in the nature and type of operations . The research was conducted at the General Company for Textile Industries Wasit / knitting socks factory was based on research into the hypothesis main of that ( possible to calculate the cost of activities that cause the production through the time it takes to run these activities can then be re- distributed product cost
... Show MoreThis paper proposes improving the structure of the neural controller based on the identification model for nonlinear systems. The goal of this work is to employ the structure of the Modified Elman Neural Network (MENN) model into the NARMA-L2 structure instead of Multi-Layer Perceptron (MLP) model in order to construct a new hybrid neural structure that can be used as an identifier model and a nonlinear controller for the SISO linear or nonlinear systems. Two learning algorithms are used to adjust the parameters weight of the hybrid neural structure with its serial-parallel configuration; the first one is supervised learning algorithm based Back Propagation Algorithm (BPA) and the second one is an intelligent algorithm n
... Show MoreGaslift reactors are employed in several bioapplications due to their characteristics of cost-effectiveness and high efficiency. However, the nutrient and thermal gradient is one of the obstacles that stand in the way of its widespread use in biological applications. The diagnosis, analysis, and tracking of fluid paths in external draft tube gaslift bioreactor-type are the main topics of the current study. Several parameters were considered to assess the mixing efficiency such as downcomer-to-rizer diameter ratio (Ded/Dr), the position of the diffuser to the height of bioreactor ratio (Pd/Lr), and gas bubble size (Db). The multiple regression of liquid velocity indicates the optimal setting: Ded/Dr is (0.5), Pd/Lr is (0.02), and Db
... Show MoreThe research aims to demonstrate the impact of TDABC as a strategic technology compatible with the rapid developments and changes in the contemporary business environment) on pricing decisions. As TDABC provides a new philosophy in the process of allocating indirect costs through time directives of resources and activities to the goal of cost, identifying unused energy and associated costs, which provides the management of economic units with financial and non-financial information that helps them in the complex and dangerous decision-making process. Of pricing decisions. To achieve better pricing decisions in light of the endeavor to maintain customers in a highly competitive environment and a variety of alternatives, the resear
... Show MoreAbstract
This study investigated the optimization of wear behavior of AISI 4340 steel based on the Taguchi method under various testing conditions. In this paper, a neural network and the Taguchi design method have been implemented for minimizing the wear rate in 4340 steel. A back-propagation neural network (BPNN) was developed to predict the wear rate. In the development of a predictive model, wear parameters like sliding speed, applying load and sliding distance were considered as the input model variables of the AISI 4340 steel. An analysis of variance (ANOVA) was used to determine the significant parameter affecting the wear rate. Finally, the Taguchi approach was applied to determine
... Show MoreThe purpose of this paper is to solve the stochastic demand for the unbalanced transport problem using heuristic algorithms to obtain the optimum solution, by minimizing the costs of transporting the gasoline product for the Oil Products Distribution Company of the Iraqi Ministry of Oil. The most important conclusions that were reached are the results prove the possibility of solving the random transportation problem when the demand is uncertain by the stochastic programming model. The most obvious finding to emerge from this work is that the genetic algorithm was able to address the problems of unbalanced transport, And the possibility of applying the model approved by the oil products distribution company in the Iraqi Ministry of Oil to m
... Show MoreAmong a variety of approaches introduced in the literature to establish duality theory, Fenchel duality was of great importance in convex analysis and optimization. In this paper we establish some conditions to obtain classical strong Fenchel duality for evenly convex optimization problems defined in infinite dimensional spaces. The objective function of the primal problem is a family of (possible) infinite even convex functions. The strong duality conditions we present are based on the consideration of the epigraphs of the c-conjugate of the dual objective functions and the ε-c-subdifferential of the primal objective functions.