Wireless Body Area Network (WBAN) is a tool that improves real-time patient health observation in hospitals, asylums, especially at home. WBAN has grown popularity in recent years due to its critical role and vast range of medical applications. Due to the sensitive nature of the patient information being transmitted through the WBAN network, security is of paramount importance. To guarantee the safe movement of data between sensor nodes and various WBAN networks, a high level of security is required in a WBAN network. This research introduces a novel technique named Integrated Grasshopper Optimization Algorithm with Artificial Neural Network (IGO-ANN) for distinguishing between trusted nodes in WBAN networks by means of a classification approach, hence strengthening the safety of such networks. Feature extraction process is done by using Linear Regression-Based Principal Component Analysis (LR-PCA). The test results demonstrated that the proposed IGO-ANN method attains the greatest performance in terms of accuracy, end to end delay and packet delivery ratio regarding trusted WBAN nodes classification than certain existing methods.
The study presents the modification of the Broyden-Flecher-Goldfarb-Shanno (BFGS) update (H-Version) based on the determinant property of inverse of Hessian matrix (second derivative of the objective function), via updating of the vector s ( the difference between the next solution and the current solution), such that the determinant of the next inverse of Hessian matrix is equal to the determinant of the current inverse of Hessian matrix at every iteration. Moreover, the sequence of inverse of Hessian matrix generated by the method would never approach a near-singular matrix, such that the program would never break before the minimum value of the objective function is obtained. Moreover, the new modification of BFGS update (H-vers
... Show MoreAlbizia lebbeck biomass was used as an adsorbent material in the present study to remove methyl red dye from an aqueous solution. A central composite rotatable design model was used to predict the dye removal efficiency. The optimization was accomplished under a temperature and mixing control system (37?C) with different particle size of 300 and 600 ?m. Highest adsorption efficiencies were obtained at lower dye concentrations and lower weight of adsorbent. The adsorption time, more than 48 h, was found to have a negative effect on the removal efficiency due to secondary metabolites compounds. However, the adsorption time was found to have a positive effect at high dye concentrations and high adsorbent weight. The colour removal effi
... Show MoreToday, urban Stormwater management is one of the main concerns of municipalities and stakeholders. Drought and water scarcity made rainwater harvesting one of the main steps toward climate change adaptation. Due to the deterioration of the quality of urban runoff and the increase of impermeable urban land use, the treatment of urban runoff is essential. Best Management Practice (BMP) and Low Impact Development (LID) approaches are necessary to combat climate change consequences by improving the quantity and quality of water resources. The application of Bioswales along urban streets and roadways can reduce the stress on water resources, recharge groundwater and prevent groundwater pollution. While Sulaymaniyah City has a
... Show MoreAlbizia lebbeck biomass was used as an adsorbent material in the present study to remove methyl red dye from an aqueous solution. A central composite rotatable design model was used to predict the dye removal efficiency. The optimization was accomplished under a temperature and mixing control system (37?C) with different particle size of 300 and 600 ?m. Highest adsorption efficiencies were obtained at lower dye concentrations and lower weight of adsorbent. The adsorption time, more than 48 h, was found to have a negative effect on the removal efficiency due to secondary metabolites compounds. However, the adsorption time was found to have a positive effect at high dye concentrations and high adsorbent weight. The colour removal effi
... Show MoreBusiness organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MoreJPEG is most popular image compression and encoding, this technique is widely used in many applications (images, videos and 3D animations). Meanwhile, researchers are very interested to develop this massive technique to compress images at higher compression ratios with keeping image quality as much as possible. For this reason in this paper we introduce a developed JPEG based on fast DCT and removed most of zeros and keeps their positions in a transformed block. Additionally, arithmetic coding applied rather than Huffman coding. The results showed up, the proposed developed JPEG algorithm has better image quality than traditional JPEG techniques.
The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show MoreAssociation rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.
Most heuristic search method's performances are dependent on parameter choices. These parameter settings govern how new candidate solutions are generated and then applied by the algorithm. They essentially play a key role in determining the quality of the solution obtained and the efficiency of the search. Their fine-tuning techniques are still an on-going research area. Differential Evolution (DE) algorithm is a very powerful optimization method and has become popular in many fields. Based on the prolonged research work on DE, it is now arguably one of the most outstanding stochastic optimization algorithms for real-parameter optimization. One reason for its popularity is its widely appreciated property of having only a small number of par
... Show MoreFor businesses that provide delivery services, the efficiency of the delivery process in terms of punctuality is very important. In addition to increasing customer trust, efficient route management, and selection are required to reduce vehicle fuel costs and expedite delivery. Some small and medium businesses still use conventional methods to manage delivery routes. Decisions to manage delivery schedules and routes do not use any specific methods to expedite the delivery settlement process. This process is inefficient, takes a long time, increases costs and is prone to errors. Therefore, the Dijkstra algorithm has been used to improve the delivery management process. A delivery management system was developed to help managers and drivers
... Show More