A three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures and values of learning parameters are determined through cross-validation, and test datasets unseen in the cross-validation are used to evaluate the performance of the DMLP trained using the three-stage learning algorithm. Experimental results show that the proposed method is effective in combating overfitting in training deep neural networks.
CR-39 is a solid state nuclear track detector (SSNTD) that has been used in many research areas. In spite of the assumption that the CR-39 detectors are insensitive to beta and gamma rays, irradiation with these rays can have significant effects on the detector properties. In this study, beta and gamma rays mass attenuation coefficients μ/ρ (cm2 g-1) for the CR-39 detector have been measured using NaI(Tl) scintillation spectrometer along with a standard geometrical arrangement in the energy region of (0.546-2.274) MeV beta rays and standard gamma sources having energy 0.356, 0.5697, 0.6617 and 1.063 MeV. The total atomic cross-section (σtot), total electronic cross-section (σT E) and the effective atomic number (Zeff) of gamma rays a
... Show MoreA factorial experiment (2× 3) in randomized complete block design (RCBD) with three replications was conducted to examine the effect of honeycomb selection method using three interplant distances on the vegetative growth, flowering, and fruit set of two cultivars of bean, Bronco and Strike. Interplant distances used were 75× 65 cm, 90× 78 cm, and 105× 91 cm (row× plant) represent short (high plant density), intermediate (intermediate plant density), and wide (low plant density) distance, respectively. Parameters used for selection were number of days from planting to the initiation of first flower, number of nodes formed prior to the onset of first flower, and number of main branches. Results showed significant superiority of the Strik
... Show MoreThe aim of this study is to develop a novel framework for managing risks in smart supply chains by enhancing business continuity and resilience against potential disruptions. This research addresses the growing uncertainty in supply chain environments, driven by both natural phenomena-such as pandemics and earthquakes—and human-induced events, including wars, political upheavals, and societal transformations. Recognizing that traditional risk management approaches are insufficient in such dynamic contexts, the study proposes an adaptive framework that integrates proactive and remedial measures for effective risk mitigation. A fuzzy risk matrix is employed to assess and analyze uncertainties, facilitating the identification of disr
... Show MoreA mathematical model constructed to study the combined effects of the concentration and the thermodiffusion on the nanoparticles of a Jeffrey fluid with a magnetic field effect the process of containing waves in a three-dimensional rectangular porous medium canal. Using the HPM to solve the nonlinear and coupled partial differential equations. Numerical results were obtained for temperature distribution, nanoparticles concentration, velocity, pressure rise, pressure gradient, friction force and stream function. Through the graphs, it was found that the velocity of fluid rises with the increase of a mean rate of volume flow and a magnetic parameter, while the velocity goes down with the increasing a Darcy number and lateral walls. Also, t
... Show MoreAbstract: Microfluidic devices present unique advantages for the development of efficient drug assay and screening. The microfluidic platforms might offer a more rapid and cost-effective alternative. Fluids are confined in devices that have a significant dimension on the micrometer scale. Due to this extreme confinement, the volumes used for drug assays are tiny (milliliters to femtoliters).
In this research, a microfluidic chip consists of micro-channels carved on substrate materials built by using Acrylic (Polymethyl Methacrylate, PMMA) chip was designed using a Carbon Dioxide (CO2) laser machine. The CO2 parameters have influence on the width, depth, roughness of the chip. In order to have regular
... Show MoreThe internet of medical things (IoMT), which is expected the lead to the biggest technology in worldwide distribution. Using 5th generation (5G) transmission, market possibilities and hazards related to IoMT are improved and detected. This framework describes a strategy for proactively addressing worries and offering a forum to promote development, alter attitudes and maintain people's confidence in the broader healthcare system without compromising security. It is combined with a data offloading system to speed up the transmission of medical data and improved the quality of service (QoS). As a result of this development, we suggested the enriched energy efficient fuzzy (EEEF) data offloading technique to enhance the delivery of dat
... Show MoreThis abstract focuses on the significance of wireless body area networks (WBANs) as a cutting-edge and self-governing technology, which has garnered substantial attention from researchers. The central challenge faced by WBANs revolves around upholding quality of service (QoS) within rapidly evolving sectors like healthcare. The intricate task of managing diverse traffic types with limited resources further compounds this challenge. Particularly in medical WBANs, the prioritization of vital data is crucial to ensure prompt delivery of critical information. Given the stringent requirements of these systems, any data loss or delays are untenable, necessitating the implementation of intelligent algorithms. These algorithms play a pivota
... Show More