The transmitting and receiving of data consume the most resources in Wireless Sensor Networks (WSNs). The energy supplied by the battery is the most important resource impacting WSN's lifespan in the sensor node. Therefore, because sensor nodes run from their limited battery, energy-saving is necessary. Data aggregation can be defined as a procedure applied for the elimination of redundant transmissions, and it provides fused information to the base stations, which in turn improves the energy effectiveness and increases the lifespan of energy-constrained WSNs. In this paper, a Perceptually Important Points Based Data Aggregation (PIP-DA) method for Wireless Sensor Networks is suggested to reduce redundant data before sending them to the sink. By utilizing Intel Berkeley Research Lab (IBRL) dataset, the efficiency of the proposed method was measured. The experimental findings illustrate the benefits of the proposed method as it reduces the overhead on the sensor node level up to 1.25% in remaining data and reduces the energy consumption up to 93% compared to prefix frequency filtering (PFF) and ATP protocols.
The distribution of the intensity of the comet Ison C/2013 is studied by taking its histogram. This distribution reveals four distinct regions that related to the background, tail, coma and nucleus. One dimensional temperature distribution fitting is achieved by using two mathematical equations that related to the coordinate of the center of the comet. The quiver plot of the gradient of the comet shows very clearly that arrows headed towards the maximum intensity of the comet.
Reverse Osmosis (RO) has already proved its worth as an efficient treatment method in chemical and environmental engineering applications. Various successful RO attempts for the rejection of organic and highly toxic pollutants from wastewater can be found in the literature over the last decade. Dimethylphenol is classified as a high-toxic organic compound found ubiquitously in wastewater. It poses a real threat to humans and the environment even at low concentration. In this paper, a model based framework was developed for the simulation and optimisation of RO process for the removal of dimethylphenol from wastewater. We incorporated our earlier developed and validated process model into the Species Conserving Genetic Algorithm (SCG
... Show MoreEarly detection of brain tumors is critical for enhancing treatment options and extending patient survival. Magnetic resonance imaging (MRI) scanning gives more detailed information, such as greater contrast and clarity than any other scanning method. Manually dividing brain tumors from many MRI images collected in clinical practice for cancer diagnosis is a tough and time-consuming task. Tumors and MRI scans of the brain can be discovered using algorithms and machine learning technologies, making the process easier for doctors because MRI images can appear healthy when the person may have a tumor or be malignant. Recently, deep learning techniques based on deep convolutional neural networks have been used to analyze med
... Show MoreThin films ZrO2: MgO nanostructure have been synthesized by a radio frequency magnetron plasma sputtering technique at different ratios of MgO (0,6, 8 and 10)% percentage to be used as the gas sensor for nitrogen dioxide NO2. The samples were investigated by X-ray diffraction (XRD), atomic force microscopy (AFM), scanning electron microscopy (SEM), energy-dispersive X-ray (EDX) and sensing properties were also investigated. The average particle size of all prepared samples was found lower than 33.22nm and the structure was a monoclinic phase. The distribution of grain size was found lower than36.3 nm and uninformed particles on the surface. Finally, the data of sensing properties have been discussed, where the
... Show MoreThe current research discusses the topic of the formal data within the methodological framework through defining the research problem, limits and objectives and defining the most important terms mentioned in this research. The theoretical framework in the first section addressed (the concept of the Bauhaus school, the philosophy of the Bauhaus school and the logical bases of this school). The second section dealt with (the most important elements and structural bases of the Bauhaus school) which are considered the most important formal data of this school and their implications on the fabrics and costumes design. The research came up with the most important indicators resulting from the theoretical framework.
Chapter three defined the
Discriminant analysis is a technique used to distinguish and classification an individual to a group among a number of groups based on a linear combination of a set of relevant variables know discriminant function. In this research discriminant analysis used to analysis data from repeated measurements design. We will deal with the problem of discrimination and classification in the case of two groups by assuming the Compound Symmetry covariance structure under the assumption of normality for univariate repeated measures data.
... Show More
Traumatic spinal cord injury is a serious neurological disorder. Patients experience a plethora of symptoms that can be attributed to the nerve fiber tracts that are compromised. This includes limb weakness, sensory impairment, and truncal instability, as well as a variety of autonomic abnormalities. This article will discuss how machine learning classification can be used to characterize the initial impairment and subsequent recovery of electromyography signals in an non-human primate model of traumatic spinal cord injury. The ultimate objective is to identify potential treatments for traumatic spinal cord injury. This work focuses specifically on finding a suitable classifier that differentiates between two distinct experimental
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
In recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur
... Show More