Preferred Language
Articles
/
bsj-46
Optimal UAV Deployment for Data Collection in Deadline-based IoT Applications
...Show More Authors

The deployment of UAVs is one of the key challenges in UAV-based communications while using UAVs for IoT applications. In this article, a new scheme for energy efficient data collection with a deadline time for the Internet of things (IoT) using the Unmanned Aerial Vehicles (UAV) is presented. We provided a new data collection method, which was set to collect IoT node data by providing an efficient deployment and mobility of multiple UAV, used to collect data from ground internet of things devices in a given deadline time. In the proposed method, data collection was done with minimum energy consumption of IoTs as well as UAVs. In order to find an optimal solution to this problem, we will first provide a mixed integer linear programming model (MILP) and then we used a heuristic to solve the time complexity problem. The results obtained in the simulation results indicate the optimal performance of the proposed scheme in terms of energy consumption and the number of used UAVs.

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Jan 01 2020
Journal Name
Ieee Access
Smart Routing Management Framework Exploiting Dynamic Data Resources of Cross-Layer Design and Machine Learning Approaches for Mobile Cognitive Radio Networks: A Survey
...Show More Authors

View Publication
Scopus (20)
Crossref (20)
Scopus Clarivate Crossref
Publication Date
Sun Apr 01 2018
Journal Name
Aquatic Geochemistry
The Origin and MgCl2–NaCl Variations in an Athalassic Sag Pond: Insights from Chemical and Isotopic Data
...Show More Authors

View Publication
Crossref (8)
Crossref
Publication Date
Tue Oct 23 2018
Journal Name
Journal Of Economics And Administrative Sciences
Processing of missing values in survey data using Principal Component Analysis and probabilistic Principal Component Analysis methods
...Show More Authors

The idea of ​​carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component  Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeed

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Feb 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Slice inverse regression with the principal components in reducing high-dimensions data by using simulation
...Show More Authors

This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions,    (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jan 01 2012
Journal Name
Tikrit Journal For Dental Sciences
Microleakage Evaluation of a Silorane-Based and Methacrylate-Based Packable and Nanofill Posterior Composites (in vitro comparative study)
...Show More Authors

This study compared in vitro the microleakage of a new low shrink silorane-based posterior composite (Filtek™ P90) and two methacrylate-based composites: a packable posterior composite (Filtek™ P60) and a nanofill composite (Filtek™ Supreme XT) through dye penetration test. Thirty sound human upper premolars were used in this study. Standardized class V cavities were prepared at the buccal surface of each tooth. The teeth were then divided into three groups of ten teeth each: (Group 1: restored with Filtek™ P90, Group 2: restored with Filtek™ P60, and Group 3: restored with Filtek™ Supreme XT). Each composite system was used according to the manufacturer's instructions with their corresponding adhesive systems. The teeth were th

... Show More
Preview PDF
Publication Date
Fri Jun 29 2018
Journal Name
Journal Of The College Of Education For Women
Audio Classification Based on Content Features
...Show More Authors

Audio classification is the process to classify different audio types according to contents. It is implemented in a large variety of real world problems, all classification applications allowed the target subjects to be viewed as a specific type of audio and hence, there is a variety in the audio types and every type has to be treatedcarefully according to its significant properties.Feature extraction is an important process for audio classification. This workintroduces several sets of features according to the type, two types of audio (datasets) were studied. Two different features sets are proposed: (i) firstorder gradient feature vector, and (ii) Local roughness feature vector, the experimentsshowed that the results are competitive to

... Show More
View Publication Preview PDF
Publication Date
Wed Jan 01 2014
Journal Name
Proceedings Of The Aintec 2014 On Asian Internet Engineering Conference - Aintec '14
LTE Peak Data Rate Estimation Using Modified alpha-Shannon Capacity Formula
...Show More Authors

View Publication
Scopus (4)
Crossref (3)
Scopus Crossref
Publication Date
Sun Feb 10 2019
Journal Name
Journal Of The College Of Education For Women
IMPLEMENTATION OF THE SKIP LIST DATA STRUCTURE WITH IT'S UPDATE OPERATIONS
...Show More Authors

A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.

View Publication Preview PDF
Publication Date
Mon Jan 01 2024
Journal Name
Aip Conference Proceedings
A multivariate Bayesian model using Gibbs sampler with real data application
...Show More Authors

In many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Tue Mar 01 2022
Journal Name
Asian Journal Of Applied Sciences
Comparison between Expert Systems, Machine Learning, and Big Data: An Overview
...Show More Authors

Today, the science of artificial intelligence has become one of the most important sciences in creating intelligent computer programs that simulate the human mind. The goal of artificial intelligence in the medical field is to assist doctors and health care workers in diagnosing diseases and clinical treatment, reducing the rate of medical error, and saving lives of citizens. The main and widely used technologies are expert systems, machine learning and big data. In the article, a brief overview of the three mentioned techniques will be provided to make it easier for readers to understand these techniques and their importance.

View Publication
Crossref (2)
Crossref