Most recent studies have focused on using modern intelligent techniques spatially, such as those
developed in the Intruder Detection Module (IDS). Such techniques have been built based on modern
artificial intelligence-based modules. Those modules act like a human brain. Thus, they should have had the
ability to learn and recognize what they had learned. The importance of developing such systems came after
the requests of customers and establishments to preserve their properties and avoid intruders’ damage. This
would be provided by an intelligent module that ensures the correct alarm. Thus, an interior visual intruder
detection module depending on Multi-Connect Architecture Associative Memory (MCA) has been proposed.
Via using the MCA associative memory as a new trend, the proposed module goes through two phases: the
first is the training phase (which is executed once during the module installation process) and the second is
the analysis phase. Both phases will be developed through the use of MCA, each according to its process.
The training phase will take place through the learning phase of MCA, while the analysis phase will take
place through the convergence phase of MCA. The use of MCA increases the efficiency of the training
process for the proposed system by using a minimum number of training images that do not exceed 10
training images of the total number of frames in JPG format. The proposed module has been evaluated using
11,825 images that have been extracted from 11 tested videos. As a result, the module can detect the intruder
with an accuracy ratio in the range of 97%–100%. The average training process time for the training videos
was in the range of 10.2 s to 23.2 s.
Background/Objectives: The purpose of current research aims to a modified image representation framework for Content-Based Image Retrieval (CBIR) through gray scale input image, Zernike Moments (ZMs) properties, Local Binary Pattern (LBP), Y Color Space, Slantlet Transform (SLT), and Discrete Wavelet Transform (DWT). Methods/Statistical analysis: This study surveyed and analysed three standard datasets WANG V1.0, WANG V2.0, and Caltech 101. The features an image of objects in this sets that belong to 101 classes-with approximately 40-800 images for every category. The suggested infrastructure within the study seeks to present a description and operationalization of the CBIR system through automated attribute extraction system premised on CN
... Show MoreMost Internet of Vehicles (IoV) applications are delay-sensitive and require resources for data storage and tasks processing, which is very difficult to afford by vehicles. Such tasks are often offloaded to more powerful entities, like cloud and fog servers. Fog computing is decentralized infrastructure located between data source and cloud, supplies several benefits that make it a non-frivolous extension of the cloud. The high volume data which is generated by vehicles’ sensors and also the limited computation capabilities of vehicles have imposed several challenges on VANETs systems. Therefore, VANETs is integrated with fog computing to form a paradigm namely Vehicular Fog Computing (VFC) which provide low-latency services to mo
... Show MoreThe main focus of this research is to examine the Travelling Salesman Problem (TSP) and the methods used to solve this problem where this problem is considered as one of the combinatorial optimization problems which met wide publicity and attention from the researches for to it's simple formulation and important applications and engagement to the rest of combinatorial problems , which is based on finding the optimal path through known number of cities where the salesman visits each city only once before returning to the city of departure n this research , the benefits of( FMOLP) algorithm is employed as one of the best methods to solve the (TSP) problem and the application of the algorithm in conjun
... Show MoreIn this paper, estimation of system reliability of the multi-components in stress-strength model R(s,k) is considered, when the stress and strength are independent random variables and follows the Exponentiated Weibull Distribution (EWD) with known first shape parameter θ and, the second shape parameter α is unknown using different estimation methods. Comparisons among the proposed estimators through Monte Carlo simulation technique were made depend on mean squared error (MSE) criteria
XML is being incorporated into the foundation of E-business data applications. This paper addresses the problem of the freeform information that stored in any organization and how XML with using this new approach will make the operation of the search very efficient and time consuming. This paper introduces new solution and methodology that has been developed to capture and manage such unstructured freeform information (multi information) depending on the use of XML schema technologies, neural network idea and object oriented relational database, in order to provide a practical solution for efficiently management multi freeform information system.
In high-dimensional semiparametric regression, balancing accuracy and interpretability often requires combining dimension reduction with variable selection. This study intro- duces two novel methods for dimension reduction in additive partial linear models: (i) minimum average variance estimation (MAVE) combined with the adaptive least abso- lute shrinkage and selection operator (MAVE-ALASSO) and (ii) MAVE with smoothly clipped absolute deviation (MAVE-SCAD). These methods leverage the flexibility of MAVE for sufficient dimension reduction while incorporating adaptive penalties to en- sure sparse and interpretable models. The performance of both methods is evaluated through simulations using the mean squared error and variable selection cri
... Show MoreThis paper is attempt to study the nonlinear second order delay multi-value problems. We want to say that the properties of such kind of problems are the same as the properties of those with out delay just more technically involved. Our results discuss several known properties, introduce some notations and definitions. We also give an approximate solution to the coined problems using the Galerkin's method.
Aggregate production planning (APP) is one of the most significant and complicated problems in production planning and aim to set overall production levels for each product category to meet fluctuating or uncertain demand in future. and to set decision concerning hiring, firing, overtime, subcontract, carrying inventory level. In this paper, we present a simulated annealing (SA) for multi-objective linear programming to solve APP. SA is considered to be a good tool for imprecise optimization problems. The proposed model minimizes total production and workforce costs. In this study, the proposed SA is compared with particle swarm optimization (PSO). The results show that the proposed SA is effective in reducing total production costs and req
... Show More