The smart city concept has attracted high research attention in recent years within diverse application domains, such as crime suspect identification, border security, transportation, aerospace, and so on. Specific focus has been on increased automation using data driven approaches, while leveraging remote sensing and real-time streaming of heterogenous data from various resources, including unmanned aerial vehicles, surveillance cameras, and low-earth-orbit satellites. One of the core challenges in exploitation of such high temporal data streams, specifically videos, is the trade-off between the quality of video streaming and limited transmission bandwidth. An optimal compromise is needed between video quality and subsequently, recognition and understanding and efficient processing of large amounts of video data. This research proposes a novel unified approach to lossy and lossless video frame compression, which is beneficial for the autonomous processing and enhanced representation of high-resolution video data in various domains. The proposed fast block matching motion estimation technique, namely mean predictive block matching, is based on the principle that general motion in any video frame is usually coherent. This coherent nature of the video frames dictates a high probability of a macroblock having the same direction of motion as the macroblocks surrounding it. The technique employs the partial distortion elimination algorithm to condense the exploration time, where partial summation of the matching distortion between the current macroblock and its contender ones will be used, when the matching distortion surpasses the current lowest error. Experimental results demonstrate the superiority of the proposed approach over state-of-the-art techniques, including the four step search, three step search, diamond search, and new three step search.
Modern French novel has gained a distinctive status in the history of French literature during the first half of the twentieth century. This is due to many factors including the new literary descriptive objective style adopted by novelists like Alain Robbe – Grillet that has long been regarded as the outstanding writer of the nouveau roman, as well as its major spokesman, a representative writer and a leading theoretician of the new novel that has broken the classical rules of the one hero and evolved, through questioning the relationship of man and the world and reevaluating the limits of contemporary fiction , into creating a new form of narrative.
Résumé:
En vue de résu
... Show MoreIn the presence of deep submicron noise, providing reliable and energy‐efficient network on‐chip operation is becoming a challenging objective. In this study, the authors propose a hybrid automatic repeat request (HARQ)‐based coding scheme that simultaneously reduces the crosstalk induced bus delay and provides multi‐bit error protection while achieving high‐energy savings. This is achieved by calculating two‐dimensional parities and duplicating all the bits, which provide single error correction and six errors detection. The error correction reduces the performance degradation caused by retransmissions, which when combined with voltage swing reduction, due to its high error detection, high‐energy savings are achieved. The res
... Show MoreA simple, precise, rapid, and accurate reversed – phase high performance liquid chromatographic method has been developed for the determination of guaifenesin in pure from pharmaceutical formulations.andindustrial effluent. Chromatography was carried out on supelco L7 reversed- phase column (25cm × 4.6mm), 5 microns, using a mixture of methanol –acetonitrile-water: (80: 10 :10 v/v/v) as a mobile phase at a flow rate of 1.0 ml.min-1. Detection was performed at 254nm at ambient temperature. The retention time for guaifenesin was found 2.4 minutes. The calibration curve was linear (r= 0.9998) over a concentration range from 0.08 to 0.8mg/ml. Limit of detection (LOD) and limit of quantification ( LOQ) were found 6µg/ml and 18µg/ml res
... Show MoreIn the field of civil engineering, the adoption and use of Falling Weight Deflectometers (FWDs) is seen as a response to the ever changing and technology-driven world. Specifically, FWDs refer to devices that aid in evaluating the physical properties of a pavement. This paper has assessed the concepts of data processing, storage, and analysis via FWDs. The device has been found to play an important role in enabling the operators and field practitioners to understand vertical deflection responses upon subjecting pavements to impulse loads. In turn, the resultant data and its analysis outcomes lead to the backcalculation of the state of stiffness, with initial analyses of the deflection bowl occurring in conjunction with the measured or assum
... Show MoreThe partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of are derived from the relation between and level density parameter . The formulae used to derive are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on from the Thomas-Fermi formula show a good agreement with the experimental data.
Thyroid disease is a common disease affecting millions worldwide. Early diagnosis and treatment of thyroid disease can help prevent more serious complications and improve long-term health outcomes. However, thyroid disease diagnosis can be challenging due to its variable symptoms and limited diagnostic tests. By processing enormous amounts of data and seeing trends that may not be immediately evident to human doctors, Machine Learning (ML) algorithms may be capable of increasing the accuracy with which thyroid disease is diagnosed. This study seeks to discover the most recent ML-based and data-driven developments and strategies for diagnosing thyroid disease while considering the challenges associated with imbalanced data in thyroid dise
... Show MoreThe cross section evaluation for (α,n) reaction was calculated according to the available International Atomic Energy Agency (IAEA) and other experimental published data . These cross section are the most recent data , while the well known international libraries like ENDF , JENDL , JEFF , etc. We considered an energy range from threshold to 25 M eV in interval (1 MeV). The average weighted cross sections for all available experimental and theoretical(JENDL) data and for all the considered isotopes was calculated . The cross section of the element is then calculated according to the cross sections of the isotopes of that element taking into account their abundance . A mathematical representative equation for each of the element
... Show MoreIn this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the
... Show MoreThe influx of data in bioinformatics is primarily in the form of DNA, RNA, and protein sequences. This condition places a significant burden on scientists and computers. Some genomics studies depend on clustering techniques to group similarly expressed genes into one cluster. Clustering is a type of unsupervised learning that can be used to divide unknown cluster data into clusters. The k-means and fuzzy c-means (FCM) algorithms are examples of algorithms that can be used for clustering. Consequently, clustering is a common approach that divides an input space into several homogeneous zones; it can be achieved using a variety of algorithms. This study used three models to cluster a brain tumor dataset. The first model uses FCM, whic
... Show More