Preferred Language
Articles
/
hhc-LZIBVTCNdQwC7afF
Lossy and Lossless Video Frame Compression: A Novel Approach for High-Temporal Video Data Analytics
...Show More Authors

The smart city concept has attracted high research attention in recent years within diverse application domains, such as crime suspect identification, border security, transportation, aerospace, and so on. Specific focus has been on increased automation using data driven approaches, while leveraging remote sensing and real-time streaming of heterogenous data from various resources, including unmanned aerial vehicles, surveillance cameras, and low-earth-orbit satellites. One of the core challenges in exploitation of such high temporal data streams, specifically videos, is the trade-off between the quality of video streaming and limited transmission bandwidth. An optimal compromise is needed between video quality and subsequently, recognition and understanding and efficient processing of large amounts of video data. This research proposes a novel unified approach to lossy and lossless video frame compression, which is beneficial for the autonomous processing and enhanced representation of high-resolution video data in various domains. The proposed fast block matching motion estimation technique, namely mean predictive block matching, is based on the principle that general motion in any video frame is usually coherent. This coherent nature of the video frames dictates a high probability of a macroblock having the same direction of motion as the macroblocks surrounding it. The technique employs the partial distortion elimination algorithm to condense the exploration time, where partial summation of the matching distortion between the current macroblock and its contender ones will be used, when the matching distortion surpasses the current lowest error. Experimental results demonstrate the superiority of the proposed approach over state-of-the-art techniques, including the four step search, three step search, diamond search, and new three step search.

Scopus Clarivate Crossref
View Publication
Publication Date
Wed Mar 23 2011
Journal Name
Ibn Al- Haitham J. For Pure & Appl. Sci.
Image Compression Using Proposed Enhanced Run Length Encoding Algorithm
...Show More Authors

In this paper, we will present proposed enhance process of image compression by using RLE algorithm. This proposed yield to decrease the size of compressing image, but the original method used primarily for compressing a binary images [1].Which will yield increasing the size of an original image mostly when used for color images. The test of an enhanced algorithm is performed on sample consists of ten BMP 24-bit true color images, building an application by using visual basic 6.0 to show the size after and before compression process and computing the compression ratio for RLE and for the enhanced RLE algorithm.

Preview PDF
Publication Date
Sat Aug 01 2015
Journal Name
International Journal Of Computer Science And Mobile Computing
Image Compression based on Non-Linear Polynomial Prediction Model
...Show More Authors

Publication Date
Wed Mar 10 2021
Journal Name
Baghdad Science Journal
Merge Operation Effect On Image Compression Using Fractal Technique
...Show More Authors

Fractal image compression gives some desirable properties like fast decoding image, and very good rate-distortion curves, but suffers from a high encoding time. In fractal image compression a partitioning of the image into ranges is required. In this work, we introduced good partitioning process by means of merge approach, since some ranges are connected to the others. This paper presents a method to reduce the encoding time of this technique by reducing the number of range blocks based on the computing the statistical measures between them . Experimental results on standard images show that the proposed method yields minimize (decrease) the encoding time and remain the quality results passable visually.

View Publication Preview PDF
Publication Date
Mon Jun 08 2020
Journal Name
Research Journal Of Chemistry And Environment
High Performance Liquid Chromatographic and Areaunder Curve spectrophotometric Methods forEstimation of Cefixime in Pure and MarketedFormulation: A Comparative Study
...Show More Authors

Cefixime is an antibiotic useful for treating a variety ofmicroorganism infections. In the present work, tworapid, specific, inexpensive and nontoxic methods wereproposed for cefixime determination. Area under curvespectrophotometric and HPLC methods were depictedfor the micro quantification of Cefixime in highly pureand local market formulation. The area under curve(first technique) used in calculation of the cefiximepeak using a UV-visible spectrophotometer.The HPLC (2nd technique) was depended on thepurification of Cefixime by a C18 separating column250mm (length of column) × 4.6 mm (diameter)andusing methanol 50% (organic modifier) and deionizedwater 50% as a mobile phase. The isocratic flow withrate of 1 mL/min was applied, the temper

... Show More
Publication Date
Wed Aug 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
((Human Resource Staffing Strategy and and its impact in the high performance A Field research in Ministry of Agriculture))
...Show More Authors

Abstract

were determine the problem of current research to answer the  question on the consciousness of the Ministry of Agriculture to adopt Staffing  strategy and identify the shortcomings  as the independent variable, represented by the three dimensions (recruitment , selection, placement) and its impact on high performance, as  dependent   variable is described  in  four dimensions (leadership, strategy, structure and processes, culture), in this research were  Used  analytical descriptive  style . This research aims to identify the correlation and impact  of Staffing strategy at high-performance in Ministry of Agriculture , To clarify the relationship between

... Show More
View Publication
Crossref
Publication Date
Sat Sep 30 2017
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Spatial Data Analysis for Geostatistical Modeling of Petrophysical Properties for Mishrif Formaiton, Nasiriya Oil Field
...Show More Authors

Spatial data analysis is performed in order to remove the skewness, a measure of the asymmetry of the probablitiy distribution. It also improve the normality, a key concept of statistics from the concept of normal distribution “bell shape”, of the properties like improving the normality porosity, permeability and saturation which can be are visualized by using histograms. Three steps of spatial analysis are involved here; exploratory data analysis, variogram analysis and finally distributing the properties by using geostatistical algorithms for the properties. Mishrif Formation (unit MB1) in Nasiriya Oil Field was chosen to analyze and model the data for the first eight wells. The field is an anticline structure with northwest- south

... Show More
View Publication Preview PDF
Publication Date
Thu Oct 01 2015
Journal Name
Journal Of Engineering
Development of Spatial Data Infrastructure based on Free Data Integration
...Show More Authors

In recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of how the

... Show More
View Publication Preview PDF
Publication Date
Thu Oct 01 2015
Journal Name
Journal Of Engineering
Development of Spatial Data Infrastructure based on Free Data Integration
...Show More Authors

In recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of ho

... Show More
Crossref
Publication Date
Fri Dec 30 2022
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Normalize and De-Normalize of Relative Permeability Data for Mishrif Formation in WQ1: An Experimental Work
...Show More Authors

In many oil-recovery systems, relative permeabilities (kr) are essential flow factors that affect fluid dispersion and output from petroleum resources. Traditionally, taking rock samples from the reservoir and performing suitable laboratory studies is required to get these crucial reservoir properties. Despite the fact that kr is a function of fluid saturation, it is now well established that pore shape and distribution, absolute permeability, wettability, interfacial tension (IFT), and saturation history all influence kr values. These rock/fluid characteristics vary greatly from one reservoir region to the next, and it would be impossible to make kr measurements in all of them. The unsteady-state approach was used to calculate the relat

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

... Show More
View Publication Preview PDF
Crossref