Document analysis of images snapped by camera is a growing challenge. These photos are often poor-quality compound images, composed of various objects and text; this makes automatic analysis complicated. OCR is one of the image processing techniques which is used to perform automatic identification of texts. Existing image processing techniques need to manage many parameters in order to clearly recognize the text in such pictures. Segmentation is regarded one of these essential parameters. This paper discusses the accuracy of segmentation process and its effect over the recognition process. According to the proposed method, the images were firstly filtered using the wiener filter then the active contour algorithm could b
... Show MoreRecognizing speech emotions is an important subject in pattern recognition. This work is about studying the effect of extracting the minimum possible number of features on the speech emotion recognition (SER) system. In this paper, three experiments performed to reach the best way that gives good accuracy. The first one extracting only three features: zero crossing rate (ZCR), mean, and standard deviation (SD) from emotional speech samples, the second one extracting only the first 12 Mel frequency cepstral coefficient (MFCC) features, and the last experiment applying feature fusion between the mentioned features. In all experiments, the features are classified using five types of classification techniques, which are the Random Forest (RF),
... Show MoreSocial Networking has dominated the whole world by providing a platform of information dissemination. Usually people share information without knowing its truthfulness. Nowadays Social Networks are used for gaining influence in many fields like in elections, advertisements etc. It is not surprising that social media has become a weapon for manipulating sentiments by spreading disinformation. Propaganda is one of the systematic and deliberate attempts used for influencing people for the political, religious gains. In this research paper, efforts were made to classify Propagandist text from Non-Propagandist text using supervised machine learning algorithms. Data was collected from the news sources from July 2018-August 2018. After annota
... Show MoreFraud Includes acts involving the exercise of deception by multiple parties inside and outside companies in order to obtain economic benefits against the harm to those companies, as they are to commit fraud upon the availability of three factors which represented by the existence of opportunities, motivation, and rationalization. Fraud detecting require necessity of indications the possibility of its existence. Here, Benford’s law can play an important role in direct the light towards the possibility of the existence of financial fraud in the accounting records of the company, which provides the required effort and time for detect fraud and prevent it.
In this paper, the developed sprite allocation method is designed to be coherent with the introduced block-matching method in order to minimize the allocation process time for digital video. The accomplished allocation process of sprite region consists of three main steps. The first step is the detection of sprite area; where the sequence of frames belong to Group of Video sequence are analysed to detect the sprite regions which survive for long time, and to determine the sprite type (i.e., whether it is static or dynamic). Then as a second step, the flagged survived areas are passed through the gaps/islands removal stage to enhance the detected sprite areas using post-processing operations. The third step is partitioning the sprite area in
... Show MoreIn this paper, new method have been investigated using evolving algorithms (EA's) to cryptanalysis one of the nonlinear stream cipher cryptosystems which depends on the Linear Feedback Shift Register (LFSR) unit by using cipher text-only attack. Genetic Algorithm (GA) and Ant Colony Optimization (ACO) which are used for attacking one of the nonlinear cryptosystems called "shrinking generator" using different lengths of cipher text and different lengths of combined LFSRs. GA and ACO proved their good performance in finding the initial values of the combined LFSRs. This work can be considered as a warning for a stream cipher designer to avoid the weak points, which may be f
... Show MoreIndustrial effluents loaded with heavy metals are a cause of hazards to the humans and other forms of life. Conventional approaches, such as electroplating, ion exchange, and membrane processes, are used for removal of copper, cadmium, and lead and are often cost prohibitive with low efficiency at low metal ion concentration. Biosorption can be considered as an option which has been proven as more efficient and economical for removing the mentioned metal ions. Biosorbents used are fungi, yeasts, oil palm shells, coir pith carbon, peanut husks, and olive pulp. Recently, low cost and natural products have also been researched as biosorbent. This paper presents an attempt of the potential use of Iraqi date pits and Al-Khriet (i.e. substances l
... Show MoreIn this paper, method of steganography in Audio is introduced for hiding secret data in audio media file (WAV). Hiding in audio becomes a challenging discipline, since the Human Auditory System is extremely sensitive. The proposed method is to embed the secret text message in frequency domain of audio file. The proposed method contained two stages: the first embedding phase and the second extraction phase. In embedding phase the audio file transformed from time domain to frequency domain using 1-level linear wavelet decomposition technique and only high frequency is used for hiding secreted message. The text message encrypted using Data Encryption Standard (DES) algorithm. Finally; the Least Significant bit (LSB) algorithm used to hide secr
... Show MoreUsed automobile oils were subjected to filtration to remove solid material and dehydration to remove water, gasoline and light components by using vacuum distillation under moderate pressure, and then the dehydrated waste oil is subjected to extraction by using liquid solvents. Two solvents, namely n-butanol and n-hexane were used to extract base oil from automobile used oil, so that the expensive base oil can be reused again.
The recovered base oil by using n-butanol solvent gives (88.67%) reduction in carbon residue, (75.93%) reduction in ash content, (93.73%) oil recovery, (95%) solvent recovery and (100.62) viscosity index, at (5:1) solvent to used oil ratio and (40 oC) extraction temperature, while using n-hexane solvent gives (6