Faces blurring is one of the important complex processes that is considered one of the advanced computer vision fields. The face blurring processes generally have two main steps to be done. The first step has detected the faces that appear in the frames while the second step is tracking the detected faces which based on the information extracted during the detection step. In the proposed method, an image is captured by the camera in real time, then the Viola Jones algorithm used for the purpose of detecting multiple faces in the captured image and for the purpose of reducing the time consumed to handle the entire captured image, the image background is removed and only the motion areas are processed. After detecting the faces, the Color-Space algorithm is used to tracks the detected faces depending on the color of the face and to check the differences between the faces the Template Matching algorithm was used to reduce the processes time. Finally, the
detected faces as well as the faces that were tracked based on their color were obscured by the use of the Gaussian filter. The achieved accuracy for a single face and dynamic background are about 82.8% and 76.3% respectively.
In this work, an enhanced Photonic Crystal Fiber (PCF) based on Surface Plasmon Resonance (SPR) sensor using a sided polished structure for the detection of toxic ions Arsenic in water was designed and implemented. The SPR curve can be obtained by polishing the side of the PCF after coating the Au film on the side of the polished area, the SPR curve can be obtained. The proposed sensor has a clear SPR effect, according to the findings of the experiments. The estimated signal to Noise Ratio (SNR), sensitivity (S), resolution (R), and Figures of merit (FOM) are approaching; the SNR is 0.0125, S is 11.11 μm/RIU, the resolution is 1.8x〖10〗^(-4), and the FOM is 13.88 for Single-mode Fiber- Photonic Crystal Fiber- single mode Fiber (SMF-P
... Show MoreImmunization is one of the most cost-effective and successful public health applications. The results of immunization are difficult to see as the incidence of disease occurrence is low while adverse effects following the immunization are noticeable, particularly if the vaccine was given to apparently healthy person. High safety expectations of population regarding the vaccines so they are more prone to hesitancy regarding presence of even small risk of adverse events which may lead to loss of pub
... Show MoreThe multiplicity of connotations in any paper does not mean that there is no main objective for that paper and certainly one of these papers is our research the main objective is to introduce a new connotation which is type-2 fuzzy somewhere dense set in general type-2 fuzzy topological space and its relationship with open sets of the connotation type-2 fuzzy set in the same space topology and theories of this connotation.
Excessive skewness which occurs sometimes in the data is represented as an obstacle against normal distribution. So, recent studies have witnessed activity in studying the skew-normal distribution (SND) that matches the skewness data which is regarded as a special case of the normal distribution with additional skewness parameter (α), which gives more flexibility to the normal distribution. When estimating the parameters of (SND), we face the problem of the non-linear equation and by using the method of Maximum Likelihood estimation (ML) their solutions will be inaccurate and unreliable. To solve this problem, two methods can be used that are: the genetic algorithm (GA) and the iterative reweighting algorithm (IR) based on the M
... Show MoreIn this paper, we will present proposed enhance process of image compression by using RLE algorithm. This proposed yield to decrease the size of compressing image, but the original method used primarily for compressing a binary images [1].Which will yield increasing the size of an original image mostly when used for color images. The test of an enhanced algorithm is performed on sample consists of ten BMP 24-bit true color images, building an application by using visual basic 6.0 to show the size after and before compression process and computing the compression ratio for RLE and for the enhanced RLE algorithm
The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show MoreMost heuristic search method's performances are dependent on parameter choices. These parameter settings govern how new candidate solutions are generated and then applied by the algorithm. They essentially play a key role in determining the quality of the solution obtained and the efficiency of the search. Their fine-tuning techniques are still an on-going research area. Differential Evolution (DE) algorithm is a very powerful optimization method and has become popular in many fields. Based on the prolonged research work on DE, it is now arguably one of the most outstanding stochastic optimization algorithms for real-parameter optimization. One reason for its popularity is its widely appreciated property of having only a small number of par
... Show MoreJPEG is most popular image compression and encoding, this technique is widely used in many applications (images, videos and 3D animations). Meanwhile, researchers are very interested to develop this massive technique to compress images at higher compression ratios with keeping image quality as much as possible. For this reason in this paper we introduce a developed JPEG based on fast DCT and removed most of zeros and keeps their positions in a transformed block. Additionally, arithmetic coding applied rather than Huffman coding. The results showed up, the proposed developed JPEG algorithm has better image quality than traditional JPEG techniques.
Stemming is a pre-processing step in Text mining applications as well as it is very important in most of the Information Retrieval systems. The goal of stemming is to reduce different grammatical forms of a word and sometimes derivationally related forms of a word to a common base (root or stem) form like reducing noun, adjective, verb, adverb etc. to its base form. The stem needs not to be identical to the morphological root of the word; it is usually sufficient that related words map to the same stem, even if this stem is not in itself a valid root. As in other languages; there is a need for an effective stemming algorithm for the indexing and retrieval of Arabic documents while the Arabic stemming algorithms are not widely available.
... Show More