Data hiding strategies have recently gained popularity in different fields; Digital watermark technology was developed for hiding copyright information in the image visually or invisibly. Today, 3D model technology has the potential to alter the field because it allows for the production of sophisticated structures and forms that were previously impossible to achieve. In this paper, a new watermarking method for the 3D model is presented. The proposed method is based on the geometrical and topology properties of the 3D model surface to increase the security. The geometrical properties are based on computing the mean curvature for a surface and topology based on the number of edges around each vertex, the vertices that have negative mean curvature and an odd number of edges around the vertex are selected for embedding. Selecting the vertex with negative mean curvature value means the vertex located in the deep region of the surface, so it not noticeable change to human eyes. To evaluate the performance of the proposed algorithm, the PSNR, and CF are used as a measurement to evaluate the visibility and robustness of the 3D watermarked model. The experimental results have shown the proposed algorithm has good imperceptibility where the PSNR reach up to 44.41 and robustness against attack where the CF is one in many cases.
The aim of this paper is to compare between classical and fuzzy filters for removing different types of noise in gray scale images. The processing used consists of three steps. First, different types of noise are added to the original image to produce a noisy image (with different noise ratios). Second, classical and fuzzy filters are used to filter the noisy image. Finally, comparing between resulting images depending on a quantitative measure called Peak Signal-to-Noise Ratio (PSNR) to determine the best filter in each case.
The image used in this paper is a 512 * 512 pixel and the size of all filters is a square window of size 3*3. Results indicate that fuzzy filters achieve varying successes in noise reduction in image compared to
Compression is the reduction in size of data in order to save space or transmission time. For data transmission, compression can be performed on just the data content or on the entire transmission unit (including header data) depending on a number of factors. In this study, we considered the application of an audio compression method by using text coding where audio compression represented via convert audio file to text file for reducing the time to data transfer by communication channel. Approach: we proposed two coding methods are applied to optimizing the solution by using CFG. Results: we test our application by using 4-bit coding algorithm the results of this method show not satisfy then we proposed a new approach to compress audio fil
... Show MoreBackground: DVT is a very common problem with a very serious complications like pulmonary embolism (PE) which carries a high mortality,and many other chronic and annoying complications ( like chronic DVT, post-phlebitic syndrome, and chronic venous insufficiency) ,and it has many risk factors that affect its course, severity ,and response to treatment. Objectives: Most of those risk factors are modifiable, and a better understanding of the relationships between them can be beneficial for better assessment for liable pfatients , prevention of disease, and the effectiveness of our treatment modalities. Male to female ratio was nearly equal , so we didn’t discuss the gender among other risk factors. Type of the study:A cross- secti
This work presents an approach to deal with modelling a decision support system framework to introduce an application for decisions in medical knowledge system analysis. First aid is extremely important worldwide and, hence, a decision support framework, know as First Aid Decision Support System (FADSS), was designed and implemented to access experimental cases exerting danger to the general population, offering advanced conditions for testing abilities in research and arranging an emergency treatment through the graphical user interface (UI). The design of first aid treatment in FADSS depends on the general cases in first aid. We presented a strategy to manage first aid treatment by modelling an application (FADSS) that assists pe
... Show MoreThe cadastral map is very important because it has technical and materialist
specification of the property borders and these maps which are land registration
based on it in Iraq, the problem is an ancient maps and unfit for use, despite its
importance, Therefor the updating and digitize the cadastral map is very pivotal, this
is what we have done in the present work.
In the present work, we have an old cadastral map (as a paper) was made in 1932
with modern satellite image (Quick Bird ) 2006, which has 61 cm resolution for the
same area after. Geometric correction technique has been applied by using image-toimage
method or (image registration ) and after that we get new agricultural cadaster
map and connect the
Decision-making in Operations Research is the main point in various studies in our real-life applications. However, these different studies focus on this topic. One drawback some of their studies are restricted and have not addressed the nature of values in terms of imprecise data (ID). This paper thus deals with two contributions. First, decreasing the total costs by classifying subsets of costs. Second, improving the optimality solution by the Hungarian assignment approach. This newly proposed method is called fuzzy sub-Triangular form (FS-TF) under ID. The results obtained are exquisite as compared with previous methods including, robust ranking technique, arithmetic operations, magnitude ranking method and centroid ranking method. This
... Show MoreAssociation rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.
World statistics declare that aging has direct correlations with more and more health problems with comorbid conditions. As healthcare communities evolve with a massive amount of data at a faster pace, it is essential to predict, assist, and prevent diseases at the right time, especially for elders. Similarly, many researchers have discussed that elders suffer extensively due to chronic health conditions. This work was performed to review literature studies on prediction systems for various chronic illnesses of elderly people. Most of the reviewed papers proposed machine learning prediction models combined with, or without, other related intelligence techniques for chronic disease detection of elderly patie
... Show MoreThis paper is focusing on reducing the time for text processing operations by taking the advantage of enumerating each string using the multi hashing methodology. Text analysis is an important subject for any system that deals with strings (sequences of characters from an alphabet) and text processing (e.g., word-processor, text editor and other text manipulation systems). Many problems have been arisen when dealing with string operations which consist of an unfixed number of characters (e.g., the execution time); this due to the overhead embedded-operations (like, symbols matching and conversion operations). The execution time largely depends on the string characteristics; especially its length (i.e., the number of characters consisting
... Show MoreThis paper study two stratified quantile regression models of the marginal and the conditional varieties. We estimate the quantile functions of these models by using two nonparametric methods of smoothing spline (B-spline) and kernel regression (Nadaraya-Watson). The estimates can be obtained by solve nonparametric quantile regression problem which means minimizing the quantile regression objective functions and using the approach of varying coefficient models. The main goal is discussing the comparison between the estimators of the two nonparametric methods and adopting the best one between them