The expanding use of multi-processor supercomputers has made a significant impact on the speed and size of many problems. The adaptation of standard Message Passing Interface protocol (MPI) has enabled programmers to write portable and efficient codes across a wide variety of parallel architectures. Sorting is one of the most common operations performed by a computer. Because sorted data are easier to manipulate than randomly ordered data, many algorithms require sorted data. Sorting is of additional importance to parallel computing because of its close relation to the task of routing data among processes, which is an essential part of many parallel algorithms. In this paper, sequential sorting algorithms, the parallel implementation of many sorting methods in a variety of ways using MPICH.NT.1.2.3 library under C++ programming language and comparisons between the parallel and sequential implementations are presented. Then, these methods are used in the image processing field. It have been built a median filter based on these submitted algorithms. As the parallel platform is unavailable, the time is computed in terms of a number of computations steps and communications steps
Abstract— The growing use of digital technologies across various sectors and daily activities has made handwriting recognition a popular research topic. Despite the continued relevance of handwriting, people still require the conversion of handwritten copies into digital versions that can be stored and shared digitally. Handwriting recognition involves the computer's strength to identify and understand legible handwriting input data from various sources, including document, photo-graphs and others. Handwriting recognition pose a complexity challenge due to the diversity in handwriting styles among different individuals especially in real time applications. In this paper, an automatic system was designed to handwriting recognition
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Today, the role of cloud computing in our day-to-day lives is very prominent. The cloud computing paradigm makes it possible to provide demand-based resources. Cloud computing has changed the way that organizations manage resources due to their robustness, low cost, and pervasive nature. Data security is usually realized using different methods such as encryption. However, the privacy of data is another important challenge that should be considered when transporting, storing, and analyzing data in the public cloud. In this paper, a new method is proposed to track malicious users who use their private key to decrypt data in a system, share it with others and cause system information leakage. Security policies are also considered to be int
... Show MoreCloud computing (CC) is a fast-growing technology that offers computers, networking, and storage services that can be accessed and used over the internet. Cloud services save users money because they are pay-per-use, and they save time because they are on-demand and elastic, a unique aspect of cloud computing. However, several security issues must be addressed before users store data in the cloud. Because the user will have no direct control over the data that has been outsourced to the cloud, particularly personal and sensitive data (health, finance, military, etc.), and will not know where the data is stored, the user must ensure that the cloud stores and maintains the outsourced data appropriately. The study's primary goals are to mak
... Show MoreIn this paper, we employ the maximum likelihood estimator in addition to the shrinkage estimation procedure to estimate the system reliability (
The research included five sections containing the first section on the introduction o research and its importance and was addressed to the importance of the game of gymnastic and skilled parallel bars effectiveness and the importance of biochemical variables, either the research problem that there is a difference in learning this skill and difficulty in learning may be one of the most important reasons are falling and injury Has a negative impact on the performance and lack of sense of movement of is one of the obstacles in the completion of the skill and the goal of research to design a device that helps in the development of biochemical changes to skill of rear vault dismount with one-half twist on parallel bars in gymnastics . And the n
... Show MoreIn the recent years, remote sensing applications have a great interest because it's offers many advantages, benefits and possibilities for the applications that using this concept, satellite it's one must important applications for remote sensing, it's provide us with multispectral images allow as study many problems like changing in ecological cover or biodiversity for earth surfers, and illustrated biological diversity of the studied areas by the presentation of the different areas of the scene taken depending on the length of the characteristic wave, Thresholding it's a common used operation for image segmentation, it's seek to extract a monochrome image from gray image by segment this image to two region (for
... Show MoreMaximizing the net present value (NPV) of oil field development is heavily dependent on optimizing well placement. The traditional approach entails the use of expert intuition to design well configurations and locations, followed by economic analysis and reservoir simulation to determine the most effective plan. However, this approach often proves inadequate due to the complexity and nonlinearity of reservoirs. In recent years, computational techniques have been developed to optimize well placement by defining decision variables (such as well coordinates), objective functions (such as NPV or cumulative oil production), and constraints. This paper presents a study on the use of genetic algorithms for well placement optimization, a ty
... Show More