The meniscus has a crucial function in human anatomy, and Magnetic Resonance Imaging (M.R.I.) plays an essential role in meniscus assessment. It is difficult to identify cartilage lesions using typical image processing approaches because the M.R.I. data is so diverse. An M.R.I. data sequence comprises numerous images, and the attributes area we are searching for may differ from each image in the series. Therefore, feature extraction gets more complicated, hence specifically, traditional image processing becomes very complex. In traditional image processing, a human tells a computer what should be there, but a deep learning (D.L.) algorithm extracts the features of what is already there automatically. The surface changes become valuable when diagnosing a tissue sample. Small, unnoticeable changes in pixel density may indicate the beginning of cancer or tear tissue in the early stages. These details even expert pathologists might miss. Artificial intelligence (A.I.) and D.L. revolutionized radiology by enhancing efficiency and accuracy of both interpretative and non-interpretive jobs. When you look at AI applications, you should think about how they might work. Convolutional Neural Network (C.N.N.) is a part of D.L. that can be used to diagnose knee problems. There are existing algorithms that can detect and categorize cartilage lesions, meniscus tears on M.R.I., offer an automated quantitative evaluation of healing, and forecast who is most likely to have recurring meniscus tears based on radiographs.
Authentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user
... Show MoreIris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show MoreAuthentication is the process of determining whether someone or something is,
in fact, who or what it is declared to be. As the dependence upon computers and
computer networks grows, the need for user authentication has increased. User’s
claimed identity can be verified by one of several methods. One of the most popular
of these methods is represented by (something user know), such as password or
Personal Identification Number (PIN). Biometrics is the science and technology of
authentication by identifying the living individual’s physiological or behavioral
attributes. Keystroke authentication is a new behavioral access control system to
identify legitimate users via their typing behavior. The objective of thi
In this paper an authentication based finger print biometric system is proposed with personal identity information of name and birthday. A generation of National Identification Number (NIDN) is proposed in merging of finger print features and the personal identity information to generate the Quick Response code (QR) image that used in access system. In this paper two approaches are dependent, traditional authentication and strong identification with QR and NIDN information. The system shows accuracy of 96.153% with threshold value of 50. The accuracy reaches to 100% when the threshold value goes under 50.
Digital forensic is part of forensic science that implicitly covers crime related to computer and other digital devices. It‟s being for a while that academic studies are interested in digital forensics. The researchers aim to find out a discipline based on scientific structures that defines a model reflecting their observations. This paper suggests a model to improve the whole investigation process and obtaining an accurate and complete evidence and adopts securing the digital evidence by cryptography algorithms presenting a reliable evidence in a court of law. This paper presents the main and basic concepts of the frameworks and models used in digital forensics investigation.
Today with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreThe revolution of multimedia has been a driving force behind fast and secured data transmission techniques. The security of image information from unapproved access is imperative. Encryptions technique is used to transfer data, where each kind of data has its own special elements; thus various methods should to be used to conserve distributing the image. This paper produces image encryption improvements based on proposed an approach to generate efficient intelligent session (mask keys) based on investigates from the combination between robust feature for ECC algebra and construction level in Greedy Randomized Adaptive Search Procedure (GRASP) to produce durable symmetric session mask keys consist of ECC points. Symmetric behavior for ECC
... Show More