The sending of information at the present time requires the speed and providing protection for it. So compression of the data is used in order to provide speed and encryption is used in order to provide protection. In this paper a proposed method is presented in order to provide compression and security for the secret information before sending it. The proposed method based on especial keys with MTF transform method to provide compression and based on RNA coding with MTF encoding method to provide security. The proposed method based on multi secret keys. Every key is designed in an especial way. The main reason in designing these keys in special way is to protect these keys from the predication of the unauthorized users.
Data mining is one of the most popular analysis methods in medical research. It involves finding patterns and correlations in previously unknown datasets. Data mining encompasses various areas of biomedical research, including data collection, clinical decision support, illness or safety monitoring, public health, and inquiry research. Health analytics frequently uses computational methods for data mining, such as clustering, classification, and regression. Studies of large numbers of diverse heterogeneous documents, including biological and electronic information, provided extensive material to medical and health studies.
The reduction to pole of the aeromagnetic map of the western desert of Iraq has been used to outline the main basement structural features. Three selected magnetic anomalies are used to determine the depths of their magnetic sources. The estimated depths are obtained by using slope half slope method and have been corrected through the application of a published nomogram. These depths are compared with previous published depth values which provide a new look at the basement of the western desert in addition to the thickness map of the Paleozoic formations. The results shed light on the important of the great depths of the basement structures and in turn the sedimentary cover to be considered for future hydrocarbon exploration
In this paper, time spent and the repetition of using the Social Network Sites (SNS) in Android applications are investigated. In this approach, we seek to raise the awareness and limit, but not eliminate the repeated uses of SNS, by introducing AndroidTrack. This AndroidTrack is an android application that was designed to monitor and apply valid experimental studies in order to improve the impacts of social media on Iraqi users. Data generated from the app were aggregated and updated periodically at Google Firebase Real-time Database. The statistical factor analysis (FA) was presented as a result of the user’s interactions.
The reaction of LAs-Cl8 : [ (2,2- (1-(3,4-bis(carboxylicdichloromethoxy)-5-oxo-2,5- dihydrofuran-2-yl)ethane – 1,2-diyl)bis(2,2-dichloroacetic acid)]with sodium azide in ethanol with drops of distilled water has been investigated . The new product L-AZ :(3Z ,5Z,8Z)-2- azido-8-[azido(3Z,5Z)-2-azido-2,6-bis(azidocarbonyl)-8,9-dihydro-2H-1,7-dioxa-3,4,5- triazonine-9-yl]methyl]-9-[(1-azido-1-hydroxy)methyl]-2H-1,7-dioxa-3,4,5-triazonine – 2,6 – dicarbonylazide was isolated and characterized by elemental analysis (C.H.N) , 1H-NMR , Mass spectrum and Fourier transform infrared spectrophotometer (FT-IR) . The reaction of the L-AZ withM+n: [ ( VO(II) , Cr(III) ,Mn(II) , Co(II) , Ni(II) , Cu(II) , Zn(II) , Cd(II) and Hg(II)] has been i
... Show MoreCompressing an image and reconstructing it without degrading its original quality is one of the challenges that still exist now a day. A coding system that considers both quality and compression rate is implemented in this work. The implemented system applies a high synthetic entropy coding schema to store the compressed image at the smallest size as possible without affecting its original quality. This coding schema is applied with two transform-based techniques, one with Discrete Cosine Transform and the other with Discrete Wavelet Transform. The implemented system was tested with different standard color images and the obtained results with different evaluation metrics have been shown. A comparison was made with some previous rel
... Show MoreThe Elliptic Curve Cryptography (ECC) algorithm meets the requirements for multimedia encryption since the encipher operation of the ECC algorithm is applied at points only and that offer significant computational advantages. The encoding/decoding operations for converting the text message into points on the curve and vice versa are not always considered a simple process. In this paper, a new mapping method has been investigated for converting the text message into a point on the curve or point to a text message in an efficient and secure manner; it depends on the repeated values in coordinate to establish a lookup table for encoding/ decoding operations. The proposed method for mapping process is&
... Show MoreThis research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show MoreTwitter popularity has increasingly grown in the last few years, influencing life’s social, political, and business aspects. People would leave their tweets on social media about an event, and simultaneously inquire to see other people's experiences and whether they had a positive/negative opinion about that event. Sentiment Analysis can be used to obtain this categorization. Product reviews, events, and other topics from all users that comprise unstructured text comments are gathered and categorized as good, harmful, or neutral using sentiment analysis. Such issues are called polarity classifications. This study aims to use Twitter data about OK cuisine reviews obtained from the Amazon website and compare the effectiveness
... Show MoreDue to the increased of information existing on the World Wide Web (WWW), the subject of how to extract new and useful knowledge from the log file has gained big interest among researchers in data mining and knowledge discovery topics.
Web miming, which is a subset of data mining divided into three particular ways, web content mining, web structure mining, web usage mining. This paper is interested in server log file, which is belonging to the third category (web usage mining). This file will be analyzed according to the suggested algorithm to extract the behavior of the user. Knowing the behavior is coming from knowing the complete path which is taken from the specific user.
Extracting these types of knowledge required many of KDD
The estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show More