This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random encryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.
Background: Cytology is one of the important diagnostic tests done on effusion fluid. It can detect malignant cells in up to 60% of malignant cases. The most important benign cell present in these effusions is the mesothelial cell. Mesothelial atypia can be striking andmay simulate metastatic carcinoma. Many clinical conditions may produce such a reactive atypical cells as in anemia,SLE, liver cirrhosis and many other conditions. Recently many studies showed the value of computerized image analysis in differentiating atypical cells from malignant adenocarcinoma cells in effusion smears. Other studies support the reliability of the quantitative analysisand morphometric features and proved that they are objective prognostic indices. Method
... Show MoreAerial Robot Arms (ARAs) enable aerial drones to interact and influence objects in various environments. Traditional ARA controllers need the availability of a high-precision model to avoid high control chattering. Furthermore, in practical applications of aerial object manipulation, the payloads that ARAs can handle vary, depending on the nature of the task. The high uncertainties due to modeling errors and an unknown payload are inversely proportional to the stability of ARAs. To address the issue of stability, a new adaptive robust controller, based on the Radial Basis Function (RBF) neural network, is proposed. A three-tier approach is also followed. Firstly, a detailed new model for the ARA is derived using the Lagrange–d’A
... Show MoreIn recent years, the number of applications utilizing mobile wireless sensor networks (WSNs) has increased, with the intent of localization for the purposes of monitoring and obtaining data from hazardous areas. Location of the event is very critical in WSN, as sensing data is almost meaningless without the location information. In this paper, two Monte Carlo based localization schemes termed MCL and MSL* are studied. MCL obtains its location through anchor nodes whereas MSL* uses both anchor nodes and normal nodes. The use of normal nodes would increase accuracy and reduce dependency on anchor nodes, but increases communication costs. For this reason, we introduce a new approach called low communication cost schemes to reduce communication
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
The widespread of internet allover the world, in addition to the increasing of the huge number of users that they exchanged important information over it highlights the need for a new methods to protect these important information from intruders' corruption or modification. This paper suggests a new method that ensures that the texts of a given document cannot be modified by the intruders. This method mainly consists of mixture of three steps. The first step which barrows some concepts of "Quran" security system to detect some type of change(s) occur in a given text. Where a key of each paragraph in the text is extracted from a group of letters in that paragraph which occur as multiply of a given prime number. This step cannot detect the ch
... Show MoreIn Computer-based applications, there is a need for simple, low-cost devices for user authentication. Biometric authentication methods namely keystroke dynamics are being increasingly used to strengthen the commonly knowledge based method (example a password) effectively and cheaply for many types of applications. Due to the semi-independent nature of the typing behavior it is difficult to masquerade, making it useful as a biometric. In this paper, C4.5 approach is used to classify user as authenticated user or impostor by combining unigraph features (namely Dwell time (DT) and flight time (FT)) and digraph features (namely Up-Up Time (UUT) and Down-Down Time (DDT)). The results show that DT enhances the performance of digraph features by i
... Show MoreBusiness organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re
... Show MoreHS Saeed, SS Abdul-Jabbar, SG Mohammed, EA Abed, HS Ibrahem, Solid State Technology, 2020