This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random encryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.
The Internet makes the world like a small village. It has the ability to make groups of the same ideas, thoughts, and identity close to each other by gathering them in one place. It is a way to communicate and share information and opinion; shaping and sharing of comprising individuals with common interests and the participation of individuals in a fruitful dialogue results in achieving a set of goals promote ideas and mobilizing people about issues and events of common interests.
To address the relationship of the internet via interactive communication and its ability to achieve social capital and discuss issues and various social events, the study sees that the problem of the study could be formulated as follows:
Consid
... Show MoreEthnographic research is perhaps the most common applicable type of qualitative research method in psychology and medicine. In ethnography studies, the researcher immerses himself in the environment of participants to understand the cultures, challenges, motivations, and topics that arise between them by investigating the environment directly. This type of research method can last for a few days to a few years because it involves in-depth monitoring and data collection based on these foundations. For this reason, the findings of the current study stimuli the researchers in psychology and medicine to conduct studies by applying ethnographic research method to investigate the common cultural patterns language, thinking, beliefs, and behavior
... Show MoreDeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detectio
... Show Moreإن موضوع الشرق الأوسط بشكل عام اتخذ أهمية كبيرة في الكتابات والمؤلفات التي صدرت منذ بداية النصف الأول من عقد التسعينات من القرن المنصرم مع بدايات مشاريع السلام التي أعقبت انهيار الاتحاد السوفيتي وتغير الخارطة السياسية والاقتصادية والايديولوجية للعالم .وعلى الرغم ان المصطلح ليس بجديد الا ان تعابير المصطلح وددلالاته تتغير مع تغير موازين القوى واتجاهات المصالح. إذ انتقل من مصطلح جغرافي الى
... Show MoreAbstract: -
The concept of joint integration of important concepts in macroeconomic application, the idea of cointegration is due to the Granger (1981), and he explained it in detail in Granger and Engle in Econometrica (1987). The introduction of the joint analysis of integration in econometrics in the mid-eighties of the last century, is one of the most important developments in the experimental method for modeling, and the advantage is simply the account and use it only needs to familiarize them selves with ordinary least squares.
Cointegration seen relations equilibrium time series in the long run, even if it contained all the sequences on t
... Show MoreThe Machine learning methods, which are one of the most important branches of promising artificial intelligence, have great importance in all sciences such as engineering, medical, and also recently involved widely in statistical sciences and its various branches, including analysis of survival, as it can be considered a new branch used to estimate the survival and was parallel with parametric, nonparametric and semi-parametric methods that are widely used to estimate survival in statistical research. In this paper, the estimate of survival based on medical images of patients with breast cancer who receive their treatment in Iraqi hospitals was discussed. Three algorithms for feature extraction were explained: The first principal compone
... Show MorePurpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me