Progression in Computer networks and emerging of new technologies in this field helps to find out new protocols and frameworks that provides new computer network-based services. E-government services, a modernized version of conventional government, are created through the steady evolution of technology in addition to the growing need of societies for numerous services. Government services are deeply related to citizens’ daily lives; therefore, it is important to evolve with technological developments—it is necessary to move from the traditional methods of managing government work to cutting-edge technical approaches that improve the effectiveness of government systems for providing services to citizens. Blockchain technology is among the modern technologies highly suitable for developing digital governance services, and its technological ability to sustain information stability is vital in digital governance systems since it improves integrity and transparency measures while preventing corruption. In this study, computer networking protocols are built to form a peer-to-peer network framework for managing official documents. using blockchain technology was built to illustrate how any element of government work may be developed using it. The suggested framework comprises the addition of a new official document, and the verification of an existing document. The system was created in socket programming using Java and tested the response times for many simultaneous requests. The system was tested using transactions per second (throughput) measurement. The result showed that the proposed system processed 200 document verification transactions within 50 seconds. In addition, the test of the proposed system presented the time required for document retrieval—about three seconds to answer 100 document retrieval transactions. Furthermore, the results of throughput were compared to the results of the same measurement of some popular applications such as bitcoin. And the result of the proposed system was within the average value of output throughput of the other compared applications.
The fetal heart rate (FHR) signal processing based on Artificial Neural Networks (ANN),Fuzzy Logic (FL) and frequency domain Discrete Wavelet Transform(DWT) were analysis in order to perform automatic analysis using personal computers. Cardiotocography (CTG) is a primary biophysical method of fetal monitoring. The assessment of the printed CTG traces was based on the visual analysis of patterns that describing the variability of fetal heart rate signal. Fetal heart rate data of pregnant women with pregnancy between 38 and 40 weeks of gestation were studied. The first stage in the system was to convert the cardiotocograghy (CTG) tracing in to digital series so that the system can be analyzed ,while the second stage ,the FHR time series was t
... Show MoreCryptography is a method used to mask text based on any encryption method, and the authorized user only can decrypt and read this message. An intruder tried to attack in many manners to access the communication channel, like impersonating, non-repudiation, denial of services, modification of data, threatening confidentiality and breaking availability of services. The high electronic communications between people need to ensure that transactions remain confidential. Cryptography methods give the best solution to this problem. This paper proposed a new cryptography method based on Arabic words; this method is done based on two steps. Where the first step is binary encoding generation used t
... Show MoreArabic text categorization for pattern recognitions is challenging. We propose for the first time a novel holistic method based on clustering for classifying Arabic writer. The categorization is accomplished stage-wise. Firstly, these document images are sectioned into lines, words, and characters. Secondly, their structural and statistical features are obtained from sectioned portions. Thirdly, F-Measure is used to evaluate the performance of the extracted features and their combination in different linkage methods for each distance measures and different numbers of groups. Finally, experiments are conducted on the standard KHATT dataset of Arabic handwritten text comprised of varying samples from 1000 writers. The results in the generatio
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Speech is the essential way to interact between humans or between human and machine. However, it is always contaminated with different types of environment noise. Therefore, speech enhancement algorithms (SEA) have appeared as a significant approach in speech processing filed to suppress background noise and return back the original speech signal. In this paper, a new efficient two-stage SEA with low distortion is proposed based on minimum mean square error sense. The estimation of clean signal is performed by taking the advantages of Laplacian speech and noise modeling based on orthogonal transform (Discrete Krawtchouk-Tchebichef transform) coefficients distribution. The Discrete Kra
The quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.
FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic
... Show MoreWith the increasing integration of computers and smartphones into our daily lives, in addition to the numerous benefits it offers over traditional paper-based methods of conducting affairs, it has become necessary to incorporate one of the most essential facilities into this integration; namely: colleges. The traditional approach for conducting affairs in colleges is mostly paper-based, which only increases time and workload and is relatively decentralized. This project provides educational and management services for the university environment, targeting the staff, the student body, and the lecturers, on two of the most used platforms: smartphones and reliable web applications by clo
The presented work shows a preliminary analytic method for estimation of load and pressure distributions on low speed wings with flow separation and wake rollup phenomena’s. A higher order vortex panel method is coupled with the numerical lifting line theory by means of iterative procedure including models of separation and wake rollup. The computer programs are written in FORTRAN which are stable and efficient.
The capability of the present method is investigated through a number of test cases with different types of wing sections (NACA 0012 and GA(W)-1) for different aspect ratios and angles of attack, the results include the lift and drag curves, lift and pressure distributions along the wing s
... Show MoreThe research acquires its importance by motivating the behavioural side of the employees to apply modern technology in the work, because of its great importance in increasing the efficiency of employees’ performance and excellence. The research was based on two main hypotheses to show the relationship and impact between the variables through the adoption of a questionnaire to collect data and information related to the research, which consisted of (50) people from administrators working at different levels, based on personal interviews and field visits to collect research data. The data collection process was subjected to statistical analysis using the statistical program (SPSS) (Statistical package for social science) to reach
... Show More