For businesses that provide delivery services, the efficiency of the delivery process in terms of punctuality is very important. In addition to increasing customer trust, efficient route management, and selection are required to reduce vehicle fuel costs and expedite delivery. Some small and medium businesses still use conventional methods to manage delivery routes. Decisions to manage delivery schedules and routes do not use any specific methods to expedite the delivery settlement process. This process is inefficient, takes a long time, increases costs and is prone to errors. Therefore, the Dijkstra algorithm has been used to improve the delivery management process. A delivery management system was developed to help managers and drivers schedule efficient ways to deliver product orders to recipients. Based on testing, the Dijkstra algorithm that has been included in the nearest route search function for the delivery process has worked well. This system is expected to improve the efficient management and delivery of orders.
Optical burst switching (OBS) network is a new generation optical communication technology. In an OBS network, an edge node first sends a control packet, called burst header packet (BHP) which reserves the necessary resources for the upcoming data burst (DB). Once the reservation is complete, the DB starts travelling to its destination through the reserved path. A notable attack on OBS network is BHP flooding attack where an edge node sends BHPs to reserve resources, but never actually sends the associated DB. As a result the reserved resources are wasted and when this happen in sufficiently large scale, a denial of service (DoS) may take place. In this study, we propose a semi-supervised machine learning approach using k-means algorithm
... Show MoreThe use of the entrance diffraction hexagon continuous improvement of operations in order to achieve the rationalization of activities, costs and efficiency in the use of available resources and reduce the incidence of damage and waste and recycling, as the accounting information system does not meet the surface production processes oil fields cost management requirements in the measurement and evaluation of the costs of each activity and development of indicators to evaluate the efficiency and effectiveness of production processes and to cover the shortcomings of currently approved by the company so cost accounting system has Find addressed the use of strategic cost management techniques, including the entrance diffraction hexagon for c
... Show MoreAbstract
The aim of this study was to identify the impact of the Knowledge Management Processes on organizational creativity in the Airlines Companies working in Sudan. The hypotheses formulated as:, there is a positive relationship statistically significant differences between knowledge management processes (diagnosis, the acquisition, storage, distribution and application) and organizational creativity. the measurement of the variables had been adopted from previous studies. The study used a Descriptive approach and and the analytical statistical method to construct the model and SPSS Program for data analysis .Purposive sample procedure had been chosen and structured questionnaire had been developed. Out of 215 q
... Show MoreAutomatic recognition of individuals is very important in modern eras. Biometric techniques have emerged as an answer to the matter of automatic individual recognition. This paper tends to give a technique to detect pupil which is a mixture of easy morphological operations and Hough Transform (HT) is presented in this paper. The circular area of the eye and pupil is divided by the morphological filter as well as the Hough Transform (HT) where the local Iris area has been converted into a rectangular block for the purpose of calculating inconsistencies in the image. This method is implemented and tested on the Chinese Academy of Sciences (CASIA V4) iris image database 249 person and the IIT Delhi (IITD) iris
... Show MoreThe study aimed to reveal the role of social capital represented by its dimensions (structural, relational, and cognitive) in strengthening the management of excellence in Azadi Hospital / Duhok. In order to reach the goal of the study, the study variables were highlighted in theory through framing concepts and literary contributions for researchers in this field, In the field, the questionnaire was used as a basic tool to collect data from the individuals in the research sample who were represented by officials and individuals working from administrators and technicians, as (120) forms were distributed to the respondents, and (110) were retrieved from them in a way that is valid for analysis. Several statistical methods have bee
... Show MoreThe research deals with the important and modern two subjects, strategic leadership which have six demotions and knowledge management
(four demotions') . the gools & the research is to know the relation & the effect them in the oil ministry (project department) , the sample was (50) persons who works in the department the questionnaire was the tool of data gathering .
The research divided to four parties, the first to the theotical review of the research variables, the second to the research methrology, the third to analysis and discoed the empirical results the last to the conclusions and recommendations .
In regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreThe aim of this paper is to present a new methodology to find the private key of RSA. A new initial value which is generated from a new equation is selected to speed up the process. In fact, after this value is found, brute force attack is chosen to discover the private key. In addition, for a proposed equation, the multiplier of Euler totient function to find both of the public key and the private key is assigned as 1. Then, it implies that an equation that estimates a new initial value is suitable for the small multiplier. The experimental results show that if all prime factors of the modulus are assigned larger than 3 and the multiplier is 1, the distance between an initial value and the private key
... Show MoreA substantial matter to confidential messages' interchange through the internet is transmission of information safely. For example, digital products' consumers and producers are keen for knowing those products are genuine and must be distinguished from worthless products. Encryption's science can be defined as the technique to embed the data in an images file, audio or videos in a style which should be met the safety requirements. Steganography is a portion of data concealment science that aiming to be reached a coveted security scale in the interchange of private not clear commercial and military data. This research offers a novel technique for steganography based on hiding data inside the clusters that resulted from fuzzy clustering. T
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show More