Applying 4K, (Ultra HD) Real-time video streaming via the internet network, with low bitrate and low latency, is the challenge this paper addresses. Compression technology and transfer links are the important elements that influence video quality. So, to deliver video over the internet or another fixed capacity medium, it is essential to compress the video to more controllable bitrates (customarily in the 1-20 Mbps range). In this study, the video quality is examined using the H.265/HEVC compression standard, and the relationship between quality of video and bitrate flow is investigated using various constant rate factors, GOP patterns, quantization parameters, RC-lookahead, and other types of video motion sequences. The ultra-high-definition video source is used, down sampled and encoded at multiple resolutions of (3480x2160), (1920x1080), (1280x720), (704x576), (352x288), and (176x144). To determine the best H265 feature configuration for each resolution experiments were conducted that resulted in a PSNR of 36 dB at the specified bitrate. The resolution is selected by delivery (encoder resource) based on the end-user application. While video streaming adapted to the available bandwidth is achieved via embedding a controller with MPEG DASH protocol at the client-side. Video streaming Adaptation methods allow the delivery of content that is encoded at different representations of video quality and bitrate and then dividing each representation into chunks of time. Through this paper, we propose to utilize HTTP/2 as a protocol to achieve low latency video streaming focusing on live streaming video avoiding the problem of HTTP/1.
The aim of this paper is to present a new methodology to find the private key of RSA. A new initial value which is generated from a new equation is selected to speed up the process. In fact, after this value is found, brute force attack is chosen to discover the private key. In addition, for a proposed equation, the multiplier of Euler totient function to find both of the public key and the private key is assigned as 1. Then, it implies that an equation that estimates a new initial value is suitable for the small multiplier. The experimental results show that if all prime factors of the modulus are assigned larger than 3 and the multiplier is 1, the distance between an initial value and the private key
... Show MoreIn this paper reliable computational methods (RCMs) based on the monomial stan-dard polynomials have been executed to solve the problem of Jeffery-Hamel flow (JHF). In addition, convenient base functions, namely Bernoulli, Euler and Laguerre polynomials, have been used to enhance the reliability of the computational methods. Using such functions turns the problem into a set of solvable nonlinear algebraic system that MathematicaⓇ12 can solve. The JHF problem has been solved with the help of Improved Reliable Computational Methods (I-RCMs), and a review of the methods has been given. Also, published facts are used to make comparisons. As further evidence of the accuracy and dependability of the proposed methods, the maximum error remainder
... Show MoreThis research depends on the relationship between the reflected spectrum, the nature of each target, area and the percentage of its presence with other targets in the unity of the target area. The changes occur in Land cover have been detected for different years using satellite images based on the Modified Spectral Angle Mapper (MSAM) processing, where Landsat satellite images are utilized using two software programming (MATLAB 7.11 and ERDAS imagine 2014). The proposed supervised classification method (MSAM) using a MATLAB program with supervised classification method (Maximum likelihood Classifier) by ERDAS imagine have been used to get farthest precise results and detect environmental changes for periods. Despite using two classificatio
... Show MoreIn data mining and machine learning methods, it is traditionally assumed that training data, test data, and the data that will be processed in the future, should have the same feature space distribution. This is a condition that will not happen in the real world. In order to overcome this challenge, domain adaptation-based methods are used. One of the existing challenges in domain adaptation-based methods is to select the most efficient features so that they can also show the most efficiency in the destination database. In this paper, a new feature selection method based on deep reinforcement learning is proposed. In the proposed method, in order to select the best and most appropriate features, the essential policies
... Show MoreThis paper analysed the effect of electronic internal auditing (EIA) based on the Control Objectives for Information and Related Technologies (COBIT) framework. Organisations must implement an up-to-date accounting information system (AIS) capable of meeting their auditing requirements. Electronic audit risk (compliance assessment, control assurance, and risk assessment) is a development by Weidenmier and Ramamoorti (2006) to improve AIS. In order to fulfil the study’s objectives, a questionnaire was prepared and distributed to a sample comprising 120 employees. The employees were financial managers, internal auditors, and workers involved in the company’s information security departments in the General Company for Electricity D
... Show MoreTraffic management at road intersections is a complex requirement that has been an important topic of research and discussion. Solutions have been primarily focused on using vehicular ad hoc networks (VANETs). Key issues in VANETs are high mobility, restriction of road setup, frequent topology variations, failed network links, and timely communication of data, which make the routing of packets to a particular destination problematic. To address these issues, a new dependable routing algorithm is proposed, which utilizes a wireless communication system between vehicles in urban vehicular networks. This routing is position-based, known as the maximum distance on-demand routing algorithm (MDORA). It aims to find an optimal route on a hop-by-ho
... Show MoreOptimizing the Access Point (AP) deployment has a great role in wireless applications due to the need for providing an efficient communication with low deployment costs. Quality of Service (QoS), is a major significant parameter and objective to be considered along with AP placement as well the overall deployment cost. This study proposes and investigates a multi-level optimization algorithm called Wireless Optimization Algorithm for Indoor Placement (WOAIP) based on Binary Particle Swarm Optimization (BPSO). WOAIP aims to obtain the optimum AP multi-floor placement with effective coverage that makes it more capable of supporting QoS and cost-effectiveness. Five pairs (coverage, AP deployment) of weights, signal thresholds and received s
... Show MoreIn this work, a chemical optical fiber sensor based on Surface Plasmon Resonance (SPR) was designed and implemented using plastic optical fiber. The sensor is used for estimating refractive indices and concentrations of various chemical materials (methanol, distilled water, ethanol, kerosene) as well as for evaluating the performance parameters such as sensitivity, signal to noise ratio, resolution and the figure of merit of the fabricated sensor. It was found that the value of the sensitivity of the optical fiber-based SPR sensor, with 40 nm thick and 10 mm long Au metal film of exposed sensing region, was 3μm/RIU, while the SNR was 0.24, the figure of merit was 20, and the resolution was 0.00066. The sort of optical fiber utilized i
... Show More