With the continuous downscaling of semiconductor processes, the growing power density and thermal issues in multicore processors become more and more challenging, thus reliable dynamic thermal management (DTM) is required to prevent severe challenges in system performance. The accuracy of the thermal profile, delivered to the DTM manager, plays a critical role in the efficiency and reliability of DTM, different sources of noise and variations in deep submicron (DSM) technologies severely affecting the thermal data that can lead to significant degradation of DTM performance. In this article, we propose a novel fault-tolerance scheme exploiting approximate computing to mitigate the DSM effects on DTM efficiency. Approximate computing in hardware design can lead to significant gains in energy efficiency, area, and performance. To exploit this opportunity, there is a need for design abstractions that can systematically incorporate approximation in hardware design which is the main contribution of our work. Our proposed scheme achieves 11.20% lower power consumption, 6.59% smaller area, and 12% reduction in the number of wires, while increasing DTM efficiency by 5.24%.
Today, the role of cloud computing in our day-to-day lives is very prominent. The cloud computing paradigm makes it possible to provide demand-based resources. Cloud computing has changed the way that organizations manage resources due to their robustness, low cost, and pervasive nature. Data security is usually realized using different methods such as encryption. However, the privacy of data is another important challenge that should be considered when transporting, storing, and analyzing data in the public cloud. In this paper, a new method is proposed to track malicious users who use their private key to decrypt data in a system, share it with others and cause system information leakage. Security policies are also considered to be int
... Show MorePurpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa
... Show MoreRutting is a crucial concern impacting asphalt concrete pavements’ stability and long-term performance, negatively affecting vehicle drivers’ comfort and safety. This research aims to evaluate the permanent deformation of pavement under different traffic and environmental conditions using an Artificial Neural Network (ANN) prediction model. The model was built based on the outcomes of an experimental uniaxial repeated loading test of 306 cylindrical specimens. Twelve independent variables representing the materials’ properties, mix design parameters, loading settings, and environmental conditions were implemented in the model, resulting in a total of 3214 data points. The network accomplished high prediction accuracy with an R
... Show MoreCloud computing (CC) is a fast-growing technology that offers computers, networking, and storage services that can be accessed and used over the internet. Cloud services save users money because they are pay-per-use, and they save time because they are on-demand and elastic, a unique aspect of cloud computing. However, several security issues must be addressed before users store data in the cloud. Because the user will have no direct control over the data that has been outsourced to the cloud, particularly personal and sensitive data (health, finance, military, etc.), and will not know where the data is stored, the user must ensure that the cloud stores and maintains the outsourced data appropriately. The study's primary goals are to mak
... Show MoreExperimental and theoretical investigations are presented on flocculation process in pulsator clarifier. Experimental system was designed to study the factors that affecting the performance of pulsator clarifier. These factors were water level in vacuum chamber which range from 60 to 150 cm , rising time of water in vacuum chamber which having times of 20,30 & 40 seconds , and sludge blanket height which having heights of 20,30 & 40 cm .The turbidity and pH of raw water used were 200 NTU and 8.13 respectively. According to the jar test, the alum dose required for this turbidity was 20 mg/l .The performance parameters of pulsator clarifier such as , turbidity ,total solid TS , shear rate , volume concentration of sludge blanket an
... Show MoreThe first successful implementation of Artificial Neural Networks (ANNs) was published a little over a decade ago. It is time to review the progress that has been made in this research area. This paper provides taxonomy for classifying Field Programmable Gate Arrays (FPGAs) implementation of ANNs. Different implementation techniques and design issues are discussed, such as obtaining a suitable activation function and numerical truncation technique trade-off, the improvement of the learning algorithm to reduce the cost of neuron and in result the total cost and the total speed of the complete ANN. Finally, the implementation of a complete very fast circuit for the pattern of English Digit Numbers NN has four layers of 70 nodes (neurons) o
... Show MoreThe first successful implementation of Artificial Neural Networks (ANNs) was published a little over a decade ago. It is time to review the progress that has been made in this research area. This paper provides taxonomy for classifying Field Programmable Gate Arrays (FPGAs) implementation of ANNs. Different implementation techniques and design issues are discussed, such as obtaining a suitable activation function and numerical truncation technique trade-off, the improvement of the learning algorithm to reduce the cost of neuron and in result the total cost and the total speed of the complete ANN. Finally, the implementation of a complete very fast circuit for the pattern of English Digit Numbers NN has four layers of 70 nodes (neurons) o
... Show MoreThe manifestations of climate change are increasing with the days: sudden rains and floods, lakes that evaporate, rivers that experience unprecedentedly low water levels, and successive droughts such as the Tigris, Euphrates, Rhine, and Lape rivers. At the same time, energy consumption is increasing, and there is no way to stop the warming of the Earth's atmosphere despite the many conferences and growing interest in environmental problems. An aspect that has not received sufficient attention is the tremendous heat produced by human activities. This work links four elements in the built environment that are known for their high energy consumption (houses, supermarkets, greenhouses, and asphalt roads) according t
... Show More