The present theoretical study analyzes the legacy of the Chicago School of Urban Sociology and evaluates it in the light of the growth and development of Chicago City and the establishment of sociology in it. Sociology has become an academic discipline recognized in the United States of America in the late nineteenth century, particularly, after the establishment of the first department of sociology in the University of Chicago in 1892. That was during the period of the rapid industrialization and sustainable growth of the Chicago City. The Chicago School relied on Chicago City in particular, as one of the American cities that grew and expanded rapidly in the first two decades of the twentieth century. At the end of the nineteenth century, the city witnessed the arrival of large numbers of immigrants from Europe and South America. Accordingly, this study aims to examine the heritage of the Chicago School in depth, focusing on its origin, genesis, and development of the Chicago School of Urban Sociology. It also sheds light on its emergence and dominance over the American academic edifice in the first two decades of the twentieth century. The study further aims to investigate the role of the pragmatic thought in the growth and development of this school as a prominent scientific edifice among all American universities. The golden age of this school and the creativity of its pioneers of the scholars continued until the mid-forties. Thus, the study is to explain the causes of its decline after the mid-forties of the last century. Then, it evaluates the reality of this school after the forties until the first two decades of the current millennium. The study concluded that although much of the urban sociology tradition in the Chicago School and its deeply rooted sub-fields was and remains important, constituting a centre to this discipline, this does not mean that the styles and methods of studies conducted under the Chicago School umbrella should be applied to the urban life of today’s cities. This is due to their inadequacy with the reality of urban life in industrial cities today. That is in return is because of the radical transformations at all levels, including economic, social, political and cultural, as well as the modern communication technologies that have changed the face of the world through what is called today globalization.
Most Internet of Vehicles (IoV) applications are delay-sensitive and require resources for data storage and tasks processing, which is very difficult to afford by vehicles. Such tasks are often offloaded to more powerful entities, like cloud and fog servers. Fog computing is decentralized infrastructure located between data source and cloud, supplies several benefits that make it a non-frivolous extension of the cloud. The high volume data which is generated by vehicles’ sensors and also the limited computation capabilities of vehicles have imposed several challenges on VANETs systems. Therefore, VANETs is integrated with fog computing to form a paradigm namely Vehicular Fog Computing (VFC) which provide low-latency services to mo
... Show MoreThe convergence speed is the most important feature of Back-Propagation (BP) algorithm. A lot of improvements were proposed to this algorithm since its presentation, in order to speed up the convergence phase. In this paper, a new modified BP algorithm called Speeding up Back-Propagation Learning (SUBPL) algorithm is proposed and compared to the standard BP. Different data sets were implemented and experimented to verify the improvement in SUBPL.
This paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used.
Experimental results shows LPG-
... Show MoreHuman interaction technology based on motion capture (MoCap) systems is a vital tool for human kinematics analysis, with applications in clinical settings, animations, and video games. We introduce a new method for analyzing and estimating dorsal spine movement using a MoCap system. The captured data by the MoCap system are processed and analyzed to estimate the motion kinematics of three primary regions; the shoulders, spine, and hips. This work contributes a non-invasive and anatomically guided framework that enables region-specific analysis of spinal motion which could be used as a clinical alternative to invasive measurement techniques. The hierarchy of our model consists of five main levels; motion capture system settings, marker data
... Show MoreCyber-attacks keep growing. Because of that, we need stronger ways to protect pictures. This paper talks about DGEN, a Dynamic Generative Encryption Network. It mixes Generative Adversarial Networks with a key system that can change with context. The method may potentially mean it can adjust itself when new threats appear, instead of a fixed lock like AES. It tries to block brute‑force, statistical tricks, or quantum attacks. The design adds randomness, uses learning, and makes keys that depend on each image. That should give very good security, some flexibility, and keep compute cost low. Tests still ran on several public image sets. Results show DGEN beats AES, chaos tricks, and other GAN ideas. Entropy reached 7.99 bits per pix
... Show More