Одной из активно развивающихся отраслей лексикологии является неология, объект её изучения - новое слово или неологизм. В задачу неологии входит выявление новых слов и новых значений у уже существующих в языке слов, анализ причин и способов их появления, описание факторов, влияющих на появление нового в лексической системе языка, разработка языковой политики в отношении новых номинаций. Лексикографическим описанием неологизмов занимается неография. В русистике активное изучение неологизмов началось во второй половине XX века, однако интерес к новым номинациям в языке появился значительно раньше. Впервые определение термина неологизм было дано в «Настольном словаре для справок по всем отраслям знания» под ред. Ф.Толля (1864 г.): «Неологизм (греч.), страсть вводить в язык слова бесполезные, т.е. назначенные для выражения идей, ясно передаваемых другими словами, уже вошедшими в употребление» [Алаторцева, 1999 с. 11]. Само же слово неологизм использовалось и ранее, например, П. Я Вяземский пишет: «Позволю себе неологизмы, т.е. прибавления к Словарю Российской Академии» [там же, с. 11].
Most Internet of Vehicles (IoV) applications are delay-sensitive and require resources for data storage and tasks processing, which is very difficult to afford by vehicles. Such tasks are often offloaded to more powerful entities, like cloud and fog servers. Fog computing is decentralized infrastructure located between data source and cloud, supplies several benefits that make it a non-frivolous extension of the cloud. The high volume data which is generated by vehicles’ sensors and also the limited computation capabilities of vehicles have imposed several challenges on VANETs systems. Therefore, VANETs is integrated with fog computing to form a paradigm namely Vehicular Fog Computing (VFC) which provide low-latency services to mo
... Show MoreThe convergence speed is the most important feature of Back-Propagation (BP) algorithm. A lot of improvements were proposed to this algorithm since its presentation, in order to speed up the convergence phase. In this paper, a new modified BP algorithm called Speeding up Back-Propagation Learning (SUBPL) algorithm is proposed and compared to the standard BP. Different data sets were implemented and experimented to verify the improvement in SUBPL.
The city is a built-up urban space and multifunctional structures that ensure safety, health and the best shelter for humans. All its built structures had various urban roofs influenced by different climate circumstances. That creates peculiarities and changes within the urban local climate and an increase in the impact of urban heat islands (UHI) with wastage of energy. The research question is less information dealing with the renovation of existing urban roofs using color as a strategy to mitigate the impact of UHI. In order to achieve local urban sustainability; the research focused on solutions using different materials and treatments to reduce urban surface heating emissions. The results showed that the new and old technologies, produ
... Show MoreToni Morrison (1931-), the first African-American winner of Noble Prize in literature (1993) and the winner of the 1988 Pulitzer Prize for fiction, regards herself as the historian of African-American people. She does not think of her writings as literature but as a sacred book dedicated to explore the interior lives of blacks. She creates history by disregarding European standards and the white man's view of African- Americans. She adopts her people's point of view, invests their heritage, voices their pains and uses their vernacular. She even writes to a black audience. She establishes the black novel by depicting the blackness of American literature. In choos
... Show MoreThis study employs wavelet transforms to address the issue of boundary effects. Additionally, it utilizes probit transform techniques, which are based on probit functions, to estimate the copula density function. This estimation is dependent on the empirical distribution function of the variables. The density is estimated within a transformed domain. Recent research indicates that the early implementations of this strategy may have been more efficient. Nevertheless, in this work, we implemented two novel methodologies utilizing probit transform and wavelet transform. We then proceeded to evaluate and contrast these methodologies using three specific criteria: root mean square error (RMSE), Akaike information criterion (AIC), and log
... Show MoreDigital tampering identification, which detects picture modification, is a significant area of image analysis studies. This area has grown with time with exceptional precision employing machine learning and deep learning-based strategies during the last five years. Synthesis and reinforcement-based learning techniques must now evolve to keep with the research. However, before doing any experimentation, a scientist must first comprehend the current state of the art in that domain. Diverse paths, associated outcomes, and analysis lay the groundwork for successful experimentation and superior results. Before starting with experiments, universal image forensics approaches must be thoroughly researched. As a result, this review of variou
... Show MoreThe advancement of digital technology has increased the deployment of wireless sensor networks (WSNs) in our daily life. However, locating sensor nodes is a challenging task in WSNs. Sensing data without an accurate location is worthless, especially in critical applications. The pioneering technique in range-free localization schemes is a sequential Monte Carlo (SMC) method, which utilizes network connectivity to estimate sensor location without additional hardware. This study presents a comprehensive survey of state-of-the-art SMC localization schemes. We present the schemes as a thematic taxonomy of localization operation in SMC. Moreover, the critical characteristics of each existing scheme are analyzed to identify its advantages
... Show MoreCyber-attacks keep growing. Because of that, we need stronger ways to protect pictures. This paper talks about DGEN, a Dynamic Generative Encryption Network. It mixes Generative Adversarial Networks with a key system that can change with context. The method may potentially mean it can adjust itself when new threats appear, instead of a fixed lock like AES. It tries to block brute‑force, statistical tricks, or quantum attacks. The design adds randomness, uses learning, and makes keys that depend on each image. That should give very good security, some flexibility, and keep compute cost low. Tests still ran on several public image sets. Results show DGEN beats AES, chaos tricks, and other GAN ideas. Entropy reached 7.99 bits per pix
... Show MoreBenthic invertebrates were used as bio- indicators to evaluate the pollution in -Diwania River . Five stations were selected for this purpose , extending from A1 -?? rtream to A1- Sadeer District downstream . The percentage of?ct uP?str?^ ? ?, oligochaeta to total benthic invertebrates were calculated . The population density of evaluation. 'I'he results Were ??? Tubificid worms without hair ehaetae was ©iso used IOBS(01igochaete Index of Sediment Bioindicati©n ), TUSP ? presented as indices Io (Tubificidae Species Percentage ) & degree of pollution Eo . IT was noticed that the 0 in??37.17 percentage of ©lig©chaeta to the total benthic invertebrates ranged between to 60.685 in station 3 , while the percentage ©f Tubificid w©rms t© ©
... Show More