Ischemic stroke is a significant cause of morbidity and mortality worldwide. Autophagy, a process of intracellular degradation, has been shown to play a crucial role in the pathogenesis of ischemic stroke. Long non-coding RNAs (lncRNAs) have emerged as essential regulators of autophagy in various diseases, including ischemic stroke. Recent studies have identified several lncRNAs that modulate autophagy in ischemic stroke, including MALAT1, MIAT, SNHG12, H19, AC136007. 2, C2dat2, MEG3, KCNQ1OT1, SNHG3, and RMRP. These lncRNAs regulate autophagy by interacting with key proteins involved in the autophagic process, such as Beclin-1, ATG7, and LC3. Understanding the role of lncRNAs in regulating autophagy in ischemic stroke may provide new insights into the pathogenesis of this disease and identify potential therapeutic targets for its treatment.
The accuracy of the skillful performance of the front and back dimensions of badminton in volleyball, occurs through the investment of complex exercises (physical skills) in a single performance and its characteristics that give the correct movement behavior and speed to the accuracy of the performance of the strokes as well as the identification of changes in some physiological indicators of By using these compound exercises. The research problem lies: I found a weakness in the accuracy of the performance of the front and back dimensions strike and diagnosed this through the tests that it conducted on the players to identify and know the problem, and attributed this weakness to a weakness in the necessary physical and skill abilities and t
... Show MoreThe need for cloud services has been raised globally to provide a platform for healthcare providers to efficiently manage their citizens’ health records and thus provide treatment remotely. In Iraq, the healthcare records of public hospitals are increasing progressively with poor digital management. While recent works indicate cloud computing as a platform for all sectors globally, a lack of empirical evidence demands a comprehensive investigation to identify the significant factors that influence the utilization of cloud health computing. Here we provide a cost-effective, modular, and computationally efficient model of utilizing cloud computing based on the organization theory and the theory of reasoned action perspectives. A tot
... Show MoreScientists have delved too much into reality and metaphor, and perhaps a topic of Arabic rhetoric has not received the attention and care of scholars as much as the topic of truth and metaphor. The metaphor opens wide horizons of expression in front of the writer so that he has several means by which he can express the one experience, so his imagination takes off depicting the intelligible as tangible, the seen as audible, and the audible as seen. That image presented by the creative writer.
The first thing to note is that the emergence of metaphor as a rhetorical term was at the hands of the Mu'tazila. Muslims differed about the issue of metaphor in the Holy Qur’an, and the beginning of the dispute was about the verses in which the
Compressing an image and reconstructing it without degrading its original quality is one of the challenges that still exist now a day. A coding system that considers both quality and compression rate is implemented in this work. The implemented system applies a high synthetic entropy coding schema to store the compressed image at the smallest size as possible without affecting its original quality. This coding schema is applied with two transform-based techniques, one with Discrete Cosine Transform and the other with Discrete Wavelet Transform. The implemented system was tested with different standard color images and the obtained results with different evaluation metrics have been shown. A comparison was made with some previous rel
... Show MoreWhoever contemplates the Qur'an and recites its texts finds that the Qur'an did not invent or invent words that were unknown before it. Rather, it is the language of the Qur'an which deals with all the matters of the saying. He chose the most honorable of the materials and connected them to the meaning. And in the places of prosperity or sweetness, we find his words easy, to go into the midst of the ills for which it is The Holy Quran chose vocabulary and structures without The son of Ajeeba was one of those distinguished by high taste and linguistic sciences. This ability helped him to analyze and draw, and to explain the ills for which he influenced the singular On the other, and installed on another, and to show the efforts of Ibn Aje
... Show MoreImage compression is one of the data compression types applied to digital images in order to reduce their high cost for storage and/or transmission. Image compression algorithms may take the benefit of visual sensitivity and statistical properties of image data to deliver superior results in comparison with generic data compression schemes, which are used for other digital data. In the first approach, the input image is divided into blocks, each of which is 16 x 16, 32 x 32, or 64 x 64 pixels. The blocks are converted first into a string; then, encoded by using a lossless and dictionary-based algorithm known as arithmetic coding. The more occurrence of the pixels values is codded in few bits compare with pixel values of less occurre
... Show MoreIn this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform
A new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show More