Ischemic stroke is a significant cause of morbidity and mortality worldwide. Autophagy, a process of intracellular degradation, has been shown to play a crucial role in the pathogenesis of ischemic stroke. Long non-coding RNAs (lncRNAs) have emerged as essential regulators of autophagy in various diseases, including ischemic stroke. Recent studies have identified several lncRNAs that modulate autophagy in ischemic stroke, including MALAT1, MIAT, SNHG12, H19, AC136007. 2, C2dat2, MEG3, KCNQ1OT1, SNHG3, and RMRP. These lncRNAs regulate autophagy by interacting with key proteins involved in the autophagic process, such as Beclin-1, ATG7, and LC3. Understanding the role of lncRNAs in regulating autophagy in ischemic stroke may provide new insights into the pathogenesis of this disease and identify potential therapeutic targets for its treatment.
Background: Stroke is an acute neurologic injury and represents the 2nd leading cause of mortality worldwide, and also the most leading cause of acquired disability and morbidity in adults.
Objective: Effect and association between stroke and risk factors.
Type of the study: A retrospective study.
Methods: The study conducted on 312 patients in 2016, all data were collected from patients’ files from the emergency unit, which included basic demographic and disease characteristic, co morbid diseases, risk factors, final diagnosis.
Results: both previous stroke, ischemic heart disease was strong predictor of new
... Show MoreIn this paper, an efficient method for compressing color image is presented. It allows progressive transmission and zooming of the image without need to extra storage. The proposed method is going to be accomplished using cubic Bezier surface (CBI) representation on wide area of images in order to prune the image component that shows large scale variation. Then, the produced cubic Bezier surface is subtracted from the image signal to get the residue component. Then, bi-orthogonal wavelet transform is applied to decompose the residue component. Both scalar quantization and quad tree coding steps are applied on the produced wavelet sub bands. Finally, adaptive shift coding is applied to handle the remaining statistical redundancy and attain e
... Show MoreWith occurrence of any financial crises, regardless at global or regional levels such as the great economic crises during 1929 – 1933 and the Asian financial crises at the end of twentieth century as well as the current global financial crises that started during second half 2008, we acknowledge that there are some critics loudly articulated accusing accountants and auditors for disparences, and they are numerical partner of financial manipulation as well as corrupted administrators with the company's administrators that they are difficulty. At this point, many suggestions and recommendation for upgrading the accounting system has been made.  
... Show MoreThe current research aims to identify pictorial coding and its relationship to the aesthetic taste of art education students. The research community consisted of (10) plastic artworks, and (3) artworks were selected as a sample for analysis and decoding. With the aim of the research, the research tool was prepared as it consisted of an analysis form, and the researcher used statistical methods: Equation (Cooper) to find the percentage of agreement between the arbitrators and the equation (Scott) to calculate the validity of the tool, and the correlation coefficient (Pearson) to extract stability in the method of segmentation half. Shape formations and achieve encryption of the plastic image through decoding symbols, meanings, and the sig
... Show MoreThe second leading cause of death and one of the most common causes of disability in the world is stroke. Researchers have found that brain–computer interface (BCI) techniques can result in better stroke patient rehabilitation. This study used the proposed motor imagery (MI) framework to analyze the electroencephalogram (EEG) dataset from eight subjects in order to enhance the MI-based BCI systems for stroke patients. The preprocessing portion of the framework comprises the use of conventional filters and the independent component analysis (ICA) denoising approach. Fractal dimension (FD) and Hurst exponent (Hur) were then calculated as complexity features, and Tsallis entropy (TsEn) and dispersion entropy (DispEn) were assessed as
... Show MoreSome problems want to be solved in image compression to make the process workable and more efficient. Much work had been done in the field of lossy image compression based on wavelet and Discrete Cosine Transform (DCT). In this paper, an efficient image compression scheme is proposed, based on a common encoding transform scheme; It consists of the following steps: 1) bi-orthogonal (tab 9/7) wavelet transform to split the image data into sub-bands, 2) DCT to de-correlate the data, 3) the combined transform stage's output is subjected to scalar quantization before being mapped to positive, 4) and LZW encoding to produce the compressed data. The peak signal-to-noise (PSNR), compression ratio (CR), and compression gain (CG) measures were used t
... Show MoreSickle cell disease (SCD) comprises an inherited blood disorder that is life long and affects many people globally. In spite of the development in treatment, SCA is a considerable cause of mortality and morbidity. The present study tries to assess the role of leukocytes represented by β integrin(CD18) and platelets and their productivity in the pathogenicity of disease during the steady state and crisis in comparison with the healthy as-control group, SCD patients (15) enrolled during crisis and steady state (follow up) showed a significant increase in leukocytes and platelets cells productivity during crisis when compared to the steady state and in the steady state when compared to the healthy control group . In this study, SCD patho
... Show More