تناول المقال موضوع استثمار الذكاء الاصطناعي في تحليل البيانات الضخمة داخل المؤسسات العلمية، وركز على توضيح أهمية هذا التكامل في تعزيز الأداء الأكاديمي والبحثي. استعرضت المقالة تعريفات كل من الذكاء الاصطناعي والبيانات الضخمة، وأنواع البيانات داخل المؤسسات العلمية، ثم بينت أبرز التطبيقات العملية مثل التنبؤ بأداء الطلبة، والإرشاد الذكي، وتحليل سلوك المستخدمين، والفهرسة التلقائية. كما ناقشت المقالة التحديات الأخلاقية والمهنية، وأكدت على ضرورة تبني رؤية استراتيجية تشمل تدريب الكوادر ودمج هذه التقنيات في السياسات التعليمية. وخُتمت بأمثلة عالمية وعربية – من السعودية، والأردن، والعراق، والإمارات – حول استخدام منصات ذكية في التعليم والبحث، مشيرة إلى دور هذه التقنيات في صياغة مستقبل معرفي قائم على التحليل الذكي واتخاذ القرار المبني على البيانات.
تحليل الأخطاء الصوتية في بعض كتب اللغة الروسية الدراسية
The main challenge of military tactical communication systems is the accessibility of relevant information on the particular operating environment required for the determination of the waveform's ideal use. The existing propagation model focuses mainly on broadcasting and commercial wireless communication with a highs transceiver antenna that is not suitable for numerous military tactical communication systems. This paper presents a study of the path loss model related to radio propagation profile within the suburban in Kuala Lumpur. The experimental path loss modeling for VHF propagation was collected from various suburban settings for the 30-88 MHz frequency range. This experiment was highly affected by ecological factors and existing
... Show Moreبكل تواضع، نقدم لكم هذا الكتاب بعنوان "الصحة النفسية في عصر التكنولوجيا: تحليل للتحديات والفرص المتاحة". يسعى الكتاب إلى إلقاء الضوء على العلاقة المعقدة بين عالمنا الرقمي المتنامي وصحة عقولنا وقلوبنا. نعيش في زمن يشهد تغييرات كبيرة نتيجة التقدم التكنولوجي، حيث أصبحت الأدوات الرقمية جزءًا أساسيًا من حياتنا اليومية، تؤثر في أساليب تواصلنا، تعلمنا، وتفاعلنا مع محيطنا. لذا، يصبح من الضروري فهم تأثير هذه التقنيا
... Show MoreIn this paper, we investigate the behavior of the bayes estimators, for the scale parameter of the Gompertz distribution under two different loss functions such as, the squared error loss function, the exponential loss function (proposed), based different double prior distributions represented as erlang with inverse levy prior, erlang with non-informative prior, inverse levy with non-informative prior and erlang with chi-square prior.
The simulation method was fulfilled to obtain the results, including the estimated values and the mean square error (MSE) for the scale parameter of the Gompertz distribution, for different cases for the scale parameter of the Gompertz distr
... Show MoreThis research shows the problem of the economic development of underdeveloped countries in an unconventional way, as these papers explain the problems of the economic development. This research not only reviews the problems, but it illustrates them in a philosophical way, basis of the data of modernity, this mean it is a process of connecting between the absence of the modernity values and the failure of development in underdeveloped countries. The Search follows the descriptive approach to get to the goal of search by four main axes. The first axis includes clarifying modernity and its principles, the second axis includes clarifying the economic development , the third axis includes the features of the mod
... Show Moreتحليل الشراكات التجارية للعراق 2003-2013
Simulation experiments are a means of solving in many fields, and it is the process of designing a model of the real system in order to follow it and identify its behavior through certain models and formulas written according to a repeating software style with a number of iterations. The aim of this study is to build a model that deals with the behavior suffering from the state of (heteroskedasticity) by studying the models (APGARCH & NAGARCH) using (Gaussian) and (Non-Gaussian) distributions for different sample sizes (500,1000,1500,2000) through the stage of time series analysis (identification , estimation, diagnostic checking and prediction). The data was generated using the estimations of the parameters resulting f
... Show MoreAbstract
Metal cutting processes still represent the largest class of manufacturing operations. Turning is the most commonly employed material removal process. This research focuses on analysis of the thermal field of the oblique machining process. Finite element method (FEM) software DEFORM 3D V10.2 was used together with experimental work carried out using infrared image equipment, which include both hardware and software simulations. The thermal experiments are conducted with AA6063-T6, using different tool obliquity, cutting speeds and feed rates. The results show that the temperature relatively decreased when tool obliquity increases at different cutting speeds and feed rates, also it
... Show MoreRecurrent strokes can be devastating, often resulting in severe disability or death. However, nearly 90% of the causes of recurrent stroke are modifiable, which means recurrent strokes can be averted by controlling risk factors, which are mainly behavioral and metabolic in nature. Thus, it shows that from the previous works that recurrent stroke prediction model could help in minimizing the possibility of getting recurrent stroke. Previous works have shown promising results in predicting first-time stroke cases with machine learning approaches. However, there are limited works on recurrent stroke prediction using machine learning methods. Hence, this work is proposed to perform an empirical analysis and to investigate machine learning al
... Show MoreThis paper proposed a new method to study functional non-parametric regression data analysis with conditional expectation in the case that the covariates are functional and the Principal Component Analysis was utilized to de-correlate the multivariate response variables. It utilized the formula of the Nadaraya Watson estimator (K-Nearest Neighbour (KNN)) for prediction with different types of the semi-metrics, (which are based on Second Derivative and Functional Principal Component Analysis (FPCA)) for measureing the closeness between curves. Root Mean Square Errors is used for the implementation of this model which is then compared to the independent response method. R program is used for analysing data. Then, when the cov
... Show More