This study aims to develop a recommendation engine methodology to enhance the model’s effectiveness and efficiency. The proposed model is commonly used to assign or propose a limited number of developers with the required skills and expertise to address and resolve a bug report. Managing collections within bug repositories is the responsibility of software engineers in addressing specific defects. Identifying the optimal allocation of personnel to activities is challenging when dealing with software defects, which necessitates a substantial workforce of developers. Analyzing new scientific methodologies to enhance comprehension of the results is the purpose of this analysis. Additionally, developer priorities were discussed, especially their utility in allocating a problem to a specific developer. An analysis was conducted on two key areas: first, the development of a model to represent developer prioritizing within the bug repository, and second, the use of hybrid machine learning techniques to select bug reports. Moreover, we use our model to facilitate developer assignment responsibilities. Moreover, we considered the developers’ backgrounds and drew upon their established knowledge and experience when formulating the pertinent objectives. An examination of two individuals’ experiences with software defects and how their actions impacted their rankings as developers in a software project is presented in this study. Researchers are implementing developer categorization techniques, assessing severity, and reopening bugs. A suitable number of bug reports is used to examine the model’s output. A developer’s bug assignment employee has been established, enabling the program to successfully address software maintenance issues with the highest accuracy of 78.38%. Best engine performance was achieved by optimizing and cleansing data, using relevant attributes, and processing it using deep learning.
A new approach for baud time (or baud rate) estimation of a random binary signal is presented. This approach utilizes the spectrum of the signal after nonlinear processing in a way that the estimation error can be reduced by simply increasing the number of the processed samples instead of increasing the sampling rate. The spectrum of the new signal is shown to give an accurate estimate about the baud time when there is no apriory information or any restricting preassumptions. The performance of the estimator for random binary square waves perturbed by white Gaussian noise and ISI is evaluated and compared with that of the conventional estimator of the zero crossing detector.
This paper demonstrates a new technique based on a combined form of the new transform method with homotopy perturbation method to find the suitable accurate solution of autonomous Equations with initial condition. This technique is called the transform homotopy perturbation method (THPM). It can be used to solve the problems without resorting to the frequency domain.The implementation of the suggested method demonstrates the usefulness in finding exact solution for linear and nonlinear problems. The practical results show the efficiency and reliability of technique and easier implemented than HPM in finding exact solutions.Finally, all algorithms in this paper implemented in MATLAB version 7.12.
The operation and management of water resources projects have direct and significant effects on the optimum use of water. Artificial intelligence techniques are a new tool used to help in making optimized decisions, based on knowledge bases in the planning, implementation, operation and management of projects as well as controlling flowing water quantities to prevent flooding and storage of excess water and use it during drought.
In this research, an Expert System was designed for operating and managing the system of AthTharthar Lake (ESSTAR). It was applied for all expected conditions of flow, including the cases of drought, normal flow, and during floods. Moreover, the cases of hypothetical op
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method
... Show MoreThis research study the effect of surface modification and copper (Cu) plating carbon fiber (CF) surface on the thermal stability and wettability of carbon fiber (CF)/epoxy (EP) composites. The TGA result indicates that the thermal-stability of carbon fiber may be enhanced after Cu coating CF. TGA curve showed that the treatment temperature was enhanced thermal stability of Ep/CF, this is due to the oxidation during heating. The Cu plating increased the thermal conductivity, this increase might be due to reduce in contact resistance at the interface due to chemical modification and copper plating and tunneling resistance.
The increase of surface polarity after coating cause decreas
... Show MoreIn this paper, three main generators are discussed: Linear generator, Geffe generator and Bruer generator. The Geffe and Bruer generators are improved and then calculate the Autocorrelation postulate of randomness test for each generator and compare the obtained result. These properties can be measured deterministically and then compared to statistical expectations using a chi-square test.
One of the most popular and legally recognized behavioral biometrics is the individual's signature, which is used for verification and identification in many different industries, including business, law, and finance. The purpose of the signature verification method is to distinguish genuine from forged signatures, a task complicated by cultural and personal variances. Analysis, comparison, and evaluation of handwriting features are performed in forensic handwriting analysis to establish whether or not the writing was produced by a known writer. In contrast to other languages, Arabic makes use of diacritics, ligatures, and overlaps that are unique to it. Due to the absence of dynamic information in the writing of Arabic signatures,
... Show MoreAtmospheric transmission is disturbed by scintillation, where scintillation caused more beam divergence. In this work target image spot radius was calculated in presence of atmospheric scintillation. The calculation depend on few relevant equation based on atmospheric parameter (for Middle East), tracking range, expansion ratio of applied beam expander's, receiving unit lens F-number, and the laser wavelength besides photodetector parameter. At maximum target range Rmax =20 km, target image radius is at its maximum Rs=0.4 mm. As the range decreases spot radius decreases too, until the range reaches limit (4 km) at which target image spot radius at its minimum value (0.22 mm). Then as the range decreases, spot radius increases due to geom
... Show MoreEnergy savings are very common in IoT sensor networks because IoT sensor nodes operate with their own limited battery. The data transmission in the IoT sensor nodes is very costly and consume much of the energy while the energy usage for data processing is considerably lower. There are several energy-saving strategies and principles, mainly dedicated to reducing the transmission of data. Therefore, with minimizing data transfers in IoT sensor networks, can conserve a considerable amount of energy. In this research, a Compression-Based Data Reduction (CBDR) technique was suggested which works in the level of IoT sensor nodes. The CBDR includes two stages of compression, a lossy SAX Quantization stage which reduces the dynamic range of the
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used.
Experimental results shows LPG-
... Show More