Classifying an overlapping object is one of the main challenges faced by researchers who work in object detection and recognition. Most of the available algorithms that have been developed are only able to classify or recognize objects which are either individually separated from each other or a single object in a scene(s), but not overlapping kitchen utensil objects. In this project, Faster R-CNN and YOLOv5 algorithms were proposed to detect and classify an overlapping object in a kitchen area. The YOLOv5 and Faster R-CNN were applied to overlapping objects where the filter or kernel that are expected to be able to separate the overlapping object in the dedicated layer of applying models. A kitchen utensil benchmark image database and overlapping kitchen utensils from internet were used as base benchmark objects. The evaluation and training/validation sets are set at 20% and 80% respectively. This project evaluated the performance of these techniques and analyzed their strengths and speeds based on accuracy, precision and F1 score. The analysis results in this project concluded that the YOLOv5 produces accurate bounding boxes whereas the Faster R-CNN detects more objects. In an identical testing environment, YOLOv5 shows the better performance than Faster R-CNN algorithm. After running in the same environment, this project gained the accuracy of 0.8912(89.12%) for YOLOv5 and 0.8392 (83.92%) for Faster R-CNN, while the loss value was 0.1852 for YOLOv5 and 0.2166 for Faster R-CNN. The comparison of these two methods is most current and never been applied in overlapping objects, especially kitchen utensils.
Abstract:
The models of time series often suffer from the problem of the existence of outliers that accompany the data collection process for many reasons, their existence may have a significant impact on the estimation of the parameters of the studied model. Access to highly efficient estimators is one of the most important stages of statistical analysis, And it is therefore important to choose the appropriate methods to obtain good estimators. The aim of this research is to compare the ordinary estimators and the robust estimators of the estimation of the parameters of
... Show MoreRegression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show MoreMultilevel models are among the most important models widely used in the application and analysis of data that are characterized by the fact that observations take a hierarchical form, In our research we examined the multilevel logistic regression model (intercept random and slope random model) , here the importance of the research highlights that the usual regression models calculate the total variance of the model and its inability to read variance and variations between levels ,however in the case of multi-level regression models, the calculation of the total variance is inaccurate and therefore these models calculate the variations for each level of the model, Where the research aims to estimate the parameters of this m
... Show MoreThe spread of novel coronavirus disease (COVID-19) has resulted in chaos around the globe. The infected cases are still increasing, with many countries still showing a trend of growing daily cases. To forecast the trend of active cases, a mathematical model, namely the SIR model was used, to visualize the spread of COVID-19. For this article, the forecast of the spread of the virus in Malaysia has been made, assuming that all Malaysian will eventually be susceptible. With no vaccine and antiviral drug currently developed, the visualization of how the peak of infection (namely flattening the curve) can be reduced to minimize the effect of COVID-19 disease. For Malaysians, let’s ensure to follow the rules and obey the SOP to lower the
The logistic regression model regarded as the important regression Models ,where of the most interesting subjects in recent studies due to taking character more advanced in the process of statistical analysis .
The ordinary estimating methods is failed in dealing with data that consist of the presence of outlier values and hence on the absence of such that have undesirable effect on the result. &nbs
... Show MoreABSTRICT:
This study is concerned with the estimation of constant and time-varying parameters in non-linear ordinary differential equations, which do not have analytical solutions. The estimation is done in a multi-stage method where constant and time-varying parameters are estimated in a straight sequential way from several stages. In the first stage, the model of the differential equations is converted to a regression model that includes the state variables with their derivatives and then the estimation of the state variables and their derivatives in a penalized splines method and compensating the estimations in the regression model. In the second stage, the pseudo- least squares method was used to es
... Show MoreThe area of character recognition has received a considerable attention by researchers all over the world during the last three decades. However, this research explores best sets of feature extraction techniques and studies the accuracy of well-known classifiers for Arabic numeral using the Statistical styles in two methods and making comparison study between them. First method Linear Discriminant function that is yield results with accuracy as high as 90% of original grouped cases correctly classified. In the second method, we proposed algorithm, The results show the efficiency of the proposed algorithms, where it is found to achieve recognition accuracy of 92.9% and 91.4%. This is providing efficiency more than the first method.
We have presented the distribution of the exponentiated expanded power function (EEPF) with four parameters, where this distribution was created by the exponentiated expanded method created by the scientist Gupta to expand the exponential distribution by adding a new shape parameter to the cumulative function of the distribution, resulting in a new distribution, and this method is characterized by obtaining a distribution that belongs for the exponential family. We also obtained a function of survival rate and failure rate for this distribution, where some mathematical properties were derived, then we used the method of maximum likelihood (ML) and method least squares developed (LSD)
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show More
In 2020 one of the researchers in this paper, in his first research, tried to find out the Modified Weighted Pareto Distribution of Type I by using the Azzalini method for weighted distributions, which contain three parameters, two of them for scale while the third for shape.This research compared the distribution with two other distributions from the same family; the Standard Pareto Distribution of Type I and the Generalized Pareto Distribution by using the Maximum likelihood estimator which was derived by the researchers for Modified Weighted Pareto Distribution of Type I, then the Mont Carlo method was used–that is one of the simulation manners for generating random samples data in different sizes ( n= 10,30,50), and in di
... Show More