Authors in this work design efficient neural networks, which are based on the modified Levenberg - Marquardt (LM) training algorithms to solve non-linear fourth - order three -dimensional partial differential equations in the two kinds in the periodic and in the non-periodic - Periodic. Software reliability growth models are essential tools for monitoring and evaluating the evolution of software reliability. Software defect detection events that occur during testing and operation are often treated as counting processes in many current models. However, when working with large software systems, the error detection process should be viewed as a random process with a continuous state space, since the number of faults found during testing is vast and the number of faults corrected by bug fixing changes only insignificantly. The suggested design addressing minimization problems employs a feed-forward approach to solve problems like these equations by converting the original problem into an optimization. Efficient design is achieved through a calculated parameter for learning with high precision. To clarify applicability, reliability, and accuracy for this design, some examples are provided. Additionally, to demonstrate the efficiency of the proposed design, comparisons were conducted with other designs.
This study is planned with the aim of constructing models that can be used to forecast trip production in the Al-Karada region in Baghdad city incorporating the socioeconomic features, through the use of various statistical approaches to the modeling of trip generation, such as artificial neural network (ANN) and multiple linear regression (MLR). The research region was split into 11 zones to accomplish the study aim. Forms were issued based on the needed sample size of 1,170. Only 1,050 forms with responses were received, giving a response rate of 89.74% for the research region. The collected data were processed using the ANN technique in MATLAB v20. The same database was utilized to
Semantic segmentation realization and understanding is a stringent task not just for computer vision but also in the researches of the sciences of earth, semantic segmentation decompose compound architectures in one elements, the most mutual object in a civil outside or inside senses must classified then reinforced with information meaning of all object, it’s a method for labeling and clustering point cloud automatically. Three dimensions natural scenes classification need a point cloud dataset to representation data format as input, many challenge appeared with working of 3d data like: little number, resolution and accurate of three Dimensional dataset . Deep learning now is the po
The study presents the modification of the Broyden-Flecher-Goldfarb-Shanno (BFGS) update (H-Version) based on the determinant property of inverse of Hessian matrix (second derivative of the objective function), via updating of the vector s ( the difference between the next solution and the current solution), such that the determinant of the next inverse of Hessian matrix is equal to the determinant of the current inverse of Hessian matrix at every iteration. Moreover, the sequence of inverse of Hessian matrix generated by the method would never approach a near-singular matrix, such that the program would never break before the minimum value of the objective function is obtained. Moreover, the new modification of BFGS update (H-vers
... Show MoreEstimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that repre
... Show MoreUntil recently, researchers have utilized and applied various techniques for intrusion detection system (IDS), including DNA encoding and clustering that are widely used for this purpose. In addition to the other two major techniques for detection are anomaly and misuse detection, where anomaly detection is done based on user behavior, while misuse detection is done based on known attacks signatures. However, both techniques have some drawbacks, such as a high false alarm rate. Therefore, hybrid IDS takes advantage of combining the strength of both techniques to overcome their limitations. In this paper, a hybrid IDS is proposed based on the DNA encoding and clustering method. The proposed DNA encoding is done based on the UNSW-NB15
... Show MoreThe problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.
With the development of cloud computing during the latest years, data center networks have become a great topic in both industrial and academic societies. Nevertheless, traditional methods based on manual and hardware devices are burdensome, expensive, and cannot completely utilize the ability of physical network infrastructure. Thus, Software-Defined Networking (SDN) has been hyped as one of the best encouraging solutions for future Internet performance. SDN notable by two features; the separation of control plane from the data plane, and providing the network development by programmable capabilities instead of hardware solutions. Current paper introduces an SDN-based optimized Resch
The aim of this essay is to use a single-index model in developing and adjusting Fama-MacBeth. Penalized smoothing spline regression technique (SIMPLS) foresaw this adjustment. Two generalized cross-validation techniques, Generalized Cross Validation Grid (GGCV) and Generalized Cross Validation Fast (FGCV), anticipated the regular value of smoothing covered under this technique. Due to the two-steps nature of the Fama-MacBeth model, this estimation generated four estimates: SIMPLS(FGCV) - SIMPLS(FGCV), SIMPLS(FGCV) - SIM PLS(GGCV), SIMPLS(GGCV) - SIMPLS(FGCV), SIM PLS(GGCV) - SIM PLS(GGCV). Three-factor Fama-French model—market risk premium, size factor, value factor, and their implication for excess stock returns and portfolio return
... Show MoreAbstract
The research aims to build a training program to develop some executive functions for kindergarten children. To achieve this goal, the two researchers built the program according to the following steps:
1. Determining the general objective of the program.
2. Determining the behavioral objectives of the program.
3. Determining the included content in the program.
4. Implementing the content of the activities of the program.
5. Evaluating the Program.
The program included (12) training activities, the training activities included several items: the title of the activity, the time of implementation of the activity, the general objective of the activity, the procedural behavioral objective, the means and tools u