During COVID-19, wearing a mask was globally mandated in various workplaces, departments, and offices. New deep learning convolutional neural network (CNN) based classifications were proposed to increase the validation accuracy of face mask detection. This work introduces a face mask model that is able to recognize whether a person is wearing mask or not. The proposed model has two stages to detect and recognize the face mask; at the first stage, the Haar cascade detector is used to detect the face, while at the second stage, the proposed CNN model is used as a classification model that is built from scratch. The experiment was applied on masked faces (MAFA) dataset with images of 160x160 pixels size and RGB color. The model achieved lower computational complexity and number of layers, while being more reliable compared with other algorithms applied to recognize face masks. The findings reveal that the model's validation accuracy reaches 97.55% to 98.43% at different learning rates and different values of features vector in the dense layer, which represents a neural network layer that is connected deeply of the CNN proposed model training. Finally, the suggested model enhances recognition performance parameters such as precision, recall, and area under the curve (AUC).
After 2003, Iraq witnessed new challenges represented by the predominance of sectarian discourses, hatred and extremism at the expense of moderate political discourse and the predominance of sub-affiliations and external agendas at the expense of national affiliation, which led to the creation of an unsafe or stable environment dominated by the character of violence and terrorism. Moderate discourse would work to fuse sub-affiliations into one melting pot in which it would be the first loyalty to the homeland and not to the tribe, party or sect... Etc., and this in turn will contribute to promoting peaceful coexistence between the various other sub-affiliations within the framework of one community construction
The audience is one of the important practical elements in the theatrical show and its importance is not confined to its static activity as a receiver element only, rather it went beyond that issue as an effective and influential element in the proceedings of the show and the process of meaning construction, that it gains an active role in the construction and production of the connotation that influences and is influenced by the actor, where the communication channels are open between the two sides, consequently a kind of watching and joint interaction happens between them. Thus, it has become necessary for the actor to create a suitable environment for the onlookers in order for it to be an essential part of the show system. The
... Show MoreWithin this work, to promote the efficiency of organic-based solar cells, a series of novel A-π-D type small molecules were scrutinised. The acceptors which we designed had a moiety of N, N-dimethylaniline as the donor and catechol moiety as the acceptor linked through various conjugated π-linkers. We performed DFT (B3LYP) as well as TD-DFT (CAM-B3LYP) computations using 6-31G (d,p) for scrutinising the impact of various π-linkers upon optoelectronic characteristics, stability, and rate of charge transport. In comparison with the reference molecule, various π-linkers led to a smaller HOMO–LUMO energy gap. Compared to the reference molecule, there was a considerable red shift in the molecules under study (A1–A4). Therefore, based on
... Show MoreA robust video-bitrate adaptive scheme at client-aspect plays a significant role in keeping a good quality of video streaming technology experience. Video quality affects the amount of time the video has turned off playing due to the unfilled buffer state. Therefore to maintain a video streaming continuously with smooth bandwidth fluctuation, a video buffer structure based on adapting the video bitrate is considered in this work. Initially, the video buffer structure is formulated as an optimal control-theoretic problem that combines both video bitrate and video buffer feedback signals. While protecting the video buffer occupancy from exceeding the limited operating level can provide continuous video str
... Show More<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T
... Show MoreIn this paper a new method is proposed to perform the N-Radon orthogonal frequency division multiplexing (OFDM), which are equivalent to 4-quadrature amplitude modulation (QAM), 16-QAM, 64-QAM, 256-QAM, ... etc. in spectral efficiency. This non conventional method is proposed in order to reduce the constellation energy and increase spectral efficiency. The proposed method gives a significant improvement in Bit Error Rate performance, and keeps bandwidth efficiency and spectrum shape as good as conventional Fast Fourier Transform based OFDM. The new structure was tested and compared with conventional OFDM for Additive White Gaussian Noise, flat, and multi-path selective fading channels. Simulation tests were generated for different channels
... Show MoreNon uniform channelization is a crucial task in cognitive radio receivers for obtaining separate channels from the digitized wideband input signal at different intervals of time. The two main requirements in the channelizer are reconfigurability and low complexity. In this paper, a reconfigurable architecture based on a combination of Improved Coefficient Decimation Method (ICDM) and Coefficient Interpolation Method (CIM) is proposed. The proposed Hybrid Coefficient Decimation-Interpolation Method (HCDIM) based filter bank (FB) is able to realize the same number of channels realized using (ICDM) but with a maximum decimation factor divided by the interpolation factor (L), which leads to less deterioration in stop band at
... Show MoreFuzzy logic is used to solve the load flow and contingency analysis problems, so decreasing computing time and its the best selection instead of the traditional methods. The proposed method is very accurate with outstanding computation time, which made the fuzzy load flow (FLF) suitable for real time application for small- as well as large-scale power systems. In addition that, the FLF efficiently able to solve load flow problem of ill-conditioned power systems and contingency analysis. The FLF method using Gaussian membership function requires less number of iterations and less computing time than that required in the FLF method using triangular membership function. Using sparsity technique for the input Ybus sparse matrix data gi
... Show MoreThe petroleum industry, which is one of the pillars of the national economy, has the potential to generate vast wealth and employment possibilities. The transportation of petroleum products is complicated and changeable because of the hazards caused by the corrosion consequences. Hazardous chemical leaks caused by natural disasters may harm the environment, resulting in significant economic losses. It significantly threatens the aim for sustainable development. When a result, determining the likelihood of leakage and the potential for environmental harm, it becomes a top priority for decision-makers as they develop maintenance plans. This study aims to provide an in-depth understanding of the risks associated with oil and gas pipeli
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show More