Clobetasol propionate (CP) is a super potent corticosteroid widely used to treat various skin disorders such as atopic dermatitis and psoriasis. However, its utility for topical application is hampered due to its common side effects, such as skin atrophy, steroidal acne, hypopigmentation, and allergic contact dermatitis. Microsponge is a unique three-dimensional microstructure particle with micro and nano-meters-wide cavities, which can encapsulate both hydrophilic and lipophilic drugs providing increased efficacy and safety. The aim of the current study is to prepare and optimize clobetasol-loaded microsponges. The emulsion solvent diffusion method is used for the preparation of ethylcellulose (EC)-based microsponges. The impact of various formulation variables on microsponge's properties includes; drug: polymer ratio, polyvinyl alcohol (PVA) quantities, the volume of external phase, and stirring rates investigated. The microsponges were characterized in terms of particle size, product yield, CP entrapment %, and in-vitro drug release behavior. The results report that increasing EC concentration led to a significant increase in particle size, with a decrease in product yield and drug entrapment %. Increasing stirring speed or external aqueous volume or PVA w/v % caused a non-significant decrease in production yield and CP entrapment % but showed a significant decrease, and increase in particle size, respectively. Finally, it was concluded that the ability to use ethylcellous as a Msg polymer matrix to prepare uniform, highly porous particles was confirmed by microscope observation and compatibility with CP.
Social media is known as detectors platform that are used to measure the activities of the users in the real world. However, the huge and unfiltered feed of messages posted on social media trigger social warnings, particularly when these messages contain hate speech towards specific individual or community. The negative effect of these messages on individuals or the society at large is of great concern to governments and non-governmental organizations. Word clouds provide a simple and efficient means of visually transferring the most common words from text documents. This research aims to develop a word cloud model based on hateful words on online social media environment such as Google News. Several steps are involved including data acq
... Show MoreNowadays, internet security is a critical concern; the One of the most difficult study issues in network security is "intrusion detection". Fight against external threats. Intrusion detection is a novel method of securing computers and data networks that are already in use. To boost the efficacy of intrusion detection systems, machine learning and deep learning are widely deployed. While work on intrusion detection systems is already underway, based on data mining and machine learning is effective, it requires to detect intrusions by training static batch classifiers regardless considering the time-varying features of a regular data stream. Real-world problems, on the other hand, rarely fit into models that have such constraints. Furthermor
... Show MoreThe petroleum industry, which is one of the pillars of the national economy, has the potential to generate vast wealth and employment possibilities. The transportation of petroleum products is complicated and changeable because of the hazards caused by the corrosion consequences. Hazardous chemical leaks caused by natural disasters may harm the environment, resulting in significant economic losses. It significantly threatens the aim for sustainable development. When a result, determining the likelihood of leakage and the potential for environmental harm, it becomes a top priority for decision-makers as they develop maintenance plans. This study aims to provide an in-depth understanding of the risks associated with oil and gas pipeli
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreNon uniform channelization is a crucial task in cognitive radio receivers for obtaining separate channels from the digitized wideband input signal at different intervals of time. The two main requirements in the channelizer are reconfigurability and low complexity. In this paper, a reconfigurable architecture based on a combination of Improved Coefficient Decimation Method (ICDM) and Coefficient Interpolation Method (CIM) is proposed. The proposed Hybrid Coefficient Decimation-Interpolation Method (HCDIM) based filter bank (FB) is able to realize the same number of channels realized using (ICDM) but with a maximum decimation factor divided by the interpolation factor (L), which leads to less deterioration in stop band at
... Show MoreHeart disease is a significant and impactful health condition that ranks as the leading cause of death in many countries. In order to aid physicians in diagnosing cardiovascular diseases, clinical datasets are available for reference. However, with the rise of big data and medical datasets, it has become increasingly challenging for medical practitioners to accurately predict heart disease due to the abundance of unrelated and redundant features that hinder computational complexity and accuracy. As such, this study aims to identify the most discriminative features within high-dimensional datasets while minimizing complexity and improving accuracy through an Extra Tree feature selection based technique. The work study assesses the efficac
... Show MoreFuzzy logic is used to solve the load flow and contingency analysis problems, so decreasing computing time and its the best selection instead of the traditional methods. The proposed method is very accurate with outstanding computation time, which made the fuzzy load flow (FLF) suitable for real time application for small- as well as large-scale power systems. In addition that, the FLF efficiently able to solve load flow problem of ill-conditioned power systems and contingency analysis. The FLF method using Gaussian membership function requires less number of iterations and less computing time than that required in the FLF method using triangular membership function. Using sparsity technique for the input Ybus sparse matrix data gi
... Show More