Iron–phthalocyanine (FePc) organic photoconductive detector was fabricated using pulsed laser deposition (PLD) technique to work in ultraviolet (UV) and visible regions. The organic semiconductor material (iron phthalocyanine) was deposited on n-type silicon wafer (Si) substrates at different thicknesses (100, 200 and 300) nm. FePc organic photoconductive detector has been improved by two methods: the first is to manufacture the detector on PSi substrates, and the second is by coating the detector with polyamide–nylon polymer to enhance the photoconductivity of the FePc detector. The current–voltage (I–V) characteristics, responsivity, photocurrent gain, response time and the quantum efficiency of the fabricated photoconductive detector were measured. The performance of the fabricated detector was taken under dark and illumination using two types of light sources: UV LED with wavelength (365[Formula: see text]nm), power of (10[Formula: see text]W) and Tungsten lamp with wavelength range between (500–800) nm and the optical power of (250[Formula: see text]W). The photoresponse enhancement was improved by coating the FePc films with 200[Formula: see text]nm of polyamide nylon polymer. This type of coating, which can be considered as a surface treatment, highly increased the photoresponse of the fabricated FePc UV detector. The results show that the responsivity increased four orders of magnitudes more than the responsivity of the uncoated FePc film. The effects of the coated polymers on the responsivity and the response time of the detector were investigated.
The emphasis of Master Production Scheduling (MPS) or tactic planning is on time and spatial disintegration of the cumulative planning targets and forecasts, along with the provision and forecast of the required resources. This procedure eventually becomes considerably difficult and slow as the number of resources, products and periods considered increases. A number of studies have been carried out to understand these impediments and formulate algorithms to optimise the production planning problem, or more specifically the master production scheduling (MPS) problem. These algorithms include an Evolutionary Algorithm called Genetic Algorithm, a Swarm Intelligence methodology called Gravitational Search Algorithm (GSA), Bat Algorithm (BAT), T
... Show More<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T
... Show MoreSocial media is known as detectors platform that are used to measure the activities of the users in the real world. However, the huge and unfiltered feed of messages posted on social media trigger social warnings, particularly when these messages contain hate speech towards specific individual or community. The negative effect of these messages on individuals or the society at large is of great concern to governments and non-governmental organizations. Word clouds provide a simple and efficient means of visually transferring the most common words from text documents. This research aims to develop a word cloud model based on hateful words on online social media environment such as Google News. Several steps are involved including data acq
... Show MoreThis paper include the problem of segmenting an image into regions represent (objects), segment this object by define boundary between two regions using a connected component labeling. Then develop an efficient segmentation algorithm based on this method, to apply the algorithm to image segmentation using different kinds of images, this algorithm consist four steps at the first step convert the image gray level the are applied on the image, these images then in the second step convert to binary image, edge detection using Canny edge detection in third Are applie the final step is images. Best segmentation rates are (90%) obtained when using the developed algorithm compared with (77%) which are obtained using (ccl) before enhancement.
In this paper reliable computational methods (RCMs) based on the monomial stan-dard polynomials have been executed to solve the problem of Jeffery-Hamel flow (JHF). In addition, convenient base functions, namely Bernoulli, Euler and Laguerre polynomials, have been used to enhance the reliability of the computational methods. Using such functions turns the problem into a set of solvable nonlinear algebraic system that MathematicaⓇ12 can solve. The JHF problem has been solved with the help of Improved Reliable Computational Methods (I-RCMs), and a review of the methods has been given. Also, published facts are used to make comparisons. As further evidence of the accuracy and dependability of the proposed methods, the maximum error remainder
... Show MoreThis research depends on the relationship between the reflected spectrum, the nature of each target, area and the percentage of its presence with other targets in the unity of the target area. The changes occur in Land cover have been detected for different years using satellite images based on the Modified Spectral Angle Mapper (MSAM) processing, where Landsat satellite images are utilized using two software programming (MATLAB 7.11 and ERDAS imagine 2014). The proposed supervised classification method (MSAM) using a MATLAB program with supervised classification method (Maximum likelihood Classifier) by ERDAS imagine have been used to get farthest precise results and detect environmental changes for periods. Despite using two classificatio
... Show MoreIn data mining and machine learning methods, it is traditionally assumed that training data, test data, and the data that will be processed in the future, should have the same feature space distribution. This is a condition that will not happen in the real world. In order to overcome this challenge, domain adaptation-based methods are used. One of the existing challenges in domain adaptation-based methods is to select the most efficient features so that they can also show the most efficiency in the destination database. In this paper, a new feature selection method based on deep reinforcement learning is proposed. In the proposed method, in order to select the best and most appropriate features, the essential policies
... Show MoreOptimizing the Access Point (AP) deployment has a great role in wireless applications due to the need for providing an efficient communication with low deployment costs. Quality of Service (QoS), is a major significant parameter and objective to be considered along with AP placement as well the overall deployment cost. This study proposes and investigates a multi-level optimization algorithm called Wireless Optimization Algorithm for Indoor Placement (WOAIP) based on Binary Particle Swarm Optimization (BPSO). WOAIP aims to obtain the optimum AP multi-floor placement with effective coverage that makes it more capable of supporting QoS and cost-effectiveness. Five pairs (coverage, AP deployment) of weights, signal thresholds and received s
... Show More