The Optimal Placement of Capacitor in Order to Reduce Losses and the Profile of Distribution Network Voltage with GA, SA

Most of the losses in a power system relate to the distribution sector which always has been considered. From the important factors which contribute to increase losses in the distribution system is the existence of radioactive flows. The most common way to compensate the radioactive power in the system is the power to use parallel capacitors. In addition to reducing the losses, the advantages of capacitor placement are the reduction of the losses in the release peak of network capacity and improving the voltage profile. The point which should be considered in capacitor placement is the optimal placement and specification of the amount of the capacitor in order to maximize the advantages of capacitor placement. In this paper, a new technique has been offered for the placement and the specification of the amount of the constant capacitors in the radius distribution network on the basis of Genetic Algorithm (GA). The existing optimal methods for capacitor placement are mostly including those which reduce the losses and voltage profile simultaneously. But the retaliation cost and load changes have not been considered as influential UN the target function .In this article, a holistic approach has been considered for the optimal response to this problem which includes all the parameters in the distribution network: The price of the phase voltage and load changes. So, a vast inquiry is required for all the possible responses. So, in this article, we use Genetic Algorithm (GA) as the most powerful method for optimal inquiry.

Cooperative Data Caching in WSN

Wireless sensor networks (WSNs) have gained tremendous attention in recent years due to their numerous applications. Due to the limited energy resource, energy efficient operation of sensor nodes is a key issue in wireless sensor networks. Cooperative caching which ensures sharing of data among various nodes reduces the number of communications over the wireless channels and thus enhances the overall lifetime of a wireless sensor network. In this paper, we propose a cooperative caching scheme called ZCS (Zone Cooperation at Sensors) for wireless sensor networks. In ZCS scheme, one-hop neighbors of a sensor node form a cooperative cache zone and share the cached data with each other. Simulation experiments show that the ZCS caching scheme achieves significant improvements in byte hit ratio and average query latency in comparison with other caching strategies.

Watermark Bit Rate in Diverse Signal Domains

A study of the obtainable watermark data rate for information hiding algorithms is presented in this paper. As the perceptual entropy for wideband monophonic audio signals is in the range of four to five bits per sample, a significant amount of additional information can be inserted into signal without causing any perceptual distortion. Experimental results showed that transform domain watermark embedding outperforms considerably watermark embedding in time domain and that signal decompositions with a high gain of transform coding, like the wavelet transform, are the most suitable for high data rate information hiding. Keywords?Digital watermarking, information hiding, audio watermarking, watermark data rate.

A Novel 14 nm Extended Body FinFET for Reduced Corner Effect, Self-Heating Effect, and Increased Drain Current

In this paper, we have proposed a novel FinFET with extended body under the poly gate, which is called EB-FinFET, and its characteristic is demonstrated by using three-dimensional (3-D) numerical simulation. We have analyzed and compared it with conventional FinFET. The extended body height dependence on the drain induced barrier lowering (DIBL) and subthreshold swing (S.S) have been also investigated. According to the 3-D numerical simulation, the proposed structure has a firm structure, an acceptable short channel effect (SCE), a reduced series resistance, an increased on state drain current (I on) and a large normalized I DS. Furthermore, the structure can also improve corner effect and reduce self-heating effect due to the extended body. Our results show that the EBFinFET is excellent for nanoscale device.

3G WCDMA Mobile Network DoS Attack and Detection Technology

Currently, there has been a 3G mobile networks data traffic explosion due to the large increase in the number of smartphone users. Unlike a traditional wired infrastructure, 3G mobile networks have limited wireless resources and signaling procedures for complex wireless resource management. And mobile network security for various abnormal and malicious traffic technologies was not ready. So Malicious or potentially malicious traffic originating from mobile malware infected smart devices can cause serious problems to the 3G mobile networks, such as DoS and scanning attack in wired networks. This paper describes the DoS security threat in the 3G mobile network and proposes a detection technology.

Extracting Human Body based on Background Estimation in Modified HLS Color Space

The ability to recognize humans and their activities by computer vision is a very important task, with many potential application. Study of human motion analysis is related to several research areas of computer vision such as the motion capture, detection, tracking and segmentation of people. In this paper, we describe a segmentation method for extracting human body contour in modified HLS color space. To estimate a background, the modified HLS color space is proposed, and the background features are estimated by using the HLS color components. Here, the large amount of human dataset, which was collected from DV cameras, is pre-processed. The human body and its contour is successfully extracted from the image sequences.

Stochastic Estimation of Cavity Flowfield

Linear stochastic estimation and quadratic stochastic estimation techniques were applied to estimate the entire velocity flow-field of an open cavity with a length to depth ratio of 2. The estimations were done through the use of instantaneous velocity magnitude as estimators. These measurements were obtained by Particle Image Velocimetry. The predicted flow was compared against the original flow-field in terms of the Reynolds stresses and turbulent kinetic energy. Quadratic stochastic estimation proved to be more superior than linear stochastic estimation in resolving the shear layer flow. When the velocity fluctuations were scaled up in the quadratic estimate, both the time-averaged quantities and the instantaneous cavity flow can be predicted to a rather accurate extent.

Evaluation of Sensitometric Properties of Radiographic Films at Different Processing Solutions

The aim of this study was to compare the sensitometric properties of commonly used radiographic films processed with chemical solutions in different workload hospitals. The effect of different processing conditions on induced densities on radiologic films was investigated. Two accessible double emulsions Fuji and Kodak films were exposed with 11-step wedge and processed with Champion and CPAC processing solutions. The mentioned films provided in both workloads centers, high and low. Our findings displays that the speed and contrast of Kodak filmscreen in both work load (high and low) is higher than Fuji filmscreen for both processing solutions. However there was significant differences in films contrast for both workloads when CPAC solution had been used (p=0.000 and 0.028). The results showed base plus fog density for Kodak film was lower than Fuji. Generally Champion processing solution caused more speed and contrast for investigated films in different conditions and there was significant differences in 95% confidence level between two used processing solutions (p=0.01). Low base plus fog density for Kodak films provide more visibility and accuracy and higher contrast results in using lower exposure factors to obtain better quality in resulting radiographs. In this study we found an economic advantages since Champion solution and Kodak film are used while it makes lower patient dose. Thus, in a radiologic facility any change in film processor/processing cycle or chemistry should be carefully investigated before radiological procedures of patients are acquired.

Analysis of Long-Term File System Activities on Cluster Systems

I/O workload is a critical and important factor to analyze I/O pattern and to maximize file system performance. However to measure I/O workload on running distributed parallel file system is non-trivial due to collection overhead and large volume of data. In this paper, we measured and analyzed file system activities on two large-scale cluster systems which had TFlops level high performance computation resources. By comparing file system activities of 2009 with those of 2006, we analyzed the change of I/O workloads by the development of system performance and high-speed network technology.

Wind Tunnel Investigation of the Turbulent Flow around the Panorama Giustinelli Building for VAWT Application

A boundary layer wind tunnel facility has been adopted in order to conduct experimental measurements of the flow field around a model of the Panorama Giustinelli Building, Trieste (Italy). Information on the main flow structures has been obtained by means of flow visualization techniques and has been compared to the numerical predictions of the vortical structures spread on top of the roof, in order to investigate the optimal positioning for a vertical-axis wind energy conversion system, registering a good agreement between experimental measurements and numerical predictions.

Digital Hypertexts vs. Traditional Books: An Inquiry into Non-Linearity

The current study begins with an awareness that today-s media environment is characterized by technological development and a new way of reading caused by the introduction of the Internet. The researcher conducted a meta analysis framed within Technological Determinism to investigate the process of hypertext reading, its differences from linear reading and the effects such differences can have on people-s ways of mentally structuring their world. The relationship between literacy and the comprehension achieved by reading hypertexts is also investigated. The results show hypertexts are not always user friendly. People experience hyperlinks as interruptions that distract their attention generating comprehension and disorientation. On one hand hypertextual jumping reading generates interruptions that finally make people lose their concentration. On the other hand hypertexts fascinate people who would rather read a document in such a format even though the outcome is often frustrating and affects their ability to elaborate and retain information.

Finite Element Simulation of Multi-Stage Deep Drawing Processes and Comparison with Experimental Results

The plastic forming process of sheet plate takes an important place in forming metals. The traditional techniques of tool design for sheet forming operations used in industry are experimental and expensive methods. Prediction of the forming results, determination of the punching force, blank holder forces and the thickness distribution of the sheet metal will decrease the production cost and time of the material to be formed. In this paper, multi-stage deep drawing simulation of an Industrial Part has been presented with finite element method. The entire production steps with additional operations such as intermediate annealing and springback has been simulated by ABAQUS software under axisymmetric conditions. The simulation results such as sheet thickness distribution, Punch force and residual stresses have been extracted in any stages and sheet thickness distribution was compared with experimental results. It was found through comparison of results, the FE model have proven to be in close agreement with those of experiment.

Performance Evaluation of Complex Valued Neural Networks Using Various Error Functions

The backpropagation algorithm in general employs quadratic error function. In fact, most of the problems that involve minimization employ the Quadratic error function. With alternative error functions the performance of the optimization scheme can be improved. The new error functions help in suppressing the ill-effects of the outliers and have shown good performance to noise. In this paper we have tried to evaluate and compare the relative performance of complex valued neural network using different error functions. During first simulation for complex XOR gate it is observed that some error functions like Absolute error, Cauchy error function can replace Quadratic error function. In the second simulation it is observed that for some error functions the performance of the complex valued neural network depends on the architecture of the network whereas with few other error functions convergence speed of the network is independent of architecture of the neural network.

A Hybrid Radial-Based Neuro-GA Multiobjective Design of Laminated Composite Plates under Moisture and Thermal Actions

In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.

GeoSEMA: A Modelling Platform, Emerging “GeoSpatial-based Evolutionary and Mobile Agents“

Spatial and mobile computing evolves. This paper describes a smart modeling platform called “GeoSEMA". This approach tends to model multidimensional GeoSpatial Evolutionary and Mobile Agents. Instead of 3D and location-based issues, there are some other dimensions that may characterize spatial agents, e.g. discrete-continuous time, agent behaviors. GeoSEMA is seen as a devoted design pattern motivating temporal geographic-based applications; it is a firm foundation for multipurpose and multidimensional special-based applications. It deals with multipurpose smart objects (buildings, shapes, missiles, etc.) by stimulating geospatial agents. Formally, GeoSEMA refers to geospatial, spatio-evolutive and mobile space constituents where a conceptual geospatial space model is given in this paper. In addition to modeling and categorizing geospatial agents, the model incorporates the concept of inter-agents event-based protocols. Finally, a rapid software-architecture prototyping GeoSEMA platform is also given. It will be implemented/ validated in the next phase of our work.

Design and Analysis of Gauge R&R Studies: Making Decisions Based on ANOVA Method

In a competitive production environment, critical decision making are based on data resulted by random sampling of product units. Efficiency of these decisions depends on data quality and also their reliability scale. This point leads to the necessity of a reliable measurement system. Therefore, the conjecture process and analysing the errors contributes to a measurement system known as Measurement System Analysis (MSA). The aim of this research is on determining the necessity and assurance of extensive development in analysing measurement systems, particularly with the use of Repeatability and Reproducibility Gages (GR&R) to improve physical measurements. Nowadays in productive industries, repeatability and reproducibility gages released so well but they are not applicable as well as other measurement system analysis methods. To get familiar with this method and gain a feedback in improving measurement systems, this survey would be on “ANOVA" method as the most widespread way of calculating Repeatability and Reproducibility (R&R).

Measurement and Estimation of Evaporation from Water Surfaces: Application to Dams in Arid and Semi Arid Areas in Algeria

Many methods exist for either measuring or estimating evaporation from free water surfaces. Evaporation pans provide one of the simplest, inexpensive, and most widely used methods of estimating evaporative losses. In this study, the rate of evaporation starting from a water surface was calculated by modeling with application to dams in wet, arid and semi arid areas in Algeria. We calculate the evaporation rate from the pan using the energy budget equation, which offers the advantage of an ease of use, but our results do not agree completely with the measurements taken by the National Agency of areas carried out using dams located in areas of different climates. For that, we develop a mathematical model to simulate evaporation. This simulation uses an energy budget on the level of a vat of measurement and a Computational Fluid Dynamics (Fluent). Our calculation of evaporation rate is compared then by the two methods and with the measures of areas in situ.

Acidity of different Jordanian Clays characterized by TPD-NH3 and MBOH Conversion

The acidity of different raw Jordanian clays containing zeolite, bentonite, red and white kaolinite and diatomite was characterized by means of temperature programmed desorption (TPD) of ammonia, conversion of 2-methyl-3-butyn-2-ol (MBOH), FTIR and BET-measurements. FTIR spectra proved presence of silanol and bridged hydroxyls on the clay surface. The number of acidic sites was calculated from experimental TPD-profiles. We observed the decrease of surface acidity correlates with the decrease of Si/Al ratio except for diatomite. On the TPD-plot for zeolite two maxima were registered due to different strength of surface acidic sites. Values of MBOH conversion, product yields and selectivity were calculated for the catalysis on Jordanian clays. We obtained that all clay samples are able to convert MBOH into a major product which is 3-methyl-3-buten-1-yne (MBYNE) catalyzed by acid surface sites with the selectivity close to 70%. There was found a correlation between MBOH conversion and acidity of clays determined by TPD-NH3, i.e. the higher the acidity the higher the conversion of MBOH. However, diatomite provided the lowest conversion of MBOH as result of poor polarization of silanol groups. Comparison of surface areas and conversions revealed the highest density of active sites for red kaolinite and the lowest for zeolite and diatomite.

On Pattern-Based Programming towards the Discovery of Frequent Patterns

The problem of frequent pattern discovery is defined as the process of searching for patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a database. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages. Such paradigm is inefficient when set of patterns is large and the frequent pattern is long. We suggest a high-level declarative style of programming apply to the problem of frequent pattern discovery. We consider two languages: Haskell and Prolog. Our intuitive idea is that the problem of finding frequent patterns should be efficiently and concisely implemented via a declarative paradigm since pattern matching is a fundamental feature supported by most functional languages and Prolog. Our frequent pattern mining implementation using the Haskell and Prolog languages confirms our hypothesis about conciseness of the program. The comparative performance studies on line-of-code, speed and memory usage of declarative versus imperative programming have been reported in the paper.

Protocol and Method for Preventing Attacks from the Web

Nowadays, computer worms, viruses and Trojan horse become popular, and they are collectively called malware. Those malware just spoiled computers by deleting or rewriting important files a decade ago. However, recent malware seems to be born to earn money. Some of malware work for collecting personal information so that malicious people can find secret information such as password for online banking, evidence for a scandal or contact address which relates with the target. Moreover, relation between money and malware becomes more complex. Many kinds of malware bear bots to get springboards. Meanwhile, for ordinary internet users, countermeasures against malware come up against a blank wall. Pattern matching becomes too much waste of computer resources, since matching tools have to deal with a lot of patterns derived from subspecies. Virus making tools can automatically bear subspecies of malware. Moreover, metamorphic and polymorphic malware are no longer special. Recently there appears malware checking sites that check contents in place of users' PC. However, there appears a new type of malicious sites that avoids check by malware checking sites. In this paper, existing protocols and methods related with the web are reconsidered in terms of protection from current attacks, and new protocol and method are indicated for the purpose of security of the web.