Voltage Stability Proximity Index Determined by LES Algorithm

In this paper, we propose an easily computable proximity index for predicting voltage collapse of a load bus using only measured values of the bus voltage and power; Using these measurements a polynomial of fourth order is obtained by using LES estimation algorithms. The sum of the absolute values of the polynomial coefficient gives an idea of the critical bus. We demonstrate the applicability of our proposed method on 6 bus test system. The results obtained verify its applicability, as well as its accuracy and the simplicity. From this indicator, it is allowed to predict the voltage instability or the proximity of a collapse. Results obtained by the PV curve are compared with corresponding values by QV curves and are observed to be in close agreement.

Determinants of Enterprise Risk Management Adoption: An Empirical Analysis of Malaysian Public Listed Firms

Purpose:This paper aims to gain insights to the influential factors of ERM adoptions by public listed firms in Malaysia. Findings:The two factors of financial leverage and auditor type were found to be significant influential factors for ERM adoption. In other words the findings indicated that firms with higher financial leverage and with a Big Four auditor are more likely to have a form of ERM framework in place. Originality/Value:Since there are relatively few studies conducted in this area and specially in developing economies like Malaysia, this study will broaden the scope of literature by providing novel empirical evidence.

Performance Verification of Seismic Design Codes for RC Frames

In this study, a frame work for verification of famous seismic codes is utilized. To verify the seismic codes performance, damage quantity of RC frames is compared with the target performance. Due to the randomness property of seismic design and earthquake loads excitation, in this paper, fragility curves are developed. These diagrams are utilized to evaluate performance level of structures which are designed by the seismic codes. These diagrams further illustrate the effect of load combination and reduction factors of codes on probability of damage exceedance. Two types of structures; very high important structures with high ductility and medium important structures with intermediate ductility are designed by different seismic codes. The Results reveal that usually lower damage ratio generate lower probability of exceedance. In addition, the findings indicate that there are buildings with higher quantity of bars which they have higher probability of damage exceedance. Life-cycle cost analysis utilized for comparison and final decision making process.

Identifying New Sequence Features for Exon-Intron Discrimination by Rescaled-Range Frameshift Analysis

For identifying the discriminative sequence features between exons and introns, a new paradigm, rescaled-range frameshift analysis (RRFA), was proposed. By RRFA, two new sequence features, the frameshift sensitivity (FS) and the accumulative penta-mer complexity (APC), were discovered which were further integrated into a new feature of larger scale, the persistency in anti-mutation (PAM). The feature-validation experiments were performed on six model organisms to test the power of discrimination. All the experimental results highly support that FS, APC and PAM were all distinguishing features between exons and introns. These identified new sequence features provide new insights into the sequence composition of genes and they have great potentials of forming a new basis for recognizing the exonintron boundaries in gene sequences.

Simulation-Based Optimization in Performance Evaluation of Marshaling Yard Storage Policy in a Container Port

Since the last two decades, container transportation system has been faced under increasing development. This fact shows the importance of container transportation system as a key role of container terminals to link between sea and land. Therefore, there is a continuous need for the optimal use of equipment and facilities in the ports. Regarding the complex structure of container ports, this paper presents a simulation model that compares tow storage strategies for storing containers in the yard. For this purpose, we considered loading and unloading norm as an important criterion to evaluate the performance of Shahid Rajaee container port. By analysing the results of the model, it will be shown that using marshalling yard policy instead of current storage system has a significant effect on the performance level of the port and can increase the loading and unloading norm up to 14%.

Inhibition of the Growth of Pathogenic Candida spp. by Salicylhydroxamic Acid

Candida spp. are common and aggressive pathogens. Because of the growing resistance of Candida spp. to current antifungals, novel targets, found in Candida spp. but not in humans or other flora, have to be identified. The alternative oxidase (AOX) is one such possibility. This enzyme is insensitive to cyanide, but is sensitive to compounds such as salicylhydroxamic acid (SHAM), disulfiram and n-alkyl gallates. The growth each of six Candida spp. was inhibited significantly by ~13 mM SHAM or 2 mM cyanide, albeit to differing extents. In C. dubliniensis, C. krusei and C. tropicalis the rate of O2 uptake was inhibited by 18-36% by 25 mM SHAM, but this had little or no effect on C. glabrata, C. guilliermondii or C. parapsilosis. Although SHAM substantially inhibited the growth of Candida spp., it is unlikely that the inhibition of AOX was the cause. Salicylhydroxamic acid is used therapeutically in the treatment of urinary tract infections and urolithiasis, but it also has some potential in the treatment of Candida spp. infection.

A Study on Finding Similar Document with Multiple Categories

Searching similar documents and document management subjects have important place in text mining. One of the most important parts of similar document research studies is the process of classifying or clustering the documents. In this study, a similar document search approach that includes discussion of out the case of belonging to multiple categories (multiple categories problem) has been carried. The proposed method that based on Fuzzy Similarity Classification (FSC) has been compared with Rocchio algorithm and naive Bayes method which are widely used in text mining. Empirical results show that the proposed method is quite successful and can be applied effectively. For the second stage, multiple categories vector method based on information of categories regarding to frequency of being seen together has been used. Empirical results show that achievement is increased almost two times, when proposed method is compared with classical approach.

Design of Ultra Fast Polymer Electro-Optic waveguide Switch for Intelligent Optical Networks

Traditional optical networks are gradually evolving towards intelligent optical networks due to the need for faster bandwidth provisioning, protection and restoration of the network that can be accomplished with devices like optical switch, add drop multiplexer and cross connects. Since dense wavelength multiplexing forms the physical layer for intelligent optical networking, the roll of high speed all optical switch is important. This paper analyzes such an ultra-high speed polymer electro-optic switch. The performances of the 2x2 optical waveguide switch with rectangular, triangular and trapezoidal grating profiles on various device parameters are analyzed. The simulation result shows that trapezoidal grating is the optimized structure which has the coupling length of 81μm and switching voltage of 11V for the operating wavelength of 1550nm. The switching time for this proposed switch is 0.47 picosecond. This makes the proposed switch to be an important element in the intelligent optical network.

Comparison of Stochastic Point Process Models of Rainfall in Singapore

Extensive rainfall disaggregation approaches have been developed and applied in climate change impact studies such as flood risk assessment and urban storm water management.In this study, five rainfall models that were capable ofdisaggregating daily rainfall data into hourly one were investigated for the rainfall record in theChangi Airport, Singapore. The objectives of this study were (i) to study the temporal characteristics of hourly rainfall in Singapore, and (ii) to evaluate the performance of variousdisaggregation models. The used models included: (i) Rectangular pulse Poisson model (RPPM), (ii) Bartlett-Lewis Rectangular pulse model (BLRPM), (iii) Bartlett-Lewis model with 2 cell types (BL2C), (iv) Bartlett-Lewis Rectangular with cell depth distribution dependent on duration (BLRD), and (v) Neyman-Scott Rectangular pulse model (NSRPM). All of these models werefitted using hourly rainfall data ranging from 1980 to 2005 (which was obtained from Changimeteorological station).The study results indicated that the weight scheme of inversely proportional variance could deliver more accurateoutputs for fitting rainfall patterns in tropical areas, and BLRPM performedrelatively better than other disaggregation models.

Transmission Loss Allocation via Loss Function Decomposition and Current Projection Concept

One of the major problems in liberalized power markets is loss allocation. In this paper, a different method for allocating transmission losses to pool market participants is proposed. The proposed method is fundamentally based on decomposition of loss function and current projection concept. The method has been implemented and tested on several networks and one sample summarized in the paper. The results show that the method is comprehensive and fair to allocating the energy losses of a power market to its participants.

Validation of the WAsP Model for a Terrain Surrounded by Mountainous Region

The problems associated with wind predictions of WAsP model in complex terrain are already the target of several studies in the last decade. In this paper, the influence of surrounding orography on accuracy of wind data analysis of a train is investigated. For the case study, a site with complex surrounding orography is considered. This site is located in Manjil, one of the windiest cities of Iran. For having precise evaluation of wind regime in the site, one-year wind data measurements from two metrological masts are used. To validate the obtained results from WAsP, the cross prediction between each mast is performed. The analysis reveals that WAsP model can estimate the wind speed behavior accurately. In addition, results show that this software can be used for predicting the wind regime in flat sites with complex surrounding orography.

Continuous and Discontinuous Shock Absorber Control through Skyhook Strategy in Semi-Active Suspension System (4DOF Model)

Active vibration isolation systems are less commonly used than passive systems due to their associated cost and power requirements. In principle, semi-active isolation systems can deliver the versatility, adaptability and higher performance of fully active systems for a fraction of the power consumption. Various semi-active control algorithms have been suggested in the past. This paper studies the 4DOF model of semi-active suspension performance controlled by on–off and continuous skyhook damping control strategy. The frequency and transient responses of model are evaluated in terms of body acceleration, roll angle and tire deflection and are compared with that of a passive damper. The results show that the semi-active system controlled by skyhook strategy always provides better isolation than a conventional passively damped system except at tire natural frequencies.

Synthesis and Reactions of Sulphone Hydrazides

The chemistry of sulphone hydrazide has gained increase interest in both synthetic organic chemistry and biological fields and has considerable value. The therapeutic importance of these compounds is the attractive force to continue research in such a point. The present review covers the literature up to date for the synthesis, reactions and applications of such compounds.

Optimization of Unweighted Minimum Vertex Cover

The Minimum Vertex Cover (MVC) problem is a classic graph optimization NP - complete problem. In this paper a competent algorithm, called Vertex Support Algorithm (VSA), is designed to find the smallest vertex cover of a graph. The VSA is tested on a large number of random graphs and DIMACS benchmark graphs. Comparative study of this algorithm with the other existing methods has been carried out. Extensive simulation results show that the VSA can yield better solutions than other existing algorithms found in the literature for solving the minimum vertex cover problem.

Cartoon Effect and Ambient Illumination Based Depth Perception Assessment of 3D Video

Monitored 3-Dimensional (3D) video experience can be utilized as “feedback information” to fine tune the service parameters for providing a better service to the demanding 3D service customers. The 3D video experience which includes both video quality and depth perception is influenced by several contextual and content related factors (e.g., ambient illumination condition, content characteristics, etc) due to the complex nature of the 3D video. Therefore, effective factors on this experience should be utilized while assessing it. In this paper, structural information of the depth map sequences of the 3D video is considered as content related factor effective on the depth perception assessment. Cartoon-like filter is utilized to abstract the significant depth levels in the depth map sequences to determine the structural information. Moreover, subjective experiments are conducted using 3D videos associated with cartoon-like depth map sequences to investigate the effectiveness of ambient illumination condition, which is a contextual factor, on depth perception. Using the knowledge gained through this study, 3D video experience metrics can be developed to deliver better service to the 3D video service users. 

Measuring Cognitive Load - A Solution to Ease Learning of Programming

Learning programming is difficult for many learners. Some researches have found that the main difficulty relates to cognitive load. Cognitive overload happens in programming due to the nature of the subject which is intrinisicly over-bearing on the working memory. It happens due to the complexity of the subject itself. The problem is made worse by the poor instructional design methodology used in the teaching and learning process. Various efforts have been proposed to reduce the cognitive load, e.g. visualization softwares, part-program method etc. Use of many computer based systems have also been tried to tackle the problem. However, little success has been made to alleviate the problem. More has to be done to overcome this hurdle. This research attempts at understanding how cognitive load can be managed so as to reduce the problem of overloading. We propose a mechanism to measure the cognitive load during pre instruction, post instruction and in instructional stages of learning. This mechanism is used to help the instruction. As the load changes the instruction is made to adapt itself to ensure cognitive viability. This mechanism could be incorporated as a sub domain in the student model of various computer based instructional systems to facilitate the learning of programming.

Flexible Wormhole-Switched Network-on-chip with Two-Level Priority Data Delivery Service

A synchronous network-on-chip using wormhole packet switching and supporting guaranteed-completion best-effort with low-priority (LP) and high-priority (HP) wormhole packet delivery service is presented in this paper. Both our proposed LP and HP message services deliver a good quality of service in term of lossless packet completion and in-order message data delivery. However, the LP message service does not guarantee minimal completion bound. The HP packets will absolutely use 100% bandwidth of their reserved links if the HP packets are injected from the source node with maximum injection. Hence, the service are suitable for small size messages (less than hundred bytes). Otherwise the other HP and LP messages, which require also the links, will experience relatively high latency depending on the size of the HP message. The LP packets are routed using a minimal adaptive routing, while the HP packets are routed using a non-minimal adaptive routing algorithm. Therefore, an additional 3-bit field, identifying the packet type, is introduced in their packet headers to classify and to determine the type of service committed to the packet. Our NoC prototypes have been also synthesized using a 180-nm CMOS standard-cell technology to evaluate the cost of implementing the combination of both services.

The Splitting Upwind Schemes for Spectral Action Balance Equation

The spectral action balance equation is an equation that used to simulate short-crested wind-generated waves in shallow water areas such as coastal regions and inland waters. This equation consists of two spatial dimensions, wave direction, and wave frequency which can be solved by finite difference method. When this equation with dominating convection term are discretized using central differences, stability problems occur when the grid spacing is chosen too coarse. In this paper, we introduce the splitting upwind schemes for avoiding stability problems and prove that it is consistent to the upwind scheme with same accuracy. The splitting upwind schemes was adopted to split the wave spectral action balance equation into four onedimensional problems, which for each small problem obtains the independently tridiagonal linear systems. For each smaller system can be solved by direct or iterative methods at the same time which is very fast when performed by a multi-processor computer.

Remarks on Energy Based Control of a Nonlinear, Underactuated, MIMO and Unstable Benchmark

In the last decade, energy based control theory has undergone a significant breakthrough in dealing with underactated mechanical systems with two successful and similar tools, controlled Lagrangians and controlled Hamiltanians (IDA-PBC). However, because of the complexity of these tools, successful case studies are lacking, in particular, MIMO cases. The seminal theoretical paper of controlled Lagrangians proposed by Bloch and his colleagues presented a benchmark example–a 4 d.o.f underactuated pendulum on a cart but a detailed and completed design is neglected. To compensate this ignorance, the note revisit their design idea by addressing explicit control functions for a similar device motivated by a vector thrust body hovering in the air. To the best of our knowledge, this system is the first MIMO, underactuated example that is stabilized by using energy based tools at the courtesy of the original design idea. Some observations are given based on computer simulation.

Towards Cloud Computing Anatomy

Cloud Computing has recently emerged as a compelling paradigm for managing and delivering services over the internet. The rise of Cloud Computing is rapidly changing the landscape of information technology, and ultimately turning the longheld promise of utility computing into a reality. As the development of Cloud Computing paradigm is speedily progressing, concepts, and terminologies are becoming imprecise and ambiguous, as well as different technologies are interfering. Thus, it becomes crucial to clarify the key concepts and definitions. In this paper, we present the anatomy of Cloud Computing, covering its essential concepts, prominent characteristics, its affects, architectural design and key technologies. We differentiate various service and deployment models. Also, significant challenges and risks need are tackled in order to guarantee the long-term success of Cloud Computing. The aim of this paper is to provide a better understanding of the anatomy of Cloud Computing and pave the way for further research in this area.