Online Programme of Excellence Model (OPEM)

Finding effective ways of improving university quality assurance requires, as well, a retraining of the staff. This article illustrates an Online Programme of Excellence Model (OPEM), based on the European quality assurance model, for improving participants- formative programme standards. The results of applying this OPEM indicate the necessity of quality policies that support the evaluators- competencies to improve formative programmes. The study concludes by outlining how faculty and agency staff can use OPEM for the internal and external quality assurance of formative programmes.

Orders Preparation and Control on the Productive Process Efficiency Preparation

The main objective of this paper is to analyse the influence of preparation and control of orders on performance. The focused activities explored in this research are: procurement, production and distribution. These changes in performance were obtained through improvement of the supply chain. It is proved using all the company activities that it is possible to increase de efficiency and do services in an adequate way, placing the products in the market efficiently. For that, it was explored the importance of the supply chain, with privilege to the practical environment and the quantification of the obtained results.

Binary Classification Tree with Tuned Observation-based Clustering

There are several approaches for handling multiclass classification. Aside from one-against-one (OAO) and one-against-all (OAA), hierarchical classification technique is also commonly used. A binary classification tree is a hierarchical classification structure that breaks down a k-class problem into binary sub-problems, each solved by a binary classifier. In each node, a set of classes is divided into two subsets. A good class partition should be able to group similar classes together. Many algorithms measure similarity in term of distance between class centroids. Classes are grouped together by a clustering algorithm when distances between their centroids are small. In this paper, we present a binary classification tree with tuned observation-based clustering (BCT-TOB) that finds a class partition by performing clustering on observations instead of class centroids. A merging step is introduced to merge any insignificant class split. The experiment shows that performance of BCT-TOB is comparable to other algorithms.

An Efficient Method of Shot Cut Detection

In this paper we present a method of abrupt cut detection with a novel logic of frames- comparison. Actual frame is compared with its motion estimated prediction instead of comparison with successive frame. Four different similarity metrics were employed to estimate the resemblance of compared frames. Obtained results were evaluated by standard used measures of test accuracy and compared with existing approach. Based on the results, we claim the proposed method is more effective and Pearson correlation coefficient obtained the best results among chosen similarity metrics.

Economic Dispatch Fuzzy Linear Regression and Optimization

This study presents a new approach based on Tanaka's fuzzy linear regression (FLP) algorithm to solve well-known power system economic load dispatch problem (ELD). Tanaka's fuzzy linear regression (FLP) formulation will be employed to compute the optimal solution of optimization problem after linearization. The unknowns are expressed as fuzzy numbers with a triangular membership function that has middle and spread value reflected on the unknowns. The proposed fuzzy model is formulated as a linear optimization problem, where the objective is to minimize the sum of the spread of the unknowns, subject to double inequality constraints. Linear programming technique is employed to obtain the middle and the symmetric spread for every unknown (power generation level). Simulation results of the proposed approach will be compared with those reported in literature.

The Role of Contextual Ontologies in Enterprise Modeling

Information sharing and exchange, rather than information processing, is what characterizes information technology in the 21st century. Ontologies, as shared common understanding, gain increasing attention, as they appear as the most promising solution to enable information sharing both at a semantic level and in a machine-processable way. Domain Ontology-based modeling has been exploited to provide shareability and information exchange among diversified, heterogeneous applications of enterprises. Contextual ontologies are “an explicit specification of contextual conceptualization". That is: ontology is characterized by concepts that have multiple representations and they may exist in several contexts. Hence, contextual ontologies are a set of concepts and relationships, which are seen from different perspectives. Contextualization is to allow for ontologies to be partitioned according to their contexts. The need for contextual ontologies in enterprise modeling has become crucial due to the nature of today's competitive market. Information resources in enterprise is distributed and diversified and is in need to be shared and communicated locally through the intranet and globally though the internet. This paper discusses the roles that ontologies play in an enterprise modeling, and how ontologies assist in building a conceptual model in order to provide communicative and interoperable information systems. The issue of enterprise modeling based on contextual domain ontology is also investigated, and a framework is proposed for an enterprise model that consists of various applications.

A Heat-Inducible Transgene Expression System for Gene Therapy

Heat-inducible gene expression vectors are useful for hyperthermia-induced cancer gene therapy, because the combination of hyperthermia and gene therapy can considerably improve the therapeutic effects. In the present study, we developed an enhanced heat-inducible transgene expression system in which a heat-shock protein (HSP) promoter and tetracycline-responsive transactivator were combined. When the transactivator plasmid containing the tetracycline-responsive transactivator gene was co-transfected with the reporter gene expression plasmid, a high level of heat-induced gene expression was observed compared with that using the HSP promoter without the transactivator. In vitro evaluation of the therapeutic effect using HeLa cells showed that heat-induced therapeutic gene expression caused cell death in a high percentage of these cells, indicating that this strategy is promising for cancer gene therapy.

Local Image Descriptor using VQ-SIFT for Image Retrieval

In this paper, we present local image descriptor using VQ-SIFT for more effective and efficient image retrieval. Instead of SIFT's weighted orientation histograms, we apply vector quantization (VQ) histogram as an alternate representation for SIFT features. Experimental results show that SIFT features using VQ-based local descriptors can achieve better image retrieval accuracy than the conventional algorithm while the computational cost is significantly reduced.

Explorations in the Role of Emotion in Moral Judgment

Recent theorizations on the cognitive process of moral judgment have focused on the role of intuitions and emotions, marking a departure from previous emphasis on conscious, step-by-step reasoning. My study investigated how being in a disgusted mood state affects moral judgment. Participants were induced to enter a disgusted mood state through listening to disgusting sounds and reading disgusting descriptions. Results shows that they, when compared to control who have not been induced to feel disgust, are more likely to endorse actions that are emotionally aversive but maximizes utilitarian return The result is analyzed using the 'emotion-as-information' approach to decision making. The result is consistent with the view that emotions play an important role in determining moral judgment.

Investigation of Organizational Work-Life Imbalance of Thai Software Developers in a Multinational Software Development Firm using Fishbone Diagram for Knowledge Management

Work stress causes the organizational work-life imbalance of employees. Because of this imbalance, workers perform with lower effort to finish assignments and thus an organization will experience reduced productivity. In order to investigate the problem of an organizational work-life imbalance, this qualitative case study focuses on an organizational work-life imbalance among Thai software developers in a German-owned company in Chiang Mai, Thailand. In terms of knowledge management, fishbone diagram is useful analysis tool to investigate the root causes of an organizational work-life imbalance systematically in focus-group discussions. Furthermore, fishbone diagram shows the relationship between causes and effects clearly. It was found that an organizational worklife imbalance among Thai software developers is influenced by management team, work environment, and information tools used in the company over time.

A Survey on Supply Chain Management and E Commerce Technology Adoption among Logistics Service Providers in Johor

Logistics is part of the supply chain processes that plans, implements, and controls the efficient and effective forward and reverse flow and storage of goods, services, and related information between the point of origin and the point of consumption in order to meet customer requirements. This research aims to investigate the current status and future direction of the use of Information Technology (IT) for logistics, focusing on Supply Chain Management (SCM) and E-Commerce adoption in Johor. Therefore, this research stresses on the type of technology being adopted, factors, benefits and barriers affecting the innovation in SCM and ECommerce technology adoption among Logistics Service Providers (LSP). A mailed questionnaire survey was conducted to collect data from 265 logistics companies in Johor. The research revealed that SCM technology adoption among LSP was higher as they had adopted SCM technology in various business processes while they perceived a high level of benefits from SCM adoption. Obviously, ECommerce technology adoption among LSP is relatively low.

Effect of Wheat Flour Extraction Rates on Flour Composition, Farinographic Characteristics and Sensory Perception of Sourdough Naans

The effect of wheat flour extraction rates on flour composition, farinographic characteristics and the quality of sourdough naans was investigated. The results indicated that by increasing the extraction rate, the amount of protein, fiber, fat and ash increased, whereas moisture content decreased. Farinographic characteristic like water absorption and dough development time increased with an increase in flour extraction rate but the dough stabilities and tolerance indices were reduced with an increase in flour extraction rates. Titratable acidity for both sourdough and sourdough naans also increased along with flour extraction rate. The study showed that overall quality of sourdough naans were affected by both flour extraction rate and starter culture used. Sensory analysis of sourdough naans revealed that desirable extraction rate for sourdough naan was 76%.

Simulation of Series Compensated Transmission Lines Protected with Mov

In this paper the behavior of fixed series compensated extra high voltage transmission lines during faults is simulated. Many over-voltage protection schemes for series capacitors are limited in terms of size and performance, and are easily affected by environmental conditions. While the need for more compact and environmentally robust equipment is required. use of series capacitors for compensating part of the inductive reactance of long transmission lines increases the power transmission capacity. Emphasis is given on the impact of modern capacitor protection techniques (MOV protection). The simulation study is performed using MATLAB/SIMULINK®and results are given for a three phase and a single phase to ground fault.

SPH Method used for Flow Predictions at a Turgo Impulse Turbine: Comparison with Fluent

This work is an attempt to use the standard Smoothed Particle Hydrodynamics methodology for the simulation of the complex unsteady, free-surface flow in a rotating Turgo impulse water turbine. A comparison of two different geometries was conducted. The SPH method due to its mesh-less nature is capable of capturing the flow features appearing in the turbine, without diffusion at the water/air interface. Furthermore results are compared with a commercial CFD package (Fluent®) and the SPH algorithm proves to be capable of providing similar results, in much less time than the mesh based CFD program. A parametric study was also performed regarding the turbine inlet angle.

MaxMin Share Based Medium Access for Attaining Fairness and Channel Utilization in Mobile Adhoc Networks

Due to the complex network architecture, the mobile adhoc network-s multihop feature gives additional problems to the users. When the traffic load at each node gets increased, the additional contention due its traffic pattern might cause the nodes which are close to destination to starve the nodes more away from the destination and also the capacity of network is unable to satisfy the total user-s demand which results in an unfairness problem. In this paper, we propose to create an algorithm to compute the optimal MAC-layer bandwidth assigned to each flow in the network. The bottleneck links contention area determines the fair time share which is necessary to calculate the maximum allowed transmission rate used by each flow. To completely utilize the network resources, we compute two optimal rates namely, the maximum fair share and minimum fair share. We use the maximum fair share achieved in order to limit the input rate of those flows which crosses the bottleneck links contention area when the flows that are not allocated to the optimal transmission rate and calculate the following highest fair share. Through simulation results, we show that the proposed protocol achieves improved fair share and throughput with reduced delay.

Adaptive Fourier Decomposition Based Signal Instantaneous Frequency Computation Approach

There have been different approaches to compute the analytic instantaneous frequency with a variety of background reasoning and applicability in practice, as well as restrictions. This paper presents an adaptive Fourier decomposition and (α-counting) based instantaneous frequency computation approach. The adaptive Fourier decomposition is a recently proposed new signal decomposition approach. The instantaneous frequency can be computed through the so called mono-components decomposed by it. Due to the fast energy convergency, the highest frequency of the signal will be discarded by the adaptive Fourier decomposition, which represents the noise of the signal in most of the situation. A new instantaneous frequency definition for a large class of so-called simple waves is also proposed in this paper. Simple wave contains a wide range of signals for which the concept instantaneous frequency has a perfect physical sense. The α-counting instantaneous frequency can be used to compute the highest frequency for a signal. Combination of these two approaches one can obtain the IFs of the whole signal. An experiment is demonstrated the computation procedure with promising results.

An Optimal Feature Subset Selection for Leaf Analysis

This paper describes an optimal approach for feature subset selection to classify the leaves based on Genetic Algorithm (GA) and Kernel Based Principle Component Analysis (KPCA). Due to high complexity in the selection of the optimal features, the classification has become a critical task to analyse the leaf image data. Initially the shape, texture and colour features are extracted from the leaf images. These extracted features are optimized through the separate functioning of GA and KPCA. This approach performs an intersection operation over the subsets obtained from the optimization process. Finally, the most common matching subset is forwarded to train the Support Vector Machine (SVM). Our experimental results successfully prove that the application of GA and KPCA for feature subset selection using SVM as a classifier is computationally effective and improves the accuracy of the classifier.

Techno-Economics Study to Select Optimum Desalination Plant for Asalouyeh Combined Cycle Power Plant in Iran

This research deals with techno economic analysis to select the most economic desalination method for Asalouyeh combined cycle power plant . Due to lack of fresh water, desalination of sea water is necessary to provide required DM water of Power Plant. The most common desalination methods are RO, MSF, MED, and MED–TVC. In this research, methods of RO, MED, and MED– TVC have been compared. Simulation results show that recovery of heat of exhaust gas of main stack is optimum case for providing DM water required for injected steam of MED desalination. This subject is very important because of improving thermal efficiency of power plant using extra heat recovery. Also, it has been shown that by adding 3 rows of finned tube to de-aerator evaporator, which is very simple and low cost, required steam for generating 5200 m3/day of desalinated water is obtainable.

A Modified Cross Correlation in the Frequency Domain for Fast Pattern Detection Using Neural Networks

Recently, neural networks have shown good results for detection of a certain pattern in a given image. In our previous papers [1-5], a fast algorithm for pattern detection using neural networks was presented. Such algorithm was designed based on cross correlation in the frequency domain between the input image and the weights of neural networks. Image conversion into symmetric shape was established so that fast neural networks can give the same results as conventional neural networks. Another configuration of symmetry was suggested in [3,4] to improve the speed up ratio. In this paper, our previous algorithm for fast neural networks is developed. The frequency domain cross correlation is modified in order to compensate for the symmetric condition which is required by the input image. Two new ideas are introduced to modify the cross correlation algorithm. Both methods accelerate the speed of the fast neural networks as there is no need for converting the input image into symmetric one as previous. Theoretical and practical results show that both approaches provide faster speed up ratio than the previous algorithm.

A Fault Tolerant Token-based Algorithm for Group Mutual Exclusion in Distributed Systems

The group mutual exclusion (GME) problem is a variant of the mutual exclusion problem. In the present paper a token-based group mutual exclusion algorithm, capable of handling transient faults, is proposed. The algorithm uses the concept of dynamic request sets. A time out mechanism is used to detect the token loss; also, a distributed scheme is used to regenerate the token. The worst case message complexity of the algorithm is n+1. The maximum concurrency and forum switch complexity of the algorithm are n and min (n, m) respectively, where n is the number of processes and m is the number of groups. The algorithm also satisfies another desirable property called smooth admission. The scheme can also be adapted to handle the extended group mutual exclusion problem.