Abstract: Different problems may causes distortion of the rotor,
and hence vibration, which is the most severe damage of the turbine
rotors. In many years different techniques have been developed for
the straightening of bent rotors. The method for straightening can be
selected according to initial information from preliminary inspections
and tests such as nondestructive tests, chemical analysis, run out tests
and also a knowledge of the shaft material. This article covers the
various causes of excessive bends and then some applicable common
straightening methods are reviewed. Finally, hot spotting is opted for
a particular bent rotor. A 325 MW steam turbine rotor is modeled and
finite element analyses are arranged to investigate this straightening
process. Results of experimental data show that performing the exact
hot spot straightening process reduced the bending of the rotor
significantly.
Abstract: The first generation of Mobile Agents based Intrusion
Detection System just had two components namely data collection
and single centralized analyzer. The disadvantage of this type of
intrusion detection is if connection to the analyzer fails, the entire
system will become useless. In this work, we propose novel hybrid
model for Mobile Agent based Distributed Intrusion Detection
System to overcome the current problem. The proposed model has
new features such as robustness, capability of detecting intrusion
against the IDS itself and capability of updating itself to detect new
pattern of intrusions. In addition, our proposed model is also capable
of tackling some of the weaknesses of centralized Intrusion Detection
System models.
Abstract: Unified Theory of Acceptance and Use of Technology
(UTAUT) model has demonstrated the influencing factors for generic
information systems use such as tablet personal computer (TPC) and
mobile communication. However, in the context of digital library
system, there has been very little effort to determine factors affecting
the intention to use digital library based on the UTAUT model. This
paper investigates factors that are expected to influence the intention
of postgraduate students to use digital library based on modified
UTAUT model. The modified model comprises of constructs
represented by several latent variables, namely performance
expectancy (PE), effort expectancy (EE), information quality (IQ)
and service quality (SQ) and moderated by age, gender and
experience in using digital library. Results show that performance
expectancy, effort expectancy and information quality are positively
related to the intention to use digital library, while service quality is
negatively related to the intention to use digital library. Age and
gender have shown no evidence of any significant interactions, while
experience in using digital library significantly interacts with effort
expectancy and intention to use digital library. This has provided the
evidence of a moderating effect of experience in the intention to use
digital library. It is expected that this research will shed new lights
into research of acceptance and intention to use the library in a digital
environment.
Abstract: Computer programming is considered a very difficult
course by many computer science students. The reasons for the
difficulties include cognitive load involved in programming,
different learning styles of students, instructional methodology and
the choice of the programming languages. To reduce the difficulties
the following have been tried: pair programming, program
visualization, different learning styles etc. However, these efforts
have produced limited success. This paper reviews the problem and
proposes a framework to help students overcome the difficulties
involved.
Abstract: In recent years, environment regulation forcing
manufactures to consider recovery activity of end-of- life products
and/or return products for refurbishing, recycling,
remanufacturing/repair and disposal in supply chain management. In
this paper, a mathematical model is formulated for single product
production-inventory system considering remanufacturing/reuse of
return products and rate of return products follows a demand like
function, dependent on purchasing price and acceptance quality level.
It is useful in decision making to determine whether to go for
remanufacturing or disposal of returned products along with newly
produced products to satisfy a stationary demand. In addition, a
modified genetic algorithm approach is proposed, inspired by particle
swarm optimization method. Numerical analysis of the case study is
carried out to validate the model.
Abstract: This paper presents the exergy analysis of a
desalination unit using humidification-dehumidification process.
Here, this unit is considered as a thermal system with three main
components, which are the heating unit by using a solar collector, the
evaporator or the humidifier, and the condenser or the dehumidifier.
In these components the exergy is a measure of the quality or grade
of energy and it can be destroyed in them. According to the second
law of thermodynamics this destroyed part is due to irreversibilities
which must be determined to obtain the exergetic efficiency of the
system.
In the current paper a computer program has been developed using
visual basic to determine the exergy destruction and the exergetic
efficiencies of the components of the desalination unit at variable
operation conditions such as feed water temperature, outlet air
temperature, air to feed water mass ratio and salinity, in addition to
cooling water mass flow rate and inlet temperature, as well as
quantity of solar irradiance.
The results obtained indicate that the exergy efficiency of the
humidifier increases by increasing the mass ratio and decreasing the
outlet air temperature. In the other hand the exergy efficiency of the
condenser increases with the increase of this ratio and also with the
increase of the outlet air temperature.
Abstract: Software complexity metrics are used to predict
critical information about reliability and maintainability of software
systems. Object oriented software development requires a different
approach to software complexity metrics. Object Oriented Software
Metrics can be broadly classified into static and dynamic metrics.
Static Metrics give information at the code level whereas dynamic
metrics provide information on the actual runtime. In this paper we
will discuss the various complexity metrics, and the comparison
between static and dynamic complexity.
Abstract: Network reconfiguration in distribution system is realized by changing the status of sectionalizing switches to reduce the power loss in the system. This paper presents a new method which applies an artificial bee colony algorithm (ABC) for determining the sectionalizing switch to be operated in order to solve the distribution system loss minimization problem. The ABC algorithm is a new population based metaheuristic approach inspired by intelligent foraging behavior of honeybee swarm. The advantage of ABC algorithm is that it does not require external parameters such as cross over rate and mutation rate as in case of genetic algorithm and differential evolution and it is hard to determine these parameters in prior. The other advantage is that the global search ability in the algorithm is implemented by introducing neighborhood source production mechanism which is a similar to mutation process. To demonstrate the validity of the proposed algorithm, computer simulations are carried out on 14, 33, and 119-bus systems and compared with different approaches available in the literature. The proposed method has outperformed the other methods in terms of the quality of solution and computational efficiency.
Abstract: In the recent past Learning Classifier Systems have
been successfully used for data mining. Learning Classifier System
(LCS) is basically a machine learning technique which combines
evolutionary computing, reinforcement learning, supervised or
unsupervised learning and heuristics to produce adaptive systems. A
LCS learns by interacting with an environment from which it
receives feedback in the form of numerical reward. Learning is
achieved by trying to maximize the amount of reward received. All
LCSs models more or less, comprise four main components; a finite
population of condition–action rules, called classifiers; the
performance component, which governs the interaction with the
environment; the credit assignment component, which distributes the
reward received from the environment to the classifiers accountable
for the rewards obtained; the discovery component, which is
responsible for discovering better rules and improving existing ones
through a genetic algorithm. The concatenate of the production rules
in the LCS form the genotype, and therefore the GA should operate
on a population of classifier systems. This approach is known as the
'Pittsburgh' Classifier Systems. Other LCS that perform their GA at
the rule level within a population are known as 'Mitchigan' Classifier
Systems. The most predominant representation of the discovered
knowledge is the standard production rules (PRs) in the form of IF P
THEN D. The PRs, however, are unable to handle exceptions and do
not exhibit variable precision. The Censored Production Rules
(CPRs), an extension of PRs, were proposed by Michalski and
Winston that exhibit variable precision and supports an efficient
mechanism for handling exceptions. A CPR is an augmented
production rule of the form: IF P THEN D UNLESS C, where
Censor C is an exception to the rule. Such rules are employed in
situations, in which conditional statement IF P THEN D holds
frequently and the assertion C holds rarely. By using a rule of this
type we are free to ignore the exception conditions, when the
resources needed to establish its presence are tight or there is simply
no information available as to whether it holds or not. Thus, the IF P
THEN D part of CPR expresses important information, while the
UNLESS C part acts only as a switch and changes the polarity of D
to ~D. In this paper Pittsburgh style LCSs approach is used for
automated discovery of CPRs. An appropriate encoding scheme is
suggested to represent a chromosome consisting of fixed size set of
CPRs. Suitable genetic operators are designed for the set of CPRs
and individual CPRs and also appropriate fitness function is proposed
that incorporates basic constraints on CPR. Experimental results are
presented to demonstrate the performance of the proposed learning
classifier system.
Abstract: The photonic component industry is a highly
innovative industry with a large value chain. In order to ensure the
growth of the industry much effort must be devoted to road mapping
activities. In such activities demand and price evolution forecasting
tools can prove quite useful in order to help in the roadmap
refinement and update process. This paper attempts to provide useful
guidelines in roadmapping of optical components and considers two
models based on diffusion theory and the extended learning curve for
demand and price evolution forecasting.
Abstract: Distance visualization of large datasets often takes the direction of remote viewing and zooming techniques of stored static images. However, the continuous increase in the size of datasets and visualization operation causes insufficient performance with traditional desktop computers. Additionally, the visualization techniques such as Isosurface depend on the available resources of the running machine and the size of datasets. Moreover, the continuous demand for powerful computing powers and continuous increase in the size of datasets results an urgent need for a grid computing infrastructure. However, some issues arise in current grid such as resources availability at the client machines which are not sufficient enough to process large datasets. On top of that, different output devices and different network bandwidth between the visualization pipeline components often result output suitable for one machine and not suitable for another. In this paper we investigate how the grid services could be used to support remote visualization of large datasets and to break the constraint of physical co-location of the resources by applying the grid computing technologies. We show our grid enabled architecture to visualize large medical datasets (circa 5 million polygons) for remote interactive visualization on modest resources clients.
Abstract: Local Linear Neuro-Fuzzy Models (LLNFM) like other neuro- fuzzy systems are adaptive networks and provide robust learning capabilities and are widely utilized in various applications such as pattern recognition, system identification, image processing and prediction. Local linear model tree (LOLIMOT) is a type of Takagi-Sugeno-Kang neuro fuzzy algorithm which has proven its efficiency compared with other neuro fuzzy networks in learning the nonlinear systems and pattern recognition. In this paper, a dedicated reconfigurable and parallel processing hardware for LOLIMOT algorithm and its applications are presented. This hardware realizes on-chip learning which gives it the capability to work as a standalone device in a system. The synthesis results on FPGA platforms show its potential to improve the speed at least 250 of times faster than software implemented algorithms.
Abstract: Teachers form the backbone of any educational system, hence selecting qualified candidates is very crucial. In Malaysia, the decision making in the selection process involves a few stages: Initial filtering through academic achievement, taking entry examination and going through an interview session. The last stage is the most challenging since it highly depends on human judgment. Therefore, this study sought to identify the selection criteria for teacher candidates that form the basis for an efficient multi-criteria teacher-candidate selection model for that last stage. The relevant criteria were determined from the literature and also based on expert input that is those who were involved in interviewing teacher candidates from a public university offering the formal training program. There are three main competency criteria that were identified which are content of knowledge, communication skills and personality. Further, each main criterion was divided into a few subcriteria. The Analytical Hierarchy Process (AHP) technique was employed to allocate weights for the criteria and later, integrated a Simple Weighted Average (SWA) scoring approach to develop the selection model. Subsequently, a web-based Decision Support System was developed to assist in the process of selecting the qualified teacher candidates. The Teacher-Candidate Selection (TeCaS) system is able to assist the panel of interviewers during the selection process which involves a large amount of complex qualitative judgments.
Abstract: In this paper, we consider the design of pulse shaping
filter using orthogonal Hermite-Rodriguez basis functions. The pulse
shaping filter design problem has been formulated and solved as a
quadratic programming problem with linear inequality constraints.
Compared with the existing approaches reported in the literature, the
use of Hermite-Rodriguez functions offers an effective alternative to
solve the constrained filter synthesis problem. This is demonstrated
through a numerical example which is concerned with the design of
an equalization filter for a digital transmission channel.
Abstract: In this paper, we argue the security protocols of
ZigBee wireless sensor network in MAC layer. AES 128-bit
encryption algorithm in CCM* mode is secure transferred data;
however, AES-s secret key will be break within nearest future.
Efficient public key algorithm, ECC has been mixed with AES to
rescue the ZigBee wireless sensor from cipher text and replay attack.
Also, the proposed protocol can parallelize the integrity function to
increase system performance.
Abstract: This work presents the results of a study carried out to
determine the sliding wear behavior and its effect on the process
parameters of components manufactured by direct metal laser
sintering (DMLS). A standard procedure and specimen had been used
in the present study to find the wear behavior. Using Taguchi-s
experimental technique, an orthogonal array of modified L8 had been
developed. Sliding wear testing using pin-on-disk machine was
carried out and analysis of variance (ANOVA) technique was used to
investigate the effect of process parameters and to identify the main
process parameter that influences the properties of wear behavior on
the DMLS components. It has been found that part orientation, one
of the selected process parameter had more influence on wear as
compared to other selected process parameters.
Abstract: Economic crime (i.e. corporate fraud) has a
significant impact on business. This study analyzes the fraud cases
reported by the Malaysian Securities Commission. Frauds involving
market manipulation and/or illegal share trading are the most
common types of fraud reported over the 6 years analyzed. The
highest number of frauds reported involved investment and fund
holding companies. Alarmingly the results indicate quite a high
number of frauds cases are committed by management. The higher
number of Chinese perpetrators may be due to fact that they are the
dominant group in Malaysian business. The result also shows that
more than half of companies involved with fraud are privately held
companies in the investment/fund/finance sector. The results of this
study highlight general characteristic of perpetrators (person and
company) that commit fraud which could help the regulators in their
monitoring and enforcement activities. To investors, this would help
in analyzing their business investment or portfolio risk.
Abstract: Transmission control protocol (TCP) Vegas detects
network congestion in the early stage and successfully prevents
periodic packet loss that usually occurs in TCP Reno. It has been
demonstrated that TCP Vegas outperforms TCP Reno in many
aspects. However, TCP Vegas suffers several problems that affect its
congestion avoidance mechanism. One of the most important
weaknesses in TCP Vegas is that alpha and beta depend on a good
expected throughput estimate, which as we have seen, depends on a
good minimum RTT estimate. In order to make the system more
robust alpha and beta must be made responsive to network conditions
(they are currently chosen statically). This paper proposes a modified
Vegas algorithm, which can be adjusted to present good performance
compared to other transmission control protocols (TCPs). In order to
do this, we use PSO algorithm to tune alpha and beta. The simulation
results validate the advantages of the proposed algorithm in term of
performance.
Abstract: This study investigated the ecological effects of
particulate pollution from a cement factory on the vegetation in the
western Mediterranean coastal desert of Egypt. Variations in
vegetation, soil chemical characters, and some responses of Atriplex
halimus, as a dominant species in the study area, were investigated in
some sites located in different directions from the cement factory
between Burg El-Arab in the east and El-Hammam in the west. The
results showed an obvious decrease in vegetation diversity, in
response to cement-kiln dust pollution, that accompanied by a high
dominance attributed to the high contribution of Atriplex halimus.
Annual species were found to be more sensitive to cement dust
pollution as they all failed to persist in highly disturbed sites. It is
remarkable that cover and phytomass of Atriplex halimus were
increased greatly in response to cement dust pollution, and this was
accompanied by a reduction in the mature seeds and leaf-area of the
plant. The few seeds of the affected individuals seemed to be more
fertile and attained higher germination percentages and exhibited
hardening against drought stress.
Abstract: The performance of mortar subjected to high
temperature and cooled in normal ambient temperature was examined
in the laboratory to comply with the situation of burning & cooling of
a structure. Four series of cubical (5 X 5 X 5 cm) mortar specimens
were made from OPC, and partial replacement (10, 15, 20, 25 &
30%) of OPC by Rice Husk Ash (RHA) produced in the uncontrolled
environment. These specimens were heated in electric furnace to 200,
300, 400, 500 and 7000C. The specimens were kept in normal room
temperature for cooling. They were then tested for mechanical
properties and the results shows that particular 20% RHA mixed
mortar shows better fire performance.