Abstract: The objective of this paper is to present the
development of the frame of Chulalongkorn University team in TSAE
Auto Challenge Student Formula and Student Formula SAE
Competition of Japan. Chulalongkorn University's SAE team, has
established since year 2003, joined many competitions since year 2006
and became the leading team in Thailand. Through these 5 years, space
frame was the most selected and developed year by year through six
frame designs. In this paper, the discussions on the conceptual design
of these frames are introduced, focusing on the mass and torsional
stiffness improvement. The torsional stiffness test was performed on
the real used frames and the results are compared. It can be seen that
the 2010-2011 frame is firstly designed based on the analysis and
experiment that considered the required mass and torsional stiffness.
From the torsional stiffness results, it can be concluded that the frames
were developed including the decreasing of mass and the increasing
torsional stiffness by applying many techniques.
Abstract: Security has been an important issue and concern in the
smart home systems. Smart home networks consist of a wide range of
wired or wireless devices, there is possibility that illegal access to
some restricted data or devices may happen. Password-based
authentication is widely used to identify authorize users, because this
method is cheap, easy and quite accurate. In this paper, a neural
network is trained to store the passwords instead of using verification
table. This method is useful in solving security problems that
happened in some authentication system. The conventional way to
train the network using Backpropagation (BPN) requires a long
training time. Hence, a faster training algorithm, Resilient
Backpropagation (RPROP) is embedded to the MLPs Neural
Network to accelerate the training process. For the Data Part, 200
sets of UserID and Passwords were created and encoded into binary
as the input. The simulation had been carried out to evaluate the
performance for different number of hidden neurons and combination
of transfer functions. Mean Square Error (MSE), training time and
number of epochs are used to determine the network performance.
From the results obtained, using Tansig and Purelin in hidden and
output layer and 250 hidden neurons gave the better performance. As
a result, a password-based user authentication system for smart home
by using neural network had been developed successfully.
Abstract: Common acceptable cuisine usually discussed in the
multicultural/ethnic nation as it represents the process of sharing it
among the ethnic groups. The common acceptable cuisine is also
considered as a precursor in the process of constructing the national
food identity within ethnic groups in the multicultural countries. The
adaptation of certain ethnic cuisines through its types of food,
methods of cooking, ingredients and eating decorum by ethnic groups
is believed creating or enhancing the process of formation on
common acceptable cuisines in a multicultural country. Malaysia as
the multicultural country without doubt is continuing to experience
cross-culturing processes among the ethnic groups including cuisine.
This study empirically investigates the adaptation level of Malay,
Chinese and Indian chefs on each other ethnic cuisine attributes
toward the formation on common acceptable cuisines and national
food identity.
Abstract: The frequency contents of the non-stationary
signals vary with time. For proper characterization of such
signals, a smart time-frequency representation is necessary.
Classically, the STFT (short-time Fourier transform) is
employed for this purpose. Its limitation is the fixed timefrequency
resolution. To overcome this drawback an enhanced
STFT version is devised. It is based on the signal driven
sampling scheme, which is named as the cross-level sampling.
It can adapt the sampling frequency and the window function
(length plus shape) by following the input signal local
variations. This adaptation results into the proposed technique
appealing features, which are the adaptive time-frequency
resolution and the computational efficiency.
Abstract: The study of the stress distribution on a hollow
cylindrical fiber placed in a composite material is considered in this
work and an analytical solution for this stress distribution has been
constructed. Finally some parameters such as fiber-s thickness and
fiber-s length are considered and their effects on the distribution of
stress have been investigated. For finding the governing relations,
continuity equations for the axisymmetric problem in cylindrical
coordinate (r,o,z) are considered. Then by assuming some conditions
and solving the governing equations and applying the boundary
conditions, an equation relates the stress applied to the representative
volume element with the stress distribution on the fiber has been
found.
Abstract: Mobile Ad hoc Network (MANET) is a wireless ad hoc self-configuring network of mobile routers (and associated hosts) connected by wireless links, the union of which forms an arbitrary topology, cause of the random mobility of the nodes. In this paper, an attempt has been made to compare these three protocols DSDV, AODV and DSR on the performance basis under different traffic protocols namely CBR and TCP in a large network. The simulation tool is NS2, the scenarios are made to see the effect of pause times. The results presented in this paper clearly indicate that the different protocols behave differently under different pause times. Also, the results show the main characteristics of different traffic protocols operating on MANETs and thus select the best protocol on each scenario.
Abstract: The purpose of this paper is to propose a framework for constructing correct parallel processing programs based on Equivalent Transformation Framework (ETF). ETF regards computation as In the framework, a problem-s domain knowledge and a query are described in definite clauses, and computation is regarded as transformation of the definite clauses. Its meaning is defined by a model of the set of definite clauses, and the transformation rules generated must preserve meaning. We have proposed a parallel processing method based on “specialization", a part of operation in the transformations, which resembles substitution in logic programming. The method requires “Memo-tree", a history of specialization to maintain correctness. In this paper we proposes the new method for the specialization-base parallel processing without Memo-tree.
Abstract: This paper maps the structure of the social network of
the 2011 class ofsixty graduate students of the Masters of Science
(Knowledge Management) programme at the Nanyang Technological
University, based on their friending relationships on Facebook. To
ensure anonymity, actual names were not used. Instead, they were
replaced with codes constructed from their gender, nationality, mode
of study, year of enrollment and a unique number. The relationships
between friends within the class, and among the seniors and alumni
of the programme wereplotted. UCINet and Pajek were used to plot
the sociogram, to compute the density, inclusivity, and degree,
global, betweenness, and Bonacich centralities, to partition the
students into two groups, namely, active and peripheral, and to
identify the cut-points. Homophily was investigated, and it was
observed for nationality and study mode. The groups students formed
on Facebook were also studied, and of fifteen groups, eight were
classified as dead, which we defined as those that have been inactive
for over two months.
Abstract: Recently the usefulness of Concept Abduction, a novel non-monotonic inference service for Description Logics (DLs), has been argued in the context of ontology-based applications such as semantic matchmaking and resource retrieval. Based on tableau calculus, a method has been proposed to realize this reasoning task in ALN, a description logic that supports simple cardinality restrictions as well as other basic constructors. However, in many ontology-based systems, the representation of ontology would require expressive formalisms for capturing domain-specific constraints, this language is not sufficient. In order to increase the applicability of the abductive reasoning method in such contexts, we would like to present in the scope of this paper an extension of the tableaux-based algorithm for dealing with concepts represented inALCQ, the description logic that extends ALN with full concept negation and quantified number restrictions.
Abstract: This paper presents the results of an experimental
investigation carried out to evaluate the shrinkage of High Strength
Concrete. High Strength Concrete is made by partially replacement of
cement by flyash and silica fume. The shrinkage of High Strength
Concrete has been studied using the different types of coarse and fine
aggregates i.e. Sandstone and Granite of 12.5 mm size and Yamuna
and Badarpur Sand. The Mix proportion of concrete is 1:0.8:2.2 with
water cement ratio as 0.30. Superplasticizer dose @ of 2% by weight
of cement is added to achieve the required degree of workability in
terms of compaction factor.
From the test results of the above investigation it can be concluded
that the shrinkage strain of High Strength Concrete increases with
age. The shrinkage strain of concrete with replacement of cement by
10% of Flyash and Silica fume respectively at various ages are more
(6 to 10%) than the shrinkage strain of concrete without Flyash and
Silica fume. The shrinkage strain of concrete with Badarpur sand as
Fine aggregate at 90 days is slightly less (10%) than that of concrete
with Yamuna Sand. Further, the shrinkage strain of concrete with
Granite as Coarse aggregate at 90 days is slightly less (6 to 7%) than
that of concrete with Sand stone as aggregate of same size. The
shrinkage strain of High Strength Concrete is also compared with that
of normal strength concrete. Test results show that the shrinkage
strain of high strength concrete is less than that of normal strength
concrete.
Abstract: For the electrical metrics that describe photovoltaic
cell performance are inherently multivariate in nature, use of a
univariate, or one variable, statistical process control chart can have
important limitations. Development of a comprehensive process
control strategy is known to be significantly beneficial to reducing
process variability that ultimately drives up the manufacturing cost
photovoltaic cells. The multivariate moving average or MMA chart,
is applied to the electrical metrics of photovoltaic cells to illustrate
the improved sensitivity on process variability this method of control
charting offers. The result show the ability of the MMA chart to
expand to as any variables as needed, suggests an application
with multiple photovoltaic electrical metrics being used in
concert to determine the processes state of control.
Abstract: We describe a novel method for removing noise (in wavelet domain) of unknown variance from microarrays. The method is based on a smoothing of the coefficients of the highest subbands. Specifically, we decompose the noisy microarray into wavelet subbands, apply smoothing within each highest subband, and reconstruct a microarray from the modified wavelet coefficients. This process is applied a single time, and exclusively to the first level of decomposition, i.e., in most of the cases, it is not necessary a multirresoltuion analysis. Denoising results compare favorably to the most of methods in use at the moment.
Abstract: This paper describes technological possibilities to
enhance methane productionin the anaerobic stabilization of wastewater treatment plant excess sludge. This objective can be achieved by the addition of waste residues: crude glycerol from biodiesel production and residues from fishery. The addition
ofglycerol in an amount by weight of 2 – 5% causes enhancement of methane production of about 250 – 400%. At the same time the
percentage increase of total solids concentration in the outgoing sludge is ten or more times less. The containment of methane in
biogas is higher in case of admixed substrate.
Abstract: We consider a typical problem in the assembly of
printed circuit boards (PCBs) in a two-machine flow shop system to
simultaneously minimize the weighted sum of weighted tardiness and
weighted flow time. The investigated problem is a group scheduling
problem in which PCBs are assembled in groups and the interest is to
find the best sequence of groups as well as the boards within each
group to minimize the objective function value. The type of setup
operation between any two board groups is characterized as carryover
sequence-dependent setup time, which exactly matches with the real
application of this problem. As a technical constraint, all of the
boards must be kitted before the assembly operation starts (kitting
operation) and by kitting staff. The main idea developed in this paper
is to completely eliminate the role of kitting staff by assigning the
task of kitting to the machine operator during the time he is idle
which is referred to as integration of internal (machine) and external
(kitting) setup times. Performing the kitting operation, which is a
preparation process of the next set of boards while the other boards
are currently being assembled, results in the boards to continuously
enter the system or have dynamic arrival times. Consequently, a
dynamic PCB assembly system is introduced for the first time in the
assembly of PCBs, which also has characteristics similar to that of
just-in-time manufacturing. The problem investigated is
computationally very complex, meaning that finding the optimal
solutions especially when the problem size gets larger is impossible.
Thus, a heuristic based on Genetic Algorithm (GA) is employed. An
example problem on the application of the GA developed is
demonstrated and also numerical results of applying the GA on
solving several instances are provided.
Abstract: The main aim of this study is to identify the most
influential variables that cause defects on the items produced by a
casting company located in Turkey. To this end, one of the items
produced by the company with high defective percentage rates is
selected. Two approaches-the regression analysis and decision treesare
used to model the relationship between process parameters and
defect types. Although logistic regression models failed, decision tree
model gives meaningful results. Based on these results, it can be
claimed that the decision tree approach is a promising technique for
determining the most important process variables.
Abstract: Many corporations are seriously concerned about
security of networks and therefore, their network supervisors are still
reluctant to install WLANs. In this regards, the IEEE802.11i standard
was developed to address the security problems, even though the
mistrust of the wireless LAN technology is still existing. The thought
was that the best security solutions could be found in open standards
based technologies that can be delivered by Virtual Private
Networking (VPN) being used for long time without addressing any
security holes for the past few years. This work, addresses this issue
and presents a simulated wireless LAN of IEEE802.11g protocol, and
analyzes impact of integrating Virtual Private Network technology to
secure the flow of traffic between the client and the server within the
LAN, using OPNET WLAN utility. Two Wireless LAN scenarios
have been introduced and simulated. These are based on normal
extension to a wired network and VPN over extension to a wired
network. The results of the two scenarios are compared and indicate
the impact of improving performance, measured by response time
and load, of Virtual Private Network over wireless LAN.
Abstract: Software maintenance and mainly software
comprehension pose the largest costs in the software lifecycle. In
order to assess the cost of software comprehension, various
complexity measures have been proposed in the literature. This paper
proposes new cognitive-spatial complexity measures, which combine
the impact of spatial as well as architectural aspect of the software to
compute the software complexity. The spatial aspect of the software
complexity is taken into account using the lexical distances (in
number of lines of code) between different program elements and the
architectural aspect of the software complexity is taken into
consideration using the cognitive weights of control structures
present in control flow of the program. The proposed measures are
evaluated using standard axiomatic frameworks and then, the
proposed measures are compared with the corresponding existing
cognitive complexity measures as well as the spatial complexity
measures for object-oriented software. This study establishes that the
proposed measures are better indicators of the cognitive effort
required for software comprehension than the other existing
complexity measures for object-oriented software.
Abstract: These days wireless local area networks has become
very popular, when the initial IEEE802.11 is the standard for
providing wireless connectivity to automatic machinery, equipment
and stations that require rapid deployment, which may be portable,
handheld or which may be mounted on moving vehicles within a
local area. IEEE802.11 Wireless local area network is a sharedmedium
communication network that transmits information over
wireless links for all IEEE802.11 stations in its transmission range to
receive. When a user is moving from one location to another, how
the other user knows about the required station inside WLAN. For
that we designed and implemented a system to locate a mobile user
inside the wireless local area network based on RSSI with the help of
four specially designed architectures. These architectures are based
on statistical or we can say manual configuration of mapping and
radio map of indoor and outdoor location with the help of available
Sniffer based and cluster based techniques. We found a better
location of a mobile user in WLAN. We tested this work in indoor
and outdoor environments with different locations with the help of
Pamvotis, a simulator for WLAN.
Abstract: This paper aims to select the optimal location and
setting parameters of TCSC (Thyristor Controlled Series
Compensator) controller using Particle Swarm Optimization (PSO)
and Genetic Algorithm (GA) to mitigate small signal oscillations in a
multimachine power system. Though Power System Stabilizers
(PSSs) are prime choice in this issue, installation of FACTS device
has been suggested here in order to achieve appreciable damping of
system oscillations. However, performance of any FACTS devices
highly depends upon its parameters and suitable location in the
power network. In this paper PSO as well as GA based techniques are
used separately and compared their performances to investigate this
problem. The results of small signal stability analysis have been
represented employing eigenvalue as well as time domain response in
face of two common power system disturbances e.g., varying load
and transmission line outage. It has been revealed that the PSO based
TCSC controller is more effective than GA based controller even
during critical loading condition.
Abstract: This paper presents a model for the characterization
and selection of beeswaxes for use as base substitute tissue for the
manufacture of objects suitable for external radiotherapy using
megavoltage photon beams. The model of characterization was
divided into three distinct stages: 1) verification of aspects related to
the origin of the beeswax, the bee species, the flora in the vicinity of
the beehives and procedures to detect adulterations; 2) evaluation of
physical and chemical properties; and 3) evaluation of beam
attenuation capacity. The chemical composition of the beeswax
evaluated in this study was similar to other simulators commonly
used in radiotherapy. The behavior of the mass attenuation coefficient
in the radiotherapy energy range was comparable to other simulators.
The proposed model is efficient and enables convenient assessment
of the use of any particular beeswax as a base substitute tissue for
radiotherapy.