Abstract: Roundabout work on the principle of circulation and
entry flows, where the maximum entry flow rates depend largely on
circulating flow bearing in mind that entry flows must give away to
circulating flows. Where an existing roundabout has a road hump
installed at the entry arm, it can be hypothesized that the kinematics
of vehicles may prevent the entry arm from achieving optimum
performance. Road humps are traffic calming devices placed across
road width solely as speed reduction mechanism. They are the
preferred traffic calming option in Malaysia and often used on single
and dual carriageway local routes. The speed limit on local routes is
30mph (50 km/hr). Road humps in their various forms achieved the
biggest mean speed reduction (based on a mean speed before traffic
calming of 30mph) of up to 10mph or 16 km/hr according to the UK
Department of Transport. The underlying aim of reduced speed
should be to achieve a 'safe' distribution of speeds which reflects the
function of the road and the impacts on the local community.
Constraining safe distribution of speeds may lead to poor drivers
timing and delayed reflex reaction that can probably cause accident.
Previous studies on road hump impact have focused mainly on speed
reduction, traffic volume, noise and vibrations, discomfort and delay
from the use of road humps. The paper is aimed at optimal entry and
circulating flow induced by road humps. Results show that
roundabout entry and circulating flow perform better in
circumstances where there is no road hump at entrance.
Abstract: This paper introduces a hand gesture recognition system to recognize real time gesture in unstrained environments. Efforts should be made to adapt computers to our natural means of communication: Speech and body language. A simple and fast algorithm using orientation histograms will be developed. It will recognize a subset of MAL static hand gestures. A pattern recognition system will be using a transforrn that converts an image into a feature vector, which will be compared with the feature vectors of a training set of gestures. The final system will be Perceptron implementation in MATLAB. This paper includes experiments of 33 hand postures and discusses the results. Experiments shows that the system can achieve a 90% recognition average rate and is suitable for real time applications.
Abstract: Quality Function Deployment (QFD) is an expounded, multi-step planning method for delivering commodity, services, and processes to customers, both external and internal to an organization. It is a way to convert between the diverse customer languages expressing demands (Voice of the Customer), and the organization-s languages expressing results that sate those demands. The policy is to establish one or more matrices that inter-relate producer and consumer reciprocal expectations. Due to its visual presence is called the “House of Quality" (HOQ). In this paper, we assumed HOQ in multi attribute decision making (MADM) pattern and through a proposed MADM method, rank technical specifications. Thereafter compute satisfaction degree of customer requirements and for it, we apply vagueness and uncertainty conditions in decision making by fuzzy set theory. This approach would propound supervised neural network (perceptron) for MADM problem solving.
Abstract: The problem of FIR system parameter estimation has been considered in the paper. A new robust recursive algorithm for simultaneously estimation of parameters and scale factor of prediction residuals in non-stationary environment corrupted by impulsive noise has been proposed. The performance of derived algorithm has been tested by simulations.
Abstract: The purpose of this paper is to present two different
approaches of financial distress pre-warning models appropriate for
risk supervisors, investors and policy makers. We examine a sample
of the financial institutions and electronic companies of Taiwan
Security Exchange (TSE) market from 2002 through 2008. We
present a binary logistic regression with paned data analysis. With
the pooled binary logistic regression we build a model including
more variables in the regression than with random effects, while the
in-sample and out-sample forecasting performance is higher in
random effects estimation than in pooled regression. On the other
hand we estimate an Adaptive Neuro-Fuzzy Inference System
(ANFIS) with Gaussian and Generalized Bell (Gbell) functions and
we find that ANFIS outperforms significant Logit regressions in both
in-sample and out-of-sample periods, indicating that ANFIS is a
more appropriate tool for financial risk managers and for the
economic policy makers in central banks and national statistical
services.
Abstract: Carbon nanotubes (CNTs) with their high mechanical,
electrical, thermal and chemical properties are regarded as promising
materials for many different potential applications. Having unique
properties they can be used in a wide range of fields such as
electronic devices, electrodes, drug delivery systems, hydrogen
storage, textile etc. Catalytic chemical vapor deposition (CCVD) is a
common method for CNT production especially for mass production.
Catalysts impregnated on a suitable substrate are important for
production with chemical vapor deposition (CVD) method. Iron
catalyst and MgO substrate is one of most common catalyst-substrate
combination used for CNT. In this study, CNTs were produced by
CCVD of acetylene (C2H2) on magnesium oxide (MgO) powder
substrate impregnated by iron nitrate (Fe(NO3)3•9H2O) solution. The
CNT synthesis conditions were as follows: at synthesis temperatures
of 500 and 800°C multiwall and single wall CNTs were produced
respectively. Iron (Fe) catalysts were prepared by with Fe:MgO ratio
of 1:100, 5:100 and 10:100. The duration of syntheses were 30 and
60 minutes for all temperatures and catalyst percentages. The
synthesized materials were characterized by thermal gravimetric
analysis (TGA), transmission electron microscopy (TEM) and Raman
spectroscopy.
Abstract: This paper aims to (1) analyze the profiles of
transgressors (detected evaders); (2) examine reason(s) that triggered a
tax audit, causes of tax evasion, audit timeframe and tax penalty
charged; and (3) to assess if tax auditors followed the guidelines as
stated in the 'Tax Audit Framework' when conducting tax audits. In
2011, the Inland Revenue Board Malaysia (IRBM) had audited and
finalized 557 company cases. With official permission, data of all the
557 cases were obtained from the IRBM. Of these, a total of 421 cases
with complete information were analyzed. About 58.1% was small and
medium corporations and from the construction industry (32.8%). The
selection for tax audit was based on risk analysis (66.8%), information
from third party (11.1%), and firm with low profitability or fluctuating
profit pattern (7.8%). The three persistent causes of tax evasion by
firms were over claimed expenses (46.8%), fraudulent reporting of
income (38.5%) and overstating purchases (10.5%). These findings
are consistent with past literature. Results showed that tax auditors
took six to 18 months to close audit cases. More than half of tax
evaders were fined 45% on additional tax raised during audit for the
first offence. The study found tax auditors did follow the guidelines in
the 'Tax Audit Framework' in audit selection, settlement and penalty
imposition.
Abstract: This paper presents an investigation of the power
penalties imposed by four-wave mixing (FWM) on G.652 (Single-
Mode Fiber - SMF), G.653 (Dispersion-Shifted Fiber - DSF), and
G.655 (Non-Zero Dispersion-Shifted Fiber - NZDSF) compliant
fibers, considering the DWDM grids suggested by the ITU-T
Recommendations G.692, and G.694.1, with uniform channel
spacing of 100, 50, 25, and 12.5 GHz. The mathematical/numerical
model assumes undepleted pumping, and shows very clearly the
deleterious effect of FWM on the performance of DWDM systems,
measured by the signal-to-noise ratio (SNR). The results make it
evident that non-uniform channel spacing is practically mandatory
for WDM systems based on DSF fibers.
Abstract: The aim of this study is evaluating the antinociceptive
and anti-inflamatory activity of Geum kokanicum. After
determination total extract LD50, different doses of extract were
chosen for intrapritoneal injections. In inflammation test, male NMRI
mice were divided into 6 groups: control (normal saline), positive
control (Dexamethasone 15mg/kg), and total extract (0.025, 0.05,
0.1, and 0.2 gr/kg). The inflammation was produced by xyleneinduced
edema. In order to evaluate the antinociceptive effect of total
extract, formalin test was used. Mice were divided into 6 groups:
control, positive control (morphine 10mg/kg), and 4 groups which
received total extract. Then they received Formalin. The animals
were observed for the reaction to pain. Data were analyzed using
One-way ANOVA followed by Tukey-Kramer multiple comparison
test. LD50 was 1 gr/kg. Data indicated that 0.5,0.1 and 0.2 gr/kg
doses of total extract have particular antinociceptive and antiinflammatory
effects in a comparison with control (P
Abstract: Magnesium is used implant material potentially for
non-toxicity to the human body. Due to the excellent
bio-compatibility, Mg alloys is applied to implants avoiding removal
second surgery. However, it is found commercial magnesium alloys
including aluminum has low corrosion resistance, resulting
subcutaneous gas bubbles and consequently the approach as
permanent bio-materials. Generally, Aluminum is known to pollution
substance, and it raises toxicity to nervous system. Therefore
especially Mg-35Zn-3Ca alloy is prepared for new biodegradable
materials in this study. And the pulsed power is used in
constant-current mode of DC power kinds of anodization. Based on
the aforementioned study, it examines corrosion resistance and
biocompatibility by effect of current and frequency variation. The
surface properties and thickness were compared using scanning
electronic microscopy. Corrosion resistance was assessed via
potentiodynamic polarization and the effect of oxide layer on the body
was assessed cell viability. Anodized Mg-35Zn-3Ca alloy has good
biocompatibility in vitro by current and frequency variation.
Abstract: One important problem in today organizations is the
existence of non-integrated information systems, inconsistency and
lack of suitable correlations between legacy and modern systems.
One main solution is to transfer the local databases into a global one.
In this regards we need to extract the data structures from the legacy
systems and integrate them with the new technology systems. In
legacy systems, huge amounts of a data are stored in legacy
databases. They require particular attention since they need more
efforts to be normalized, reformatted and moved to the modern
database environments. Designing the new integrated (global)
database architecture and applying the reverse engineering requires
data normalization. This paper proposes the use of database reverse
engineering in order to integrate legacy and modern databases in
organizations. The suggested approach consists of methods and
techniques for generating data transformation rules needed for the
data structure normalization.
Abstract: This paper proposes a modeling methodology for the
development of data analysis solution. The Author introduce the
approach to address data warehousing issues at the at enterprise level.
The methodology covers the process of the requirements eliciting and
analysis stage as well as initial design of data warehouse. The paper
reviews extended business process model, which satisfy the needs of
data warehouse development. The Author considers that the use of
business process models is necessary, as it reflects both enterprise
information systems and business functions, which are important for
data analysis. The Described approach divides development into
three steps with different detailed elaboration of models. The
Described approach gives possibility to gather requirements and
display them to business users in easy manner.
Abstract: Twist drills are geometrical complex tools and thus various researchers have adopted different mathematical and experimental approaches for their simulation. The present paper acknowledges the increasing use of modern CAD systems and using the API (Application Programming Interface) of a CAD system, drilling simulations are carried out. The developed DRILL3D software routine, creates parametrically controlled tool geometries and using different cutting conditions, achieves the generation of solid models for all the relevant data involved (drilling tool, cut workpiece, undeformed chip). The final data derived, consist a platform for further direct simulations regarding the determination of cutting forces, tool wear, drilling optimizations etc.
Abstract: We consider a heterogeneously mixing SIR stochastic
epidemic process in populations described by a general graph.
Likelihood theory is developed to facilitate statistic inference for the
parameters of the model under complete observation. We show that
these estimators are asymptotically Gaussian unbiased estimates by
using a martingale central limit theorem.
Abstract: Automatic face detection is a complex problem in
image processing. Many methods exist to solve this problem such as
template matching, Fisher Linear Discriminate, Neural Networks,
SVM, and MRC. Success has been achieved with each method to
varying degrees and complexities. In proposed algorithm we used
upright, frontal faces for single gray scale images with decent
resolution and under good lighting condition. In the field of face
recognition technique the single face is matched with single face
from the training dataset. The author proposed a neural network
based face detection algorithm from the photographs as well as if any
test data appears it check from the online scanned training dataset.
Experimental result shows that the algorithm detected up to 95%
accuracy for any image.
Abstract: A high performance computer includes a fast
processor and millions bytes of memory. During the data processing,
huge amount of information are shuffled between the memory and
processor. Because of its small size and its effectiveness speed, cache
has become a common feature of high performance computers.
Enhancing cache performance proved to be essential in the speed up
of cache-based computers. Most enhancement approaches can be
classified as either software based or hardware controlled. The
performance of the cache is quantified in terms of hit ratio or miss
ratio. In this paper, we are optimizing the cache performance based
on enhancing the cache hit ratio. The optimum cache performance is
obtained by focusing on the cache hardware modification in the way
to make a quick rejection to the missed line's tags from the hit-or
miss comparison stage, and thus a low hit time for the wanted line in
the cache is achieved. In the proposed technique which we called
Even- Odd Tabulation (EOT), the cache lines come from the main
memory into cache are classified in two types; even line's tags and
odd line's tags depending on their Least Significant Bit (LSB). This
division is exploited by EOT technique to reject the miss match line's
tags in very low time compared to the time spent by the main
comparator in the cache, giving an optimum hitting time for the
wanted cache line. The high performance of EOT technique against
the familiar mapping technique FAM is shown in the simulated
results.
Abstract: Online trading is an alternative to conventional shopping method. People trade goods which are new or pre-owned before. However, there are times when a user is not able to search the items wanted online. This is because the items may not be posted as yet, thus ending the search. Conventional search mechanism only works by searching and matching search criteria (requirement) with data available in a particular database. This research aims to match current search requirements with future postings. This would involve the time factor in the conventional search method. A Car Matching Alert System (CMAS) prototype was developed to test the matching algorithm. When a buyer-s search returns no result, the system saves the search and the buyer will be alerted if there is a match found based on future postings. The algorithm developed is useful and as it can be applied in other search context.
Abstract: The influence of axial magnetic field (B=0.48 T) on
the variation of ionization efficiency coefficient h and secondary
electron emission coefficient g with respect to reduced electric field
E/P is studied at a new range of plane-parallel electrode spacing (0<
d< 20 cm) and different nitrogen working pressure between 0.5-20
Pa. The axial magnetic field is produced from an inductive copper
coil of radius 5.6 cm. The experimental data of breakdown voltage is
adopted to estimate the mean Paschen curves at different working
features. The secondary electron emission coefficient is calculated
from the mean Paschen curve and used to determine the minimum
breakdown voltage. A reduction of discharge voltage of about 25% is
investigated by the applied of axial magnetic field. At high interelectrode
spacing, the effect of axial magnetic field becomes more
significant for the obtained values of h but it was less for the values
of g.
Abstract: This paper presents an interval-based multi-attribute
decision making (MADM) approach in support of the decision
process with imprecise information. The proposed decision
methodology is based on the model of linear additive utility function
but extends the problem formulation with the measure of composite
utility variance. A sample study concerning with the evaluation of
electric generation expansion strategies is provided showing how the
imprecise data may affect the choice toward the best solution and
how a set of alternatives, acceptable to the decision maker (DM),
may be identified with certain confidence.
Abstract: The dynamic behaviour of a four-bar linkage driven by a velocity controlled DC motor is discussed in the paper. In particular the author presents the results obtained by means of a specifically developed software, which implements the mathematical models of all components of the system (linkage, transmission, electric motor, control devices). The use of this software enables a more efficient design approach, since it allows the designer to check, in a simple and immediate way, the dynamic behaviour of the mechanism, arising from different values of the system parameters.