Abstract: Active vibration isolation systems are less commonly
used than passive systems due to their associated cost and power
requirements. In principle, semi-active isolation systems can deliver
the versatility, adaptability and higher performance of fully active
systems for a fraction of the power consumption. Various semi-active
control algorithms have been suggested in the past. This paper
studies the 4DOF model of semi-active suspension performance
controlled by on–off and continuous skyhook damping control
strategy. The frequency and transient responses of model are
evaluated in terms of body acceleration, roll angle and tire deflection
and are compared with that of a passive damper. The results show
that the semi-active system controlled by skyhook strategy always
provides better isolation than a conventional passively damped
system except at tire natural frequencies.
Abstract: The chemistry of sulphone hydrazide has gained increase interest in both synthetic organic chemistry and biological fields and has considerable value. The therapeutic importance of these compounds is the attractive force to continue research in such a point. The present review covers the literature up to date for the synthesis, reactions and applications of such compounds.
Abstract: The menace of counterfeiting pharmaceuticals/drugs has become a major threat to consumers, healthcare providers, drug manufacturers and governments. It is a source of public health concern both in the developed and developing nations. Several solutions for detecting and authenticating counterfeit drugs have been adopted by different nations of the world. In this article, a dialogue system-based drug counterfeiting detection system was developed and the results of the user satisfaction and acceptability of the system are presented. The results show that the users were satisfied with the system and the system was widely accepted as a means of fighting counterfeited drugs.
Abstract: Monitored 3-Dimensional (3D) video experience can be utilized as “feedback information” to fine tune the service parameters for providing a better service to the demanding 3D service customers. The 3D video experience which includes both video quality and depth perception is influenced by several contextual and content related factors (e.g., ambient illumination condition, content characteristics, etc) due to the complex nature of the 3D video. Therefore, effective factors on this experience should be utilized while assessing it. In this paper, structural information of the depth map sequences of the 3D video is considered as content related factor effective on the depth perception assessment. Cartoon-like filter is utilized to abstract the significant depth levels in the depth map sequences to determine the structural information. Moreover, subjective experiments are conducted using 3D videos associated with cartoon-like depth map sequences to investigate the effectiveness of ambient illumination condition, which is a contextual factor, on depth perception. Using the knowledge gained through this study, 3D video experience metrics can be developed to deliver better service to the 3D video service users.
Abstract: Learning programming is difficult for many learners. Some researches have found that the main difficulty relates to cognitive load. Cognitive overload happens in programming due to the nature of the subject which is intrinisicly over-bearing on the working memory. It happens due to the complexity of the subject itself. The problem is made worse by the poor instructional design methodology used in the teaching and learning process. Various efforts have been proposed to reduce the cognitive load, e.g. visualization softwares, part-program method etc. Use of many computer based systems have also been tried to tackle the problem. However, little success has been made to alleviate the problem. More has to be done to overcome this hurdle. This research attempts at understanding how cognitive load can be managed so as to reduce the problem of overloading. We propose a mechanism to measure the cognitive load during pre instruction, post instruction and in instructional stages of learning. This mechanism is used to help the instruction. As the load changes the instruction is made to adapt itself to ensure cognitive viability. This mechanism could be incorporated as a sub domain in the student model of various computer based instructional systems to facilitate the learning of programming.
Abstract: A synchronous network-on-chip using wormhole packet switching
and supporting guaranteed-completion best-effort with low-priority (LP)
and high-priority (HP) wormhole packet delivery service is presented in
this paper. Both our proposed LP and HP message services deliver a good
quality of service in term of lossless packet completion and in-order message
data delivery. However, the LP message service does not guarantee minimal
completion bound. The HP packets will absolutely use 100% bandwidth of
their reserved links if the HP packets are injected from the source node with
maximum injection. Hence, the service are suitable for small size messages
(less than hundred bytes). Otherwise the other HP and LP messages, which
require also the links, will experience relatively high latency depending on the
size of the HP message. The LP packets are routed using a minimal adaptive
routing, while the HP packets are routed using a non-minimal adaptive routing
algorithm. Therefore, an additional 3-bit field, identifying the packet type,
is introduced in their packet headers to classify and to determine the type
of service committed to the packet. Our NoC prototypes have been also
synthesized using a 180-nm CMOS standard-cell technology to evaluate the
cost of implementing the combination of both services.
Abstract: Linear stability analysis of wake-shear layers in twophase
shallow flows is performed in the present paper. Twodimensional
shallow water equations are used in the analysis. It is
assumed that the fluid contains uniformly distributed solid particles.
No dynamic interaction between the carrier fluid and particles is
expected in the initial moment. The stability calculations are
performed for different values of the particle loading parameter and
two other parameters which characterize the velocity ratio and the
velocity deficit. The results show that the particle loading parameter
has a stabilizing effect on the flow while the increase in the velocity
ratio or in the velocity deficit destabilizes the flow.
Abstract: The spectral action balance equation is an equation that
used to simulate short-crested wind-generated waves in shallow water
areas such as coastal regions and inland waters. This equation consists
of two spatial dimensions, wave direction, and wave frequency which
can be solved by finite difference method. When this equation with
dominating convection term are discretized using central differences,
stability problems occur when the grid spacing is chosen too coarse.
In this paper, we introduce the splitting upwind schemes for avoiding
stability problems and prove that it is consistent to the upwind scheme
with same accuracy. The splitting upwind schemes was adopted
to split the wave spectral action balance equation into four onedimensional
problems, which for each small problem obtains the
independently tridiagonal linear systems. For each smaller system
can be solved by direct or iterative methods at the same time which
is very fast when performed by a multi-processor computer.
Abstract: In the last decade, energy based control theory has undergone a significant breakthrough in dealing with underactated mechanical systems with two successful and similar tools, controlled Lagrangians and controlled Hamiltanians (IDA-PBC). However, because of the complexity of these tools, successful case studies are lacking, in particular, MIMO cases. The seminal theoretical paper of controlled Lagrangians proposed by Bloch and his colleagues presented a benchmark example–a 4 d.o.f underactuated pendulum on a cart but a detailed and completed design is neglected. To compensate this ignorance, the note revisit their design idea by addressing explicit control functions for a similar device motivated by a vector thrust body hovering in the air. To the best of our knowledge, this system is the first MIMO, underactuated example that is stabilized by using energy based tools at the courtesy of the original design idea. Some observations are given based on computer simulation.
Abstract: Cloud Computing has recently emerged as a
compelling paradigm for managing and delivering services over the
internet. The rise of Cloud Computing is rapidly changing the
landscape of information technology, and ultimately turning the longheld
promise of utility computing into a reality. As the development
of Cloud Computing paradigm is speedily progressing, concepts, and
terminologies are becoming imprecise and ambiguous, as well as
different technologies are interfering. Thus, it becomes crucial to
clarify the key concepts and definitions. In this paper, we present the
anatomy of Cloud Computing, covering its essential concepts,
prominent characteristics, its affects, architectural design and key
technologies. We differentiate various service and deployment
models. Also, significant challenges and risks need are tackled in
order to guarantee the long-term success of Cloud Computing. The
aim of this paper is to provide a better understanding of the anatomy
of Cloud Computing and pave the way for further research in this
area.
Abstract: Microarrays have become the effective, broadly used tools in biological and medical research to address a wide range of problems, including classification of disease subtypes and tumors. Many statistical methods are available for analyzing and systematizing these complex data into meaningful information, and one of the main goals in analyzing gene expression data is the detection of samples or genes with similar expression patterns. In this paper, we express and compare the performance of several clustering methods based on data preprocessing including strategies of normalization or noise clearness. We also evaluate each of these clustering methods with validation measures for both simulated data and real gene expression data. Consequently, clustering methods which are common used in microarray data analysis are affected by normalization and degree of noise and clearness for datasets.
Abstract: In this paper, a two-dimensional (2D) numerical
model for the tidal currents simulation in Persian Gulf is presented.
The model is based on the depth averaged equations of shallow water
which consider hydrostatic pressure distribution. The continuity
equation and two momentum equations including the effects of bed
friction, the Coriolis effects and wind stress have been solved. To
integrate the 2D equations, the Alternative Direction Implicit (ADI)
technique has been used. The base of equations discritization was
finite volume method applied on rectangular mesh. To evaluate the
model validation, a dam break case study including analytical
solution is selected and the comparison is done. After that, the
capability of the model in simulation of tidal current in a real field is
represented by modeling the current behavior in Persian Gulf. The
tidal fluctuations in Hormuz Strait have caused the tidal currents in
the area of study. Therefore, the water surface oscillations data at
Hengam Island on Hormoz Strait are used as the model input data.
The check point of the model is measured water surface elevations at
Assaluye port. The comparison between the results and the
acceptable agreement of them showed the model ability for modeling
marine hydrodynamic.
Abstract: As a part of evaluation system for R&D program, the
Korean government has applied feasibility analysis since 2008.
Various professionals put forth a great effort in order to catch up the
high degree of freedom of R&D programs, and make contributions to
evolving the feasibility analysis. We analyze diverse R&D programs
from various viewpoints, such as technology, policy, and Economics,
integrate the separate analysis, and finally arrive at a definite result;
whether a program is feasible or unfeasible. This paper describes the
concept and method of the feasibility analysis as a decision making
tool. The analysis unit and content of each criterion, which are key
elements in a comprehensive decision making structure, are examined
Abstract: Data mining, which is the exploration of
knowledge from the large set of data, generated as a result of
the various data processing activities. Frequent Pattern Mining
is a very important task in data mining. The previous
approaches applied to generate frequent set generally adopt
candidate generation and pruning techniques for the
satisfaction of the desired objective. This paper shows how
the different approaches achieve the objective of frequent
mining along with the complexities required to perform the
job. This paper will also look for hardware approach of cache
coherence to improve efficiency of the above process. The
process of data mining is helpful in generation of support
systems that can help in Management, Bioinformatics,
Biotechnology, Medical Science, Statistics, Mathematics,
Banking, Networking and other Computer related
applications. This paper proposes the use of both upward and
downward closure property for the extraction of frequent item
sets which reduces the total number of scans required for the
generation of Candidate Sets.
Abstract: We propose a method for discrimination and
classification of ovarian with benign, malignant and normal tissue
using independent component analysis and neural networks. The
method was tested for a proteomic patters set from A database, and
radial basis functions neural networks. The best performance was
obtained with probabilistic neural networks, resulting I 99% success
rate, with 98% of specificity e 100% of sensitivity.
Abstract: In this paper, we propose an effective relay
communication for layered video transmission as an alternative to
make the most of limited resources in a wireless communication
network where loss often occurs. Relaying brings stable multimedia
services to end clients, compared to multiple description coding
(MDC). Also, retransmission of only parity data about one or more
video layer using channel coder to the end client of the relay device is
paramount to the robustness of the loss situation. Using these
methods in resource-constrained environments, such as real-time user
created content (UCC) with layered video transmission, can provide
high-quality services even in a poor communication environment.
Minimal services are also possible. The mathematical analysis shows
that the proposed method reduced the probability of GOP loss rate
compared to MDC and raptor code without relay. The GOP loss rate
is about zero, while MDC and raptor code without relay have a GOP
loss rate of 36% and 70% in case of 10% frame loss rate.
Abstract: In this era of globalization, the role of the State in all aspects of development is widely debated. Some scholars contend the 'demise' and diminishing role of the State whilst others claim that the State is still “de facto developmental". Clearly, it is vital to ascertain which of these two contentions are reflective of the role of the State as nations ascend their development trajectories. Based on the findings of this paper, the perception that the Malaysian State plays an active and committed role towards distributing equitable educational opportunities and enhancing employability of Malaysian PWDs is actually a myth and not reality. Thus, in order to fulfill the promise of Vision 2020 to transform Malaysia into a caring and socially-inclusive society; this paper calls for a more interventionist and committed role by the Malaysian State to translate the universal rights of education and employment opportunities for PWDs from mere policy rhetoric into inclusive realities.
Abstract: We study the problem of decision making with Dempster-Shafer belief structure. We analyze the previous work developed by Yager about using the ordered weighted averaging (OWA) operator in the aggregation of the Dempster-Shafer decision process. We discuss the possibility of aggregating with an ascending order in the OWA operator for the cases where the smallest value is the best result. We suggest the introduction of the ordered weighted geometric (OWG) operator in the Dempster-Shafer framework. In this case, we also discuss the possibility of aggregating with an ascending order and we find that it is completely necessary as the OWG operator cannot aggregate negative numbers. Finally, we give an illustrative example where we can see the different results obtained by using the OWA, the Ascending OWA (AOWA), the OWG and the Ascending OWG (AOWG) operator.
Abstract: A simple approach is demonstrated for growing large
scale, nearly vertically aligned ZnO nanowire arrays by thermal
oxidation method. To reveal effect of temperature on growth and
physical properties of the ZnO nanowires, gold coated zinc substrates
were annealed at 300 °C and 400 °C for 4 hours duration in air. Xray
diffraction patterns of annealed samples indicated a set of well
defined diffraction peaks, indexed to the wurtzite hexagonal phase of
ZnO. The scanning electron microscopy studies show formation of
ZnO nanowires having length of several microns and average of
diameter less than 500 nm. It is found that the areal density of wires
is relatively higher, when the annealing is carried out at higher
temperature i.e. at 400°C. From the field emission studies, the values
of the turn-on and threshold field, required to draw emission current
density of 10 μA/cm2 and 100 μA/cm2 are observed to be 1.2 V/μm
and 1.7 V/μm for the samples annealed at 300 °C and 2.9 V/μm and
3.7 V/μm for that annealed at 400 °C, respectively. The field
emission current stability, investigated over duration of more than 2
hours at the preset value of 1 μA, is found to be fairly good in both
cases. The simplicity of the synthesis route coupled with the
promising field emission properties offer unprecedented advantage
for the use of ZnO field emitters for high current density
applications.
Abstract: This paper considers inference under progressive type II censoring with a compound Rayleigh failure time distribution. The maximum likelihood (ML), and Bayes methods are used for estimating the unknown parameters as well as some lifetime parameters, namely reliability and hazard functions. We obtained Bayes estimators using the conjugate priors for two shape and scale parameters. When the two parameters are unknown, the closed-form expressions of the Bayes estimators cannot be obtained. We use Lindley.s approximation to compute the Bayes estimates. Another Bayes estimator has been obtained based on continuous-discrete joint prior for the unknown parameters. An example with the real data is discussed to illustrate the proposed method. Finally, we made comparisons between these estimators and the maximum likelihood estimators using a Monte Carlo simulation study.