Abstract: A new generation product made from bamboo strips,
known as laminated bamboo, has gained importance. The objective
of this research was to experiment the effect of three factors on the
mechanical property of laminated bamboo. The interested factors for
experimental design were (A) four bamboo species, namely Bambusa
blumeana Schultes (Pai See Suk), Dendrocalamus asper Backer (Pai
Tong), Dendrocalamus hamiltonii Nees (Pai Hok) and
Dendrocalamus sericeus Munro (Pai Sang Mon), (B) two types of
glue adhesive, polyvinyl acetate emulsion (PVAC) fortified with
urea-formaldehyde (UF) and urea-formaldehyde (UF) to make
parallel-oriented bamboo strips laminates and (C) glue weight per
strip area, 150 g/m2 and 190 g/m2. Experimental results showed that
Dendrocalamus asper Backer (Pai Tong) and Dendrocalamus
sericeus Munro (Pai Sang Mon) were best used for manufacturing
due to their highest MOR and MOE. The amount of glue weight 150
g/m2 yielded higher MOR and MOE than the amount of glue weight
190 g/m2. At the conclusion, the laminated bamboo manufacturers
can benefit from this research in order to select right materials
according to strength, cost and accessibility.
Abstract: In this study, the ability of Aspergillus niger and
Penicillium simplicissimum to extract heavy metals from a spent
refinery catalyst was investigated. For the first step, a spent
processing catalyst from one of the oil refineries in Iran was
physically and chemically characterized. Aspergillus niger and
Penicillium simplicissimum were used to mobilize Al/Co/Mo/Ni from
hazardous spent catalysts. The fungi were adapted to the mixture of
metals at 100-800 mg L-1 with increments in concentration of 100 mg
L-1. Bioleaching experiments were carried out in batch cultures. To
investigate the production of organic acids in sucrose medium,
analyses of the culture medium by HPLC were performed at specific
time intervals after inoculation. The results obtained from Inductive
coupled plasma-optical emission spectrometry (ICP-OES) showed
that after the one-step bioleaching process using Aspergillus niger,
maximum removal efficiencies of 27%, 66%, 62% and 38% were
achieved for Al, Co, Mo and Ni, respectively. However, the highest
removal efficiencies using Penicillium simplicissimum were of 32%,
67%, 65% and 38% for Al, Co, Mo and Ni, respectively
Abstract: This study has applied the L16 orthogonal array of the
Taguchi method to determine the optimized polymeric
Nanocomposite asphalt binder. Three control factors are defined as
polypropylene plastomer (PP), styrene-butadiene-styrene elastomer
(SBS) and Nanoclay. Four level of concentration contents are
introduced for prepared asphalt binder samples. all samples were
prepared with 4.5% of bitumen 60/70 content. Compressive strength
tests were carried out for defining the optimized sample via
QUALITEK-4 software. SBS with 3%, PP with 5 % and Nanoclay
with 1.5% of concentrations are defined as the optimized
Nanocomposite asphalt binders. The confirmation compressive
strength and also softening point tests showed that modification of
asphalt binders with this method, improved the compressive strength
and softening points of asphalt binders up to 55%.
Abstract: In this paper, we have developed an explicit analytical
drain current model comprising surface channel potential and
threshold voltage in order to explain the advantages of the proposed
Gate Stack Double Diffusion (GSDD) MOSFET design over the
conventional MOSFET with the same geometric specifications that
allow us to use the benefits of the incorporation of the high-k layer
between the oxide layer and gate metal aspect on the immunity of the
proposed design against the self-heating effects. In order to show the
efficiency of our proposed structure, we propose the simulation of the
power chopper circuit. The use of the proposed structure to design a
power chopper circuit has showed that the (GSDD) MOSFET can
improve the working of the circuit in terms of power dissipation and
self-heating effect immunity. The results so obtained are in close
proximity with the 2D simulated results thus confirming the validity
of the proposed model.
Abstract: This paper describes a research project on Year 3 primary school students in Malaysia in their use of computer-based video game to enhance learning of multiplication facts (tables) in the Mathematics subject. This study attempts to investigate whether video games could actually contribute to positive effect on children-s learning or otherwise. In conducting this study, the researchers assume a neutral stand in the investigation as an unbiased outcome of the study would render reliable response to the impact of video games in education which would contribute to the literature of technology-based education as well as impact to the pedagogical aspect of formal education. In order to conduct the study, a subject (Mathematics) with a specific topic area in the subject (multiplication facts) is chosen. The study adopts a causal-comparative research to investigate the impact of the inclusion of a computer-based video game designed to teach multiplication facts to primary level students. Sample size is 100 students divided into two i.e., A: conventional group and B conventional group aided by video games. The conventional group (A) would be taught multiplication facts (timetables) and skills conventionally. The other group (B) underwent the same lessons but with supplementary activity: a computer-based video game on multiplication which is called Timez-Attack. Analysis of marks accrued from pre-test will be compared to post- test using comparisons of means, t tests, and ANOVA tests to investigate the impact of computer games as an added learning activity. The findings revealed that video games as a supplementary activity to classroom learning brings significant and positive effect on students- retention and mastery of multiplication tables as compared to students who rely only upon formal classroom instructions.
Abstract: The aim of this study is to emphasize the opportunities in space design under the aspect of HCI as performance areas. HCI is a multidisciplinary approach that could be identified in many different areas. The aesthetical reflections of HCI by virtual reality in space design are the high-tech solutions of the new innovations as computational facilities by artistic features. The method of this paper is to identify the subject in 3 main parts. In the first part a general approach and definition of interactivity on the basis of space design; in the second part the concept of multimedia interactive theater by some chosen samples from the world and interactive design aspects; in the third part the samples from Turkey will be identified by stage designing principles. In the results it could be declared that the multimedia database is the virtual approach of theatre stage designing regarding interactive means by computational facilities according to aesthetical aspects. HCI is mostly identified in theatre stages as computational intelligence under the affect of interactivity.
Abstract: This paper characterizes the effects of artificial short
term aging in the laboratory on the rheological properties of virgin
80/100 penetration grade asphalt binder. After several years in
service, asphalt mixture started to deteriorate due to aging. Aging is a
complex physico-chemical phenomenon that influences asphalt
binder rheological properties causing a deterioration in asphalt
mixture performance. To ascertain asphalt binder aging effects, the
virgin, artificially aged and extracted asphalt binder were tested via
the Rolling Thin film Oven (RTFO), Dynamic Shear Rheometer
(DSR) and Rotational Viscometer (RV). A comparative study
between laboratory and field aging conditions were also carried out.
The results showed that the specimens conditioned for 85 minutes
inside the RTFO was insufficient to simulate the actual short term
aging caused that took place in the field under Malaysian field
conditions
Abstract: This article presents a short discussion on
optimum neighborhood size selection in a spherical selforganizing
feature map (SOFM). A majority of the literature
on the SOFMs have addressed the issue of selecting optimal
learning parameters in the case of Cartesian topology SOFMs.
However, the use of a Spherical SOFM suggested that the
learning aspects of Cartesian topology SOFM are not directly
translated. This article presents an approach on how to
estimate the neighborhood size of a spherical SOFM based on
the data. It adopts the L-curve criterion, previously suggested
for choosing the regularization parameter on problems of
linear equations where their right-hand-side is contaminated
with noise. Simulation results are presented on two artificial
4D data sets of the coupled Hénon-Ikeda map.
Abstract: The tray/multi-tray distillation process is a topic that
has been investigated to great detail over the last decade by many
teams such as Jubran et al. [1], Adhikari et al. [2], Mowla et al. [3],
Shatat et al. [4] and Fath [5] to name a few. A significant amount of
work and effort was spent focusing on modeling and/simulation of
specific distillation hardware designs. In this work, we have focused
our efforts on investigating and gathering experimental data on
several engineering and design variables to quantify their influence
on the yield of the multi-tray distillation process. Our goals are to
generate experimental performance data to bridge some existing gaps
in the design, engineering, optimization and theoretical modeling
aspects of the multi-tray distillation process.
Abstract: The Boundary Representation of a 3D manifold contains
FACES (connected subsets of a parametric surface S : R2 -!
R3). In many science and engineering applications it is cumbersome
and algebraically difficult to deal with the polynomial set and
constraints (LOOPs) representing the FACE. Because of this reason, a
Piecewise Linear (PL) approximation of the FACE is needed, which is
usually represented in terms of triangles (i.e. 2-simplices). Solving the
problem of FACE triangulation requires producing quality triangles
which are: (i) independent of the arguments of S, (ii) sensitive to the
local curvatures, and (iii) compliant with the boundaries of the FACE
and (iv) topologically compatible with the triangles of the neighboring
FACEs. In the existing literature there are no guarantees for the point
(iii). This article contributes to the topic of triangulations conforming
to the boundaries of the FACE by applying the concept of parameterindependent
Gabriel complex, which improves the correctness of the
triangulation regarding aspects (iii) and (iv). In addition, the article
applies the geometric concept of tangent ball to a surface at a point to
address points (i) and (ii). Additional research is needed in algorithms
that (i) take advantage of the concepts presented in the heuristic
algorithm proposed and (ii) can be proved correct.
Abstract: SEMG (Surface Electromyogram) is one of the
bio-signals and is generated from the muscle. And there are many
research results that use forearm EMG to detect hand motions. In this
paper, we will talk about our developed the robot hand system that can
control grasping power by SEMG. In our system, we suppose that
muscle power is proportional to the amplitude of SEMG. The power is
estimated and the grip power of a robot hand is able to be controlled
using estimated muscle power in our system. In addition, to perform a
more precise control can be considered to build a closed loop feedback
system as an object to a subject to pressure from the edge of hand. Our
objectives of this study are the development of a method that makes
perfect detection of the hand grip force possible using SEMG patterns,
and applying this method to the man-machine interface.
Abstract: The IEEE 802.11e which is an enhanced version of the 802.11 WLAN standards incorporates the Quality of Service (QoS) which makes it a better choice for multimedia and real time applications. In this paper we study various aspects concerned with 802.11e standard. Further, the analysis results for this standard are compared with the legacy 802.11 standard. Simulation results show that IEEE 802.11e out performs legacy IEEE 802.11 in terms of quality of service due to its flow differentiated channel allocation and better queue management architecture. We also propose a method to improve the unfair allocation of bandwidth for downlink and uplink channels by varying the medium access priority level.
Abstract: We have defined two suites of metrics, which cover
static and dynamic aspects of component assembly. The static
metrics measure complexity and criticality of component assembly,
wherein complexity is measured using Component Packing Density
and Component Interaction Density metrics. Further, four criticality
conditions namely, Link, Bridge, Inheritance and Size criticalities
have been identified and quantified. The complexity and criticality
metrics are combined to form a Triangular Metric, which can be used
to classify the type and nature of applications. Dynamic metrics are
collected during the runtime of a complete application. Dynamic
metrics are useful to identify super-component and to evaluate the
degree of utilisation of various components. In this paper both static
and dynamic metrics are evaluated using Weyuker-s set of properties.
The result shows that the metrics provide a valid means to measure
issues in component assembly. We relate our metrics suite with
McCall-s Quality Model and illustrate their impact on product
quality and to the management of component-based product
development.
Abstract: Faced with social and health system capacity
constraints and rising and changing demand for welfare services,
governments and welfare providers are increasingly relying on
innovation to help support and enhance services. However, the
evidence reported by several studies indicates that the realization of
that potential is not an easy task. Innovations can be deemed
inherently complex to implement and operate, because many of them
involve a combination of technological and organizational renewal
within an environment featuring a diversity of stakeholders. Many
public welfare service innovations are markedly systemic in their
nature, which means that they emerge from, and must address, the
complex interplay between political, administrative, technological,
institutional and legal issues. This paper suggests that stakeholders
dealing with systemic innovation in welfare services must deal with
ambiguous and incomplete information in circumstances of
uncertainty. Employing a literature review methodology and case
study, this paper identifies, categorizes and discusses different
aspects of the uncertainty of systemic innovation in public welfare
services, and argues that uncertainty can be classified into eight
categories: technological uncertainty, market uncertainty,
regulatory/institutional uncertainty, social/political uncertainty,
acceptance/legitimacy uncertainty, managerial uncertainty, timing
uncertainty and consequence uncertainty.
Abstract: One major source of performance decline in speaker
recognition system is channel mismatch between training and testing.
This paper focuses on improving channel robustness of speaker
recognition system in two aspects of channel compensation technique
and channel robust features. The system is text-independent speaker
identification system based on two-stage recognition. In the aspect of
channel compensation technique, this paper applies MAP (Maximum
A Posterior Probability) channel compensation technique, which was
used in speech recognition, to speaker recognition system. In the
aspect of channel robust features, this paper introduces
pitch-dependent features and pitch-dependent speaker model for the
second stage recognition. Based on the first stage recognition to
testing speech using GMM (Gaussian Mixture Model), the system
uses GMM scores to decide if it needs to be recognized again. If it
needs to, the system selects a few speakers from all of the speakers
who participate in the first stage recognition for the second stage
recognition. For each selected speaker, the system obtains 3
pitch-dependent results from his pitch-dependent speaker model, and
then uses ANN (Artificial Neural Network) to unite the 3
pitch-dependent results and 1 GMM score for getting a fused result.
The system makes the second stage recognition based on these fused
results. The experiments show that the correct rate of two-stage
recognition system based on MAP channel compensation technique
and pitch-dependent features is 41.7% better than the baseline system
for closed-set test.
Abstract: Transmission control protocol (TCP) Vegas detects
network congestion in the early stage and successfully prevents
periodic packet loss that usually occurs in TCP Reno. It has been
demonstrated that TCP Vegas outperforms TCP Reno in many
aspects. However, TCP Vegas suffers several problems that affect its
congestion avoidance mechanism. One of the most important
weaknesses in TCP Vegas is that alpha and beta depend on a good
expected throughput estimate, which as we have seen, depends on a
good minimum RTT estimate. In order to make the system more
robust alpha and beta must be made responsive to network conditions
(they are currently chosen statically). This paper proposes a modified
Vegas algorithm, which can be adjusted to present good performance
compared to other transmission control protocols (TCPs). In order to
do this, we use PSO algorithm to tune alpha and beta. The simulation
results validate the advantages of the proposed algorithm in term of
performance.
Abstract: Aspect of visual perception occupies a central position
in shaping the physical structure of a city. This paper discusses the
visual characteristics of utopian cities and their impact on the shaping
of real urban structures. Utopian examples of cities will not be
discussed in terms of social and sociological conditions, but rather
the emphasis is on urban utopias and ideal cities that have achieved
or have had potential impact on the shape of the physical structure of
Nikšić. It is a Renaissance-Baroque period with a touch of classicism.
The paper’s emphasis is on the physical dimension, not excluding the
importance of social equilibrium, studies of which are dating back to
Aristotle, Plato, Thomas More, Robert Owen, Tommaso Campanella
and others. The emphasis is on urban utopias and their impact on the
development of sustainable physical structure of a real city in the
context of visual perception. In the case of Nikšić, this paper
identifies the common features of a real city and a utopian city, as
well as criteria for sustainable urban development in the context of
visual achievement.
Abstract: Out of all visual arts including: painting, sculpture,
graphics, photography, architecture, and others, architecture is by far
the most complex one, because the art category is only one of its
determinants. Architecture, to some extent includes other arts which
can significantly influence the shaping of an urban space (artistic
interventions). These arts largely shape the visual culture in
combination with other categories: film, TV, Internet, information
technologies that are "changing the world" etc. In the area of
architecture and urbanism, visual culture is achieved through the
aspects of visual spatial effects. In this context, a complex visual
deliberation about designing urban areas in order to contribute to the
urban visual culture, and with it restore the cultural identity of the
city, is becoming almost the primary concept of contemporary urban
and architectural practice. Research in this paper relate to the city of
Niksic and its place in the visual urban culture. We are looking at the
city’s existing visual effects and determining the directions of
transformability of its physical structure in order to achieve the visual
realization of an urban area and the renewal of cultural identity of a
modern city.
Abstract: UML is a collection of notations for capturing a software system specification. These notations have a specific syntax defined by the Object Management Group (OMG), but many of their constructs only present informal semantics. They are primarily graphical, with textual annotation. The inadequacies of standard UML as a vehicle for complete specification and implementation of real-time embedded systems has led to a variety of competing and complementary proposals. The Real-time UML profile (UML-RT), developed and standardized by OMG, defines a unified framework to express the time, scheduling and performance aspects of a system. We present in this paper a framework approach aimed at deriving a complete specification of a real-time system. Therefore, we combine two methods, a semiformal one, UML-RT, which allows the visual modeling of a realtime system and a formal one, CSP+T, which is a design language including the specification of real-time requirements. As to show the applicability of the approach, a correct design of a real-time system with hard real time constraints by applying a set of mapping rules is obtained.
Abstract: The adoption of building information modeling (BIM)
is increasing in the construction industry. However, quantity
surveyors are slow in adoption compared to other professions due to
lack of awareness of the BIM’s potential in their profession. It is still
unclear on how BIM application can enhance quantity surveyors’
work performance and project performance. The aim of this research
is to identify the capabilities of BIM in quantity surveying practices
and examine the relationship between BIM capabilities and project
performance. Questionnaire survey and interviews were adopted for
data collection. Literature reviews identified there are eleven BIM
capabilities in quantity surveying practice. Questionnaire results
showed that there are several BIM capabilities significantly
correlated with project performance in time, cost and quality aspects
and the results were validated through interviews. These findings
show that BIM has the capabilities to enhance quantity surveyors’
performances and subsequently improved project performance.