Abstract: Chloride induced corrosion of steel reinforcement is
the main cause of deterioration of reinforced concrete marine
structures. This paper investigates the relative performance of
alternative repair options with respect to the deterioration of
reinforced concrete bridge elements in marine environments. Focus is
placed on the initiation phase of reinforcement corrosion. A
laboratory study is described which involved exposing concrete
samples to accelerated chloride-ion ingress. The study examined the
relative efficiencies of two repair methods, namely Ordinary Portland
Cement (OPC) concrete and a concrete which utilised Ground
Granulated Blastfurnace Cement (GGBS) as a partial cement
replacement. The mix designs and materials utilised were identical to
those implemented in the repair of a marine bridge on the South East
coast of Ireland in 2007. The results of this testing regime serve to
inform input variables employed in probabilistic modelling of
deterioration for subsequent reliability based analysis to compare the
relative performance of the studied repair options.
Abstract: The ideal sinc filter, ignoring the noise statistics, is often
applied for generating an arbitrary sample of a bandlimited signal by
using the uniformly sampled data. In this article, an optimal interpolator is proposed; it reaches a minimum mean square error (MMSE)
at its output in the presence of noise. The resulting interpolator is
thus a Wiener filter, and both the optimal infinite impulse response
(IIR) and finite impulse response (FIR) filters are presented. The
mean square errors (MSE-s) for the interpolator of different length
impulse responses are obtained by computer simulations; it shows that
the MSE-s of the proposed interpolators with a reasonable length are
improved about 0.4 dB under flat power spectra in noisy environment with signal-to-noise power ratio (SNR) equal 10 dB. As expected,
the results also demonstrate the improvements for the MSE-s with various fractional delays of the optimal interpolator against the ideal
sinc filter under a fixed length impulse response.
Abstract: Lean production (or lean management respectively)
gained popularity in several waves. The last three decades have been
filled with numerous attempts to apply these concepts in companies.
However, this has only been partially successful. The roots of lean
production can be traced back to Toyota-s just-in-time production.
This concept, which according to Womack-s, Jones- and Roos-
research at MIT was employed by Japanese car manufacturers,
became popular under its international names “lean production",
“lean-manufacturing" and was termed “Schlanke Produktion" in
Germany. This contribution shows a review about lean production in
Germany over the last thirty years: development, trial & error and
implementation as well.
Abstract: In the recent past, there has been an increasing interest
in applying evolutionary methods to Knowledge Discovery in
Databases (KDD) and a number of successful applications of Genetic
Algorithms (GA) and Genetic Programming (GP) to KDD have been
demonstrated. The most predominant representation of the
discovered knowledge is the standard Production Rules (PRs) in the
form If P Then D. The PRs, however, are unable to handle
exceptions and do not exhibit variable precision. The Censored
Production Rules (CPRs), an extension of PRs, were proposed by
Michalski & Winston that exhibit variable precision and supports an
efficient mechanism for handling exceptions. A CPR is an
augmented production rule of the form:
If P Then D Unless C, where C (Censor) is an exception to the rule.
Such rules are employed in situations, in which the conditional
statement 'If P Then D' holds frequently and the assertion C holds
rarely. By using a rule of this type we are free to ignore the exception
conditions, when the resources needed to establish its presence are
tight or there is simply no information available as to whether it
holds or not. Thus, the 'If P Then D' part of the CPR expresses
important information, while the Unless C part acts only as a switch
and changes the polarity of D to ~D.
This paper presents a classification algorithm based on evolutionary
approach that discovers comprehensible rules with exceptions in the
form of CPRs.
The proposed approach has flexible chromosome encoding, where
each chromosome corresponds to a CPR. Appropriate genetic
operators are suggested and a fitness function is proposed that
incorporates the basic constraints on CPRs. Experimental results are
presented to demonstrate the performance of the proposed algorithm.
Abstract: This paper discusses a new model of Islamic code of
ethics for directors. Several corporate scandals and local (example
Transmile and Megan Media) and overseas corporate (example
Parmalat and Enron) collapses show that the current corporate
governance and regulatory reform are unable to prevent these events
from recurring. Arguably, the code of ethics for directors is under
research and the current code of ethics only concentrates on binding
the work of the employee of the organization as a whole, without
specifically putting direct attention to the directors, the group of
people responsible for the performance of the company. This study
used a semi-structured interview survey of well-known Islamic
scholars such as the Mufti to develop the model. It is expected that
the outcome of the research is a comprehensive model of code of
ethics based on the Islamic principles that can be applied and used by
the company to construct a code of ethics for their directors.
Abstract: Human amniotic membrane (HAM) is a useful
biological material for the reconstruction of damaged ocular surface.
The processing and preservation of HAM is critical to prevent the
patients undergoing amniotic membrane transplant (AMT) from cross
infections. For HAM preparation human placenta is obtained after an
elective cesarean delivery. Before collection, the donor is screened
for seronegativity of HCV, Hbs Ag, HIV and Syphilis. After
collection, placenta is washed in balanced salt solution (BSS) in
sterile environment. Amniotic membrane is then separated from the
placenta as well as chorion while keeping the preparation in BSS.
Scrapping of HAM is then carried out manually until all the debris is
removed and clear transparent membrane is acquired. Nitrocellulose
membrane filters are then placed on the stromal side of HAM, cut
around the edges with little membrane folded towards other side
making it easy to separate during surgery. HAM is finally stored in
solution of glycerine and Dulbecco-s Modified Eagle Medium
(DMEM) in 1:1 ratio containing antibiotics. The capped borosil vials
containing HAM are kept at -80°C until use. This vial is thawed to
room temperature and opened under sterile operation theatre
conditions at the time of surgery.
Abstract: Many factors affect the success of Machine Learning
(ML) on a given task. The representation and quality of the instance
data is first and foremost. If there is much irrelevant and redundant
information present or noisy and unreliable data, then knowledge
discovery during the training phase is more difficult. It is well known
that data preparation and filtering steps take considerable amount of
processing time in ML problems. Data pre-processing includes data
cleaning, normalization, transformation, feature extraction and
selection, etc. The product of data pre-processing is the final training
set. It would be nice if a single sequence of data pre-processing
algorithms had the best performance for each data set but this is not
happened. Thus, we present the most well know algorithms for each
step of data pre-processing so that one achieves the best performance
for their data set.
Abstract: This article proposes modeling, simulation and
kinematic and workspace analysis of a spatial cable suspended robot
as incompletely Restrained Positioning Mechanism (IRPM). These
types of robots have six cables equal to the number of degrees of
freedom. After modeling, the kinds of workspace are defined then an
statically reachable combined workspace for different geometric
structures of fixed and moving platform is obtained. This workspace
is defined as the situations of reference point of the moving platform
(center of mass) which under external forces such as weight and with
ignorance of inertial effects, the moving platform should be in static
equilibrium under conditions that length of all cables must not be
exceeded from the maximum value and all of cables must be at
tension (they must have non-negative tension forces). Then the effect
of various parameters such as the size of moving platform, the size of
fixed platform, geometric configuration of robots, magnitude of
applied forces and moments to moving platform on workspace of
these robots with different geometric configuration are investigated.
Obtained results should be effective in employing these robots under
different conditions of applied wrench for increasing the workspace
volume.
Abstract: Nowadays, we are facing with network threats that
cause enormous damage to the Internet community day by day. In
this situation, more and more people try to prevent their network
security using some traditional mechanisms including firewall,
Intrusion Detection System, etc. Among them honeypot is a versatile
tool for a security practitioner, of course, they are tools that are meant
to be attacked or interacted with to more information about attackers,
their motives and tools. In this paper, we will describe usefulness of
low-interaction honeypot and high-interaction honeypot and
comparison between them. And then we propose hybrid honeypot
architecture that combines low and high -interaction honeypot to
mitigate the drawback. In this architecture, low-interaction honeypot
is used as a traffic filter. Activities like port scanning can be
effectively detected by low-interaction honeypot and stop there.
Traffic that cannot be handled by low-interaction honeypot is handed
over to high-interaction honeypot. In this case, low-interaction
honeypot is used as proxy whereas high-interaction honeypot offers
the optimal level realism. To prevent the high-interaction honeypot
from infections, containment environment (VMware) is used.
Abstract: This paper presents a comparative analysis of a new
unsupervised PCA-based technique for steel plates texture segmentation
towards defect detection. The proposed scheme called Variance
Based Component Analysis or VBCA employs PCA for feature
extraction, applies a feature reduction algorithm based on variance of
eigenpictures and classifies the pixels as defective and normal. While
the classic PCA uses a clusterer like Kmeans for pixel clustering,
VBCA employs thresholding and some post processing operations to
label pixels as defective and normal. The experimental results show
that proposed algorithm called VBCA is 12.46% more accurate and
78.85% faster than the classic PCA.
Abstract: Both prognostic and diagnostic modes of a 3D baroclinic
model in hydrodynamic and sediment transport models of
the Princeton Ocean Model (POM) were conducted to separate
prognose and diagnose effects of different hydrodynamic factors on
transport of suspended sediment discharged from the rivers to the
Gulf of Thailand (GoT). Both transport modes of suspended sediment
distribution in the GoT were numerically simulated. It could be
concluded that the suspended sediment discharged from the rivers
around the GoT. Most of sediments in estuaries and coastal areas are
deposited outside the GoT under the condition of wind-driven current,
and very small amount of the sediments of them are transported
faraway. On the basis of wind forcing, sediments from the lower
GoT to the upper GoT are mainly transported south-northwestward
and also continuously moved north-southwestward. An obvious 3D
characteristic of suspended sediment transport is produced in the
wind-driven current residual circulation condition. In this study, the
transport patterns at the third layer are generally consistent with
the typhoon-induced strong currents in two case studies of Typhoon
Linda 1997. The case studies presented the prognostic and diagnostic
modes during 00UTC28OCT1997 to 12UTC06NOV1997 in a short
period with the current condition for pre-operation of the suspended
sediment transport model in estuaries and coastal areas.
Abstract: Let Gα ,β (γ ,δ ) denote the class of function
f (z), f (0) = f ′(0)−1= 0 which satisfied e δ {αf ′(z)+ βzf ′′(z)}> γ i Re
in the open unit disk D = {z ∈ı : z < 1} for some α ∈ı (α ≠ 0) ,
β ∈ı and γ ∈ı (0 ≤γ 0 . In
this paper, we determine some extremal properties including
distortion theorem and argument of f ′( z ) .
Abstract: Traditional wind tunnel models are meticulously machined from metal in a process that can take several months. While very precise, the manufacturing process is too slow to assess a new design's feasibility quickly. Rapid prototyping technology makes this concurrent study of air vehicle concepts via computer simulation and in the wind tunnel possible. This paper described the Affects layer thickness models product with rapid prototyping on Aerodynamic Coefficients for Constructed wind tunnel testing models. Three models were evaluated. The first model was a 0.05mm layer thickness and Horizontal plane 0.1μm (Ra) second model was a 0.125mm layer thickness and Horizontal plane 0.22μm (Ra) third model was a 0.15mm layer thickness and Horizontal plane 4.6μm (Ra). These models were fabricated from somos 18420 by a stereolithography (SLA). A wing-body-tail configuration was chosen for the actual study. Testing covered the Mach range of Mach 0.3 to Mach 0.9 at an angle-of-attack range of -2° to +12° at zero sideslip. Coefficients of normal force, axial force, pitching moment, and lift over drag are shown at each of these Mach numbers. Results from this study show that layer thickness does have an effect on the aerodynamic characteristics in general; the data differ between the three models by fewer than 5%. The layer thickness does have more effect on the aerodynamic characteristics when Mach number is decreased and had most effect on the aerodynamic characteristics of axial force and its derivative coefficients.
Abstract: ECG contains very important clinical information about the cardiac activities of the heart. Often the ECG signal needs to be captured for a long period of time in order to identify abnormalities in certain situations. Such signal apart of a large volume often is characterised by low quality due to the noise and other influences. In order to extract features in the ECG signal with time-varying characteristics at first need to be preprocessed with the best parameters. Also, it is useful to identify specific parts of the long lasting signal which have certain abnormalities and to direct the practitioner to those parts of the signal. In this work we present a method based on wavelet transform, standard deviation and variable threshold which achieves 100% accuracy in identifying the ECG signal peaks and heartbeat as well as identifying the standard deviation, providing a quick reference to abnormalities.
Abstract: Heat Index describes the combined effect of
temperature and humidity on human body. This combined effect is
causing a serious threat to the health of people because of the
changing climate. With climate change, climate variability and thus
the occurrence of heat waves is likely to increase. Evidence is
emerging from the analysis of long-term climate records of an
increase in the frequency and duration of extreme temperature events
in all over Bangladesh particularly during summer. Summer season
has prolonged while winters have become short in Bangladesh.
Summers have become hotter and thus affecting the lives of the
people engaged in outdoor activities during scorching sun hours. In
2003 around 62 people died due to heat wave across the country. In
this paper Bangladesh is divided in four regions and heat index has
been calculated from 1960 to 2010 in these regions of the country.
The aim of this paper is to identify the spots most vulnerable to heat
strokes and heat waves due to high heat index. The results show
upward trend of heat index in almost all the regions of Bangladesh.
The highest increase in heat index value has been observed in areas
of South-west region and North-west Region. The highest change in
average heat index has been found in Jessore by almost 5.50C.
Abstract: The purpose of this paper is to guide the effort in
improving the economic added value of Indonesian fisheries product
through post fishing program, which is cold storage program.
Indonesia's fisheries potential has been acknowledged by the world.
FAO (2009) stated that Indonesia is one of the tenth highest
producers of fishery products in the world. Based on BPS (Statistics
Indonesia data), the national fisheries production in 2011 reached
5.714 million tons, which 93.55% came from marine fisheries and
6.45% from open waters. Indonesian territory consist of 2/3 of
Indonesian waters, has given enormous benefits for Indonesia,
especially fishermen. To improve the economic level of fishermen
requires efforts to develop fisheries business unit. On of the efforts is
by improving the quality of products which are marketed in the
regional and international levels. It is certainly need the support of
the existence of various fishery facilities (infrastructure to
superstructure), one of which is cold storage. Given the many
benefits of cold storage as a means of processing of fishery resources,
Indonesia Maritime Security Coordinating Board (IMSCB) as one of
the maritime institutions for maritime security and safety, has a
program to empower the coastal community through encourages the
development of cold storage in the middle and lower fishery business
unit. The development of cold storage facilities which able to run its
maximum role requires synergistic efforts of various parties.
Abstract: The balance between nitrogen loading and runoff in the
forested headwater streams of the Kanna River was estimated to
elucidate the current status of nitrogen saturation in a forested
watershed. NO3-N concentration in the study area was far higher than
the average value in Japan. Estimated nitrogen runoff accounted for
55–57% of nitrogen loading; suggesting that the forest-s nitrogen
retention capacity is most likely in decline. Since the 1970s, Japan-s
forestry industry has been declining due to the decrease in lumber
demand and increase in cheap imported materials. Thus, this decline
will contribute significantly to further reducing nitrogen saturation in
forest ecosystems.
Abstract: Calculations of energy efficiency of several AACbased
building envelopes under different climatic conditions are
presented. As thermal insulating materials, expanded polystyrene and
hydrophobic and hydrophilic mineral wools are assumed. The
computations are accomplished using computer code HEMOT
developed at Department of Materials Engineering, Faculty of Civil
Engineering at the Czech Technical University in Prague. The
climatic data of Athens, Kazan, Oslo, Prague and Reykjavík are
obtained using METEONORM software.
Abstract: In this work, we try to find the best setting
of Computational Fluid Dynamic solver available for the problems in
the field of supersonic internal flows. We used the supersonic air-toair
ejector to represent the typical problem in focus. There are
multiple oblique shock waves, shear layers, boundary layers
and normal shock interacting in the supersonic ejector making this
device typical in field of supersonic inner flows. Modeling of shocks
in general is demanding on the physical model of fluid, because
ordinary conservation equation does not conform to real conditions in
the near-shock region as found in many works. From these reasons,
we decided to take special care about solver setting in this article by
means of experimental approach of color Schlieren pictures and
pneumatic measurement. Fast pressure transducers were used to
measure unsteady static pressure in regimes with normal shock in
mixing chamber. Physical behavior of ejector in several regimes is
discussed. Best choice of eddy-viscosity setting is discussed on the
theoretical base. The final verification of the k-ω SST is done on the
base of comparison between experiment and numerical results.
Abstract: Mobile ad hoc network is a collection of mobile
nodes communicating through wireless channels without any
existing network infrastructure or centralized administration.
Because of the limited transmission range of wireless network
interfaces, multiple "hops" may be needed to exchange data
across the network. Consequently, many routing algorithms
have come into existence to satisfy the needs of
communications in such networks. Researchers have
conducted many simulations comparing the performance of
these routing protocols under various conditions and
constraints. One question that arises is whether speed of nodes
affects the relative performance of routing protocols being
studied. This paper addresses the question by simulating two
routing protocols AODV and DSDV. Protocols were
simulated using the ns-2 and were compared in terms of
packet delivery fraction, normalized routing load and average
delay, while varying number of nodes, and speed.