Abstract: Logistics is part of the supply chain processes that
plans, implements, and controls the efficient and effective forward
and reverse flow and storage of goods, services, and related
information between the point of origin and the point of consumption
in order to meet customer requirements. This research aims to
investigate the current status and future direction of the use of
Information Technology (IT) for logistics, focusing on Supply Chain
Management (SCM) and E-Commerce adoption in Malaysia.
Therefore, this research stresses on the type of technology being
adopted, factors, benefits and barriers affecting the innovation in
SCM and E-Commerce technology adoption among Logistics
Service Providers (LSP). A mailed questionnaire survey was
conducted to collect data from 265 logistics companies in Johor. The
research revealed a high level of SCM technology adoption among
LSP as they had adopted SCM technology in various business
processes while they perceived a high level of benefits from SCM
adoption.
Abstract: Lactic acid alone and its combined application with
nisin were evaluated for reducing population of naturally occurring
microorganisms on chilled shrimp. Fresh shrimps were dipped in 0,
1.0% and 2.0% (v/v) lactic acid alone and their combined application
with 0.04 (g/L/kg) nisin solution for 10 min. Total plate counts of
aerobic bacteria (TPCs), Psychrotrophic counts, population of
Pseudomonas spp., H2S producing bacteria and Lactic acid bacteria
(LAB) on shrimps were determined during storage at 4 °C. The
results indicated that total plate counts were 2.91 and 2.63 log CFU/g
higher on untreated shrimps after 7 and 14 days of storage,
respectively, than on shrimps treated with 2.0% lactic acid combined
with 0.04 (g/L/kg) nisin. Both concentrations of lactic acid indicated
significant reduction on Pseudomonas counts during storage, while
2.0% lactic acid combined with nisin indicated the highest reduction.
In addition, H2S producing bacteria were more sensitive to high
concentration of lactic acid combined with nisin during storage.
Abstract: In Image processing the Image compression can improve
the performance of the digital systems by reducing the cost and
time in image storage and transmission without significant reduction
of the Image quality. This paper describes hardware architecture of
low complexity Discrete Cosine Transform (DCT) architecture for
image compression[6]. In this DCT architecture, common computations
are identified and shared to remove redundant computations
in DCT matrix operation. Vector processing is a method used for
implementation of DCT. This reduction in computational complexity
of 2D DCT reduces power consumption. The 2D DCT is performed
on 8x8 matrix using two 1-Dimensional Discrete cosine transform
blocks and a transposition memory [7]. Inverse discrete cosine
transform (IDCT) is performed to obtain the image matrix and
reconstruct the original image. The proposed image compression
algorithm is comprehended using MATLAB code. The VLSI design
of the architecture is implemented Using Verilog HDL. The proposed
hardware architecture for image compression employing DCT was
synthesized using RTL complier and it was mapped using 180nm
standard cells. . The Simulation is done using Modelsim. The
simulation results from MATLAB and Verilog HDL are compared.
Detailed analysis for power and area was done using RTL compiler
from CADENCE. Power consumption of DCT core is reduced to
1.027mW with minimum area[1].
Abstract: New nondestructive technique, namely an inverse technique based on vibration tests, to characterize nonlinear mechanical properties of adhesive layers in sandwich composites is developed. An adhesive layer is described as a viscoelastic isotropic material with storage and loss moduli which are both frequency dependent values in wide frequency range. An optimization based on the planning of experiments and response surface technique to minimize the error functional is applied to decrease considerably the computational expenses. The developed identification technique has been tested on aluminum panels and successfully applied to characterize viscoelastic material properties of 3M damping polymer ISD-112 used as a core material in sandwich panels.
Abstract: Grid computing is a form of distributed computing
that involves coordinating and sharing computational power, data
storage and network resources across dynamic and geographically
dispersed organizations. Scheduling onto the Grid is NP-complete,
so there is no best scheduling algorithm for all grid computing
systems. An alternative is to select an appropriate scheduling
algorithm to use in a given grid environment because of the
characteristics of the tasks, machines and network connectivity. Job
and resource scheduling is one of the key research area in grid
computing. The goal of scheduling is to achieve highest possible
system throughput and to match the application need with the
available computing resources. Motivation of the survey is to
encourage the amateur researcher in the field of grid computing, so
that they can understand easily the concept of scheduling and can
contribute in developing more efficient scheduling algorithm. This
will benefit interested researchers to carry out further work in this
thrust area of research.
Abstract: This paper describes the design and development of pico-hydro generation system using consuming water distributed to houses. Water flow in the domestic pipes has kinetic energy that potential to generate electricity for energy storage purposes in addition to the routine activities such as laundry, cook and bathe. The inherent water pressure and flow inside the pipe from utility-s main tank that used for those usual activities is also used to rotate small scale hydro turbine to drive a generator for electrical power generation. Hence, this project is conducted to develop a small scale hydro generation system using consuming water distributed to houses as an alternative electrical energy source for residential use.
Abstract: According to the density of the chips, designers are
trying to put so any facilities of computational and storage on single
chips. Along with the complexity of computational and storage
circuits, the designing, testing and debugging become more and more
complex and expensive. So, hardware design will be built by using
very high speed hardware description language, which is more
efficient and cost effective. This paper will focus on the
implementation of 32-bit ALU design based on Verilog hardware
description language. Adder and subtracter operate correctly on both
unsigned and positive numbers. In ALU, addition takes most of the
time if it uses the ripple-carry adder. The general strategy for
designing fast adders is to reduce the time required to form carry
signals. Adders that use this principle are called carry look- ahead
adder. The carry look-ahead adder is to be designed with combination
of 4-bit adders. The syntax of Verilog HDL is similar to the C
programming language. This paper proposes a unified approach to
ALU design in which both simulation and formal verification can
co-exist.
Abstract: Noise contamination in a magnetic resonance (MR)
image could occur during acquisition, storage, and transmission in
which effective filtering is required to avoid repeating the MR
procedure. In this paper, an iterative asymmetrical triangle fuzzy
filter with moving average center (ATMAVi filter) is used to reduce
different levels of salt and pepper noise in a brain MR image. Besides
visual inspection on filtered images, the mean squared error (MSE) is
used as an objective measurement. When compared with the median
filter, simulation results indicate that the ATMAVi filter is effective
especially for filtering a higher level noise (such as noise density =
0.45) using a smaller window size (such as 3x3) when operated
iteratively or using a larger window size (such as 5x5) when operated
non-iteratively.
Abstract: Block replacement algorithms to increase hit ratio
have been extensively used in cache memory management. Among
basic replacement schemes, LRU and FIFO have been shown to be
effective replacement algorithms in terms of hit rates. In this paper,
we introduce a flexible stack-based circuit which can be employed in
hardware implementation of both LRU and FIFO policies. We
propose a simple and efficient architecture such that stack-based
replacement algorithms can be implemented without the drawbacks
of the traditional architectures. The stack is modular and hence, a set
of stack rows can be cascaded depending on the number of blocks in
each cache set. Our circuit can be implemented in conjunction with
the cache controller and static/dynamic memories to form a cache
system. Experimental results exhibit that our proposed circuit
provides an average value of 26% improvement in storage bits and its
maximum operating frequency is increased by a factor of two
Abstract: The Yazd-Ardakan basin in Central Iran has two separated aquifers. The shallow unconfined aquifer is supplies 40 Qanats. The deep saturated confined aquifer is the main water storage. Due to over-withdrawal, water table has been decreasing during last 25 years. Recent study shows that the shortage of the aquifer is about 16 meters and land subsidence is 0.5 - 1.2 meters. Long deep cracks are found just above the aquifer and devour the irrigation water and floods. Although the most cracks direction is NW-SE and could be compared to the main direction of YA basin, there is no direct evidence for relation between land subsidence and the huge cracks. Large-scale water pumping has been decreased the water pressure in aquifer. The pressure decline disturbed the balance and increased the pressure of overlying sediments. So porosity decreased and compaction started. Then, sediments compaction developed and made land subsidence and some huge cracks slowly.
Abstract: Processes of plant breeding, testing and licensing of new varieties, patent protection in seed production, relations in trade and protection of copyright are dependent on identification, differentiation and characterization of plant genotypes. Therefore, we focused our research on utilization of wheat storage proteins as genetic markers suitable not only for differentiation of individual genotypes, but also for identification and characterization of their considerable properties. We analyzed a collection of 102 genotypes of bread wheat (Triticum aestivum L.), 41 genotypes of spelt wheat (Triticum spelta L.), and 35 genotypes of durum wheat (Triticum durum Desf.), in this study. Our results show, that genotypes of bread wheat and durum wheat were homogenous and single line, but spelt wheat genotypes were heterogenous. We observed variability of HMW-GS composition according to environmental factors and level of breeding and predict technological quality on the basis of Glu-score calculation.
Abstract: In order to assess optical fiber reliability in different environmental and stress conditions series of testing are performed simulating overlapping of chemical and mechanical controlled varying factors. Each series of testing may be compared using statistical processing: i.e. Weibull plots. Due to the numerous data to treat, a software application has appeared useful to interpret selected series of experiments in function of envisaged factors. The current paper presents a software application used in the storage, modelling and interpretation of experimental data gathered from optical fibre testing. The present paper strictly deals with the software part of the project (regarding the modelling, storage and processing of user supplied data).
Abstract: Main Memory Database systems (MMDB) store their
data in main physical memory and provide very high-speed access.
Conventional database systems are optimized for the particular
characteristics of disk storage mechanisms. Memory resident
systems, on the other hand, use different optimizations to structure
and organize data, as well as to make it reliable.
This paper provides a brief overview on MMDBs and one of the
memory resident systems named FastDB and compares the
processing time of this system with a typical disc resident database
based on the results of the implementation of TPC benchmarks
environment on both.
Abstract: The purpose of this study was to understand the main
sources of copper (Cu) accumulation in target organs of tilapia
(Oreochromis mossambicus) and to investigate how the organism
mediate the process of Cu accumulation under prolonged conditions.
By measuring both dietary and waterborne Cu accumulation and total
concentrations in tilapia with biokinetic modeling approach, we were
able to clarify the biokinetic coping mechanisms for the long term Cu
accumulation. This study showed that water and food are both the
major source of Cu for the muscle and liver of tilapia. This implied
that control the Cu concentration in these two routes will be correlated
to the Cu bioavailability for tilapia. We found that exposure duration
and level of waterborne Cu drove the Cu accumulation in tilapia. The
ability for Cu biouptake and depuration in organs of tilapia were
actively mediated under prolonged exposure conditions. Generally,
the uptake rate, depuration rate and net bioaccumulation ability in all
selected organs decreased with the increasing level of waterborne Cu
and extension of exposure duration.Muscle tissues accounted for over
50%of the total accumulated Cu and played a key role in buffering the
Cu burden in the initial period of exposure, alternatively, the liver
acted a more important role in the storage of Cu with the extension of
exposures. We concluded that assumption of the constant biokinetic
rates could lead to incorrect predictions with overestimating the
long-term Cu accumulation in ecotoxicological risk assessments.
Abstract: Lake Nasser is one of the largest reservoirs in the
world. Over 120 million metric tons of sediments are deposited in its
dead storage zone every year. The main objective of the present work
was to determine the physical and chemical characteristics of Lake
Nasser sediments. The sample had a relatively low surface area of 2.9
m2/g which increased more than 3-fold upon chemical activation. The
main chemical elements of the raw sediments were C, O and Si with
some traces of Al, Fe and Ca. The organic functional groups for the
tested sample included O-H, C=C, C-H and C-O, with indications of
Si-O and other metal-C and/or metal-O bonds normally associated
with clayey materials. Potentiometric titration of the sample in
different ionic strength backgrounds revealed an alkaline material with
very strong positive surface charge at pH values just a little less than
the pH of zero charge which is ~9. Surface interactions of the
sediments with the background electrolyte were significant. An
advanced surface complexation model was able to capture these
effects, employing a single-site approach to represent protolysis
reactions in aqueous solution, and to determine the significant surface
species in the pH range of environmental interest.
Abstract: On a such wide-area environment as a Grid, data
placement is an important aspect of distributed database systems. In
this paper, we address the problem of initial placement of database
no-replicated fragments in Grid architecture. We propose a graph
based approach that considers resource restrictions. The goal is to
optimize the use of computing, storage and communication
resources. The proposed approach is developed in two phases: in the
first phase, we perform fragment grouping using knowledge about
fragments dependency and, in the second phase, we determine an
efficient placement of the fragment groups on the Grid. We also
show, via experimental analysis that our approach gives solutions
that are close to being optimal for different databases and Grid
configurations.
Abstract: Key management represents a major and the most
sensitive part of cryptographic systems. It includes key generation,
key distribution, key storage, and key deletion. It is also considered
the hardest part of cryptography. Designing secure cryptographic
algorithms is hard, and keeping the keys secret is much harder.
Cryptanalysts usually attack both symmetric and public key
cryptosystems through their key management. We introduce a
protocol to exchange cipher keys over insecure communication
channel. This protocol is based on public key cryptosystem,
especially elliptic curve cryptosystem. Meanwhile, it tests the cipher
keys and selects only the good keys and rejects the weak one.
Abstract: Stochastic models of biological networks are well established in systems biology, where the computational treatment of such models is often focused on the solution of the so-called chemical master equation via stochastic simulation algorithms. In contrast to this, the development of storage-efficient model representations that are directly suitable for computer implementation has received significantly less attention. Instead, a model is usually described in terms of a stochastic process or a "higher-level paradigm" with graphical representation such as e.g. a stochastic Petri net. A serious problem then arises due to the exponential growth of the model-s state space which is in fact a main reason for the popularity of stochastic simulation since simulation suffers less from the state space explosion than non-simulative numerical solution techniques. In this paper we present transition class models for the representation of biological network models, a compact mathematical formalism that circumvents state space explosion. Transition class models can also serve as an interface between different higher level modeling paradigms, stochastic processes and the implementation coded in a programming language. Besides, the compact model representation provides the opportunity to apply non-simulative solution techniques thereby preserving the possible use of stochastic simulation. Illustrative examples of transition class representations are given for an enzyme-catalyzed substrate conversion and a part of the bacteriophage λ lysis/lysogeny pathway.
Abstract: This study investigates the capacity of granular
activated carbon (GAC) for the storage of methane through the
equilibrium adsorption. An experimental apparatus consist of a dual
adsorption vessel was set up for the measurement of equilibrium
adsorption of methane on GAC using volumetric technique (pressure
decay). Experimental isotherms of methane adsorption were
determined by the measurement of equilibrium uptake of methane in
different pressures (0-50 bar) and temperatures (285.15-328.15°K).
The experimental data was fitted to Freundlich and Langmuir
equations to determine the model isotherm. The results show that the
experimental data is equally well fitted by the both model isotherms.
Using the experimental data obtained in different temperatures the
isosteric heat of methane adsorption was also calculated by the
Clausius-Clapeyron equation from the Sips isotherm model. Results
of isosteric heat of adsorption show that decreasing temperature or
increasing methane uptake by GAC decrease the isosteric heat of
methane adsorption.
Abstract: Two algorithms are proposed to reduce the storage requirements for mammogram images. The input image goes through a shrinking process that converts the 16-bit images to 8-bits by using pixel-depth conversion algorithm followed by enhancement process. The performance of the algorithms is evaluated objectively and subjectively. A 50% reduction in size is obtained with no loss of significant data at the breast region.