Abstract: Magnesium is used implant material potentially for
non-toxicity to the human body. Due to the excellent
bio-compatibility, Mg alloys is applied to implants avoiding removal
second surgery. However, it is found commercial magnesium alloys
including aluminum has low corrosion resistance, resulting
subcutaneous gas bubbles and consequently the approach as
permanent bio-materials. Generally, Aluminum is known to pollution
substance, and it raises toxicity to nervous system. Therefore
especially Mg-35Zn-3Ca alloy is prepared for new biodegradable
materials in this study. And the pulsed power is used in
constant-current mode of DC power kinds of anodization. Based on
the aforementioned study, it examines corrosion resistance and
biocompatibility by effect of current and frequency variation. The
surface properties and thickness were compared using scanning
electronic microscopy. Corrosion resistance was assessed via
potentiodynamic polarization and the effect of oxide layer on the body
was assessed cell viability. Anodized Mg-35Zn-3Ca alloy has good
biocompatibility in vitro by current and frequency variation.
Abstract: One important problem in today organizations is the
existence of non-integrated information systems, inconsistency and
lack of suitable correlations between legacy and modern systems.
One main solution is to transfer the local databases into a global one.
In this regards we need to extract the data structures from the legacy
systems and integrate them with the new technology systems. In
legacy systems, huge amounts of a data are stored in legacy
databases. They require particular attention since they need more
efforts to be normalized, reformatted and moved to the modern
database environments. Designing the new integrated (global)
database architecture and applying the reverse engineering requires
data normalization. This paper proposes the use of database reverse
engineering in order to integrate legacy and modern databases in
organizations. The suggested approach consists of methods and
techniques for generating data transformation rules needed for the
data structure normalization.
Abstract: This paper addresses issues of integral steering of
vehicles with two steering axles, where the rear wheels are pivoted in
the direction of the front wheels, but also in the opposite direction.
The steering box of the rear axle is presented with simple linkages
(single contour) that correlate the pivoting of the rear wheels
according to the direction of the front wheels, respectively to the
rotation angle of the steering wheel. The functionality of the system
is analyzed – the extent to which the requirements of the integral
steering are met by the considered/proposed mechanisms. The paper
highlights the quality of the single contour linkages, with two driving
elements for meeting these requirements, emphasizing diagrams of
mechanisms with 2 driving elements. Cam variants are analyzed and
proposed for the rear axle steering box. Cam profiles are determined
by various factors.
Abstract: The algorithm represents the DCT coefficients to concentrate signal energy and proposes combination and dictator to eliminate the correlation in the same level subband for encoding the DCT-based images. This work adopts DCT and modifies the SPIHT algorithm to encode DCT coefficients. The proposed algorithm also provides the enhancement function in low bit rate in order to improve the perceptual quality. Experimental results indicate that the proposed technique improves the quality of the reconstructed image in terms of both PSNR and the perceptual results close to JPEG2000 at the same bit rate.
Abstract: Twist drills are geometrical complex tools and thus various researchers have adopted different mathematical and experimental approaches for their simulation. The present paper acknowledges the increasing use of modern CAD systems and using the API (Application Programming Interface) of a CAD system, drilling simulations are carried out. The developed DRILL3D software routine, creates parametrically controlled tool geometries and using different cutting conditions, achieves the generation of solid models for all the relevant data involved (drilling tool, cut workpiece, undeformed chip). The final data derived, consist a platform for further direct simulations regarding the determination of cutting forces, tool wear, drilling optimizations etc.
Abstract: We consider a heterogeneously mixing SIR stochastic
epidemic process in populations described by a general graph.
Likelihood theory is developed to facilitate statistic inference for the
parameters of the model under complete observation. We show that
these estimators are asymptotically Gaussian unbiased estimates by
using a martingale central limit theorem.
Abstract: Automatic face detection is a complex problem in
image processing. Many methods exist to solve this problem such as
template matching, Fisher Linear Discriminate, Neural Networks,
SVM, and MRC. Success has been achieved with each method to
varying degrees and complexities. In proposed algorithm we used
upright, frontal faces for single gray scale images with decent
resolution and under good lighting condition. In the field of face
recognition technique the single face is matched with single face
from the training dataset. The author proposed a neural network
based face detection algorithm from the photographs as well as if any
test data appears it check from the online scanned training dataset.
Experimental result shows that the algorithm detected up to 95%
accuracy for any image.
Abstract: A high performance computer includes a fast
processor and millions bytes of memory. During the data processing,
huge amount of information are shuffled between the memory and
processor. Because of its small size and its effectiveness speed, cache
has become a common feature of high performance computers.
Enhancing cache performance proved to be essential in the speed up
of cache-based computers. Most enhancement approaches can be
classified as either software based or hardware controlled. The
performance of the cache is quantified in terms of hit ratio or miss
ratio. In this paper, we are optimizing the cache performance based
on enhancing the cache hit ratio. The optimum cache performance is
obtained by focusing on the cache hardware modification in the way
to make a quick rejection to the missed line's tags from the hit-or
miss comparison stage, and thus a low hit time for the wanted line in
the cache is achieved. In the proposed technique which we called
Even- Odd Tabulation (EOT), the cache lines come from the main
memory into cache are classified in two types; even line's tags and
odd line's tags depending on their Least Significant Bit (LSB). This
division is exploited by EOT technique to reject the miss match line's
tags in very low time compared to the time spent by the main
comparator in the cache, giving an optimum hitting time for the
wanted cache line. The high performance of EOT technique against
the familiar mapping technique FAM is shown in the simulated
results.
Abstract: Online trading is an alternative to conventional shopping method. People trade goods which are new or pre-owned before. However, there are times when a user is not able to search the items wanted online. This is because the items may not be posted as yet, thus ending the search. Conventional search mechanism only works by searching and matching search criteria (requirement) with data available in a particular database. This research aims to match current search requirements with future postings. This would involve the time factor in the conventional search method. A Car Matching Alert System (CMAS) prototype was developed to test the matching algorithm. When a buyer-s search returns no result, the system saves the search and the buyer will be alerted if there is a match found based on future postings. The algorithm developed is useful and as it can be applied in other search context.
Abstract: The influence of axial magnetic field (B=0.48 T) on
the variation of ionization efficiency coefficient h and secondary
electron emission coefficient g with respect to reduced electric field
E/P is studied at a new range of plane-parallel electrode spacing (0<
d< 20 cm) and different nitrogen working pressure between 0.5-20
Pa. The axial magnetic field is produced from an inductive copper
coil of radius 5.6 cm. The experimental data of breakdown voltage is
adopted to estimate the mean Paschen curves at different working
features. The secondary electron emission coefficient is calculated
from the mean Paschen curve and used to determine the minimum
breakdown voltage. A reduction of discharge voltage of about 25% is
investigated by the applied of axial magnetic field. At high interelectrode
spacing, the effect of axial magnetic field becomes more
significant for the obtained values of h but it was less for the values
of g.
Abstract: This paper presents an interval-based multi-attribute
decision making (MADM) approach in support of the decision
process with imprecise information. The proposed decision
methodology is based on the model of linear additive utility function
but extends the problem formulation with the measure of composite
utility variance. A sample study concerning with the evaluation of
electric generation expansion strategies is provided showing how the
imprecise data may affect the choice toward the best solution and
how a set of alternatives, acceptable to the decision maker (DM),
may be identified with certain confidence.
Abstract: The dynamic behaviour of a four-bar linkage driven by a velocity controlled DC motor is discussed in the paper. In particular the author presents the results obtained by means of a specifically developed software, which implements the mathematical models of all components of the system (linkage, transmission, electric motor, control devices). The use of this software enables a more efficient design approach, since it allows the designer to check, in a simple and immediate way, the dynamic behaviour of the mechanism, arising from different values of the system parameters.
Abstract: The job shop scheduling problem (JSSP) is well known as one of the most difficult combinatorial optimization problems. This paper presents a hybrid genetic algorithm for the JSSP with the objective of minimizing makespan. The efficiency of the genetic algorithm is enhanced by integrating it with a local search method. The chromosome representation of the problem is based on operations. Schedules are constructed using a procedure that generates full active schedules. In each generation, a local search heuristic based on Nowicki and Smutnicki-s neighborhood is applied to improve the solutions. The approach is tested on a set of standard instances taken from the literature and compared with other approaches. The computation results validate the effectiveness of the proposed algorithm.
Abstract: This paper presents the experimental results of a
single cylinder Enfield engine using an electronically controlled fuel
injection system which was developed to carry out exhaustive tests
using neat CNG, and mixtures of hydrogen in compressed natural gas
(HCNG) as 0, 5, 10, 15 and 20% by energy. Experiments were
performed at 2000 and 2400 rpm with wide open throttle and varying
the equivalence ratio. Hydrogen which has fast burning rate, when
added to compressed natural gas, enhances its flame propagation rate.
The emissions of HC, CO, decreased with increasing percentage of
hydrogen but NOx was found to increase. The results indicated a
marked improvement in the brake thermal efficiency with the
increase in percentage of hydrogen added. The improved thermal
efficiency was clearly observed to be more in lean region as
compared to rich region. This study is expected to reduce vehicular
emissions along with increase in thermal efficiency and thus help in
reduction of further environmental degradation.
Abstract: Graph based image segmentation techniques are
considered to be one of the most efficient segmentation techniques
which are mainly used as time & space efficient methods for real
time applications. How ever, there is need to focus on improving the
quality of segmented images obtained from the earlier graph based
methods. This paper proposes an improvement to the graph based
image segmentation methods already described in the literature. We
contribute to the existing method by proposing the use of a weighted
Euclidean distance to calculate the edge weight which is the key
element in building the graph. We also propose a slight modification
of the segmentation method already described in the literature, which
results in selection of more prominent edges in the graph. The
experimental results show the improvement in the segmentation
quality as compared to the methods that already exist, with a slight
compromise in efficiency.
Abstract: Logistics outsourcing is a growing trend and measuring its performance, a challenge. It must be consistent with the objectives set for logistics outsourcing, but we have found no objective-based performance measurement system. We have conducted a comprehensive review of the specialist literature to cover this gap, which has led us to identify and define these objectives. The outcome is that we have obtained a list of the most relevant objectives and their descriptions. This will enable us to analyse in a future study whether the indicators used for measuring logistics outsourcing performance are consistent with the objectives pursued with the outsourcing. If this is not the case, a proposal will be made for a set of financial and operational indicators to measure performance in logistics outsourcing that take the goals being pursued into account.
Abstract: Wavelet transform or wavelet analysis is a recently
developed mathematical tool in applied mathematics. In numerical
analysis, wavelets also serve as a Galerkin basis to solve partial
differential equations. Haar transform or Haar wavelet transform has
been used as a simplest and earliest example for orthonormal wavelet
transform. Since its popularity in wavelet analysis, there are several
definitions and various generalizations or algorithms for calculating
Haar transform. Fast Haar transform, FHT, is one of the algorithms
which can reduce the tedious calculation works in Haar transform. In
this paper, we present a modified fast and exact algorithm for FHT,
namely Modified Fast Haar Transform, MFHT. The algorithm or
procedure proposed allows certain calculation in the process
decomposition be ignored without affecting the results.
Abstract: This paper demonstrates the results when either
Shiftrows stage or Mixcolumns stage and when both the stages are
omitted in the well known block cipher Advanced Encryption
Standard(AES) and its modified version AES with Key Dependent
S-box(AES-KDS), using avalanche criterion and other tests namely
encryption quality, correlation coefficient, histogram analysis and
key sensitivity tests.
Abstract: The objective of this study was to improve our
understanding of vulnerability and environmental change; it's causes
basically show the intensity, its distribution and human-environment
effect on the ecosystem in the Apodi Valley Region, This paper is
identify, assess and classify vulnerability and environmental change
in the Apodi valley region using a combined approach of landscape
pattern and ecosystem sensitivity. Models were developed using the
following five thematic layers: Geology, geomorphology, soil,
vegetation and land use/cover, by means of a Geographical
Information Systems (GIS)-based on hydro-geophysical parameters.
In spite of the data problems and shortcomings, using ESRI-s ArcGIS
9.3 program, the vulnerability score, to classify, weight and combine
a number of 15 separate land cover classes to create a single indicator
provides a reliable measure of differences (6 classes) among regions
and communities that are exposed to similar ranges of hazards.
Indeed, the ongoing and active development of vulnerability
concepts and methods have already produced some tools to help
overcome common issues, such as acting in a context of high
uncertainties, taking into account the dynamics and spatial scale of
asocial-ecological system, or gathering viewpoints from different
sciences to combine human and impact-based approaches. Based on
this assessment, this paper proposes concrete perspectives and
possibilities to benefit from existing commonalities in the
construction and application of assessment tools.
Abstract: We investigate the planar quasi-septic non-analytic systems which have a center-focus equilibrium at the origin and whose angular speed is constant. The system could be changed into an analytic system by two transformations, with the help of computer algebra system MATHEMATICA, the conditions of uniform isochronous center are obtained.