Abstract: This paper describes the study of cryptographic hash functions, one of the most important classes of primitives used in recent techniques in cryptography. The main aim is the development of recent crypt analysis hash function. We present different approaches to defining security properties more formally and present basic attack on hash function. We recall Merkle-Damgard security properties of iterated hash function. The Main aim of this paper is the development of recent techniques applicable to crypt Analysis hash function, mainly from SHA family. Recent proposed attacks an MD5 & SHA motivate a new hash function design. It is designed not only to have higher security but also to be faster than SHA-256. The performance of the new hash function is at least 30% better than that of SHA-256 in software. And it is secure against any known cryptographic attacks on hash functions.
Abstract: The purpose of this study is to identify the critical success factors (CSFs) for the effective implementation of Six Sigma in non-formal service Sectors.
Based on the survey of literature, the critical success factors (CSFs) for Six Sigma have been identified and are assessed for their importance in Non-formal service sector using Delphi Technique. These selected CSFs were put forth to the panel of expert to cluster them and prepare cognitive map to establish their relationship.
All the critical success factors examined and obtained from the review of literature have been assessed for their importance with respect to their contribution to Six Sigma effectiveness in non formal service sector.
The study is limited to the non-formal service sectors involved in the organization of religious festival only. However, the similar exercise can be conducted for broader sample of other non-formal service sectors like temple/ashram management, religious tours management etc.
The research suggests an approach to identify CSFs of Six Sigma for Non-formal service sector. All the CSFs of the formal service sector will not be applicable to Non-formal services, hence opinion of experts was sought to add or delete the CSFs. In the first round of Delphi, the panel of experts has suggested, two new CSFs-“competitive benchmarking (F19) and resident’s involvement (F28)”, which were added for assessment in the next round of Delphi. One of the CSFs-“fulltime six sigma personnel (F15)” has been omitted in proposed clusters of CSFs for non-formal organization, as it is practically impossible to deploy full time trained Six Sigma recruits.
Abstract: Cryptographic algorithms play a crucial role in the
information society by providing protection from unauthorized
access to sensitive data. It is clear that information technology will
become increasingly pervasive, Hence we can expect the emergence
of ubiquitous or pervasive computing, ambient intelligence. These
new environments and applications will present new security
challenges, and there is no doubt that cryptographic algorithms and
protocols will form a part of the solution. The efficiency of a public
key cryptosystem is mainly measured in computational overheads,
key size and bandwidth. In particular the RSA algorithm is used in
many applications for providing the security. Although the security
of RSA is beyond doubt, the evolution in computing power has
caused a growth in the necessary key length. The fact that most chips
on smart cards can-t process key extending 1024 bit shows that there
is need for alternative. NTRU is such an alternative and it is a
collection of mathematical algorithm based on manipulating lists of
very small integers and polynomials. This allows NTRU to high
speeds with the use of minimal computing power. NTRU (Nth degree
Truncated Polynomial Ring Unit) is the first secure public key
cryptosystem not based on factorization or discrete logarithm
problem. This means that given sufficient computational resources
and time, an adversary, should not be able to break the key. The
multi-party communication and requirement of optimal resource
utilization necessitated the need for the present day demand of
applications that need security enforcement technique .and can be
enhanced with high-end computing. This has promoted us to develop
high-performance NTRU schemes using approaches such as the use
of high-end computing hardware. Peer-to-peer (P2P) or enterprise
grids are proven as one of the approaches for developing high-end
computing systems. By utilizing them one can improve the
performance of NTRU through parallel execution. In this paper we
propose and develop an application for NTRU using enterprise grid
middleware called Alchemi. An analysis and comparison of its
performance for various text files is presented.
Abstract: This paper evaluate the multilevel modulation for
different techniques such as amplitude shift keying (M-ASK), MASK,
differential phase shift keying (M-ASK-Bipolar), Quaternary
Amplitude Shift Keying (QASK) and Quaternary Polarization-ASK
(QPol-ASK) at a total bit rate of 107 Gbps. The aim is to find a costeffective
very high speed transport solution. Numerical investigation
was performed using Monte Carlo simulations. The obtained results
indicate that some modulation formats can be operated at 100Gbps
in optical communication systems with low implementation effort
and high spectral efficiency.
Abstract: Hemodialysis patients might suffer from unhealthy
care behaviors or long-term dialysis treatments. Ultimately they need
to be hospitalized. If the hospitalization rate of a hemodialysis center
is high, its quality of service would be low. Therefore, how to decrease
hospitalization rate is a crucial problem for health care. In this study
we combined temporal abstraction with data mining techniques for
analyzing the dialysis patients' biochemical data to develop a decision
support system. The mined temporal patterns are helpful for clinicians
to predict hospitalization of hemodialysis patients and to suggest them
some treatments immediately to avoid hospitalization.
Abstract: The article deals with the relation between rainfall in selected months and subsequent weed infestation of spring barley. The field experiment was performed at Mendel University agricultural enterprise in Žabčice, Czech Republic. Weed infestation was measured in spring barley vegetation in years 2004 to 2012. Barley was grown in three tillage variants: conventional tillage technology (CT), minimization tillage technology (MT), and no tillage (NT). Precipitation was recorded in one-day intervals. Monthly precipitation was calculated from the measured values in the months of October through to April. The technique of canonical correspondence analysis was applied for further statistical processing. 41 different species of weeds were found in the course of the 9-year monitoring period. The results clearly show that precipitation affects the incidence of most weed species in the selected months, but acts differently in the monitored variants of tillage technologies.
Abstract: This paper presents design features of a rescue robot, named CEO Mission II. Its body is designed to be the track wheel type with double front flippers for climbing over the collapse and the rough terrain. With 125 cm. long, 5-joint mechanical arm installed on the robot body, it is deployed not only for surveillance from the top view but also easier and faster access to the victims to get their vital signs. Two cameras and sensors for searching vital signs are set up at the tip of the multi-joint mechanical arm. The third camera is at the back of the robot for driving control. Hardware and software of the system, which controls and monitors the rescue robot, are explained. The control system is used for controlling the robot locomotion, the 5-joint mechanical arm, and for turning on/off devices. The monitoring system gathers all information from 7 distance sensors, IR temperature sensors, 3 CCD cameras, voice sensor, robot wheels encoders, yawn/pitch/roll angle sensors, laser range finder and 8 spare A/D inputs. All sensors and controlling data are communicated with a remote control station via IEEE 802.11b Wi-Fi. The audio and video data are compressed and sent via another IEEE 802.11g Wi-Fi transmitter for getting real-time response. At remote control station site, the robot locomotion and the mechanical arm are controlled by joystick. Moreover, the user-friendly GUI control program is developed based on the clicking and dragging method to easily control the movement of the arm. Robot traveling map is plotted from computing the information of wheel encoders and the yawn/pitch data. 2D Obstacle map is plotted from data of the laser range finder. The concept and design of this robot can be adapted to suit many other applications. As the Best Technique awardee from Thailand Rescue Robot Championship 2006, all testing results are satisfied.
Abstract: Worldwide many electrical equipment insulation
failures have been reported caused by switching operations, while
those equipments had previously passed all the standard tests and
complied with all quality requirements. The problem is mostly
associated with high-frequency overvoltages generated during
opening or closing of a switching device. The transients generated
during switching operations in a Gas Insulated Substation (GIS) are
associated with high frequency components in the order of few tens
of MHz.
The frequency spectrum of the VFTO generated in the 220/66 kV
Wadi-Hoff GIS is analyzed using Fast Fourier Transform technique.
The main frequency with high voltage amplitude due to the operation
of disconnector (DS5) is 5 to 10 MHz, with the highest amplitude at 9
MHz. The main frequency with high voltage amplitude due to the
operation of circuit breaker (CB5) is 1 to 25 MHz, with the highest
amplitude at 2 MHz.
Mitigating techniques damped the oscillating frequencies
effectively. The using of cable terminal reduced the frequency
oscillation effectively than that of OHTL terminal. The using of a
shunt capacitance results in vanishing the high frequency
components. Ferrite rings reduces the high frequency components
effectively especially in the range 2 to 7 MHz. The using of RC and
RL filters results in vanishing the high frequency components.
Abstract: Burnishing is a method of finishing and hardening
machined parts by plastic deformation of the surface. Experimental
work based on central composite second order rotatable design has
been carried out on a lathe machine to establish the effects of ball
burnishing parameters on the surface roughness of brass material.
Analysis of the results by the analysis of variance technique and the
F-test show that the parameters considered, have significant effects
on the surface roughness.
Abstract: The major goal in defining and examining game
scenarios is to find good strategies as solutions to the game. A
plausible solution is a recommendation to the players on how to play
the game, which is represented as strategies guided by the various
choices available to the players. These choices invariably compel the
players (decision makers) to execute an action following some
conscious tactics. In this paper, we proposed a refinement-based
heuristic as a machine learning technique for human-like decision
making in playing Ayo game. The result showed that our machine
learning technique is more adaptable and more responsive in making
decision than human intelligence. The technique has the advantage
that a search is astutely conducted in a shallow horizon game tree.
Our simulation was tested against Awale shareware and an appealing
result was obtained.
Abstract: This paper presents the development of analysis tools
for Home Agriculture project. The tools are required for monitoring
the condition of greenhouse which involves two components:
measurement hardware and data analysis engine. Measurement
hardware is functioned to measure environment parameters such as
temperature, humidity, air quality, dust and etc while analysis tool is
used to analyse and interpret the integrated data against the condition
of weather, quality of health, irradiance, quality of soil and etc. The
current development of the tools is completed for off-line data
recorded technique. The data is saved in MMC and transferred via
ZigBee to Environment Data Manager (EDM) for data analysis.
EDM converts the raw data and plot three combination graphs. It has
been applied in monitoring three months data measurement for
irradiance, temperature and humidity of the greenhouse..
Abstract: In this paper, the robust exponential stability problem of discrete-time uncertain stochastic neural networks with timevarying delays is investigated. By introducing a new augmented Lyapunov function, some delay-dependent stable results are obtained in terms of linear matrix inequality (LMI) technique. Compared with some existing results in the literature, the conservatism of the new criteria is reduced notably. Three numerical examples are provided to demonstrate the less conservatism and effectiveness of the proposed method.
Abstract: Wavelet transform has been extensively used in
machine fault diagnosis and prognosis owing to its strength to deal
with non-stationary signals. The existing Wavelet transform based
schemes for fault diagnosis employ wavelet decomposition of the
entire vibration frequency which not only involve huge
computational overhead in extracting the features but also increases
the dimensionality of the feature vector. This increase in the
dimensionality has the tendency to 'over-fit' the training data and
could mislead the fault diagnostic model. In this paper a novel
technique, envelope wavelet packet transform (EWPT) is proposed in
which features are extracted based on wavelet packet transform of the
filtered envelope signal rather than the overall vibration signal. It not
only reduces the computational overhead in terms of reduced number
of wavelet decomposition levels and features but also improves the
fault detection accuracy. Analytical expressions are provided for the
optimal frequency resolution and decomposition level selection in
EWPT. Experimental results with both actual and simulated machine
fault data demonstrate significant gain in fault detection ability by
EWPT at reduced complexity compared to existing techniques.
Abstract: Flight management system (FMS) is a specialized
computer system that automates a wide variety of in-flight tasks,
reducing the workload on the flight crew to the point that modern
aircraft no longer carry flight engineers or navigators. The primary
function of FMS is to perform the in-flight management of the flight
plan using various sensors (such as GPS and INS often backed up by
radio navigation) to determine the aircraft's position. From the
cockpit FMS is normally controlled through a Control Display Unit
(CDU) which incorporates a small screen and keyboard or touch
screen. This paper investigates the performance of GPS/ INS
integration techniques in which the data fusion process is done using
Kalman filtering. This will include the importance of sensors
calibration as well as the alignment of the strap down inertial
navigation system. The limitations of the inertial navigation systems
are investigated in order to understand why INS sometimes is
integrated with other navigation aids and not just operating in standalone
mode. Finally, both the loosely coupled and tightly coupled
configurations are analyzed for several types of situations and
operational conditions.
Abstract: The data is available in abundance in any business
organization. It includes the records for finance, maintenance,
inventory, progress reports etc. As the time progresses, the data keep
on accumulating and the challenge is to extract the information from
this data bank. Knowledge discovery from these large and complex
databases is the key problem of this era. Data mining and machine
learning techniques are needed which can scale to the size of the
problems and can be customized to the application of business. For
the development of accurate and required information for particular
problem, business analyst needs to develop multidimensional models
which give the reliable information so that they can take right
decision for particular problem. If the multidimensional model does
not possess the advance features, the accuracy cannot be expected.
The present work involves the development of a Multidimensional
data model incorporating advance features. The criterion of
computation is based on the data precision and to include slowly
change time dimension. The final results are displayed in graphical
form.
Abstract: Interactive push VOD system is a new kind of system
that incorporates push technology and interactive technique. It can
push movies to users at high speeds at off-peak hours for optimal
network usage so as to save bandwidth. This paper presents effective
software-based solution for processing mass downstream data at
terminals of interactive push VOD system, where the service can
download movie according to a viewer-s selection. The downstream
data is divided into two catalogs: (1) the carousel data delivered
according to DSM-CC protocol; (2) IP data delivered according to
Euro-DOCSIS protocol. In order to accelerate download speed and
reduce data loss rate at terminals, this software strategy introduces
caching, multi-thread and resuming mechanisms. The experiments
demonstrate advantages of the software-based solution.
Abstract: The use of a Bayesian Hierarchical Model (BHM) to interpret breath measurements obtained during a 13C Octanoic Breath Test (13COBT) is demonstrated. The statistical analysis was implemented using WinBUGS, a commercially available computer package for Bayesian inference. A hierarchical setting was adopted where poorly defined parameters associated with a delayed Gastric Emptying (GE) were able to "borrow" strength from global distributions. This is proved to be a sufficient tool to correct model's failures and data inconsistencies apparent in conventional analyses employing a Non-linear least squares technique (NLS). Direct comparison of two parameters describing gastric emptying ng ( tlag -lag phase, t1/ 2 -half emptying time) revealed a strong correlation between the two methods. Despite our large dataset ( n = 164 ), Bayesian modeling was fast and provided a successful fitting for all subjects. On the contrary, NLS failed to return acceptable estimates in cases where GE was delayed.
Abstract: Panoramic view generation has always offered
novel and distinct challenges in the field of image processing.
Panoramic view generation is nothing but construction of bigger
view mosaic image from set of partial images of the desired view.
The paper presents a solution to one of the problems of image
seascape formation where some of the partial images are color and
others are grayscale. The simplest solution could be to convert all
image parts into grayscale images and fusing them to get grayscale
image panorama. But in the multihued world, obtaining the colored
seascape will always be preferred. This could be achieved by picking
colors from the color parts and squirting them in grayscale parts of
the seascape. So firstly the grayscale image parts should be colored
with help of color image parts and then these parts should be fused to
construct the seascape image.
The problem of coloring grayscale images has no exact solution.
In the proposed technique of panoramic view generation, the job of
transferring color traits from reference color image to grayscale
image is done by palette based method. In this technique, the color
palette is prepared using pixel windows of some degrees taken from
color image parts. Then the grayscale image part is divided into pixel
windows with same degrees. For every window of grayscale image
part the palette is searched and equivalent color values are found,
which could be used to color grayscale window. For palette
preparation we have used RGB color space and Kekre-s LUV color
space. Kekre-s LUV color space gives better quality of coloring. The
searching time through color palette is improved over the exhaustive
search using Kekre-s fast search technique.
After coloring the grayscale image pieces the next job is fusion of
all these pieces to obtain panoramic view. For similarity estimation
between partial images correlation coefficient is used.
Abstract: Leo Breimans Random Forests (RF) is a recent
development in tree based classifiers and quickly proven to be one of
the most important algorithms in the machine learning literature. It
has shown robust and improved results of classifications on standard
data sets. Ensemble learning algorithms such as AdaBoost and
Bagging have been in active research and shown improvements in
classification results for several benchmarking data sets with mainly
decision trees as their base classifiers. In this paper we experiment to
apply these Meta learning techniques to the random forests. We
experiment the working of the ensembles of random forests on the
standard data sets available in UCI data sets. We compare the
original random forest algorithm with their ensemble counterparts
and discuss the results.
Abstract: In this paper, the local grid refinement is focused by
using a nested grid technique. The Cartesian grid numerical method is
developed for simulating unsteady, viscous, incompressible flows
with complex immersed boundaries. A finite volume method is used in
conjunction with a two-step fractional-step procedure. The key aspects
that need to be considered in developing such a nested grid solver are
imposition of interface conditions on the inter-block and accurate
discretization of the governing equation in cells that are with the
inter-block as a control surface. A new interpolation procedure is
presented which allows systematic development of a spatial
discretization scheme that preserves the spatial accuracy of the
underlying solver. The present nested grid method has been tested by
two numerical examples to examine its performance in the two
dimensional problems. The numerical examples include flow past a
circular cylinder symmetrically installed in a Channel and flow past
two circular cylinders with different diameters. From the numerical
experiments, the ability of the solver to simulate flows with
complicated immersed boundaries is demonstrated and the nested grid
approach can efficiently speed up the numerical solutions.