Abstract: This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of Pulping of Sugar Maple problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified problem where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.
Abstract: With the advent of digital cinema and digital
broadcasting, copyright protection of video data has been one of the
most important issues.
We present a novel method of watermarking for video image data
based on the hardware and digital wavelet transform techniques and
name it as “traceable watermarking" because the watermarked data is
constructed before the transmission process and traced after it has been
received by an authorized user.
In our method, we embed the watermark to the lowest part of each
image frame in decoded video by using a hardware LSI.
Digital Cinema is an important application for traceable
watermarking since digital cinema system makes use of watermarking
technology during content encoding, encryption, transmission,
decoding and all the intermediate process to be done in digital cinema
systems. The watermark is embedded into the randomly selected
movie frames using hash functions.
Embedded watermark information can be extracted from the
decoded video data. For that, there is no need to access original movie
data. Our experimental results show that proposed traceable
watermarking method for digital cinema system is much better than the
convenient watermarking techniques in terms of robustness, image
quality, speed, simplicity and robust structure.
Abstract: Semantic Web Technologies enable machines to
interpret data published in a machine-interpretable form on the web.
At the present time, only human beings are able to understand the
product information published online. The emerging semantic Web
technologies have the potential to deeply influence the further
development of the Internet Economy. In this paper we propose a
scenario based research approach to predict the effects of these new
technologies on electronic markets and business models of traders
and intermediaries and customers. Over 300 million searches are
conducted everyday on the Internet by people trying to find what
they need. A majority of these searches are in the domain of
consumer ecommerce, where a web user is looking for something to
buy. This represents a huge cost in terms of people hours and an
enormous drain of resources. Agent enabled semantic search will
have a dramatic impact on the precision of these searches. It will
reduce and possibly eliminate information asymmetry where a better
informed buyer gets the best value. By impacting this key
determinant of market prices semantic web will foster the evolution
of different business and economic models. We submit that there is a
need for developing these futuristic models based on our current
understanding of e-commerce models and nascent semantic web
technologies. We believe these business models will encourage
mainstream web developers and businesses to join the “semantic web
revolution."
Abstract: The empirical studies on High Performance Work Systems (HPWSs) and their impacts on firm performance have remarkably little in the developing countries. This paper reviews literatures on the HPWSs practices in different work settings, Western and Asian countries. A review on the empirical research leads to a conclusion that, country differences influence the Human Resource Management (HRM) practices. It is anticipated that there are similarities and differences in the extent of implementation of HPWSs practices by the Malaysian manufacturing firms due to the organizational contextual factors and, the HPWSs have a significant impact on firms- better performance amongst MNCs and local firms.
Abstract: In this paper, subtractive clustering based fuzzy inference system approach is used for early detection of faults in the function oriented software systems. This approach has been tested with real time defect datasets of NASA software projects named as PC1 and CM1. Both the code based model and joined model (combination of the requirement and code based metrics) of the datasets are used for training and testing of the proposed approach. The performance of the models is recorded in terms of Accuracy, MAE and RMSE values. The performance of the proposed approach is better in case of Joined Model. As evidenced from the results obtained it can be concluded that Clustering and fuzzy logic together provide a simple yet powerful means to model the earlier detection of faults in the function oriented software systems.
Abstract: In this paper a new approach is proposed for the
adaptation of the simulated annealing search in the field of the
Multi-Objective Optimization (MOO). This new approach is called
Multi-Case Multi-Objective Simulated Annealing (MC-MOSA). It
uses some basics of a well-known recent Multi-Objective Simulated
Annealing proposed by Ulungu et al., which is referred in the
literature as U-MOSA. However, some drawbacks of this algorithm
have been found, and are substituted by other ones, especially in
the acceptance decision criterion. The MC-MOSA has shown better
performance than the U-MOSA in the numerical experiments. This
performance is further improved by some other subvariants of the
MC-MOSA, such as Fast-annealing MC-MOSA, Re-annealing MCMOSA
and the Two-Stage annealing MC-MOSA.
Abstract: The γ-turns play important roles in protein folding and
molecular recognition. The prediction and analysis of γ-turn types are
important for both protein structure predictions and better
understanding the characteristics of different γ-turn types. This study
proposed a physicochemical property-based decision tree (PPDT)
method to interpretably predict γ-turn types. In addition to the good
prediction performance of PPDT, three simple and human
interpretable IF-THEN rules are extracted from the decision tree
constructed by PPDT. The identified informative physicochemical
properties and concise rules provide a simple way for discriminating
and understanding γ-turn types.
Abstract: A model is presented to find the optimal design of the
mixed renewable warranty policy for non-repairable Weibull life
products. The optimal design considers the conflict of interests
between the customer and the manufacturer: the customer interests
are longer full rebate coverage period and longer total warranty
coverage period, the manufacturer interests are lower warranty cost
and lower risk. The design factors are full rebate and total warranty
coverage periods. Results showed that mixed policy is better than full
rebate policy in terms of risk and total warranty coverage period in all
of the three bathtub regions. In addition, results showed that linear
policy is better than mixed policy in infant mortality and constant
failure regions while the mixed policy is better than linear policy in
ageing region of the model. Furthermore, the results showed that
using burn-in period for infant mortality products reduces warranty
cost and risk.
Abstract: The network traffic data provided for the design of
intrusion detection always are large with ineffective information and
enclose limited and ambiguous information about users- activities.
We study the problems and propose a two phases approach in our
intrusion detection design. In the first phase, we develop a
correlation-based feature selection algorithm to remove the worthless
information from the original high dimensional database. Next, we
design an intrusion detection method to solve the problems of
uncertainty caused by limited and ambiguous information. In the
experiments, we choose six UCI databases and DARPA KDD99
intrusion detection data set as our evaluation tools. Empirical studies
indicate that our feature selection algorithm is capable of reducing the
size of data set. Our intrusion detection method achieves a better
performance than those of participating intrusion detectors.
Abstract: The Mongol expansion in the West and the political
and commercial interests arising from antagonisms between the
Golden Horde and the Persian Ilkhanate determined the
transformation of the Black Sea into an international trade turntable
beginning with the last third of the XIIIth century. As the Volga
Khanate attracted the maritime power of Genoa in the
transcontinental project of deviating the Silk Road to its own benefit,
the latter took full advantage of the new historical conjuncture, to the
detriment of its rival, Venice. As a consequence, Genoa settled
important urban centers on the Pontic shores, having mainly a
commercial role. In the Romanian outer-Carpathian area, Vicina,
Cetatea Albâ, and Chilia are notable, representing distinct, important
types of cities within the broader context of the Romanian medieval
urban genesis typology.
Abstract: This paper describes the study of cryptographic hash functions, one of the most important classes of primitives used in recent techniques in cryptography. The main aim is the development of recent crypt analysis hash function. We present different approaches to defining security properties more formally and present basic attack on hash function. We recall Merkle-Damgard security properties of iterated hash function. The Main aim of this paper is the development of recent techniques applicable to crypt Analysis hash function, mainly from SHA family. Recent proposed attacks an MD5 & SHA motivate a new hash function design. It is designed not only to have higher security but also to be faster than SHA-256. The performance of the new hash function is at least 30% better than that of SHA-256 in software. And it is secure against any known cryptographic attacks on hash functions.
Abstract: Metaphor has recently gained extensive interest most probably due to developments in cognitive sciences and the study of language as the reflection of humans- world perception. Metaphor is no longer reckoned as solely literary expressive means. Nowadays it is studied in a whole number of discourses, such as politics, law, medicine, sports, etc. with the purpose of the analysis and determining its role. The scientific language is not an exception. It might seem that metaphor cannot suit it; we would dare to draw a hypothesis that metaphor has indeed found its stable place in terminology. In comprehension of metaphorically represented terms the stage of visualization plays a significant role. We proceeded on the assumption that this stage is the main in provision of better term comprehension and would try to exemplify it with metaphoricallyoriented terms.
Abstract: This paper presents design features of a rescue robot, named CEO Mission II. Its body is designed to be the track wheel type with double front flippers for climbing over the collapse and the rough terrain. With 125 cm. long, 5-joint mechanical arm installed on the robot body, it is deployed not only for surveillance from the top view but also easier and faster access to the victims to get their vital signs. Two cameras and sensors for searching vital signs are set up at the tip of the multi-joint mechanical arm. The third camera is at the back of the robot for driving control. Hardware and software of the system, which controls and monitors the rescue robot, are explained. The control system is used for controlling the robot locomotion, the 5-joint mechanical arm, and for turning on/off devices. The monitoring system gathers all information from 7 distance sensors, IR temperature sensors, 3 CCD cameras, voice sensor, robot wheels encoders, yawn/pitch/roll angle sensors, laser range finder and 8 spare A/D inputs. All sensors and controlling data are communicated with a remote control station via IEEE 802.11b Wi-Fi. The audio and video data are compressed and sent via another IEEE 802.11g Wi-Fi transmitter for getting real-time response. At remote control station site, the robot locomotion and the mechanical arm are controlled by joystick. Moreover, the user-friendly GUI control program is developed based on the clicking and dragging method to easily control the movement of the arm. Robot traveling map is plotted from computing the information of wheel encoders and the yawn/pitch data. 2D Obstacle map is plotted from data of the laser range finder. The concept and design of this robot can be adapted to suit many other applications. As the Best Technique awardee from Thailand Rescue Robot Championship 2006, all testing results are satisfied.
Abstract: This paper offers suggestions for educators at all levels about how to better prepare our students for the future, by building on the past. The discussion begins with a summary of changes in the World Wide Web, especially as the term Web 3.0 is being heard. The bulk of the discussion is retrospective and concerned with an overview of traditional teaching and research approaches as they evolved during the 20th century beginning with those grounded in the Cartesian reality of IA Richards- (1929) Practical Criticism. The paper concludes with a proposal of five strategies which incorporate timeless elements from the past as well as cutting-edge elements from today, in order to better prepare our students for the future.
Abstract: Noise has adverse effect on human health and
comfort. Noise not only cause hearing impairment, but it also acts as
a causal factor for stress and raising systolic pressure. Additionally it
can be a causal factor in work accidents, both by marking hazards
and warning signals and by impeding concentration. Industry
workers also suffer psychological and physical stress as well as
hearing loss due to industrial noise. This paper proposes an approach
to enable engineers to point out quantitatively the noisiest source for
modification, while multiple machines are operating simultaneously.
The model with the point source and spherical radiation in a free field
was adopted to formulate the problem. The procedure works very
well in ideal cases (point source and free field). However, most of the
industrial noise problems are complicated by the fact that the noise is
confined in a room. Reflections from the walls, floor, ceiling, and
equipment in a room create a reverberant sound field that alters the
sound wave characteristics from those for the free field. So the model
was validated for relatively low absorption room at NIT Kurukshetra
Central Workshop. The results of validation pointed out that the
estimated sound power of noise sources under simultaneous
conditions were on lower side, within the error limits 3.56 - 6.35 %.
Thus suggesting the use of this methodology for practical
implementation in industry. To demonstrate the application of the
above analytical procedure for estimating the sound power of noise
sources under simultaneous operating conditions, a manufacturing
facility (Railway Workshop at Yamunanagar, India) having five
sound sources (machines) on its workshop floor is considered in this
study. The findings of the case study had identified the two most
effective candidates (noise sources) for noise control in the Railway
Workshop Yamunanagar, India. The study suggests that the
modification in the design and/or replacement of these two identified
noisiest sources (machine) would be necessary so as to achieve an
effective reduction in noise levels. Further, the estimated data allows
engineers to better understand the noise situations of the workplace
and to revise the map when changes occur in noise level due to a
workplace re-layout.
Abstract: Activity-Based Costing (ABC) represents an
alternative paradigm to traditional cost accounting system and
it often provides more accurate cost information for decision
making such as product pricing, product mix, and make-orbuy
decisions. ABC models the causal relationships between
products and the resources used in their production and traces
the cost of products according to the activities through the use
of appropriate cost drivers. In this paper, the implementation
of the ABC in a manufacturing system is analyzed and a
comparison with the traditional cost based system in terms of
the effects on the product costs are carried out to highlight the
difference between two costing methodologies. By using this
methodology, a valuable insight into the factors that cause the
cost is provided, helping to better manage the activities of the
company.
Abstract: Automatic Extraction of Event information from
social text stream (emails, social network sites, blogs etc) is a vital
requirement for many applications like Event Planning and
Management systems and security applications. The key information
components needed from Event related text are Event title, location,
participants, date and time. Emails have very unique distinctions over
other social text streams from the perspective of layout and format
and conversation style and are the most commonly used
communication channel for broadcasting and planning events.
Therefore we have chosen emails as our dataset. In our work, we
have employed two statistical NLP methods, named as Finite State
Machines (FSM) and Hidden Markov Model (HMM) for the
extraction of event related contextual information. An application
has been developed providing a comparison among the two methods
over the event extraction task. It comprises of two modules, one for
each method, and works for both bulk as well as direct user input.
The results are evaluated using Precision, Recall and F-Score.
Experiments show that both methods produce high performance and
accuracy, however HMM was good enough over Title extraction and
FSM proved to be better for Venue, Date, and time.
Abstract: Insufficient Quality of Service (QoS) of Voice over
Internet Protocol (VoIP) is a growing concern that has lead the need
for research and study. In this paper we investigate the performance
of VoIP and the impact of resource limitations on the performance of
Access Networks. The impact of VoIP performance in Access
Networks is particularly important in regions where Internet
resources are limited and the cost of improving these resources is
prohibitive. It is clear that perceived VoIP performance, as measured
by mean opinion score [2] in experiments, where subjects are asked
to rate communication quality, is determined by end-to-end delay on
the communication path, delay variation, packet loss, echo, the
coding algorithm in use and noise. These performance indicators can
be measured and the affect in the Access Network can be estimated.
This paper investigates the congestion in the Access Network to the
overall performance of VoIP services with the presence of other
substantial uses of internet and ways in which Access Networks can
be designed to improve VoIP performance. Methods for analyzing
the impact of the Access Network on VoIP performance will be
surveyed and reviewed. This paper also considers some approaches
for improving performance of VoIP by carrying out experiments
using Network Simulator version 2 (NS2) software with a view to
gaining a better understanding of the design of Access Networks.
Abstract: This paper proposes an efficient method for the design
of two channel quadrature mirror filter (QMF) bank. To achieve
minimum value of reconstruction error near to perfect reconstruction,
a linear optimization process has been proposed. Prototype low pass
filter has been designed using Kaiser window function. The modified
algorithm has been developed to optimize the reconstruction error
using linear objective function through iteration method. The result
obtained, show that the performance of the proposed algorithm is
better than that of the already exists methods.
Abstract: In recent times, the problem of Unsolicited Bulk
Email (UBE) or commonly known as Spam Email, has increased at a
tremendous growth rate. We present an analysis of survey based on
classifications of UBE in various research works. There are many
research instances for classification between spam and non-spam
emails but very few research instances are available for classification
of spam emails, per se. This paper does not intend to assert some
UBE classification to be better than the others nor does it propose
any new classification but it bemoans the lack of harmony on number
and definition of categories proposed by different researchers. The
paper also elaborates on factors like intent of spammer, content of
UBE and ambiguity in different categories as proposed in related
research works of classifications of UBE.