Abstract: The early diagnostic decision making in industrial processes is absolutely necessary to produce high quality final products. It helps to provide early warning for a special event in a process, and finding its assignable cause can be obtained. This work presents a hybrid diagnostic schmes for batch processes. Nonlinear representation of raw process data is combined with classification tree techniques. The nonlinear kernel-based dimension reduction is executed for nonlinear classification decision boundaries for fault classes. In order to enhance diagnosis performance for batch processes, filtering of the data is performed to get rid of the irrelevant information of the process data. For the diagnosis performance of several representation, filtering, and future observation estimation methods, four diagnostic schemes are evaluated. In this work, the performance of the presented diagnosis schemes is demonstrated using batch process data.
Abstract: Von Willebrand-s disease is the most common
inherited bleeding disorder in humans, it
caused by qualitative abnormalities of the von Willebrand factor
(vWF). Our objective is to determine the prevalence of this disease at
part of the Algerian population in the East and the South by a
biological diagnosis based on specific biological tests (automated
platelet count, the bleeding time (TS), the time of cephalin + activator
(TCA), measure of the prothrombin rate (TP), vWF rate and factor
VIII rate, Molecular electrophoresis of vWF multimers in agarose gel
in the presence of SDS). Four patients of type III or severe
Willebrand-s disease were found on 200 suspect cases. All cases are
showed a deficit in vWF rate (< 5%), and factor VIII (P
Abstract: The survey and classification of the different security
attacks in structured peer-to-peer (P2P) overlay networks can be
useful to computer system designers, programmers, administrators,
and users. In this paper, we attempt to provide a taxonomy of
structured P2P overlay networks security attacks. We have specially
focused on the way these attacks can arise at each level of the
network. Moreover, we observed that most of the existing systems
such as Content Addressable Network (CAN), Chord, Pastry,
Tapestry, Kademlia, and Viceroy suffer from threats and vulnerability
which lead to disrupt and corrupt their functioning. We hope that our
survey constitutes a good help for who-s working on this area of
research.
Abstract: Object-oriented simulation is considered one of the most sophisticated techniques that has been widely used in planning, designing, executing and maintaining construction projects. This technique enables the modeler to focus on objects which is extremely important for thorough understanding of a system. Thus, identifying an object is an essential point of building a successful simulation model. In a maintenance process an object is a maintenance work order (MWO). This study demonstrates a maintenance simulation model for the building maintenance division of Saudi Consolidated Electric Company (SCECO) in Dammam, Saudi Arabia. The model focused on both types of maintenance processes namely: (1) preventive maintenance (PM) and (2) corrective maintenance (CM). It is apparent from the findings that object-oriented simulation is a good diagnostic and experimental tool. This is because problems, limitations, bottlenecks and so forth are easily identified. These features are very difficult to obtain when using other tools.
Abstract: In cryptography, confusion and diffusion are very
important to get confidentiality and privacy of message in block
ciphers and stream ciphers. There are two types of network to provide
confusion and diffusion properties of message in block ciphers. They
are Substitution- Permutation network (S-P network), and Feistel
network. NLFS (Non-Linear feedback stream cipher) is a fast and
secure stream cipher for software application. NLFS have two modes
basic mode that is synchronous mode and self synchronous mode.
Real random numbers are non-deterministic. R-box (random box)
based on the dynamic properties and it performs the stochastic
transformation of data that can be used effectively meet the
challenges of information is protected from international destructive
impacts. In this paper, a new implementation of stochastic
transformation will be proposed.
Abstract: The challenge in the case of image authentication is that in many cases images need to be subjected to non malicious operations like compression, so the authentication techniques need to be compression tolerant. In this paper we propose an image authentication system that is tolerant to JPEG lossy compression operations. A scheme for JPEG grey scale images is proposed based on a data embedding method that is based on a secret key and a secret mapping vector in the frequency domain. An encrypted feature vector extracted from the image DCT coefficients, is embedded redundantly, and invisibly in the marked image. On the receiver side, the feature vector from the received image is derived again and compared against the extracted watermark to verify the image authenticity. The proposed scheme is robust against JPEG compression up to a maximum compression of approximately 80%,, but sensitive to malicious attacks such as cutting and pasting.
Abstract: We developed a new method based on quasimolecular
modeling to simulate the cavity flow in three cavity
shapes: rectangular, half-circular and bucket beer in cgs units. Each
quasi-molecule was a group of particles that interacted in a fashion
entirely analogous to classical Newtonian molecular interactions.
When a cavity flow was simulated, the instantaneous velocity vector
fields were obtained by using an inverse distance weighted
interpolation method. In all three cavity shapes, fluid motion was
rotated counter-clockwise. The velocity vector fields of the three
cavity shapes showed a primary vortex located near the upstream
corners at time t ~ 0.500 s, t ~ 0.450 s and t ~ 0.350 s, respectively.
The configurational kinetic energy of the cavities increased as time
increased until the kinetic energy reached a maximum at time t ~
0.02 s and, then, the kinetic energy decreased as time increased. The
rectangular cavity system showed the lowest kinetic energy, while
the half-circular cavity system showed the highest kinetic energy.
The kinetic energy of rectangular, beer bucket and half-circular
cavities fluctuated about stable average values 35.62 x 103, 38.04 x
103 and 40.80 x 103 ergs/particle, respectively. This indicated that the
half-circular shapes were the most suitable shape for a shrimp pond
because the water in shrimp pond flows best when we compared with
rectangular and beer bucket shape.
Abstract: For the past couple of decades Weak signal detection
is of crucial importance in various engineering and scientific
applications. It finds its application in areas like Wireless
communication, Radars, Aerospace engineering, Control systems and
many of those. Usually weak signal detection requires phase sensitive
detector and demodulation module to detect and analyze the signal.
This article gives you a preamble to intrusion detection system which
can effectively detect a weak signal from a multiplexed signal. By
carefully inspecting and analyzing the respective signal, this
system can successfully indicate any peripheral intrusion. Intrusion
detection system (IDS) is a comprehensive and easy approach
towards detecting and analyzing any signal that is weakened and
garbled due to low signal to noise ratio (SNR). This approach
finds significant importance in applications like peripheral security
systems.
Abstract: Due to the dynamic nature of the Cloud, continuous monitoring of QoS requirements is necessary to manage the Cloud computing environment. The process of QoS monitoring and SLA violation detection consists of: collecting low and high level information pertinent to the service, analyzing the collected information, and taking corrective actions when SLA violations are detected. In this paper, we detail the architecture and the implementation of the first step of this process. More specifically, we propose an event-based approach to obtain run time information of services developed as BPEL processes. By catching particular events (i.e., the low level information), our approach recognizes the run-time execution path of a monitored service and uses the BPEL execution patterns to compute QoS of the composite service (i.e., the high level information).
Abstract: Computer technology and the Internet have made a
breakthrough in the existence of data communication. This has
opened a whole new way of implementing steganography to ensure
secure data transfer. Steganography is the fine art of hiding the
information. Hiding the message in the carrier file enables the
deniability of the existence of any message at all. This paper designs
a stego machine to develop a steganographic application to hide data
containing text in a computer video file and to retrieve the hidden
information. This can be designed by embedding text file in a video
file in such away that the video does not loose its functionality using
Least Significant Bit (LSB) modification method. This method
applies imperceptible modifications. This proposed method strives
for high security to an eavesdropper-s inability to detect hidden
information.
Abstract: Simulation is a very helpful and valuable work tool in
manufacturing. It can be used in industrial field allowing the
system`s behavior to be learnt and tested. Simulation provides a low
cost, secure and fast analysis tool. It also provides benefits, which
can be reached with many different system configurations. Topics to
be discussed include: Applications, Modeling, Validating, Software
and benefits of simulation. This paper provides a comprehensive
literature review on research efforts in simulation.
Abstract: This paper describes the results of an extensive study
and comparison of popular hash functions SHA-1, SHA-256,
RIPEMD-160 and RIPEMD-320 with JERIM-320, a 320-bit hash
function. The compression functions of hash functions like SHA-1
and SHA-256 are designed using serial successive iteration whereas
those like RIPEMD-160 and RIPEMD-320 are designed using two
parallel lines of message processing. JERIM-320 uses four parallel
lines of message processing resulting in higher level of security than
other hash functions at comparable speed and memory requirement.
The performance evaluation of these methods has been done by using
practical implementation and also by using step computation
methods. JERIM-320 proves to be secure and ensures the integrity of
messages at a higher degree. The focus of this work is to establish
JERIM-320 as an alternative of the present day hash functions for the
fast growing internet applications.
Abstract: Existing proceeding-models for the development of mechatronic systems provide a largely parallel action in the detailed development. This parallel approach is to take place also largely independent of one another in the various disciplines involved. An approach for a new proceeding-model provides a further development of existing models to use for the development of Adaptronic Systems. This approach is based on an intermediate integration and an abstract modeling of the adaptronic system. Based on this system-model a simulation of the global system behavior, due to external and internal factors or Forces is developed. For the intermediate integration a special data management system is used. According to the presented approach this data management system has a number of functions that are not part of the "normal" PDM functionality. Therefore a concept for a new data management system for the development of Adaptive system is presented in this paper. This concept divides the functions into six layers. In the first layer a system model is created, which divides the adaptronic system based on its components and the various technical disciplines. Moreover, the parameters and properties of the system are modeled and linked together with the requirements and the system model. The modeled parameters and properties result in a network which is analyzed in the second layer. From this analysis necessary adjustments to individual components for specific manipulation of the system behavior can be determined. The third layer contains an automatic abstract simulation of the system behavior. This simulation is a precursor for network analysis and serves as a filter. By the network analysis and simulation changes to system components are examined and necessary adjustments to other components are calculated. The other layers of the concept treat the automatic calculation of system reliability, the "normal" PDM-functionality and the integration of discipline-specific data into the system model. A prototypical implementation of an appropriate data management with the addition of an automatic system development is being implemented using the data management system ENOVIA SmarTeam V5 and the simulation system MATLAB.
Abstract: These days MANET is attracting much attention as
they are expected to gratefully influence communication between
wireless nodes. Along with this great strength, there is much more
chance of leave and being attacked by a malicious node. Due to this
reason much attention is given to the security and the private issue in
MANET. A lot of research in MANET has been doing. In this paper
we present the overview of MANET, the security issues of MANET,
IP configuration in MANET, the solution to puzzle out the security
issues and the simulation of the proposal idea. We add the method to
figure out the malicious nodes so that we can prevent the attack from
them. Nodes exchange the information about nodes to prevent DAD
attack. We can get 30% better performance than the previous
MANETConf.
Abstract: Multimedia security is an incredibly significant area
of concern. A number of papers on robust digital watermarking have
been presented, but there are no standards that have been defined so
far. Thus multimedia security is still a posing problem. The aim of
this paper is to design a robust image-watermarking scheme, which
can withstand a different set of attacks. The proposed scheme
provides a robust solution integrating image moment normalization,
content dependent watermark and discrete wavelet transformation.
Moment normalization is useful to recover the watermark even in
case of geometrical attacks. Content dependent watermarks are a
powerful means of authentication as the data is watermarked with its
own features. Discrete wavelet transforms have been used as they
describe image features in a better manner. The proposed scheme
finds its place in validating identification cards and financial
instruments.
Abstract: The dissolution of spherical particles in liquids is analyzed dynamically. Here, we consider the case the dissolution of solute yields a solute-free solid phase in the outer portion of a particle. As dissolution proceeds, the interface between the undissolved solid phase and the solute-free solid phase moves towards the center of the particle. We assume that there exist two resistances for the diffusion of solute molecules: the resistance due to the solute-free portion of the particle and that due to a surface layer near solid-liquid interface. In general, the equation governing the dynamic behavior of dissolution needs to be solved numerically. However, analytical expressions for the temporal variation of the size of the undissoved portion of a particle and the variation of dissolution time can be obtained in some special cases. The present analysis takes the effect of variable bulk solute concentration on dissolution into account.
Abstract: In this paper, a new secure watermarking scheme for
color image is proposed. It splits the watermark into two shares using
(2, 2)- threshold Visual Cryptography Scheme (V CS) with Adaptive
Order Dithering technique and embeds one share into high textured
subband of Luminance channel of the color image. The other share
is used as the key and is available only with the super-user or the
author of the image. In this scheme only the super-user can reveal
the original watermark. The proposed scheme is dynamic in the sense
that to maintain the perceptual similarity between the original and the
watermarked image the selected subband coefficients are modified
by varying the watermark scaling factor. The experimental results
demonstrate the effectiveness of the proposed scheme. Further, the
proposed scheme is able to resist all common attacks even with strong
amplitude.
Abstract: Web services provide significant new benefits for SOAbased
applications, but they also expose significant new security
risks. There are huge number of WS security standards and
processes. At present, there is still a lack of a comprehensive
approach which offers a methodical development in the construction
of secure WS-based SOA. Thus, the main objective of this paper is
to address this needs, presenting a comprehensive method for Web
Services Security guaranty in SOA. The proposed method defines
three stages, Initial Security Analysis, Architectural Security
Guaranty and WS Security Standards Identification. These facilitate,
respectively, the definition and analysis of WS-specific security
requirements, the development of a WS-based security architecture
and the identification of the related WS security standards that the
security architecture must articulate in order to implement the
security services.
Abstract: Governments around the world are expending
considerable time and resources framing strategies and policies to
deliver energy security. The term 'energy security' has quietly
slipped into the energy lexicon without any meaningful discourse
about its meaning or assumptions. An examination of explicit and
inferred definitions finds that the concept is inherently slippery
because it is polysemic in nature having multiple dimensions and
taking on different specificities depending on the country (or
continent), timeframe or energy source to which it is applied. But
what does this mean for policymakers? Can traditional policy
approaches be used to address the problem of energy security or does
its- polysemic qualities mean that it should be treated as a 'wicked'
problem? To answer this question, the paper assesses energy security
against nine commonly cited characteristics of wicked policy
problems and finds strong evidence of 'wickedness'.
Abstract: Historic preservation areas are extremely vulnerable to disasters because they are home to many vulnerable people and contain many closely spaced wooden houses. However, the narrow streets in these regions have historic meaning, which means that they cannot be widened and can become blocked easily during large disasters. Here, we describe our efforts to establish a methodology for the planning of evacuation route sin such historic preservation areas. In particular, this study aims to clarify the effectiveness of measures intended to secure two-way evacuation routes for vulnerable people during large disasters in a historic area preserved under the Cultural Properties Protection Law, Japan.