Abstract: The number of intrusions and attacks against critical
infrastructures and other information networks is increasing rapidly.
While there is no identified evidence that terrorist organizations are
currently planning a coordinated attack against the vulnerabilities of
computer systems and network connected to critical infrastructure,
and origins of the indiscriminate cyber attacks that infect computers
on network remain largely unknown. The growing trend toward the
use of more automated and menacing attack tools has also
overwhelmed some of the current methodologies used for tracking
cyber attacks. There is an ample possibility that this kind of cyber
attacks can be transform to cyberterrorism caused by illegal purposes.
Cyberterrorism is a matter of vital importance to national welfare.
Therefore, each countries and organizations have to take a proper
measure to meet the situation and consider effective legislation about
cyberterrorism.
Abstract: In this paper a new approach to face recognition is presented that achieves double dimension reduction making the system computationally efficient with better recognition results. In pattern recognition techniques, discriminative information of image increases with increase in resolution to a certain extent, consequently face recognition results improve with increase in face image resolution and levels off when arriving at a certain resolution level. In the proposed model of face recognition, first image decimation algorithm is applied on face image for dimension reduction to a certain resolution level which provides best recognition results. Due to better computational speed and feature extraction potential of Discrete Cosine Transform (DCT) it is applied on face image. A subset of coefficients of DCT from low to mid frequencies that represent the face adequately and provides best recognition results is retained. A trade of between decimation factor, number of DCT coefficients retained and recognition rate with minimum computation is obtained. Preprocessing of the image is carried out to increase its robustness against variations in poses and illumination level. This new model has been tested on different databases which include ORL database, Yale database and a color database. The proposed technique has performed much better compared to other techniques. The significance of the model is two fold: (1) dimension reduction up to an effective and suitable face image resolution (2) appropriate DCT coefficients are retained to achieve best recognition results with varying image poses, intensity and illumination level.
Abstract: The frequency dependence of the phase field
model(PFM) is studied. A simple PFM is proposed, and is tested in a
laminar boundary layer. The Blasius-s laminar boundary layer
solution on a flat plate is used for the flow pattern, and several
frequencies are imposed on the PFM, and the decay times of the
interfaces are obtained. The computations were conducted for three
cases: 1) no-flow, and 2) a half ball on the laminar boundary layer, 3) a
line of mass sources in the laminar boundary layer. The computations
show the decay time becomes shorter as the frequency goes larger, and
also show that it is sensitive to both background disturbances and
surface tension parameters. It is concluded that the proposed simple
PFM can describe the properties of decay process, and could give the
fundamentals for the decay of the interface in turbulent flows.
Abstract: Even though most researchers would agree that in
symbiotic relationships, like the one between parent and child,
influences become reciprocal over time, empirical evidence
supporting this claim is limited. The aim of the current study was to
develop and test a model describing the reciprocal influence between
characteristics of the parent-child relationship, such as closeness and
conflict, and the child-s bullying and victimization experiences at
school. The study used data from the longitudinal Study of Early
Child-Care, conducted by the National Institute of Child Health and
Human Development. The participants were dyads of early
adolescents (5th and 6th graders during the two data collection waves)
and their mothers (N=1364). Supporting our hypothesis, the findings
suggested a reciprocal association between bullying and positive
parenting, although this association was only significant for boys.
Victimization and positive parenting were not significantly
interrelated.
Abstract: This paper describes a segmentation algorithm based
on the cooperation of an optical flow estimation method with edge
detection and region growing procedures.
The proposed method has been developed as a pre-processing
stage to be used in methodologies and tools for video/image indexing
and retrieval by content. The addressed problem consists in
extracting whole objects from background for producing images of
single complete objects from videos or photos. The extracted images
are used for calculating the object visual features necessary for both
indexing and retrieval processes.
The first task of the algorithm exploits the cues from motion
analysis for moving area detection. Objects and background are then
refined using respectively edge detection and region growing
procedures. These tasks are iteratively performed until objects and
background are completely resolved.
The developed method has been applied to a variety of indoor and
outdoor scenes where objects of different type and shape are
represented on variously textured background.
Abstract: Currently, there has been a 3G mobile networks data
traffic explosion due to the large increase in the number of smartphone
users. Unlike a traditional wired infrastructure, 3G mobile networks
have limited wireless resources and signaling procedures for complex
wireless resource management. And mobile network security for
various abnormal and malicious traffic technologies was not ready. So
Malicious or potentially malicious traffic originating from mobile
malware infected smart devices can cause serious problems to the 3G
mobile networks, such as DoS and scanning attack in wired networks.
This paper describes the DoS security threat in the 3G mobile network
and proposes a detection technology.
Abstract: The ability to recognize humans and their activities by computer vision is a very important task, with many potential application. Study of human motion analysis is related to several research areas of computer vision such as the motion capture, detection, tracking and segmentation of people. In this paper, we describe a segmentation method for extracting human body contour in modified HLS color space. To estimate a background, the modified HLS color space is proposed, and the background features are estimated by using the HLS color components. Here, the large amount of human dataset, which was collected from DV cameras, is pre-processed. The human body and its contour is successfully extracted from the image sequences.
Abstract: The aim of this study was to compare the
sensitometric properties of commonly used radiographic films
processed with chemical solutions in different workload hospitals.
The effect of different processing conditions on induced densities on
radiologic films was investigated. Two accessible double emulsions
Fuji and Kodak films were exposed with 11-step wedge and
processed with Champion and CPAC processing solutions. The
mentioned films provided in both workloads centers, high and low.
Our findings displays that the speed and contrast of Kodak filmscreen
in both work load (high and low) is higher than Fuji filmscreen
for both processing solutions. However there was significant
differences in films contrast for both workloads when CPAC solution
had been used (p=0.000 and 0.028). The results showed base plus
fog density for Kodak film was lower than Fuji. Generally Champion
processing solution caused more speed and contrast for investigated
films in different conditions and there was significant differences in
95% confidence level between two used processing solutions
(p=0.01). Low base plus fog density for Kodak films provide more
visibility and accuracy and higher contrast results in using lower
exposure factors to obtain better quality in resulting radiographs. In
this study we found an economic advantages since Champion
solution and Kodak film are used while it makes lower patient dose.
Thus, in a radiologic facility any change in film processor/processing
cycle or chemistry should be carefully investigated before
radiological procedures of patients are acquired.
Abstract: Two commercial proteases from Bacillus
licheniformis (Alcalase 2.4 L FG and Alcalase 2.5 L, Type DX) were
screened for the production of Z-Ala-Phe-NH2 in batch reaction.
Alcalase 2.4 L FG was the most efficient enzyme for the C-terminal
amidation of Z-Ala-Phe-OMe using ammonium carbamate as
ammonium source. Immobilization of protease has been achieved by
the sol-gel method, using dimethyldimethoxysilane (DMDMOS) and
tetramethoxysilane (TMOS) as precursors (unpublished results). In
batch production, about 95% of Z-Ala-Phe-NH2 was obtained at
30°C after 24 hours of incubation. Reproducibility of different
batches of commercial Alcalase 2.4 L FG preparations was also
investigated by evaluating the amidation activity and the entrapment
yields in the case of immobilization. A packed-bed reactor (0.68 cm
ID, 15.0 cm long) was operated successfully for the continuous
synthesis of peptide amides. The immobilized enzyme retained the
initial activity over 10 cycles of repeated use in continuous reactor at
ambient temperature. At 0.75 mL/min flow rate of the substrate
mixture, the total conversion of Z-Ala-Phe-OMe was achieved after 5
hours of substrate recycling. The product contained about 90%
peptide amide and 10% hydrolysis byproduct.
Abstract: Optimization of rational geometrical and mechanical
parameters of panel with curved plywood ribs is considered in this
paper. The panel consists of cylindrical plywood ribs manufactured
from Finish plywood, upper and bottom plywood flange, stiffness
diaphragms. Panel is filled with foam. Minimal ratio of structure self
weight and load that could be applied to structure is considered as
rationality criteria. Optimization is done, by using classical beam
theory without nonlinearities. Optimization of discreet design
variables is done by Genetic algorithm.
Abstract: Spatial and mobile computing evolves. This paper
describes a smart modeling platform called “GeoSEMA". This
approach tends to model multidimensional GeoSpatial Evolutionary
and Mobile Agents. Instead of 3D and location-based issues, there
are some other dimensions that may characterize spatial agents, e.g.
discrete-continuous time, agent behaviors. GeoSEMA is seen as a
devoted design pattern motivating temporal geographic-based
applications; it is a firm foundation for multipurpose and
multidimensional special-based applications. It deals with
multipurpose smart objects (buildings, shapes, missiles, etc.) by
stimulating geospatial agents.
Formally, GeoSEMA refers to geospatial, spatio-evolutive and
mobile space constituents where a conceptual geospatial space model
is given in this paper. In addition to modeling and categorizing
geospatial agents, the model incorporates the concept of inter-agents
event-based protocols. Finally, a rapid software-architecture
prototyping GeoSEMA platform is also given. It will be
implemented/ validated in the next phase of our work.
Abstract: We present design, fabrication, and characterization of
a small (12 mm × 12 mm × 8 mm) movable railway vehicle for sensor
carrying. The miniature railway vehicle (MRV) was mainly composed
of a vibrational structure and three legs. A railway was designed and
fabricated to power and guide the MRV. It also transmits the sensed
data from the MRV to the signal processing unit. The MRV with legs
on the railway was moving due to its high-frequency vibration. A
model was derived to describe the motion. Besides, FEM simulations
were performed to design the legs. Then, the MRV and the railway
were fabricated by precision machining. Finally, an infrared sensor
was carried and tested. The result shows that the MRV without loading
was moving along the railway and its maximum speed was 12.2 mm/s.
Moreover, the testing signal was sensed by the MRV.
Abstract: Generator of hypotheses is a new method for data mining. It makes possible to classify the source data automatically and produces a particular enumeration of patterns. Pattern is an expression (in a certain language) describing facts in a subset of facts. The goal is to describe the source data via patterns and/or IF...THEN rules. Used evaluation criteria are deterministic (not probabilistic). The search results are trees - form that is easy to comprehend and interpret. Generator of hypotheses uses very effective algorithm based on the theory of monotone systems (MS) named MONSA (MONotone System Algorithm).
Abstract: Cu-mesoporous TiO2 is developed for removal acid
odor cooperated with ozone assistance and online- regeneration
system with/without UV irradiation (all weather) in study. The results
showed that Cu-mesoporous TiO2 present the desirable adsorption
efficiency of acid odor without UV irradiation, due to the larger
surface area, pore sizeand the additional absorption ability provided by
Cu. In the photocatalysis process, the material structure also benefits
Cu-mesoporous TiO2 to perform the more outstanding efficiency on
degrading acid odor. Cu also postponed the recombination of
electron-hole pairs excited from TiO2 to enhance photodegradation
ability. Cu-mesoporous TiO2 could gain the conspicuous increase on
photocatalysis ability from ozone assistance, but without any benefit
on adsorption. In addition, the online regeneration procedure could
process the used Cu-mesoporous TiO2 to reinstate the adsorption
ability and maintain the photodegradtion performance, depended on
scrubbing, desorping acid odor and reducing Cu to metal state.
Abstract: The mathematical framework for studying of a fuzzy approximate reasoning is presented in this paper. Two important defuzzification methods (Area defuzzification and Height defuzzification) besides the center of gravity method which is the best well known defuzzification method are described. The continuity of the defuzzification methods and its application to a fuzzy feedback control are discussed.
Abstract: In this paper we describe the design and implementation of a parallel algorithm for data assimilation with ensemble Kalman filter (EnKF) for oil reservoir history matching problem. The use of large number of observations from time-lapse seismic leads to a large turnaround time for the analysis step, in addition to the time consuming simulations of the realizations. For efficient parallelization it is important to consider parallel computation at the analysis step. Our experiments show that parallelization of the analysis step in addition to the forecast step has good scalability, exploiting the same set of resources with some additional efforts.
Abstract: Quantitative Investigation of impact of the factors' contribution towards measuring the reusability of software components could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable component from existing legacy systems; that can save cost of developing the software from scratch. But the issue of the relative significance of contributing factors has remained relatively unexplored. In this paper, we have use the Taguchi's approach in analyzing the significance of different structural attributes or factors in deciding the reusability level of a particular component. The results obtained shows that the complexity is the most important factor in deciding the better Reusability of a function oriented Software. In case of Object Oriented Software, Coupling and Complexity collectively play significant role in high reusability.
Abstract: Medical image modalities such as computed
tomography (CT), magnetic resonance imaging (MRI), ultrasound
(US), X-ray are adapted to diagnose disease. These modalities
provide flexible means of reviewing anatomical cross-sections and
physiological state in different parts of the human body. The raw
medical images have a huge file size and need large storage
requirements. So it should be such a way to reduce the size of those
image files to be valid for telemedicine applications. Thus the image
compression is a key factor to reduce the bit rate for transmission or
storage while maintaining an acceptable reproduction quality, but it is
natural to rise the question of how much an image can be compressed
and still preserve sufficient information for a given clinical
application. Many techniques for achieving data compression have
been introduced. In this study, three different MRI modalities which
are Brain, Spine and Knee have been compressed and reconstructed
using wavelet transform. Subjective and objective evaluation has
been done to investigate the clinical information quality of the
compressed images. For the objective evaluation, the results show
that the PSNR which indicates the quality of the reconstructed image
is ranging from (21.95 dB to 30.80 dB, 27.25 dB to 35.75 dB, and
26.93 dB to 34.93 dB) for Brain, Spine, and Knee respectively. For
the subjective evaluation test, the results show that the compression
ratio of 40:1 was acceptable for brain image, whereas for spine and
knee images 50:1 was acceptable.
Abstract: Cryptography provides the secure manner of
information transmission over the insecure channel. It authenticates
messages based on the key but not on the user. It requires a lengthy
key to encrypt and decrypt the sending and receiving the messages,
respectively. But these keys can be guessed or cracked. Moreover,
Maintaining and sharing lengthy, random keys in enciphering and
deciphering process is the critical problem in the cryptography
system. A new approach is described for generating a crypto key,
which is acquired from a person-s iris pattern. In the biometric field,
template created by the biometric algorithm can only be
authenticated with the same person. Among the biometric templates,
iris features can efficiently be distinguished with individuals and
produces less false positives in the larger population. This type of iris
code distribution provides merely less intra-class variability that aids
the cryptosystem to confidently decrypt messages with an exact
matching of iris pattern. In this proposed approach, the iris features
are extracted using multi resolution wavelets. It produces 135-bit iris
codes from each subject and is used for encrypting/decrypting the
messages. The autocorrelators are used to recall original messages
from the partially corrupted data produced by the decryption process.
It intends to resolve the repudiation and key management problems.
Results were analyzed in both conventional iris cryptography system
(CIC) and non-repudiation iris cryptography system (NRIC). It
shows that this new approach provides considerably high
authentication in enciphering and deciphering processes.
Abstract: Experimental investigations were made on the instability of supercritical kerosene flowing in active cooling channels. Two approaches were used to control the pressure in the channel. One is the back-pressure valve while the other is the venturi. In both conditions, a kind of low-frequency oscillation of pressure and temperature is observed. And the oscillation periods are calculated. By comparison with the flow time, it is concluded that the instability occurred in active cooling channels is probably one kind of density wave instability. And its period has no relationship with the cooling channel geometry, nor the pressure, but only depends on the flow time of kerosene in active cooling channels. When the mass flow rate, density and pressure drop couple with each other, the density wave instability will appear.