Abstract: In view of growing competition in the service sector,
services are as much in need of modeling, analysis and improvement
as business or working processes. Graphical process models are
important means to capture process-related know-how for an
effective management of the service process. In this contribution, a
human performance analysis of process model development paying
special attention to model development time and the working method
was conducted. It was found that modelers with higher application
experience need significantly less time for mental activities than
modelers with lower application experience, spend more time on
labeling graphical elements, and achieved higher process model
quality in terms of activity label quality.
Abstract: Skin color can provide a useful and robust cue
for human-related image analysis, such as face detection,
pornographic image filtering, hand detection and tracking,
people retrieval in databases and Internet, etc. The major
problem of such kinds of skin color detection algorithms is
that it is time consuming and hence cannot be applied to a real
time system. To overcome this problem, we introduce a new
fast technique for skin detection which can be applied in a real
time system. In this technique, instead of testing each image
pixel to label it as skin or non-skin (as in classic techniques),
we skip a set of pixels. The reason of the skipping process is
the high probability that neighbors of the skin color pixels are
also skin pixels, especially in adult images and vise versa. The
proposed method can rapidly detect skin and non-skin color
pixels, which in turn dramatically reduce the CPU time
required for the protection process. Since many fast detection
techniques are based on image resizing, we apply our
proposed pixel skipping technique with image resizing to
obtain better results. The performance evaluation of the
proposed skipping and hybrid techniques in terms of the
measured CPU time is presented. Experimental results
demonstrate that the proposed methods achieve better result
than the relevant classic method.
Abstract: Depressurization and pressurization streams in
industrial systems constitute a work exchange network (WEN). In this
paper, a novel graphical approach for targeting energy conservation
potential of a WEN is proposed. Through constructing the composite
work curves in the pressure-work diagram and assuming all of the
mechanical energy of the depressurization streams is recovered by
expanders, the maximum work target of a WEN can be determined via
the proposed targeting steps. A WEN in an ammonia production
process is used as a case study to illustrate the applicability of the
proposed graphical approach.
Abstract: Constant amplitude fatigue crack growth (FCG) tests
were performed on dissimilar metal welded plates of Type 316L
Stainless Steel (SS) and IS 2062 Grade A Carbon steel (CS). The
plates were welded by TIG welding using SS E309 as electrode. FCG
tests were carried on the Side Edge Notch Tension (SENT)
specimens of 5 mm thickness, with crack initiator (notch) at base
metal region (BM), weld metal region (WM) and heat affected zones
(HAZ). The tests were performed at a test frequency of 10 Hz and at
load ratios (R) of 0.1 & 0.6. FCG rate was found to increase with
stress ratio for weld metals and base metals, where as in case of
HAZ, FCG rates were almost equal at high ΔK. FCG rate of HAZ of
stainless steel was found to be lowest at low and high ΔK. At
intermediate ΔK, WM showed the lowest FCG rate. CS showed
higher crack growth rate at all ΔK. However, the scatter band of data
was found to be narrow. Fracture toughness (Kc) was found to vary
in different locations of weldments. Kc was found lowest for the
weldment and highest for HAZ of stainless steel. A novel method of
characterizing the FCG behavior using an Infrared thermography
(IRT) camera was attempted. By monitoring the temperature rise at
the fast moving crack tip region, the amount of plastic deformation
was estimated.
Abstract: Acute toxicity of nano SiO2, ZnO, MCM-41 (Meso
pore silica), Cu, Multi Wall Carbon Nano Tube (MWCNT), Single
Wall Carbon Nano Tube (SWCNT) , Fe (Coated) to bacteria Vibrio
fischeri using a homemade luminometer , was evaluated. The values
of the nominal effective concentrations (EC), causing 20% and 50%
inhibition of biouminescence, using two mathematical models at two
times of 5 and 30 minutes were calculated. Luminometer was
designed with Photomultiplier (PMT) detector. Luminol
chemiluminescence reaction was carried out for the calibration graph.
In the linear calibration range, the correlation coefficients and
coefficient of Variation (CV) were 0.988 and 3.21% respectively
which demonstrate the accuracy and reproducibility of the instrument
that are suitable. The important part of this research depends on how
to optimize the best condition for maximum bioluminescence. The
culture of Vibrio fischeri with optimal conditions in liquid media,
were stirring at 120 rpm at a temperature of 150C to 180C and were
incubated for 24 to 72 hours while solid medium was held at 180C
and for 48 hours. Suspension of nanoparticles ZnO, after 30 min
contact time to bacteria Vibrio fischeri, showed the highest toxicity
while SiO2 nanoparticles showed the lowest toxicity. After 5 min
exposure time, the toxicity of ZnO was the strongest and MCM-41
was the weakest toxicant component.
Abstract: Quantum cryptography offers a way of key agreement,
which is unbreakable by any external adversary. Authentication is
of crucial importance, as perfect secrecy is worthless if the identity
of the addressee cannot be ensured before sending important information.
Message authentication has been studied thoroughly, but no
approach seems to be able to explicitly counter meet-in-the-middle
impersonation attacks. The goal of this paper is the development of
an authentication scheme being resistant against active adversaries
controlling the communication channel. The scheme is built on top
of a key-establishment protocol and is unconditionally secure if built
upon quantum cryptographic key exchange. In general, the security
is the same as for the key-agreement protocol lying underneath.
Abstract: Reachability graph (RG) generation suffers from the
problem of exponential space and time complexity. To alleviate the
more critical problem of time complexity, this paper presents the new
approach for RG generation for the Petri net (PN) models of parallel
processes. Independent RGs for each parallel process in the PN
structure are generated in parallel and cross-product of these RGs
turns into the exhaustive state space from which the RG of given
parallel system is determined. The complexity analysis of the
presented algorithm illuminates significant decrease in the time
complexity cost of RG generation. The proposed technique is
applicable to parallel programs having multiple threads with the
synchronization problem.
Abstract: In this paper an efficient implementation of Ripemd-
160 hash function is presented. Hash functions are a special family
of cryptographic algorithms, which is used in technological
applications with requirements for security, confidentiality and
validity. Applications like PKI, IPSec, DSA, MAC-s incorporate
hash functions and are used widely today. The Ripemd-160 is
emanated from the necessity for existence of very strong algorithms
in cryptanalysis. The proposed hardware implementation can be
synthesized easily for a variety of FPGA and ASIC technologies.
Simulation results, using commercial tools, verified the efficiency of
the implementation in terms of performance and throughput. Special
care has been taken so that the proposed implementation doesn-t
introduce extra design complexity; while in parallel functionality was
kept to the required levels.
Abstract: The unique structural configuration found in human foot allows easy walking. Similar movement is hard to imitate even for an ape. It is obvious that human ambulation relates to the foot structure itself. Suppose the bones are represented as vertices and the joints as edges. This leads to the development of a special graph that represents human foot. On a footprint there are point-ofcontacts which have contact with the ground. It involves specific vertices. Theoretically, for an ideal ambulation, these points provide reactions onto the ground or the static equilibrium forces. They are arranged in sequence in form of a path. The ambulating footprint follows this path. Having the human foot graph and the path crossbred, it results in a representation that describes the profile of an ideal ambulation. This profile cites the locations where the point-of-contact experience normal reaction forces. It highlights the significant of these points.
Abstract: Grid environments consist of the volatile integration
of discrete heterogeneous resources. The notion of the Grid is to
unite different users and organisations and pool their resources into
one large computing platform where they can harness, inter-operate,
collaborate and interact. If the Grid Community is to achieve this
objective, then participants (Users and Organisations) need to be
willing to donate or share their resources and permit other
participants to use their resources. Resources do not have to be
shared at all times, since it may result in users not having access to
their own resource. The idea of reward-based computing was
developed to address the sharing problem in a pragmatic manner.
Participants are offered a reward to donate their resources to the
Grid. A reward may include monetary recompense or a pro rata share
of available resources when constrained. This latter point may imply
a quality of service, which in turn may require some globally agreed
reservation mechanism. This paper presents a platform for economybased
computing using the WebCom Grid middleware. Using this
middleware, participants can configure their resources at times and
priority levels to suit their local usage policy. The WebCom system
accounts for processing done on individual participants- resources
and rewards them accordingly.
Abstract: With the popularity of the multi-core and many-core architectures there is a great requirement for software frameworks which can support parallel programming methodologies. In this paper we introduce an Eclipse toolkit, JConqurr which is easy to use and provides robust support for flexible parallel progrmaming. JConqurr is a multi-core and many-core programming toolkit for Java which is capable of providing support for common parallel programming patterns which include task, data, divide and conquer and pipeline parallelism. The toolkit uses an annotation and a directive mechanism to convert the sequential code into parallel code. In addition to that we have proposed a novel mechanism to achieve the parallelism using graphical processing units (GPU). Experiments with common parallelizable algorithms have shown that our toolkit can be easily and efficiently used to convert sequential code to parallel code and significant performance gains can be achieved.
Abstract: The objective of this paper is to investigate a new
approach based on the idea of pictograms for food portion size. This
approach adopts the model of the United States Pharmacopeia- Drug
Information (USP-DI). The representation of each food portion size
composed of three parts: frame, the connotation of dietary portion
sizes and layout. To investigate users- comprehension based on this
approach, two experiments were conducted, included 122 Taiwanese
people, 60 male and 62 female with ages between 16 and 64 (divided
into age groups of 16-30, 31-45 and 46-64). In Experiment 1, the mean
correcting rate of the understanding level of food items is 48.54%
(S.D.= 95.08) and the mean response time 2.89sec (S.D.=2.14). The
difference on the correct rates for different age groups is significant
(P*=0.00
Abstract: In this paper, we propose a routing scheme that guarantees
the residual lifetime of wireless sensor networks where each
sensor node operates with a limited budget of battery energy. The
scheme maximizes the communications QoS while sustaining the
residual battery lifetime of the network for a specified duration.
Communication paths of wireless nodes are translated into a directed
acyclic graph(DAG) and the maximum-flow algorithm is applied to
the graph. The found maximum flow are assigned to sender nodes, so
as to maximize their communication QoS. Based on assigned flows,
the scheme determines the routing path and the transmission rate of
data packet so that any sensor node on the path would not exhaust
its battery energy before a specified duration.
Abstract: In Geographic Information System, one of the sources
of obtaining needed geographic data is digitizing analog maps and
evaluation of aerial and satellite photos. In this study, a method will
be discussed which can be used to extract vectorial features and
creating vectorized drawing files for aerial photos. At the same time
a software developed for these purpose. Converting from raster to
vector is also known as vectorization and it is the most important step
when creating vectorized drawing files. In the developed algorithm,
first of all preprocessing on the aerial photo is done. These are;
converting to grayscale if necessary, reducing noise, applying some
filters and determining the edge of the objects etc. After these steps,
every pixel which constitutes the photo are followed from upper left
to right bottom by examining its neighborhood relationship and one
pixel wide lines or polylines obtained. The obtained lines have to be
erased for preventing confusion while continuing vectorization
because if not erased they can be perceived as new line, but if erased
it can cause discontinuity in vector drawing so the image converted
from 2 bit to 8 bit and the detected pixels are expressed as a different
bit. In conclusion, the aerial photo can be converted to vector form
which includes lines and polylines and can be opened in any CAD
application.
Abstract: With the advance of multimedia and diagnostic
images technologies, the number of radiographic images is increasing
constantly. The medical field demands sophisticated systems for
search and retrieval of the produced multimedia document. This
paper presents an ongoing research that focuses on the semantic
content of radiographic image documents to facilitate semantic-based
radiographic image indexing and a retrieval system. The proposed
model would divide a radiographic image document, based on its
semantic content, and would be converted into a logical structure or
a semantic structure. The logical structure represents the overall
organization of information. The semantic structure, which is bound
to logical structure, is composed of semantic objects with
interrelationships in the various spaces in the radiographic image.
Abstract: This paper presents an interactive modeling system of
polyhedra using the isomorphic graphs. Especially, Conway
polyhedron notation is implemented. The notation can be observed as
interactive animation.
Abstract: Financial forecasting using machine learning techniques has received great efforts in the last decide . In this ongoing work, we show how machine learning of graphical models will be able to infer a visualized causal interactions between different banks in the Saudi equities market. One important discovery from such learned causal graphs is how companies influence each other and to what extend. In this work, a set of graphical models named Gaussian graphical models with developed ensemble penalized feature selection methods that combine ; filtering method, wrapper method and a regularizer will be shown. A comparison between these different developed ensemble combinations will also be shown. The best ensemble method will be used to infer the causal relationships between banks in Saudi equities market.
Abstract: The steady state response of bond graphs representing
passive and active suspension is presented. A bond graph with
preferred derivative causality assignment to get the steady state
is proposed. A general junction structure of this bond graph
is proposed. The proposed methodology to passive and active
suspensions is applied.
Abstract: The availability of broadband internet and increased
access to computers has been instrumental in the rise of internet
literacy in Malaysia. This development has led to the adoption of
online shopping by many Malaysians. On another note, the
Government has supported the development and production of local
herbal products. This has resulted in an increase in the production and
diversity of products by SMEs. The purpose of this study is to
evaluate the influence of the Malaysian demographic factors and
selected attitudinal characteristics in relation to the online purchasing
of herbal products. In total, 1054 internet users were interviewed
online and Chi-square analysis was used to determine the relationship
between demographic variables and different aspects of online
shopping for herbal products. The overall results show that the
demographic variables such as age, gender, education level, income
and ethnicity were significant when considering the online shopping
antecedents of trust, quality of herbal products, perceived risks and
perceived benefits.
Abstract: Although the STL (stereo lithography) file format is
widely used as a de facto industry standard in the rapid prototyping
industry due to its simplicity and ability to tessellation of almost all
surfaces, but there are always some defects and shortcoming in their
usage, which many of them are difficult to correct manually. In
processing the complex models, size of the file and its defects grow
extremely, therefore, correcting STL files become difficult. In this
paper through optimizing the exiting algorithms, size of the files and
memory usage of computers to process them will be reduced. In spite
of type and extent of the errors in STL files, the tail-to-head
searching method and analysis of the nearest distance between tails
and heads techniques were used. As a result STL models sliced
rapidly, and fully closed contours produced effectively and errorless.