Abstract: In this paper, we propose a fast and efficient method for drawing very large-scale graph data. The conventional force-directed method proposed by Fruchterman and Rheingold (FR method) is well-known. It defines repulsive forces between every pair of nodes and attractive forces between connected nodes on a edge and calculates corresponding potential energy. An optimal layout is obtained by iteratively updating node positions to minimize the potential energy. Here, the positions of the nodes are updated every global timestep at the same time. In the proposed method, each node has its own individual time and time step, and nodes are updated at different frequencies depending on the local situation. The proposed method is inspired by the hierarchical individual time step method used for the high accuracy calculations for dense particle fields such as star clusters in astrophysical dynamics. Experiments show that the proposed method outperforms the original FR method in both speed and accuracy. We implement the proposed method on the MDGRAPE-3 PCI-X special purpose parallel computer and realize a speed enhancement of several hundred times.
Abstract: The evolution of logic circuits, which falls under the heading of evolvable hardware, is carried out by evolutionary algorithms. These algorithms are able to automatically configure reconfigurable devices. One of main difficulties in developing evolvable hardware with the ability to design functional electrical circuits is to choose the most favourable EA features such as fitness function, chromosome representations, population size, genetic operators and individual selection. Until now several researchers from the evolvable hardware community have used and tuned these parameters and various rules on how to select the value of a particular parameter have been proposed. However, to date, no one has presented a study regarding the size of the chromosome representation (circuit layout) to be used as a platform for the evolution in order to increase the evolvability, reduce the number of generations and optimize the digital logic circuits through reducing the number of logic gates. In this paper this topic has been thoroughly investigated and the optimal parameters for these EA features have been proposed. The evolution of logic circuits has been carried out by an extrinsic evolvable hardware system which uses (1+λ) evolution strategy as the core of the evolution.
Abstract: In this paper Lattice Boltzmann simulation of
turbulent natural convection with large-eddy simulations (LES) in a
square cavity which is filled by water has been investigated. The
present results are validated by finds of other investigations which
have been done with different numerical methods. Calculations were
performed for high Rayleigh numbers of Ra=108 and 109. The results
confirm that this method is in acceptable agreement with other
verifications of such a flow. In this investigation is tried to present
Large-eddy turbulence flow model by Lattice Boltzmann Method
(LBM) with a clear and simple statement. Effects of increase in
Rayleigh number are displayed on streamlines, isotherm counters and
average Nusselt number. Result shows that the average Nusselt
number enhances with growth of the Rayleigh numbers.
Abstract: As the number of networked computers grows,
intrusion detection is an essential component in keeping networks
secure. Various approaches for intrusion detection are currently
being in use with each one has its own merits and demerits. This
paper presents our work to test and improve the performance of a
new class of decision tree c-fuzzy decision tree to detect intrusion.
The work also includes identifying best candidate feature sub set to
build the efficient c-fuzzy decision tree based Intrusion Detection
System (IDS). We investigated the usefulness of c-fuzzy decision
tree for developing IDS with a data partition based on horizontal
fragmentation. Empirical results indicate the usefulness of our
approach in developing the efficient IDS.
Abstract: Water borne polyurethane (PU) based on newly prepared hyperbranched poly (amine-ester) (HBPAE) was applied and evaluated as organic coating material. HBPAE was prepared through one-pot synthesis between trimethylol propane as a core and AB2 branched monomer which was obtained via Michal addition of methyl methacrylate (MMA) and diethanol amine (DEA). PU was prepared from HBPAE using different ratios of toluene diisocyanate (TDI) to form cured coating film. The prepared HBPAE was characterized using; GPC, FT-IR and 1H-NMR. The mechanical properties (impact, hardness, adhesion, and flexibility), thermal properties (DSC and TGA) and chemical resistance of the applied film were estimated. The results indicated 50% of TDI is the selected ratio. This formulation represents a promising candidate to be used as coating material.
Abstract: During the past several years, face recognition in video
has received significant attention. Not only the wide range of
commercial and law enforcement applications, but also the availability
of feasible technologies after several decades of research contributes
to the trend. Although current face recognition systems have reached a
certain level of maturity, their development is still limited by the
conditions brought about by many real applications. For example,
recognition images of video sequence acquired in an open
environment with changes in illumination and/or pose and/or facial
occlusion and/or low resolution of acquired image remains a largely
unsolved problem. In other words, current algorithms are yet to be
developed. This paper provides an up-to-date survey of video-based
face recognition research. To present a comprehensive survey, we
categorize existing video based recognition approaches and present
detailed descriptions of representative methods within each category.
In addition, relevant topics such as real time detection, real time
tracking for video, issues such as illumination, pose, 3D and low
resolution are covered.
Abstract: Flash floods are considered natural disasters that can
cause casualties and demolishing of infra structures. The problem is
that flash floods, particularly in arid and semi arid zones, take place
in very short time. So, it is important to forecast flash floods earlier to
its events with a lead time up to 48 hours to give early warning alert
to avoid or minimize disasters. The flash flood took place over Wadi
Watier - Sinai Peninsula, in October 24th, 2008, has been simulated,
investigated and analyzed using the state of the art regional weather
model. The Weather Research and Forecast (WRF) model, which is a
reliable short term forecasting tool for precipitation events, has been
utilized over the study area. The model results have been calibrated
with the real data, for the same date and time, of the rainfall
measurements recorded at Sorah gauging station. The WRF model
forecasted total rainfall of 11.6 mm while the real measured one was
10.8 mm. The calibration shows significant consistency between
WRF model and real measurements results.
Abstract: Based on general proportional integral (GPI) observers and sliding mode control technique, a robust control method is proposed for the master-slave synchronization of chaotic systems in the presence of parameter uncertainty and with partially measurable output signal. By using GPI observer, the master dynamics are reconstructed by the observations from a measurable output under the differential algebraic framework. Driven by the signals provided by GPI observer, a sliding mode control technique is used for the tracking control and synchronization of the master-slave dynamics. The convincing numerical results reveal the proposed method is effective, and successfully accommodate the system uncertainties, disturbances, and noisy corruptions.
Abstract: In this paper, we propose a Connect6 solver which
adopts a hybrid approach based on a tree-search algorithm and image
processing techniques. The solver must deal with the complicated
computation and provide high performance in order to make real-time
decisions. The proposed approach enables the solver to be
implemented on a single Spartan-6 XC6SLX45 FPGA produced by
XILINX without using any external devices. The compact
implementation is achieved through image processing techniques to
optimize a tree-search algorithm of the Connect6 game. The tree
search is widely used in computer games and the optimal search brings
the best move in every turn of a computer game. Thus, many
tree-search algorithms such as Minimax algorithm and artificial
intelligence approaches have been widely proposed in this field.
However, there is one fundamental problem in this area; the
computation time increases rapidly in response to the growth of the
game tree. It means the larger the game tree is, the bigger the circuit
size is because of their highly parallel computation characteristics.
Here, this paper aims to reduce the size of a Connect6 game tree using
image processing techniques and its position symmetric property. The
proposed solver is composed of four computational modules: a
two-dimensional checkmate strategy checker, a template matching
module, a skilful-line predictor, and a next-move selector. These
modules work well together in selecting next moves from some
candidates and the total amount of their circuits is small. The details of
the hardware design for an FPGA implementation are described and
the performance of this design is also shown in this paper.
Abstract: All-to-all personalized communication, also known as complete exchange, is one of the most dense communication patterns in parallel computing. In this paper, we propose new indirect algorithms for complete exchange on all-port ring and torus. The new algorithms fully utilize all communication links and transmit messages along shortest paths to completely achieve the theoretical lower bounds on message transmission, which have not be achieved among other existing indirect algorithms. For 2D r × c ( r % c ) all-port torus, the algorithm has time complexities of optimal transmission cost and O(c) message startup cost. In addition, the proposed algorithms accommodate non-power-of-two tori where the number of nodes in each dimension needs not be power-of-two or square. Finally, the algorithms are conceptually simple and symmetrical for every message and every node so that they can be easily implemented and achieve the optimum in practice.
Abstract: Medical image segmentation based on image smoothing followed by edge detection assumes a great degree of importance in the field of Image Processing. In this regard, this paper proposes a novel algorithm for medical image segmentation based on vigorous smoothening by identifying the type of noise and edge diction ideology which seems to be a boom in medical image diagnosis. The main objective of this algorithm is to consider a particular medical image as input and make the preprocessing to remove the noise content by employing suitable filter after identifying the type of noise and finally carrying out edge detection for image segmentation. The algorithm consists of three parts. First, identifying the type of noise present in the medical image as additive, multiplicative or impulsive by analysis of local histograms and denoising it by employing Median, Gaussian or Frost filter. Second, edge detection of the filtered medical image is carried out using Canny edge detection technique. And third part is about the segmentation of edge detected medical image by the method of Normalized Cut Eigen Vectors. The method is validated through experiments on real images. The proposed algorithm has been simulated on MATLAB platform. The results obtained by the simulation shows that the proposed algorithm is very effective which can deal with low quality or marginal vague images which has high spatial redundancy, low contrast and biggish noise, and has a potential of certain practical use of medical image diagnosis.
Abstract: The aim of this study is to test the “work values"
inventory developed by Tevruz and Turgut and to utilize the concept
in a model, which aims to create a greater understanding of the work
experience. In the study multiple effects of work values, work-value
congruence and work centrality on organizational citizenship
behavior are examined. In this respect, it is hypothesized that work
values and work-value congruence predict organizational citizenship
behavior through work centrality. Work-goal congruence test, Tevruz
and Turgut-s work values inventory are administered along with
Kanungo-s work centrality and Podsakoff et al.-s [47] organizational
citizenship behavior test to employees working in Turkish SME-s.
The study validated that Tevruz and Turgut-s work values inventory
and the work-value congruence test were reliable and could be used
for future research. The study revealed the mediating role of work
centrality only for the relationship of work values and the
responsibility dimension of citizenship behavior. Most important, this
study brought in an important concept, work-value congruence,
which enables a better understanding of work values and their
relation to various attitudinal variables.
Abstract: System development life cycle (SDLC) is a
process uses during the development of any system. SDLC
consists of four main phases: analysis, design, implement and
testing. During analysis phase, context diagram and data flow
diagrams are used to produce the process model of a system.
A consistency of the context diagram to lower-level data flow
diagrams is very important in smoothing up developing
process of a system. However, manual consistency check from
context diagram to lower-level data flow diagrams by using a
checklist is time-consuming process. At the same time, the
limitation of human ability to validate the errors is one of the
factors that influence the correctness and balancing of the
diagrams. This paper presents a tool that automates the
consistency check between Data Flow Diagrams (DFDs)
based on the rules of DFDs. The tool serves two purposes: as
an editor to draw the diagrams and as a checker to check the
correctness of the diagrams drawn. The consistency check
from context diagram to lower-level data flow diagrams is
embedded inside the tool to overcome the manual checking
problem.
Abstract: This paper presents a new problem solving approach
that is able to generate optimal policy solution for finite-state
stochastic sequential decision-making problems with high data
efficiency. The proposed algorithm iteratively builds and improves
an approximate Markov Decision Process (MDP) model along with
cost-to-go value approximates by generating finite length trajectories
through the state-space. The approach creates a synergy between an
approximate evolving model and approximate cost-to-go values to
produce a sequence of improving policies finally converging to the
optimal policy through an intelligent and structured search of the
policy space. The approach modifies the policy update step of the
policy iteration so as to result in a speedy and stable convergence to
the optimal policy. We apply the algorithm to a non-holonomic
mobile robot control problem and compare its performance with
other Reinforcement Learning (RL) approaches, e.g., a) Q-learning,
b) Watkins Q(λ), c) SARSA(λ).
Abstract: Nowadays pharmaceutical care departments located in
hospitals are amongst the important pillars of the healthcare system.
The aim of this study was to evaluate quality of hospital drugstores
affiliated with Kermanshah University of Medical Sciences.
In this cross-sectional study a validated questionnaire was used.
The questionnaire was filled in by the one of the researchers in all
seventeen hospital drugstores located in the teaching and nonteaching
hospitals affiliated with Kermanshah University of Medical
Sciences. The results shows that in observed hospitals,24% of
pharmacy environments, 25% of pharmacy store and storage
conditions, 49% of storage procedure, 25% of ordering drugs and
supplies, 73% of receiving supplies (proper procedure are fallowed
for receiving supplies), 35% of receiving supplies (prompt action
taken if deterioration of drugs received is suspected), 23.35% of
drugs delivery to patients and finally 0% of stock cards are used for
proper inventory control have full compliance with standards.
Abstract: Contact centres have been exemplars of scientific management in the discipline of operations management for more than a decade now. With the movement of industries from a resource based economy to knowledge based economy businesses have started to realize the customer eccentricity being the key to sustainability amidst high velocity of the market. However, as technologies have converged and advanced, so have the contact centres. Contact Centres have redirected the supply chains and the concept of retailing is highly diminished due to over exaggeration of cost reduction strategies. In conditions of high environmental velocity together with services featuring considerable information intensity contact centres will require up to date and enlightened agents to satisfy the demands placed upon them by those requesting their services. In this paper we examine salient factors such as Power Distance, Knowledge structures and the dynamics of job specialisation and enlargement to suggest critical success factors in the domain of contact centres.
Abstract: Recently, permeable breakwaters have been suggested to overcome the disadvantages of fully protection breakwaters. These protection structures have minor impacts on the coastal environment and neighboring beaches where they provide a more economical protection from waves and currents. For regular waves, a numerical model is used (FLOW-3D, VOF) to investigate the hydraulic performance of a permeable breakwater. The model of permeable breakwater consists of a pair of identical vertical slotted walls with an impermeable upper and lower part, where the draft is a decimal multiple of the total depth. The middle part is permeable with a porosity of 50%. The second barrier is located at distant of 0.5 and 1.5 of the water depth from the first one. The numerical model is validated by comparisons with previous laboratory data and semi-analytical results of the same model. A good agreement between the numerical results and both laboratory data and semi-analytical results has been shown and the results indicate the applicability of the numerical model to reproduce most of the important features of the interaction. Through the numerical investigation, the friction factor of the model is carefully discussed.
Abstract: Lateral-torsional buckling (LTB) is one of the
phenomenae controlling the ultimate bending strength of steel Ibeams
carrying distributed loads on top flange. Built-up I-sections
are used as main beams and distributors. This study investigates the
ultimate bending strength of such beams with sections of different
classes including slender elements. The nominal strengths of the
selected beams are calculated for different unsupported lengths
according to the Provisions of the American Institute of Steel
Constructions (AISC-LRFD). These calculations are compared with
results of a nonlinear inelastic study using accurate FE model for this
type of loading. The goal is to investigate the performance of the
provisions for the selected sections. Continuous distributed load at
the top flange of the beams was applied at the FE model.
Imperfections of different values are implemented to the FE model to
examine their effect on the LTB of beams at failure, and hence, their
effect on the ultimate strength of beams. The study also introduces a
procedure for evaluating the performance of the provisions compared
with the accurate FEA results of the selected sections. A simplified
design procedure is given and recommendations for future code
updates are made.
Abstract: A new tool path planning method for 5-axis flank
milling of a globoidal indexing cam is developed in this paper. The
globoidal indexing cam is a practical transmission mechanism due
to its high transmission speed, accuracy and dynamic performance.
Machining the cam profile is a complex and precise task. The profile
surface of the globoidal cam is generated by the conjugate contact
motion of the roller. The generated complex profile surface is usually
machined by 5-axis point-milling method. The point-milling method
is time-consuming compared with flank milling. The tool path for
5-axis flank milling of globoidal cam is developed to improve the
cutting efficiency. The flank milling tool path is globally optimized
according to the minimum zone criterion, and high accuracy is
guaranteed. The computational example and cutting simulation finally
validate the developed method.
Abstract: Hierarchical classification is a problem with applications in many areas as protein function prediction where the dates are hierarchically structured. Therefore, it is necessary the development of algorithms able to induce hierarchical classification models. This paper presents experimenters using the algorithm for hierarchical classification called Multi-label Hierarchical Classification using a Competitive Neural Network (MHC-CNN). It was tested in ten datasets the Gene Ontology (GO) Cellular Component Domain. The results are compared with the Clus-HMC and Clus-HSC using the hF-Measure.