Abstract: This study proposes a new recommender system based on the collaborative folksonomy. The purpose of the proposed system is to recommend Internet resources (such as books, articles, documents, pictures, audio and video) to users. The proposed method includes four steps: creating the user profile based on the tags, grouping the similar users into clusters using an agglomerative hierarchical clustering, finding similar resources based on the user-s past collections by using content-based filtering, and recommending similar items to the target user. This study examines the system-s performance for the dataset collected from “del.icio.us," which is a famous social bookmarking website. Experimental results show that the proposed tag-based collaborative and content-based filtering hybridized recommender system is promising and effectiveness in the folksonomy-based bookmarking website.
Abstract: A simple and easy algorithm is presented for a fast calculation of kernel functions which required in fluid simulations using the Smoothed Particle Hydrodynamic (SPH) method. Present proposed algorithm improves the Linked-list algorithm and adopts the Pair-Wise Interaction technique, which are widely used for evaluating kernel functions in fluid simulations using the SPH method. The algorithm is easy to be implemented without any complexities in programming. Some benchmark examples are used to show the simulation time saved by using the proposed algorithm. Parametric studies on the number of divisions for sub-domains, smoothing length and total amount of particles are conducted to show the effectiveness of the present technique. A compact formulation is proposed for practical usage.
Abstract: AAM (active appearance model) has been successfully
applied to face and facial feature localization. However, its performance is sensitive to initial parameter values. In this paper, we propose a two-stage AAM for robust face alignment, which first fits an
inner face-AAM model to the inner facial feature points of the face and then localizes the whole face and facial features by optimizing the
whole face-AAM model parameters. Experiments show that the proposed face alignment method using two-stage AAM is more reliable to the background and the head pose than the standard
AAM-based face alignment method.
Abstract: An experiment was conducted to examine the effect of the level of performance stabilization on the human adaptability to perceptual-motor perturbation in a complex coincident timing task. Three levels of performance stabilization were established operationally: pre-stabilization, stabilization, and super-stabilization groups. Each group practiced the task until reached its level of stabilization in a constant sequence of movements and under a constant time constraint before exposure to perturbation. The results clearly showed that performance stabilization is a pre-condition for adaptation. Moreover, variability before reaching stabilization is harmful to adaptation and persistent variability after stabilization is beneficial. Moreover, the behavior of variability is specific to each measure.
Abstract: The study is aimed to test causal relationship between
growth and unemployment, using time series data for Pakistan from
1972 to 2006. Growth is considered to be a pathway to decrease the
level of unemployment. Unemployment is a social and political
issue. It is a phenomenon where human resources are wasted leading
to deacceleration in growth. Johanson Cointegration shows that there
is long run relationship between growth and unemployment. For
short run dynamics and causality, the study utilizes Vector Error
Correction Model (VECM). The results of VECM indicate that there
is short and long run causal relation between growth and
unemployment including capital, labor and human capital as
explanatory variables.
Abstract: The article contains results of the flour and bread
quality assessment from the grains of spring spelt, also called as an
ancient wheat. Spelt was cultivated on heavy and medium soils
observing principles of organic farming. Based on flour and bread
laboratory studies, as well as laboratory baking, the technological
usefulness of studied flour has been determined. These results were
referred to the standard derived from common wheat cultivated in the
same conditions. Grain of spring spelt is a good raw material for
manufacturing bread flour, from which to get high-quality bakery
products, but this is strictly dependent on the variety of ancient
wheat.
Abstract: In field of Computer Science and Mathematics,
sorting algorithm is an algorithm that puts elements of a list in a
certain order i.e. ascending or descending. Sorting is perhaps the
most widely studied problem in computer science and is frequently
used as a benchmark of a system-s performance. This paper
presented the comparative performance study of four sorting
algorithms on different platform. For each machine, it is found that
the algorithm depends upon the number of elements to be sorted. In
addition, as expected, results show that the relative performance of
the algorithms differed on the various machines. So, algorithm
performance is dependent on data size and there exists impact of
hardware also.
Abstract: As the information age matures, major social
infrastructures such as communication, finance, military and energy,
have become ever more dependent on information communication
systems. And since these infrastructures are connected to the Internet,
electronic intrusions such as hacking and viruses have become a new
security threat. Especially, disturbance or neutralization of a major
social infrastructure can result in extensive material damage and social
disorder. To address this issue, many nations around the world are
researching and developing various techniques and information
security policies as a government-wide effort to protect their
infrastructures from newly emerging threats. This paper proposes an
evaluation method for information security levels of CIIP (Critical
Information Infrastructure Protection), which can enhance the security
level of critical information infrastructure by checking the current
security status and establish security measures accordingly to protect
infrastructures effectively.
Abstract: A new approach to promote the generalization ability
of neural networks is presented. It is based on the point of view of
fuzzy theory. This approach is implemented through shrinking or
magnifying the input vector, thereby reducing the difference between
training set and testing set. It is called “shrinking-magnifying
approach" (SMA). At the same time, a new algorithm; α-algorithm is
presented to find out the appropriate shrinking-magnifying-factor
(SMF) α and obtain better generalization ability of neural networks.
Quite a few simulation experiments serve to study the effect of SMA
and α-algorithm. The experiment results are discussed in detail, and
the function principle of SMA is analyzed in theory. The results of
experiments and analyses show that the new approach is not only
simpler and easier, but also is very effective to many neural networks
and many classification problems. In our experiments, the proportions
promoting the generalization ability of neural networks have even
reached 90%.
Abstract: The usual correctness condition for a schedule of
concurrent database transactions is some form of serializability of
the transactions. For general forms, the problem of deciding whether
a schedule is serializable is NP-complete. In those cases other approaches
to proving correctness, using proof rules that allow the steps
of the proof of serializability to be guided manually, are desirable.
Such an approach is possible in the case of conflict serializability
which is proved algebraically by deriving serial schedules using
commutativity of non-conflicting operations. However, conflict serializability
can be an unnecessarily strong form of serializability restricting
concurrency and thereby reducing performance. In practice,
weaker, more general, forms of serializability for extended models of
transactions are used. Currently, there are no known methods using
proof rules for proving those general forms of serializability. In this
paper, we define serializability for an extended model of partitioned
transactions, which we show to be as expressive as serializability
for general partitioned transactions. An algebraic method for proving
general serializability is obtained by giving an initial-algebra specification
of serializable schedules of concurrent transactions in the
model. This demonstrates that it is possible to conduct algebraic
proofs of correctness of concurrent transactions in general cases.
Abstract: In general dynamic analyses, lower mode response is
of interest, however the higher modes of spatially discretized
equations generally do not represent the real behavior and not affects
to global response much. Some implicit algorithms, therefore, are
introduced to filter out the high-frequency modes using intended
numerical error. The objective of this study is to introduce the
P-method and PC α-method to compare that with dissipation method
and Newmark method through the stability analysis and numerical
example. PC α-method gives more accuracy than other methods
because it based on the α-method inherits the superior properties of the
implicit α-method. In finite element analysis, the PC α-method is more
useful than other methods because it is the explicit scheme and it
achieves the second order accuracy and numerical damping
simultaneously.
Abstract: To reduce accidents in the industry, WSNs(Wireless Sensor
networks)- sensor data is used. WSNs- sensor data has the persistence and
continuity. therefore, we design and exploit the buffer management system that
has the persistence and continuity to avoid and delivery data conflicts. To
develop modules, we use the multi buffers and design the buffer management
modules that transfer sensor data through the context-aware methods.
Abstract: The aim of this paper is to introduce a parametric
distribution model in fatigue life reliability analysis dealing with
variation in material properties. Service loads in terms of responsetime
history signal of Belgian pave were replicated on a multi-axial
spindle coupled road simulator and stress-life method was used to
estimate the fatigue life of automotive stub axle. A PSN curve was
obtained by monotonic tension test and two-parameter Weibull
distribution function was used to acquire the mean life of the
component. A Pearson system was developed to evaluate the fatigue
life reliability by considering stress range intercept and slope of the
PSN curve as random variables. Considering normal distribution of
fatigue strength, it is found that the fatigue life of the stub axle to
have the highest reliability between 10000 – 15000 cycles. Taking
into account the variation of material properties associated with the
size effect, machining and manufacturing conditions, the method
described in this study can be effectively applied in determination of
probability of failure of mass-produced parts.
Abstract: Music Information Retrieval (MIR) and modern data mining techniques are applied to identify style markers in midi music for stylometric analysis and author attribution. Over 100 attributes are extracted from a library of 2830 songs then mined using supervised learning data mining techniques. Two attributes are identified that provide high informational gain. These attributes are then used as style markers to predict authorship. Using these style markers the authors are able to correctly distinguish songs written by the Beatles from those that were not with a precision and accuracy of over 98 per cent. The identification of these style markers as well as the architecture for this research provides a foundation for future research in musical stylometry.
Abstract: High level and high velocity flood flows are
potentially harmful to bridge piers as evidenced in many toppled
piers, and among them the single-column piers were considered as
the most vulnerable. The flood flow characteristic parameters
including drag coefficient, scouring and vortex shedding are built into
a pier-flood interaction model to investigate structural safety against
flood hazards considering the effects of local scouring, hydrodynamic
forces, and vortex induced resonance vibrations. By extracting the
pier-flood simulation results embedded in a neural networks code,
two cases of pier toppling occurred in typhoon days were reexamined:
(1) a bridge overcome by flash flood near a mountain side;
(2) a bridge washed off in flood across a wide channel near the
estuary. The modeling procedures and simulations are capable of
identifying the probable causes for the tumbled bridge piers during
heavy floods, which include the excessive pier bending moments and
resonance in structural vibrations.
Abstract: The conventional GA combined with a local search
algorithm, such as the 2-OPT, forms a hybrid genetic algorithm(HGA)
for the traveling salesman problem (TSP). However, the geometric
properties which are problem specific knowledge can be used to
improve the search process of the HGA. Some tour segments (edges)
of TSPs are fine while some maybe too long to appear in a short tour.
This knowledge could constrain GAs to work out with fine tour
segments without considering long tour segments as often.
Consequently, a new algorithm is proposed, called intelligent-OPT
hybrid genetic algorithm (IOHGA), to improve the GA and the 2-OPT
algorithm in order to reduce the search time for the optimal solution.
Based on the geometric properties, all the tour segments are assigned
2-level priorities to distinguish between good and bad genes. A
simulation study was conducted to evaluate the performance of the
IOHGA. The experimental results indicate that in general the IOHGA
could obtain near-optimal solutions with less time and better accuracy
than the hybrid genetic algorithm with simulated annealing algorithm
(HGA(SA)).
Abstract: Electronic voting (E-voting) using an internet has been
recently performed in some nations and regions. There is no spatial
restriction which a voter directly has to visit the polling place, but an
e-voting using an internet has to go together the computer in which the
internet connection is possible. Also, this voting requires an access
code for the e-voting through the beforehand report of a voter. To
minimize these disadvantages, we propose a method in which a voter,
who has the wireless certificate issued in advance, uses its own cellular
phone for an e-voting without the special registration for a vote. Our
proposal allows a voter to cast his vote in a simple and convenient way
without the limit of time and location, thereby increasing the voting
rate, and also ensuring confidentiality and anonymity.
Abstract: The purpose of this research was to analyze and compare the instability of a contact surface between Copper and Nickel an alloy cathode in vacuum, the different ratio of Copper and Copper were conducted at 1%, 2% and 4% by using the cathode spot model. The transient recovery voltage is predicted. The cathode spot region is recognized as the collisionless space charge sheath connected with singly ionized collisional plasma. It was found that the transient voltage is decreased with increasing the percentage of an amount of Nickel in cathode materials.
Abstract: The objective of this study is to investigate the
combustion in a pilot-ignited supercharged dual-fuel engine, fueled
with different types of gaseous fuels under various equivalence ratios.
It is found that if certain operating conditions are maintained,
conventional dual-fuel engine combustion mode can be transformed to
the combustion mode with the two-stage heat release. This mode of
combustion was called the PREMIER (PREmixed Mixture Ignition in
the End-gas Region) combustion. During PREMIER combustion,
initially, the combustion progresses as the premixed flame
propagation and then, due to the mixture autoignition in the end-gas
region, ahead of the propagating flame front, the transition occurs with
the rapid increase in the heat release rate.
Abstract: Applying a rigorous process to optimize the elements
of a supply-chain network resulted in reduction of the waiting time
for a service provider and customer. Different sources of downtime
of hydraulic pressure controller/calibrator (HPC) were causing
interruptions in the operations. The process examined all the issues to
drive greater efficiencies. The issues included inherent design issues
with HPC pump, contamination of the HPC with impurities, and the
lead time required for annual calibration in the USA.
HPC is used for mandatory testing/verification of formation
tester/pressure measurement/logging-while drilling tools by oilfield
service providers, including Halliburton.
After market study andanalysis, it was concluded that the current
HPC model is best suited in the oilfield industry. To use theexisting
HPC model effectively, design andcontamination issues were
addressed through design and process improvements. An optimum
network is proposed after comparing different supply-chain models
for calibration lead-time reduction.