Abstract: Currently WWW is the first solution for scholars in
finding information. But, analyzing and interpreting this volume of
information will lead to researchers overload in pursuing their
research.
Trend detection in scientific publication retrieval systems helps
scholars to find relevant, new and popular special areas by
visualizing the trend of input topic.
However, there are few researches on trend detection in scientific
corpora while their proposed models do not appear to be suitable.
Previous works lack of an appropriate representation scheme for
research topics.
This paper describes a method that combines Semantic Web and
ontology to support advance search functions such as trend detection
in the context of scholarly Semantic Web system (SSWeb).
Abstract: Basel III (or the Third Basel Accord) is a global
regulatory standard on bank capital adequacy, stress testing and
market liquidity risk agreed upon by the members of the Basel
Committee on Banking Supervision in 2010-2011, and scheduled to
be introduced from 2013 until 2018. Basel III is a comprehensive set
of reform measures. These measures aim to; (1) improve the banking
sector-s ability to absorb shocks arising from financial and economic
stress, whatever the source, (2) improve risk management and
governance, (3) strengthen banks- transparency and disclosures.
Similarly the reform target; (1) bank level or micro-prudential,
regulation, which will help raise the resilience of individual banking
institutions to periods of stress. (2) Macro-prudential regulations,
system wide risk that can build up across the banking sector as well
as the pro-cyclical implication of these risks over time. These two
approaches to supervision are complementary as greater resilience at
the individual bank level reduces the risk system wide shocks.
Macroeconomic impact of Basel III; OECD estimates that the
medium-term impact of Basel III implementation on GDP growth is
in the range -0,05 percent to -0,15 percent per year. On the other hand
economic output is mainly affected by an increase in bank lending
spreads as banks pass a rise in banking funding costs, due to higher
capital requirements, to their customers. Consequently the estimated
effects on GDP growth assume no active response from monetary
policy. Basel III impact on economic output could be offset by a
reduction (or delayed increase) in monetary policy rates by about 30
to 80 basis points. The aim of this paper is to create a framework
based on the recent regulations in order to prevent financial crises.
Thus the need to overcome the global financial crisis will contribute
to financial crises that may occur in the future periods. In the first
part of the paper, the effects of the global crisis on the banking
system examine the concept of financial regulations. In the second
part; especially in the financial regulations and Basel III are analyzed.
The last section in this paper explored the possible consequences of
the macroeconomic impacts of Basel III.
Abstract: Tacit knowledge has been one of the most discussed
and contradictory concepts in the field of knowledge management
since the mid 1990s. The concept is used relatively vaguely to refer
to any type of information that is difficult to articulate, which has led
to discussions about the original meaning of the concept (adopted
from Polanyi-s philosophy) and the nature of tacit knowing. It is
proposed that the subject should be approached from the perspective
of cognitive science in order to connect tacit knowledge to
empirically studied cognitive phenomena. Some of the most
important examples of tacit knowing presented by Polanyi are
analyzed in order to trace the cognitive mechanisms of tacit knowing
and to promote better understanding of the nature of tacit knowledge.
The cognitive approach to Polanyi-s theory reveals that the
tacit/explicit typology of knowledge often presented in the
knowledge management literature is not only artificial but totally
opposite approach compared to Polanyi-s thinking.
Abstract: The present investigation is concerned with
sub-impacts taken placed when a rigid hemispherical-head block
transversely impacts against a beam at different locations. Dynamic
substructure technique for elastic-plastic impact is applied to solve
numerically this problem. The time history of impact force and energy
exchange between block and beam are obtained. The process of
sub-impacts is analyzed from the energy exchange point of view. The
results verify the influences of the impact location on impact duration,
the first sub-impact and energy exchange between the beam and the
block.
Abstract: Comparison of two approaches for the simulation of
the dynamic behaviour of a permanent magnet linear actuator is
presented. These are full coupled model, where the electromagnetic
field, electric circuit and mechanical motion problems are solved
simultaneously, and decoupled model, where first a set of static
magnetic filed analysis is carried out and then the electric circuit and
mechanical motion equations are solved employing bi-cubic spline
approximations of the field analysis results. The results show that the
proposed decoupled model is of satisfactory accuracy and gives more
flexibility when the actuator response is required to be estimated for
different external conditions, e.g. external circuit parameters or
mechanical loads.
Abstract: We present a novel scheme to evaluate sinusoidal functions with low complexity and high precision using cubic spline interpolation. To this end, two different approaches are proposed to find the interpolating polynomial of sin(x) within the range [- π , π]. The first one deals with only a single data point while the other with two to keep the realization cost as low as possible. An approximation error optimization technique for cubic spline interpolation is introduced next and is shown to increase the interpolator accuracy without increasing complexity of the associated hardware. The architectures for the proposed approaches are also developed, which exhibit flexibility of implementation with low power requirement.
Abstract: Requirements are critical to system validation as they guide all subsequent stages of systems development. Inadequately specified requirements generate systems that require major revisions or cause system failure entirely. Use Cases have become the main vehicle for requirements capture in many current Object Oriented (OO) development methodologies, and a means for developers to communicate with different stakeholders. In this paper we present the results of a laboratory experiment that explored whether different types of use case format are equally effective in facilitating high knowledge user-s understanding. Results showed that the provision of diagrams along with the textual use case descriptions significantly improved user comprehension of system requirements in both familiar and unfamiliar application domains. However, when comparing groups that received models of textual description accompanied with diagrams of different level of details (simple and detailed) we found no significant difference in performance.
Abstract: With the development of Internet and databases application techniques, the demand that lots of databases in the Internet are permitted to remote query and access for authorized users becomes common, and the problem that how to protect the copyright of relational databases arises. This paper simply introduces the knowledge of cloud model firstly, includes cloud generators and similar cloud. And then combined with the property of the cloud, a method of protecting relational databases copyright with cloud watermark is proposed according to the idea of digital watermark and the property of relational databases. Meanwhile, the corresponding watermark algorithms such as cloud watermark embedding algorithm and detection algorithm are proposed. Then, some experiments are run and the results are analyzed to validate the correctness and feasibility of the watermark scheme. In the end, the foreground of watermarking relational database and its research direction are prospected.
Abstract: Research in quantum computation is looking for the consequences of having information encoding, processing and communication exploit the laws of quantum physics, i.e. the laws which govern the ultimate knowledge that we have, today, of the foreign world of elementary particles, as described by quantum mechanics. This paper starts with a short survey of the principles which underlie quantum computing, and of some of the major breakthroughs brought by the first ten to fifteen years of research in this domain; quantum algorithms and quantum teleportation are very biefly presented. The next sections are devoted to one among the many directions of current research in the quantum computation paradigm, namely quantum programming languages and their semantics. A few other hot topics and open problems in quantum information processing and communication are mentionned in few words in the concluding remarks, the most difficult of them being the physical implementation of a quantum computer. The interested reader will find a list of useful references at the end of the paper.
Abstract: In this paper, we propose a modified version of the
Constant Modulus Algorithm (CMA) tailored for blind Decision
Feedback Equalizer (DFE) of first order Markovian time varying
channels. The proposed NonStationary CMA (NSCMA) is designed
so that it explicitly takes into account the Markovian structure of
the channel nonstationarity. Hence, unlike the classical CMA, the
NSCMA is not blind with respect to the channel time variations.
This greatly helps the equalizer in the case of realistic channels, and
avoids frequent transmissions of training sequences.
This paper develops a theoretical analysis of the steady state
performance of the CMA and the NSCMA for DFEs within a time
varying context. Therefore, approximate expressions of the mean
square errors are derived. We prove that in the steady state, the
NSCMA exhibits better performance than the classical CMA. These
new results are confirmed by simulation.
Through an experimental study, we demonstrate that the Bit Error
Rate (BER) is reduced by the NSCMA-DFE, and the improvement
of the BER achieved by the NSCMA-DFE is as significant as the
channel time variations are severe.
Abstract: An Optimal Power Flow based on Improved Particle
Swarm Optimization (OPF-IPSO) with Generator Capability Curve
Constraint is used by NN-OPF as a reference to get pattern of
generator scheduling. There are three stages in Designing NN-OPF.
The first stage is design of OPF-IPSO with generator capability curve
constraint. The second stage is clustering load to specific range and
calculating its index. The third stage is training NN-OPF using
constructive back propagation method. In training process total load
and load index used as input, and pattern of generator scheduling
used as output. Data used in this paper is power system of Java-Bali.
Software used in this simulation is MATLAB.
Abstract: The most severe damage of the turbine rotor is its
distortion. The rotor straightening process must lead, at the first
stage, to removal of the stresses from the material by annealing and
next, to straightening of the plastic distortion without leaving any
stress by hot spotting. The straightening method does not produce
stress accumulations and the heating technique, developed
specifically for solid forged rotors and disks, enables to avoid local
overheating and structural changes in the material. This process also
does not leave stresses in the shaft material. An experimental study
of hot spotting is carried out on a large turbine rotor and some of the
most important effective parameters that must be considered on
annealing and hot spotting processes are investigated in this paper.
Abstract: Skin color based tracking techniques often assume a
static skin color model obtained either from an offline set of library
images or the first few frames of a video stream. These models
can show a weak performance in presence of changing lighting or
imaging conditions. We propose an adaptive skin color model based
on the Gaussian mixture model to handle the changing conditions.
Initial estimation of the number and weights of skin color clusters
are obtained using a modified form of the general Expectation
maximization algorithm, The model adapts to changes in imaging
conditions and refines the model parameters dynamically using spatial
and temporal constraints. Experimental results show that the method
can be used in effectively tracking of hand and face regions.
Abstract: Batch fermentation of 5, 10 and 25 g/L biodiesel
derived crude glycerol was carried out at 30, 37 and 450C by
Clostridium pasteurianum cells immobilized on silica. Maximum
yield of 1,3-propanediol (PDO) (0.60 mol/mol), and ethanol (0.26
mol/mol) were obtained from 10 g/L crude glycerol at 30 and 370C
respectively. Maximum yield of butanol (0.28 mol/mol substrate
added) was obtained at 370C with 25 g/L substrate. None of the three
products were detected at 45oC even after 10 days of fermentation.
Only traces of ethanol (0.01 mol/mol) were detected at 450C with 5
g/L substrate. The results obtained for 25 g/L substrate utilization
were fitted in first order rate equation to obtain the values of rate
constant at three different temperatures for bioconversion of glycerol.
First order rate constants for bioconversion of glycerol at 30, 37 and
45oC were found to be 0.198, 0.294 and 0.029/day respectively.
Activation energy (Ea) for crude glycerol bioconversion was
calculated to be 57.62 kcal/mol.
Abstract: One of the difficulties of the vibration-based damage identification methods is the nonuniqueness of the results of damage identification. The different damage locations and severity may cause the identical response signal, which is even more severe for detection of the multiple damage. This paper proposes a new strategy for damage detection to avoid this nonuniqueness. This strategy firstly determines the approximates damage area based on the statistical pattern recognition method using the dynamic strain signal measured by the distributed fiber Bragg grating, and then accurately evaluates the damage information based on the Bayesian model updating method using the experimental modal data. The stochastic simulation method is then used to compute the high-dimensional integral in the Bayesian problem. Finally, an experiment of the plate structure, simulating one part of mechanical structure, is used to verify the effectiveness of this approach.
Abstract: Sports Sciences has been historically supported by the positivism idea of science, especially by the mechanistic/reductionist and becomes a field that views experimentation and measurement as the mayor research domains. The disposition to simplify nature and the world by parts has fragmented and reduced the idea of bodyathletes as machine. In this paper we intent to re-think this perception lined by Complexity Theory. We come with the idea of athletes as a reflexive and active being (corporeity-body). Therefore, the construction of a training that considers the cultural, biological, psychological elements regarding the experience of the human corporal movements in a circumspect and responsible way could bring better chances of accomplishment. In the end, we hope to help coaches understand the intrinsic complexity of the body they are training, how better deal with it, and, in the field of a deep globalization among the different types of knowledge, to respect and accepted the peculiarities of knowledge that comprise this area.
Abstract: This paper presents a method of model selection and
identification of Hammerstein systems by hybridization of the genetic
algorithm (GA) and particle swarm optimization (PSO). An unknown
nonlinear static part to be estimated is approximately represented
by an automatic choosing function (ACF) model. The weighting
parameters of the ACF and the system parameters of the linear
dynamic part are estimated by the linear least-squares method. On
the other hand, the adjusting parameters of the ACF model structure
are properly selected by the hybrid algorithm of the GA and PSO,
where the Akaike information criterion is utilized as the evaluation
value function. Simulation results are shown to demonstrate the
effectiveness of the proposed hybrid algorithm.
Abstract: The need for multilingual communication in Japan has
increased due to an increase in the number of foreigners in the
country. When people communicate in their nonnative language,
the differences in language prevent mutual understanding among
the communicating individuals. In the medical field, communication
between the hospital staff and patients is a serious problem. Currently,
medical translators accompany patients to medical care facilities, and
the demand for medical translators is increasing. However, medical
translators cannot necessarily provide support, especially in cases in
which round-the-clock support is required or in case of emergencies.
The medical field has high expectations from information technology.
Hence, a system that supports accurate multilingual communication is
required. Despite recent advances in machine translation technology,
it is very difficult to obtain highly accurate translations. We have
developed a support system called M3 for multilingual medical
reception. M3 provides support functions that aid foreign patients in
the following respects: conversation, questionnaires, reception procedures,
and hospital navigation; it also has a Q&A function. Users
can operate M3 using a touch screen and receive text-based support.
In addition, M3 uses accurate translation tools called parallel texts
to facilitate reliable communication through conversations between
the hospital staff and the patients. However, if there is no parallel
text that expresses what users want to communicate, the users cannot
communicate. In this study, we have developed a circulating support
environment for multilingual medical communication using parallel
texts. The proposed environment can circulate necessary parallel texts
through the following procedure: (1) a user provides feedback about
the necessary parallel texts, following which (2) these parallel texts
are created and evaluated.
Abstract: There are many real world problems in which
parameters like the arrival time of new jobs, failure of resources, and
completion time of jobs change continuously. This paper tackles the
problem of scheduling jobs with random due dates on multiple
identical machines in a stochastic environment. First to assign jobs to
different machine centers LPT scheduling methods have been used,
after that the particular sequence of jobs to be processed on the
machine have been found using simple stochastic techniques. The
performance parameter under consideration has been the maximum
lateness concerning the stochastic due dates which are independent
and exponentially distributed. At the end a relevant problem has been
solved using the techniques in the paper..
Abstract: Semantic Web Technologies enable machines to
interpret data published in a machine-interpretable form on the web.
At the present time, only human beings are able to understand the
product information published online. The emerging semantic Web
technologies have the potential to deeply influence the further
development of the Internet Economy. In this paper we propose a
scenario based research approach to predict the effects of these new
technologies on electronic markets and business models of traders
and intermediaries and customers. Over 300 million searches are
conducted everyday on the Internet by people trying to find what
they need. A majority of these searches are in the domain of
consumer ecommerce, where a web user is looking for something to
buy. This represents a huge cost in terms of people hours and an
enormous drain of resources. Agent enabled semantic search will
have a dramatic impact on the precision of these searches. It will
reduce and possibly eliminate information asymmetry where a better
informed buyer gets the best value. By impacting this key
determinant of market prices semantic web will foster the evolution
of different business and economic models. We submit that there is a
need for developing these futuristic models based on our current
understanding of e-commerce models and nascent semantic web
technologies. We believe these business models will encourage
mainstream web developers and businesses to join the “semantic web
revolution."