Abstract: A serious problem on the WWW is finding reliable
information. Not everything found on the Web is true and the
Semantic Web does not change that in any way. The problem will be
even more crucial for the Semantic Web, where agents will be
integrating and using information from multiple sources. Thus, if an
incorrect premise is used due to a single faulty source, then any
conclusions drawn may be in error. Thus, statements published on
the Semantic Web have to be seen as claims rather than as facts, and
there should be a way to decide which among many possibly
inconsistent sources is most reliable. In this work, we propose a trust
model for the Semantic Web. The proposed model is inspired by the
use trust in human society. Trust is a type of social knowledge and
encodes evaluations about which agents can be taken as reliable
sources of information or services. Our proposed model allows
agents to decide which among different sources of information to
trust and thus act rationally on the semantic web.
Abstract: Charge Simulation Method (CSM) is one of the very widely used numerical field computation technique in High Voltage (HV) engineering. The high voltage fields of varying non uniformities are encountered in practice. CSM programs being case specific, the simulation accuracies heavily depend on the user (programmers) experience. Here is an effort to understand CSM errors and evolve some guidelines to setup accurate CSM models, relating non uniformities with assignment factors. The results are for the six-point-charge model of sphere-plane gap geometry. Using genetic algorithm (GA) as tool, optimum assignment factors at different non uniformity factors for this model have been evaluated and analyzed. It is shown that the symmetrically placed six-point-charge models can be good enough to set up CSM programs with potential errors less than 0.1% when the field non uniformity factor is greater than 2.64 (field utilization factor less than 52.76%).
Abstract: Single photon detectors have been fabricated NbN
nano wire. These detectors are fabricated from high quality, ultra
high vacuum sputtered NbN thin films on a sapphire substrate. In this
work a typical schematic of the nanowire Single Photon Detector
structure and then driving and measurement electronic circuit are
shown.
The response of superconducting nanowire single photon detectors
during a photo detection event, is modeled by a special electrical
circuits (two circuit).
Finally, current through the wire is calculated by solving
equations of models.
Abstract: Bioinformatics methods for predicting the T cell
coreceptor usage from the array of membrane protein of HIV-1 are
investigated. In this study, we aim to propose an effective prediction
method for dealing with the three-class classification problem of
CXCR4 (X4), CCR5 (R5) and CCR5/CXCR4 (R5X4). We made
efforts in investigating the coreceptor prediction problem as follows: 1)
proposing a feature set of informative physicochemical properties
which is cooperated with SVM to achieve high prediction test
accuracy of 81.48%, compared with the existing method with
accuracy of 70.00%; 2) establishing a large up-to-date data set by
increasing the size from 159 to 1225 sequences to verify the proposed
prediction method where the mean test accuracy is 88.59%, and 3)
analyzing the set of 14 informative physicochemical properties to
further understand the characteristics of HIV-1coreceptors.
Abstract: Abovepresented work deals with the new scope of application of information and communication technologies for the improvement of the election process in the biased environment. We are introducing a new concept of construction of the information-communication system for the election participant. It consists of four main components: Software, Physical Infrastructure, Structured Information and the Trained Stuff. The Structured Information is the bases of the whole system and is the collection of all possible events (irregularities among them) at the polling stations, which are structured in special templates, forms and integrated in mobile devices.The software represents a package of analytic modules, which operates with the dynamic database. The application of modern communication technologies facilities the immediate exchange of information and of relevant documents between the polling stations and the Server of the participant. No less important is the training of the staff for the proper functioning of the system. The e-training system with various modules should be applied in this respect. The presented methodology is primarily focused on the election processes in the countries of emerging democracies.It can be regarded as the tool for the monitoring of elections process by the political organization(s) and as one of the instruments to foster the spread of democracy in these countries.
Abstract: This paper presents a simple method for estimation of
additional load as a factor of the existing load that may be drawn
before reaching the point of line maximum loadability of radial
distribution system (RDS) with different realistic load models at
different substation voltages. The proposed method involves a simple
line loadability index (LLI) that gives a measure of the proximity of
the present state of a line in the distribution system. The LLI can use
to assess voltage instability and the line loading margin. The
proposed method also compares with the existing method of
maximum loadability index [10]. The simulation results show that the
LLI can identify not only the weakest line/branch causing system
instability but also the system voltage collapse point when it is near
one. This feature enables us to set an index threshold to monitor and
predict system stability on-line so that a proper action can be taken to
prevent the system from collapse. To demonstrate the validity of the
proposed algorithm, computer simulations are carried out on two bus
and 69 bus RDS.
Abstract: The challenge in the swing-up problem of double
inverted pendulum on a cart (DIPC) is to design a controller that
bring all DIPC's states, especially the joint angles of the two links,
into the region of attraction of the desired equilibrium. This paper
proposes a new method to swing-up DIPC based on a series of restto-
rest maneuvers of the first link about its vertically upright
configuration while holding the cart fixed at the origin. The rest-torest
maneuvers are designed such that each one results in a net gain
in energy of the second link. This results in swing-up of DIPC-s
configuration to the region of attraction of the desired equilibrium. A
three-step algorithm is provided for swing-up control followed by the
stabilization step. Simulation results with a comparison to an
experimental work done in the literature are presented to demonstrate
the efficacy of the approach.
Abstract: This paper presents design, analysis and comparison of the different rotor type permanent magnet machines. The presented machines are designed as having same geometrical dimensions and same materials for comparison. The main machine parameters of interior and exterior rotor type machines including eddy current effect, torque-speed characteristics and magnetic analysis are investigated using MAXWELL program. With this program, the components of the permanent magnet machines can be calculated with high accuracy. Six types of Permanent machines are compared with respect to their topology, size, magnetic field, air gap flux, voltage, torque, loss and efficiency. The analysis results demonstrate the effectiveness of the proposed machines design methodology. We believe that, this study will be a helpful resource in terms of examination and comparison of the basic structure and magnetic features of the PM (Permanent magnet) machines which have different rotor structure.
Abstract: An image compression method has been developed
using fuzzy edge image utilizing the basic Block Truncation Coding
(BTC) algorithm. The fuzzy edge image has been validated with
classical edge detectors on the basis of the results of the well-known
Canny edge detector prior to applying to the proposed method. The
bit plane generated by the conventional BTC method is replaced with
the fuzzy bit plane generated by the logical OR operation between
the fuzzy edge image and the corresponding conventional BTC bit
plane. The input image is encoded with the block mean and standard
deviation and the fuzzy bit plane. The proposed method has been
tested with test images of 8 bits/pixel and size 512×512 and found to
be superior with better Peak Signal to Noise Ratio (PSNR) when
compared to the conventional BTC, and adaptive bit plane selection
BTC (ABTC) methods. The raggedness and jagged appearance, and
the ringing artifacts at sharp edges are greatly reduced in
reconstructed images by the proposed method with the fuzzy bit
plane.
Abstract: Recognition of Indian languages scripts is challenging problems. In Optical Character Recognition [OCR], a character or symbol to be recognized can be machine printed or handwritten characters/numerals. There are several approaches that deal with problem of recognition of numerals/character depending on the type of feature extracted and different way of extracting them. This paper proposes a recognition scheme for handwritten Hindi (devnagiri) numerals; most admired one in Indian subcontinent. Our work focused on a technique in feature extraction i.e. global based approach using end-points information, which is extracted from images of isolated numerals. These feature vectors are fed to neuro-memetic model [18] that has been trained to recognize a Hindi numeral. The archetype of system has been tested on varieties of image of numerals. . In proposed scheme data sets are fed to neuro-memetic algorithm, which identifies the rule with highest fitness value of nearly 100 % & template associates with this rule is nothing but identified numerals. Experimentation result shows that recognition rate is 92-97 % compared to other models.
Abstract: Along with the progress of our information society,
various risks are becoming increasingly common, causing multiple social problems. For this reason, risk communications for
establishing consensus among stakeholders who have different
priorities have become important. However, it is not always easy for the decision makers to agree on measures to reduce risks based on
opposing concepts, such as security, privacy and cost. Therefore, we previously developed and proposed the “Multiple Risk Communicator" (MRC) with the following functions: (1) modeling
the support role of the risk specialist, (2) an optimization engine, and (3) displaying the computed results. In this paper, MRC program
version 1.0 is applied to the personal information leakage problem. The application process and validation of the results are discussed.
Abstract: Most of the Question Answering systems
composed of three main modules: question processing,
document processing and answer processing. Question
processing module plays an important role in QA systems. If
this module doesn't work properly, it will make problems for
other sections. Moreover answer processing module is an
emerging topic in Question Answering, where these systems
are often required to rank and validate candidate answers.
These techniques aiming at finding short and precise answers
are often based on the semantic classification.
This paper discussed about a new model for question
answering which improved two main modules, question
processing and answer processing.
There are two important components which are the bases
of the question processing. First component is question
classification that specifies types of question and answer.
Second one is reformulation which converts the user's
question into an understandable question by QA system in a
specific domain. Answer processing module, consists of
candidate answer filtering, candidate answer ordering
components and also it has a validation section for interacting
with user. This module makes it more suitable to find exact
answer. In this paper we have described question and answer
processing modules with modeling, implementing and
evaluating the system. System implemented in two versions.
Results show that 'Version No.1' gave correct answer to 70%
of questions (30 correct answers to 50 asked questions) and
'version No.2' gave correct answers to 94% of questions (47
correct answers to 50 asked questions).
Abstract: In this study, the effect of mechanical activation on the synthesis of Fe3Al/Al2O3 nanocomposite has been investigated by using mechanochemical method. For this purpose, Aluminum powder and hematite as precursors, with stoichiometric ratio, have been utilized and other effective parameters in milling process were kept constant. Phase formation analysis, crystallite size measurement and lattice strain were studied by X-ray diffraction (XRD) by using Williamson-Hall method as well as microstructure and morphology were explored by Scanning electron microscopy (SEM). Also, Energy-dispersive X-ray spectroscopy (EDX) analysis was used in order to probe the particle distribution. The results showed that after 30-hour milling, the reaction was started, combustibly done and completed.
Abstract: Due to the tremendous amount of information provided
by the World Wide Web (WWW) developing methods for mining
the structure of web-based documents is of considerable interest. In
this paper we present a similarity measure for graphs representing
web-based hypertext structures. Our similarity measure is mainly
based on a novel representation of a graph as linear integer strings,
whose components represent structural properties of the graph. The
similarity of two graphs is then defined as the optimal alignment of
the underlying property strings. In this paper we apply the well known
technique of sequence alignments for solving a novel and challenging
problem: Measuring the structural similarity of generalized trees.
In other words: We first transform our graphs considered as high
dimensional objects in linear structures. Then we derive similarity
values from the alignments of the property strings in order to
measure the structural similarity of generalized trees. Hence, we
transform a graph similarity problem to a string similarity problem for
developing a efficient graph similarity measure. We demonstrate that
our similarity measure captures important structural information by
applying it to two different test sets consisting of graphs representing
web-based document structures.
Abstract: In the current context of globalization, a large number of companies sought to develop as a group in order to reach to other markets or meet the necessary criteria for listing on a stock exchange. The issue of consolidated financial statements prepared by a parent, an investor or a venture and the financial reporting standards guiding them therefore becomes even more important. The aim of our paper is to expose this issue in a consistent manner, first by summarizing the international accounting and financial reporting standards applicable before the 1st of January 2013 and considering the role of the crisis in shaping the standard setting process, and secondly by analyzing the newly issued/modified standards and main changes being brought
Abstract: Analysis of heart rate variability (HRV) has become a
popular non-invasive tool for assessing the activities of autonomic
nervous system. Most of the methods were hired from techniques
used for time series analysis. Currently used methods are time
domain, frequency domain, geometrical and fractal methods. A new
technique, which searches for pattern repeatability in a time series, is
proposed for quantifying heart rate (HR) time series. These set of
indices, which are termed as pattern repeatability measure and
pattern repeatability ratio are able to distinguish HR data clearly
from noise and electroencephalogram (EEG). The results of analysis
using these measures give an insight into the fundamental difference
between the composition of HR time series with respect to EEG and
noise.
Abstract: Calcite aCalcite and aragonite are the two common
polymorphs of CaCO3 observed as biominerals. It is universal that
the sea water contents a high Mg2+ (50mM) relative to Ca2+ (10mM).
In vivo crystallization, Mg2+ inhibits calcite formation. For this
reason, stony corals skeleton may be formed only aragonite crystals
in the biocalcification. It is special in case of soft corals of which
formed only calcite crystal; however, this interesting phenomenon,
still uncharacterized in the marine environment, has been explored in
this study using newly purified cell-free proteins isolated from the
endoskeletal sclerites of soft coral. By recording the decline of pH in
vitro, the control of CaCO3 nucleation and crystal growth by the cellfree
proteins was revealed. Using Atomic Force Microscope, here we
find that these endoskeletal cell-free proteins significantly design the
morphological shape in the molecular-scale kinetics of crystal
formation and those proteins act as surfactants to promote ion
attachment at calcite steps.nd aragonite are the two common polymorphs of CaCO3 observed as biominerals. It is universal that the sea water contents a high Mg2+ (50mM) relative to Ca2+ (10mM). In vivo crystallization, Mg2+ inhibits calcite formation. For this reason, stony corals skeleton may be formed only aragonite crystals in the biocalcification. It is special in case of soft corals of which formed only calcite crystal; however, this interesting phenomenon, still uncharacterized in the marine environment, has been explored in this study using newly purified cell-free proteins isolated from the endoskeletal sclerites of soft coral. By recording the decline of pH in vitro, the control of CaCO3 nucleation and crystal growth by the cell-free proteins was revealed. Using Atomic Force Microscope, here we find that these endoskeletal cell-free proteins significantly design the morphological shape in the molecular-scale kinetics of crystal formation and those proteins act as surfactants to promote ion attachment at calcite steps. KeywordsBiomineralization, Calcite, Cell-free protein, Soft coral
Abstract: International trade involves both large and small firms
engaged in business overseas. Possible drivers that force companies
to enter international markets include increasing competition at the
domestic market, maturing domestic markets, and limited domestic
market opportunities. Technology is an important driving factor in
shaping international marketing strategy as well as in driving force
towards a more global marketplace, especially technology in
communication. It includes telephones, the internet, computer
systems and e-mail. There are three main marketing strategy choices,
namely standardization approach, adaptation approach and middleof-
the-road approach that companies implement to overseas markets.
The decision depends on situations and factors facing the companies
in the international markets. In this paper, the contingency concept is
considered that no single strategy can be effective in all contexts.
The effect of strategy on performance depends on specific situational
variables. Strategic fit is employed to investigate export marketing
strategy adaptation under certain environmental conditions, which in
turn can lead to superior performance.
Abstract: In this paper, the existence of periodic solutions of a delayed competitive system with the effect of toxic substances is investigated by using the Gaines and Mawhin,s continuation theorem of coincidence degree theory on time scales. New sufficient conditions are obtained for the existence of periodic solutions. The approach is unified to provide the existence of the desired solutions for the continuous differential equations and discrete difference equations. Moreover, The approach has been widely applied to study existence of periodic solutions in differential equations and difference equations.
Abstract: Water level forecasting using records of past time series is of importance in water resources engineering and management. For example, water level affects groundwater tables in low-lying coastal areas, as well as hydrological regimes of some coastal rivers. Then, a reliable prediction of sea-level variations is required in coastal engineering and hydrologic studies. During the past two decades, the approaches based on the Genetic Programming (GP) and Artificial Neural Networks (ANN) were developed. In the present study, the GP is used to forecast daily water level variations for a set of time intervals using observed water levels. The measurements from a single tide gauge at Urmia Lake, Northwest Iran, were used to train and validate the GP approach for the period from January 1997 to July 2008. Statistics, the root mean square error and correlation coefficient, are used to verify model by comparing with a corresponding outputs from Artificial Neural Network model. The results show that both these artificial intelligence methodologies are satisfactory and can be considered as alternatives to the conventional harmonic analysis.