Abstract: The main aim of this research is to investigate a novel technique for implementing a more natural and intelligent conversation system. Conversation systems are designed to converse like a human as much as their intelligent allows. Sometimes, we can think that they are the embodiment of Turing-s vision. It usually to return a predetermined answer in a predetermined order, but conversations abound with uncertainties of various kinds. This research will focus on an integrated natural language processing approach. This approach includes an integrated knowledge-base construction module, a conversation understanding and generator module, and a state manager module. We discuss effectiveness of this approach based on an experiment.
Abstract: The necessity of updating the numerical models inputs, because of geometrical and resistive variations in rivers subject to solid transport phenomena, requires detailed control and monitoring activities. The human employment and financial resources of these activities moves the research towards the development of expeditive methodologies, able to evaluate the outflows through the measurement of more easily acquirable sizes. Recent studies highlighted the dependence of the entropic parameter on the kinematical and geometrical flow conditions. They showed a meaningful variability according to the section shape, dimension and slope. Such dependences, even if not yet well defined, could reduce the difficulties during the field activities, and also the data elaboration time. On the basis of such evidences, the relationships between the entropic parameter and the geometrical and resistive sizes, obtained through a large and detailed laboratory experience on steady free surface flows in conditions of macro and intermediate homogeneous roughness, are analyzed and discussed.
Abstract: The aim of this paper is to present a methodology in
three steps to forecast supply chain demand. In first step, various data
mining techniques are applied in order to prepare data for entering
into forecasting models. In second step, the modeling step, an
artificial neural network and support vector machine is presented
after defining Mean Absolute Percentage Error index for measuring
error. The structure of artificial neural network is selected based on
previous researchers' results and in this article the accuracy of
network is increased by using sensitivity analysis. The best forecast
for classical forecasting methods (Moving Average, Exponential
Smoothing, and Exponential Smoothing with Trend) is resulted based
on prepared data and this forecast is compared with result of support
vector machine and proposed artificial neural network. The results
show that artificial neural network can forecast more precisely in
comparison with other methods. Finally, forecasting methods'
stability is analyzed by using raw data and even the effectiveness of
clustering analysis is measured.
Abstract: Due to the increasing and varying risks that economic units face with, derivative instruments gain substantial importance, and trading volumes of derivatives have reached very significant level. Parallel with these high trading volumes, researchers have developed many different models. Some are parametric, some are nonparametric. In this study, the aim is to analyse the success of artificial neural network in pricing of options with S&P 100 index options data. Generally, the previous studies cover the data of European type call options. This study includes not only European call option but also American call and put options and European put options. Three data sets are used to perform three different ANN models. One only includes data that are directly observed from the economic environment, i.e. strike price, spot price, interest rate, maturity, type of the contract. The others include an extra input that is not an observable data but a parameter, i.e. volatility. With these detail data, the performance of ANN in put/call dimension, American/European dimension, moneyness dimension is analyzed and whether the contribution of the volatility in neural network analysis make improvement in prediction performance or not is examined. The most striking results revealed by the study is that ANN shows better performance when pricing call options compared to put options; and the use of volatility parameter as an input does not improve the performance.
Abstract: Introducing survivability into embedded real-time system (ERTS) can improve the survivability power of the system. This paper mainly discusses about the survivability of ERTS. The first is the survivability origin of ERTS. The second is survivability analysis. According to the definition of survivability based on survivability specification and division of the entire survivability analysis process for ERTS, a survivability analysis profile is presented. The quantitative analysis model of this profile is emphasized and illuminated in detail, the quantifying analysis of system was showed helpful to evaluate system survivability more accurate. The third is platform design of survivability analysis. In terms of the profile, the analysis process is encapsulated and assembled into one platform, on which quantification, standardization and simplification of survivability analysis are all achieved. The fourth is survivability design. According to character of ERTS, strengthened design method is selected to realize system survivability design. Through the analysis of embedded mobile video-on-demand system, intrusion tolerant technology is introduced in whole survivability design.
Abstract: NFκB activation plays a crucial role in anti-apoptotic responses in response to the apoptotic signaling during tumor necrosis factor (TNFa) stimulation in Multiple Myeloma (MM). Although several drugs have been found effective for the treatment of MM by mainly inhibiting NFκB pathway, there are no any quantitative or qualitative results of comparison assessment on inhibition effect between different single drugs or drug combinations. Computational modeling is becoming increasingly indispensable for applied biological research mainly because it can provide strong quantitative predicting power. In this study, a novel computational pathway modeling approach is employed to comparably assess the inhibition effects of specific single drugs and drug combinations on the NFκB pathway in MM, especially the prediction of synergistic drug combinations.
Abstract: As a result of urbanization, the unpredictable growth of industry and transport, production of chemicals, military activities, etc. the concentration of anthropogenic toxicants spread in nature exceeds all the permissible standards. Most dangerous among these contaminants are organic compounds having great persistence, bioaccumulation, and toxicity along with our awareness of their prominent occurrence in the environment and food chain. Among natural ecological tools, plants still occupying above 40% of the world land, until recently, were considered as organisms having only a limited ecological potential, accumulating in plant biomass and partially volatilizing contaminants of different structure. However, analysis of experimental data of the last two decades revealed the essential role of plants in environment remediation due to ability to carry out intracellular degradation processes leading to partial or complete decomposition of carbon skeleton of different structure contaminants. Though, phytoremediation technologies still are in research and development, their various applications have been successfully used. The paper aims to analyze mechanisms of organic contaminants uptake and detoxification in plants, being the less studied issue in evaluation and exploration of plants potential for environment remediation.
Abstract: This research was conducted for the first time at the
southeastern coasts of the Caspian Sea in order to evaluate the
performance of osteichthyes cooperatives through production (catch)
function. Using one of the indirect valuation methods in this research,
contributory factors in catch were identified and were inserted into
the function as independent variables. In order to carry out this
research, the performance of 25 Osteichthyes catching cooperatives
in the utilization year of 2009 which were involved in fishing in
Miankale wildlife refuge region. The contributory factors in catch
were divided into groups of economic, ecological and biological
factors. In the mentioned function, catch rate of the cooperative were
inserted into as the dependant variable and fourteen partial variables
in terms of nine general variables as independent variables. Finally,
after function estimation, seven variables were rendered significant at
99 percent reliably level. The results of the function estimation
indicated that human resource (fisherman quantity) had the greatest
positive effect on catch rate with an influence coefficient of 1.7 while
weather conditions had the greatest negative effect on the catch rate
of cooperatives with an influence coefficient of -2.07. Moreover,
factors like member's share, experience and fisherman training and
fishing effort played the main roles in the catch rate of cooperative
with influence coefficients of 0.81, 0.5 and 0.21, respectively.
Abstract: Choosing the right metadata is a critical, as good
information (metadata) attached to an image will facilitate its
visibility from a pile of other images. The image-s value is enhanced
not only by the quality of attached metadata but also by the technique
of the search. This study proposes a technique that is simple but
efficient to predict a single human image from a website using the
basic image data and the embedded metadata of the image-s content
appearing on web pages. The result is very encouraging with the
prediction accuracy of 95%. This technique may become a great
assist to librarians, researchers and many others for automatically and
efficiently identifying a set of human images out of a greater set of
images.
Abstract: The research objective of the project and article
“European Ecological Network Natura 2000 – opportunities and
threats” Natura 2000 sites constitute a form of environmental
protection, several legal problems are likely to result. Most
controversially, certain sites will be subject to two regimes of
protection: as national parks and as Natura 2000 sites. This dualism
of the legal regulation makes it difficult to perform certain legal
obligations related to the regimes envisaged under each form of
environmental protection. Which regime and which obligations
resulting from the particular form of environmental protection have
priority and should prevail? What should be done if these obligations
are contradictory? Furthermore, an institutional problem consists in
that no public administration authority has the power to resolve legal
conflicts concerning the application of a particular regime on a given
site. There are also no criteria to decide priority and superiority of
one form of environmental protection over the other. Which
regulations are more important, those that pertain to national parks or
to Natura 2000 sites? In the light of the current regulations, it is
impossible to give a decisive answer to these questions. The internal
hierarchy of forms of environmental protection has not been
determined, and all such forms should be treated equally.
Abstract: The dynamics of User Datagram Protocol (UDP) traffic
over Ethernet between two computers are analyzed using nonlinear
dynamics which shows that there are two clear regimes in the data
flow: free flow and saturated. The two most important variables
affecting this are the packet size and packet flow rate. However,
this transition is due to a transcritical bifurcation rather than phase
transition in models such as in vehicle traffic or theorized large-scale
computer network congestion. It is hoped this model will help lay
the groundwork for further research on the dynamics of networks,
especially computer networks.
Abstract: This study applied Theory of Planned Behaviour
(TPB) to explain the knowledge sharing behaviour among academic
staff at a Public Higher Education Institution (HEI) in Malaysia. The
main objectives of this study are; to identify the components that
influence knowledge sharing behaviour and to determine the levels of
knowledge sharing behaviour among academic staff. A total of 200
respondents were participated in answering questionnaires. The
findings of this study revealed that level of perceiving and
implementing knowledge sharing behaviour among academic staff at
a Public HEI in Malaysia exist but not openly or strongly practiced.
The findings were discussed and recommendations for the future
research were also addressed.
Abstract: The necessity of accurate and timely field data is
shared among organizations engaged in fundamentally different
activities, public services or commercial operations. Basically, there
are three major components in the process of the qualitative research:
data collection, interpretation and organization of data, and analytic
process. Representative technological advancements in terms of
innovation have been made in mobile devices (mobile phone, PDA-s,
tablets, laptops, etc). Resources that can be potentially applied on the
data collection activity for field researches in order to improve this
process.
This paper presents and discuss the main features of a mobile
phone based solution for field data collection, composed of basically
three modules: a survey editor, a server web application and a client
mobile application. The data gathering process begins with the
survey creation module, which enables the production of tailored
questionnaires. The field workforce receives the questionnaire(s) on
their mobile phones to collect the interviews responses and sending
them back to a server for immediate analysis.
Abstract: Network layer multicast, i.e. IP multicast, even after
many years of research, development and standardization, is not
deployed in large scale due to both technical (e.g. upgrading of
routers) and political (e.g. policy making and negotiation) issues.
Researchers looked for alternatives and proposed application/overlay
multicast where multicast functions are handled by end hosts, not
network layer routers. Member hosts wishing to receive multicast
data form a multicast delivery tree. The intermediate hosts in the tree
act as routers also, i.e. they forward data to the lower hosts in the
tree. Unlike IP multicast, where a router cannot leave the tree until all
members below it leave, in overlay multicast any member can leave
the tree at any time thus disjoining the tree and disrupting the data
dissemination. All the disrupted hosts have to rejoin the tree. This
characteristic of the overlay multicast causes multicast tree unstable,
data loss and rejoin overhead. In this paper, we propose that each node
sets its leaving time from the tree and sends join request to a number
of nodes in the tree. The nodes in the tree will reject the request if
their leaving time is earlier than the requesting node otherwise they
will accept the request. The node can join at one of the accepting
nodes. This makes the tree more stable as the nodes will join the tree
according to their leaving time, earliest leaving time node being at the
leaf of the tree. Some intermediate nodes may not follow their leaving
time and leave earlier than their leaving time thus disrupting the tree.
For this, we propose a proactive recovery mechanism so that disrupted
nodes can rejoin the tree at predetermined nodes immediately. We
have shown by simulation that there is less overhead when joining
the multicast tree and the recovery time of the disrupted nodes is
much less than the previous works. Keywords
Abstract: Supply chain networks are frequently hit by
unplanned events which lead to disruptions and cause operational and
financial consequences. It is neither possible to avoid disruption risk
entirely, nor are network members able to prepare for every possible
disruptive event. Therefore a continuity planning should be set up
which supports effective operational responses in supply chain
networks in times of emergencies. In this research network related
degrees of freedom which determine the options for responsive
actions are derived from interview data. The findings are further
embedded into a common risk management process. The paper
provides support for researchers and practitioners to identify the
network related options for responsive actions and to determine the
need for improving the reaction capabilities.
Abstract: In the current research, neuro-fuzzy model and regression model was developed to predict Material Removal Rate in Electrical Discharge Machining process for AISI D2 tool steel with copper electrode. Extensive experiments were conducted with various levels of discharge current, pulse duration and duty cycle. The experimental data are split into two sets, one for training and the other for validation of the model. The training data were used to develop the above models and the test data, which was not used earlier to develop these models were used for validation the models. Subsequently, the models are compared. It was found that the predicted and experimental results were in good agreement and the coefficients of correlation were found to be 0.999 and 0.974 for neuro fuzzy and regression model respectively
Abstract: The main aim of this research is to study the possible
use of recycled fine aggregate made from waste rubble wall to
substitute partially for the natural sand used in the production of
cement and sand bricks. The bricks specimens were prepared by
using 100% natural sand; they were then replaced by recycled fine
aggregate at 25, 50, 75, and 100% by weight of natural sand. A series
of tests was carried out to study the effect of using recycled aggregate
on the physical and mechanical properties of bricks, such as density,
drying shrinkage, water absorption characteristic, compressive and
flexural strength. Test results indicate that it is possible to
manufacture bricks containing recycled fine aggregate with good
characteristics that are similar in physical and mechanical properties
to those of bricks with natural aggregate, provided that the percentage
of recycled fine aggregates is limited up to 50-75%.
Abstract: The purpose of research was to know the role of
immunogenic protein of 49 kDa from V.alginolyticus which capable
to initiate molecule expression of MHC Class II in receptor of
Cromileptes altivelis. The method used was in vivo experimental
research through testing of immunogenic protein 49 kDa from
V.alginolyticus at Cromileptes altivelis (size of 250 - 300 grams)
using 3 times booster by injecting an immunogenic protein in a
intramuscular manner. Response of expressed MHC molecule was
shown using immunocytochemistry method and SEM. Results
indicated that adhesin V.alginolyticus 49 kDa which have
immunogenic character could trigger expression of MHC class II on
receptor of grouper and has been proven by staining using
immunocytochemistry and SEM with labeling using antibody anti
MHC (anti mouse). This visible expression based on binding between
epitopes antigen and antibody anti MHC in the receptor. Using
immunocytochemistry, intracellular response of MHC to in vivo
induction of immunogenic adhesin from V.alginolyticus was shown.
Abstract: In this paper, we propose an algorithm to compute
initial cluster centers for K-means clustering. Data in a cell is
partitioned using a cutting plane that divides cell in two smaller cells.
The plane is perpendicular to the data axis with the highest variance
and is designed to reduce the sum squared errors of the two cells as
much as possible, while at the same time keep the two cells far apart
as possible. Cells are partitioned one at a time until the number of
cells equals to the predefined number of clusters, K. The centers of
the K cells become the initial cluster centers for K-means. The
experimental results suggest that the proposed algorithm is effective,
converge to better clustering results than those of the random
initialization method. The research also indicated the proposed
algorithm would greatly improve the likelihood of every cluster
containing some data in it.
Abstract: The purposes of this paper are to (1) promote excellence in computer science by suggesting a cohesive innovative approach to fill well documented deficiencies in current computer science education, (2) justify (using the authors' and others anecdotal evidence from both the classroom and the real world) why this approach holds great potential to successfully eliminate the deficiencies, (3) invite other professionals to join the authors in proof of concept research. The authors' experiences, though anecdotal, strongly suggest that a new approach involving visual modeling technologies should allow computer science programs to retain a greater percentage of prospective and declared majors as students become more engaged learners, more successful problem-solvers, and better prepared as programmers. In addition, the graduates of such computer science programs will make greater contributions to the profession as skilled problem-solvers. Instead of wearily rememorizing code as they move to the next course, students will have the problem-solving skills to think and work in more sophisticated and creative ways.