Abstract: The aim of this paper is to examine factors related to system environment (namely, system quality and vendor support) that influences ERP implementation success in Iranian companies. Implementation success is identified using user satisfaction and organizational impact perspective. The study adopts the survey questionnaire approach to collect empirical data. The questionnaire was distributed to ERP users and a total of 384 responses were used for analysis. The results illustrated that both system quality and vendor support have significant effect on ERP implementation success. This implies that companies must ensure they source for the best available system and a vendor that is dependable, reliable and trustworthy.
Abstract: This study aims to conduct a preliminary investigation to determine the topic to be focused in developing Virtual Laboratory For Biology (VLab-Bio). Samples involved in answering the questionnaire are form five students (equivalent to A-Level) and biology teachers. Time and economical resources for the setting up and construction of scientific laboratories can be solved with the adaptation of virtual laboratories as an educational tool. Thus, it is hoped that the proposed virtual laboratory will help students to learn the abstract concepts in biology. Findings show that the difficult topic chosen is Cell Division and the learning objective to be focused in developing the virtual lab is “Describe the application of knowledge on mitosis in cloning".
Abstract: e-mail has become an important means of electronic
communication but the viability of its usage is marred by Unsolicited
Bulk e-mail (UBE) messages. UBE consists of many types
like pornographic, virus infected and 'cry-for-help' messages as well
as fake and fraudulent offers for jobs, winnings and medicines. UBE
poses technical and socio-economic challenges to usage of e-mails.
To meet this challenge and combat this menace, we need to
understand UBE. Towards this end, the current paper presents a
content-based textual analysis of more than 2700 body enhancement
medicinal UBE. Technically, this is an application of Text Parsing
and Tokenization for an un-structured textual document and we
approach it using Bag Of Words (BOW) and Vector Space Document
Model techniques. We have attempted to identify the most
frequently occurring lexis in the UBE documents that advertise
various products for body enhancement. The analysis of such top
100 lexis is also presented. We exhibit the relationship between
occurrence of a word from the identified lexis-set in the given UBE
and the probability that the given UBE will be the one advertising for
fake medicinal product. To the best of our knowledge and survey of
related literature, this is the first formal attempt for identification of
most frequently occurring lexis in such UBE by its textual analysis.
Finally, this is a sincere attempt to bring about alertness against and
mitigate the threat of such luring but fake UBE.
Abstract: This paper includes two novel techniques for skew
estimation of binary document images. These algorithms are based on
connected component analysis and Hough transform. Both these
methods focus on reducing the amount of input data provided to
Hough transform. In the first method, referred as word centroid
approach, the centroids of selected words are used for skew detection.
In the second method, referred as dilate & thin approach, the selected
characters are blocked and dilated to get word blocks and later
thinning is applied. The final image fed to Hough transform has the
thinned coordinates of word blocks in the image. The methods have
been successful in reducing the computational complexity of Hough
transform based skew estimation algorithms. Promising experimental
results are also provided to prove the effectiveness of the proposed
methods.
Abstract: In this paper, we present a new method for solving quadratic programming problems, not strictly convex. Constraints of the problem are linear equalities and inequalities, with bounded variables. The suggested method combines the active-set strategies and support methods. The algorithm of the method and numerical experiments are presented, while comparing our approach with the active set method on randomly generated problems.
Abstract: Eight heavy metals (Cu, Cr, Zn, Hg, Pb, Cd, Ni and As) were analyzed in sediment samples in the dry and wet seasons from November 2009 to October 2010 in West Port of Peninsular Malaysia. The heavy metal concentrations (mg/kg dry weight) were ranged from 23.4 to 98.3 for Zn, 22.3 to 80 for Pb, 7.4 to 27.6 Cu, 0.244 to 3.53 for Cd, 7.2 to 22.2 for Ni, 20.2 to 162 for As, 0.11 to 0.409 for Hg and 11.5 to 61.5 for Cr. Metals concentrations in dry season were higher than the rainy season except in cupper and chromium. Analysis of variance with Statistical Analysis System (SAS) shows that the mean concentration of metals in the two seasons (α level=0.05) are not significantly different which shows that the metals were held firmly in the matrix of sediment. Also there are significant differences between control point station with other stations. According to the Interim Sediment Quality guidelines (ISQG), the metal concentrations are moderately polluted, except in arsenic which shows the highest level of pollution.
Abstract: Legionella pneumophila is involved in more than 95%
cases of severe atypical pneumonia. Infection is mainly by
inhalation the indoor aerosols through the water-coolant systems.
Because some Legionella strains may be viable but not culturable,
therefore, Taq polymerase, DNA amplification and semi-nested-PCR
were carried out to detect Legionella-specific 16S-rDNA sequence.
For this purpose, 1.5 litter of water samples from 77 water-coolant
system were collected from four different hospitals, two nursing
homes and one student hostel in Kerman city of Iran, each in a brand
new plastic bottle during summer season of 2006 (from April to
August). The samples were filtered in the sterile condition through
the Millipore Membrane Filter. DNA was extracted from membrane
and used for PCR to detect Legionella spp. The PCR product was
then subjected to semi-nested PCR for detection of L. pneumophila.
Out of 77 water samples that were tested by PCR, 30 (39%) were
positive for most species of Legionella. However, L. pneumophila
was detected from 14 (18.2%) water samples by semi-nested PCR.
From the above results it can be concluded that water coolant
systems of different hospitals and nursing homes in Kerman city of
Iran are highly contaminated with L. pneumophila spp. and pose
serious concern. So, we recommend avoiding such type of coolant
system in the hospitals and nursing homes.
Abstract: Cantilever L-shaped walls are known to be relatively economical as retaining solution. The design starts by proportioning the wall dimensions for which the stability is checked for. A ratio between the lengths of the base and the stem, falling between 0.5 to 0.7 ensure in most case the stability requirements, however, the displacement pattern of the wall in terms of rotations and translations, and the lateral pressure profile, do not have the same figure for all wall’s proportioning, as it is usually assumed. In the present work the results of a numerical analysis are presented, different wall geometries were considered. The results show that the proportioning governs the equilibrium between the instantaneous rotation and the translation of the wall-toe, also, the lateral pressure estimation based on the average value between the at-rest and the active pressure, recommended by most design standards, is found to be not applicable for all walls.
Abstract: Fuzzy Load forecasting plays a paramount role in the operation and management of power systems. Accurate estimation of future power demands for various lead times facilitates the task of generating power reliably and economically. The forecasting of future loads for a relatively large lead time (months to few years) is studied here (long term load forecasting). Among the various techniques used in forecasting load, artificial intelligence techniques provide greater accuracy to the forecasts as compared to conventional techniques. Fuzzy Logic, a very robust artificial intelligent technique, is described in this paper to forecast load on long term basis. The paper gives a general algorithm to forecast long term load. The algorithm is an Extension of Short term load forecasting method to Long term load forecasting and concentrates not only on the forecast values of load but also on the errors incorporated into the forecast. Hence, by correcting the errors in the forecast, forecasts with very high accuracy have been achieved. The algorithm, in the paper, is demonstrated with the help of data collected for residential sector (LT2 (a) type load: Domestic consumers). Load, is determined for three consecutive years (from April-06 to March-09) in order to demonstrate the efficiency of the algorithm and to forecast for the next two years (from April-09 to March-11).
Abstract: Measurements of capacitance C and dissipation
factor tand of the stator insulation system provide useful information
about internal defects within the insulation. The index k is defined as
the proportionality constant between the changes at high voltage of
capacitance DC and of the dissipation factor Dtand . DC and
Dtand values were highly correlated when small flat defects were
within the insulation and that correlation was lost in the presence of
large narrow defects like electrical treeing. The discrimination
between small and large defects is made resorting to partial discharge
PD phase angle analysis. For the validation of the results, C and tand
measurements were carried out in a 15MVA 4160V steam turbine
turbogenerator placed in a sugar mill. In addition, laboratory test
results obtained by other authors were analyzed jointly. In such
laboratory tests, model coil bars subjected to thermal cycling resulted
highly degraded and DC and Dtand values were not correlated. Thus,
the index k could not be calculated.
Abstract: Finding the shortest path between two positions is a
fundamental problem in transportation, routing, and communications
applications. In robot motion planning, the robot should pass around
the obstacles touching none of them, i.e. the goal is to find a
collision-free path from a starting to a target position. This task has
many specific formulations depending on the shape of obstacles,
allowable directions of movements, knowledge of the scene, etc.
Research of path planning has yielded many fundamentally different
approaches to its solution, mainly based on various decomposition
and roadmap methods. In this paper, we show a possible use of
visibility graphs in point-to-point motion planning in the Euclidean
plane and an alternative approach using Voronoi diagrams that
decreases the probability of collisions with obstacles. The second
application area, investigated here, is focused on problems of finding
minimal networks connecting a set of given points in the plane using
either only straight connections between pairs of points (minimum
spanning tree) or allowing the addition of auxiliary points to the set
to obtain shorter spanning networks (minimum Steiner tree).
Abstract: The article touches upon questions of information security in Russian Economy. It covers theoretical bases of information security and causes of its development. The theory is proved by the analysis of business activities and the main tendencies of information security development. Perm region has been chosen as the bases for the analysis, being the fastestdeveloping region that uses methods of information security in managing it economy. As a result of the study the authors of the given article have formulated their own vision of the problem of information security in various branches of economy and stated prospects of information security development and its growing role in Russian economy
Abstract: The paper investigates the feasibility of constructing a software multi-agent based monitoring and classification system and utilizing it to provide an automated and accurate classification of end users developing applications in the spreadsheet domain. The agents function autonomously to provide continuous and periodic monitoring of excels spreadsheet workbooks. Resulting in, the development of the MultiAgent classification System (MACS) that is in compliance with the specifications of the Foundation for Intelligent Physical Agents (FIPA). However, different technologies have been brought together to build MACS. The strength of the system is the integration of the agent technology with the FIPA specifications together with other technologies that are Windows Communication Foundation (WCF) services, Service Oriented Architecture (SOA), and Oracle Data Mining (ODM). The Microsoft's .NET widows service based agents were utilized to develop the monitoring agents of MACS, the .NET WCF services together with SOA approach allowed the distribution and communication between agents over the WWW that is in order to satisfy the monitoring and classification of the multiple developer aspect. ODM was used to automate the classification phase of MACS.
Abstract: The study of non-equilibrium systems has attracted
increasing interest in recent years, mainly due to the lack of
theoretical frameworks, unlike their equilibrium counterparts.
Studying the steady state and/or simple systems is thus one of the
main interests. Hence in this work we have focused our attention on
the driven lattice gas model (DLG model) consisting of interacting
particles subject to an external field E. The dynamics of the system
are given by hopping of particles to nearby empty sites with rates
biased for jumps in the direction of E. Having used small two
dimensional systems of DLG model, the stochastic properties at nonequilibrium
steady state were analytically studied. To understand the
non-equilibrium phenomena, we have applied the analytic approach
via master equation to calculate probability function and analyze
violation of detailed balance in term of the fluctuation-dissipation
theorem. Monte Carlo simulations have been performed to validate
the analytic results.
Abstract: The purpose of this study is to research on thoughts transmitted from virtual fitting-room and to deduce discussion in an auxiliary narrative way. The research structure is based on 3D virtual fitting-room as the research subject. Initially, we will discuss the principles of narrate study, User Demand and so on by using a narrative design pattern to transmit their objective indications of “people-situation-reason-object", etc, and then to analyze the virtual fitting-room examples that are able to provide a new thinking for designers who engaged in clothing related industry – which comes in “story telling" and “user-centered design" forms. Clothing designs are not just to cover up the body to keep warm but to draw closer to people-s demand physiologically and psychologically through interactive designs so as to achieve cognition between people and environment. In the “outside" goal of clothing-s functional designs, we use tribal group-s behavior characteristics to “transform" the existing personal cultural stories, and “reform" them to design appropriate interactive products. Synthesizing the above matters, apart from being able to regard “narrate" as a kind of functional thinking process, we are also able to regard it as a kind of choice, arrangement and an activity of story expression, allowing interactive design-s spirit, product characteristics and experience ideas be transmitted to target tribal group in a visual image performance method. It is a far more confident and innovative attempt, and meanwhile, able to achieve entertainment, joyful and so forth fundamental interactive transmissions. Therefore, this study takes “user-centered design" thinking as a basis to establish a set of clothing designs with interactive experience patterns and to assist designers to examine the five sensual feeling of interactive demands in order to initiate a new value in textile industry.
Abstract: In this paper, we proposed a new framework to incorporate an intelligent agent software robot into a crisis communication portal (CCNet) in order to send alert news to subscribed users via email and other mobile services such as Short Message Service (SMS), Multimedia Messaging Service (MMS) and General Packet Radio Services (GPRS). The content on the mobile services can be delivered either through mobile phone or Personal Digital Assistance (PDA). This research has shown that with our proposed framework, the embodied conversation agents system can handle questions intelligently with our multilayer architecture. At the same time, the extended framework can take care of delivery content through a more humanoid interface on mobile devices.
Abstract: The quantum mechanics simulation was applied for
calculating the interaction force between 2 molecules based on atomic level. For the simple extractive distillation system, it is ternary
components consisting of 2 closed boiling point components (A,lower boiling point and B, higher boiling point) and solvent (S). The
quantum mechanics simulation was used to calculate the intermolecular force (interaction force) between the closed boiling
point components and solvents consisting of intermolecular between
A-S and B-S.
The requirement of the promising solvent for extractive distillation
is that solvent (S) has to form stronger intermolecular force with only
one component than the other component (A or B). In this study, the
systems of aromatic-aromatic, aromatic-cycloparaffin, and paraffindiolefin
systems were selected as the demonstration for solvent
selection. This study defined new term using for screening the solvents called relative interaction force which is calculated from the
quantum mechanics simulation. The results showed that relative
interaction force gave the good agreement with the literature data
(relative volatilities from the experiment). The reasons are discussed. Finally, this study suggests that quantum mechanics results can improve the relative volatility estimation for screening the solvents leading to reduce time and money consuming
Abstract: In mobile computing environments, there are many
new non existing problems in the distributed system, which is
consisted of stationary hosts because of host mobility, sudden
disconnection by handoff in wireless networks, voluntary
disconnection for efficient power consumption of a mobile host, etc.
To solve the problems, we proposed the architecture of Partial
Connection Manager (PCM) in this paper. PCM creates the limited
number of mobile agents according to priority, sends them in parallel
to servers, and combines the results to process the user request rapidly.
In applying the proposed PCM to the mobile market agent service, we
understand that the mobile agent technique could be suited for the
mobile computing environment and the partial connection problem
management.
Abstract: This study focuses on an evaluation of Hokkaido which
is the northernmost and largest prefecture by surface area in Japan and
particularly on two points: the rivalry between all kinds of land use
such as urban land and agricultural and forestry land in various cities
and their surrounding areas and the possibilities for forestry biomass in
areas other than those mentioned above and grasps which areas require
examination of the nature of land use control and guidance through
conducting land use analysis at the district level using GIS
(Geographic Information Systems). The results of analysis in this
study demonstrated that it is essential to divide the whole of Hokkaido
into two areas: those within delineated city planning areas and those
outside of delineated city planning areas and to conduct an evaluation
of each land use control.
In delineated urban areas, particularly urban areas, it is essential to
re-examine land use from the point of view of compact cities or smart
cities along with conducting an evaluation of land use control that
focuses on issues of rivalry between all kinds of land use such as urban
land and agricultural and forestry land. In areas outside of delineated
urban areas, it is desirable to aim to build a specific community
recycling range based on forest biomass utilization by conducting an
evaluation of land use control concerning the possibilities for forest
biomass focusing particularly on forests within and outside of city
planning areas.
Abstract: This paper presents an effective traffic lights detection
method at the night-time. First, candidate blobs of traffic lights are
extracted from RGB color image. Input image is represented on the
dominant color domain by using color transform proposed by Ruta,
then red and green color dominant regions are selected as candidates.
After candidate blob selection, we carry out shape filter for noise
reduction using information of blobs such as length, area, area of
boundary box, etc. A multi-class classifier based on SVM (Support
Vector Machine) applies into the candidates. Three kinds of features
are used. We use basic features such as blob width, height, center
coordinate, area, area of blob. Bright based stochastic features are also
used. In particular, geometric based moment-s values between
candidate region and adjacent region are proposed and used to improve
the detection performance. The proposed system is implemented on
Intel Core CPU with 2.80 GHz and 4 GB RAM and tested with the
urban and rural road videos. Through the test, we show that the
proposed method using PF, BMF, and GMF reaches up to 93 % of
detection rate with computation time of in average 15 ms/frame.