Abstract: Most of the existing text mining approaches are
proposed, keeping in mind, transaction databases model. Thus, the
mined dataset is structured using just one concept: the “transaction",
whereas the whole dataset is modeled using the “set" abstract type. In
such cases, the structure of the whole dataset and the relationships
among the transactions themselves are not modeled and
consequently, not considered in the mining process.
We believe that taking into account structure properties of
hierarchically structured information (e.g. textual document, etc ...)
in the mining process, can leads to best results. For this purpose, an
hierarchical associations rule mining approach for textual documents
is proposed in this paper and the classical set-oriented mining
approach is reconsidered profits to a Direct Acyclic Graph (DAG)
oriented approach. Natural languages processing techniques are used
in order to obtain the DAG structure. Based on this graph model, an
hierarchical bottom up algorithm is proposed. The main idea is that
each node is mined with its parent node.
Abstract: An application framework provides a reusable design
and implementation for a family of software systems. Frameworks
are introduced to reduce the cost of a product line (i.e., a family of
products that shares the common features). Software testing is a timeconsuming
and costly ongoing activity during the application
software development process. Generating reusable test cases for the
framework applications during the framework development stage,
and providing and using the test cases to test part of the framework
application whenever the framework is used reduces the application
development time and cost considerably. This paper introduces the
Framework Interface State Transition Tester (FIST2), a tool for
automated unit testing of Java framework applications. During the
framework development stage, given the formal descriptions of the
framework hooks, the specifications of the methods of the
framework-s extensible classes, and the illegal behavior description
of the Framework Interface Classes (FICs), FIST2 generates unitlevel
test cases for the classes. At the framework application
development stage, given the customized method specifications of
the implemented FICs, FIST2 automates the use, execution, and
evaluation of the already generated test cases to test the implemented
FICs. The paper illustrates the use of the FIST2 tool for testing
several applications that use the SalesPoint framework.
Abstract: Fast development of technologies, economic globalization and many other external circumstances stimulate company’s competitiveness. One of the major trends in today’s business is the shift to the exploitation of the Internet and electronic environment for entrepreneurial needs. Latest researches confirm that e-environment provides a range of possibilities and opportunities for companies, especially for micro-, small- and medium-sized companies, which have limited resources. The usage of e-tools raises the effectiveness and the profitability of an organization, as well as its competitiveness.
In the electronic market, as in the classic one, there are factors, such as globalization, development of new technology, price sensitive consumers, Internet, new distribution and communication channels that influence entrepreneurship. As a result of eenvironment development, e-commerce and e-marketing grow as well.
Objective of the paper: To describe and identify factors influencing company’s competitiveness in e-environment.
Research methodology: The authors employ well-established quantitative and qualitative methods of research: grouping, analysis, statistics method, factor analysis in SPSS 20 environment, etc. The theoretical and methodological background of the research is formed by using scientific researches and publications, such as that from mass media and professional literature; statistical information from legal institutions as well as information collected by the authors during the surveying process.
Research result: The authors detected and classified factors influencing competitiveness in e-environment.
In this paper, the authors presented their findings based on theoretical, scientific, and field research. Authors have conducted a research on e-environment utilization among Latvian enterprises.
Abstract: This paper argues that increased uncertainty, in certain
situations, may actually encourage investment. Since earlier studies
mostly base their arguments on the assumption of geometric Brownian
motion, the study extends the assumption to alternative stochastic
processes, such as mixed diffusion-jump, mean-reverting process, and
jump amplitude process. A general approach of Monte Carlo
simulation is developed to derive optimal investment trigger for the
situation that the closed-form solution could not be readily obtained
under the assumption of alternative process. The main finding is that
the overall effect of uncertainty on investment is interpreted by the
probability of investing, and the relationship appears to be an invested
U-shaped curve between uncertainty and investment. The implication
is that uncertainty does not always discourage investment even under
several sources of uncertainty. Furthermore, high-risk projects are not
always dominated by low-risk projects because the high-risk projects
may have a positive realization effect on encouraging investment.
Abstract: On-line (near infrared) spectroscopy is widely used to support the operation of complex process systems. Information extracted from spectral database can be used to estimate unmeasured product properties and monitor the operation of the process. These techniques are based on looking for similar spectra by nearest neighborhood algorithms and distance based searching methods. Search for nearest neighbors in the spectral space is an NP-hard problem, the computational complexity increases by the number of points in the discrete spectrum and the number of samples in the database. To reduce the calculation time some kind of indexing could be used. The main idea presented in this paper is to combine indexing and visualization techniques to reduce the computational requirement of estimation algorithms by providing a two dimensional indexing that can also be used to visualize the structure of the spectral database. This 2D visualization of spectral database does not only support application of distance and similarity based techniques but enables the utilization of advanced clustering and prediction algorithms based on the Delaunay tessellation of the mapped spectral space. This means the prediction has not to use the high dimension space but can be based on the mapped space too. The results illustrate that the proposed method is able to segment (cluster) spectral databases and detect outliers that are not suitable for instance based learning algorithms.
Abstract: In present work are considered the scheme of
evaluation the transition probability in quantum system. It is based on
path integral representation of transition probability amplitude and its
evaluation by means of a saddle point method, applied to the part of
integration variables. The whole integration process is reduced to
initial value problem solutions of Hamilton equations with a random
initial phase point. The scheme is related to the semiclassical initial
value representation approaches using great number of trajectories. In
contrast to them from total set of generated phase paths only one path
for each initial coordinate value is selected in Monte Karlo process.
Abstract: There has been a growing emphasis in
communication management from simple coordination of
promotional tools to a complex strategic process. This study will
examine the current marketing communications and engagement
strategies used in addressing the key stakeholders. In the case of
fertilizer industry in Malaysia, there has been little empirical
research on stakeholder communication when major challenges
facing the modern corporation is the need to communicate its
identity, its values and products in order to distinguish itself from
competitors. The study will employ both quantitative and qualitative
methods and the use of Structural Equation Modeling (SEM) to
establish a causal relationship amongst the key factors of stakeholder
communication strategies and increment in consumers-
choice/acceptance and impact on financial performance. One of the
major contributions is a conceptual framework for communication
strategies and engagement in increasing consumers- acceptance level
and the firm-s financial performance.
Abstract: In inspection and workpiece localization, sampling point data is an important issue. Since the devices for sampling only sample discrete points, not the completely surface, sampling size and location of the points will be taken into consideration. In this paper a method is presented for determining the sampled points size and location for achieving efficient sampling. Firstly, uncertainty analysis of the localization parameters is investigated. A localization uncertainty model is developed to predict the uncertainty of the localization process. Using this model the minimum size of the sampled points is predicted. Secondly, based on the algebra theory an eigenvalue-optimal optimization is proposed. Then a freeform surface is used in the simulation. The proposed optimization is implemented. The simulation result shows its effectivity.
Abstract: Development of intelligent assembly cell conception includes new solution kind of how to create structures of automated and flexible assembly system. The current trend of the final product quality increasing is affected by time analysis of the entire manufacturing process. The primary requirement of manufacturing is to produce as many products as soon as possible, at the lowest possible cost, but of course with the highest quality. Such requirements may be satisfied only if all the elements entering and affecting the production cycle are in a fully functional condition. These elements consist of sensory equipment and intelligent control elements that are essential for building intelligent manufacturing systems. Intelligent behavior of the system as the control system will repose on monitoring of important parameters of the system in the real time. Intelligent manufacturing system itself should be a system that can flexibly respond to changes in entering and exiting the process in interaction with the surroundings.
Abstract: Industrial surveys shows that manufacturing
companies define the qualities of thermal removing process based on
the dimension and physical appearance of the cutting material
surface. Therefore, the roughness of the surface area of the material
cut by the plasma arc cutting process and the rate of the removed
material by the manual plasma arc cutting machine was importantly
considered. Plasma arc cutter Selco Genesis 90 was used to cut
Standard AISI 1017 Steel of 200 mm x100 mm x 6 mm manually
based on the selected parameters setting. The material removal rate
(MRR) was measured by determining the weight of the specimens
before and after the cutting process. The surface roughness (SR)
analysis was conducted using Mitutoyo CS-3100 to determine the
average roughness value (Ra). Taguchi method was utilized to
achieve optimum condition for both outputs studied. The
microstructure analysis in the region of the cutting surface is
performed using SEM. The results reveal that the SR values are
inversely proportional to the MRR values. The quality of the surface
roughness depends on the dross peak that occurred after the cutting
process.
Abstract: Dual bell nozzle is a promising one among the altitude
adaptation nozzle concepts, which offer increased nozzle
performance in rocket engines. Its advantage is the simplicity it offers
due to the absence of any additional mechanical device or movable
parts. Hence it offers reliability along with improved nozzle
performance as demanded by future launch vehicles. Among other
issues, the flow transition to the extension nozzle of a dual bell
nozzle is one of the major issues being studied in the development of
dual bell nozzle. A parameter named over-expansion factor, which
controls the value of the wall inflection angle, has been reported to
have substantial influence in this transition process. This paper
studies, through CFD and cold flow experiments, the effect of overexpansion
factor on flow transition in dual bell nozzles.
Abstract: Multi criteria decision analysis (MDCA) covers both
data and experience. It is very common to solve the problems with
many parameters and uncertainties. GIS supported solutions improve
and speed up the decision process. Weighted grading as a MDCA
method is employed for solving the geotechnical problems. In this
study, geotechnical parameters namely soil type; SPT (N) blow
number, shear wave velocity (Vs) and depth of underground water
level (DUWL) have been engaged in MDCA and GIS. In terms of
geotechnical aspects, the settlement suitability of the municipal area
was analyzed by the method. MDCA results were compatible with
the geotechnical observations and experience. The method can be
employed in geotechnical oriented microzoning studies if the criteria
are well evaluated.
Abstract: The peel of dragon fruit is a byproduct left over after consuming. Normally, the use of plants as antioxidant source must be dried before further process. Therefore, the aim of this study is interesting to dry the peel by heat pump dryer (45 ºC) and fluidized bed dryer (110 º C) compared with the sun drying method. The sample with initial moisture content of about 85-91% wet basis was dried down to about 10% wet basis where it took 620 and 25 min for heat pump dryer and fluidized bed dryer, respectively. However, the sun drying took about 900 min to dry the peel. After that, sample was evaluated antioxidant activity, -carotene and betalains contents. The results found that the antioxidant activity and betalains contents of dried peel obtained from heat pump and fluidized bed dryings were significantly higher than that sun drying (p 0.05). Moreover, the drying by heat pump provided the highest -carotene content.
Abstract: Text Mining is an important step of Knowledge
Discovery process. It is used to extract hidden information from notstructured
o semi-structured data. This aspect is fundamental because
much of the Web information is semi-structured due to the nested
structure of HTML code, much of the Web information is linked,
much of the Web information is redundant. Web Text Mining helps
whole knowledge mining process to mining, extraction and
integration of useful data, information and knowledge from Web
page contents.
In this paper, we present a Web Text Mining process able to
discover knowledge in a distributed and heterogeneous multiorganization
environment. The Web Text Mining process is based on
flexible architecture and is implemented by four steps able to
examine web content and to extract useful hidden information
through mining techniques. Our Web Text Mining prototype starts
from the recovery of Web job offers in which, through a Text Mining
process, useful information for fast classification of the same are
drawn out, these information are, essentially, job offer place and
skills.
Abstract: Subdivision surfaces were applied to the entire
meshes in order to produce smooth surfaces refinement from coarse
mesh. Several schemes had been introduced in this area to provide a
set of rules to converge smooth surfaces. However, to compute and
render all the vertices are really inconvenient in terms of memory
consumption and runtime during the subdivision process. It will lead
to a heavy computational load especially at a higher level of
subdivision. Adaptive subdivision is a method that subdivides only at
certain areas of the meshes while the rest were maintained less
polygons. Although adaptive subdivision occurs at the selected areas,
the quality of produced surfaces which is their smoothness can be
preserved similar as well as regular subdivision. Nevertheless,
adaptive subdivision process burdened from two causes; calculations
need to be done to define areas that are required to be subdivided and
to remove cracks created from the subdivision depth difference
between the selected and unselected areas. Unfortunately, the result
of adaptive subdivision when it reaches to the higher level of
subdivision, it still brings the problem with memory consumption.
This research brings to iterative process of adaptive subdivision to
improve the previous adaptive method that will reduce memory
consumption applied on triangular mesh. The result of this iterative
process was acceptable better in memory and appearance in order to
produce fewer polygons while it preserves smooth surfaces.
Abstract: This paper proposes a method to vibration analysis in
order to on-line monitoring and predictive maintenance during the
milling process. Adapting envelope method to diagnostics and the
analysis for milling tool materials is an important contribution to the
qualitative and quantitative characterization of milling capacity and a
step by modeling the three-dimensional cutting process. An
experimental protocol was designed and developed for the
acquisition, processing and analyzing three-dimensional signal. The
vibration envelope analysis is proposed to detect the cutting capacity
of the tool with the optimization application of cutting parameters.
The research is focused on Hilbert transform optimization to evaluate
the dynamic behavior of the machine/ tool/workpiece.
Abstract: Global competitiveness has recently become the
biggest concern of both manufacturing and service companies.
Electronic commerce, as a key technology enables the firms to reach
all the potential consumers from all over the world. In this study, we
have presented commonly used electronic payment systems, and then
we have shown the evaluation of these systems in respect to different
criteria. The payment systems which are included in this research are
the credit card, the virtual credit card, the electronic money, the
mobile payment, the credit transfer and the debit instruments. We
have realized a systematic comparison of these systems in respect to
three main criteria: Technical, economical and social. We have
conducted a fuzzy multi-criteria decision making procedure to deal
with the multi-attribute nature of the problem. The subjectiveness
and imprecision of the evaluation process are modeled using
triangular fuzzy numbers.
Abstract: The paper presents frame and burst acquisition in a satellite communication network based on time division multiple access (TDMA) in which the transmissions may be carried on different transponders. A unique word pattern is used for the acquisition process. The search for the frame is aided by soft-decision of QPSK modulated signals in an additive white Gaussian channel. Results show that when the false alarm rate is low the probability of detection is also low, and the acquisition time is long. Conversely when the false alarm rate is high, the probability of detection is also high and the acquisition time is short. Thus the system operators can trade high false alarm rates for high detection probabilities and shorter acquisition times.
Abstract: Loop detectors report traffic characteristics in real
time. They are at the core of traffic control process. Intuitively,
one would expect that as density of detection increases, so would
the quality of estimates derived from detector data. However, as
detector deployment increases, the associated operating and
maintenance cost increases. Thus, traffic agencies often need to
decide where to add new detectors and which detectors should
continue receiving maintenance, given their resource constraints.
This paper evaluates the effect of detector spacing on freeway
travel time estimation. A freeway section (Interstate-15) in Salt
Lake City metropolitan region is examined. The research reveals
that travel time accuracy does not necessarily deteriorate with
increased detector spacing. Rather, the actual location of detectors
has far greater influence on the quality of travel time estimates.
The study presents an innovative computational approach that
delivers optimal detector locations through a process that relies on
Genetic Algorithm formulation.
Abstract: Wireless sensor networks include small nodes which
have sensing ability; calculation and connection extend themselves
everywhere soon. Such networks have source limitation on
connection, calculation and energy consumption. So, since the nodes
have limited energy in sensor networks, the optimized energy
consumption in these networks is of more importance and has created
many challenges. The previous works have shown that by organizing
the network nodes in a number of clusters, the energy consumption
could be reduced considerably. So the lifetime of the network would
be increased. In this paper, we used the Queen-bee algorithm to
create energy efficient clusters in wireless sensor networks. The
Queen-bee (QB) is similar to nature in that the queen-bee plays a
major role in reproduction process. The QB is simulated with J-sim
simulator. The results of the simulation showed that the clustering by
the QB algorithm decreases the energy consumption with regard to
the other existing algorithms and increases the lifetime of the
network.