Abstract: Data mining is an extraordinarily demanding field referring to extraction of implicit knowledge and relationships, which are not explicitly stored in databases. A wide variety of methods of data mining have been introduced (classification, characterization, generalization...). Each one of these methods includes more than algorithm. A system of data mining implies different user categories,, which mean that the user-s behavior must be a component of the system. The problem at this level is to know which algorithm of which method to employ for an exploratory end, which one for a decisional end, and how can they collaborate and communicate. Agent paradigm presents a new way of conception and realizing of data mining system. The purpose is to combine different algorithms of data mining to prepare elements for decision-makers, benefiting from the possibilities offered by the multi-agent systems. In this paper the agent framework for data mining is introduced, and its overall architecture and functionality are presented. The validation is made on spatial data. Principal results will be presented.
Abstract: This paper presents a novel method for data hiding based on neighborhood pixels information to calculate the number of bits that can be used for substitution and modified Least Significant Bits technique for data embedding. The modified solution is independent of the nature of the data to be hidden and gives correct results along with un-noticeable image degradation. The technique, to find the number of bits that can be used for data hiding, uses the green component of the image as it is less sensitive to human eye and thus it is totally impossible for human eye to predict whether the image is encrypted or not. The application further encrypts the data using a custom designed algorithm before embedding bits into image for further security. The overall process consists of three main modules namely embedding, encryption and extraction cm.
Abstract: The client server systems using mobile
communications networks for data transmission became very
attractive for many economic agents, in the purpose of promoting and
offering electronic services to their clients. E-services are suitable for
business developing and financial benefits increasing. The products
or services can be efficiently delivered to a large number of clients,
using mobile Internet access technologies. The clients can have
access to e-services, anywhere and anytime, with the support of 3G,
GPRS, WLAN, etc., channels bandwidth, data services and protocols.
Based on the mobile communications networks evolution and
development, a convergence of technological and financial interests
of mobile operators, software developers, mobile terminals producers
and e-content providers is established. These will lead to a high level
of integration of IT&C resources and will facilitate the value added
services delivery through the mobile communications networks. In
this paper it is presented a client server system, for e-services access,
with Smartphones and PDA-s mobile software applications, installed
on Symbian and Windows Mobile operating systems.
Abstract: According to the density of the chips, designers are
trying to put so any facilities of computational and storage on single
chips. Along with the complexity of computational and storage
circuits, the designing, testing and debugging become more and more
complex and expensive. So, hardware design will be built by using
very high speed hardware description language, which is more
efficient and cost effective. This paper will focus on the
implementation of 32-bit ALU design based on Verilog hardware
description language. Adder and subtracter operate correctly on both
unsigned and positive numbers. In ALU, addition takes most of the
time if it uses the ripple-carry adder. The general strategy for
designing fast adders is to reduce the time required to form carry
signals. Adders that use this principle are called carry look- ahead
adder. The carry look-ahead adder is to be designed with combination
of 4-bit adders. The syntax of Verilog HDL is similar to the C
programming language. This paper proposes a unified approach to
ALU design in which both simulation and formal verification can
co-exist.
Abstract: In an open real-time system environment, the coexistence of different kinds of real-time and non real-time applications makes the system scheduling mechanism face new requirements and challenges. One two-level scheduling scheme of the open real-time systems is introduced, and points out that hard and soft real-time applications are scheduled non-distinctively as the same type real-time applications, the Quality of Service (QoS) cannot be guaranteed. It has two flaws: The first, it can not differentiate scheduling priorities of hard and soft real-time applications, that is to say, it neglects characteristic differences between hard real-time applications and soft ones, so it does not suit a more complex real-time environment. The second, the worst case execution time of soft real-time applications cannot be predicted exactly, so it is not worth while to cost much spending in order to assure all soft real-time applications not to miss their deadlines, and doing that may cause resource wasting. In order to solve this problem, a novel two-level real-time scheduling mechanism (including scheduling profile and scheduling algorithm) which adds the process of dealing with soft real-time applications is proposed. Finally, we verify real-time scheduling mechanism from two aspects of theory and experiment. The results indicate that our scheduling mechanism can achieve the following objectives. (1) It can reflect the difference of priority when scheduling hard and soft real-time applications. (2) It can ensure schedulability of hard real-time applications, that is, their rate of missing deadline is 0. (3) The overall rate of missing deadline of soft real-time applications can be less than 1. (4) The deadline of a non-real-time application is not set, whereas the scheduling algorithm that server 0 S uses can avoid the “starvation" of jobs and increase QOS. By doing that, our scheduling mechanism is more compatible with different types of applications and it will be applied more widely.
Abstract: The group mutual exclusion (GME) problem is an
interesting generalization of the mutual exclusion problem. Several
solutions of the GME problem have been proposed for message
passing distributed systems. However, none of these solutions is
suitable for real time distributed systems. In this paper, we propose a
token-based distributed algorithms for the GME problem in soft real
time distributed systems. The algorithm uses the concepts of priority
queue, dynamic request set and the process state. The algorithm uses
first come first serve approach in selecting the next session type
between the same priority levels and satisfies the concurrent
occupancy property. The algorithm allows all n processors to be
inside their CS provided they request for the same session. The
performance analysis and correctness proof of the algorithm has also
been included in the paper.
Abstract: Model Predictive Control has been previously applied
to supply chain problems with promising results; however hitherto
proposed systems possessed no information on future demand. A
forecasting methodology will surely promote the efficiency of
control actions by providing insight on the future. A complete supply
chain management framework that is based on Model Predictive
Control (MPC) and Time Series Forecasting will be presented in this
paper. The proposed framework will be tested on industrial data in
order to assess the efficiency of the method and the impact of
forecast accuracy on overall control performance of the supply chain.
To this end, forecasting methodologies with different characteristics
will be implemented on test data to generate forecasts that will serve
as input to the Model Predictive Control module.
Abstract: The purpose of this study was to explore the correlation
between leisure participation and perceived wellness, with the students
of a nursing college in southern Taiwan as the subjects. One thousand
six hundred and ninety-six (1,696) surveys were sent, and 1,408
surveys were received for an 83.02% valid response rate. Using
canonical correlation analysis to analyze the data, the results showed
that the linear combination of the two sets of variable produces five
significant canonical variates. Out of the five canonical variates, only
the first has sufficient explanatory power. The canonical correlation
coefficient of first canonical variate is 0.660. This indicated that
leisure participation and perceived wellness are significantly
correlated.
Abstract: In designing river intakes and diversion structures, it is paramount that the sediments entering the intake are minimized or, if possible, completely separated. Due to high water velocity, sediments can significantly damage hydraulic structures especially when mechanical equipment like pumps and turbines are used. This subsequently results in wasting water, electricity and further costs. Therefore, it is prudent to investigate and analyze the performance of lateral intakes affected by sediment control structures. Laboratory experiments, despite their vast potential and benefits, can face certain limitations and challenges. Some of these include: limitations in equipment and facilities, space constraints, equipment errors including lack of adequate precision or mal-operation, and finally, human error. Research has shown that in order to achieve the ultimate goal of intake structure design – which is to design longlasting and proficient structures – the best combination of sediment control structures (such as sill and submerged vanes) along with parameters that increase their performance (such as diversion angle and location) should be determined. Cost, difficulty of execution and environmental impacts should also be included in evaluating the optimal design. This solution can then be applied to similar problems in the future. Subsequently, the model used to arrive at the optimal design requires high level of accuracy and precision in order to avoid improper design and execution of projects. Process of creating and executing the design should be as comprehensive and applicable as possible. Therefore, it is important that influential parameters and vital criteria is fully understood and applied at all stages of choosing the optimal design. In this article, influential parameters on optimal performance of the intake, advantages and disadvantages, and efficiency of a given design are studied. Then, a multi-criterion decision matrix is utilized to choose the optimal model that can be used to determine the proper parameters in constructing the intake.
Abstract: This paper proposes a method to vibration analysis in
order to on-line monitoring and predictive maintenance during the
milling process. Adapting envelope method to diagnostics and the
analysis for milling tool materials is an important contribution to the
qualitative and quantitative characterization of milling capacity and a
step by modeling the three-dimensional cutting process. An
experimental protocol was designed and developed for the
acquisition, processing and analyzing three-dimensional signal. The
vibration envelope analysis is proposed to detect the cutting capacity
of the tool with the optimization application of cutting parameters.
The research is focused on Hilbert transform optimization to evaluate
the dynamic behavior of the machine/ tool/workpiece.
Abstract: Noise contamination in a magnetic resonance (MR)
image could occur during acquisition, storage, and transmission in
which effective filtering is required to avoid repeating the MR
procedure. In this paper, an iterative asymmetrical triangle fuzzy
filter with moving average center (ATMAVi filter) is used to reduce
different levels of salt and pepper noise in a brain MR image. Besides
visual inspection on filtered images, the mean squared error (MSE) is
used as an objective measurement. When compared with the median
filter, simulation results indicate that the ATMAVi filter is effective
especially for filtering a higher level noise (such as noise density =
0.45) using a smaller window size (such as 3x3) when operated
iteratively or using a larger window size (such as 5x5) when operated
non-iteratively.
Abstract: Object: Review recent publications of patient safety
culture to investigate the relationship between leadership behavior,
safety culture, and safety performance in the healthcare industry.
Method: This study is a cross-sectional study, 350 questionnaires were
mailed to hospital workers with 195 valid responses obtained, and a
55.7% valid response rate. Confirmatory factor analysis (CFA) was
carried out to test the factor structure and determine if the composite
reliability was significant with a factor loading of >0.5, resulting in an
acceptable model fit. Results: Through the analysis of One-way
ANOVA, the results showed that physicians significantly have more
negative patient safety culture perceptions and safety performance
perceptions than non- physicians. Conclusions: The path analysis
results show that leadership behavior affects safety culture and safety
performance in the health care industry. Safety performance was
affected and improved with contingency leadership and a positive
patient safety organization culture. The study suggests improving
safety performance by providing a well-managed system that
includes: consideration of leadership, hospital worker training
courses, and a solid safety reporting system.
Abstract: In this study, an analysis has been performed for
heat and mass transfer of a steady laminar boundary-layer flow
of a viscous flow past a nonlinearly stretching sheet.
Parameters n, Ec, k0, Sc represent the dominance of the
nonlinearly effect, viscous effect, radiation effect and mass
transfer effect which have presented in governing equations,
respectively. The similarity transformation and the
finite-difference method have been used to analyze the present
problem.
Abstract: Photonic Crystal (PhC) based devices are being
increasingly used in multifunctional, compact devices in integrated
optical communication systems. They provide excellent
controllability of light, yet maintaining the small size required for
miniaturization. In this paper, the band gap properties of PhCs and
their typical applications in optical waveguiding are considered.
Novel PhC based applications such as nonlinear switching and
tapers are considered and simulation results are shown using the
accurate time-domain numerical method based on Finite Difference
Time Domain (FDTD) scheme. The suitability of these devices for
novel applications is discussed and evaluated.
Abstract: This research investigates risk factors for defective products in autoparts factories. Under a Bayesian framework, a generalized linear mixed model (GLMM) in which the dependent variable, the number of defective products, has a Poisson distribution is adopted. Its performance is compared with the Poisson GLM under a Bayesian framework. The factors considered are production process, machines, and workers. The products coded RT50 are observed. The study found that the Poisson GLMM is more appropriate than the Poisson GLM. For the production Process factor, the highest risk of producing defective products is Process 1, for the Machine factor, the highest risk is Machine 5, and for the Worker factor, the highest risk is Worker 6.
Abstract: Water pollution assessment problems arise frequently
in environmental science. In this research, a finite difference method
for solving the one-dimensional steady convection-diffusion equation
with variable coefficients is proposed; it is then used to optimize
water treatment costs.
Abstract: Today-s economy is in a permanent change, causing
merger and acquisitions and co operations between enterprises. As a
consequence, process adaptations and realignments result in systems
integration and software development projects. Processes and
procedures to execute such projects are still reliant on craftsman-ship
of highly skilled workers. A generally accepted, industrialized
production, characterized by high efficiency and quality, seems
inevitable.
In spite of this, current concepts of software industrialization are
aimed at traditional software engineering and do not consider the
characteristics of systems integration. The present work points out
these particularities and discusses the applicability of existing
industrial concepts in the systems integration domain. Consequently
it defines further areas of research necessary to bring the field of
systems integration closer to an industrialized production, allowing a
higher efficiency, quality and return on investment.
Abstract: Global competitiveness has recently become the
biggest concern of both manufacturing and service companies.
Electronic commerce, as a key technology enables the firms to reach
all the potential consumers from all over the world. In this study, we
have presented commonly used electronic payment systems, and then
we have shown the evaluation of these systems in respect to different
criteria. The payment systems which are included in this research are
the credit card, the virtual credit card, the electronic money, the
mobile payment, the credit transfer and the debit instruments. We
have realized a systematic comparison of these systems in respect to
three main criteria: Technical, economical and social. We have
conducted a fuzzy multi-criteria decision making procedure to deal
with the multi-attribute nature of the problem. The subjectiveness
and imprecision of the evaluation process are modeled using
triangular fuzzy numbers.
Abstract: Intelligence tests are series of tasks designed to measure the capacity to make abstractions, to learn, and to deal with novel situations. Testing of the visual abilities of the shape understanding system (SUS) is performed based on the visual intelligence tests. In this paper the progressive matrices tests are formulated as tasks given to SUS. These tests require good visual problem solving abilities of the human subject. SUS solves these tests by performing complex visual reasoning transforming the visual forms (tests) into the string forms. The experiment proved that the proposed method, which is part of the SUS visual understanding abilities, can solve a test that is very difficult for human subject.
Abstract: Borate minerals have attracted considerable attention in the past years due to their structural chemistry and mechanical properties in several industries. Recently, increasing attention has been paid to the use of; synthetically produced magnesium borates as catalysts reinforcing material for plastics, the conversion of hydrocarbons, electro-conductive treating agent, anti-wear and anti-corrosion materials. Magnesium borates can be synthesized by several methods such as; hydrothermal and solid-state (thermal) processes. In this study the hydrothermal production method was applied at the modest temperature of 80C along with convenient crystal growth. Using MgCl2.6H2O, H3BO3, and NaOH as starting materials, 30, 60, 120, 240 minutes of reaction times were studied. After all, the crystal structure and the morphology of the products were examined by X-Ray Diffraction (XRD) and Fourier Transform Infrared Spectroscopy (FT-IR). As a result the forms of Admontite and Mcallisterite minerals were synthesized.