Abstract: This paper reviews various approaches that have been
used for the modeling and simulation of large-scale engineering
systems and determines their appropriateness in the development of a
RICS modeling and simulation tool. Bond graphs, linear graphs,
block diagrams, differential and difference equations, modeling
languages, cellular automata and agents are reviewed. This tool
should be based on linear graph representation and supports symbolic
programming, functional programming, the development of noncausal
models and the incorporation of decentralized approaches.
Abstract: A new approach for timestamp ordering problem in
serializable schedules is presented. Since the number of users using
databases is increasing rapidly, the accuracy and needing high
throughput are main topics in database area. Strict 2PL does not
allow all possible serializable schedules and so does not result high
throughput. The main advantages of the approach are the ability to
enforce the execution of transaction to be recoverable and the high
achievable performance of concurrent execution in central databases.
Comparing to Strict 2PL, the general structure of the algorithm is
simple, free deadlock, and allows executing all possible serializable
schedules which results high throughput. Various examples which
include different orders of database operations are discussed.
Abstract: The modern Kazakh society is characterized by strengthen cross-cultural communication, the emergence of new powerful subcultures, accelerated change in social systems and values. The socio-political reforms in all fields have changed the quality of social relationships and spiritual life.Cross-cultural approach involves the analysis of different types of behavior and communication, including the manifestation of the conflict, and the formation of marginal destructive stereotypes.
Abstract: Fast forecasting of stock market prices is very important for
strategic planning. In this paper, a new approach for fast forecasting of
stock market prices is presented. Such algorithm uses new high speed
time delay neural networks (HSTDNNs). The operation of these
networks relies on performing cross correlation in the frequency
domain between the input data and the input weights of neural
networks. It is proved mathematically and practically that the number
of computation steps required for the presented HSTDNNs is less
than that needed by traditional time delay neural networks
(TTDNNs). Simulation results using MATLAB confirm the
theoretical computations.
Abstract: This paper describes studies carried out to investigate
the viability of using wireless cameras as a tool in monitoring
changes in air quality. A camera is used to monitor the change in
colour of a chemically responsive polymer within view of the camera
as it is exposed to varying chemical species concentration levels. The
camera captures this image and the colour change is analyzed by
averaging the RGB values present. This novel chemical sensing
approach is compared with an established chemical sensing method
using the same chemically responsive polymer coated onto LEDs. In
this way, the concentration levels of acetic acid in the air can be
tracked using both approaches. These approaches to chemical plume
tracking have many applications for air quality monitoring.
Abstract: An enhanced particle swarm optimization algorithm
(PSO) is presented in this work to solve the non-convex OPF
problem that has both discrete and continuous optimization variables.
The objective functions considered are the conventional quadratic
function and the augmented quadratic function. The latter model
presents non-differentiable and non-convex regions that challenge
most gradient-based optimization algorithms. The optimization
variables to be optimized are the generator real power outputs and
voltage magnitudes, discrete transformer tap settings, and discrete
reactive power injections due to capacitor banks. The set of equality
constraints taken into account are the power flow equations while the
inequality ones are the limits of the real and reactive power of the
generators, voltage magnitude at each bus, transformer tap settings,
and capacitor banks reactive power injections. The proposed
algorithm combines PSO with Newton-Raphson algorithm to
minimize the fuel cost function. The IEEE 30-bus system with six
generating units is used to test the proposed algorithm. Several cases
were investigated to test and validate the consistency of detecting
optimal or near optimal solution for each objective. Results are
compared to solutions obtained using sequential quadratic
programming and Genetic Algorithms.
Abstract: Years of extensive research in the field of speech
processing for compression and recognition in the last five decades,
resulted in a severe competition among the various methods and
paradigms introduced. In this paper we include the different representations
of speech in the time-frequency and time-scale domains
for the purpose of compression and recognition. The examination of
these representations in a variety of related work is accomplished.
In particular, we emphasize methods related to Fourier analysis
paradigms and wavelet based ones along with the advantages and
disadvantages of both approaches.
Abstract: In this paper, a method for deriving a group priority vector in the Fuzzy Analytic Network Process (FANP) is proposed. By introducing importance weights of multiple decision makers (DMs) based on their experiences, the Fuzzy Preferences Programming Method (FPP) is extended to a fuzzy group prioritization problem in the FANP. Additionally, fuzzy pair-wise comparison judgments are presented rather than exact numerical assessments in order to model the uncertainty and imprecision in the DMs- judgments and then transform the fuzzy group prioritization problem into a fuzzy non-linear programming optimization problem which maximize the group satisfaction. Unlike the known fuzzy prioritization techniques, the new method proposed in this paper can easily derive crisp weights from incomplete and inconsistency fuzzy set of comparison judgments and does not require additional aggregation producers. Detailed numerical examples are used to illustrate the implement of our approach and compare with the latest fuzzy prioritization method.
Abstract: This work presents a novel means of extracting fixedlength parameters from voice signals, such that words can be recognized
in linear time. The power and the zero crossing rate are first
calculated segment by segment from a voice signal; by doing so, two
feature sequences are generated. We then construct an FIR system
across these two sequences. The parameters of this FIR system, used
as the input of a multilayer proceptron recognizer, can be derived by
recursive LSE (least-square estimation), implying that the complexity of overall process is linear to the signal size. In the second part of
this work, we introduce a weighting factor λ to emphasize recent
input; therefore, we can further recognize continuous speech signals.
Experiments employ the voice signals of numbers, from zero to nine, spoken in Mandarin Chinese. The proposed method is verified to
recognize voice signals efficiently and accurately.
Abstract: In this paper, the finite-time stabilization of a class of multi-state time delay of fractional-order system is proposed. First, we define finite-time stability with the fractional-order system. Second, by using Generalized Gronwall's approach and the methods of the inequality, we get some conditions of finite-time stability for the fractional system with multi-state delay. Finally, a numerical example is given to illustrate the result.
Abstract: Machine-understandable data when strongly
interlinked constitutes the basis for the SemanticWeb. Annotating
web documents is one of the major techniques for creating metadata
on the Web. Annotating websitexs defines the containing data in a
form which is suitable for interpretation by machines. In this paper,
we present a better and improved approach than previous [1] to
annotate the texts of the websites depends on the knowledge base.
Abstract: The service sector continues to grow and the percentage
of GDP accounted for by service industries keeps increasing. The
growth and importance of service to an economy is not just a
phenomenon of advanced economies, service is now a majority of the
world gross domestic products. However, the performance evaluation
process of new service development problems generally involves
uncertain and imprecise data. This paper presents a 2-tuple fuzzy
linguistic computing approach to dealing with heterogeneous
information and information loss problems while the processes of
subjective evaluation integration. The proposed method based on group
decision-making scenario to assist business managers in measuring
performance of new service development manipulates the
heterogeneity integration processes and avoids the information loss
effectively.
Abstract: The original idea for a feature film may come from a
writer, director or a producer. Director is the person responsible for
the creative aspects, both interpretive and technical, of a motion
picture production in a film. Director may be shot discussing his
project with his or her cowriters, members of production staff, and
producer, and director may be shown selecting locales or
constructing sets. All these activities provide, of course, ways of
externalizing director-s ideas about the film. A director sometimes
pushes both the film image and techniques of narration to new artistic
limits, but main responsibility of director is take the spectator to an
original opinion in his philosophical approach. Director tries to find
an artistic angle in every scene and change screenplay into an
effective story and sets his film on a spiritual and philosophical base.
Abstract: The objective of this paper is to propose an adaptive multi threshold for image segmentation precisely in object detection. Due to the different types of license plates being used, the requirement of an automatic LPR is rather different for each country. The proposed technique is applied on Malaysian LPR application. It is based on Multi Layer Perceptron trained by back propagation. The proposed adaptive threshold is introduced to find the optimum threshold values. The technique relies on the peak value from the graph of the number object versus specific range of threshold values. The proposed approach has improved the overall performance compared to current optimal threshold techniques. Further improvement on this method is in progress to accommodate real time system specification.
Abstract: Due to the recovering global economy, enterprises are
increasingly focusing on logistics. Investing in logistic measures for
a production generates a large potential for achieving a good starting
point within a competitive field. Unlike during the global economic
crisis, enterprises are now challenged with investing available capital
to maximize profits. In order to be able to create an informed and
quantifiably comprehensible basis for a decision, enterprises need an
adequate model for logistically and monetarily evaluating measures
in production. The Collaborate Research Centre 489 (SFB 489) at the
Institute for Production Systems (IFA) developed a Logistic
Information System which provides support in making decisions and
is designed specifically for the forging industry. The aim of a project
that has been applied for is to now transfer this process in order to
develop a universal approach to logistically and monetarily evaluate
measures in production.
Abstract: Industrial radiography is a famous technique for the identification and evaluation of discontinuities, or defects, such as cracks, porosity and foreign inclusions found in welded joints. Although this technique has been well developed, improving both the inspection process and operating time, it does suffer from several drawbacks. The poor quality of radiographic images is due to the physical nature of radiography as well as small size of the defects and their poor orientation relatively to the size and thickness of the evaluated parts. Digital image processing techniques allow the interpretation of the image to be automated, avoiding the presence of human operators making the inspection system more reliable, reproducible and faster. This paper describes our attempt to develop and implement digital image processing algorithms for the purpose of automatic defect detection in radiographic images. Because of the complex nature of the considered images, and in order that the detected defect region represents the most accurately possible the real defect, the choice of global and local preprocessing and segmentation methods must be appropriated.
Abstract: Information is power. Geographical information is an
emerging science that is advancing the development of knowledge to
further help in the understanding of the relationship of “place" with
other disciplines such as crime. The researchers used crime data for
the years 2004 to 2007 from the Baguio City Police Office to
determine the incidence and actual locations of crime hotspots.
Combined qualitative and quantitative research methodology was
employed through extensive fieldwork and observation, geographic
visualization with Geographic Information Systems (GIS) and Global
Positioning Systems (GPS), and data mining. The paper discusses
emerging geographic visualization and data mining tools and
methodologies that can be used to generate baseline data for
environmental initiatives such as urban renewal and rejuvenation.
The study was able to demonstrate that crime hotspots can be
computed and were seen to be occurring to some select places in the
Central Business District (CBD) of Baguio City. It was observed that
some characteristics of the hotspot places- physical design and milieu
may play an important role in creating opportunities for crime. A list
of these environmental attributes was generated. This derived
information may be used to guide the design or redesign of the urban
environment of the City to be able to reduce crime and at the same
time improve it physically.
Abstract: As networking has become popular, Web-learning
tends to be a trend while designing a tool. Moreover, five-axis
machining has been widely used in industry recently; however, it has
potential axial table colliding problems. Thus this paper aims at
proposing an efficient web-learning collision detection tool on
five-axis machining. However, collision detection consumes heavy
resource that few devices can support, thus this research uses a
systematic approach based on web knowledge to detect collision. The
methodologies include the kinematics analyses for five-axis motions,
separating axis method for collision detection, and computer
simulation for verification. The machine structure is modeled as STL
format in CAD software. The input to the detection system is the
g-code part program, which describes the tool motions to produce the
part surface. This research produced a simulation program with C
programming language and demonstrated a five-axis machining
example with collision detection on web site. The system simulates the
five-axis CNC motion for tool trajectory and detects for any collisions
according to the input g-codes and also supports high-performance
web service benefiting from C. The result shows that our method
improves 4.5 time of computational efficiency, comparing to the
conventional detection method.
Abstract: Atrial Fibrillation is the most common sustained
arrhythmia encountered by clinicians. Because of the invisible
waveform of atrial fibrillation in atrial activation for human, it is
necessary to develop an automatic diagnosis system. 12-Lead ECG
now is available in hospital and is appropriate for using Independent
Component Analysis to estimate the AA period. In this research, we
also adopt a second-order blind identification approach to transform
the sources extracted by ICA to more precise signal and then we use
frequency domain algorithm to do the classification. In experiment,
we gather a significant result of clinical data.
Abstract: In this paper is investigated a possible
optimization of some linear algebra problems which can be
solved by parallel processing using the special arrays called
systolic arrays. In this paper are used some special types of
transformations for the designing of these arrays. We show
the characteristics of these arrays. The main focus is on
discussing the advantages of these arrays in parallel
computation of matrix product, with special approach to the
designing of systolic array for matrix multiplication.
Multiplication of large matrices requires a lot of
computational time and its complexity is O(n3 ). There are
developed many algorithms (both sequential and parallel) with
the purpose of minimizing the time of calculations. Systolic
arrays are good suited for this purpose. In this paper we show
that using an appropriate transformation implicates in finding
more optimal arrays for doing the calculations of this type.