Abstract: With major technological advances and to reduce the
cost of training apprentices for real-time critical systems, it was
necessary the development of Intelligent Tutoring Systems for
training apprentices in these systems. These systems, in general, have
interactive features so that the learning is actually more efficient,
making the learner more familiar with the mechanism in question. In
the home stage of learning, tests are performed to obtain the student's
income, a measure on their use. The aim of this paper is to present a
framework to model an Intelligent Tutoring Systems using the UML
language. The various steps of the analysis are considered the
diagrams required to build a general model, whose purpose is to
present the different perspectives of its development.
Abstract: Unlike its conventional counterpart, Islamic principles
forbid Islamic banks to take any interest-related income and thus
makes deposits from depositors as an important source of fund for its
operational and financing. Consequently, the risk of deposit
withdrawal by depositors is an important aspect that should be wellmanaged
in Islamic banking. This paper aims to investigate factors
that influence depositors- withdrawal behavior in Islamic banks,
particularly in Malaysia, using the framework of theory of reasoned
action. A total of 368 respondents from Klang valley are involved in
the analysis. The paper finds that all the constructs variable i.e.
normative beliefs, subjective norms, behavioral beliefs, and attitude
towards behavior are perceived to be distinct by the respondents. In
addition, the structural equation model is able to verify the structural
relationships between subjective norms, attitude towards behavior
and behavioral intention. Subjective norms gives more influence to
depositors- decision on deposit withdrawal compared to attitude
towards behavior.
Abstract: Vision-based intelligent vehicle applications often require large amounts of memory to handle video streaming and image processing, which in turn increases complexity of hardware and software. This paper presents an FPGA implement of a vision-based blind spot warning system. Using video frames, the information of the blind spot area turns into one-dimensional information. Analysis of the estimated entropy of image allows the detection of an object in time. This idea has been implemented in the XtremeDSP video starter kit. The blind spot warning system uses only 13% of its logic resources and 95k bits block memory, and its frame rate is over 30 frames per sec (fps).
Abstract: We developed a non-contact method for the in-situ
monitoring of the thermal forming of glass and Si foils to optimize
the manufacture of mirrors for high-resolution space x-ray
telescopes. Their construction requires precise and light-weight
segmented optics with angular resolution better than 5 arcsec. We
used 75x25 mm Desag D263 glass foils 0.75 mm thick and 0.6 mm
thick Si foils. The glass foils were shaped by free slumping on a
frame at viscosities in the range of 109.3-1012 dPa·s, the Si foils by
forced slumping above 1000°C. Using a Nikon D80 digital camera,
we took snapshots of a foil-s shape every 5 min during its isothermal
heat treatment. The obtained results we can use for computer
simulations. By comparing the measured and simulated data, we can
more precisely define material properties of the foils and optimize
the forming technology.
Abstract: Currently searching through internet is very popular especially in a field of academic. A huge of educational information such as research papers are overload for user. So community-base web sites have been developed to help user search information more easily from process of customizing a web site to need each specifies user or set of user. In this paper propose to use association rule analyze the community group on research paper bookmarking. A set of design goals for community group frameworks is developed and discussed. Additionally Researcher analyzes the initial relation by using association rule discovery between the antecedent and the consequent of a rule in the groups of user for generate the idea to improve ranking search result and development recommender system.
Abstract: Environmental aspects plays a central role in environmental management system (EMS) because it is the basis for the identification of an organization-s environmental targets. The
existing methods for the assessment of environmental aspects are grouped into three categories: risk assessment-based (RA-based),
LCA-based and criterion-based methods. To combine the benefits of
these three categories of research, this study proposes an integrated framework, combining RA-, LCA- and criterion-based methods. The
integrated framework incorporates LCA techniques for the identification of the causal linkage for aspect, pathway, receptor and
impact, uses fuzzy logic to assess aspects, considers fuzzy conditions,
in likelihood assessment, and employs a new multi-criteria decision analysis method - multi-criteria and multi-connection comprehensive
assessment (MMCA) - to estimate significant aspects in EMS. The proposed model is verified, using a real case study and the results show
that this method successfully prioritizes the environmental aspects.
Abstract: Cryptographic protocols are widely used in various
applications to provide secure communications. They are usually
represented as communicating agents that send and receive messages.
These agents use their knowledge to exchange information and
communicate with other agents involved in the protocol. An agent
knowledge can be partitioned into explicit knowledge and procedural
knowledge. The explicit knowledge refers to the set of information
which is either proper to the agent or directly obtained from other
agents through communication. The procedural knowledge relates to
the set of mechanisms used to get new information from what is
already available to the agent.
In this paper, we propose a mathematical framework which specifies
the explicit knowledge of an agent involved in a cryptographic
protocol. Modelling this knowledge is crucial for the specification,
analysis, and implementation of cryptographic protocols. We also,
report on a prototype tool that allows the representation and the
manipulation of the explicit knowledge.
Abstract: Seismic design may require non-conventional
concept, due to the fact that the stiffness and layout of the structure
have a great effect on the overall structural behaviour, on the seismic
load intensity as well as on the internal force distribution. To find an
economical and optimal structural configuration the key issue is the
optimal design of the lateral load resisting system. This paper focuses
on the optimal design of regular, concentric braced frame (CBF)
multi-storey steel building structures. The optimal configurations are
determined by a numerical method using genetic algorithm approach,
developed by the authors. Aim is to find structural configurations
with minimum structural cost. The design constraints of objective
function are assigned in accordance with Eurocode 3 and Eurocode 8
guidelines. In this paper the results are presented for various building
geometries, different seismic intensities, and levels of energy
dissipation.
Abstract: In this paper, we investigate the strategic stochastic air traffic flow management problem which seeks to balance airspace capacity and demand under weather disruptions. The goal is to reduce the need for myopic tactical decisions that do not account for probabilistic knowledge about the NAS near-future states. We present and discuss a scenario-based modeling approach based on a time-space stochastic process to depict weather disruption occurrences in the NAS. A solution framework is also proposed along with a distributed implementation aimed at overcoming scalability problems. Issues related to this implementation are also discussed.
Abstract: Crime is a major societal problem for most of the
world's nations. Consequently, the police need to develop new
methods to improve their efficiency in dealing with these ever increasing crime rates. Two of the common difficulties that the police
face in crime control are crime investigation and the provision of crime information to the general public to help them protect themselves. Crime control in police operations involves the use of
spatial data, crime data and the related crime data from different organizations (depending on the nature of the analysis to be made).
These types of data are collected from several heterogeneous sources
in different formats and from different platforms, resulting in a lack of standardization. Moreover, there is no standard framework for
crime data collection, integration and dissemination through mobile
devices. An investigation into the current situation in crime control was carried out to identify the needs to resolve these issues. This
paper proposes and investigates the use of service oriented
architecture (SOA) and the mobile spatial information service in crime control. SOA plays an important role in crime control as an
appropriate way to support data exchange and model sharing from
heterogeneous sources. Crime control also needs to facilitate mobile
spatial information services in order to exchange, receive, share and release information based on location to mobile users anytime and
anywhere.
Abstract: Recently, a great amount of interest has been shown
in the field of modeling and controlling hybrid systems. One of the
efficient and common methods in this area utilizes the mixed logicaldynamical
(MLD) systems in the modeling. In this method, the
system constraints are transformed into mixed-integer inequalities by
defining some logic statements. In this paper, a system containing
three tanks is modeled as a nonlinear switched system by using the
MLD framework. Comparing the model size of the three-tank system
with that of a two-tank system, it is deduced that the number of
binary variables, the size of the system and its complexity
tremendously increases with the number of tanks, which makes the
control of the system more difficult. Therefore, methods should be
found which result in fewer mixed-integer inequalities.
Abstract: This paper describes an algorithm to estimate realtime vehicle velocity using image processing technique from the known camera calibration parameters. The presented algorithm involves several main steps. First, the moving object is extracted by utilizing frame differencing technique. Second, the object tracking method is applied and the speed is estimated based on the displacement of the object-s centroid. Several assumptions are listed to simplify the transformation of 2D images from 3D real-world images. The results obtained from the experiment have been compared to the estimated ground truth. From this experiment, it exhibits that the proposed algorithm has achieved the velocity accuracy estimation of about ± 1.7 km/h.
Abstract: With the advent of emerging personal computing paradigms such as ubiquitous and mobile computing, Web contents are becoming accessible from a wide range of mobile devices. Since these devices do not have the same rendering capabilities, Web contents need to be adapted for transparent access from a variety of client agents. Such content adaptation is exploited for either an individual element or a set of consecutive elements in a Web document and results in better rendering and faster delivery to the client device. Nevertheless, Web content adaptation sets new challenges for semantic markup. This paper presents an advanced components platform, called SMC, enabling the development of mobility applications and services according to a channel model based on the principles of Services Oriented Architecture (SOA). It then goes on to describe the potential for integration with the Semantic Web through a novel framework of external semantic annotation that prescribes a scheme for representing semantic markup files and a way of associating Web documents with these external annotations. The role of semantic annotation in this framework is to describe the contents of individual documents themselves, assuring the preservation of the semantics during the process of adapting content rendering. Semantic Web content adaptation is a way of adding value to Web contents and facilitates repurposing of Web contents (enhanced browsing, Web Services location and access, etc).
Abstract: Property investment in the real estate industry has a
high risk due to the uncertainty factors that will affect the decisions
made and high cost. Analytic hierarchy process has existed for some
time in which referred to an expert-s opinion to measure the
uncertainty of the risk factors for the risk analysis. Therefore,
different level of experts- experiences will create different opinion
and lead to the conflict among the experts in the field. The objective
of this paper is to propose a new technique to measure the uncertainty
of the risk factors based on multidimensional data model and data
mining techniques as deterministic approach. The propose technique
consist of a basic framework which includes four modules: user,
technology, end-user access tools and applications. The property
investment risk analysis defines as a micro level analysis as the
features of the property will be considered in the analysis in this
paper.
Abstract: In this paper, we present a comparative study between two computer vision systems for objects recognition and tracking, these algorithms describe two different approach based on regions constituted by a set of pixels which parameterized objects in shot sequences. For the image segmentation and objects detection, the FCM technique is used, the overlapping between cluster's distribution is minimized by the use of suitable color space (other that the RGB one). The first technique takes into account a priori probabilities governing the computation of various clusters to track objects. A Parzen kernel method is described and allows identifying the players in each frame, we also show the importance of standard deviation value research of the Gaussian probability density function. Region matching is carried out by an algorithm that operates on the Mahalanobis distance between region descriptors in two subsequent frames and uses singular value decomposition to compute a set of correspondences satisfying both the principle of proximity and the principle of exclusion.
Abstract: Lately, significant work in the area of Intelligent
Manufacturing has become public and mainly applied within the
frame of industrial purposes. Special efforts have been made in the
implementation of new technologies, management and control
systems, among many others which have all evolved the field. Aware
of all this and due to the scope of new projects and the need of
turning the existing flexible ideas into more autonomous and
intelligent ones, i.e.: Intelligent Manufacturing, the present paper
emerges with the main aim of contributing to the design and analysis
of the material flow in either systems, cells or work stations under
this new “intelligent" denomination. For this, besides offering a
conceptual basis in some of the key points to be taken into account
and some general principles to consider in the design and analysis of
the material flow, also some tips on how to define other possible
alternative material flow scenarios and a classification of the states a
system, cell or workstation are offered as well. All this is done with
the intentions of relating it with the use of simulation tools, for which
these have been briefly addressed with a special focus on the Witness
simulation package. For a better comprehension, the previous
elements are supported by a detailed layout, other figures and a few
expressions which could help obtaining necessary data. Such data and
others will be used in the future, when simulating the scenarios in the
search of the best material flow configurations.
Abstract: As the gradual increase of the enterprise scale, the
firms may possess many manufacturing plants located in different
places geographically. This change will result in the multi-site
production planning problems under the environment of multiple
plants or production resources. Our research proposes the structural
framework to analyze the multi-site planning problems. The analytical
framework is composed of six elements: multi-site conceptual model,
product structure (bill of manufacturing), production strategy,
manufacturing capability and characteristics, production planning
constraints, and key performance indicators. As well as the discussion
of these six ingredients, we also review related literatures in this paper
to match our analytical framework. Finally we take a real-world
practical example of a TFT-LCD manufacturer in Taiwan to explain
our proposed analytical framework for the multi-site production
planning problems.
Abstract: Clustering in high dimensional space is a difficult
problem which is recurrent in many fields of science and
engineering, e.g., bioinformatics, image processing, pattern
reorganization and data mining. In high dimensional space some of
the dimensions are likely to be irrelevant, thus hiding the possible
clustering. In very high dimensions it is common for all the objects in
a dataset to be nearly equidistant from each other, completely
masking the clusters. Hence, performance of the clustering algorithm
decreases.
In this paper, we propose an algorithmic framework which
combines the (reduct) concept of rough set theory with the k-means
algorithm to remove the irrelevant dimensions in a high dimensional
space and obtain appropriate clusters. Our experiment on test data
shows that this framework increases efficiency of the clustering
process and accuracy of the results.
Abstract: Quantum computation using qubits made of two component Bose-Einstein condensates (BECs) is analyzed. We construct a general framework for quantum algorithms to be executed using the collective states of the BECs. The use of BECs allows for an increase of energy scales via bosonic enhancement, resulting in two qubit gate operations that can be performed at a time reduced by a factor of N, where N is the number of bosons per qubit. We illustrate the scheme by an application to Deutsch-s and Grover-s algorithms, and discuss possible experimental implementations. Decoherence effects are analyzed under both general conditions and for the experimental implementation proposed.
Abstract: This frame work describes a computationally more
efficient and adaptive threshold estimation method for image
denoising in the wavelet domain based on Generalized Gaussian
Distribution (GGD) modeling of subband coefficients. In this
proposed method, the choice of the threshold estimation is carried out
by analysing the statistical parameters of the wavelet subband
coefficients like standard deviation, arithmetic mean and geometrical
mean. The noisy image is first decomposed into many levels to
obtain different frequency bands. Then soft thresholding method is
used to remove the noisy coefficients, by fixing the optimum
thresholding value by the proposed method. Experimental results on
several test images by using this method show that this method yields
significantly superior image quality and better Peak Signal to Noise
Ratio (PSNR). Here, to prove the efficiency of this method in image
denoising, we have compared this with various denoising methods
like wiener filter, Average filter, VisuShrink and BayesShrink.