Abstract: The element of justice or al-‘adl in the context of
Islamic critical thinking deals with the notion of justice in a thinking
process which critically rationalizes the truth in a fair and objective
manner with no irrelevant interference that can jeopardize a sound
judgment. This Islamic axiological element is vital in technological
decision making as it addresses the issues of religious values and
ethics that are primarily set to fulfill the purpose of human life on
earth. The main objective of this study was to examine and analyze
the perception of Muslim engineering students in Malaysian higher
education institutions towards the concept of al-‘adl as an essential
element of Islamic critical thinking. The study employed mixed
methods approach that comprises data collection from the
questionnaire survey and the interview responses. A total of 557
Muslim engineering undergraduates from six Malaysian universities
participated in the study. The study generally indicated that Muslim
engineering undergraduates in the higher institutions have rather
good comprehension and consciousness for al-‘adl with a slight
awareness on the importance of objective thinking. Nonetheless there
were a few items on the concept that have implied a comparatively
low perception on the rational justice in Islam as the means to grasp
the ultimate truth.
Abstract: The experiment was conducted to evaluate
digestibility quantities of protein in Canola Meals (CMs) between
caecectomised and intact adult Rhode Island Red (RIR) cockerels
with using conventional addition method (CAM) for 7 d: a 4-d
adaptation and a 3-d experiment period on the basis of a completely
randomized design with 4 replicates. Results indicated that
caecectomy decreased (P
Abstract: A newly designed gas-distributor for granulation of powdery materials in equilibrated fluidized bed and a system for collecting the granules prepared are suggested. The aim of these designs is to solve the problems arising by the granulation of powdery materials in fluidized bed devices. The gasdistributor and the collection system proved to be reliable at operation; they reduce the size of still zones, effectively disperse the binding solution in the bed and ensure the collection of granules of given diameter
Abstract: Current tools for data migration between documentoriented
and relational databases have several disadvantages. We
propose a new approach for data migration between documentoriented
and relational databases. During data migration the relational
schema of the target (relational database) is automatically created
from collection of XML documents. Proposed approach is verified on
data migration between document-oriented database IBM Lotus/
Notes Domino and relational database implemented in relational
database management system (RDBMS) MySQL.
Abstract: Along with forward supply chain organization needs
to consider the impact of reverse logistics due to its economic
advantage, social awareness and strict legislations. In this paper, we
develop a system dynamics framework for a closed-loop supply
chain with fuzzy demand and fuzzy collection rate by incorporating
product exchange policy in forward channel and various recovery
options in reverse channel. The uncertainty issues associated with
acquisition and collection of used product have been quantified using
possibility measures. In the simulation study, we analyze order
variation at both retailer and distributor level and compare bullwhip
effects of different logistics participants over time between the
traditional forward supply chain and the closed-loop supply chain.
Our results suggest that the integration of reverse logistics can reduce
order variation and bullwhip effect of a closed-loop system. Finally,
sensitivity analysis is performed to examine the impact of various
parameters on recovery process and bullwhip effect.
Abstract: With the extensive inclusion of document, especially
text, in the business systems, data mining does not cover the full
scope of Business Intelligence. Data mining cannot deliver its impact
on extracting useful details from the large collection of unstructured
and semi-structured written materials based on natural languages.
The most pressing issue is to draw the potential business intelligence
from text. In order to gain competitive advantages for the business, it
is necessary to develop the new powerful tool, text mining, to expand
the scope of business intelligence.
In this paper, we will work out the strong points of text mining in
extracting business intelligence from huge amount of textual
information sources within business systems. We will apply text
mining to each stage of Business Intelligence systems to prove that
text mining is the powerful tool to expand the scope of BI. After
reviewing basic definitions and some related technologies, we will
discuss the relationship and the benefits of these to text mining. Some
examples and applications of text mining will also be given. The
motivation behind is to develop new approach to effective and
efficient textual information analysis. Thus we can expand the scope
of Business Intelligence using the powerful tool, text mining.
Abstract: In this paper, we propose an improved 3D star skeleton
technique, which is a suitable skeletonization for human posture representation
and reflects the 3D information of human posture.
Moreover, the proposed technique is simple and then can be performed
in real-time. The existing skeleton construction techniques, such as
distance transformation, Voronoi diagram, and thinning, focus on the
precision of skeleton information. Therefore, those techniques are not
applicable to real-time posture recognition since they are computationally
expensive and highly susceptible to noise of boundary. Although
a 2D star skeleton was proposed to complement these problems,
it also has some limitations to describe the 3D information of the
posture. To represent human posture effectively, the constructed skeleton
should consider the 3D information of posture. The proposed 3D
star skeleton contains 3D data of human, and focuses on human action
and posture recognition. Our 3D star skeleton uses the 8 projection
maps which have 2D silhouette information and depth data of human
surface. And the extremal points can be extracted as the features of 3D
star skeleton, without searching whole boundary of object. Therefore,
on execution time, our 3D star skeleton is faster than the “greedy" 3D
star skeleton using the whole boundary points on the surface. Moreover,
our method can offer more accurate skeleton of posture than the
existing star skeleton since the 3D data for the object is concerned.
Additionally, we make a codebook, a collection of representative 3D
star skeletons about 7 postures, to recognize what posture of constructed
skeleton is.
Abstract: In this study, the dispersed model is used to predict
gas phase concentration, liquid drop concentration. The venturi
scrubber efficiency is calculated by gas phase concentration. The
modified model has been validated with available experimental data
of Johnstone, Field and Tasler for a range of throat gas velocities,
liquid to gas ratios and particle diameters and is used to study the
effect of some design parameters on collection efficiency.
Abstract: Text Mining is around applying knowledge discovery techniques to unstructured text is termed knowledge discovery in text (KDT), or Text data mining or Text Mining. In Neural Network that address classification problems, training set, testing set, learning rate are considered as key tasks. That is collection of input/output patterns that are used to train the network and used to assess the network performance, set the rate of adjustments. This paper describes a proposed back propagation neural net classifier that performs cross validation for original Neural Network. In order to reduce the optimization of classification accuracy, training time. The feasibility the benefits of the proposed approach are demonstrated by means of five data sets like contact-lenses, cpu, weather symbolic, Weather, labor-nega-data. It is shown that , compared to exiting neural network, the training time is reduced by more than 10 times faster when the dataset is larger than CPU or the network has many hidden units while accuracy ('percent correct') was the same for all datasets but contact-lences, which is the only one with missing attributes. For contact-lences the accuracy with Proposed Neural Network was in average around 0.3 % less than with the original Neural Network. This algorithm is independent of specify data sets so that many ideas and solutions can be transferred to other classifier paradigms.
Abstract: This research is to study the types of products and
services that employs 'ambient media and respective techniques in its
advertisement materials. Data collection has been done via analyses of a total of 62 advertisements that employed ambient media
approach in Thailand during the years 2004 to 2011. The 62 advertisement were qualifying advertisements of the Adman Awards
& Symposium under the category of Outdoor & Ambience. Analysis
results reveal that there is a total of 14 products and services that
chooses to utilize ambient media in its advertisement. Amongst all ambient media techniques, 'intrusion' uses the value of a medium in
its representation of content most often. Following intrusion is 'interaction', where consumers are invited to participate and interact
with the advertising materials. 'Illusion' ranks third in its ability to subject the viewers to distortions of reality that makes the division
between reality and fantasy less clear.
Abstract: Automatic Vehicle Identification (AVI) has many
applications in traffic systems (highway electronic toll collection, red
light violation enforcement, border and customs checkpoints, etc.).
License Plate Recognition is an effective form of AVI systems. In
this study, a smart and simple algorithm is presented for vehicle-s
license plate recognition system. The proposed algorithm consists of
three major parts: Extraction of plate region, segmentation of
characters and recognition of plate characters. For extracting the
plate region, edge detection algorithms and smearing algorithms are
used. In segmentation part, smearing algorithms, filtering and some
morphological algorithms are used. And finally statistical based
template matching is used for recognition of plate characters. The
performance of the proposed algorithm has been tested on real
images. Based on the experimental results, we noted that our
algorithm shows superior performance in car license plate
recognition.
Abstract: The purpose of this paper is to develop a typology
based on market orientation (MO) and innovation orientation (IO),
and to illustrate to what extent housing companies in Sweden fit
within this framework. A qualitative study on 11 public housing
companies in the central part of Sweden has been conducted by the
help of open and semi-structured questions for data collection. Four
public housing company types- i.e. reactive prospector, proactive
prospector, reactive defender and proactive defender have been
identified by the combination of MO-IO dimensions. Future research
can include other dimensions like entrepreneurship and network to
observe how it particularly affects MO. An empirical study can
compare public and private housing companies on the basis of MO
and IO dimensions. One major contribution of the paper is the
proposition of typology which can be used to describe public housing
companies and deciding their future course of actions.
Abstract: The notion of S-fuzzy left h-ideals in a hemiring is introduced and it's basic properties are investigated.We also study the homomorphic image and preimage of S-fuzzy left h-ideal of hemirings.Using a collection of left h-ideals of a hemiring, S-fuzzy left h-ideal of hemirings are established.The notion of a finite-valued S-fuzzy left h-ideal is introduced,and its characterization is given.S-fuzzy relations on hemirings are discussed.The notion of direct product and S-product are introduced and some properties of the direct product and S-product of S-fuzzy left h-ideal of hemiring are also discussed.
Abstract: Many people regard food events as part of gastronomic tourism and important in enhancing visitors’ experiences. Realizing the importance and contribution of food events to a country’s economy, the Malaysia government is undertaking greater efforts to promote such tourism activities to international tourists. Among other food events, the Ramadan bazaar is a unique food culture event, which receives significant attention from the Malaysia Ministry of Tourism. This study reports the empirical investigation into the international tourists’ perceptions, attraction towards the Ramadan bazaar and willingness in disseminating the information. Using the Ramadan bazaar at Kampung Baru, Kuala Lumpur as the data collection setting, results revealed that the Ramadan bazaar attributes (food and beverages, events and culture) significantly influenced the international tourist attraction to such a bazaar. Their high level of experience and satisfaction positively influenced their willingness to disseminate information. The positive response among the international tourists indicates that the Ramadan bazaar as gastronomic tourism can be used in addition to other tourism products as a catalyst to generate and boost the local economy. The related authorities that are closely associated with the tourism industry therefore should not ignore this indicator but continue to take proactive action in promoting the gastronomic event as one of the major tourist attractions.
Abstract: Cryptographic algorithms play a crucial role in the
information society by providing protection from unauthorized
access to sensitive data. It is clear that information technology will
become increasingly pervasive, Hence we can expect the emergence
of ubiquitous or pervasive computing, ambient intelligence. These
new environments and applications will present new security
challenges, and there is no doubt that cryptographic algorithms and
protocols will form a part of the solution. The efficiency of a public
key cryptosystem is mainly measured in computational overheads,
key size and bandwidth. In particular the RSA algorithm is used in
many applications for providing the security. Although the security
of RSA is beyond doubt, the evolution in computing power has
caused a growth in the necessary key length. The fact that most chips
on smart cards can-t process key extending 1024 bit shows that there
is need for alternative. NTRU is such an alternative and it is a
collection of mathematical algorithm based on manipulating lists of
very small integers and polynomials. This allows NTRU to high
speeds with the use of minimal computing power. NTRU (Nth degree
Truncated Polynomial Ring Unit) is the first secure public key
cryptosystem not based on factorization or discrete logarithm
problem. This means that given sufficient computational resources
and time, an adversary, should not be able to break the key. The
multi-party communication and requirement of optimal resource
utilization necessitated the need for the present day demand of
applications that need security enforcement technique .and can be
enhanced with high-end computing. This has promoted us to develop
high-performance NTRU schemes using approaches such as the use
of high-end computing hardware. Peer-to-peer (P2P) or enterprise
grids are proven as one of the approaches for developing high-end
computing systems. By utilizing them one can improve the
performance of NTRU through parallel execution. In this paper we
propose and develop an application for NTRU using enterprise grid
middleware called Alchemi. An analysis and comparison of its
performance for various text files is presented.
Abstract: The response surface methodology (RSM) is a
collection of mathematical and statistical techniques useful in the
modeling and analysis of problems in which the dependent variable
receives the influence of several independent variables, in order to
determine which are the conditions under which should operate these
variables to optimize a production process. The RSM estimated a
regression model of first order, and sets the search direction using the
method of maximum / minimum slope up / down MMS U/D.
However, this method selects the step size intuitively, which can
affect the efficiency of the RSM. This paper assesses how the step
size affects the efficiency of this methodology. The numerical
examples are carried out through Monte Carlo experiments,
evaluating three response variables: efficiency gain function, the
optimum distance and the number of iterations. The results in the
simulation experiments showed that in response variables efficiency
and gain function at the optimum distance were not affected by the
step size, while the number of iterations is found that the efficiency if
it is affected by the size of the step and function type of test used.
Abstract: The world-s largest Pre-stressed Concrete Cylinder
Pipe (PCCP) water supply project had a series of pipe failures which
occurred between 1999 and 2001. This has led the Man-Made River
Authority (MMRA), the authority in charge of the implementation
and operation of the project, to setup a rehabilitation plan for the
conveyance system while maintaining the uninterrupted flow of
water to consumers. At the same time, MMRA recognized the need
for a long term management tool that would facilitate repair and
maintenance decisions and enable taking the appropriate preventive
measures through continuous monitoring and estimation of the
remaining life of each pipe. This management tool is known as the
Pipe Risk Management System (PRMS) and now in operation at
MMRA. Both the rehabilitation plan and the PRMS require the
availability of complete and accurate pipe construction and
manufacturing data
This paper describes a systematic approach of data collection,
analysis, evaluation and correction for the construction and
manufacturing data files of phase I pipes which are the platform for
the PRMS database and any other related decision support system.
Abstract: This paper presents a system for discovering
association rules from collections of unstructured documents called
EART (Extract Association Rules from Text). The EART system
treats texts only not images or figures. EART discovers association
rules amongst keywords labeling the collection of textual documents.
The main characteristic of EART is that the system integrates XML
technology (to transform unstructured documents into structured
documents) with Information Retrieval scheme (TF-IDF) and Data
Mining technique for association rules extraction. EART depends on
word feature to extract association rules. It consists of four phases:
structure phase, index phase, text mining phase and visualization
phase. Our work depends on the analysis of the keywords in the
extracted association rules through the co-occurrence of the keywords
in one sentence in the original text and the existing of the keywords
in one sentence without co-occurrence. Experiments applied on a
collection of scientific documents selected from MEDLINE that are
related to the outbreak of H5N1 avian influenza virus.
Abstract: Text categorization is the problem of classifying text documents into a set of predefined classes. After a preprocessing step the documents are typically represented as large sparse vectors. When training classifiers on large collections of documents, both the time and memory restrictions can be quite prohibitive. This justifies the application of features selection methods to reduce the dimensionality of the document-representation vector. Four feature selection methods are evaluated: Random Selection, Information Gain (IG), Support Vector Machine (called SVM_FS) and Genetic Algorithm with SVM (GA_FS). We showed that the best results were obtained with SVM_FS and GA_FS methods for a relatively small dimension of the features vector comparative with the IG method that involves longer vectors, for quite similar classification accuracies. Also we present a novel method to better correlate SVM kernel-s parameters (Polynomial or Gaussian kernel).
Abstract: The dynamic or complex modulus test is considered
to be a mechanistically based laboratory test to reliably characterize
the strength and load-resistance of Hot-Mix Asphalt (HMA) mixes
used in the construction of roads. The most common observation is
that the data collected from these tests are often noisy and somewhat
non-sinusoidal. This hampers accurate analysis of the data to obtain
engineering insight. The goal of the work presented in this paper is to
develop and compare automated evolutionary computational
techniques to filter test noise in the collection of data for the HMA
complex modulus test. The results showed that the Covariance
Matrix Adaptation-Evolutionary Strategy (CMA-ES) approach is
computationally efficient for filtering data obtained from the HMA
complex modulus test.