Abstract: Managing and improving efficiency in the current
highly competitive global automotive industry demands that those
companies adopt leaner and more flexible systems. During the past
20 years the domestic automotive industry in North America has been
focusing on establishing new management strategies in order to meet
market demands. The lean management process also known as
Toyota Manufacturing Process (TPS) or lean manufacturing
encompasses tools and techniques that were established in order to
provide the best quality product with the fastest lead time at the
lowest cost. The following paper presents a study that focused on
improving labor efficiency at one of the Big Three (Ford, GM,
Chrysler LLC) domestic automotive facility in North America. The
objective of the study was to utilize several lean management tools in
order to optimize the efficiency and utilization levels at the “Pre-
Marriage” chassis area in a truck manufacturing and assembly
facility. Utilizing three different lean tools (i.e. Standardization of
work, 7 Wastes, and 5S) this research was able to improve efficiency
by 51%, utilization by 246%, and reduce operations by 14%. The
return on investment calculated based on the improvements made
was 284%.
Abstract: Fast changing knowledge systems on the Internet can
be accessed more efficiently with the help of automatic document
summarization and updating techniques. The aim of multi-document
update summary generation is to construct a summary unfolding the
mainstream of data from a collection of documents based on the
hypothesis that the user has already read a set of previous documents.
In order to provide a lot of semantic information from the documents,
deeper linguistic or semantic analysis of the source documents were
used instead of relying only on document word frequencies to select
important concepts. In order to produce a responsive summary,
meaning oriented structural analysis is needed. To address this issue,
the proposed system presents a document summarization approach
based on sentence annotation with aspects, prepositions and named
entities. Semantic element extraction strategy is used to select
important concepts from documents which are used to generate
enhanced semantic summary.
Abstract: Construction cost estimation is one of the most
important aspects of construction project design. For generations, the
process of cost estimating has been manual, time-consuming and
error-prone. This has partly led to most cost estimates to be unclear
and riddled with inaccuracies that at times lead to over- or underestimation
of construction cost. The development of standard set of
measurement rules that are understandable by all those involved in a
construction project, have not totally solved the challenges. Emerging
Building Information Modelling (BIM) technologies can exploit
standard measurement methods to automate cost estimation process
and improve accuracies. This requires standard measurement
methods to be structured in ontological and machine readable format;
so that BIM software packages can easily read them. Most standard
measurement methods are still text-based in textbooks and require
manual editing into tables or Spreadsheet during cost estimation. The
aim of this study is to explore the development of an ontology based
on New Rules of Measurement (NRM) commonly used in the UK for
cost estimation. The methodology adopted is Methontology, one of
the most widely used ontology engineering methodologies. The
challenges in this exploratory study are also reported and
recommendations for future studies proposed.
Abstract: Clustering involves the partitioning of n objects into k
clusters. Many clustering algorithms use hard-partitioning techniques
where each object is assigned to one cluster. In this paper we propose
an overlapping algorithm MCOKE which allows objects to belong to
one or more clusters. The algorithm is different from fuzzy clustering
techniques because objects that overlap are assigned a membership
value of 1 (one) as opposed to a fuzzy membership degree. The
algorithm is also different from other overlapping algorithms that
require a similarity threshold be defined a priori which can be
difficult to determine by novice users.
Abstract: High Peak to Average Power Ratio (PAPR) of the
transmitted signal is a serious problem in multicarrier systems (MC),
such as Orthogonal Frequency Division Multiplexing (OFDM), or in
Multi-Carrier Code Division Multiple Access (MC-CDMA) systems,
due to large number of subcarriers. This effect is possible reduce with
some PAPR reduction techniques. Spreading sequences at the
presence of Saleh and Rapp models of high power amplifier (HPA)
have big influence on the behavior of system. In this paper we
investigate the bit-error-rate (BER) performance of MC-CDMA
systems. Basically we can see from simulations that the MC-CDMA
system with Iterative algorithm can be providing significantly better
results than the MC-CDMA system. The results of our analyses are
verified via simulation.
Abstract: The current study investigated the influence of milling
time and ball-to-powder (BPR) weight ratio on the microstructural
constituents and mechanical properties of bulk nanocrystalline Al;
Al-10%Cu; and Al-10%Cu-5%Ti alloys. Powder consolidation was
carried out using a high frequency induction heat sintering where the
processed metal powders were sintered into a dense and strong bulk
material. The powders and the bulk samples were characterized using
XRD and FEGSEM techniques. The mechanical properties were
evaluated at various temperatures of 25°C, 100°C, 200°C, 300°C and
400°C to study the thermal stability of the processed alloys. The
processed bulk nanocrystalline alloys displayed extremely high
hardness values even at elevated temperatures. The Al-10%Cu-5%Ti
alloy displayed the highest hardness values at room and elevated
temperatures which are related to the presence of Ti-containing
phases such as Al3Ti and AlCu2Ti. These phases are thermally stable
and retain the high hardness values at elevated temperatures up to
400ºC.
Abstract: Job Scheduling plays an important role for efficient
utilization of grid resources available across different domains and
geographical zones. Scheduling of jobs is challenging and NPcomplete.
Evolutionary / Swarm Intelligence algorithms have been
extensively used to address the NP problem in grid scheduling.
Artificial Bee Colony (ABC) has been proposed for optimization
problems based on foraging behaviour of bees. This work proposes a
modified ABC algorithm, Cluster Heterogeneous Earliest First Min-
Min Artificial Bee Colony (CHMM-ABC), to optimally schedule
jobs for the available resources. The proposed model utilizes a novel
Heterogeneous Earliest Finish Time (HEFT) Heuristic Algorithm
along with Min-Min algorithm to identify the initial food source.
Simulation results show the performance improvement of the
proposed algorithm over other swarm intelligence techniques.
Abstract: Text mining techniques are generally applied for
classifying the text, finding fuzzy relations and structures in data
sets. This research provides plenty text mining capabilities. One
common application is text classification and event extraction,
which encompass deducing specific knowledge concerning incidents
referred to in texts. The main contribution of this paper is the
clarification of a concept graph generation mechanism, which is based
on a text classification and optimal fuzzy relationship extraction.
Furthermore, the work presented in this paper explains the application
of fuzzy relationship extraction and branch and bound (BB) method
to simplify the texts.
Abstract: Motion Tracking and Stereo Vision are complicated,
albeit well-understood problems in computer vision. Existing
softwares that combine the two approaches to perform stereo motion
tracking typically employ complicated and computationally expensive
procedures. The purpose of this study is to create a simple and
effective solution capable of combining the two approaches. The
study aims to explore a strategy to combine the two techniques
of two-dimensional motion tracking using Kalman Filter; and depth
detection of object using Stereo Vision. In conventional approaches
objects in the scene of interest are observed using a single camera.
However for Stereo Motion Tracking; the scene of interest is
observed using video feeds from two calibrated cameras. Using two
simultaneous measurements from the two cameras a calculation for
the depth of the object from the plane containing the cameras is made.
The approach attempts to capture the entire three-dimensional spatial
information of each object at the scene and represent it through a
software estimator object. In discrete intervals, the estimator tracks
object motion in the plane parallel to plane containing cameras and
updates the perpendicular distance value of the object from the plane
containing the cameras as depth. The ability to efficiently track
the motion of objects in three-dimensional space using a simplified
approach could prove to be an indispensable tool in a variety of
surveillance scenarios. The approach may find application from high
security surveillance scenes such as premises of bank vaults, prisons
or other detention facilities; to low cost applications in supermarkets
and car parking lots.
Abstract: Composite materials, due to their unique properties
such as high strength to weight ratio, corrosion resistance, and impact
resistance have huge potential as structural materials in automotive,
construction and transportation applications. However, these
properties often come at higher cost owing to complex design
methods, difficult manufacturing processes and raw material cost.
Traditionally, tapered laminated composite structures are
manufactured using autoclave manufacturing process by ply drop off
technique. Autoclave manufacturing though very powerful suffers
from high capital investment and higher energy consumption. As per
the current trends in composite manufacturing, Out of Autoclave
(OoA) processes are looked as emerging technologies for
manufacturing the structural composite components for aerospace
and defense applications. However, there is a need for improvement
among these processes to make them reliable and consistent. In this
paper, feasibility of using out of autoclave process to manufacture the
variable thickness cantilever beam is discussed. The minimum weight
design for the composite beam is obtained using constant stress beam
concept by tailoring the thickness of the beam. Ply drop off
techniques was used to fabricate the variable thickness beam from
glass/epoxy prepregs. Experiments were conducted to measure
bending stresses along the span of the cantilever beam at different
intervals by applying the concentrated load at the free end.
Experimental results showed that the stresses in the bean at different
intervals were constant. This proves the ability of OoA process to
manufacture the constant stress beam. Finite element model for the
constant stress beam was developed using commercial finite element
simulation software. It was observed that the simulation results
agreed very well with the experimental results and thus validated
design and manufacturing approach used.
Abstract: This paper focuses on the assessment of the air
pollution and morbidity relationship in Tunisia. Air pollution is
measured by ozone air concentration and the morbidity is measured
by the number of respiratory-related restricted activity days during
the 2-week period prior to the interview. Socioeconomic data are also
collected in order to adjust for any confounding covariates. Our
sample is composed by 407 Tunisian respondents; 44.7% are women,
the average age is 35.2, near 69% are living in a house built after
1980, and 27.8% have reported at least one day of respiratory-related
restricted activity. The model consists on the regression of the
number of respiratory-related restricted activity days on the air
quality measure and the socioeconomic covariates. In order to correct
for zero-inflation and heterogeneity, we estimate several models
(Poisson, negative binomial, zero inflated Poisson, Poisson hurdle,
negative binomial hurdle and finite mixture Poisson models).
Bootstrapping and post-stratification techniques are used in order to
correct for any sample bias. According to the Akaike information
criteria, the hurdle negative binomial model has the greatest goodness
of fit. The main result indicates that, after adjusting for
socioeconomic data, the ozone concentration increases the probability
of positive number of restricted activity days.
Abstract: Recently, increasing the quality of experience (QoE) is
an important issue. Since performance degradation at cell edge
extremely reduces the QoE, several techniques are defined at
LTE/LTE-A standard to remove inter-cell interference (ICI). However,
the conventional techniques have disadvantage because there is a
trade-off between resource allocation and reliable communication.
The proposed scheme reduces the ICI more efficiently by using
channel state information (CSI) smartly. It is shown that the proposed
scheme can reduce the ICI with fewer resources.
Abstract: In this study, we propose a novel technique for acoustic
echo suppression (AES) during speech recognition under barge-in
conditions. Conventional AES methods based on spectral subtraction
apply fixed weights to the estimated echo path transfer function
(EPTF) at the current signal segment and to the EPTF estimated until
the previous time interval. However, the effects of echo path changes
should be considered for eliminating the undesired echoes. We
describe a new approach that adaptively updates weight parameters in
response to abrupt changes in the acoustic environment due to
background noises or double-talk. Furthermore, we devised a voice
activity detector and an initial time-delay estimator for barge-in speech
recognition in communication networks. The initial time delay is
estimated using log-spectral distance measure, as well as
cross-correlation coefficients. The experimental results show that the
developed techniques can be successfully applied in barge-in speech
recognition systems.
Abstract: Operations research science (OR) deals with good
success in developing and applying scientific methods for problem
solving and decision-making. However, by using OR techniques, we
can enhance the use of computer decision support systems to achieve
optimal management for institutions. OR applies comprehensive
analysis including all factors that effect on it and builds mathematical
modeling to solve business or organizational problems. In addition, it
improves decision-making and uses available resources efficiently.
The adoption of OR by universities would definitely contributes to
the development and enhancement of the performance of OR
techniques. This paper provides an understanding of the structures,
approaches and models of OR in problem solving and decisionmaking.
Abstract: The ad hoc networks are the future of wireless
technology as everyone wants fast and accurate error free information
so keeping this in mind Bit Error Rate (BER) and power is optimized
in this research paper by using the Genetic Algorithm (GA). The
digital modulation techniques used for this paper are Binary Phase
Shift Keying (BPSK), M-ary Phase Shift Keying (M-ary PSK), and
Quadrature Amplitude Modulation (QAM). This work is
implemented on Wireless Ad Hoc Networks (WLAN). Then it is
analyze which modulation technique is performing well to optimize
the BER and power of WLAN.
Abstract: The use of eXtensible Markup Language (XML) in
web, business and scientific databases lead to the development of
methods, techniques and systems to manage and analyze XML data.
Semi-structured documents suffer due to its heterogeneity and
dimensionality. XML structure and content mining represent
convergence for research in semi-structured data and text mining. As
the information available on the internet grows drastically, extracting
knowledge from XML documents becomes a harder task. Certainly,
documents are often so large that the data set returned as answer to a
query may also be very big to convey the required information. To
improve the query answering, a Semantic Tree Based Association
Rule (STAR) mining method is proposed. This method provides
intentional information by considering the structure, content and the
semantics of the content. The method is applied on Reuter’s dataset
and the results show that the proposed method outperforms well.
Abstract: We present a trigonometric scheme to approximate a
circular arc with its two end points and two end tangents/unit
tangents. A rational cubic trigonometric Bézier curve is constructed
whose end control points are defined by the end points of the circular
arc. Weight functions and the remaining control points of the cubic
trigonometric Bézier curve are estimated by variational approach to
reproduce a circular arc. The radius error is calculated and found less
than the existing techniques.
Abstract: Red blood cells (RBC) are the most common types of
blood cells and are the most intensively studied in cell biology. The
lack of RBCs is a condition in which the amount of hemoglobin level
is lower than normal and is referred to as “anemia”. Abnormalities in
RBCs will affect the exchange of oxygen. This paper presents a
comparative study for various techniques for classifying the RBCs as
normal or abnormal (anemic) using WEKA. WEKA is an open
source consists of different machine learning algorithms for data
mining applications. The algorithms tested are Radial Basis Function
neural network, Support vector machine, and K-Nearest Neighbors
algorithm. Two sets of combined features were utilized for
classification of blood cells images. The first set, exclusively consist
of geometrical features, was used to identify whether the tested blood
cell has a spherical shape or non-spherical cells. While the second
set, consist mainly of textural features was used to recognize the
types of the spherical cells. We have provided an evaluation based on
applying these classification methods to our RBCs image dataset
which were obtained from Serdang Hospital - Malaysia, and
measuring the accuracy of test results. The best achieved
classification rates are 97%, 98%, and 79% for Support vector
machines, Radial Basis Function neural network, and K-Nearest
Neighbors algorithm respectively.
Abstract: The classroom of the 21st century is an ever changing
forum for new and innovative thoughts and ideas. With increasing
technology and opportunity, students have rapid access to
information that only decades ago would have taken weeks to obtain.
Unfortunately, new techniques and technology are not the cure for
the fundamental problems that have plagued the classroom ever since
education was established. Class size has been an issue long debated
in academia. While it is difficult to pin point an exact number, it is
clear that in this case more does not mean better. By looking into the
success and pitfalls of classroom size the true advantages of smaller
classes will become clear. Previously, one class was comprised of 50
students. Being seventeen and eighteen- year- old students,
sometimes it was quite difficult for them to stay focused. To help
them understand and gain much knowledge, a researcher introduced
“The Theory of Multiple Intelligence” and this, in fact, enabled
students to learn according to their own learning preferences no
matter how they were being taught. In this lesson, the researcher
designed a cycle of learning activities involving all intelligences so
that everyone had equal opportunities to learn.
Abstract: Image spam is a kind of email spam where the spam
text is embedded with an image. It is a new spamming technique
being used by spammers to send their messages to bulk of internet
users. Spam email has become a big problem in the lives of internet
users, causing time consumption and economic losses. The main
objective of this paper is to detect the image spam by using histogram
properties of an image. Though there are many techniques to
automatically detect and avoid this problem, spammers employing
new tricks to bypass those techniques, as a result those techniques are
inefficient to detect the spam mails. In this paper we have proposed a
new method to detect the image spam. Here the image features are
extracted by using RGB histogram, HSV histogram and combination
of both RGB and HSV histogram. Based on the optimized image
feature set classification is done by using k- Nearest Neighbor(k-NN)
algorithm. Experimental result shows that our method has achieved
better accuracy. From the result it is known that combination of RGB
and HSV histogram with k-NN algorithm gives the best accuracy in
spam detection.