Abstract: Complex engineering design problems consist of
numerous factors of varying criticalities. Considering fundamental features of design and inferior details alike will result in an extensive
waste of time and effort. Design parameters should be introduced gradually as appropriate based on their significance relevant to the
problem context. This motivates the representation of design parameters at multiple levels of an abstraction hierarchy. However, developing abstraction hierarchies is an area that is not well
understood. Our research proposes a novel hierarchical abstraction methodology to plan effective engineering designs and processes. It
provides a theoretically sound foundation to represent, abstract and stratify engineering design parameters and tasks according to causality and criticality. The methodology creates abstraction
hierarchies in a recursive and bottom-up approach that guarantees no
backtracking across any of the abstraction levels. The methodology consists of three main phases, representation, abstraction, and layering to multiple hierarchical levels. The effectiveness of the
developed methodology is demonstrated by a design problem.
Abstract: The study of human hand morphology reveals that developing an artificial hand with the capabilities of human hand is an extremely challenging task. This paper presents the development of a robotic prosthetic hand focusing on the improvement of a tendon driven mechanism towards a biomimetic prosthetic hand. The design of this prosthesis hand is geared towards achieving high level of dexterity and anthropomorphism by means of a new hybrid mechanism that integrates a miniature motor driven actuation mechanism, a Shape Memory Alloy actuated mechanism and a passive mechanical linkage. The synergy of these actuators enables the flexion-extension movement at each of the finger joints within a limited size, shape and weight constraints. Tactile sensors are integrated on the finger tips and the finger phalanges area. This prosthesis hand is developed with an exact size ratio that mimics a biological hand. Its behavior resembles the human counterpart in terms of working envelope, speed and torque, and thus resembles both the key physical features and the grasping functionality of an adult hand.
Abstract: The aim of this paper to characterize a larger set of
wavelet functions for implementation in a still image compression
system using SPIHT algorithm. This paper discusses important
features of wavelet functions and filters used in sub band coding to
convert image into wavelet coefficients in MATLAB. Image quality
is measured objectively using peak signal to noise ratio (PSNR) and
its variation with bit rate (bpp). The effect of different parameters is
studied on different wavelet functions. Our results provide a good
reference for application designers of wavelet based coder.
Abstract: Within the domain of Systems Engineering the need
to perform property aggregation to understand, analyze and manage
complex systems is unequivocal. This can be seen in numerous
domains such as capability analysis, Mission Essential Competencies
(MEC) and Critical Design Features (CDF). Furthermore, the need
to consider uncertainty propagation as well as the sensitivity of
related properties within such analysis is equally as important when
determining a set of critical properties within such a system.
This paper describes this property breakdown in a number of
domains within Systems Engineering and, within the area of CDFs,
emphasizes the importance of uncertainty analysis. As part of this, a
section of the paper describes possible techniques which may be used
within uncertainty propagation and in conclusion an example is
described utilizing one of the techniques for property and uncertainty
aggregation within an aircraft system to aid the determination of
Critical Design Features.
Abstract: This study investigates the possibility providing gully
erosion map by the supervised classification of satellite images
(ETM+) in two mountainous and plain land types. These land types
were the part of Varamin plain, Tehran province, and Roodbar subbasin,
Guilan province, as plain and mountain land types,
respectively. The position of 652 and 124 ground control points were
recorded by GPS respectively in mountain and plain land types. Soil
gully erosion, land uses or plant covers were investigated in these
points. Regarding ground control points and auxiliary points, training
points of gully erosion and other surface features were introduced to
software (Ilwis 3.3 Academic). The supervised classified map of
gully erosion was prepared by maximum likelihood method and then,
overall accuracy of this map was computed. Results showed that the
possibility supervised classification of gully erosion isn-t possible,
although it need more studies for results generalization to other
mountainous regions. Also, with increasing land uses and other
surface features in plain physiography, it decreases the classification
of accuracy.
Abstract: The heuristic decision rules used for project
scheduling will vary depending upon the project-s size, complexity,
duration, personnel, and owner requirements. The concept of project
complexity has received little detailed attention. The need to
differentiate between easy and hard problem instances and the
interest in isolating the fundamental factors that determine the
computing effort required by these procedures inspired a number of
researchers to develop various complexity measures.
In this study, the most common measures of project complexity are
presented. A new measure of project complexity is developed. The
main privilege of the proposed measure is that, it considers size,
shape and logic characteristics, time characteristics, resource
demands and availability characteristics as well as number of critical
activities and critical paths. The degree of sensitivity of the proposed
measure for complexity of project networks has been tested and
evaluated against the other measures of complexity of the considered
fifty project networks under consideration in the current study. The
developed measure showed more sensitivity to the changes in the
network data and gives accurate quantified results when comparing
the complexities of networks.
Abstract: The advances in wireless communication have opened unlimited horizons but there are some challenges as well. The Nature derived air medium between MS (Mobile Station) and BS (Base Station) is beyond human control and produces channel impairment. The impact of the natural conditions at the air medium is the biggest issue in wireless communication. Natural conditions make reliability more cumbersome; here reliability refers to the efficient recovery of the lost or erroneous data. The SR-ARQ (Selective Repeat-Automatic Repeat Request) protocol is a de facto standard for any wireless technology at the air interface with its standard reliability features. Our focus in this research is on the reliability of the control or feedback signal of the SR-ARQ protocol. The proposed mechanism, RSR-ARQ (Reliable SR-ARQ) is an enhancement of the SR-ARQ protocol that has ensured the reliability of the control signals through channel impairment sensitive mechanism. We have modeled the system under two-state discrete time Markov Channel. The simulation results demonstrate the better recovery of the lost or erroneous data that will increase the overall system performance.
Abstract: Social media has led to paradigm shifts in ways
people work and do business, interact and socialize, learn and obtain
knowledge. So much so that social media has established itself as an
important spatial extension of this nation-s historicity and challenges.
Regardless of the enabling reputation and recommendation features
through social networks embedded in the social media system, the
overflow of broadcasted and publicized media contents turns the
table around from engendering trust to doubting the trust system.
When the trust is at doubt, the effects include deactivation of
accounts and creation of multiple profiles, which lead to the overflow
of 'ghost' contents (i.e. “the abundance of abandoned ships"). In
most literature, the study of trust can be related to culture; hence the
difference between Western-s “openness" and Eastern-s “blue-chip"
concepts in networking and relationships. From a survey on issues
and challenges among Malaysian social media users, 'authenticity'
emerges as one of the main factors that causes and is caused by other
factors. The other issue that has surfaced is credibility either in terms
of message/content and source. Another is the quality of the
knowledge that is shared. This paper explores the terrains of this
critical space which in recent years has been dominated increasingly
by, arguably, social networks embedded in the social media system,
the overflow of broadcasted and publicized media content.
Abstract: Text categorization is the problem of classifying text documents into a set of predefined classes. After a preprocessing step the documents are typically represented as large sparse vectors. When training classifiers on large collections of documents, both the time and memory restrictions can be quite prohibitive. This justifies the application of features selection methods to reduce the dimensionality of the document-representation vector. Four feature selection methods are evaluated: Random Selection, Information Gain (IG), Support Vector Machine (called SVM_FS) and Genetic Algorithm with SVM (GA_FS). We showed that the best results were obtained with SVM_FS and GA_FS methods for a relatively small dimension of the features vector comparative with the IG method that involves longer vectors, for quite similar classification accuracies. Also we present a novel method to better correlate SVM kernel-s parameters (Polynomial or Gaussian kernel).
Abstract: A Web-based learning tool, the Learn IN Context
(LINC) system, designed and being used in some institution-s
courses in mixed-mode learning, is presented in this paper. This
mode combines face-to-face and distance approaches to education.
LINC can achieve both collaborative and competitive learning. In
order to provide both learners and tutors with a more natural way to
interact with e-learning applications, a conversational interface has
been included in LINC. Hence, the components and essential features
of LINC+, the voice enhanced version of LINC, are described. We
report evaluation experiments of LINC/LINC+ in a real use context
of a computer programming course taught at the Université de
Moncton (Canada). The findings show that when the learning
material is delivered in the form of a collaborative and voice-enabled
presentation, the majority of learners seem to be satisfied with this
new media, and confirm that it does not negatively affect their
cognitive load.
Abstract: The paper proposes a novel technique for iris
recognition using texture and phase features. Texture features are
extracted on the normalized iris strip using Haar Wavelet while phase
features are obtained using LOG Gabor Wavelet. The matching
scores generated from individual modules are combined using sum of
score technique. The system is tested on database obtained from Bath
University and Indian Institute of Technology Kanpur and is giving
an accuracy of 95.62% and 97.66% respectively. The FAR and FRR
of the combined system is also reduced comparatively.
Abstract: As emails communications have no consistent
authentication procedure to ensure the authenticity, we present an
investigation analysis approach for detecting forged emails based on
Random Forests and Naïve Bays classifiers. Instead of investigating
the email headers, we use the body content to extract a unique writing
style for all the possible suspects. Our approach consists of four main
steps: (1) The cybercrime investigator extract different effective
features including structural, lexical, linguistic, and syntactic
evidence from previous emails for all the possible suspects, (2) The
extracted features vectors are normalized to increase the accuracy
rate. (3) The normalized features are then used to train the learning
engine, (4) upon receiving the anonymous email (M); we apply the
feature extraction process to produce a feature vector. Finally, using
the machine learning classifiers the email is assigned to one of the
suspects- whose writing style closely matches M. Experimental
results on real data sets show the improved performance of the
proposed method and the ability of identifying the authors with a
very limited number of features.
Abstract: Ringing effect is one of the most annoying visual
artifacts in digital video. It is a significant factor of subjective quality
deterioration. However, there is a widely-accepted misunderstanding
of its cause. In this paper, we propose a reasonable interpretation of the
cause of ringing effect. Based on the interpretation, we suggest further
two methods to reduce ringing effect in DCT-based video coding. The
methods adaptively adjust quantizers according to video features. Our
experiments proved that the methods could efficiently improve
subjective quality with acceptable additional computing costs.
Abstract: Face Recognition is a field of multidimensional
applications. A lot of work has been done, extensively on the most of
details related to face recognition. This idea of face recognition using
PCA is one of them. In this paper the PCA features for Feature
extraction are used and matching is done for the face under
consideration with the test image using Eigen face coefficients. The
crux of the work lies in optimizing Euclidean distance and paving the
way to test the same algorithm using Matlab which is an efficient tool
having powerful user interface along with simplicity in representing
complex images.
Abstract: In this paper, the processing of sonar signals has been
carried out using Minimal Resource Allocation Network (MRAN)
and a Probabilistic Neural Network (PNN) in differentiation of
commonly encountered features in indoor environments. The
stability-plasticity behaviors of both networks have been
investigated. The experimental result shows that MRAN possesses
lower network complexity but experiences higher plasticity than
PNN. An enhanced version called parallel MRAN (pMRAN) is
proposed to solve this problem and is proven to be stable in
prediction and also outperformed the original MRAN.
Abstract: Unlike general-purpose processors, digital signal
processors (DSP processors) are strongly application-dependent. To
meet the needs for diverse applications, a wide variety of DSP
processors based on different architectures ranging from the
traditional to VLIW have been introduced to the market over the
years. The functionality, performance, and cost of these processors
vary over a wide range. In order to select a processor that meets the
design criteria for an application, processor performance is usually
the major concern for digital signal processing (DSP) application
developers. Performance data are also essential for the designers of
DSP processors to improve their design. Consequently, several DSP
performance benchmarks have been proposed over the past decade or
so. However, none of these benchmarks seem to have included recent
new DSP applications.
In this paper, we use a new benchmark that we recently developed
to compare the performance of popular DSP processors from Texas
Instruments and StarCore. The new benchmark is based on the
Selectable Mode Vocoder (SMV), a speech-coding program from the
recent third generation (3G) wireless voice applications. All
benchmark kernels are compiled by the compilers of the respective
DSP processors and run on their simulators. Weighted arithmetic
mean of clock cycles and arithmetic mean of code size are used to
compare the performance of five DSP processors.
In addition, we studied how the performance of a processor is
affected by code structure, features of processor architecture and
optimization of compiler. The extensive experimental data gathered,
analyzed, and presented in this paper should be helpful for DSP
processor and compiler designers to meet their specific design goals.
Abstract: Eye localization is necessary for face recognition and
related application areas. Most of eye localization algorithms reported
so far still need to be improved about precision and computational
time for successful applications. In this paper, we propose an eye
location method based on multi-scale Gabor feature vectors, which is
more robust with respect to initial points. The eye localization based
on Gabor feature vectors first needs to constructs an Eye Model Bunch
for each eye (left or right eye) which consists of n Gabor jets and
average eye coordinates of each eyes obtained from n model face
images, and then tries to localize eyes in an incoming face image by
utilizing the fact that the true eye coordinates is most likely to be very
close to the position where the Gabor jet will have the best Gabor jet
similarity matching with a Gabor jet in the Eye Model Bunch. Similar
ideas have been already proposed in such as EBGM (Elastic Bunch
Graph Matching). However, the method used in EBGM is known to be
not robust with respect to initial values and may need extensive search
range for achieving the required performance, but extensive search
ranges will cause much more computational burden. In this paper, we
propose a multi-scale approach with a little increased computational
burden where one first tries to localize eyes based on Gabor feature
vectors in a coarse face image obtained from down sampling of the
original face image, and then localize eyes based on Gabor feature
vectors in the original resolution face image by using the eye
coordinates localized in the coarse scaled image as initial points.
Several experiments and comparisons with other eye localization
methods reported in the other papers show the efficiency of our
proposed method.
Abstract: Identifying and classifying intersections according to
severity is very important for implementation of safety related
counter measures and effective models are needed to compare and
assess the severity. Highway safety organizations have considered
intersection safety among their priorities. In spite of significant
advances in highways safety, the large numbers of crashes with high
severities still occur in the highways. Investigation of influential
factors on crashes enables engineers to carry out calculations in order
to reduce crash severity. Previous studies lacked a model capable of
simultaneous illustration of the influence of human factors, road,
vehicle, weather conditions and traffic features including traffic
volume and flow speed on the crash severity. Thus, this paper is
aimed at developing the models to illustrate the simultaneous
influence of these variables on the crash severity in urban highways.
The models represented in this study have been developed using
binary Logit Models. SPSS software has been used to calibrate the
models. It must be mentioned that backward regression method in
SPSS was used to identify the significant variables in the model.
Consider to obtained results it can be concluded that the main
factor in increasing of crash severity in urban highways are driver
age, movement with reverse gear, technical defect of the vehicle,
vehicle collision with motorcycle and bicycle, bridge, frontal impact
collisions, frontal-lateral collisions and multi-vehicle crashes in
urban highways which always increase the crash severity in urban
highways.
Abstract: Voice over Internet Protocol (VoIP) application or commonly known as softphone has been developing an increasingly large market in today-s telecommunication world and the trend is expected to continue with the enhancement of additional features. This includes leveraging on the existing presence services, location and contextual information to enable more ubiquitous and seamless communications. In this paper, we discuss the concept of seamless session transfer for real-time application such as VoIP and IPTV, and our prototype implementation of such concept on a selected open source VoIP application. The first part of this paper is about conducting performance evaluation and assessments across some commonly found open source VoIP applications that are Ekiga, Kphone, Linphone and Twinkle so as to identify one of them for implementing our design of seamless session transfer. Subjective testing has been carried out to evaluate the audio performance on these VoIP applications and rank them according to their Mean Opinion Score (MOS) results. The second part of this paper is to discuss on the performance evaluations of our prototype implementation of session transfer using Linphone.
Abstract: Amount of dissolve oxygen in a river has a great direct affect on aquatic macroinvertebrates and this would influence on the region ecosystem indirectly. In this paper it is tried to predict dissolved oxygen in rivers by employing an easy Fuzzy Logic Modeling, Wang Mendel method. This model just uses previous records to estimate upcoming values. For this purpose daily and hourly records of eight stations in Au Sable watershed in Michigan, United States are employed for 12 years and 50 days period respectively. Calculations indicate that for long period prediction it is better to increase input intervals. But for filling missed data it is advisable to decrease the interval. Increasing partitioning of input and output features influence a little on accuracy but make the model too time consuming. Increment in number of input data also act like number of partitioning. Large amount of train data does not modify accuracy essentially, so, an optimum training length should be selected.