Abstract: As a vital activity for companies, new product
development (NPD) is also a very risky process due to the high
uncertainty degree encountered at every development stage and the
inevitable dependence on how previous steps are successfully
accomplished. Hence, there is an apparent need to evaluate new
product initiatives systematically and make accurate decisions under
uncertainty. Another major concern is the time pressure to launch a
significant number of new products to preserve and increase the
competitive power of the company. In this work, we propose an
integrated decision-making framework based on neural networks and
fuzzy logic to make appropriate decisions and accelerate the
evaluation process. We are especially interested in the two initial
stages where new product ideas are selected (go/no go decision) and
the implementation order of the corresponding projects are
determined. We show that this two-staged intelligent approach allows
practitioners to roughly and quickly separate good and bad product
ideas by making use of previous experiences, and then, analyze a
more shortened list rigorously.
Abstract: In this paper, we propose a new robust and secure
system that is based on the combination between two different
transforms Discrete wavelet Transform (DWT) and Contourlet
Transform (CT). The combined transforms will compensate the
drawback of using each transform separately. The proposed
algorithm has been designed, implemented and tested successfully.
The experimental results showed that selecting the best sub-band for
embedding from both transforms will improve the imperceptibility
and robustness of the new combined algorithm. The evaluated
imperceptibility of the combined DWT-CT algorithm which gave a
PSNR value 88.11 and the combination DWT-CT algorithm
improves robustness since it produced better robust against Gaussian
noise attack. In addition to that, the implemented system shored a
successful extraction method to extract watermark efficiently.
Abstract: Deep Brain Stimulation or DBS is the second solution
for Parkinson's Disease. Its three parameters are: frequency, pulse
width and voltage. They must be optimized to achieve successful
treatment. Nowadays it is done clinically by neurologists and there is
not certain numerical method to detect them. The aim of this research
is to introduce simulation and modeling of Parkinson's Disease
treatment as a computational procedure to select optimum voltage.
We recorded finger tremor signals of some Parkinsonian patients
under DBS treatment at constant frequency and pulse width but
variable voltages; then, we adapted a new model to fit these data. The
optimum voltages obtained by data fitting results were the same as
neurologists- commented voltages, which means modeling can be
used as an engineering method to select optimum stimulation
voltages.
Abstract: Security risk models have been successful in estimating the likelihood of attack for simple security threats. However, modeling complex system and their security risk is even a challenge. Many methods have been proposed to face this problem. Often difficult to manipulate, and not enough all-embracing they are not as famous as they should with administrators and deciders. We propose in this paper a new tool to model big systems on purpose. The software, takes into account attack threats and security strength.
Abstract: In the world of Peer-to-Peer (P2P) networking
different protocols have been developed to make the resource sharing
or information retrieval more efficient. The SemPeer protocol is a
new layer on Gnutella that transforms the connections of the nodes
based on semantic information to make information retrieval more
efficient. However, this transformation causes high clustering in the
network that decreases the number of nodes reached, therefore the
probability of finding a document is also decreased. In this paper we
describe a mathematical model for the Gnutella and SemPeer
protocols that captures clustering-related issues, followed by a
proposition to modify the SemPeer protocol to achieve moderate
clustering. This modification is a sort of link management for the
individual nodes that allows the SemPeer protocol to be more
efficient, because the probability of a successful query in the P2P
network is reasonably increased. For the validation of the models, we
evaluated a series of simulations that supported our results.
Abstract: The stereophotogrammetry modality is gaining more widespread use in the clinical setting. Registration and visualization of this data, in conjunction with conventional 3D volumetric image modalities, provides virtual human data with textured soft tissue and internal anatomical and structural information. In this investigation computed tomography (CT) and stereophotogrammetry data is acquired from 4 anatomical phantoms and registered using the trimmed iterative closest point (TrICP) algorithm. This paper fully addresses the issue of imaging artifacts around the stereophotogrammetry surface edge using the registered CT data as a reference. Several iterative algorithms are implemented to automatically identify and remove stereophotogrammetry surface edge outliers, improving the overall visualization of the combined stereophotogrammetry and CT data. This paper shows that outliers at the surface edge of stereophotogrammetry data can be successfully removed automatically.
Abstract: In order to guarantee secure communication for wireless sensor networks (WSNs), many user authentication schemes have successfully drawn researchers- attention and been studied widely. In 2012, He et al. proposed a robust biometric-based user authentication scheme for WSNs. However, this paper demonstrates that He et al.-s scheme has some drawbacks: poor reparability problem, user impersonation attack, and sensor node impersonate attack.
Abstract: This paper presents a new version of the SVM mixture algorithm initially proposed by Kwok for classification and regression problems. For both cases, a slight modification of the mixture model leads to a standard SVM training problem, to the existence of an exact solution and allows the direct use of well known decomposition and working set selection algorithms. Only the regression case is considered in this paper but classification has been addressed in a very similar way. This method has been successfully applied to engine pollutants emission modeling.
Abstract: This paper presents the development of a Bayesian
belief network classifier for prediction of graft status and survival
period in renal transplantation using the patient profile information
prior to the transplantation. The objective was to explore feasibility
of developing a decision making tool for identifying the most suitable
recipient among the candidate pool members. The dataset was
compiled from the University of Toledo Medical Center Hospital
patients as reported to the United Network Organ Sharing, and had
1228 patient records for the period covering 1987 through 2009. The
Bayes net classifiers were developed using the Weka machine
learning software workbench. Two separate classifiers were induced
from the data set, one to predict the status of the graft as either failed
or living, and a second classifier to predict the graft survival period.
The classifier for graft status prediction performed very well with a
prediction accuracy of 97.8% and true positive values of 0.967 and
0.988 for the living and failed classes, respectively. The second
classifier to predict the graft survival period yielded a prediction
accuracy of 68.2% and a true positive rate of 0.85 for the class
representing those instances with kidneys failing during the first year
following transplantation. Simulation results indicated that it is
feasible to develop a successful Bayesian belief network classifier for
prediction of graft status, but not the graft survival period, using the
information in UNOS database.
Abstract: Tracing and locating the geographical location of users (Geolocation) is used extensively in todays Internet. Whenever we, e.g., request a page from google we are - unless there was a specific configuration made - automatically forwarded to the page with the relevant language and amongst others, dependent on our location identified, specific commercials are presented. Especially within the area of Network Security, Geolocation has a significant impact. Because of the way the Internet works, attacks can be executed from almost everywhere. Therefore, for an attribution, knowledge of the origination of an attack - and thus Geolocation - is mandatory in order to be able to trace back an attacker. In addition, Geolocation can also be used very successfully to increase the security of a network during operation (i.e. before an intrusion actually has taken place). Similar to greylisting in emails, Geolocation allows to (i) correlate attacks detected with new connections and (ii) as a consequence to classify traffic a priori as more suspicious (thus particularly allowing to inspect this traffic in more detail). Although numerous techniques for Geolocation are existing, each strategy is subject to certain restrictions. Following the ideas of Endo et al., this publication tries to overcome these shortcomings with a combined solution of different methods to allow improved and optimized Geolocation. Thus, we present our architecture for improved Geolocation, by designing a new algorithm, which combines several Geolocation techniques to increase the accuracy.
Abstract: Due to a high unemployment rate among local people
and a high reliance on expatriate workers, the governments in the
Gulf Co-operation Council (GCC) countries have been implementing
programmes of localisation (replacing foreign workers with GCC
nationals). These programmes have been successful in the public
sector but much less so in the private sector. However, there are now
insufficient jobs for locals in the public sector and the onus to provide
employment has fallen on the private sector. This paper is concerned
with a study, which is a work in progress (certain elements are
complete but not the whole study), investigating the effective
implementation of localisation policies in four- and five-star hotels in
the Kingdom of Saudi Arabia (KSA) and the United Arab Emirates
(UAE). The purpose of the paper is to identify the research gap, and
to present the need for the research. Further, it will explain how this
research was conducted.
Studies of localisation in the GCC countries are under-represented
in scholarly literature. Currently, the hotel sectors in KSA and UAE
play an important part in the countries’ economies. However, the
total proportion of Saudis working in the hotel sector in KSA is
slightly under 8%, and in the UAE, the hotel sector remains highly
reliant on expatriates. There is therefore a need for research on
strategies to enhance the implementation of the localisation policies
in general and in the hotel sector in particular.
Further, despite the importance of the hotel sector to their
economies, there remains a dearth of research into the
implementation of localisation policies in this sector. Indeed, as far as
the researchers are aware, there is no study examining localisation in
the hotel sector in KSA, and few in the UAE. This represents a
considerable research gap.
Regarding how the research was carried out, a multiple case study
strategy was used. The four- and five-star hotel sector in KSA is one
of the cases, while the four- and five-star hotel sector in the UAE is
the other case. Four- and five-star hotels in KSA and the UAE were
chosen as these countries have the longest established localisation
policies of all the GCC states and there are more hotels of these
classifications in these countries than in any of the other Gulf
countries. A literature review was carried out to underpin the
research. The empirical data were gathered in three phases. In order
to gain a pre-understanding of the issues pertaining to the research
context, Phase I involved eight unstructured interviews with officials
from the Saudi Commission for Tourism and Antiquities (three
interviewees); the Saudi Human Resources Development Fund (one);
the Abu Dhabi Tourism and Culture Authority (three); and the Abu
Dhabi Development Fund (one).
In Phase II, a questionnaire was administered to 24 managers and
24 employees in four- and five-star hotels in each country to obtain
their beliefs, attitudes, opinions, preferences and practices concerning
localisation.
Unstructured interviews were carried out in Phase III with six
managers in each country in order to allow them to express opinions
that may not have been explored in sufficient depth in the
questionnaire. The interviews in Phases I and III were analysed using
thematic analysis and SPSS will be used to analyse the questionnaire
data.
It is recommended that future research be undertaken on a larger
scale, with a larger sample taken from all over KSA and the UAE
rather than from only four cities (i.e., Riyadh and Jeddah in KSA and
Abu Dhabi and Sharjah in the UAE), as was the case in this research.
Abstract: Data security in u-Health system can be an important
issue because wireless network is vulnerable to hacking. However, it is
not easy to implement a proper security algorithm in an embedded
u-health monitoring because of hardware constraints such as low
performance, power consumption and limited memory size and etc. To
secure data that contain personal and biosignal information, we
implemented several security algorithms such as Blowfish, data
encryption standard (DES), advanced encryption standard (AES) and
Rivest Cipher 4 (RC4) for our u-Health monitoring system and the
results were successful. Under the same experimental conditions, we
compared these algorithms. RC4 had the fastest execution time.
Memory usage was the most efficient for DES. However, considering
performance and safety capability, however, we concluded that AES
was the most appropriate algorithm for a personal u-Health monitoring
system.
Abstract: A two-dimensional numerical simulation of crossflow
around four cylinders in an in-line rectangular configuration is
studied by using the lattice Boltzmann method (LBM). Special
attention is paid to the effect of the spacing between the cylinders.
The Reynolds number ( Re ) is chosen to be e 100 R = and the
spacing ratio L / D is set at 0.5, 1.5, 2.5, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0
and 10.0. Results show that, as in the case of four cylinders in an inline
rectangular configuration , flow fields show four different
features depending on the spacing (single square cylinder, stable
shielding flow, wiggling shielding flow and a vortex shedding flow)
are observed in this study. The effects of spacing ratio on physical
quantities such as mean drag coefficient, Strouhal number and rootmean-
square value of the drag and lift coefficients are also presented.
There is more than one shedding frequency at small spacing ratios.
The mean drag coefficients for downstream cylinders are less than
that of the single cylinder for all spacing ratios. The present results
using the LBM are compared with some existing experimental data
and numerical studies. The comparison shows that the LBM can
capture the characteristics of the bluff body flow reasonably well and
is a good tool for bluff body flow studies.
Abstract: In this paper, we propose a novel algorithm for
delineating the endocardial wall from a human heart ultrasound scan.
We assume that the gray levels in the ultrasound images are
independent and identically distributed random variables with
different Rician Inverse Gaussian (RiIG) distributions. Both synthetic
and real clinical data will be used for testing the algorithm. Algorithm
performance will be evaluated using the expert radiologist evaluation
of a soft copy of an ultrasound scan during the scanning process and
secondly, doctor’s conclusion after going through a printed copy of
the same scan. Successful implementation of this algorithm should
make it possible to differentiate normal from abnormal soft tissue and
help disease identification, what stage the disease is in and how best
to treat the patient. We hope that an automated system that uses this
algorithm will be useful in public hospitals especially in Third World
countries where problems such as shortage of skilled radiologists and
shortage of ultrasound machines are common. These public hospitals
are usually the first and last stop for most patients in these countries.
Abstract: Forecasting the values of the indicators, which
characterize the effectiveness of performance of organizations is of
great importance for their successful development. Such forecasting
is necessary in order to assess the current state and to foresee future
developments, so that measures to improve the organization-s
activity could be undertaken in time. The article presents an
overview of the applied mathematical and statistical methods for
developing forecasts. Special attention is paid to artificial neural
networks as a forecasting tool. Their strengths and weaknesses are
analyzed and a synopsis is made of the application of artificial neural
networks in the field of forecasting of the values of different
education efficiency indicators. A method of evaluation of the
activity of universities using the Balanced Scorecard is proposed and
Key Performance Indicators for assessment of e-learning are
selected. Resulting indicators for the evaluation of efficiency of the
activity are proposed. An artificial neural network is constructed and
applied in the forecasting of the values of indicators for e-learning
efficiency on the basis of the KPI values.
Abstract: Nowadays companies strive to survive in a
competitive global environment. To speed up product
development/modifications, it is suggested to adopt a collaborative
product development approach. However, despite the advantages of
new IT improvements still many CAx systems work separately and
locally. Collaborative design and manufacture requires a product
information model that supports related CAx product data models. To
solve this problem many solutions are proposed, which the most
successful one is adopting the STEP standard as a product data model
to develop a collaborative CAx platform. However, the improvement
of the STEP-s Application Protocols (APs) over the time, huge
number of STEP AP-s and cc-s, the high costs of implementation,
costly process for conversion of older CAx software files to the STEP
neutral file format; and lack of STEP knowledge, that usually slows
down the implementation of the STEP standard in collaborative data
exchange, management and integration should be considered. In this
paper the requirements for a successful collaborative CAx system is
discussed. The STEP standard capability for product data integration
and its shortcomings as well as the dominant platforms for supporting
CAx collaboration management and product data integration are
reviewed. Finally a platform named LAYMOD to fulfil the
requirements of CAx collaborative environment and integrating the
product data is proposed. The platform is a layered platform to enable
global collaboration among different CAx software
packages/developers. It also adopts the STEP modular architecture
and the XML data structures to enable collaboration between CAx
software packages as well as overcoming the STEP standard
limitations. The architecture and procedures of LAYMOD platform
to manage collaboration and avoid contradicts in product data
integration are introduced.
Abstract: A mathematical model based on a mass and energy
balance for the combustion in a cement rotary kiln was developed.
The model was used to investigate the impact of replacing about
45 % of the primary coal energy by different alternative fuels.
Refuse derived fuel, waste wood, solid hazardous waste and liquid
hazardous waste were used in the modeling. The results showed that
in order to keep the kiln temperature unchanged, and thereby
maintain the required clinker quality, the production capacity had to
be reduced by 1-15 %, depending on the fuel type. The reason for the
reduction is increased exhaust gas flow rates caused by the fuel
characteristics. The model, which has been successfully validated in a
full-scale experiment, was also used to show that the negative impact
on the production capacity can be avoided if a relatively small part of
the combustion air is replaced by pure oxygen.
Abstract: The concept of e-government has begun to spread among countries. It is based on the use of information communication technology (ICT) to fully utilize government resources, as well as to provide government services to citizens, investors and foreigners. Critical factors are the factors that are determined by the senior management of each organization; the success or failure of the organization depends on the effective implementation of critical factors. These factors vary from one organization to another according to their activity, size and functions. It is very important that organizations identify them in order to avoid the risk of implementing initiatives that may fail to work, while simultaneously exploiting opportunities that may succeed in working. The main focus of this paper is to investigate the majority of critical success factors (CSFs) associated with the implementation of e-government projects. This study concentrates on both technical and nontechnical factors. This paper concludes by listing the majority of CSFs relating to successful e-government implementation in Bahrain.
Abstract: As German companies roll out their standardized
production systems to offshore manufacturing plants, they face the
challenge of implementing them in different cultural environments.
Studies show that the local adaptation is one of the key factors for a
successful implementation. Thus the question arises of where the line
between standardization and adaptation can be drawn. To answer
this question the influence of culture on production systems is
analysed in this paper. The culturally contingent components of
production systems are identified. Also the contingency factors are
classified according to their impact on the necessary adaptation
changes and implementation effort. Culturally specific decision
making, coordination, communication and motivation patterns
require one-time changes in organizational and process design. The
attitude towards rules requires more intense coaching and controlling.
Lastly a framework is developed to depict standardization and
adaption needs when transplanting production systems into different
cultural environments.
Abstract: Embedding Sustainability in technological curricula has become a crucial factor for educating engineers with competences in sustainability. The Technical University of Catalonia UPC, in 2008, designed the Sustainable Technology Excellence Program STEP 2015 in order to assure a successful Sustainability Embedding. This Program takes advantage of the opportunity that the redesign of all Bachelor and Master Degrees in Spain by 2010 under the European Higher Education Area framework offered. The STEP program goals are: to design compulsory courses in each degree; to develop the conceptual base and identify reference models in sustainability for all specialties at UPC; to create an internal interdisciplinary network of faculty from all the schools; to initiate new transdisciplinary research activities in technology-sustainability-education; to spread the know/how attained; to achieve international scientific excellence in technology-sustainability-education and to graduate the first engineers/architects of the new EHEA bachelors with sustainability as a generic competence. Specifically, in this paper authors explain their experience in leading the STEP program, and two examples are presented: Industrial Robotics subject and the curriculum for the School of Architecture.