Abstract: Computational simulation of steam flow and heat transfer in power plant condensers on the basis of the threedimensional mathematical model for the flow through porous media is presented. In order to solve the mathematical model of steam flow and heat transfer in power plant condensers, the Streamline Upwind Petrov-Galerkin finite element method is applied. By comparison of the results of simulation with experimental results about an experimental condenser, it is confirmed that SUPG finite element method can be successfully applied for solving the three-dimensional mathematical model of steam flow and heat transfer in power plant condensers.
Abstract: Single crystals of Magnesium alloys such as Mg-1Al,
Mg-1Zn-0.5Y, Mg-3Li, and AZ31 alloys were successfully fabricated in this study by employing the modified Bridgman method. Single
crystals of pure Mg were also made in this study. To determine the exact orientation of crystals, Laue back-reflection method and pole figure measurement were carried out on each single crystal. Dimensions of single crystals were 10 mm in diameter and 120 mm in
length. Hardness and compression tests were conducted and the results
revealed that hardness and the strength strongly depended on the
orientation. The closer to basal one the orientation was, the higher hardness and compressive strength were. The effect of alloying was
not higher than that of orientation. After compressive deformation of single crystals, the orientation of the crystals was found to rotate and to be parallel to the basal orientation.
Abstract: α-Pinene is the main component of the most
turpentine oils. The hydration of α-pinene with acid catalysts leads to
a complex mixture of monoterpenes. In order to obtain more valuable
products, the α-pinene in the turpentine can be hydrated in dilute
mineral acid solutions to produce α-terpineol. The design of
separation processes requires information on phase equilibrium and
related thermodynamic properties. This paper reports the results of
study on liquid-liquid equilibrium (LLE) of system containing α-
pinene + water and α-terpineol + water.
Binary LLE for α-pinene + water system, and α-terpineol + water
systems were determined by experiment at 301K and atmospheric
pressure. The two component mixture was stirred for about 30min,
then the mixture was left for about 2h for complete phase separation.
The composition of both phases was analyzed by using a Gas
Chromatograph. The experimental data were correlated by
considering both NRTL and UNIQUAC activity coefficient models.
The LLE data for the system of α-pinene + water and α-terpineol +
water were correlated successfully by the NRTL model. The
experimental data were not satisfactorily fitted by the UNIQUAC
model. The NRTL model (α =0.3) correlates the LLE data for the
system of α-pinene + water at 301K with RMSD of 0.0404%. And
the NRTL model (α =0.61) at 301K with RMSD of 0.0058 %. The
NRTL model (α =0.3) correlates the LLE data for the system of α-
terpineol + water at 301K with RMSD of 0.1487% and the NRTL
model (α =0.6) at 301K with RMSD of 0.0032%, between the
experimental and calculated mole fractions.
Abstract: According to the new developments in the field of information and communication technologies, the necessity arises for active use of these new technologies in education. It is clear that the integration of technology in education system will be different for primary-higher education or traditional- distance education. In this study, the subject of the integration of technology for distance education was discussed. The subject was taken from the viewpoint of students. With using the information of student feedback about education program in which new technological medias are used, how can survey variables can be separated into the factors as positive, negative and supporter and how can be redesigned education strategy of the higher education associations with the examining the variables of each determinated factor is explained. The paper concludes with the recommendations about the necessitity of working as a group of different area experts and using of numerical methods in establishing of education strategy to be successful.
Abstract: As a vital activity for companies, new product
development (NPD) is also a very risky process due to the high
uncertainty degree encountered at every development stage and the
inevitable dependence on how previous steps are successfully
accomplished. Hence, there is an apparent need to evaluate new
product initiatives systematically and make accurate decisions under
uncertainty. Another major concern is the time pressure to launch a
significant number of new products to preserve and increase the
competitive power of the company. In this work, we propose an
integrated decision-making framework based on neural networks and
fuzzy logic to make appropriate decisions and accelerate the
evaluation process. We are especially interested in the two initial
stages where new product ideas are selected (go/no go decision) and
the implementation order of the corresponding projects are
determined. We show that this two-staged intelligent approach allows
practitioners to roughly and quickly separate good and bad product
ideas by making use of previous experiences, and then, analyze a
more shortened list rigorously.
Abstract: Appeared toward 1986, the object-oriented databases
management systems had not known successes knew five years after
their birth. One of the major difficulties is the query optimization.
We propose in this paper a new approach that permits to enrich
techniques of query optimization existing in the object-oriented
databases. Seen success that knew the query optimization in the
relational model, our approach inspires itself of these optimization
techniques and enriched it so that they can support the new concepts
introduced by the object databases.
Abstract: In this paper, we propose a new robust and secure
system that is based on the combination between two different
transforms Discrete wavelet Transform (DWT) and Contourlet
Transform (CT). The combined transforms will compensate the
drawback of using each transform separately. The proposed
algorithm has been designed, implemented and tested successfully.
The experimental results showed that selecting the best sub-band for
embedding from both transforms will improve the imperceptibility
and robustness of the new combined algorithm. The evaluated
imperceptibility of the combined DWT-CT algorithm which gave a
PSNR value 88.11 and the combination DWT-CT algorithm
improves robustness since it produced better robust against Gaussian
noise attack. In addition to that, the implemented system shored a
successful extraction method to extract watermark efficiently.
Abstract: Deep Brain Stimulation or DBS is the second solution
for Parkinson's Disease. Its three parameters are: frequency, pulse
width and voltage. They must be optimized to achieve successful
treatment. Nowadays it is done clinically by neurologists and there is
not certain numerical method to detect them. The aim of this research
is to introduce simulation and modeling of Parkinson's Disease
treatment as a computational procedure to select optimum voltage.
We recorded finger tremor signals of some Parkinsonian patients
under DBS treatment at constant frequency and pulse width but
variable voltages; then, we adapted a new model to fit these data. The
optimum voltages obtained by data fitting results were the same as
neurologists- commented voltages, which means modeling can be
used as an engineering method to select optimum stimulation
voltages.
Abstract: Security risk models have been successful in estimating the likelihood of attack for simple security threats. However, modeling complex system and their security risk is even a challenge. Many methods have been proposed to face this problem. Often difficult to manipulate, and not enough all-embracing they are not as famous as they should with administrators and deciders. We propose in this paper a new tool to model big systems on purpose. The software, takes into account attack threats and security strength.
Abstract: In the world of Peer-to-Peer (P2P) networking
different protocols have been developed to make the resource sharing
or information retrieval more efficient. The SemPeer protocol is a
new layer on Gnutella that transforms the connections of the nodes
based on semantic information to make information retrieval more
efficient. However, this transformation causes high clustering in the
network that decreases the number of nodes reached, therefore the
probability of finding a document is also decreased. In this paper we
describe a mathematical model for the Gnutella and SemPeer
protocols that captures clustering-related issues, followed by a
proposition to modify the SemPeer protocol to achieve moderate
clustering. This modification is a sort of link management for the
individual nodes that allows the SemPeer protocol to be more
efficient, because the probability of a successful query in the P2P
network is reasonably increased. For the validation of the models, we
evaluated a series of simulations that supported our results.
Abstract: Software project effort estimation is frequently seen
as complex and expensive for individual software engineers.
Software production is in a crisis. It suffers from excessive costs.
Software production is often out of control. It has been suggested that
software production is out of control because we do not measure.
You cannot control what you cannot measure. During last decade, a
number of researches on cost estimation have been conducted. The
metric-set selection has a vital role in software cost estimation
studies; its importance has been ignored especially in neural network
based studies. In this study we have explored the reasons of those
disappointing results and implemented different neural network
models using augmented new metrics. The results obtained are
compared with previous studies using traditional metrics. To be able
to make comparisons, two types of data have been used. The first
part of the data is taken from the Constructive Cost Model
(COCOMO'81) which is commonly used in previous studies and the
second part is collected according to new metrics in a leading
international company in Turkey. The accuracy of the selected
metrics and the data samples are verified using statistical techniques.
The model presented here is based on Multi-Layer Perceptron
(MLP). Another difficulty associated with the cost estimation studies
is the fact that the data collection requires time and care. To make a
more thorough use of the samples collected, k-fold, cross validation
method is also implemented. It is concluded that, as long as an
accurate and quantifiable set of metrics are defined and measured
correctly, neural networks can be applied in software cost estimation
studies with success
Abstract: The present study was designed to test the influence
of confirmed expectations, perceived usefulness and perceived
competence on e-learning satisfaction among university teachers. A
questionnaire was completed by 125 university teachers from 12
different universities in Norway. We found that 51% of the variance
in university teachers- satisfaction with e-learning could be explained
by the three proposed antecedents. Perceived usefulness seems to be
the most important predictor of teachers- satisfaction with e-learning.
Abstract: The stereophotogrammetry modality is gaining more widespread use in the clinical setting. Registration and visualization of this data, in conjunction with conventional 3D volumetric image modalities, provides virtual human data with textured soft tissue and internal anatomical and structural information. In this investigation computed tomography (CT) and stereophotogrammetry data is acquired from 4 anatomical phantoms and registered using the trimmed iterative closest point (TrICP) algorithm. This paper fully addresses the issue of imaging artifacts around the stereophotogrammetry surface edge using the registered CT data as a reference. Several iterative algorithms are implemented to automatically identify and remove stereophotogrammetry surface edge outliers, improving the overall visualization of the combined stereophotogrammetry and CT data. This paper shows that outliers at the surface edge of stereophotogrammetry data can be successfully removed automatically.
Abstract: In order to guarantee secure communication for wireless sensor networks (WSNs), many user authentication schemes have successfully drawn researchers- attention and been studied widely. In 2012, He et al. proposed a robust biometric-based user authentication scheme for WSNs. However, this paper demonstrates that He et al.-s scheme has some drawbacks: poor reparability problem, user impersonation attack, and sensor node impersonate attack.
Abstract: This paper presents a new version of the SVM mixture algorithm initially proposed by Kwok for classification and regression problems. For both cases, a slight modification of the mixture model leads to a standard SVM training problem, to the existence of an exact solution and allows the direct use of well known decomposition and working set selection algorithms. Only the regression case is considered in this paper but classification has been addressed in a very similar way. This method has been successfully applied to engine pollutants emission modeling.
Abstract: This paper presents the development of a Bayesian
belief network classifier for prediction of graft status and survival
period in renal transplantation using the patient profile information
prior to the transplantation. The objective was to explore feasibility
of developing a decision making tool for identifying the most suitable
recipient among the candidate pool members. The dataset was
compiled from the University of Toledo Medical Center Hospital
patients as reported to the United Network Organ Sharing, and had
1228 patient records for the period covering 1987 through 2009. The
Bayes net classifiers were developed using the Weka machine
learning software workbench. Two separate classifiers were induced
from the data set, one to predict the status of the graft as either failed
or living, and a second classifier to predict the graft survival period.
The classifier for graft status prediction performed very well with a
prediction accuracy of 97.8% and true positive values of 0.967 and
0.988 for the living and failed classes, respectively. The second
classifier to predict the graft survival period yielded a prediction
accuracy of 68.2% and a true positive rate of 0.85 for the class
representing those instances with kidneys failing during the first year
following transplantation. Simulation results indicated that it is
feasible to develop a successful Bayesian belief network classifier for
prediction of graft status, but not the graft survival period, using the
information in UNOS database.
Abstract: Tracing and locating the geographical location of users (Geolocation) is used extensively in todays Internet. Whenever we, e.g., request a page from google we are - unless there was a specific configuration made - automatically forwarded to the page with the relevant language and amongst others, dependent on our location identified, specific commercials are presented. Especially within the area of Network Security, Geolocation has a significant impact. Because of the way the Internet works, attacks can be executed from almost everywhere. Therefore, for an attribution, knowledge of the origination of an attack - and thus Geolocation - is mandatory in order to be able to trace back an attacker. In addition, Geolocation can also be used very successfully to increase the security of a network during operation (i.e. before an intrusion actually has taken place). Similar to greylisting in emails, Geolocation allows to (i) correlate attacks detected with new connections and (ii) as a consequence to classify traffic a priori as more suspicious (thus particularly allowing to inspect this traffic in more detail). Although numerous techniques for Geolocation are existing, each strategy is subject to certain restrictions. Following the ideas of Endo et al., this publication tries to overcome these shortcomings with a combined solution of different methods to allow improved and optimized Geolocation. Thus, we present our architecture for improved Geolocation, by designing a new algorithm, which combines several Geolocation techniques to increase the accuracy.
Abstract: Due to a high unemployment rate among local people
and a high reliance on expatriate workers, the governments in the
Gulf Co-operation Council (GCC) countries have been implementing
programmes of localisation (replacing foreign workers with GCC
nationals). These programmes have been successful in the public
sector but much less so in the private sector. However, there are now
insufficient jobs for locals in the public sector and the onus to provide
employment has fallen on the private sector. This paper is concerned
with a study, which is a work in progress (certain elements are
complete but not the whole study), investigating the effective
implementation of localisation policies in four- and five-star hotels in
the Kingdom of Saudi Arabia (KSA) and the United Arab Emirates
(UAE). The purpose of the paper is to identify the research gap, and
to present the need for the research. Further, it will explain how this
research was conducted.
Studies of localisation in the GCC countries are under-represented
in scholarly literature. Currently, the hotel sectors in KSA and UAE
play an important part in the countries’ economies. However, the
total proportion of Saudis working in the hotel sector in KSA is
slightly under 8%, and in the UAE, the hotel sector remains highly
reliant on expatriates. There is therefore a need for research on
strategies to enhance the implementation of the localisation policies
in general and in the hotel sector in particular.
Further, despite the importance of the hotel sector to their
economies, there remains a dearth of research into the
implementation of localisation policies in this sector. Indeed, as far as
the researchers are aware, there is no study examining localisation in
the hotel sector in KSA, and few in the UAE. This represents a
considerable research gap.
Regarding how the research was carried out, a multiple case study
strategy was used. The four- and five-star hotel sector in KSA is one
of the cases, while the four- and five-star hotel sector in the UAE is
the other case. Four- and five-star hotels in KSA and the UAE were
chosen as these countries have the longest established localisation
policies of all the GCC states and there are more hotels of these
classifications in these countries than in any of the other Gulf
countries. A literature review was carried out to underpin the
research. The empirical data were gathered in three phases. In order
to gain a pre-understanding of the issues pertaining to the research
context, Phase I involved eight unstructured interviews with officials
from the Saudi Commission for Tourism and Antiquities (three
interviewees); the Saudi Human Resources Development Fund (one);
the Abu Dhabi Tourism and Culture Authority (three); and the Abu
Dhabi Development Fund (one).
In Phase II, a questionnaire was administered to 24 managers and
24 employees in four- and five-star hotels in each country to obtain
their beliefs, attitudes, opinions, preferences and practices concerning
localisation.
Unstructured interviews were carried out in Phase III with six
managers in each country in order to allow them to express opinions
that may not have been explored in sufficient depth in the
questionnaire. The interviews in Phases I and III were analysed using
thematic analysis and SPSS will be used to analyse the questionnaire
data.
It is recommended that future research be undertaken on a larger
scale, with a larger sample taken from all over KSA and the UAE
rather than from only four cities (i.e., Riyadh and Jeddah in KSA and
Abu Dhabi and Sharjah in the UAE), as was the case in this research.
Abstract: The existing image coding standards generally degrades at low bit-rates because of the underlying block based Discrete Cosine Transform scheme. Over the past decade, the success of wavelets in solving many different problems has contributed to its unprecedented popularity. Due to implementation constraints scalar wavelets do not posses all the properties such as orthogonality, short support, linear phase symmetry, and a high order of approximation through vanishing moments simultaneously, which are very much essential for signal processing. New class of wavelets called 'Multiwavelets' which posses more than one scaling function overcomes this problem. This paper presents a new image coding scheme based on non linear approximation of multiwavelet coefficients along with multistage vector quantization. The performance of the proposed scheme is compared with the results obtained from scalar wavelets.
Abstract: A feasibility study for the design and construction of a
pilot plant for the extraction of castor oil in South Africa was
conducted. The study emphasized the four critical aspects of project
feasibility analysis, namely technical, financial, market and
managerial aspects. The technical aspect involved research on
existing oil extraction technologies, namely: mechanical pressing and
solvent extraction, as well as assessment of the proposed production
site for both short and long term viability of the project. The site is
on the outskirts of Nkomazi village in the Mpumalanga province,
where connections for water and electricity are currently underway,
potential raw material supply proves to be reliable since the province
is known for its commercial farming. The managerial aspect was
evaluated based on the fact that the current producer of castor oil will
be fully involved in the project while receiving training and technical
assistance from Sasol Technology, the TSC and SEDA. Market and
financial aspects were evaluated and the project was considered
financially viable with a Net Present Value (NPV) of R2 731 687 and
an Internal Rate of Return (IRR) of 18% at an annual interest rate of
10.5%. The payback time is 6years for analysis over the first 10
years with a net income of R1 971 000 in the first year. The project
was thus found to be feasible with high chance of success while
contributing to socio-economic development. It was recommended
for lab tests to be conducted to establish process kinetics that would
be used in the initial design of the plant.