Abstract: Meshing is the process of discretizing problem
domain into many sub domains before the numerical calculation can
be performed. One of the most popular meshes among many types of meshes is tetrahedral mesh, due to their flexibility to fit into almost
any domain shape. In both 2D and 3D domains, triangular and tetrahedral meshes can be generated by using Delaunay triangulation.
The quality of mesh is an important factor in performing any Computational Fluid Dynamics (CFD) simulations as the results is
highly affected by the mesh quality. Many efforts had been done in
order to improve the quality of the mesh. The paper describes a mesh
generation routine which has been developed capable of generating
high quality tetrahedral cells in arbitrary complex geometry. A few
test cases in CFD problems are used for testing the mesh generator.
The result of the mesh is compared with the one generated by a
commercial software. The results show that no sliver exists for the
meshes generated, and the overall quality is acceptable since the percentage of the bad tetrahedral is relatively small. The boundary
recovery was also successfully done where all the missing faces are
rebuilt.
Abstract: Medical Decision Support Systems (MDSSs) are sophisticated, intelligent systems that can provide inference due to lack of information and uncertainty. In such systems, to model the uncertainty various soft computing methods such as Bayesian networks, rough sets, artificial neural networks, fuzzy logic, inductive logic programming and genetic algorithms and hybrid methods that formed from the combination of the few mentioned methods are used. In this study, symptom-disease relationships are presented by a framework which is modeled with a formal concept analysis and theory, as diseases, objects and attributes of symptoms. After a concept lattice is formed, Bayes theorem can be used to determine the relationships between attributes and objects. A discernibility relation that forms the base of the rough sets can be applied to attribute data sets in order to reduce attributes and decrease the complexity of computation.
Abstract: It is widely acknowledged that there is a shortage of software developers, not only in South Africa, but also worldwide. Despite reports on a gap between industry needs and software education, the gap has mostly been explored in quantitative studies. This paper reports on the qualitative data of a mixed method study of the perceptions of professional software developers regarding what topics they learned from their formal education and the importance of these topics to their actual work. The analysis suggests that there is a gap between industry’s needs and software development education and the following recommendations are made: 1) Real-life projects must be included in students’ education; 2) Soft skills and business skills must be included in curricula; 3) Universities must keep the curriculum up to date; 4) Software development education must be made accessible to a diverse range of students.
Abstract: The six sigma method is a project-driven management approach to improve the organization-s products, services, and processes by continually reducing defects in the organization. Understanding the key features, obstacles, and shortcomings of the six sigma method allows organizations to better support their strategic directions, and increasing needs for coaching, mentoring, and training. It also provides opportunities to better implement six sigma projects. The purpose of this paper is the survey of six sigma process and its impact on the organizational productivity. So I have studied key concepts , problem solving process of six sigmaas well as the survey of important fields such as: DMAIC, six sigma and productivity applied programme, and other advantages of six sigma. In the end of this paper, present research conclusions. (direct and positive relation between six sigma and productivity)
Abstract: Arbitrarily shaped video objects are an important
concept in modern video coding methods. The techniques presently
used are not based on image elements but rather video objects having
an arbitrary shape. In this paper, spatial shape error concealment
techniques to be used for object-based image in error-prone
environments are proposed. We consider a geometric shape
representation consisting of the object boundary, which can be
extracted from the α-plane. Three different approaches are used to
replace a missing boundary segment: Bézier interpolation, Bézier
approximation and NURBS approximation. Experimental results on
object shape with different concealment difficulty demonstrate the
performance of the proposed methods. Comparisons with proposed
methods are also presented.
Abstract: Despite many success stories of manufacturing safety, many organizations are still reluctant, perceiving it as cost increasing and time consuming. The clear contributor may be due to the use of lagging indicators rather than leading indicator measures. The study therefore proposes a combinatorial model for determining the best safety strategy. A combination theory and cost benefit analysis was employed to develop a monetary saving / loss function in terms value of preventions and cost of prevention strategy. Documentations, interviews and structured questionnaire were employed to collect information on Before-And-After safety programme records from a Tobacco company between periods of 1993-2001(for pre-safety) and 2002-2008 (safety period) for the model application. Three combinatorial alternatives A, B, C were obtained resulting into 4, 6 and 4 strategies respectively with PPE and Training being predominant. A total of 728 accidents were recorded for a 9 year period of pre-safety programme and 163 accidents were recorded for 7 years period of safety programme. Six preventions activities (alternative B) yielded the best results. However, all the years of operation experienced except year 2004. The study provides a leading resources for planning successful safety programme
Abstract: Web applications have become very complex and
crucial, especially when combined with areas such as CRM
(Customer Relationship Management) and BPR (Business Process
Reengineering), the scientific community has focused attention to
Web applications design, development, analysis, and testing, by
studying and proposing methodologies and tools. This paper
proposes an approach to automatic multi-dimensional concern
mining for Web Applications, based on concepts analysis, impact
analysis, and token-based concern identification. This approach lets
the user to analyse and traverse Web software relevant to a particular
concern (concept, goal, purpose, etc.) via multi-dimensional
separation of concerns, to document, understand and test Web
applications. This technique was developed in the context of WAAT
(Web Applications Analysis and Testing) project. A semi-automatic
tool to support this technique is currently under development.
Abstract: Creating3D environments, including characters and
cities, is a significantly time consuming process due to a large amount
of workinvolved in designing and modelling.There have been a
number of attempts to automatically generate 3D objects employing
shape grammars. However it is still too early to apply the mechanism
to real problems such as real-time computer games.The purpose of this
research is to introduce a time efficient and cost effective method to
automatically generatevarious 3D objects for real-time 3D games.
This Shape grammar-based real-time City Generation (RCG) model is
a conceptual model for generating 3Denvironments in real-time and
can be applied to 3D gamesoranimations. The RCG system can
generate even a large cityby applying fundamental principles of shape
grammars to building elementsin various levels of detailin real-time.
Abstract: Canola is a specific edible type of rapeseed, developed
in the 1970s, which contains about 40 percent oil. This research was
carried out to determine the yield and some quality characteristics of
some winter canola cultivars during the 2010-2011 vegetation period
in Central Anatolia of Turkey. In this research; Oase, Dante,
Californium, Excalibur, Elvis, ES Hydromel, Licord, Orkan, Vectra,
Nelson, Champlain and NK Petrol winter canola varieties were used
as material. The field experiment was set up in a “Randomized
Complete Block Design” with three replications on 21 September
2010. In this research; seed yield, oil content, protein content, oil
yield and protein yield were examined.
As a result of this research; seed yield, oil content, oil yield and
protein yield (except protein content) were significant differences
between the cultivars. The highest seed yield (6348 kg ha-1) was
obtained from the NK Petrol, while the lowest seed yield (3949 kg
ha-1) was determined from the Champlain cultivar was obtained. The
highest oil content (46.73%) was observed from Oase and the lowest
value was obtained from Vectra (41.87%) cultivar. The highest oil
yield (2950 kg ha-1) was determined from NK Petrol while the least
value (1681 kg ha-1) was determined from Champlain cultivar. The
highest protein yield (1539.3 kg ha-1) was obtained from NK Petrol
and the lowest protein yield (976.5 kg ha-1) was obtained from
Champlain cultivar.
The main purpose of the cultivation of oil crops, to increase the
yield of oil per unit area. According the result of this research, NK
Petrol cultivar which ranks first with regard to both seed yield and oil
yield between cultivars as the most suitable winter canola cultivar of
local conditions.
Abstract: Optical Bursts Switching (OBS) is a relatively new
optical switching paradigm. Contention and burst loss in OBS
networks are major concerns. To resolve contentions, an interesting
alternative to discarding the entire data burst is to partially drop the
burst. Partial burst dropping is based on burst segmentation concept
that its implementation is constrained by some technical challenges,
besides the complexity added to the algorithms and protocols on both
edge and core nodes. In this paper, the burst segmentation concept is
investigated, and an implementation scheme is proposed and
evaluated. An appropriate dropping policy that effectively manages
the size of the segmented data bursts is presented. The dropping
policy is further supported by a new control packet format that
provides constant transmission overhead.
Abstract: Different agricultural waste peels were assessed for
their suitability to be used as primary substrates for the
bioremediation of free cyanide (CN-) by a cyanide-degrading fungus
Aspergillus awamori isolated from cyanide containing wastewater.
The bioremediated CN- concentration were in the range of 36 to 110
mg CN-/L, with Orange (C. sinensis) > Carrot (D. carota) > Onion
(A. cepa) > Apple (M. pumila), being chosen as suitable substrates
for large scale CN- degradation processes due to: 1) the high
concentration of bioremediated CN-, 2) total reduced sugars released
into solution to sustain the biocatalyst, and 3) minimal residual NH4-
N concentration after fermentation. The bioremediation rate constants
(k) were 0.017h-1 (0h < t < 24h), with improved bioremediation rates
(0.02189h-1) observed after 24h. The averaged nitrilase activity was
~10 U/L.
Abstract: This paper intends to identify the ethnic Kazakhstani
Koreans- political process of identity formation by exploring their
narrative and practice about the state language represented in the
course of their becoming the new citizens of a new independent state.
The Russophone Kazakhstani Koreans- inability to speak the official
language of their affiliated state is considered there as dissatisfying the
basic requirement of citizens of the independent state, so that they are
becoming marginalized from the public sphere. Their contradictory
attitude that at once demonstrates nominal reception and practical
rejection of the obligatory state language unveils a high barrier inside
between their self-language and other-language. In this paper, the
ethnic Korean group-s conflicting linguistic identity is not seen as a
free and simple choice, but as a dynamic struggle and political process
in which the subject-s past experiences and memories intersect with
the external elements of pressure.
Abstract: The response of growth and yield of rainfed-chickpea
to population density should be evaluated based on long-term
experiments to include the climate variability. This is achievable just
by simulation. In this simulation study, this evaluation was done by
running the CYRUS model for long-term daily weather data of five
locations in Iran. The tested population densities were 7 to 59 (with
interval of 2) stands per square meter. Various functions, including
quadratic, segmented, beta, broken linear, and dent-like functions,
were tested. Considering root mean square of deviations and linear
regression statistics [intercept (a), slope (b), and correlation
coefficient (r)] for predicted versus observed variables, the quadratic
and broken linear functions appeared to be appropriate for describing
the changes in biomass and grain yield, and in harvest index,
respectively. Results indicated that in all locations, grain yield tends
to show increasing trend with crowding the population, but
subsequently decreases. This was also true for biomass in five
locations. The harvest index appeared to have plateau state across
low population densities, but decreasing trend with more increasing
density. The turning point (optimum population density) for grain
yield was 30.68 stands per square meter in Isfahan, 30.54 in Shiraz,
31.47 in Kermanshah, 34.85 in Tabriz, and 32.00 in Mashhad. The
optimum population density for biomass ranged from 24.6 (in
Tabriz) to 35.3 stands per square meter (Mashhad). For harvest index
it varied between 35.87 and 40.12 stands per square meter.
Abstract: This article proposes a new methodology to be used by SMEs (Small and Medium enterprises) to characterize their performance in quality, highlighting weaknesses and area for improvement. The methodology aims to identify the principal causes of quality problems and help to prioritize improvement initiatives. This is a self-assessment methodology that intends to be easy to implement by companies with low maturity level in quality. The methodology is organized in six different steps which includes gathering information about predetermined processes and subprocesses of quality management, defined based on the well-known Juran-s trilogy for quality management (Quality planning, quality control and quality improvement) and, predetermined results categories, defined based on quality concept. A set of tools for data collecting and analysis, such as interviews, flowcharts, process analysis diagrams and Failure Mode and effects Analysis (FMEA) are used. The article also presents the conclusions obtained in the application of the methodology in two cases studies.
Abstract: Since the pioneering work of Zadeh, fuzzy set theory has been applied to a myriad of areas. Song and Chissom introduced the concept of fuzzy time series and applied some methods to the enrollments of the University of Alabama. In recent years, a number of techniques have been proposed for forecasting based on fuzzy set theory methods. These methods have either used enrollment numbers or differences of enrollments as the universe of discourse. We propose using the year to year percentage change as the universe of discourse. In this communication, the approach of Jilani, Burney, and Ardil is modified by using the year to year percentage change as the universe of discourse. We use enrollment figures for the University of Alabama to illustrate our proposed method. The proposed method results in better forecasting accuracy than existing models.
Abstract: While financial institutions have faced difficulties
over the years for a multitude of reasons, the major cause of serious
banking problems continues to be directly related to lax credit
standards for borrowers and counterparties, poor portfolio risk
management, or a lack of attention to changes in economic or other
circumstances that can lead to a deterioration in the credit standing of
a bank's counterparties. Credit risk is most simply defined as the
potential that a bank borrower or counterparty will fail to meet its
obligations in accordance with agreed terms. The goal of credit risk
management is to maximize a bank's risk-adjusted rate of return by
maintaining credit risk exposure within acceptable parameters. Banks
need to manage the credit risk inherent in the entire portfolio as well
as the risk in individual credits or transactions. Banks should also
consider the relationships between credit risk and other risks. The
effective management of credit risk is a critical component of a
comprehensive approach to risk management and essential to the
long-term success of any banking organization. In this research we
also study the relationship between credit risk indices and borrower-s
timely payback in Karafarin bank.
Abstract: This paper adopts a notion of expectation-perception
gap of systems users as information systems (IS) failure. Problems
leading to the expectation-perception gap are identified and modelled
as five interrelated discrepancies or gaps throughout the process of
information systems development (ISD). It describes an empirical
study on how systems developers and users perceive the size of each
gap and the extent to which each problematic issue contributes to the
gap. The key to achieving success in ISD is to keep the expectationperception
gap closed by closing all 5 pertaining gaps. The gap model
suggests that most factors in IS failure are related to organizational,
cognitive and social aspects of information systems design.
Organization requirement analysis, being the weakest link of IS
development, is particularly worthy of investigation.
Abstract: The objective of the research was focused on the
design, development and evaluation of a sustainable web based
network system to be used as an interoperable environment for
University process workflow and document management. In this
manner the most of the process workflows in Universities can be
entirely realized electronically and promote integrated University.
Definition of the most used University process workflows enabled
creating electronic workflows and their execution on standard
workflow execution engines. Definition or reengineering of
workflows provided increased work efficiency and helped in having
standardized process through different faculties. The concept and the
process definition as well as the solution applied as Case study are
evaluated and findings are reported.
Abstract: Classifier fusion may generate more accurate
classification than each of the basic classifiers. Fusion is often based
on fixed combination rules like the product, average etc. This paper
presents decision templates as classifier fusion method for the
recognition of the handwritten English and Farsi numerals (1-9).
The process involves extracting a feature vector on well-known
image databases. The extracted feature vector is fed to multiple
classifier fusion. A set of experiments were conducted to compare
decision templates (DTs) with some combination rules. Results from
decision templates conclude 97.99% and 97.28% for Farsi and
English handwritten digits.
Abstract: International markets driven forces are changing
continuously, therefore companies need to gain a competitive edge in
such markets. Improving the company's products, processes and
practices is no longer auxiliary. Lean production is a production
management philosophy that consolidates work tasks with minimum
waste resulting in improved productivity. Lean production practices
can be mapped into many production areas. One of these is
Manufacturing Equipment and Technology (MET). Many lean
production practices can be implemented in MET, namely, specific
equipment configurations, total preventive maintenance, visual
control, new equipment/ technologies, production process
reengineering and shared vision of perfection.The purpose of this
paper is to investigate the implementation level of these six practices
in Jordanian industries. To achieve that a questionnaire survey has
been designed according to five-point Likert scale. The questionnaire
is validated through pilot study and through experts review. A sample
of 350 Jordanian companies were surveyed, the response rate was
83%. The respondents were asked to rate the extent of
implementation for each of practices. A relationship conceptual
model is developed, hypotheses are proposed, and consequently the
essential statistical analyses are then performed. An assessment tool
that enables management to monitor the progress and the
effectiveness of lean practices implementation is designed and
presented. Consequently, the results show that the average
implementation level of lean practices in MET is 77%, Jordanian
companies are implementing successfully the considered lean
production practices, and the presented model has Cronbach-s alpha
value of 0.87 which is good evidence on model consistency and
results validation.