Abstract: Meshing is the process of discretizing problem
domain into many sub domains before the numerical calculation can
be performed. One of the most popular meshes among many types of meshes is tetrahedral mesh, due to their flexibility to fit into almost
any domain shape. In both 2D and 3D domains, triangular and tetrahedral meshes can be generated by using Delaunay triangulation.
The quality of mesh is an important factor in performing any Computational Fluid Dynamics (CFD) simulations as the results is
highly affected by the mesh quality. Many efforts had been done in
order to improve the quality of the mesh. The paper describes a mesh
generation routine which has been developed capable of generating
high quality tetrahedral cells in arbitrary complex geometry. A few
test cases in CFD problems are used for testing the mesh generator.
The result of the mesh is compared with the one generated by a
commercial software. The results show that no sliver exists for the
meshes generated, and the overall quality is acceptable since the percentage of the bad tetrahedral is relatively small. The boundary
recovery was also successfully done where all the missing faces are
rebuilt.
Abstract: Lately there has been a significant boost of interest in
music digital libraries, which constitute an attractive area of research
and development due to their inherent interesting issues and
challenging technical problems, solutions to which will be highly
appreciated by enthusiastic end-users. We present here a DL that we
have developed to support users in their quest for classical music
pieces within a particular collection of 18,000+ audio recordings.
To cope with the early DL model limitations, we have used a refined
socio-semantic and contextual model that allows rich bibliographic
content description, along with semantic annotations, reviewing,
rating, knowledge sharing etc. The multi-layered service model
allows incorporation of local and distributed information,
construction of rich hypermedia documents, expressing the complex
relationships between various objects and multi-dimensional spaces,
agents, actors, services, communities, scenarios etc., and facilitates
collaborative activities to offer to individual users the needed
collections and services.
Abstract: Virtual Reality Modelling Language (VRML) is description language, which belongs to a field Window on World virtual reality system. The file, which is in VRML format, can be interpreted by VRML explorer in three-dimensional scene. VRML was created with aim to represent virtual reality on Internet easier. Development of 3D graphic is connected with Silicon Graphic Corporation. VRML 2.0 is the file format for describing interactive 3D scenes and objects. It can be used in collaboration with www, can be used for 3D complex representations creating of scenes, products or VR applications VRML 2.0 enables represent static and animated objects too. Interesting application of VRML is in area of manufacturing systems presentation.
Abstract: The increasing number of senior population gradually
causes to demand the use of information and communication
technology for their satisfactory lives. This paper presents the
development of an integrated TV based system which offers an
opportunity to provide value added services to a large number of
elderly citizens, and thus helps improve their quality of life. The
design philosophy underlying this paper is to fulfill both technological
and human aspects. The balance between these two dimensions has
been currently stressed as a crucial element for the design of usable
systems in real use, particularly to the elderly who have physical and
mental decline. As the first step to achieve it, we have identified
human and social factors that affect the elder-s quality of life by a
literature review, and based on them, build four fundamental services:
information, healthcare, learning and social network services.
Secondly, the system architecture, employed technologies and the
elderly-friendly system design considerations are presented. This
reflects technological and human perspectives in terms of the system
design. Finally, we describe some scenarios that illustrate the
potentiality of the proposed system to improve elderly people-s quality
of life.
Abstract: This study focuses on teamwork in Finnish working
life. Through a wide cross-section of teams the study examines the
causes to which team members attribute the outcomes of their teams.
Qualitative data was collected from 314 respondents. They wrote 616
stories to describe memorable experiences of success and failure in
teamwork. The stories revealed 1930 explanations. The findings
indicate that both favorable and unfavorable team outcomes are
perceived as being caused by the characteristics of team members,
relationships between members, team communication, team
structure, team goals, team leadership, and external forces. The types
represent different attribution levels in the context of organizational
teamwork.
Abstract: Mammographic images and data analysis to
facilitate modelling or computer aided diagnostic (CAD) software development should best be done using a common database that can handle various mammographic image file
formats and relate these to other patient information.
This would optimize the use of the data as both primary
reporting and enhanced information extraction of research data could be performed from the single dataset. One desired
improvement is the integration of DICOM file header information into the database, as an efficient and reliable source of supplementary patient information intrinsically
available in the images.
The purpose of this paper was to design a suitable database to link and integrate different types of image files and gather common information that can be further used for research
purposes. An interface was developed for accessing, adding,
updating, modifying and extracting data from the common
database, enhancing the future possible application of the data in CAD processing.
Technically, future developments envisaged include the creation of an advanced search function to selects image files
based on descriptor combinations. Results can be further used for specific CAD processing and other research. Design of a
user friendly configuration utility for importing of the required fields from the DICOM files must be done.
Abstract: Noise level has critical effects on the diagnostic
performance of signal-averaged electrocardiogram (SAECG), because
the true starting and end points of QRS complex would be masked by
the residual noise and sensitive to the noise level. Several studies and
commercial machines have used a fixed number of heart beats
(typically between 200 to 600 beats) or set a predefined noise level
(typically between 0.3 to 1.0 μV) in each X, Y and Z lead to perform
SAECG analysis. However different criteria or methods used to
perform SAECG would cause the discrepancies of the noise levels
among study subjects. According to the recommendations of 1991
ESC, AHA and ACC Task Force Consensus Document for the use of
SAECG, the determinations of onset and offset are related closely to
the mean and standard deviation of noise sample. Hence this study
would try to perform SAECG using consistent root-mean-square
(RMS) noise levels among study subjects and analyze the noise level
effects on SAECG. This study would also evaluate the differences
between normal subjects and chronic renal failure (CRF) patients in
the time-domain SAECG parameters.
The study subjects were composed of 50 normal Taiwanese and 20
CRF patients. During the signal-averaged processing, different RMS
noise levels were adjusted to evaluate their effects on three time
domain parameters (1) filtered total QRS duration (fQRSD), (2) RMS
voltage of the last QRS 40 ms (RMS40), and (3) duration of the low
amplitude signals below 40 μV (LAS40). The study results
demonstrated that the reduction of RMS noise level can increase
fQRSD and LAS40 and decrease the RMS40, and can further increase
the differences of fQRSD and RMS40 between normal subjects and
CRF patients. The SAECG may also become abnormal due to the
reduction of RMS noise level. In conclusion, it is essential to establish
diagnostic criteria of SAECG using consistent RMS noise levels for
the reduction of the noise level effects.
Abstract: The processing of the electrocardiogram (ECG) signal consists essentially in the detection of the characteristic points of
signal which are an important tool in the diagnosis of heart diseases. The most suitable are the detection of R waves. In this paper, we
present various mathematical tools used for filtering ECG using digital filtering and Discreet Wavelet Transform (DWT) filtering. In
addition, this paper will include two main R peak detection methods
by applying a windowing process: The first method is based on calculations derived, the second is a time-frequency method based on
Dyadic Wavelet Transform DyWT.
Abstract: Historic preservation areas are extremely vulnerable to disasters because they are home to many vulnerable people and contain many closely spaced wooden houses. However, the narrow streets in these regions have historic meaning, which means that they cannot be widened and can become blocked easily during large disasters. Here, we describe our efforts to establish a methodology for the planning of evacuation route sin such historic preservation areas. In particular, this study aims to clarify the effectiveness of measures intended to secure two-way evacuation routes for vulnerable people during large disasters in a historic area preserved under the Cultural Properties Protection Law, Japan.
Abstract: The response of growth and yield of rainfed-chickpea
to population density should be evaluated based on long-term
experiments to include the climate variability. This is achievable just
by simulation. In this simulation study, this evaluation was done by
running the CYRUS model for long-term daily weather data of five
locations in Iran. The tested population densities were 7 to 59 (with
interval of 2) stands per square meter. Various functions, including
quadratic, segmented, beta, broken linear, and dent-like functions,
were tested. Considering root mean square of deviations and linear
regression statistics [intercept (a), slope (b), and correlation
coefficient (r)] for predicted versus observed variables, the quadratic
and broken linear functions appeared to be appropriate for describing
the changes in biomass and grain yield, and in harvest index,
respectively. Results indicated that in all locations, grain yield tends
to show increasing trend with crowding the population, but
subsequently decreases. This was also true for biomass in five
locations. The harvest index appeared to have plateau state across
low population densities, but decreasing trend with more increasing
density. The turning point (optimum population density) for grain
yield was 30.68 stands per square meter in Isfahan, 30.54 in Shiraz,
31.47 in Kermanshah, 34.85 in Tabriz, and 32.00 in Mashhad. The
optimum population density for biomass ranged from 24.6 (in
Tabriz) to 35.3 stands per square meter (Mashhad). For harvest index
it varied between 35.87 and 40.12 stands per square meter.
Abstract: The International Classification of Primary Care (ICPC), which belongs to the WHO Family of International Classifications (WHO-FIC), has a low granularity, which is convenient for describing general medical practice. However, its lack of specificity makes it useful to be used along with an interface terminology. An international survey has been performed, using a questionnaire sent by email to experts from 25 countries, in order to describe the terminologies interfacing with ICPC. Eleven interface terminologies have been identified, developed in Argentina, Australia, Belgium (2), Canada, Denmark, France, Germany, Norway, South Africa, and The Netherlands. Globally, these systems have been poorly assessed until now.
Abstract: A multi-agent system is developed here to predict
monthly details of the upcoming peak of the 24th solar magnetic
cycle. While studies typically predict the timing and magnitude of
cycle peaks using annual data, this one utilizes the unsmoothed
monthly sunspot number instead. Monthly numbers display more
pronounced fluctuations during periods of strong solar magnetic
activity than the annual sunspot numbers. Because strong magnetic
activities may cause significant economic damages, predicting
monthly variations should provide different and perhaps helpful
information for decision-making purposes. The multi-agent system
developed here operates in two stages. In the first, it produces twelve
predictions of the monthly numbers. In the second, it uses those
predictions to deliver a final forecast. Acting as expert agents, genetic
programming and neural networks produce the twelve fits and
forecasts as well as the final forecast. According to the results
obtained, the next peak is predicted to be 156 and is expected to
occur in October 2011- with an average of 136 for that year.
Abstract: The paper which is dedicated to describing the effect
made by the “significant other", presents the new model of
interrelation between self-reflection, the “significant other"
phenomenon and aggression. Tendencies of direction and type
frustration response developments in detail are discussed. New
results have been received through designing of the original
experiment. It is based on modifications of the “Picture – Frustration
Study" test by S. Rosenzweig.
Abstract: Characteristics of ad hoc networks and even their existence depend on the nodes forming them. Thus, services and applications designed for ad hoc networks should adapt to this dynamic and distributed environment. In particular, multicast algorithms having reliability and scalability requirements should abstain from centralized approaches. We aspire to define a reliable and scalable multicast protocol for ad hoc networks. Our target is to utilize epidemic techniques for this purpose. In this paper, we present a brief survey of epidemic algorithms for reliable multicasting in ad hoc networks, and describe formulations and analytical results for simple epidemics. Then, P2P anti-entropy algorithm for content distribution and our prototype simulation model are described together with our initial results demonstrating the behavior of the algorithm.
Abstract: This paper adopts a notion of expectation-perception
gap of systems users as information systems (IS) failure. Problems
leading to the expectation-perception gap are identified and modelled
as five interrelated discrepancies or gaps throughout the process of
information systems development (ISD). It describes an empirical
study on how systems developers and users perceive the size of each
gap and the extent to which each problematic issue contributes to the
gap. The key to achieving success in ISD is to keep the expectationperception
gap closed by closing all 5 pertaining gaps. The gap model
suggests that most factors in IS failure are related to organizational,
cognitive and social aspects of information systems design.
Organization requirement analysis, being the weakest link of IS
development, is particularly worthy of investigation.
Abstract: The performance of time-reversal MUSIC algorithm will be dramatically degrades in presence of strong noise and multiple scattering (i.e. when scatterers are close to each other). This is due to error in determining the number of scatterers. The present paper provides a new approach to alleviate such a problem using an information theoretic criterion referred as minimum description length (MDL). The merits of the novel approach are confirmed by the numerical examples. The results indicate the time-reversal MUSIC yields accurate estimate of the target locations with considerable noise and multiple scattering in the received signals.
Abstract: Monitoring lightning electromagnetic pulses (sferics)
and other terrestrial as well as extraterrestrial transient radiation signals
is of considerable interest for practical and theoretical purposes
in astro- and geophysics as well as meteorology. Managing a continuous
flow of data, automisation of the detection and classification
process is important. Features based on a combination of wavelet and
statistical methods proved efficient for analysis and characterisation
of transients and as input into a radial basis function network that is
trained to discriminate transients from pulse like to wave like.
Abstract: The purpose of the experiments described in this article was the comparison of integrated fixed film activated sludge (IFAS) and activated sludge (AS) system. The IFAS applied system consists of the cigarette filter rods (wasted filter in tobacco factories) as a biofilm carrier. The comparison with activated sludge was performed by two parallel treatment lines. Organic substance, ammonia and TP removal was investigated over four month period. Synthetic wastewater was prepared with ordinary tap water and glucose as the main sources of carbon and energy, plus balanced macro and micro nutrients. COD removal percentages of 94.55%, and 81.62% were achieved for IFAS and activated sludge system, respectively. Also, ammonia concentration significantly decreased by increasing the HRT in both systems. The average ammonia removal of 97.40 % and 96.34% were achieved for IFAS and activated sludge system, respectively. The removal efficiency of total phosphorus (TP-P) was 60.64%, higher than AS process by 56.63% respectively.
Abstract: A simple method for the simultaneous determination
of hippuric acid and benzoic acid in urine using reversed-phase high
performance liquid chromatography was described. Chromatography
was performed on a Nova-Pak C18 (3.9 x 150 mm) column with a
mobile phase of mixed solution methanol: water: acetic acid
(20:80:0.2) and UV detection at 254 nm. The calibration curve was
linear within concentration range at 0.125 to 6.0 mg/ml of hippuric
acid and benzoic acid. The recovery, accuracy and coefficient
variance of hippuric acid were 104.54%, 0.2% and 0.2% respectively
and for benzoic acid were 98.48%, 1.25% and 0.60% respectively.
The detection limit of this method was 0.01ng/l for hippuric acid and
0.06ng/l for benzoic acid. This method has been applied to the
analysis of urine samples from the suspected of toluene abuser or
glue sniffer among secondary school students at Johor Bahru.
Abstract: Self-organizing map (SOM) provides both clustering and visualization capabilities in mining data. Dynamic self-organizing maps such as Growing Self-organizing Map (GSOM) has been developed to overcome the problem of fixed structure in SOM to enable better representation of the discovered patterns. However, in mining large datasets or historical data the hierarchical structure of the data is also useful to view the cluster formation at different levels of abstraction. In this paper, we present a technique to generate concept trees from the GSOM. The formation of tree from different spread factor values of GSOM is also investigated and the quality of the trees analyzed. The results show that concept trees can be generated from GSOM, thus, eliminating the need for re-clustering of the data from scratch to obtain a hierarchical view of the data under study.