Abstract: The purpose of this study was to analyze the correlation
between permitted building areas and housing distribution ratios and
their fluctuation, and test a distribution model during 3 successive governments in 5 cities including Bucheon in reference to the time
series administrative data, and thereby, interpret the results of the analysis in association with the policies pursued by the successive
governments to examine the structural fluctuation of permitted building areas and housing distribution ratios.
In order to analyze the fluctuation of permitted building areas and
housing distribution ratios during 3 successive governments and
examine the cycles of the time series data, the spectral analysis was performed, and in order to analyze the correlation between permitted
building areas and housing distribution ratios, the tabulation was performed to describe the correlations statistically, and in order to
explain about differences of fluctuation distribution of permitted building areas and housing distribution ratios among 3 governments,
the goodness of fit test was conducted.
Abstract: The highly nonlinear characteristics of drying
processes have prompted researchers to seek new nonlinear control
solutions. However, the relation between the implementation
complexity, on-line processing complexity, reliability control
structure and controller-s performance is not well established. The
present paper proposes high performance nonlinear fuzzy controllers
for a real-time operation of a drying machine, being developed under
a consistent match between those issues. A PCI-6025E data
acquisition device from National Instruments® was used, and the
control system was fully designed with MATLAB® / SIMULINK
language. Drying parameters, namely relative humidity and
temperature, were controlled through MIMOs Hybrid Bang-bang+PI
(BPI) and Four-dimensional Fuzzy Logic (FLC) real-time-based
controllers to perform drying tests on biological materials. The
performance of the drying strategies was compared through several
criteria, which are reported without controllers- retuning. Controllers-
performance analysis has showed much better performance of FLC
than BPI controller. The absolute errors were lower than 8,85 % for
Fuzzy Logic Controller, about three times lower than the
experimental results with BPI control.
Abstract: One major issue that is regularly cited as a block to
the widespread use of online assessments in eLearning, is that of the
authentication of the student and the level of confidence that an
assessor can have that the assessment was actually completed by that
student. Currently, this issue is either ignored, in which case
confidence in the assessment and any ensuing qualification is
damaged, or else assessments are conducted at central, controlled
locations at specified times, losing the benefits of the distributed
nature of the learning programme. Particularly as we move towards
constructivist models of learning, with intentions towards achieving
heutagogic learning environments, the benefits of a properly
managed online assessment system are clear. Here we discuss some
of the approaches that could be adopted to address these issues,
looking at the use of existing security and biometric techniques,
combined with some novel behavioural elements. These approaches
offer the opportunity to validate the student on accessing an
assessment, on submission, and also during the actual production of
the assessment. These techniques are currently under development in
the DECADE project, and future work will evaluate and report their
use..
Abstract: Grid computing is a group of clusters connected over
high-speed networks that involves coordinating and sharing
computational power, data storage and network resources operating
across dynamic and geographically dispersed locations. Resource
management and job scheduling are critical tasks in grid computing.
Resource selection becomes challenging due to heterogeneity and
dynamic availability of resources. Job scheduling is a NP-complete
problem and different heuristics may be used to reach an optimal or
near optimal solution. This paper proposes a model for resource and
job scheduling in dynamic grid environment. The main focus is to
maximize the resource utilization and minimize processing time of
jobs. Grid resource selection strategy is based on Max Heap Tree
(MHT) that best suits for large scale application and root node of
MHT is selected for job submission. Job grouping concept is used to
maximize resource utilization for scheduling of jobs in grid
computing. Proposed resource selection model and job grouping
concept are used to enhance scalability, robustness, efficiency and
load balancing ability of the grid.
Abstract: We present a method for fast volume rendering using
graphics hardware (GPU). To our knowledge, it is the first implementation
on the GPU. Based on the Shear-Warp algorithm, our
GPU-based method provides real-time frame rates and outperforms
the CPU-based implementation. When the number of slices is not
sufficient, we add in-between slices computed by interpolation. This
improves then the quality of the rendered images. We have also
implemented the ray marching algorithm on the GPU. The results
generated by the three algorithms (CPU-based and GPU-based Shear-
Warp, GPU-based Ray Marching) for two test models has proved that
the ray marching algorithm outperforms the shear-warp methods in
terms of speed up and image quality.
Abstract: Sub-prime mortgage crisis which began in the US is
regarded as the most economic crisis since the Great Depression in the
early 20th century. Especially, hidden problems on efficient operation
of a business were disclosed at a time and many financial institutions
went bankrupt and filed for court receivership. The collapses of
physical market lead to bankruptcy of manufacturing and construction
businesses. This study is to analyze dynamic efficiency of construction
businesses during the five years at the turn of the global financial
crisis. By discovering the trend and stability of efficiency of a
construction business, this study-s objective is to improve
management efficiency of a construction business in the
ever-changing construction market. Variables were selected by
analyzing corporate information on top 20 construction businesses in
Korea and analyzed for static efficiency in 2008 and dynamic
efficiency between 2006 and 2010. Unlike other studies, this study
succeeded in deducing efficiency trend and stability of a construction
business for five years by using the DEA/Window model. Using the
analysis result, efficient and inefficient companies could be figured
out. In addition, relative efficiency among DMU was measured by
comparing the relationship between input and output variables of
construction businesses. This study can be used as a literature to
improve management efficiency for companies with low efficiency
based on efficiency analysis of construction businesses.
Abstract: Nanophotocatalysts such as titanium (TiO2), zinc (ZnO), and iron (Fe2O3) oxides can be used in organic pollutants oxidation, and in many other applications. But among the challenges for technological application (scale-up) of the nanotechnology scientific developments two aspects are still little explored: research on environmental risk of the nanomaterials preparation methods, and the study of nanomaterials properties and/or performance variability. The environmental analysis was performed for six different methods of ZnO nanoparticles synthesis, and showed that it is possible to identify the more environmentally compatible process even at laboratory scale research. The obtained ZnO nanoparticles were tested as photocatalysts, and increased the degradation rate of the Rhodamine B dye up to 30 times.
Abstract: Iterative learning control aims to achieve zero tracking
error of a specific command. This is accomplished by iteratively
adjusting the command given to a feedback control system, based on
the tracking error observed in the previous iteration. One would like
the iterations to converge to zero tracking error in spite of any error
present in the model used to design the learning law. First, this need
for stability robustness is discussed, and then the need for robustness
of the property that the transients are well behaved. Methods of
producing the needed robustness to parameter variations and to
singular perturbations are presented. Then a method involving
reverse time runs is given that lets the world behavior produce the
ILC gains in such a way as to eliminate the need for a mathematical
model. Since the real world is producing the gains, there is no issue
of model error. Provided the world behaves linearly, the approach
gives an ILC law with both stability robustness and good transient
robustness, without the need to generate a model.
Abstract: This paper presents an intrusion detection system of hybrid neural network model based on RBF and Elman. It is used for anomaly detection and misuse detection. This model has the memory function .It can detect discrete and related aggressive behavior effectively. RBF network is a real-time pattern classifier, and Elman network achieves the memory ability for former event. Based on the hybrid model intrusion detection system uses DARPA data set to do test evaluation. It uses ROC curve to display the test result intuitively. After the experiment it proves this hybrid model intrusion detection system can effectively improve the detection rate, and reduce the rate of false alarm and fail.
Abstract: Electrocardiogram (ECG) segmentation is necessary to help reduce the time consuming task of manually annotating ECG's. Several algorithms have been developed to segment the ECG automatically. We first review several of such methods, and then present a new single lead segmentation method based on Adaptive piecewise constant approximation (APCA) and Piecewise derivative dynamic time warping (PDDTW). The results are tested on the QT database. We compared our results to Laguna's two lead method. Our proposed approach has a comparable mean error, but yields a slightly higher standard deviation than Laguna's method.
Abstract: Termination Mechanism is an indispensible part of the
emergency management mechanism. Despite of its importance in both
theory and practice, it is almost a brand new field for researching. The
concept of termination mechanism is proposed firstly in this paper,
and the design and implementation which are helpful to guarantee the
effect and integrity of emergency management are discussed secondly.
Starting with introduction of the problems caused by absent
termination and incorrect termination, the essence of termination
mechanism is analyzed, a model based on Optimal Stopping Theory is
constructed and the termination index is given. The model could be
applied to find the best termination time point.. Termination decision
should not only be concerned in termination stage, but also in the
whole emergency management process, which makes it a dynamic
decision making process. Besides, the main subjects and the procedure
of termination are illustrated after the termination time point is given.
Some future works are discussed lastly.
Abstract: The use of hard and brittle material has become
increasingly more extensive in recent years. Therefore processing of
these materials for the parts fabrication has become a challenging
problem. However, it is time-consuming to machine the hard brittle
materials with the traditional metal-cutting technique that uses
abrasive wheels. In addition, the tool would suffer excessive wear as
well. However, if ultrasonic energy is applied to the machining
process and coupled with the use of hard abrasive grits, hard and
brittle materials can be effectively machined. Ultrasonic machining
process is mostly used for the brittle materials. The present research
work has developed models using finite element approach to predict
the mechanical stresses sand strains produced in the tool during
ultrasonic machining process. Also the flow behavior of abrasive
slurry coming out of the nozzle has been studied for simulation using
ANSYS CFX module. The different abrasives of different grit sizes
have been used for the experimentation work.
Abstract: Sickness absence represents a major economic and
social issue. Analysis of sick leave data is a recurrent challenge to analysts because of the complexity of the data structure which is
often time dependent, highly skewed and clumped at zero. Ignoring these features to make statistical inference is likely to be inefficient
and misguided. Traditional approaches do not address these problems. In this study, we discuss model methodologies in terms of statistical techniques for addressing the difficulties with sick leave data. We also introduce and demonstrate a new method by performing a longitudinal assessment of long-term absenteeism using
a large registration dataset as a working example available from the Helsinki Health Study for municipal employees from Finland during the period of 1990-1999. We present a comparative study on model
selection and a critical analysis of the temporal trends, the occurrence
and degree of long-term sickness absences among municipal employees. The strengths of this working example include the large
sample size over a long follow-up period providing strong evidence in supporting of the new model. Our main goal is to propose a way to
select an appropriate model and to introduce a new methodology for analysing sickness absence data as well as to demonstrate model
applicability to complicated longitudinal data.
Abstract: the research was conducted using the self report of
shoplifters who apprehended in the supermarket while stealing. 943
shoplifters in three years were interviewed right after the stealing act
and before calling the police. The aim of the study is to know the
shoplifting characteristics in Saudi Arabia, including the trait of
shoplifters and the situation of the supermarkets where the stealing
takes place. The analysis based on the written information about each
thief as the documentary research method. Descriptive statistics as
well as some inferential statistics were employed. The result shows
that there are differences between genders, age groups, occupations,
time of the day, days of the week, months, way of stealing, individual
or group of thieves and other supermarket situations in the type of
items stolen, total price and the count of items. The result and the
recommendation will serve as a guide for retailers where, when and
who to look at to prevent shoplifting.
Abstract: Metal stamping die design is a complex, experiencebased
and time-consuming task. Various artificial intelligence (AI)
techniques are being used by worldwide researchers for stamping die
design to reduce complexity, dependence on human expertise and
time taken in design process as well as to improve design efficiency.
In this paper a comprehensive review of applications of AI
techniques in manufacturability evaluation of sheet metal parts, die
design and process planning of metal stamping die is presented.
Further the salient features of major research work published in the
area of metal stamping are presented in tabular form and scope of
future research work is identified.
Abstract: An advanced Monte Carlo simulation method, called Subset Simulation (SS) for the time-dependent reliability prediction for underground pipelines has been presented in this paper. The SS can provide better resolution for low failure probability level with efficient investigating of rare failure events which are commonly encountered in pipeline engineering applications. In SS method, random samples leading to progressive failure are generated efficiently and used for computing probabilistic performance by statistical variables. SS gains its efficiency as small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment. It is hoped that the development work can promote the use of SS tools for uncertainty propagation in the decision-making process of underground pipelines network reliability prediction.
Abstract: Transesterification of candlenut (aleurites moluccana)
oil with methanol using potassium hydroxide as catalyst was
studied. The objective of the present investigation was to produce
the methyl ester for use as biodiesel. The operation variables
employed were methanol to oil molar ratio (3:1 – 9:1), catalyst
concentration (0.50 – 1.5 %) and temperature (303 – 343K). Oil
volume of 150 mL, reaction time of 75 min were fixed as common
parameters in all the experiments. The concentration of methyl ester
was evaluated by mass balance of free glycerol formed which was
analyzed by using periodic acid. The optimal triglyceride conversion
was attained by using methanol to oil ratio of 6:1, potassium
hydroxide as catalyst was of 1%, at room temperature. Methyl ester
formed was characterized by its density, viscosity, cloud and pour
points. The biodiesel properties had properties similar to those of
diesel oil, except for the viscosity that was higher.
Abstract: In this article the accumulated results out of the effects
and length of the manufacture and production projects in the
university and research standard have been settled with the usefulness
definition of the process of project management for the accessibility
to the proportional pattern in the “time and action" stages. Studies
show that many problems confronted by the researchers in these
projects are connected to the non-profiting of: 1) autonomous timing
for gathering the educational theme, 2) autonomous timing for
planning and pattern, presenting before the construction, and 3)
autonomous timing for manufacture and sample presentation from the
output. The result of this study indicates the division of every
manufacture and production projects into three smaller autonomous
projects from its kind, budget and autonomous expenditure, shape
and order of the stages for the management of these kinds of projects.
In this case study real result are compared with theoretical results.
Abstract: A Wireless sensor network (WSN) consists of a set of battery-powered nodes, which collaborate to perform sensing tasks in a given environment. Each node in WSN should be capable to act for long periods of time with scrimpy or no external management. One requirement for this independent is: in the presence of adverse positions, the sensor nodes must be capable to configure themselves. Hence, the nodes for determine the existence of unusual events in their surroundings should make use of position awareness mechanisms. This work approaches the problem by considering the possible unusual events as diseases, thus making it possible to diagnose them through their symptoms, namely, their side effects. Considering these awareness mechanisms as a foundation for highlevel monitoring services, this paper also shows how these mechanisms are included in the primal plan of an intrusion detection system.
Abstract: The trial in the city, located 170 kilometers from the
Iranian city of Ahvaz was Omidiyeh. The main factor in this project
includes 4 levels in control (without hormones), use of hormones in
the seed, vegetative and flowering stage respectively. And sub-plots
included 3 varieties of vetch in three levels, with local names, was the
jewel in the study of light and Auxin in the vegetative and
reproductive different times in different varieties of vetch was
investigated. This test has been taken in the plots in a randomized
complete block with four replications. In order to study the effects of
the hormone Auxin in the growth stages (seed, vegetative and
flowering) to control (no hormone Auxin) on three local varieties of
vetch, the essence of light and plant height, number of pods per plant,
seed number The pods, seeds per plant, grain weight, grain yield,
plant dry weight and protein content were measured. Among the
vetch varieties for plant height, number of pods per plant, a seed per
plant, grain weight, grain yield, and plant dry weight and protein
levels of 1 percent of plant and seed number per pod per plant at 5%
level of There was no significant difference. Interactions for grain
yield per plant, grain yield and protein levels of 1 percent and the
number of seeds per pod and seed weight are significant differences
in levels 5 and plant height and plant dry weight of the interaction
were INFLUENCE There was no significant difference in them.