Abstract: A total of 33,680 nuclear power plants (NPPs) workers were monitored and recorded from 1990 to 2007. According to the record, the average individual radiation dose has been decreasing continually from it 3.20 mSv/man in 1990 to 1.12 mSv/man at the end of 2007. After the International Commission on Radiological Protection (ICRP) 60 recommendation was generalized in South Korea, no nuclear power plant workers received above 20 mSv radiation, and the numbers of relatively highly exposed workers have been decreasing continuously. The age distribution of radiation workers in nuclear power plants was composed of mainly 20-30- year-olds (83%) for 1990 ~ 1994 and 30-40-year-olds (75%) for 2003 ~ 2007. The difference in individual average dose by age was not significant. Most (77%) of NPP radiation exposures from 1990 to 2007 occurred mostly during the refueling period. With regard to exposure type, the majority of exposures were external exposures, representing 95% of the total exposures, while internal exposures represented only 5%. External effective dose was affected mainly by gamma radiation exposure, with an insignificant amount of neutron exposure. As for internal effective dose, tritium (3H) in the pressurized heavy water reactor (PHWR) was the biggest cause of exposure.
Abstract: Air conditioning is mainly use as human comfort
cooling medium. It use more in high temperatures are country such as
Malaysia. Proper estimation of cooling load will archive ideal
temperature. Without proper estimation can lead to over estimation or
under estimation. The ideal temperature should be comfort enough.
This study is to develop a program to calculate an ideal cooling load
demand, which is match with heat gain. Through this study, it is easy
to calculate cooling load estimation. Objective of this study are to
develop user-friendly and easy excess cooling load program. This is
to insure the cooling load can be estimate by any of the individual
rather than them using rule-of-thumb. Developed software is carryout
by using Matlab-GUI. These developments are only valid for
common building in Malaysia only. An office building was select as
case study to verify the applicable and accuracy of develop software.
In conclusion, the main objective has successfully where developed
software is user friendly and easily to estimate cooling load demand.
Abstract: In this paper, genetic algorithm (GA) is proposed for
the design of an optimization algorithm to achieve the bandwidth
allocation of ATM network. In Broadband ISDN, the ATM is a highbandwidth;
fast packet switching and multiplexing technique. Using
ATM it can be flexibly reconfigure the network and reassign the
bandwidth to meet the requirements of all types of services. By
dynamically routing the traffic and adjusting the bandwidth
assignment, the average packet delay of the whole network can be
reduced to a minimum. M/M/1 model can be used to analyze the
performance.
Abstract: Space exploration is a highly visible endeavour of
humankind to seek profound answers to questions about the origins
of our solar system, whether life exists beyond Earth, and how we
could live on other worlds. Different platforms have been utilized in
planetary exploration missions, such as orbiters, landers, rovers, and
penetrators.
Having low mass, good mechanical contact with the surface,
ability to acquire high quality scientific subsurface data, and ability to
be deployed in areas that may not be conducive to landers or rovers,
Penetrators provide an alternative and complimentary solution that
makes possible scientific exploration of hardly accessible sites (icy
areas, gully sites, highlands etc.).
The Canadian Space Agency (CSA) has put space exploration as
one of the pillars of its space program, and established ExCo program
to prepare Canada for future international planetary exploration.
ExCo sets surface mobility as its focus and priority, and invests
mainly in the development of rovers because of Canada's niche space
robotics technology. Meanwhile, CSA is also investigating how
micro-penetrators can help Canada to fulfill its scientific objectives
for planetary exploration.
This paper presents a review of the micro-penetrator technologies,
past missions, and lessons learned. It gives a detailed analysis of the
technical challenges of micro-penetrators, such as high impact
survivability, high precision guidance navigation and control, thermal
protection, communications, and etc. Then, a Canadian perspective of
a possible micro-penetrator mission is given, including Canadian
scientific objectives and priorities, potential instruments, and flight
opportunities.
Abstract: This paper presents a method for determining the
uniaxial tensile properties such as Young-s modulus, yield strength
and the flow behaviour of a material in a virtually non-destructive
manner. To achieve this, a new dumb-bell shaped miniature
specimen has been designed. This helps in avoiding the removal of
large size material samples from the in-service component for the
evaluation of current material properties. The proposed miniature
specimen has an advantage in finite element modelling with respect
to computational time and memory space. Test fixtures have been
developed to enable the tension tests on the miniature specimen in a
testing machine. The studies have been conducted in a chromium
(H11) steel and an aluminum alloy (AR66). The output from the
miniature test viz. load-elongation diagram is obtained and the finite
element simulation of the test is carried out using a 2D plane stress
analysis. The results are compared with the experimental results. It is
observed that the results from the finite element simulation
corroborate well with the miniature test results. The approach seems
to have potential to predict the mechanical properties of the
materials, which could be used in remaining life estimation of the
various in-service structures.
Abstract: This paper describes new computer vision algorithms
that have been developed to track moving objects as part of a
long-term study into the design of (semi-)autonomous vehicles. We
present the results of a study to exploit variable kernels for tracking in
video sequences. The basis of our work is the mean shift
object-tracking algorithm; for a moving target, it is usual to define a
rectangular target window in an initial frame, and then process the data
within that window to separate the tracked object from the background
by the mean shift segmentation algorithm. Rather than use the
standard, Epanechnikov kernel, we have used a kernel weighted by the
Chamfer distance transform to improve the accuracy of target
representation and localization, minimising the distance between the
two distributions in RGB color space using the Bhattacharyya
coefficient. Experimental results show the improved tracking
capability and versatility of the algorithm in comparison with results
using the standard kernel. These algorithms are incorporated as part of
a robot test-bed architecture which has been used to demonstrate their
effectiveness.
Abstract: Model-based approaches have been applied successfully
to a wide range of tasks such as specification, simulation, testing, and
diagnosis. But one bottleneck often prevents the introduction of these
ideas: Manual modeling is a non-trivial, time-consuming task.
Automatically deriving models by observing and analyzing running
systems is one possible way to amend this bottleneck. To
derive a model automatically, some a-priori knowledge about the
model structure–i.e. about the system–must exist. Such a model
formalism would be used as follows: (i) By observing the network
traffic, a model of the long-term system behavior could be generated
automatically, (ii) Test vectors can be generated from the model,
(iii) While the system is running, the model could be used to diagnose
non-normal system behavior.
The main contribution of this paper is the introduction of a model
formalism called 'probabilistic regression automaton' suitable for the
tasks mentioned above.
Abstract: The presence of cold air with the convergent
topography of the Lut valley over the valley-s sloping terrain can
generate Low Level Jets (LLJ). Moreover, the valley-parallel
pressure gradients and northerly LLJ are produced as a result of the
large-scale processes. In the numerical study the regional MM5
model was run leading to achieve an appropriate dynamical analysis
of flows in the region for summer and winter. The results of this
study show the presence of summer synoptical systems cause the
formation of north-south pressure gradients in the valley which could
be led to the blowing of winds with the velocity more than 14 ms-1
and vulnerable dust and wind storms lasting more than 120 days.
Whereas the presence of cold air masses in the region in winter,
cause the average speed of LLJs decrease. In this time downslope
flows are noticeable in creating the night LLJs.
Abstract: Bioinformatics and computational biology involve
the use of techniques including applied mathematics,
informatics, statistics, computer science, artificial intelligence,
chemistry, and biochemistry to solve biological problems
usually on the molecular level. Research in computational
biology often overlaps with systems biology. Major research
efforts in the field include sequence alignment, gene finding,
genome assembly, protein structure alignment, protein structure
prediction, prediction of gene expression and proteinprotein
interactions, and the modeling of evolution. Various
global rearrangements of permutations, such as reversals and
transpositions,have recently become of interest because of their
applications in computational molecular biology. A reversal is
an operation that reverses the order of a substring of a permutation.
A transposition is an operation that swaps two adjacent
substrings of a permutation. The problem of determining the
smallest number of reversals required to transform a given
permutation into the identity permutation is called sorting by
reversals. Similar problems can be defined for transpositions
and other global rearrangements. In this work we perform a
study about some genome rearrangement primitives. We show
how a genome is modelled by a permutation, introduce some
of the existing primitives and the lower and upper bounds
on them. We then provide a comparison of the introduced
primitives.
Abstract: Support vector machines (SVMs) have shown
superior performance compared to other machine learning techniques,
especially in classification problems. Yet one limitation of SVMs is
the lack of an explanation capability which is crucial in some
applications, e.g. in the medical and security domains. In this paper, a
novel approach for eclectic rule-extraction from support vector
machines is presented. This approach utilizes the knowledge acquired
by the SVM and represented in its support vectors as well as the
parameters associated with them. The approach includes three stages;
training, propositional rule-extraction and rule quality evaluation.
Results from four different experiments have demonstrated the value
of the approach for extracting comprehensible rules of high accuracy
and fidelity.
Abstract: In this study, the contact problem of a layered composite which consists of two materials with different elastic constants and heights resting on two rigid flat supports with sharp edges is considered. The effect of gravity is neglected. While friction between the layers is taken into account, it is assumed that there is no friction between the supports and the layered composite so that only compressive tractions can be transmitted across the interface. The layered composite is subjected to a uniform clamping pressure over a finite portion of its top surface. The problem is reduced to a singular integral equation in which the contact pressure is the unknown function. The singular integral equation is evaluated numerically and the results for various dimensionless quantities are presented in graphical forms.
Abstract: In this paper, a multi-agent robot system is presented. The system consists of four robots. The developed robots are able to automatically enter and patrol a harmful environment, such as the building infected with virus or the factory with leaking hazardous gas. Further, every robot is able to perform obstacle avoidance and search for the victims. Several operation modes are designed: remote control, obstacle avoidance, automatic searching, and so on.
Abstract: In this paper, all-optical signal processors that perform
both microwave mixing and bandpass filtering in a radio-over-fiber
(RoF) link are presented. The key device is a Mach-Zehnder
modulator (MZM) which performs all-optical microwave mixing. An
up-converted microwave signal is obtained and other unwanted
frequency components are suppressed at the end of the fiber span.
Abstract: Isobaric vapor-liquid equilibrium measurements are
reported for the binary mixture of Methyl acetate and
Isopropylbenzene at 97.3 kPa. The measurements have been
performed using a vapor recirculating type (modified Othmer's)
equilibrium still. The mixture shows positive deviation from ideality
and does not form an azeotrope. The activity coefficients have been
calculated taking into consideration the vapor phase nonideality. The
data satisfy the thermodynamic consistency tests of Herington and
Black. The activity coefficients have been satisfactorily correlated by
means of the Margules, NRTL, and Black equations. A comparison
of the values of activity coefficients obtained by experimental data
with the UNIFAC model has been made.
Abstract: Based on Traub-s methods for solving nonlinear
equation f(x) = 0, we develop two families of third-order
methods for solving system of nonlinear equations F(x) = 0. The
families include well-known existing methods as special cases.
The stability is corroborated by numerical results. Comparison
with well-known methods shows that the present methods are
robust. These higher order methods may be very useful in the
numerical applications requiring high precision in their computations
because these methods yield a clear reduction in number of iterations.
Abstract: In this paper we illuminate a frequency domain based
classification method for video scenes. Videos from certain topical
areas often contain activities with repeating movements. Sports
videos, home improvement videos, or videos showing mechanical
motion are some example areas. Assessing main and side frequencies
of each repeating movement gives rise to the motion type. We
obtain the frequency domain by transforming spatio-temporal motion
trajectories. Further on we explain how to compute frequency features
for video clips and how to use them for classifying. The focus of
the experimental phase is on transforms utilized for our system.
By comparing various transforms, experiments show the optimal
transform for a motion frequency based approach.
Abstract: The article investigates how 14- to 15- year-olds build informal conceptions of inferential statistics as they engage in a modelling process and build their own computer simulations with dynamic statistical software. This study proposes four primary phases of informal inferential reasoning for the students in the statistical modeling and simulation process. Findings show shifts in the conceptual structures across the four phases and point to the potential of all of these phases for fostering the development of students- robust knowledge of the logic of inference when using computer based simulations to model and investigate statistical questions.
Abstract: In this paper we examine the use of global texture analysis based approaches for the purpose of Persian font recognition in machine-printed document images. Most existing methods for font recognition make use of local typographical features and connected component analysis. However derivation of such features is not an easy task. Gabor filters are appropriate tools for texture analysis and are motivated by human visual system. Here we consider document images as textures and use Gabor filter responses for identifying the fonts. The method is content independent and involves no local feature analysis. Two different classifiers Weighted Euclidean Distance and SVM are used for the purpose of classification. Experiments on seven different type faces and four font styles show average accuracy of 85% with WED and 82% with SVM classifier over typefaces
Abstract: The main goal of this paper is to establish a
methodology for testing and optimizing GPRS performance over
Libya GSM network as well as to propose a suitable optimization
technique to improve performance. Some measurements of
download, upload, throughput, round-trip time, reliability, handover,
security enhancement and packet loss over a GPRS access network
were carried out. Measured values are compared to the theoretical
values that could be calculated beforehand. This data should be
processed and delivered by the server across the wireless network to
the client. The client on the fly takes those pieces of the data and
process immediately. Also, we illustrate the results by describing the
main parameters that affect the quality of service. Finally, Libya-s
two mobile operators, Libyana Mobile Phone and Al-Madar al-
Jadeed Company are selected as a case study to validate our
methodology.
Abstract: Positron emission particle tracking (PEPT) is a
technique in which a single radioactive tracer particle can be
accurately tracked as it moves. A limitation of PET is that in order to
reconstruct a tomographic image it is necessary to acquire a large
volume of data (millions of events), so it is difficult to study rapidly
changing systems. By considering this fact, PEPT is a very fast
process compared with PET.
In PEPT detecting both photons defines a line and the annihilation
is assumed to have occurred somewhere along this line. The location
of the tracer can be determined to within a few mm from coincident
detection of a small number of pairs of back-to-back gamma rays and
using triangulation. This can be achieved many times per second and
the track of a moving particle can be reliably followed. This
technique was invented at the University of Birmingham [1].
The attempt in PEPT is not to form an image of the tracer particle
but simply to determine its location with time. If this tracer is
followed for a long enough period within a closed, circulating system
it explores all possible types of motion.
The application of PEPT to industrial process systems carried out
at the University of Birmingham is categorized in two subjects: the
behaviour of granular materials and viscous fluids. Granular
materials are processed in industry for example in the manufacture of
pharmaceuticals, ceramics, food, polymers and PEPT has been used
in a number of ways to study the behaviour of these systems [2].
PEPT allows the possibility of tracking a single particle within the
bed [3]. Also PEPT has been used for studying systems such as: fluid
flow, viscous fluids in mixers [4], using a neutrally-buoyant tracer
particle [5].