Abstract: A high performance clarification system has been
discussed for advanced aqueous reprocessing of FBR spent fuel.
Dissolver residue gives the cause of troubles on the plant operation of
reprocessing. In this study, the new clarification system based on the
hybrid of centrifuge and filtration was proposed to get the high
separation ability of the component of whole insoluble sludge. The
clarification tests of simulated solid species were carried out to
evaluate the clarification performance using small-scale test apparatus
of centrifuge and filter unit. The density effect of solid species on the
collection efficiency was mainly evaluated in the centrifugal
clarification test. In the filtration test using ceramic filter with pore
size of 0.2μm, on the other hand, permeability and filtration rate
were evaluated in addition to the filtration efficiency. As results, it was
evaluated that the collection efficiency of solid species on the new
clarification system was estimated as nearly 100%. In conclusion, the
high clarification performance of dissolver liquor can be achieved by
the hybrid of the centrifuge and filtration system.
Abstract: This paper presents an evaluation of the wind potential in the area of the Lagoon of Venice (Italy). A full anemometric campaign of 2 year measurements, performed by the "Osservatorio Bioclimatologico dell'Ospedale al Mare di Venezia" has been analyzed to obtain the Weibull wind speed distribution and the main wind directions. The annual energy outputs of two candidate horizontal-axis wind turbines (“Aventa AV-7 LoWind" and “Gaia Wind 133-11kW") have been estimated on the basis of the computed Weibull wind distribution, registering a better performance of the former turbine, due to a higher ratio between rotor swept area and rated power of the electric generator, determining a lower cut-in wind speed.
Abstract: Air conditioning is mainly use as human comfort
cooling medium. It use more in high temperatures are country such as
Malaysia. Proper estimation of cooling load will archive ideal
temperature. Without proper estimation can lead to over estimation or
under estimation. The ideal temperature should be comfort enough.
This study is to develop a program to calculate an ideal cooling load
demand, which is match with heat gain. Through this study, it is easy
to calculate cooling load estimation. Objective of this study are to
develop user-friendly and easy excess cooling load program. This is
to insure the cooling load can be estimate by any of the individual
rather than them using rule-of-thumb. Developed software is carryout
by using Matlab-GUI. These developments are only valid for
common building in Malaysia only. An office building was select as
case study to verify the applicable and accuracy of develop software.
In conclusion, the main objective has successfully where developed
software is user friendly and easily to estimate cooling load demand.
Abstract: This paper presents a method for determining the
uniaxial tensile properties such as Young-s modulus, yield strength
and the flow behaviour of a material in a virtually non-destructive
manner. To achieve this, a new dumb-bell shaped miniature
specimen has been designed. This helps in avoiding the removal of
large size material samples from the in-service component for the
evaluation of current material properties. The proposed miniature
specimen has an advantage in finite element modelling with respect
to computational time and memory space. Test fixtures have been
developed to enable the tension tests on the miniature specimen in a
testing machine. The studies have been conducted in a chromium
(H11) steel and an aluminum alloy (AR66). The output from the
miniature test viz. load-elongation diagram is obtained and the finite
element simulation of the test is carried out using a 2D plane stress
analysis. The results are compared with the experimental results. It is
observed that the results from the finite element simulation
corroborate well with the miniature test results. The approach seems
to have potential to predict the mechanical properties of the
materials, which could be used in remaining life estimation of the
various in-service structures.
Abstract: Video Mosaicing is the stitching of selected frames of
a video by estimating the camera motion between the frames and
thereby registering successive frames of the video to arrive at the
mosaic. Different techniques have been proposed in the literature for
video mosaicing. Despite of the large number of papers dealing with
techniques to generate mosaic, only a few authors have investigated
conditions under which these techniques generate good estimate of
motion parameters. In this paper, these techniques are studied under
different videos, and the reasons for failures are found. We propose
algorithms with incorporation of outlier removal algorithms for better
estimation of motion parameters.
Abstract: The aim of this paper is to emphasize and alleviate the effect of phase noise due to imperfect local oscillators on the performances of a Multi-Carrier CDMA system. After the cancellation of Common Phase Error (CPE), an iterative approach is introduced which iteratively estimates Inter-Carrier Interference (ICI) components in the frequency domain and cancels their contribution in the time domain. Simulation are conducted in order to investigate the achievable performances for several parameters, such as the spreading factor, the modulation order, the phase noise power and the transmission Signal-to-Noise Ratio.
Abstract: The problem of Small Area Estimation (SAE) is complex because of various information sources and insufficient data. In this paper, an approach for SAE is presented for decision-making at national, regional and local level. We propose an Empirical Best Linear Unbiased Predictor (EBLUP) as an estimator in order to combine several information sources to evaluate various indicators. First, we present the urban audit project and its environmental, social and economic indicators. Secondly, we propose an approach for decision making in order to estimate indicators. An application is used to validate the theoretical proposal. Finally, a decision support system is presented based on open-source environment.
Abstract: Traffic density, an indicator of traffic
conditions, is one of the most critical characteristics to
Intelligent Transport Systems (ITS). This paper investigates
recursive traffic density estimation using the information
provided from inductive loop detectors. On the basis of the
phenomenological relationship between speed and density, the
existing studies incorporate a state space model and update the
density estimate using vehicular speed observations via the
extended Kalman filter, where an approximation is made
because of the linearization of the nonlinear observation
equation. In practice, this may lead to substantial estimation
errors. This paper incorporates a suitable transformation to
deal with the nonlinear observation equation so that the
approximation is avoided when using Kalman filter to
estimate the traffic density. A numerical study is conducted. It
is shown that the developed method outperforms the existing
methods for traffic density estimation.
Abstract: Tourism is a phenomenon respected by the human communities since a long time ago. It has been evoloving continually based on a variety of social and economic needs and with respect to increasingly development of communication and considerable increase of tourist-s number and resulted exchange income has attained much out come such as employment for the communities. or the purpose of tourism development in this zone suitable times and locations need to be specified in the zone for the tourist-s attendance. One of the most important needs of the tourists is the knowledge of climate conditions and suitable times for sightseeing. In this survey, the climate trend condition has been identified for attending the tourists in Isfahan province using the modified tourism climate index (TCI) as well as SPSS, GIS, excel, surfer softwares. This index evoluates systematically the climate conditions for tourism affairs and activities using the monthly maximum mean parameters of daily temperature, daily mean temperature, minimum relative humidity, daily mean relative humidity, precipitation (mm), total sunny hours, wind speed and dust. The results obtaind using kendal-s correlation test show that the months January, February, March, April, May, June, July, August, September, October, November and December are significant and have an increasing trend that indicates the best condition for attending the tourists. S, P, T mean , T max and dust are estimated from 1976-2005 and do kendal-s correlation test again to see which parameter has been effective. Based on the test, we also observed on the effective parameters that the rate of dust in February, March, April, May, June, July, August, October and November is decreasing and precipitation in September and January is increasing and also the radiation rate in May and August is increasing that indicate a better condition of convenience. Maximum temperature in June is also decreasing. Isfahan province has two spring and fall peaks and the best places for tourism are in the north and western areas.
Abstract: In this paper we present a technique to speed up
ICA based on the idea of reducing the dimensionality of the data
set preserving the quality of the results. In particular we refer to
FastICA algorithm which uses the Kurtosis as statistical property
to be maximized. By performing a particular Johnson-Lindenstrauss
like projection of the data set, we find the minimum dimensionality
reduction rate ¤ü, defined as the ratio between the size k of the reduced
space and the original one d, which guarantees a narrow confidence
interval of such estimator with high confidence level. The derived
dimensionality reduction rate depends on a system control parameter
β easily computed a priori on the basis of the observations only.
Extensive simulations have been done on different sets of real world
signals. They show that actually the dimensionality reduction is very
high, it preserves the quality of the decomposition and impressively
speeds up FastICA. On the other hand, a set of signals, on which the
estimated reduction rate is greater than 1, exhibits bad decomposition
results if reduced, thus validating the reliability of the parameter β.
We are confident that our method will lead to a better approach to
real time applications.
Abstract: Unlike this study focused extensively on trading
behavior of option market, those researches were just taken their
attention to model-driven option pricing. For example, Black-Scholes
(B-S) model is one of the most famous option pricing models.
However, the arguments of B-S model are previously mentioned by
some pricing models reviewing. This paper following suggests the
importance of the dynamic character for option pricing, which is also
the reason why using the genetic algorithm (GA). Because of its
natural selection and species evolution, this study proposed a hybrid
model, the Genetic-BS model which combining GA and B-S to
estimate the price more accurate. As for the final experiments, the
result shows that the output estimated price with lower MAE value
than the calculated price by either B-S model or its enhanced one,
Gram-Charlier garch (G-C garch) model. Finally, this work would
conclude that the Genetic-BS pricing model is exactly practical.
Abstract: Computed tomography (CT) dosimetry normally uses
an ionization chamber 100 mm long to estimate the computed
tomography dose index (CTDI), however some reports have already
indicated that small devices could replace the long ion chamber to
improve quality assurance procedures in CT dosimetry. This paper
presents a novel dosimetry system based in a commercial
phototransistor evaluated for CT dosimetry. Three detector
configurations were developed for this system: with a single, two and
four devices. Dose profile measurements were obtained with them
and their angular response were evaluated. The results showed that
the novel dosimetry system with the phototransistor could be an
alternative for CT dosimetry. It allows to obtain the CT dose profile
in details and also to estimate the CTDI in longer length than the
100 mm pencil chamber. The angular response showed that the one
device detector configuration is the most adequate among the three
configurations analyzed in this study.
Abstract: Methane is the second most important greenhouse gas
(GHG) after carbon dioxide. Amount of methane emission from
energy sector is increasing day by day with various activities. In
present work, various sources of methane emission from upstream,
middle stream and downstream of oil & gas sectors are identified and
categorised as per IPCC-2006 guidelines. Data were collected from
various oil & gas sector like (i) exploration & production of oil & gas
(ii) supply through pipelines (iii) refinery throughput & production
(iv) storage & transportation (v) usage. Methane emission factors for
various categories were determined applying Tier-II and Tier-I
approach using the collected data. Total methane emission from
Indian Oil & Gas sectors was thus estimated for the year 1990 to
2007.
Abstract: Psoriasis is a widespread skin disease affecting up to 2% population with plaque psoriasis accounting to about 80%. It can be identified as a red lesion and for the higher severity the lesion is usually covered with rough scale. Psoriasis Area Severity Index (PASI) scoring is the gold standard method for measuring psoriasis severity. Scaliness is one of PASI parameter that needs to be quantified in PASI scoring. Surface roughness of lesion can be used as a scaliness feature, since existing scale on lesion surface makes the lesion rougher. The dermatologist usually assesses the severity through their tactile sense, therefore direct contact between doctor and patient is required. The problem is the doctor may not assess the lesion objectively. In this paper, a digital image analysis technique is developed to objectively determine the scaliness of the psoriasis lesion and provide the PASI scaliness score. Psoriasis lesion is modelled by a rough surface. The rough surface is created by superimposing a smooth average (curve) surface with a triangular waveform. For roughness determination, a polynomial surface fitting is used to estimate average surface followed by a subtraction between rough and average surface to give elevation surface (surface deviations). Roughness index is calculated by using average roughness equation to the height map matrix. The roughness algorithm has been tested to 444 lesion models. From roughness validation result, only 6 models can not be accepted (percentage error is greater than 10%). These errors occur due the scanned image quality. Roughness algorithm is validated for roughness measurement on abrasive papers at flat surface. The Pearson-s correlation coefficient of grade value (G) of abrasive paper and Ra is -0.9488, its shows there is a strong relation between G and Ra. The algorithm needs to be improved by surface filtering, especially to overcome a problem with noisy data.
Abstract: This paper attempts to identify the significance of
Information and Communications Technology (ICT) and
competitiveness to the profit efficiency of commercial banks in
Malaysia. The profit efficiency of commercial banks in Malaysia, the
dependent variable, was estimated using the Stochastic Frontier
Approach (SFA) on a sample of unbalanced panel data, covering 23
commercial banks, between 1995 to 2007. Based on the empirical
results, ICT was not found to exert a significant impact on profit
efficiency, whereas competitiveness, non ICT stock expenditure and
ownership were significant contributors. On the other hand, the size
of banks was found to have significantly reduced profit efficiency,
opening up for various interpretations of the interrelated role of ICT
and competition.
Abstract: The innovative intelligent fuzzy weighted input
estimation method (FWIEM) can be applied to the inverse heat
transfer conduction problem (IHCP) to estimate the unknown
time-varying heat flux of the multilayer materials as presented in this
paper. The feasibility of this method can be verified by adopting the
temperature measurement experiment. The experiment modular may
be designed by using the copper sample which is stacked up 4
aluminum samples with different thicknesses. Furthermore, the
bottoms of copper samples are heated by applying the standard heat
source, and the temperatures on the tops of aluminum are measured by
using the thermocouples. The temperature measurements are then
regarded as the inputs into the presented method to estimate the heat
flux in the bottoms of copper samples. The influence on the estimation
caused by the temperature measurement of the sample with different
thickness, the processing noise covariance Q, the weighting factor γ ,
the sampling time interval Δt , and the space discrete interval Δx ,
will be investigated by utilizing the experiment verification. The
results show that this method is efficient and robust to estimate the
unknown time-varying heat input of the multilayer materials.
Abstract: Thrombosis can be life threatening, necessitating therefore its instant treatment. Hydergine, a nootropic agent is used as a cognition enhancer in stroke patients but relatively little is known about its anti-thrombolytic effect. To investigate this aspect, in vivo and ex vivo experiments were designed and conducted. Three groups of rats were injected 1.5mg, 3.0mg and 4.5mg hydergine intraperitonealy with and without prior exposure to fresh plasma. Positive and negative controls were run in parallel. Animals were sacrificed after 1.5hrs and BT, CT, PT, INR, APTT, plasma calcium levels were estimated. For ex vivo analyses, each 1ml blood aspirated was exposed to 0.1mg, 0.2mg, 0.3mg dose of hydergine with parallel controls. Parameters analyzed were as above. Statistical analysis was through one-way ANOVA. Dunken-s and Tukey-s tests provided intra-group variance. BT, CT, PT, INR and APTT increased while calcium levels dropped significantly (P
Abstract: Operating rooms are important assets for hospitals as
they generate the largest revenue and, at the same time, produce the
largest cost for hospitals. The model presented in this paper helps
make capacity planning decisions on the combination of open
operating rooms (ORs) and estimated overtime to satisfy the
allocated OR time to each specialty. The model combines both
decisions on determining the amount of OR time to open and to
allocate to different surgical specialties. The decisions made are
based on OR costs, overutilization and underutilization costs, and
contribution margins from allocating OR time. The results show the
importance of having a good estimate of specialty usage of OR time
to determine the amount of needed capacity and highlighted the
tradeoff that the OR manager faces between opening more ORs
versus extending the working time of the ORs already in use.
Abstract: The draft Auckland Unitary Plan outlines the future land used for new housing and businesses with Auckland population growth over the next thirty years. According to Auckland Unitary Plan, over the next 30 years, the population of Auckland is projected to increase by one million, and up to 70% of total new dwellings occur within the existing urban area. Intensification will not only increase the number of median or higher density houses such as terrace house, apartment building, etc. within the existing urban area but also change mean housing design data that can impact building thermal performance under the local climate. Based on mean energy consumption and building design data, and their relationships of a number of Auckland sample houses, this study is to estimate the future mean housing energy consumption associated with the change of mean housing design data and evaluate housing energy efficiency with the Auckland Unitary Plan.
Abstract: This study discusses the effect of uncertainty on
production levels of a petrochemical complex. Uncertainly or
variations in some model parameters, such as prices, supply and
demand of materials, can affect the optimality or the efficiency of any
chemical process. For any petrochemical complex with many plants,
there are many sources of uncertainty and frequent variations which
require more attention. Many optimization approaches are proposed
in the literature to incorporate uncertainty within the model in order
to obtain a robust solution. In this work, a stability analysis approach
is applied to a deterministic LP model of a petrochemical complex
consists of ten plants to investigate the effect of such variations on
the obtained optimal production levels. The proposed approach can
determinate the allowable variation ranges of some parameters,
mainly objective or RHS coefficients, before the system lose its
optimality. Parameters with relatively narrow range of variations, i.e.
stability limits, are classified as sensitive parameters or constraints
that need accurate estimate or intensive monitoring. These stability
limits offer easy-to-use information to the decision maker and help in
understanding the interaction between some model parameters and
deciding when the system need to be re-optimize. The study shows
that maximum production of ethylene and the prices of intermediate
products are the most sensitive factors that affect the stability of the
optimum solution