Abstract: Today, money laundering (ML) poses a serious threat
not only to financial institutions but also to the nation. This criminal
activity is becoming more and more sophisticated and seems to have
moved from the cliché of drug trafficking to financing terrorism and
surely not forgetting personal gain. Most international financial
institutions have been implementing anti-money laundering solutions
(AML) to fight investment fraud. However, traditional investigative
techniques consume numerous man-hours. Recently, data mining
approaches have been developed and are considered as well-suited
techniques for detecting ML activities. Within the scope of a
collaboration project for the purpose of developing a new solution for
the AML Units in an international investment bank, we proposed a
data mining-based solution for AML. In this paper, we present a
heuristics approach to improve the performance for this solution. We
also show some preliminary results associated with this method on
analysing transaction datasets.
Abstract: To understand working features of a micro combustor,
a computer code has been developed to study combustion of
hydrogen–air mixture in a series of chambers with same shape aspect
ratio but various dimensions from millimeter to micrometer level.
The prepared algorithm and the computer code are capable of
modeling mixture effects in different fluid flows including chemical
reactions, viscous and mass diffusion effects. The effect of various
heat transfer conditions at chamber wall, e.g. adiabatic wall, with
heat loss and heat conduction within the wall, on the combustion is
analyzed. These thermal conditions have strong effects on the
combustion especially when the chamber dimension goes smaller and
the ratio of surface area to volume becomes larger.
Both factors, such as larger heat loss through the chamber wall
and smaller chamber dimension size, may lead to the thermal
quenching of micro-scale combustion. Through such systematic
numerical analysis, a proper operation space for the micro-combustor
is suggested, which may be used as the guideline for microcombustor
design. In addition, the results reported in this paper
illustrate that the numerical simulation can be one of the most
powerful and beneficial tools for the micro-combustor design,
optimization and performance analysis.
Abstract: This is a comprehensive large-sample study of Australian earnings management. Using a sample of 4,844 firm-year observations across nine Australia industries from 2000 to 2006, we find substantial corporate earnings management activity across several Australian industries. We document strong evidence of size and return on assets being primary determinants of earnings management in Australia. The effects of size and return on assets are also found to be dominant in both income-increasing and incomedecreasing earnings manipulation. We also document that that periphery sector firms are more likely to involve larger magnitude of earnings management than firms in the core sector.
Abstract: This paper presents a new adaptive DMC controller
that improves the controller performance in case of plant-model
mismatch. The new controller monitors the plant measured output,
compares it with the model output and calculates weights applied to
the controller move. Simulations show that the new controller can
help improve control performance and avoid instability in case of
severe model mismatches.
Abstract: We demonstrate the synthesis of intermediary views
within a sequence of color encoded, materials discriminating, X-ray
images that exhibit animated depth in a visual display. During the
image acquisition process, the requirement for a linear X-ray detector
array is replaced by synthetic image. Scale Invariant Feature
Transform, SIFT, in combination with material segmented morphing
is employed to produce synthetic imagery. A quantitative analysis of
the feature matching performance of the SIFT is presented along with
a comparative study of the synthetic imagery. We show that the total
number of matches produced by SIFT reduces as the angular
separation between the generating views increases. This effect is
accompanied by an increase in the total number of synthetic pixel
errors. The trends observed are obtained from 15 different luggage
items. This programme of research is in collaboration with the UK
Home Office and the US Dept. of Homeland Security.
Abstract: In this report we present a rule-based approach to
detect anomalous telephone calls. The method described here uses
subscriber usage CDR (call detail record) data sampled over two
observation periods: study period and test period. The study period
contains call records of customers- non-anomalous behaviour.
Customers are first grouped according to their similar usage
behaviour (like, average number of local calls per week, etc). For
customers in each group, we develop a probabilistic model to describe
their usage. Next, we use maximum likelihood estimation (MLE) to
estimate the parameters of the calling behaviour. Then we determine
thresholds by calculating acceptable change within a group. MLE is
used on the data in the test period to estimate the parameters of the
calling behaviour. These parameters are compared against thresholds.
Any deviation beyond the threshold is used to raise an alarm. This
method has the advantage of identifying local anomalies as compared
to techniques which identify global anomalies. The method is tested
for 90 days of study data and 10 days of test data of telecom
customers. For medium to large deviations in the data in test window,
the method is able to identify 90% of anomalous usage with less than
1% false alarm rate.
Abstract: This paper presents an integrated model that
automatically measures the change of rivers, damage area of bridge
surroundings, and change of vegetation. The proposed model is on the
basis of a neurofuzzy mechanism enhanced by SOM optimization
algorithm, and also includes three functions to deal with river imagery.
High resolution imagery from FORMOSAT-2 satellite taken before
and after the invasion period is adopted. By randomly selecting a
bridge out of 129 destroyed bridges, the recognition results show that
the average width has increased 66%. The ruined segment of the
bridge is located exactly at the most scour region. The vegetation
coverage has also reduced to nearly 90% of the original. The results
yielded from the proposed model demonstrate a pinpoint accuracy rate
at 99.94%. This study brings up a successful tool not only for
large-scale damage assessment but for precise measurement to
disasters.
Abstract: Atherosclerosis was identified as a chronic inflammatory process resulting from interactions between plasma lipoproteins, cellular components (monocyte, macrophages, T lymphocytes, endothelial cells and smooth muscle cells) and the extracellular matrix of the arterial wall. Several types of genes were known to express during formation of atherosclerosis. This study is carried out to identify unknown differentially expressed gene (DEG) in atherogenesis. Rabbit’s aorta tissues were stained by H&E for histomorphology. GeneFishing™ PCR analysis was performed from total RNA extracted from the aorta tissues. The DNA fragment from DEG was cloned, sequenced and validated by Real-time PCR. Histomorphology showed intimal thickening in the aorta. DEG detected from ACP-41 was identified as cathepsin B gene and showed upregulation at week-8 and week-12 of atherogenesis. Therefore, ACP-based GeneFishing™ PCR facilitated identification of cathepsin B gene which was differentially expressed during development of atherosclerosis.
Abstract: The interaction of tunneling or mining with
groundwater has become a very relevant problem not only due to the
need to guarantee the safety of workers and to assure the efficiency of
the tunnel drainage systems, but also to safeguard water resources
from impoverishment and pollution risk. Therefore it is very
important to forecast the drainage processes (i.e., the evaluation of
drained discharge and drawdown caused by the excavation). The aim
of this study was to know better the system and to quantify the flow
drained from the Fontane mines, located in Val Germanasca (Turin,
Italy). This allowed to understand the hydrogeological local changes
in time. The work has therefore been structured as follows: the
reconstruction of the conceptual model with the geological,
hydrogeological and geological-structural study; the calculation of
the tunnel inflows (through the use of structural methods) and the
comparison with the measured flow rates; the water balance at the
basin scale. In this way it was possible to understand what are the
relationships between rainfall, groundwater level variations and the
effect of the presence of tunnels as a means of draining water.
Subsequently, it the effects produced by the excavation of the mining
tunnels was quantified, through numerical modeling. In particular,
the modeling made it possible to observe the drawdown variation as a
function of number, excavation depth and different mines linings.
Abstract: This paper presents a new method for estimating the mean curve of impulse voltage waveforms that are recorded during impulse tests. In practice, these waveforms are distorted by noise, oscillations and overshoot. The problem is formulated as an estimation problem. Estimation of the current signal parameters is achieved using a fast and accurate technique. The method is based on discrete dynamic filtering algorithm (DDF). The main advantage of the proposed technique is its ability in producing the estimates in a very short time and at a very high degree of accuracy. The algorithm uses sets of digital samples of the recorded impulse waveform. The proposed technique has been tested using simulated data of practical waveforms. Effects of number of samples and data window size are studied. Results are reported and discussed.
Abstract: Human genome is not only the evolutionary
summation of all advantageous events, but also houses lesions of
deleterious foot prints. A single gene mutation sometimes may
express multiple consequences in numerous tissues and a linear
relationship of the genotype and the phenotype may often be obscure.
ß Thalassemia minor, a transfusion independent mild anaemia,
coupled with environment among other factors may articulate into
phenotypic pleotropy with Hypocholesterolemia, Vitamin D
deficiency, Tissue hypoxia, Hyper-parathyroidism and Psychological
alterations. Occurrence of Pancreatic insufficiency, resultant
steatorrhoea, Vitamin-D (25-OH) deficiency (13.86 ngm/ml) with
Hypocholesterolemia (85mg/dl) in a 30 years old male ß Thal-minor
patient (Hemoglobin 11mg/dl with Fetal Hemoglobin 2.10%, Hb A2
4.60% and Hb Adult 84.80% and altered Hemogram) with increased
Para thyroid hormone (62 pg/ml) & moderate Serum Ca+2
(9.5mg/ml) indicate towards a cascade of phenotypic pleotropy
where the ß Thalassemia mutation ,be it in the 5’ cap site of the
mRNA , differential splicing etc in heterozygous state is effecting
several metabolic pathways. Compensatory extramedulary
hematopoiesis may not coped up well with the stressful life style of
the young individual and increased erythropoietic stress with high
demand for cholesterol for RBC membrane synthesis may have
resulted in Hypocholesterolemia.Oxidative stress and tissue hypoxia
may have caused the pancreatic insufficiency, leading to Vitamin D
deficiency. This may in turn have caused the secondary
hyperparathyroidism to sustain serum Calcium level. Irritability and
stress intolerance of the patient was a cumulative effect of the vicious
cycle of metabolic compromises. From these findings we propose
that the metabolic deficiencies in the ß Thalassemia mutations may
be considered as the phenotypic display of the pleotropy to explain
the genetic epidemiology.
According to the recommendations from the NIH Workshop on
Gene-Environment Interplay in Common Complex Diseases: Forging
an Integrative Model, study design of observations should be
informed by gene-environment hypotheses and results of a study
(genetic diseases) should be published to inform future hypotheses.
Variety of approaches is needed to capture data on all possible
aspects, each of which is likely to contribute to the etiology of
disease. Speakers also agreed that there is a need for development of
new statistical methods and measurement tools to appraise
information that may be missed out by conventional method where
large sample size is needed to segregate considerable effect.
A meta analytic cohort study in future may bring about significant
insight on to the title comment.
Abstract: In the upstream we place a piece of ring and rotate
it with 83Hz, 166Hz, 333Hz,and 666H to find the effect of the
periodic distortion.In the experiment this type of the perturbation
will not allow since the mechanical failure of any parts of the
equipment in the upstream will destroy the blade system. This type of
study will be only possible by CFD. We use two pumps NS32
(ENSAM) and three blades pump (Tamagawa Univ). The benchmark
computations were performed without perturbation parts, and confirm
the computational results well agreement in head-flow rate. We
obtained the pressure fluctuation growth rate that is representing the
global instability of the turbo-system. The fluctuating torque
components were 0.01Nm(5000rpm), 0.1Nm(10000rmp),
0.04Nm(20000rmp), 0.15Nm( 40000rmp) respectively. Only for
10000rpm(166Hz) the output toque was random, and it implies that it
creates unsteady flow by separations on the blades, and will reduce the
pressure loss significantly
Abstract: Leave of absence is important in maintaining a good
status of human resource quality. Allowing the employees temporarily
free from the routine assignments can vitalize the workers- morality
and productivity. This is particularly critical to secure a satisfactory
service quality for healthcare professionals of which were typically
featured with labor intensive and complicated works to perform. As
one of the veteran hospitals that were found and operated by the
Veteran Department of Taiwan, the nursing staff of the case hospital
was squeezed to an extreme minimum level under the pressure of a
tight budgeting. Leave of absence on schedule became extremely
difficult, especially for the intensive care units (ICU), in which
required close monitoring over the cared patients, and that had more
easily driven the ICU nurses nervous. Even worse, the deferred leaves
were more than 10 days at any time in the ICU because of a fluctuating
occupancy. As a result, these had brought a bad setback to this
particular nursing team, and consequently defeated the job
performance and service quality. To solve this problem and
accordingly to strengthen their morality, a project team was organized
across different departments specific for this. Sufficient information
regarding jobs and positions requirements, labor resources, and actual
working hours in detail were collected and analyzed in the team
meetings. Several alternatives were finalized. These included job
rotating, job combination, leave on impromptu and cross-departmental
redeployment. Consequently, the deferred leave days sharply reduced
70% to a level of 3 or less days. This improvement had not only
provided good shelter for the ICU nurses that improved their job
performance and patient safety but also encouraged the nurses active
participating of a project and learned the skills of solving problems
with colleagues.
Abstract: A company CSR commitment, as stated in its Social
Report is, actually, perceived by its stakeholders?And in what
measure? Moreover, are stakeholders satisfied with the company
CSR efforts? Indeed, business returns from Corporate Social
Responsibility (CSR) practices, such as company reputation and
customer loyalty, depend heavily on how stakeholders perceive the
company social conduct. In this paper, we propose a methodology to
assess a company CSR commitment based on Global Reporting
Initiative (GRI) indicators, Content Analysis and a CSR positioning
matrix. We evaluate three aspects of CSR: the company commitment
disclosed through its Social Report; the company commitment
perceived by its stakeholders; the CSR commitment that stakeholders
require to the company. The positioning of the company under study
in the CSR matrix is based on the comparison among the three
commitment aspects (disclosed, perceived, required) and it allows
assessment and development of CSR strategies.
Abstract: Reinforced concrete crash barriers used in road traffic
must meet a number of criteria. Crash barriers are laid lengthwise,
one behind another, and joined using specially designed steel locks.
While developing BSV reinforced concrete crash barriers (type
ŽPSV), experiments and calculations aimed to optimize the shape of
a newly designed lock and the reinforcement quantity and
distribution in a crash barrier were carried out. The tension carrying
capacity of two parallelly joined locks was solved experimentally.
Based on the performed experiments, adjustments of nonlinear
properties of steel were performed in the calculations. The obtained
results served as a basis to optimize the lock design using a
computational model that takes into account the plastic behaviour of
steel and the influence of the surrounding concrete [6]. The response
to the vehicle impact has been analyzed using a specially elaborated
complex computational model, comprising both the nonlinear model
of the damping wall or crash barrier and the detailed model of the
vehicle [7].
Abstract: the data quality is a kind of complex and unstructured concept, which is concerned by information systems managers. The reason of this attention is the high amount of Expenses for maintenance and cleaning of the inefficient data. Such a data more than its expenses of lack of quality, cause wrong statistics, analysis and decisions in organizations. Therefor the managers intend to improve the quality of their information systems' data. One of the basic subjects of quality improvement is the evaluation of the amount of it. In this paper, we present a precautionary method, which with its application the data of information systems would have a better quality. Our method would cover different dimensions of data quality; therefor it has necessary integrity. The presented method has tested on three dimensions of accuracy, value-added and believability and the results confirm the improvement and integrity of this method.
Abstract: Through a proper analysis of residual strain and stress
distributions obtained at the surface of high speed milled specimens
of AA 6082–T6 aluminium alloy, the performance of an improved
indentation method is evaluated. This method integrates a special
device of indentation to a universal measuring machine. The
mentioned device allows introducing elongated indents allowing to
diminish the absolute error of measurement. It must be noted that the
present method offers the great advantage of avoiding both the
specific equipment and highly qualified personnel, and their inherent
high costs. In this work, the cutting tool geometry and high speed
parameters are selected to introduce reduced plastic damage.
Through the variation of the depth of cut, the stability of the shapes
adopted by the residual strain and stress distributions is evaluated.
The results show that the strain and stress distributions remain
unchanged, compressive and small. Moreover, these distributions
reveal a similar asymmetry when the gradients corresponding to
conventional and climb cutting zones are compared.
Abstract: In this paper, we study the application of Extreme
Learning Machine (ELM) algorithm for single layered feedforward
neural networks to non-linear chaotic time series problems. In this
algorithm the input weights and the hidden layer bias are randomly
chosen. The ELM formulation leads to solving a system of linear
equations in terms of the unknown weights connecting the hidden
layer to the output layer. The solution of this general system of
linear equations will be obtained using Moore-Penrose generalized
pseudo inverse. For the study of the application of the method we
consider the time series generated by the Mackey Glass delay
differential equation with different time delays, Santa Fe A and
UCR heart beat rate ECG time series. For the choice of sigmoid,
sin and hardlim activation functions the optimal values for the
memory order and the number of hidden neurons which give the
best prediction performance in terms of root mean square error are
determined. It is observed that the results obtained are in close
agreement with the exact solution of the problems considered
which clearly shows that ELM is a very promising alternative
method for time series prediction.
Abstract: This paper presents the work of signal discrimination
specifically for Electrocardiogram (ECG) waveform. ECG signal is
comprised of P, QRS, and T waves in each normal heart beat to
describe the pattern of heart rhythms corresponds to a specific
individual. Further medical diagnosis could be done to determine any
heart related disease using ECG information. The emphasis on QRS
Complex classification is further discussed to illustrate the
importance of it. Pan-Tompkins Algorithm, a widely known
technique has been adapted to realize the QRS Complex
classification process. There are eight steps involved namely
sampling, normalization, low pass filter, high pass filter (build a band
pass filter), derivation, squaring, averaging and lastly is the QRS
detection. The simulation results obtained is represented in a
Graphical User Interface (GUI) developed using MATLAB.
Abstract: A major part of the flow field involves no complicated
turbulent behavior in many turbulent flows. In this research work, in
order to reduce required memory and CPU time, the flow field was
decomposed into several blocks, each block including its special
turbulence. A two dimensional backward facing step was considered
here. Four combinations of the Prandtl mixing length and standard k-
E models were implemented as well. Computer memory and CPU
time consumption in addition to numerical convergence and accuracy
of the obtained results were mainly investigated. Observations
showed that, a suitable combination of turbulence models in different
blocks led to the results with the same accuracy as the high order
turbulence model for all of the blocks, in addition to the reductions in
memory and CPU time consumption.