Abstract: Cosmic showers, from their places of origin in space,
after entering earth generate secondary particles called Extensive Air
Shower (EAS). Detection and analysis of EAS and similar High
Energy Particle Showers involve a plethora of experimental setups
with certain constraints for which soft-computational tools like
Artificial Neural Network (ANN)s can be adopted. The optimality
of ANN classifiers can be enhanced further by the use of Multiple
Classifier System (MCS) and certain data - dimension reduction
techniques. This work describes the performance of certain data
dimension reduction techniques like Principal Component Analysis
(PCA), Independent Component Analysis (ICA) and Self Organizing
Map (SOM) approximators for application with an MCS formed
using Multi Layer Perceptron (MLP), Recurrent Neural Network
(RNN) and Probabilistic Neural Network (PNN). The data inputs are
obtained from an array of detectors placed in a circular arrangement
resembling a practical detector grid which have a higher dimension
and greater correlation among themselves. The PCA, ICA and SOM
blocks reduce the correlation and generate a form suitable for real
time practical applications for prediction of primary energy and
location of EAS from density values captured using detectors in a
circular grid.
Abstract: Nowadays, ontologies are the only widely accepted paradigm for the management of sharable and reusable knowledge in a way that allows its automatic interpretation. They are collaboratively created across the Web and used to index, search and annotate documents. The vast majority of the ontology based approaches, however, focus on indexing texts at document level. Recently, with the advances in ontological engineering, it became clear that information indexing can largely benefit from the use of general purpose ontologies which aid the indexing of documents at word level. This paper presents a concept indexing algorithm, which adds ontology information to words and phrases and allows full text to be searched, browsed and analyzed at different levels of abstraction. This algorithm uses a general purpose ontology, OntoRo, and an ontologically tagged corpus, OntoCorp, both developed for the purpose of this research. OntoRo and OntoCorp are used in a two-stage supervised machine learning process aimed at generating ontology tagging rules. The first experimental tests show a tagging accuracy of 78.91% which is encouraging in terms of the further improvement of the algorithm.
Abstract: This paper reports on the enhanced photoluminescence
(PL) of nanocomposites through the layered structuring of phosphor
and quantum dot (QD). Green phosphor of Sr2SiO4:Eu, red QDs of
CdSe/CdS/CdZnS/ZnS core-multishell, and thermo-curable resin
were used for this study. Two kinds of composite (layered and mixed)
were prepared, and the schemes for optical energy transfer between
QD and phosphor were suggested and investigated based on PL decay
characteristics. It was found that the layered structure is more effective
than the mixed one in the respects of PL intensity, PL decay and
thermal loss. When this layered nanocomposite (QDs on phosphor) is
used to make white light emitting diode (LED), the brightness is
increased by 37 %, and the color rendering index (CRI) value is raised
to 88.4 compared to the mixed case of 80.4.
Abstract: In recent years, fast neural networks for object/face detection have been introduced based on cross correlation in the frequency domain between the input matrix and the hidden weights of neural networks. In our previous papers [3,4], fast neural networks for certain code detection was introduced. It was proved in [10] that for fast neural networks to give the same correct results as conventional neural networks, both the weights of neural networks and the input matrix must be symmetric. This condition made those fast neural networks slower than conventional neural networks. Another symmetric form for the input matrix was introduced in [1-9] to speed up the operation of these fast neural networks. Here, corrections for the cross correlation equations (given in [13,15,16]) to compensate for the symmetry condition are presented. After these corrections, it is proved mathematically that the number of computation steps required for fast neural networks is less than that needed by classical neural networks. Furthermore, there is no need for converting the input data into symmetric form. Moreover, such new idea is applied to increase the speed of neural networks in case of processing complex values. Simulation results after these corrections using MATLAB confirm the theoretical computations.
Abstract: This is a comprehensive large-sample study of Australian earnings management. Using a sample of 4,844 firm-year observations across nine Australia industries from 2000 to 2006, we find substantial corporate earnings management activity across several Australian industries. We document strong evidence of size and return on assets being primary determinants of earnings management in Australia. The effects of size and return on assets are also found to be dominant in both income-increasing and incomedecreasing earnings manipulation. We also document that that periphery sector firms are more likely to involve larger magnitude of earnings management than firms in the core sector.
Abstract: In the present study, the incorporation of graphene
into blends of acrylonitrile-butadiene-styrene terpolymer with
polypropylene (ABS/PP) was investigated focusing on the
improvement of their thermomechanical characteristics and the effect
on their rheological behavior. The blends were prepared by melt
mixing in a twin-screw extruder and were characterized by measuring
the MFI as well as by performing DSC, TGA and mechanical tests.
The addition of graphene to ABS/PP blends tends to increase their
melt viscosity, due to the confinement of polymer chains motion.
Also, graphene causes an increment of the crystallization temperature
(Tc), especially in blends with higher PP content, because of the
reduction of surface energy of PP nucleation, which is a consequence
of the attachment of PP chains to the surface of graphene through the
intermolecular CH-π interaction. Moreover, the above nanofiller
improves the thermal stability of PP and increases the residue of
thermal degradation at all the investigated compositions of blends,
due to the thermal isolation effect and the mass transport barrier
effect. Regarding the mechanical properties, the addition of graphene
improves the elastic modulus, because of its intrinsic mechanical
characteristics and its rigidity, and this effect is particularly strong in
the case of pure PP.
Abstract: In this report we present a rule-based approach to
detect anomalous telephone calls. The method described here uses
subscriber usage CDR (call detail record) data sampled over two
observation periods: study period and test period. The study period
contains call records of customers- non-anomalous behaviour.
Customers are first grouped according to their similar usage
behaviour (like, average number of local calls per week, etc). For
customers in each group, we develop a probabilistic model to describe
their usage. Next, we use maximum likelihood estimation (MLE) to
estimate the parameters of the calling behaviour. Then we determine
thresholds by calculating acceptable change within a group. MLE is
used on the data in the test period to estimate the parameters of the
calling behaviour. These parameters are compared against thresholds.
Any deviation beyond the threshold is used to raise an alarm. This
method has the advantage of identifying local anomalies as compared
to techniques which identify global anomalies. The method is tested
for 90 days of study data and 10 days of test data of telecom
customers. For medium to large deviations in the data in test window,
the method is able to identify 90% of anomalous usage with less than
1% false alarm rate.
Abstract: This paper presents a new method for estimating the mean curve of impulse voltage waveforms that are recorded during impulse tests. In practice, these waveforms are distorted by noise, oscillations and overshoot. The problem is formulated as an estimation problem. Estimation of the current signal parameters is achieved using a fast and accurate technique. The method is based on discrete dynamic filtering algorithm (DDF). The main advantage of the proposed technique is its ability in producing the estimates in a very short time and at a very high degree of accuracy. The algorithm uses sets of digital samples of the recorded impulse waveform. The proposed technique has been tested using simulated data of practical waveforms. Effects of number of samples and data window size are studied. Results are reported and discussed.
Abstract: Human genome is not only the evolutionary
summation of all advantageous events, but also houses lesions of
deleterious foot prints. A single gene mutation sometimes may
express multiple consequences in numerous tissues and a linear
relationship of the genotype and the phenotype may often be obscure.
ß Thalassemia minor, a transfusion independent mild anaemia,
coupled with environment among other factors may articulate into
phenotypic pleotropy with Hypocholesterolemia, Vitamin D
deficiency, Tissue hypoxia, Hyper-parathyroidism and Psychological
alterations. Occurrence of Pancreatic insufficiency, resultant
steatorrhoea, Vitamin-D (25-OH) deficiency (13.86 ngm/ml) with
Hypocholesterolemia (85mg/dl) in a 30 years old male ß Thal-minor
patient (Hemoglobin 11mg/dl with Fetal Hemoglobin 2.10%, Hb A2
4.60% and Hb Adult 84.80% and altered Hemogram) with increased
Para thyroid hormone (62 pg/ml) & moderate Serum Ca+2
(9.5mg/ml) indicate towards a cascade of phenotypic pleotropy
where the ß Thalassemia mutation ,be it in the 5’ cap site of the
mRNA , differential splicing etc in heterozygous state is effecting
several metabolic pathways. Compensatory extramedulary
hematopoiesis may not coped up well with the stressful life style of
the young individual and increased erythropoietic stress with high
demand for cholesterol for RBC membrane synthesis may have
resulted in Hypocholesterolemia.Oxidative stress and tissue hypoxia
may have caused the pancreatic insufficiency, leading to Vitamin D
deficiency. This may in turn have caused the secondary
hyperparathyroidism to sustain serum Calcium level. Irritability and
stress intolerance of the patient was a cumulative effect of the vicious
cycle of metabolic compromises. From these findings we propose
that the metabolic deficiencies in the ß Thalassemia mutations may
be considered as the phenotypic display of the pleotropy to explain
the genetic epidemiology.
According to the recommendations from the NIH Workshop on
Gene-Environment Interplay in Common Complex Diseases: Forging
an Integrative Model, study design of observations should be
informed by gene-environment hypotheses and results of a study
(genetic diseases) should be published to inform future hypotheses.
Variety of approaches is needed to capture data on all possible
aspects, each of which is likely to contribute to the etiology of
disease. Speakers also agreed that there is a need for development of
new statistical methods and measurement tools to appraise
information that may be missed out by conventional method where
large sample size is needed to segregate considerable effect.
A meta analytic cohort study in future may bring about significant
insight on to the title comment.
Abstract: Leave of absence is important in maintaining a good
status of human resource quality. Allowing the employees temporarily
free from the routine assignments can vitalize the workers- morality
and productivity. This is particularly critical to secure a satisfactory
service quality for healthcare professionals of which were typically
featured with labor intensive and complicated works to perform. As
one of the veteran hospitals that were found and operated by the
Veteran Department of Taiwan, the nursing staff of the case hospital
was squeezed to an extreme minimum level under the pressure of a
tight budgeting. Leave of absence on schedule became extremely
difficult, especially for the intensive care units (ICU), in which
required close monitoring over the cared patients, and that had more
easily driven the ICU nurses nervous. Even worse, the deferred leaves
were more than 10 days at any time in the ICU because of a fluctuating
occupancy. As a result, these had brought a bad setback to this
particular nursing team, and consequently defeated the job
performance and service quality. To solve this problem and
accordingly to strengthen their morality, a project team was organized
across different departments specific for this. Sufficient information
regarding jobs and positions requirements, labor resources, and actual
working hours in detail were collected and analyzed in the team
meetings. Several alternatives were finalized. These included job
rotating, job combination, leave on impromptu and cross-departmental
redeployment. Consequently, the deferred leave days sharply reduced
70% to a level of 3 or less days. This improvement had not only
provided good shelter for the ICU nurses that improved their job
performance and patient safety but also encouraged the nurses active
participating of a project and learned the skills of solving problems
with colleagues.
Abstract: A company CSR commitment, as stated in its Social
Report is, actually, perceived by its stakeholders?And in what
measure? Moreover, are stakeholders satisfied with the company
CSR efforts? Indeed, business returns from Corporate Social
Responsibility (CSR) practices, such as company reputation and
customer loyalty, depend heavily on how stakeholders perceive the
company social conduct. In this paper, we propose a methodology to
assess a company CSR commitment based on Global Reporting
Initiative (GRI) indicators, Content Analysis and a CSR positioning
matrix. We evaluate three aspects of CSR: the company commitment
disclosed through its Social Report; the company commitment
perceived by its stakeholders; the CSR commitment that stakeholders
require to the company. The positioning of the company under study
in the CSR matrix is based on the comparison among the three
commitment aspects (disclosed, perceived, required) and it allows
assessment and development of CSR strategies.
Abstract: A different concept for designing and detailing of
reinforced concrete precast frame structures is analyzed in this paper.
The new detailing of the joints derives from the special hybrid
moment frame joints. The special reinforcements of this alternative
detailing, named modified special hybrid joint, are bondless with
respect to both column and beams. Full scale tests were performed on
a plan model, which represents a part of 5 story structure, cropped in
the middle of the beams and columns spans. Theoretical approach
was developed, based on testing results on twice repaired model,
subjected to lateral seismic type loading. Discussion regarding the
modified special hybrid joint behavior and further on widening
research needed concludes the presentation.
Abstract: Color image segmentation can be considered as a
cluster procedure in feature space. k-means and its adaptive
version, i.e. competitive learning approach are powerful tools
for data clustering. But k-means and competitive learning suffer
from several drawbacks such as dead-unit problem and need to
pre-specify number of cluster. In this paper, we will explore to
use competitive and cooperative learning approach to perform
color image segmentation. In competitive and cooperative
learning approach, seed points not only compete each other, but
also the winner will dynamically select several nearest
competitors to form a cooperative team to adapt to the input
together, finally it can automatically select the correct number
of cluster and avoid the dead-units problem. Experimental
results show that CCL can obtain better segmentation result.
Abstract: Through a proper analysis of residual strain and stress
distributions obtained at the surface of high speed milled specimens
of AA 6082–T6 aluminium alloy, the performance of an improved
indentation method is evaluated. This method integrates a special
device of indentation to a universal measuring machine. The
mentioned device allows introducing elongated indents allowing to
diminish the absolute error of measurement. It must be noted that the
present method offers the great advantage of avoiding both the
specific equipment and highly qualified personnel, and their inherent
high costs. In this work, the cutting tool geometry and high speed
parameters are selected to introduce reduced plastic damage.
Through the variation of the depth of cut, the stability of the shapes
adopted by the residual strain and stress distributions is evaluated.
The results show that the strain and stress distributions remain
unchanged, compressive and small. Moreover, these distributions
reveal a similar asymmetry when the gradients corresponding to
conventional and climb cutting zones are compared.
Abstract: The physical methods for RNA secondary structure prediction are time consuming and expensive, thus methods for computational prediction will be a proper alternative. Various algorithms have been used for RNA structure prediction including dynamic programming and metaheuristic algorithms. Musician's behaviorinspired harmony search is a recently developed metaheuristic algorithm which has been successful in a wide variety of complex optimization problems. This paper proposes a harmony search algorithm (HSRNAFold) to find RNA secondary structure with minimum free energy and similar to the native structure. HSRNAFold is compared with dynamic programming benchmark mfold and metaheuristic algorithms (RnaPredict, SetPSO and HelixPSO). The results showed that HSRNAFold is comparable to mfold and better than metaheuristics in finding the minimum free energies and the number of correct base pairs.
Abstract: This paper presents the work of signal discrimination
specifically for Electrocardiogram (ECG) waveform. ECG signal is
comprised of P, QRS, and T waves in each normal heart beat to
describe the pattern of heart rhythms corresponds to a specific
individual. Further medical diagnosis could be done to determine any
heart related disease using ECG information. The emphasis on QRS
Complex classification is further discussed to illustrate the
importance of it. Pan-Tompkins Algorithm, a widely known
technique has been adapted to realize the QRS Complex
classification process. There are eight steps involved namely
sampling, normalization, low pass filter, high pass filter (build a band
pass filter), derivation, squaring, averaging and lastly is the QRS
detection. The simulation results obtained is represented in a
Graphical User Interface (GUI) developed using MATLAB.
Abstract: Bioinformatics and Cheminformatics use computer as disciplines providing tools for acquisition, storage, processing, analysis, integrate data and for the development of potential applications of biological and chemical data. A chemical database is one of the databases that exclusively designed to store chemical information. NMRShiftDB is one of the main databases that used to represent the chemical structures in 2D or 3D structures. SMILES format is one of many ways to write a chemical structure in a linear format. In this study we extracted Antimicrobial Structures in SMILES format from NMRShiftDB and stored it in our Local Data Warehouse with its corresponding information. Additionally, we developed a searching tool that would response to user-s query using the JME Editor tool that allows user to draw or edit molecules and converts the drawn structure into SMILES format. We applied Quick Search algorithm to search for Antimicrobial Structures in our Local Data Ware House.
Abstract: This research aimed to study on the potential of
recycling organic waste in Suan Sunandha Rajabhat University as
compost. In doing so, the composition of solid waste generated in the
campus was investigated while physical and chemical properties of
organic waste were analyzed in order to evaluate the portion of waste
suitable for recycling as compost. As a result of the study, it was
found that (1) the amount of organic waste was averaged at 299.8
kg/day in which mixed food wastes had the highest amount of 191.9
kg/day followed by mixed leave & yard wastes and mixed fruit &
vegetable wastes at the amount of 66.3 and 41.6 kg/day respectively;
(2) physical and chemical properties of organic waste in terms of
moisture content was between 69.54 to 78.15%, major elements for
plant as N, P and K were 0.14 to 0.17%, 0.46 to 0.52% and 0.16 to
0.18% respectively, and carbon/nitrogen ratio (C/N) was about 15:1
to 17.5:1; (3) recycling organic waste as compost was designed by
aerobic decomposition using mixed food wastes : mixed leave & yard
wastes : mixed fruit & vegetable wastes at the portion of 3:2:1 by
weight in accordance with the potential of their amounts and their
physical and chemical properties.
Abstract: This paper presents a new approach for the prob-ability density function estimation using the Support Vector Ma-chines (SVM) and the Expectation Maximization (EM) algorithms.In the proposed approach, an advanced algorithm for the SVM den-sity estimation which incorporates the Mean Field theory in the learning process is used. Instead of using ad-hoc values for the para-meters of the kernel function which is used by the SVM algorithm,the proposed approach uses the EM algorithm for an automatic optimization of the kernel. Experimental evaluation using simulated data set shows encouraging results.
Abstract: In this article we present a methodology which
enables preschool and primary school unlanguaged children to
remember words, phrases and texts with the help of graphic signs -
letters, syllables and words. Reading for a child becomes a support
for speech development. Teaching is based on the principle "from
simple to complex", "a letter - a syllable - a word - a proposal - a
text." Availability of multi-level texts allows using this methodology
for working with children who have different levels of speech
development.