Abstract: Only recently have water ethics received focused interest in the international water community. Because water is metabolically basic to life, an ethical dimension persists in every decision related to water. Water ethics at once express human society-s approach to water and act as guidelines for behaviour. Ideas around water are often implicit and embedded as assumptions. They can be entrenched in behaviour and difficult to contest because they are difficult to “see". By explicitly revealing the ethical ideas underlying water-related decisions, human society-s relationship with water, and with natural systems of which water is part, can be contested and shifted or be accepted with conscious intention by human society. In recent decades, improved understanding of water-s importance for ecosystem functioning and ecological services for human survival is moving us beyond this growth-driven, supplyfocused management paradigm. Environmental ethics challenge this paradigm by extending the ethical sphere to the environment and thus water or water Resources management per se. An ethical approach is a legitimate, important, and often ignored approach to effect change in environmental decision making. This qualitative research explores principles of water ethics and examines the underlying ethical precepts of selected water policy examples. The constructed water ethic principles act as a set of criteria against which a policy comparison can be established. This study shows that water Resources management is a progressive issue by embracing full public participation and a new planning model, and knowledgegeneration initiatives.
Abstract: Using efficient classification methods is necessary for automatic fingerprint recognition system. This paper introduces a new structural approach to fingerprint classification by using the directional image of fingerprints to increase the number of subclasses. In this method, the directional image of fingerprints is segmented into regions consisting of pixels with the same direction. Afterwards the relational graph to the segmented image is constructed and according to it, the super graph including prominent information of this graph is formed. Ultimately we apply a matching technique to compare obtained graph with the model graphs in order to classify fingerprints by using cost function. Increasing the number of subclasses with acceptable accuracy in classification and faster processing in fingerprints recognition, makes this system superior.
Abstract: The purpose of this article is to analyze the
degree of concentration in the banking market in EU member
states as well as to determine the impact of the length of EU
membership on the degree of concentration. In that sense
several analysis were conducted, specifically, panel analysis,
calculation of correlation coefficient and regression analysis of
the impact of the length of EU membership on the degree of
concentration. Panel analysis was conducted to determine
whether there is a similar trend of concentration in three
groups of countries - countries with a low, moderate and high
level of concentration. The conducted panel analysis showed
that in EU countries with a moderate level of concentration,
the level of concentration decreases. The calculation of
correlation showed that, to some extent, with other influential
factors, the length of EU membership negatively affects the
market concentration of the banking market. Using the
regression analysis for investigation of the influence of the
length of EU membership on the level of concentration in the
banking sector in a particular country, the results reveal that
there is a negative effect of the length in EU membership on
market concentration, although it is not significantly influential
variable.
Abstract: Speckle noise affects all coherent imaging systems
including medical ultrasound. In medical images, noise suppression
is a particularly delicate and difficult task. A tradeoff between noise
reduction and the preservation of actual image features has to be made
in a way that enhances the diagnostically relevant image content.
Even though wavelets have been extensively used for denoising
speckle images, we have found that denoising using contourlets gives
much better performance in terms of SNR, PSNR, MSE, variance and
correlation coefficient. The objective of the paper is to determine the
number of levels of Laplacian pyramidal decomposition, the number
of directional decompositions to perform on each pyramidal level and
thresholding schemes which yields optimal despeckling of medical
ultrasound images, in particular. The proposed method consists of the
log transformed original ultrasound image being subjected to contourlet
transform, to obtain contourlet coefficients. The transformed
image is denoised by applying thresholding techniques on individual
band pass sub bands using a Bayes shrinkage rule. We quantify the
achieved performance improvement.
Abstract: In this paper, a new method of image edge-detection
and characterization is presented. “Parametric Filtering method" uses
a judicious defined filter, which preserves the signal correlation
structure as input in the autocorrelation of the output. This leads,
showing the evolution of the image correlation structure as well as
various distortion measures which quantify the deviation between
two zones of the signal (the two Hamming signals) for the protection
of an image edge.
Abstract: In this paper, a theoretical formula is presented to
predict the instantaneous folding force of the first fold creation in a
square column under axial loading. Calculations are based on analysis
of “Basic Folding Mechanism" introduced by Wierzbicki and
Abramowicz. For this purpose, the sum of dissipated energy rate under
bending around horizontal and inclined hinge lines and dissipated
energy rate under extensional deformations are equated to the work rate
of the external force on the structure. Final formula obtained in this
research, reasonably predicts the instantaneous folding force of the first
fold creation versus folding distance and folding angle and also predicts
the instantaneous folding force instead of the average value. Finally,
according to the calculated theoretical relation, instantaneous folding
force of the first fold creation in a square column was sketched
versus folding distance and was compared to the experimental results
which showed a good correlation.
Abstract: Multimedia distributed systems deal with heterogeneous
data, such as texts, images, graphics, video and audio. The specification
of temporal relations among different data types and distributed
sources is an open research area. This paper proposes a fully
distributed synchronization model to be used in multimedia systems.
One original aspect of the model is that it avoids the use of a common
reference (e.g. wall clock and shared memory). To achieve this, all
possible multimedia temporal relations are specified according to
their causal dependencies.
Abstract: Most of the existing text mining approaches are
proposed, keeping in mind, transaction databases model. Thus, the
mined dataset is structured using just one concept: the “transaction",
whereas the whole dataset is modeled using the “set" abstract type. In
such cases, the structure of the whole dataset and the relationships
among the transactions themselves are not modeled and
consequently, not considered in the mining process.
We believe that taking into account structure properties of
hierarchically structured information (e.g. textual document, etc ...)
in the mining process, can leads to best results. For this purpose, an
hierarchical associations rule mining approach for textual documents
is proposed in this paper and the classical set-oriented mining
approach is reconsidered profits to a Direct Acyclic Graph (DAG)
oriented approach. Natural languages processing techniques are used
in order to obtain the DAG structure. Based on this graph model, an
hierarchical bottom up algorithm is proposed. The main idea is that
each node is mined with its parent node.
Abstract: Frequent patterns are patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a dataset. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages such as C, Cµ, Java. The imperative paradigm is significantly inefficient when itemset is large and the frequent pattern is long. We suggest a high-level declarative style of programming using a functional language. Our supposition is that the problem of frequent pattern discovery can be efficiently and concisely implemented via a functional paradigm since pattern matching is a fundamental feature supported by most functional languages. Our frequent pattern mining implementation using the Haskell language confirms our hypothesis about conciseness of the program. The performance studies on speed and memory usage support our intuition on efficiency of functional language.
Abstract: School homework has been synonymous with students- life in Chinese national type primary schools in Malaysia. Although many reports in the press claimed that students were burdened with too much of it, homework continues to be a common practice in national type schools that is believed to contribute to academic achievement. This study is conducted to identify the relationship between the burden of school homework and academic achievement among pupils in Chinese National Type Primary School in the state of Perak, Malaysia. A total of 284 students (142 from urban and 142 from rural) respectively were chosen as participants in this study. Variables of gender and location (urban/rural areas) has shown significant difference in student academic achievement. Female Chinese student from rural areas showed a higher mean score than males from urban area. Therefore, the Chinese language teachers should give appropriate and relevant homework to primary school students to achieve good academic performance.
Abstract: Our Medicine-oriented research is based on a medical
data set of real patients. It is a security problem to share
patient private data with peoples other than clinician or hospital
staff. We have to remove person identification information
from medical data. The medical data without private data
are available after a de-identification process for any research
purposes. In this paper, we introduce an universal automatic
rule-based de-identification application to do all this stuff on an
heterogeneous medical data. A patient private identification is
replaced by an unique identification number, even in burnedin
annotation in pixel data. The identical identification is used
for all patient medical data, so it keeps relationships in a data.
Hospital can take an advantage of a research feedback based
on results.
Abstract: We analyze the effectivity of different pseudo noise (PN) and orthogonal sequences for encrypting speech signals in terms of perceptual intelligence. Speech signal can be viewed as sequence of correlated samples and each sample as sequence of bits. The residual intelligibility of the speech signal can be reduced by removing the correlation among the speech samples. PN sequences have random like properties that help in reducing the correlation among speech samples. The mean square aperiodic auto-correlation (MSAAC) and the mean square aperiodic cross-correlation (MSACC) measures are used to test the randomness of the PN sequences. Results of the investigation show the effectivity of large Kasami sequences for this purpose among many PN sequences.
Abstract: The connection between solar activity and adverse phenomena in the Earth’s environment that can affect space and ground based technologies has spurred interest in Space Weather (SW) research. A great effort has been put on the development of suitable models that can provide advanced forecast of SW events. With the progress in computational technology, it is becoming possible to develop operational large scale physics based models which can incorporate the most important physical processes and domains of the Sun-Earth system. In order to enhance our SW prediction capabilities we are developing advanced numerical tools. With operational requirements in mind, our goal is to develop a modular simulation framework of propagation of the disturbances from the Sun through interplanetary space to the Earth. Here, we report and discuss on the development of coronal field and solar wind components for a large scale MHD code. The model for these components is based on a potential field source surface model and an empirical Wang-Sheeley-Arge solar wind relation.
Abstract: This paper argues that increased uncertainty, in certain
situations, may actually encourage investment. Since earlier studies
mostly base their arguments on the assumption of geometric Brownian
motion, the study extends the assumption to alternative stochastic
processes, such as mixed diffusion-jump, mean-reverting process, and
jump amplitude process. A general approach of Monte Carlo
simulation is developed to derive optimal investment trigger for the
situation that the closed-form solution could not be readily obtained
under the assumption of alternative process. The main finding is that
the overall effect of uncertainty on investment is interpreted by the
probability of investing, and the relationship appears to be an invested
U-shaped curve between uncertainty and investment. The implication
is that uncertainty does not always discourage investment even under
several sources of uncertainty. Furthermore, high-risk projects are not
always dominated by low-risk projects because the high-risk projects
may have a positive realization effect on encouraging investment.
Abstract: There has been a growing emphasis in
communication management from simple coordination of
promotional tools to a complex strategic process. This study will
examine the current marketing communications and engagement
strategies used in addressing the key stakeholders. In the case of
fertilizer industry in Malaysia, there has been little empirical
research on stakeholder communication when major challenges
facing the modern corporation is the need to communicate its
identity, its values and products in order to distinguish itself from
competitors. The study will employ both quantitative and qualitative
methods and the use of Structural Equation Modeling (SEM) to
establish a causal relationship amongst the key factors of stakeholder
communication strategies and increment in consumers-
choice/acceptance and impact on financial performance. One of the
major contributions is a conceptual framework for communication
strategies and engagement in increasing consumers- acceptance level
and the firm-s financial performance.
Abstract: This study aims to examine the factors affecting
knowledge sharing behavior in knowledge-based electronic communities (e-communities) because quantity and quality of
knowledge shared among the members play a critical role in the community-s sustainability. Past research has suggested three
perspectives that may affect the quantity and quality of knowledge
shared: economics, social psychology, and social ecology. In this
study, we strongly believe that an economic perspective may be suitable to validate factors influencing newly registered members-
knowledge contribution at the beginning of relationship development.
Accordingly, this study proposes a model to validate the factors influencing members- knowledge sharing based on Transaction Cost
Theory. By doing so, we may empirically test our hypotheses in various types of e-communities to determine the generalizability of our research models.
Abstract: The objective of the present study was to examine the
dose-response relationships between antioxidant parameters and liver
contaminant levels of Kazakhstan light crude oil (KLCO) in albino
rats. The animals were repeatedly exposed, by intraperitoneal
injection, to low dosages (0.5–1.5 ml/kg) of KLCO. Rats exposed to
these doses levels did not show any apparent symptoms of
intoxication. Serum aminotransferases increased significantly
(p
Abstract: In this paper we evaluated the efficacy of
photodynamic treatment of infected wounds on pig animal model by
diffuse reflectance spectrometry. The study was conducted on fifteen
wounds contaminated with Staphylococcus aureus bacteria that were
incubated for 30 min with methylene blue solution (c = 3.3 x 10-3 M)
and exposed to laser radiations (λ = 670 nm, P = 15 mW) for 15 min.
The efficiency of photodynamic inactivation of bacteria was
evaluated by microbiological exams and diffuse reflectance
spectrometry. The results of the microbiological exams showed that
the bacterial concentration has decreased from 6.93±0.138
logCFU/ml to 3.12±0.108 logCFU/ml. The spectral examination
showed that the diffuse reflectance of wounds contaminated with
Staphylococcus aureus has decreased from 5.06±0.036 % to
3.36±0.025 %. In conclusion, photodynamic therapy is an effective
method for the treatment of infected wounds and there is a correlation
between the CFU count and diffuse reflectance.
Abstract: The aim of the work presented here was to either use
existing forest dynamic simulation models or calibrate a new one
both within the SYMFOR framework with the purpose of examining
changes in stand level basal area and functional composition in
response to selective logging considering trees > 10 cm d.b.h for two
areas of undisturbed Amazonian non flooded tropical forest in Brazil
and one in Peru. Model biological realism was evaluated for forest in
the undisturbed and selectively logged state and it was concluded that
forest dynamics were realistically represented. Results of the logging
simulation experiments showed that in relation to undisturbed forest
simulation subject to no form of harvesting intervention there was a
significant amount of change over a 90 year simulation period that
was positively proportional to the intensity of logging. Areas which
had in the dynamic equilibrium of undisturbed forest a greater
proportion of a specific ecological guild of trees known as the light
hardwoods (LHW’s) seemed to respond more favorably in terms of
less deviation but only within a specific range of baseline forest
composition beyond which compositional diversity became more
important. These finds are in line partially with practical management
experience and partiality basic systematics theory respectively.
Abstract: This paper provides an exergy analysis of the multistage refrigeration cycle used for C2+ recovery plant. The behavior of an industrial refrigeration cycle with refrigerant propane has been investigated by the exergy method. A computational model based on the exergy analysis is presented for the investigation of the effects of the valves on the exergy losses, the second law of efficiency, and the coefficient of performance (COP) of a vapor compression refrigeration cycle. The equations of exergy destruction and exergetic efficiency for the main cycle components such as evaporators, condensers, compressors, and expansion valves are developed. The relations for the total exergy destruction in the cycle and the cycle exergetic efficiency are obtained. An ethane recovery unit with its refrigeration cycle has been simulated to prepare the exergy analysis. Using a typical actual work input value; the exergetic efficiency of the refrigeration cycle is determined to be 39.90% indicating a great potential for improvements. The simulation results reveal that the exergetic efficiencies of the heat exchanger and expansion sections get the lowest rank among the other compartments of refrigeration cycle. Refrigeration calculations have been carried out through the analysis of T–S and P–H diagrams where coefficient of performance (COP) was obtained as 1.85. The novelty of this article includes the effect and sensitivity analysis of molar flow, pressure drops and temperature on the exergy efficiency and coefficient of performance of the cycle.