Abstract: The problem of frequent pattern discovery is defined
as the process of searching for patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns
has become an important data mining task because it reveals associations, correlations, and many other interesting relationships
hidden in a database. Most of the proposed frequent pattern mining
algorithms have been implemented with imperative programming
languages. Such paradigm is inefficient when set of patterns is large
and the frequent pattern is long. We suggest a high-level declarative
style of programming apply to the problem of frequent pattern
discovery. We consider two languages: Haskell and Prolog. Our
intuitive idea is that the problem of finding frequent patterns should
be efficiently and concisely implemented via a declarative paradigm
since pattern matching is a fundamental feature supported by most
functional languages and Prolog. Our frequent pattern mining
implementation using the Haskell and Prolog languages confirms our
hypothesis about conciseness of the program. The comparative
performance studies on line-of-code, speed and memory usage of
declarative versus imperative programming have been reported in the
paper.
Abstract: Nowadays, computer worms, viruses and Trojan horse
become popular, and they are collectively called malware. Those
malware just spoiled computers by deleting or rewriting important
files a decade ago. However, recent malware seems to be born to earn
money. Some of malware work for collecting personal information so
that malicious people can find secret information such as password for
online banking, evidence for a scandal or contact address which relates
with the target. Moreover, relation between money and malware
becomes more complex. Many kinds of malware bear bots to get
springboards. Meanwhile, for ordinary internet users,
countermeasures against malware come up against a blank wall.
Pattern matching becomes too much waste of computer resources,
since matching tools have to deal with a lot of patterns derived from
subspecies. Virus making tools can automatically bear subspecies of
malware. Moreover, metamorphic and polymorphic malware are no
longer special. Recently there appears malware checking sites that
check contents in place of users' PC. However, there appears a new
type of malicious sites that avoids check by malware checking sites. In
this paper, existing protocols and methods related with the web are
reconsidered in terms of protection from current attacks, and new
protocol and method are indicated for the purpose of security of the
web.
Abstract: The availability of water in adequate quantity and
quality is imperative for sustainable development. Worldwide,
significant imbalance exists with regards to sustainable development
particularly from a water and sanitation perspective. Water is a
critical component of public health, and failure to supply safe water
will place a heavy burden on the entire population. Although the 21st
century has witnessed wealth and advanced development, it has not
been realized everywhere. Billions of people are still striving to
access the most basic human needs which are food, shelter, safe
drinking water and adequate sanitation. The global picture conceals
various inequalities particularly with regards to sanitation coverage in
rural and urban areas. Currently, water scarcity and in particular
water governance is the main challenge which will cause a threat to
sustainable development goals. Within the context of water,
sanitation and health, sustainable development is a confusing concept
primarily when examined from the viewpoint of policy options for
developing countries. This perspective paper aims to summarize and
critically evaluate evidence of published studies in relation to water,
sanitation and health and to identify relevant solutions to reduce
public health impacts. Evidently, improving water and sanitation
services will result in significant and lasting gains in health and
economic development.
Abstract: In this paper is reported an analysis about the outdoor air pollution of the urban centre of the city of Messina. The variations of the most critical pollutants concentrations (PM10, O3, CO, C6H6) and their trends respect of climatic parameters and vehicular traffic have been studied. Linear regressions have been effectuated for representing the relations among the pollutants; the differences between pollutants concentrations on weekend/weekday were also analyzed. In order to evaluate air pollution and its effects on human health, a method for calculating a pollution index was implemented and applied in the urban centre of the city. This index is based on the weighted mean of the most detrimental air pollutants concentrations respect of their limit values for protection of human health. The analyzed data of the polluting substances were collected by the Assessorship of the Environment of the Regional Province of Messina in the year 2004. A statistical analysis of the air quality index trends is also reported.
Abstract: Generator of hypotheses is a new method for data mining. It makes possible to classify the source data automatically and produces a particular enumeration of patterns. Pattern is an expression (in a certain language) describing facts in a subset of facts. The goal is to describe the source data via patterns and/or IF...THEN rules. Used evaluation criteria are deterministic (not probabilistic). The search results are trees - form that is easy to comprehend and interpret. Generator of hypotheses uses very effective algorithm based on the theory of monotone systems (MS) named MONSA (MONotone System Algorithm).
Abstract: Cu-mesoporous TiO2 is developed for removal acid
odor cooperated with ozone assistance and online- regeneration
system with/without UV irradiation (all weather) in study. The results
showed that Cu-mesoporous TiO2 present the desirable adsorption
efficiency of acid odor without UV irradiation, due to the larger
surface area, pore sizeand the additional absorption ability provided by
Cu. In the photocatalysis process, the material structure also benefits
Cu-mesoporous TiO2 to perform the more outstanding efficiency on
degrading acid odor. Cu also postponed the recombination of
electron-hole pairs excited from TiO2 to enhance photodegradation
ability. Cu-mesoporous TiO2 could gain the conspicuous increase on
photocatalysis ability from ozone assistance, but without any benefit
on adsorption. In addition, the online regeneration procedure could
process the used Cu-mesoporous TiO2 to reinstate the adsorption
ability and maintain the photodegradtion performance, depended on
scrubbing, desorping acid odor and reducing Cu to metal state.
Abstract: In this paper we describe the design and implementation of a parallel algorithm for data assimilation with ensemble Kalman filter (EnKF) for oil reservoir history matching problem. The use of large number of observations from time-lapse seismic leads to a large turnaround time for the analysis step, in addition to the time consuming simulations of the realizations. For efficient parallelization it is important to consider parallel computation at the analysis step. Our experiments show that parallelization of the analysis step in addition to the forecast step has good scalability, exploiting the same set of resources with some additional efforts.
Abstract: Quantitative Investigation of impact of the factors' contribution towards measuring the reusability of software components could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable component from existing legacy systems; that can save cost of developing the software from scratch. But the issue of the relative significance of contributing factors has remained relatively unexplored. In this paper, we have use the Taguchi's approach in analyzing the significance of different structural attributes or factors in deciding the reusability level of a particular component. The results obtained shows that the complexity is the most important factor in deciding the better Reusability of a function oriented Software. In case of Object Oriented Software, Coupling and Complexity collectively play significant role in high reusability.
Abstract: Given a parallel program to be executed on a heterogeneous
computing system, the overall execution time of the program
is determined by a schedule. In this paper, we analyze the worst-case
performance of the list scheduling algorithm for scheduling tasks
of a parallel program in a mixed-machine heterogeneous computing
system such that the total execution time of the program is minimized.
We prove tight lower and upper bounds for the worst-case
performance ratio of the list scheduling algorithm. We also examine
the average-case performance of the list scheduling algorithm. Our
experimental data reveal that the average-case performance of the list
scheduling algorithm is much better than the worst-case performance
and is very close to optimal, except for large systems with large
heterogeneity. Thus, the list scheduling algorithm is very useful in
real applications.
Abstract: Shear walls are used in most of the tall buildings for
carrying the lateral load. When openings for doors or windows are
necessary to be existed in the shear walls, a special type of the shear
walls is used called "coupled shear walls" which in some cases is
stiffened by specific beams and so, called "stiffened coupled shear
walls".
In this paper, a mathematical method for geometrically nonlinear
analysis of the stiffened coupled shear walls has been presented.
Then, a suitable formulation for determining the critical load of the
stiffened coupled shear walls under gravity force has been proposed.
The governing differential equations for equilibrium and deformation
of the stiffened coupled shear walls have been obtained by setting up
the equilibrium equations and the moment-curvature relationships for
each wall. Because of the complexity of the differential equation, the
energy method has been adopted for approximate solution of the
equations.
Abstract: This paper presents an equivalent circuit model based on piecewise linear parallel branches (PLPB) to study solar cell modules which are partially shaded. The PLPB model can easily be used in circuit simulation software such as the ElectroMagnetic Transients Program (EMTP). This PLPB model allows the user to simulate several different configurations of solar cells, the influence of partial shadowing on a single or multiple cells, the influence of the number of solar cells protected by a bypass diode and the effect of the cell connection configuration on partial shadowing.
Abstract: Encrypted messages sending frequently draws the attention
of third parties, perhaps causing attempts to break and
reveal the original messages. Steganography is introduced to hide
the existence of the communication by concealing a secret message
in an appropriate carrier like text, image, audio or video. Quantum
steganography where the sender (Alice) embeds her steganographic
information into the cover and sends it to the receiver (Bob) over a
communication channel. Alice and Bob share an algorithm and hide
quantum information in the cover. An eavesdropper (Eve) without
access to the algorithm can-t find out the existence of the quantum
message. In this paper, a text quantum steganography technique based
on the use of indefinite articles (a) or (an) in conjunction with the nonspecific
or non-particular nouns in English language and quantum
gate truth table have been proposed. The authors also introduced a
new code representation technique (SSCE - Secret Steganography
Code for Embedding) at both ends in order to achieve high level of
security. Before the embedding operation each character of the secret
message has been converted to SSCE Value and then embeds to cover
text. Finally stego text is formed and transmits to the receiver side.
At the receiver side different reverse operation has been carried out
to get back the original information.
Abstract: The stilling basins are commonly used to dissipate the
energy and protect the downstream floor from erosion. The aim of
the present experimental work is to improve the roughened stilling
basin using T-shape roughness instead of the regular cubic one and
design this new shape. As a result of the present work the best
intensity and the best roughness length are identified. Also, it is
found that the T-shape roughness save materials and reduce the jump
length compared to the cubic one. Sensitivity analysis was performed
and it was noticed that the change in the length of jump is more
sensitive to the change in roughness length than the change in
intensity.
Abstract: Medical image modalities such as computed
tomography (CT), magnetic resonance imaging (MRI), ultrasound
(US), X-ray are adapted to diagnose disease. These modalities
provide flexible means of reviewing anatomical cross-sections and
physiological state in different parts of the human body. The raw
medical images have a huge file size and need large storage
requirements. So it should be such a way to reduce the size of those
image files to be valid for telemedicine applications. Thus the image
compression is a key factor to reduce the bit rate for transmission or
storage while maintaining an acceptable reproduction quality, but it is
natural to rise the question of how much an image can be compressed
and still preserve sufficient information for a given clinical
application. Many techniques for achieving data compression have
been introduced. In this study, three different MRI modalities which
are Brain, Spine and Knee have been compressed and reconstructed
using wavelet transform. Subjective and objective evaluation has
been done to investigate the clinical information quality of the
compressed images. For the objective evaluation, the results show
that the PSNR which indicates the quality of the reconstructed image
is ranging from (21.95 dB to 30.80 dB, 27.25 dB to 35.75 dB, and
26.93 dB to 34.93 dB) for Brain, Spine, and Knee respectively. For
the subjective evaluation test, the results show that the compression
ratio of 40:1 was acceptable for brain image, whereas for spine and
knee images 50:1 was acceptable.
Abstract: Cryptography provides the secure manner of
information transmission over the insecure channel. It authenticates
messages based on the key but not on the user. It requires a lengthy
key to encrypt and decrypt the sending and receiving the messages,
respectively. But these keys can be guessed or cracked. Moreover,
Maintaining and sharing lengthy, random keys in enciphering and
deciphering process is the critical problem in the cryptography
system. A new approach is described for generating a crypto key,
which is acquired from a person-s iris pattern. In the biometric field,
template created by the biometric algorithm can only be
authenticated with the same person. Among the biometric templates,
iris features can efficiently be distinguished with individuals and
produces less false positives in the larger population. This type of iris
code distribution provides merely less intra-class variability that aids
the cryptosystem to confidently decrypt messages with an exact
matching of iris pattern. In this proposed approach, the iris features
are extracted using multi resolution wavelets. It produces 135-bit iris
codes from each subject and is used for encrypting/decrypting the
messages. The autocorrelators are used to recall original messages
from the partially corrupted data produced by the decryption process.
It intends to resolve the repudiation and key management problems.
Results were analyzed in both conventional iris cryptography system
(CIC) and non-repudiation iris cryptography system (NRIC). It
shows that this new approach provides considerably high
authentication in enciphering and deciphering processes.
Abstract: Rainfall data at fine resolution and knowledge of its
characteristics plays a major role in the efficient design and operation
of agricultural, telecommunication, runoff and erosion control as well
as water quality control systems. The paper is aimed to study the
statistical distribution of hourly rainfall depth for 12 representative
stations spread across Peninsular Malaysia. Hourly rainfall data of 10
to 22 years period were collected and its statistical characteristics
were estimated. Three probability distributions namely, Generalized
Pareto, Exponential and Gamma distributions were proposed to
model the hourly rainfall depth, and three goodness-of-fit tests,
namely, Kolmogorov-Sminov, Anderson-Darling and Chi-Squared
tests were used to evaluate their fitness. Result indicates that the east
cost of the Peninsular receives higher depth of rainfall as compared
to west coast. However, the rainfall frequency is found to be
irregular. Also result from the goodness-of-fit tests show that all the
three models fit the rainfall data at 1% level of significance.
However, Generalized Pareto fits better than Exponential and
Gamma distributions and is therefore recommended as the best fit.
Abstract: The large and small-scale shaking table tests, which
was conducted for investigating damage evolution of piles inside
liquefied soil, are numerically simulated and experimental verified by the3D nonlinear finite element analysis. Damage evolution of
elasto-plastic circular steel piles and reinforced concrete (RC) one with cracking and yield of reinforcement are focused on, and the failure patterns and residual damages are captured by the proposed constitutive models. The superstructure excitation behind quay wall is
reproduced as well.
Abstract: Extraction of laccase produced by L. polychrous in an
aqueous two-phase system, composed of polyethylene glycol and
phosphate salt at pH 7.0 and 250C was investigated. The effect of
PEG molecular weight, PEG concentration and phosphate
concentration was determined. Laccase preferentially partitioned to
the top phase. Good extraction of laccase to the top phase was
observed with PEG 4000. The optimum system was found in the
system containing 12% w/w PEG 4000 and 16% w/w phosphate salt
with KE of 88.3, purification factor of 3.0-fold and 99.1% yield.
Some properties of the enzyme such as thermal stability, effect of
heavy metal ions and kinetic constants were also presented in this
work. The thermal stability decreased sharply with high temperature
above 60 0C. The enzyme was inhibited by Cd2+, Pb2+, Zn2+ and
Cu2+. The Vmax and Km values of the enzyme were 74.70
μmol/min/ml and 9.066 mM respectively.
Abstract: The convergence of heterogeneous wireless access technologies characterizes the 4G wireless networks. In such converged systems, the seamless and efficient handoff between
different access technologies (vertical handoff) is essential and remains a challenging problem. The heterogeneous co-existence of access technologies with largely different characteristics creates a decision problem of determining the “best" available network at
“best" time to reduce the unnecessary handoffs. This paper proposes a dynamic decision model to decide the “best" network at “best"
time moment to handoffs. The proposed dynamic decision model make the right vertical handoff decisions by determining the “best"
network at “best" time among available networks based on, dynamic
factors such as “Received Signal Strength(RSS)" of network and
“velocity" of mobile station simultaneously with static factors like Usage Expense, Link capacity(offered bandwidth) and power
consumption. This model not only meets the individual user needs but also improve the whole system performance by reducing the unnecessary handoffs.
Abstract: Educational reforms are focused point of different
nations. New reform movements generally claim that something is
wrong with the current state of affairs, and that the system is deficient in its goals, its accomplishments and it is accused not being
adopted into global changes all over the world. It is the same for
Turkish education system. It is considered those recent reforms of
teacher education in Turkey and the extent to which they reflect a
response to global economic pressures. The paper challenges the
view that such imposes are inevitable determinants of educational
policy and argues that any country will need to develop its own
national approach to modernizing teacher education in light of the
global context and its particular circumstances. It draws on the idea
of reflexive modernization developed by educators and discusses its
implications for teacher education policy. The paper deals with four
themes teacher education in last decade policy in Turkey; the shift
away from the educational disciplines, the shift towards school-based
approaches, and the emergence of more centralized forms of
accountability of teacher competence.