Abstract: With optimized bandwidth and latency discrepancy ratios, Node Gain Scores (NGSs) are determined and used as a basis for shaping the max-heap overlay. The NGSs - determined as the respective bandwidth-latency-products - govern the construction of max-heap-form overlays. Each NGS is earned as a synergy of discrepancy ratio of the bandwidth requested with respect to the estimated available bandwidth, and latency discrepancy ratio between the nodes and the source node. The tree leads to enhanceddelivery overlay multicasting – increasing packet delivery which could, otherwise, be hindered by induced packet loss occurring in other schemes not considering the synergy of these parameters on placing the nodes on the overlays. The NGS is a function of four main parameters – estimated available bandwidth, Ba; individual node's requested bandwidth, Br; proposed node latency to its prospective parent (Lp); and suggested best latency as advised by source node (Lb). Bandwidth discrepancy ratio (BDR) and latency discrepancy ratio (LDR) carry weights of α and (1,000 - α ) , respectively, with arbitrary chosen α ranging between 0 and 1,000 to ensure that the NGS values, used as node IDs, maintain a good possibility of uniqueness and balance between the most critical factor between the BDR and the LDR. A max-heap-form tree is constructed with assumption that all nodes possess NGS less than the source node. To maintain a sense of load balance, children of each level's siblings are evenly distributed such that a node can not accept a second child, and so on, until all its siblings able to do so, have already acquired the same number of children. That is so logically done from left to right in a conceptual overlay tree. The records of the pair-wise approximate available bandwidths as measured by a pathChirp scheme at individual nodes are maintained. Evaluation measures as compared to other schemes – Bandwidth Aware multicaSt architecturE (BASE), Tree Building Control Protocol (TBCP), and Host Multicast Tree Protocol (HMTP) - have been conducted. This new scheme generally performs better in terms of trade-off between packet delivery ratio; link stress; control overhead; and end-to-end delays.
Abstract: As the remedy used music becomes active and
meditation effect through the music is verified, people take a growing
interest about psychological balance or remedy given by music. From
traditional studies, it is verified that the music of which spectral
envelop varies approximately as 1/f (f is frequency) down to a
frequency of low frequency bandwidth gives psychological balance.
In this paper, we researched signal properties of music which gives
psychological balance. In order to find this, we derived the property
from voice. Music composed by voice shows large value in NCSD.
We confirmed the degree of deference between music by curvature of
normalized cumulative spectral distribution. In the music that gives
psychological balance, the curvature shows high value, otherwise, the
curvature shows low value.
Abstract: This paper proposes the requirements and design of
RFID based system for SFC (Shop Floor Control) in order to achieve
the factory real time controllability, Allowing to develop EManufacturing
System. The detailed logical specifications of the core
functions and the design diagrams of RFID based system are
developed. Then RFID deployment in E-Manufacturing systems is
investigated..
Abstract: During the last years, the genomes of more and more
species have been sequenced, providing data for phylogenetic recon-
struction based on genome rearrangement measures. A main task in
all phylogenetic reconstruction algorithms is to solve the median of
three problem. Although this problem is NP-hard even for the sim-
plest distance measures, there are exact algorithms for the breakpoint
median and the reversal median that are fast enough for practical use.
In this paper, this approach is extended to the transposition median as
well as to the weighted reversal and transposition median. Although
there is no exact polynomial algorithm known even for the pairwise
distances, we will show that it is in most cases possible to solve
these problems exactly within reasonable time by using a branch and
bound algorithm.
Abstract: In this paper, a novel multi join algorithm to join
multiple relations will be introduced. The novel algorithm is based
on a hashed-based join algorithm of two relations to produce a double index. This is done by scanning the two relations once. But
instead of moving the records into buckets, a double index will be built. This will eliminate the collision that can happen from a complete hash algorithm. The double index will be divided into join
buckets of similar categories from the two relations. The algorithm then joins buckets with similar keys to produce joined buckets. This
will lead at the end to a complete join index of the two relations. without actually joining the actual relations. The time complexity
required to build the join index of two categories is Om log m where m is the size of each category. Totaling time complexity to O n log m
for all buckets. The join index will be used to materialize the joined relation if required. Otherwise, it will be used along with other join
indices of other relations to build a lattice to be used in multi-join operations with minimal I/O requirements. The lattice of the join indices can be fitted into the main memory to reduce time complexity of the multi join algorithm.
Abstract: Y chromosome microdeletions are the most common
genetic cause of male infertility and screening for these
microdeletions in azoospermic or severely oligospermic men is now
standard practice. Analysis of the Y chromosome in men with
azoospermia or severe oligozoospermia has resulted in the
identification of three regions in the euchromatic part of the long arm
of the human Y chromosome (Yq11) that are frequently deleted in
men with otherwise unexplained spermatogenic failure. PCR analysis
of microdeletions in the AZFa, AZFb and AZFc regions of the
human Y chromosome is an important screening tool. The aim of this
study was to analyse the type of microdeletions in men with fertility
disorders in Slovakia. We evaluated 227 patients with azoospermia
and with normal karyotype. All patient samples were analyzed
cytogenetically. For PCR amplification of sequence-tagged sites
(STS) of the AZFa, AZFb and AZFc regions of the Y chromosome
was used Devyser AZF set. Fluorescently labeled primers for all
markers in one multiplex PCR reaction were used and for automated
visualization and identification of the STS markers we used genetic
analyzer ABi 3500xl (Life Technologies). We reported 13 cases of
deletions in the AZF region 5,73%. Particular types of deletions were
recorded in each region AZFa,b,c .The presence of microdeletions in
the AZFc region was the most frequent. The study confirmed that
percentage of microdeletions in the AZF region is low in Slovak
azoospermic patients, but important from a prognostic view.
Abstract: This paper presents the experimental results of
comparison between leakage currents and discharge currents. The leakage currents were obtained on polluted porcelain insulator.
Whereas, the discharge currents were obtained on lightly artificial
polluted porcelain specimen. The conducted measurements were
leakage current or discharge current and applied voltage. The insulator or specimen was in a hermetically sealed chamber, and the
current waveforms were analyzed using FFT.
The result indicated that the leakage current (LC) on low RH condition the fifth harmonic would be visible, and followed by the
seventh harmonic. The insulator had capacitive property. Otherwise,
on 99% relative humidity, the fifth harmonic would also be visible,
and the phase angle reached up to 12.2 degree. Whereas, on discharge current, the third harmonic would be visible, and followed
by fifth harmonic. The third harmonic would increase as pressure reduced. On this condition, the specimen had a non-linear characteristics
Abstract: An effective approach for realizing the binary tree structure, representing a combinational logic functionality with enhanced throughput, is discussed in this paper. The optimization in maximum operating frequency was achieved through delay minimization, which in turn was possible by means of reducing the depth of the binary network. The proposed synthesis methodology has been validated by experimentation with FPGA as the target technology. Though our proposal is technology independent, yet the heuristic enables better optimization in throughput even after technology mapping for such Boolean functionality; whose reduced CNF form is associated with a lesser literal cost than its reduced DNF form at the Boolean equation level. For cases otherwise, our method converges to similar results as that of [12]. The practical results obtained for a variety of case studies demonstrate an improvement in the maximum throughput rate for Spartan IIE (XC2S50E-7FT256) and Spartan 3 (XC3S50-4PQ144) FPGA logic families by 10.49% and 13.68% respectively. With respect to the LUTs and IOBUFs required for physical implementation of the requisite non-regenerative logic functionality, the proposed method enabled savings to the tune of 44.35% and 44.67% respectively, over the existing efficient method available in literature [12].
Abstract: We identify clawback triggers from firms- proxy
statements (Form DEF 14A) and use the likelihood of restatements to
proxy for financial reporting quality. Based on a sample of 578 U.S.
firms that voluntarily adopt clawback provisions during 2003-2009,
when restatement-based triggers could be decomposed into two types:
fraud and unintentional error, and we do observe the evidence that
using fraud triggers is associated with high financial reporting quality.
The findings support that fraud triggers can enhance deterrent effect of
clawback provision by establishing a viable disincentive against fraud,
misconduct, and otherwise harmful acts. These results are robust to
controlling for the compensation components, to different sample
specifications and to a number of sensitivity.
Abstract: In this paper, a new algorithm for generating codebook is proposed for vector quantization (VQ) in image coding. The significant features of the training image vectors are extracted by using the proposed Orthogonal Polynomials based transformation. We propose to generate the codebook by partitioning these feature vectors into a binary tree. Each feature vector at a non-terminal node of the binary tree is directed to one of the two descendants by comparing a single feature associated with that node to a threshold. The binary tree codebook is used for encoding and decoding the feature vectors. In the decoding process the feature vectors are subjected to inverse transformation with the help of basis functions of the proposed Orthogonal Polynomials based transformation to get back the approximated input image training vectors. The results of the proposed coding are compared with the VQ using Discrete Cosine Transform (DCT) and Pairwise Nearest Neighbor (PNN) algorithm. The new algorithm results in a considerable reduction in computation time and provides better reconstructed picture quality.
Abstract: SoftBoost is a recently presented boosting algorithm,
which trades off the size of achieved classification margin and
generalization performance. This paper presents a performance
evaluation of SoftBoost algorithm on the generic object recognition
problem. An appearance-based generic object recognition
model is used. The evaluation experiments are performed using
a difficult object recognition benchmark. An assessment with respect
to different degrees of label noise as well as a comparison to
the well known AdaBoost algorithm is performed. The obtained
results reveal that SoftBoost is encouraged to be used in cases
when the training data is known to have a high degree of noise.
Otherwise, using Adaboost can achieve better performance.
Abstract: Background: Dialign is a DNA/Protein alignment tool
for performing pairwise and multiple pairwise alignments through the
comparison of gap-free segments (fragments) between sequence
pairs. An alignment of two sequences is a chain of fragments, i.e
local gap-free pairwise alignments, with the highest total score.
METHOD: A new approach is defined in this article which relies on
the concept of using three-dimensional fragments – i.e. local threeway
alignments -- in the alignment process instead of twodimensional
ones. These three-dimensional fragments are gap-free
alignments constituting of equal-length segments belonging to three
distinct sequences. RESULTS: The obtained results showed good
improvments over the performance of DIALIGN.
Abstract: Network layer multicast, i.e. IP multicast, even after
many years of research, development and standardization, is not
deployed in large scale due to both technical (e.g. upgrading of
routers) and political (e.g. policy making and negotiation) issues.
Researchers looked for alternatives and proposed application/overlay
multicast where multicast functions are handled by end hosts, not
network layer routers. Member hosts wishing to receive multicast
data form a multicast delivery tree. The intermediate hosts in the tree
act as routers also, i.e. they forward data to the lower hosts in the
tree. Unlike IP multicast, where a router cannot leave the tree until all
members below it leave, in overlay multicast any member can leave
the tree at any time thus disjoining the tree and disrupting the data
dissemination. All the disrupted hosts have to rejoin the tree. This
characteristic of the overlay multicast causes multicast tree unstable,
data loss and rejoin overhead. In this paper, we propose that each node
sets its leaving time from the tree and sends join request to a number
of nodes in the tree. The nodes in the tree will reject the request if
their leaving time is earlier than the requesting node otherwise they
will accept the request. The node can join at one of the accepting
nodes. This makes the tree more stable as the nodes will join the tree
according to their leaving time, earliest leaving time node being at the
leaf of the tree. Some intermediate nodes may not follow their leaving
time and leave earlier than their leaving time thus disrupting the tree.
For this, we propose a proactive recovery mechanism so that disrupted
nodes can rejoin the tree at predetermined nodes immediately. We
have shown by simulation that there is less overhead when joining
the multicast tree and the recovery time of the disrupted nodes is
much less than the previous works. Keywords
Abstract: The major objective of this paper is to introduce a new method to select genes from DNA microarray data. As criterion to select genes we suggest to measure the local changes in the correlation graph of each gene and to select those genes whose local changes are largest. More precisely, we calculate the correlation networks from DNA microarray data of cervical cancer whereas each network represents a tissue of a certain tumor stage and each node in the network represents a gene. From these networks we extract one tree for each gene by a local decomposition of the correlation network. The interpretation of a tree is that it represents the n-nearest neighbor genes on the n-th level of a tree, measured by the Dijkstra distance, and, hence, gives the local embedding of a gene within the correlation network. For the obtained trees we measure the pairwise similarity between trees rooted by the same gene from normal to cancerous tissues. This evaluates the modification of the tree topology due to tumor progression. Finally, we rank the obtained similarity values from all tissue comparisons and select the top ranked genes. For these genes the local neighborhood in the correlation networks changes most between normal and cancerous tissues. As a result we find that the top ranked genes are candidates suspected to be involved in tumor growth. This indicates that our method captures essential information from the underlying DNA microarray data of cervical cancer.
Abstract: Games can be classified as games of skill, games of chance or otherwise be classified as mixed. This paper deals with the topic of scientifically classifying mixed games as more reliant on elements of chance or elements of skill and ways to scientifically measure the amount of skill involved. This is predominantly useful for classification of games as legal or illegal in deferent jurisdictions based on the local gaming laws. We propose a novel measure of skill to chance ratio called the Game Skill Measure (GSM) and utilize it to calculate the skill component of a popular variant of Poker.
Abstract: The purpose of this study is to revisit the concept of
rape as represented by professionals in the literature as well as its
perception (beliefs and attitudes) in the population at large and to
propose methodological improvements to its measurement tool. Rape
is a serious crime threatening its victim-s physical and mental health
and integrity; and as such is legally prosecuted in all modern
societies. The problem is not in accepting or rejecting rape as a
criminal act, but rather in the vagueness of its interpretations and
“justifications" maintained in the mentality of modern societies -
known in the literature as the phenomenon of "rape-myth". The rapemyth
can be studied from different perspectives: criminology,
sociology, ethics, medicine and psychology. Its investigation requires
rigorous scientific objectivity, free of passion (victims of rape are at
risk of emotional bias), free of activism (social activists, even if wellintentioned
are also biased), free of any pre-emptive assumptions or
prejudices. To apply a rigorous scientific procedure, we need a solid,
valid and reliable measurement. Rape is a form of heterosexual or
homosexual aggression, violently forcing the victim to give-in in the
sexual activity of the aggressor against her/his will. Human beings
always try to “understand" or find a reason justifying their acts.
Psychological literature provides multiple clinical and experimental
examples of it; just to mention the famous studies by Milgram on the
level of electroshock delivered by the “teacher" towards the “learner"
if “scientifically justifiable" or the studies on the behavior of
“prisoners" and the “guards" and many other experiments and field
observations. Sigmund Freud presented the phenomenon of
unconscious justification and called it rationalization. The multiple
justifications, rationalizations and repeated opinions about sexual
behavior contribute to a myth maintained in the society. What kind of
“rationale" our societies apply to “understand" the non-consensual
sexual behavior? There are many, just to mention few:
• Sex is a ludistic activity for both participants, therefore –
even if not consented – it should bring pleasure to both.
• Everybody wants sex, but only men are allowed to manifest
it openly while women have to pretend the opposite, thus men have
to initiate sexual behavior and women would follow.
• A person who strongly needs sex is free to manifest it and
struggle to get it; the person who doesn-t want it must not reveal
her/his sexual attraction and avoid risky situations; otherwise she/he
is perceived as a promiscuous seducer.
• A person who doesn-t fight against the sexual initiator
unconsciously accepts the rape (does it explain why homosexual
rapes are reported less frequently than rapes against women?).
• Women who are raped deserve it because their wardrobe is
very revealing and seducing and they ''willingly'' go to highly risky
places (alleys, dark roads, etc.).
• Men need to ventilate their sexual energy and if they are
deprived of a partner their urge to have sex is difficult to control.
• Men are supposed to initiate and insist even by force to have
sex (their testosterone makes them both sexual and aggressive).
The paper overviews numerous cultural beliefs about masculine
versus feminine behavior and their impact on the “rape myth".
Abstract: Self-organizing map (SOM) is a well known data
reduction technique used in data mining. It can reveal structure in
data sets through data visualization that is otherwise hard to detect
from raw data alone. However, interpretation through visual
inspection is prone to errors and can be very tedious. There are
several techniques for the automatic detection of clusters of code
vectors found by SOM, but they generally do not take into account
the distribution of code vectors; this may lead to unsatisfactory
clustering and poor definition of cluster boundaries, particularly
where the density of data points is low. In this paper, we propose the
use of an adaptive heuristic particle swarm optimization (PSO)
algorithm for finding cluster boundaries directly from the code
vectors obtained from SOM. The application of our method to
several standard data sets demonstrates its feasibility. PSO algorithm
utilizes a so-called U-matrix of SOM to determine cluster boundaries;
the results of this novel automatic method compare very favorably to
boundary detection through traditional algorithms namely k-means
and hierarchical based approach which are normally used to interpret
the output of SOM.
Abstract: In spite of the advent of new materials, clay bricks
remain, arguably, the most popular construction materials today.
Nevertheless the low cost and versatility of clay bricks cannot always
be associated with high environmental and sustainable values,
especially in terms of raw material sources and manufacturing
processes. At the same time, the worldwide agricultural footprint is
fast growing, with vast agricultural land cultivation and active
expansion of the agro-based industry. The resulting large quantities of
agricultural wastes, unfortunately, are not always well managed or
utilised. These wastes can be recycled, such as by retrieving fibres
from disposed leaves and fruit bunches, and then incorporated in
brick-making. This way the clay bricks are made a 'greener' building
material and the discarded natural wastes can be reutilised, avoiding
otherwise wasteful landfill and harmful open incineration. This study
examined the physical and mechanical properties of clay bricks made
by adding two natural fibres to a clay-water mixture, with baked and
non-baked conditions. The fibres were sourced from pineapple leaves
(PF) and oil palm fruit bunch (OF), and added within the range of
0.25-0.75 %. Cement was added as a binder to the mixture at 5-15 %.
Although the two fibres had different effects on the bricks produced,
cement appeared to dominate the compressive strength. The
non-baked bricks disintegrated when submerged in water, while the
baked ones displayed cement-dependent characteristics in
water-absorption and density changes. Interestingly, further increase
in fibre content did not cause significant density decrease in both the
baked and non-baked bricks.
Abstract: These This paper looks into frameworks which aim at
furthering the discussion of the role of regenerative design practices
in a city-s historic core and the tool of urban design to achieve urban
revitalization on the island of Cyprus. It also examines the region-s
demographic mix, the effectiveness of its governmental coordination
and the strategies of adaptive reuse and strategic investments in older
areas with existing infrastructure. The two main prongs of
investigation will consider the effect of the existing and proposed
changes in the physical infrastructure and fabric of the city, as well as
the catalytic effect of sustainable urban design practices. Through this
process, the work hopes to integrate the contained potential within
the existing historic core and the contributions and participation of
the migrant and immigrant populations to the local economy. It also
examines ways in which this coupling of factors can bring to the front
the positive effects of this combined effort on an otherwise sluggish
local redevelopment effort. The data for this study is being collected
and organized as part of ongoing urban design and development
student workshop efforts in urban planning and design education.
The work is presented in graphic form and includes data collected
from interviews with study area organizations and the community at
large. Planning work is also based on best practices initiated by the
staff of the Nicosia Master Plan task force, which coordinates holistic
planning efforts for the historic center of the city of Nicosia.
Abstract: Most routing protocols (DSR, AODV etc.) that have
been designed for wireless adhoc networks incorporate the broadcasting
operation in their route discovery scheme. Probabilistic broadcasting
techniques have been developed to optimize the broadcast operation
which is otherwise very expensive in terms of the redundancy
and the traffic it generates. In this paper we have explored percolation
theory to gain a different perspective on probabilistic broadcasting
schemes which have been actively researched in the recent years.
This theory has helped us estimate the value of broadcast probability
in a wireless adhoc network as a function of the size of the network.
We also show that, operating at those optimal values of broadcast
probability there is at least 25-30% reduction in packet regeneration
during successful broadcasting.