Abstract: Eukaryotic protein-coding genes are interrupted by spliceosomal introns, which are removed from the RNA transcripts before translation into a protein. The exon-intron structures of different eukaryotic species are quite different from each other, and the evolution of such structures raises many questions. We try to address some of these questions using statistical analysis of whole genomes. We go through all the protein-coding genes in a genome and study correlations between the net length of all the exons in a gene, the number of the exons, and the average length of an exon. We also take average values of these features for each chromosome and study correlations between those averages on the chromosomal level. Our data show universal features of exon-intron structures common to animals, plants, and protists (specifically, Arabidopsis thaliana, Caenorhabditis elegans, Drosophila melanogaster, Cryptococcus neoformans, Homo sapiens, Mus musculus, Oryza sativa, and Plasmodium falciparum). We have verified linear correlation between the number of exons in a gene and the length of a protein coded by the gene, while the protein length increases in proportion to the number of exons. On the other hand, the average length of an exon always decreases with the number of exons. Finally, chromosome clustering based on average chromosome properties and parameters of linear regression between the number of exons in a gene and the net length of those exons demonstrates that these average chromosome properties are genome-specific features.
Abstract: Webcam systems now function as the new privileged
vantage points from which to view the city. This transformation of
CCTV technology from surveillance to promotional tool is significant
because its'scopic regime' presents, back to the public, a new virtual
'site' that sits alongside its real-time counterpart. Significantly,
thisraw 'image' data can, in fact,be co-optedand processed so as to
disrupt their original purpose. This paper will demonstrate this
disruptive capacity through an architectural project. It will reveal how
the adaption the webcam image offers a technical springboard by
which to initiate alternate urban form making decisions and subvert
the disciplinary reliance on the 'flat' orthographic plan. In so doing,
the paper will show how this 'digital material' exceeds the imagistic
function of the image; shiftingit from being a vehicle of signification
to a site of affect.
Abstract: Coronary artery bypass grafts (CABG) are widely
studied with respect to hemodynamic conditions which play
important role in presence of a restenosis. However, papers which
concern with constitutive modeling of CABG are lacking in the
literature. The purpose of this study is to find a constitutive model for
CABG tissue. A sample of the CABG obtained within an autopsy
underwent an inflation–extension test. Displacements were
recoredered by CCD cameras and subsequently evaluated by digital
image correlation. Pressure – radius and axial force – elongation
data were used to fit material model. The tissue was modeled as onelayered
composite reinforced by two families of helical fibers. The
material is assumed to be locally orthotropic, nonlinear,
incompressible and hyperelastic. Material parameters are estimated
for two strain energy functions (SEF). The first is classical
exponential. The second SEF is logarithmic which allows
interpretation by means of limiting (finite) strain extensibility.
Presented material parameters are estimated by optimization based
on radial and axial equilibrium equation in a thick-walled tube. Both
material models fit experimental data successfully. The exponential
model fits significantly better relationship between axial force and
axial strain than logarithmic one.
Abstract: Modeling transfer phenomena in several chemical
engineering operations leads to the resolution of partial differential
equations systems. According to the complexity of the operations
mechanisms, the equations present a nonlinear form and analytical
solution became difficult, we have then to use numerical methods
which are based on approximations in order to transform a
differential system to an algebraic one.Finite element method is one
of numerical methods which can be used to obtain an accurate
solution in many complex cases of chemical engineering.The packed
columns find a large application like contactor for liquid-liquid
systems such solvent extraction. In the literature, the modeling of this
type of equipment received less attention in comparison with the
plate columns.A mathematical bidimensionnal model with radial and
axial dispersion, simulating packed tower extraction behavior was
developed and a partial differential equation was solved using the
finite element method by adopting the Galerkine model. We
developed a Mathcad program, which can be used for a similar
equations and concentration profiles are obtained along the column.
The influence of radial dispersion was prooved and it can-t be
neglected, the results were compared with experimental concentration
at the top of the column in the extraction system:
acetone/toluene/water.
Abstract: The aim of this paper is description of the notion of
the death for prisoners and the ways of deal with. They express
indifference, coldness, inability to accept the blame, they have no
shame and no empathy. Is it enough to perform acts verging on the
death. In this paper we described mechanisms and regularities of selfdestructive
behaviour in the view of the relevant literature? The
explanation of the phenomenon is of a biological and sociopsychological
nature. It must be clearly stated that all forms of selfdestructive
behaviour result from various impulses, conflicts and
deficits. That is why they should be treated differently in terms of
motivation and functions which they perform in a given group of
people. Behind self-destruction there seems to be a motivational
mechanism which forces prisoners to rebel and fight against the hated
law and penitentiary systems. The imprisoned believe that pain and
suffering inflicted on them by themselves are better than passive
acceptance of repression. The variety of self-destruction acts is wide,
and some of them take strange forms. We assume that a life-death
barrier is a kind of game for them. If they cannot change the
degrading situation, their life loses sense.
Abstract: Numerical study of a plane jet occurring in a vertical
heated channel is carried out. The aim is to explore the influence of
the forced flow, issued from a flat nozzle located in the entry section
of a channel, on the up-going fluid along the channel walls. The
Reynolds number based on the nozzle width and the jet velocity
ranges between 3 103 and 2.104; whereas, the Grashof number based
on the channel length and the wall temperature difference is 2.57
1010. Computations are established for a symmetrically heated
channel and various nozzle positions. The system of governing
equations is solved with a finite volumes method. The obtained
results show that the jet-wall interactions activate the heat transfer,
the position variation modifies the heat transfer especially for low
Reynolds numbers: the heat transfer is enhanced for the adjacent
wall; however it is decreased for the opposite one. The numerical
velocity and temperature fields are post-processed to compute the
quantities of engineering interest such as the induced mass flow rate,
and the Nusselt number along the plates.
Abstract: The more recent satellite projects/programs makes
extensive usage of real – time embedded systems. 16 bit processors
which meet the Mil-Std-1750 standard architecture have been used in
on-board systems. Most of the Space Applications have been written
in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are
needed in the area of spacecraft computing and therefore an effort is
desirable in the study and survey of 64 bit architectures for space
applications. This will also result in significant technology
development in terms of VLSI and software tools for ADA (as the
legacy code is in ADA).
There are several basic requirements for a special processor for
this purpose. They include Radiation Hardened (RadHard) devices,
very low power dissipation, compatibility with existing operational
systems, scalable architectures for higher computational needs,
reliability, higher memory and I/O bandwidth, predictability, realtime
operating system and manufacturability of such processors.
Further on, these may include selection of FPGA devices, selection
of EDA tool chains, design flow, partitioning of the design, pin
count, performance evaluation, timing analysis etc.
This project deals with a brief study of 32 and 64 bit processors
readily available in the market and designing/ fabricating a 64 bit
RISC processor named RISC MicroProcessor with added
functionalities of an extended double precision floating point unit
and a 32 bit signal processing unit acting as co-processors. In this
paper, we emphasize the ease and importance of using Open Core
(OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as
Icarus to develop FPGA based prototypes quickly. Commercial tools
such as Xilinx ISE for Synthesis are also used when appropriate.
Abstract: In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.
Abstract: This paper proposes a new method for analyzing textual data. The method deals with items of textual data, where each item is described based on various viewpoints. The method acquires 2- class classification models of the viewpoints by applying an inductive learning method to items with multiple viewpoints. The method infers whether the viewpoints are assigned to the new items or not by using the models. The method extracts expressions from the new items classified into the viewpoints and extracts characteristic expressions corresponding to the viewpoints by comparing the frequency of expressions among the viewpoints. This paper also applies the method to questionnaire data given by guests at a hotel and verifies its effect through numerical experiments.
Abstract: Some believe that stigma is the worst side effect of the
people who have mental illness. Mental illness researchers have
focused on the influence of mass media on the stigmatization of the
people with mental illness. However, no studies have investigated the
effects of the interactive media, such as blogs, on the stigmatization
of mentally ill people, even though the media have a significant
influence on people in all areas of life. The purpose of this study is to
investigate the use of interactivity in destigmatization of the mentally
ill and the moderating effect of self-construal (independent versus
interdependent self-construal) on the relation between interactivity
and destigmatization. The findings suggested that people in the
human-human interaction condition had less social distance toward
people with mental illness. Additionally, participants with higher
independence showed more favorable affection and less social
distance toward mentally ill people. Finally, direct contact with
mentally ill people increased a person-s positive affect toward people
with mental illness. The current study should provide insights for
mental health practitioners by suggesting how they can use
interactive media to approach the public that stigmatizes the mentally
ill.
Abstract: Clustering algorithms help to understand the hidden
information present in datasets. A dataset may contain intrinsic and
nested clusters, the detection of which is of utmost importance. This
paper presents a Distributed Grid-based Density Clustering algorithm
capable of identifying arbitrary shaped embedded clusters as well as
multi-density clusters over large spatial datasets. For handling
massive datasets, we implemented our method using a 'sharednothing'
architecture where multiple computers are interconnected
over a network. Experimental results are reported to establish the
superiority of the technique in terms of scale-up, speedup as well as
cluster quality.
Abstract: In this article, while it is attempted to describe the
problem and its importance, transformational leadership is studied by considering leadership theories. Issues such as the definition of
transformational leadership and its aspects are compared on the basis of the ideas of various connoisseurs and then it (transformational leadership) is examined in successful and
unsuccessful companies. According to the methodology, the
method of research, hypotheses, population and statistical sample
are investigated and research findings are analyzed by using descriptive and inferential statistical methods in the framework of
analytical tables. Finally, our conclusion is provided by considering the results of statistical tests. The final result shows that
transformational leadership is significantly higher in successful companies than unsuccessful ones P
Abstract: We report on a high-speed quantum cryptography
system that utilizes simultaneous entanglement in polarization and in
“time-bins". With multiple degrees of freedom contributing to the
secret key, we can achieve over ten bits of random entropy per detected coincidence. In addition, we collect from multiple spots o
the downconversion cone to further amplify the data rate, allowing usto achieve over 10 Mbits of secure key per second.
Abstract: Software reliability, defined as the probability of a
software system or application functioning without failure or errors
over a defined period of time, has been an important area of research
for over three decades. Several research efforts aimed at developing
models to improve reliability are currently underway. One of the
most popular approaches to software reliability adopted by some of
these research efforts involves the use of operational profiles to
predict how software applications will be used. Operational profiles
are a quantification of usage patterns for a software application. The
research presented in this paper investigates an innovative multiagent
framework for automatic creation and management of
operational profiles for generic distributed systems after their release
into the market. The architecture of the proposed Operational Profile
MAS (Multi-Agent System) is presented along with detailed
descriptions of the various models arrived at following the analysis
and design phases of the proposed system. The operational profile in
this paper is extended to comprise seven different profiles. Further,
the criticality of operations is defined using a new composed metrics
in order to organize the testing process as well as to decrease the time
and cost involved in this process. A prototype implementation of the
proposed MAS is included as proof-of-concept and the framework is
considered as a step towards making distributed systems intelligent
and self-managing.
Abstract: Optimization is often a critical issue for most system
design problems. Evolutionary Algorithms are population-based,
stochastic search techniques, widely used as efficient global
optimizers. However, finding optimal solution to complex high
dimensional, multimodal problems often require highly
computationally expensive function evaluations and hence are
practically prohibitive. The Dynamic Approximate Fitness based
Hybrid EA (DAFHEA) model presented in our earlier work [14]
reduced computation time by controlled use of meta-models to
partially replace the actual function evaluation by approximate
function evaluation. However, the underlying assumption in
DAFHEA is that the training samples for the meta-model are
generated from a single uniform model. Situations like model
formation involving variable input dimensions and noisy data
certainly can not be covered by this assumption. In this paper we
present an enhanced version of DAFHEA that incorporates a
multiple-model based learning approach for the SVM approximator.
DAFHEA-II (the enhanced version of the DAFHEA framework) also
overcomes the high computational expense involved with additional
clustering requirements of the original DAFHEA framework. The
proposed framework has been tested on several benchmark functions
and the empirical results illustrate the advantages of the proposed
technique.
Abstract: The use of electronic sensors in the electronics
industry has become increasingly popular over the past few years,
and it has become a high competition product. The frequency
adjustment process is regarded as one of the most important process
in the electronic sensor manufacturing process. Due to inaccuracies
in the frequency adjustment process, up to 80% waste can be caused
due to rework processes; therefore, this study aims to provide a
preliminary understanding of the role of parameters used in the
frequency adjustment process, and also make suggestions in order to
further improve performance. Four parameters are considered in this
study: air pressure, dispensing time, vacuum force, and the distance
between the needle tip and the product. A full factorial design for
experiment 2k was considered to determine those parameters that
significantly affect the accuracy of the frequency adjustment process,
where a deviation in the frequency after adjustment and the target
frequency is expected to be 0 kHz. The experiment was conducted on
two levels, using two replications and with five center-points added.
In total, 37 experiments were carried out. The results reveal that air
pressure and dispensing time significantly affect the frequency
adjustment process. The mathematical relationship between these
two parameters was formulated, and the optimal parameters for air
pressure and dispensing time were found to be 0.45 MPa and 458 ms,
respectively. The optimal parameters were examined by carrying out
a confirmation experiment in which an average deviation of 0.082
kHz was achieved.
Abstract: The research aims to study the quality of surface water
for consumer in Samut Songkram province. Water sample were
collected from 217 sampling sites conclude 72 sampling sites in
Amphawa, 67 sampling sites in Bangkhonthee and 65 sampling sites
in Muang. Water sample were collected in December 2011 for
winter, March 2012 for summer and August 2012 for rainy season.
From the investigation of surface water quality in Mae Klong
River, main and tributaries canals in Samut Songkram province, we
found that water quality meet the type III of surface water quality
standard issued by the National Environmental Quality Act B.E.
1992. Seasonal variations of pH, Temperature, nitrate, lead and
cadmium have statistical differences between 3 seasons.
Abstract: The number of electronic participation (eParticipation) projects introduced by different governments and international organisations is considerably high and increasing. In order to have an overview of the development of these projects, various evaluation frameworks have been proposed. In this paper, a five-level participation model, which takes into account the advantages of the Social Web or Web 2.0, together with a quantitative approach for the evaluation of eParticipation projects is presented. Each participation level is evaluated independently, taking into account three main components: Web evolution, media richness, and communication channels. This paper presents the evaluation of a number of existing Voting Advice Applications (VAAs). The results provide an overview of the main features implemented by each project, their strengths and weaknesses, and the participation levels reached.
Abstract: This study aims at investigating factors in research
and development (R&D) growth and exploring the role of R&D
management in enhancing social innovation and productivity
improvement in Iran-s industrial sector. It basically explores the
common types of R&D activities and the industries which benefited
the most from active R&D units in Iran. The researchers generated
qualitative analyses obtained from primary and secondary data.
The primary data have been retrieved through interviews with five
key players (Managing Director, Internal Manager, General Manager,
Executive Manager, and Project Manager) in the industrial sector.
The secondary data acquired from an investigation on Mazandaran, a
province of northern Iran. The findings highlight Iran-s focuses of R
& D on cost reduction and upgrading productivity. Industries that
have benefited the most from active R&D units are metallic,
machinery and equipment design, and automotive.
We rank order the primary effects of R&D on productivity
improvement as follows, industry improvement, economic growth,
using professional human resources, generating productivity and
creativity culture, creating a competitive and innovative environment,
and increasing people-s knowledge.
Generally, low budget dedication and insufficient supply of highly
skilled scientists and engineers are two important obstacles for R&D
in Iran. Whereas, R&D has resulted in improvement in Iranian
society, transfer of contemporary knowledge into the international
market is still lacking.
Abstract: In this paper, the action research driven design of a
context relevant, developmental peer review of teaching model, its
implementation strategy and its impact at an Australian university is
presented. PRO-Teaching realizes an innovative process that
triangulates contemporaneous teaching quality data from a range of
stakeholders including students, discipline academics, learning and
teaching expert academics, and teacher reflection to create reliable
evidence of teaching quality. Data collected over multiple classroom
observations allows objective reporting on development differentials
in constructive alignment, peer, and student evaluations. Further
innovation is realized in the application of this highly structured
developmental process to provide summative evidence of sufficient
validity to support claims for professional advancement and learning
and teaching awards. Design decision points and contextual triggers
are described within the operating domain. Academics and
developers seeking to introduce structured peer review of teaching
into their organization will find this paper a useful reference.