Abstract: The aim of this paper is description of the notion of
the death for prisoners and the ways of deal with. They express
indifference, coldness, inability to accept the blame, they have no
shame and no empathy. Is it enough to perform acts verging on the
death. In this paper we described mechanisms and regularities of selfdestructive
behaviour in the view of the relevant literature? The
explanation of the phenomenon is of a biological and sociopsychological
nature. It must be clearly stated that all forms of selfdestructive
behaviour result from various impulses, conflicts and
deficits. That is why they should be treated differently in terms of
motivation and functions which they perform in a given group of
people. Behind self-destruction there seems to be a motivational
mechanism which forces prisoners to rebel and fight against the hated
law and penitentiary systems. The imprisoned believe that pain and
suffering inflicted on them by themselves are better than passive
acceptance of repression. The variety of self-destruction acts is wide,
and some of them take strange forms. We assume that a life-death
barrier is a kind of game for them. If they cannot change the
degrading situation, their life loses sense.
Abstract: The purpose of this study was to understand the main
sources of copper (Cu) accumulation in target organs of tilapia
(Oreochromis mossambicus) and to investigate how the organism
mediate the process of Cu accumulation under prolonged conditions.
By measuring both dietary and waterborne Cu accumulation and total
concentrations in tilapia with biokinetic modeling approach, we were
able to clarify the biokinetic coping mechanisms for the long term Cu
accumulation. This study showed that water and food are both the
major source of Cu for the muscle and liver of tilapia. This implied
that control the Cu concentration in these two routes will be correlated
to the Cu bioavailability for tilapia. We found that exposure duration
and level of waterborne Cu drove the Cu accumulation in tilapia. The
ability for Cu biouptake and depuration in organs of tilapia were
actively mediated under prolonged exposure conditions. Generally,
the uptake rate, depuration rate and net bioaccumulation ability in all
selected organs decreased with the increasing level of waterborne Cu
and extension of exposure duration.Muscle tissues accounted for over
50%of the total accumulated Cu and played a key role in buffering the
Cu burden in the initial period of exposure, alternatively, the liver
acted a more important role in the storage of Cu with the extension of
exposures. We concluded that assumption of the constant biokinetic
rates could lead to incorrect predictions with overestimating the
long-term Cu accumulation in ecotoxicological risk assessments.
Abstract: Numerical study of a plane jet occurring in a vertical
heated channel is carried out. The aim is to explore the influence of
the forced flow, issued from a flat nozzle located in the entry section
of a channel, on the up-going fluid along the channel walls. The
Reynolds number based on the nozzle width and the jet velocity
ranges between 3 103 and 2.104; whereas, the Grashof number based
on the channel length and the wall temperature difference is 2.57
1010. Computations are established for a symmetrically heated
channel and various nozzle positions. The system of governing
equations is solved with a finite volumes method. The obtained
results show that the jet-wall interactions activate the heat transfer,
the position variation modifies the heat transfer especially for low
Reynolds numbers: the heat transfer is enhanced for the adjacent
wall; however it is decreased for the opposite one. The numerical
velocity and temperature fields are post-processed to compute the
quantities of engineering interest such as the induced mass flow rate,
and the Nusselt number along the plates.
Abstract: Dust storms are one of the most costly and destructive
events in many desert regions. They can cause massive damages both
in natural environments and human lives. This paper is aimed at
presenting a preliminary study on dust storms, as a major natural
hazard in arid and semi-arid regions. As a case study, dust storm
events occurred in Zabol city located in Sistan Region of Iran was
analyzed to diagnose and predict dust storms. The identification and
prediction of dust storm events could have significant impacts on
damages reduction. Present models for this purpose are complicated
and not appropriate for many areas with poor-data environments. The
present study explores Gamma test for identifying inputs of ANNs
model, for dust storm prediction. Results indicate that more attempts
must be carried out concerning dust storms identification and
segregate between various dust storm types.
Abstract: In the recent works related with mixture discriminant
analysis (MDA), expectation and maximization (EM) algorithm is
used to estimate parameters of Gaussian mixtures. But, initial values
of EM algorithm affect the final parameters- estimates. Also, when
EM algorithm is applied two times, for the same data set, it can be
give different results for the estimate of parameters and this affect the
classification accuracy of MDA. Forthcoming this problem, we use
Self Organizing Mixture Network (SOMN) algorithm to estimate
parameters of Gaussians mixtures in MDA that SOMN is more robust
when random the initial values of the parameters are used [5]. We
show effectiveness of this method on popular simulated waveform
datasets and real glass data set.
Abstract: The more recent satellite projects/programs makes
extensive usage of real – time embedded systems. 16 bit processors
which meet the Mil-Std-1750 standard architecture have been used in
on-board systems. Most of the Space Applications have been written
in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are
needed in the area of spacecraft computing and therefore an effort is
desirable in the study and survey of 64 bit architectures for space
applications. This will also result in significant technology
development in terms of VLSI and software tools for ADA (as the
legacy code is in ADA).
There are several basic requirements for a special processor for
this purpose. They include Radiation Hardened (RadHard) devices,
very low power dissipation, compatibility with existing operational
systems, scalable architectures for higher computational needs,
reliability, higher memory and I/O bandwidth, predictability, realtime
operating system and manufacturability of such processors.
Further on, these may include selection of FPGA devices, selection
of EDA tool chains, design flow, partitioning of the design, pin
count, performance evaluation, timing analysis etc.
This project deals with a brief study of 32 and 64 bit processors
readily available in the market and designing/ fabricating a 64 bit
RISC processor named RISC MicroProcessor with added
functionalities of an extended double precision floating point unit
and a 32 bit signal processing unit acting as co-processors. In this
paper, we emphasize the ease and importance of using Open Core
(OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as
Icarus to develop FPGA based prototypes quickly. Commercial tools
such as Xilinx ISE for Synthesis are also used when appropriate.
Abstract: In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.
Abstract: Some believe that stigma is the worst side effect of the
people who have mental illness. Mental illness researchers have
focused on the influence of mass media on the stigmatization of the
people with mental illness. However, no studies have investigated the
effects of the interactive media, such as blogs, on the stigmatization
of mentally ill people, even though the media have a significant
influence on people in all areas of life. The purpose of this study is to
investigate the use of interactivity in destigmatization of the mentally
ill and the moderating effect of self-construal (independent versus
interdependent self-construal) on the relation between interactivity
and destigmatization. The findings suggested that people in the
human-human interaction condition had less social distance toward
people with mental illness. Additionally, participants with higher
independence showed more favorable affection and less social
distance toward mentally ill people. Finally, direct contact with
mentally ill people increased a person-s positive affect toward people
with mental illness. The current study should provide insights for
mental health practitioners by suggesting how they can use
interactive media to approach the public that stigmatizes the mentally
ill.
Abstract: Crosstalk is the major limiting issue in very high bit-rate digital subscriber line (VDSL) systems in terms of bit-rate or service coverage. At the central office side, joint signal processing accompanied by appropriate power allocation enables complex multiuser processors to provide near capacity rates. Unfortunately complexity grows with the square of the number of lines within a binder, so by taking into account that there are only a few dominant crosstalkers who contribute to main part of crosstalk power, the canceller structure can be simplified which resulted in a much lower run-time complexity. In this paper, a multiuser power control scheme, namely iterative waterfilling, is combined with previously proposed partial crosstalk cancellation approaches to demonstrate the best ever achieved performance which is verified by simulation results.
Abstract: In this article, while it is attempted to describe the
problem and its importance, transformational leadership is studied by considering leadership theories. Issues such as the definition of
transformational leadership and its aspects are compared on the basis of the ideas of various connoisseurs and then it (transformational leadership) is examined in successful and
unsuccessful companies. According to the methodology, the
method of research, hypotheses, population and statistical sample
are investigated and research findings are analyzed by using descriptive and inferential statistical methods in the framework of
analytical tables. Finally, our conclusion is provided by considering the results of statistical tests. The final result shows that
transformational leadership is significantly higher in successful companies than unsuccessful ones P
Abstract: We report on a high-speed quantum cryptography
system that utilizes simultaneous entanglement in polarization and in
“time-bins". With multiple degrees of freedom contributing to the
secret key, we can achieve over ten bits of random entropy per detected coincidence. In addition, we collect from multiple spots o
the downconversion cone to further amplify the data rate, allowing usto achieve over 10 Mbits of secure key per second.
Abstract: Effective evaluation of software development effort is an important aspect of successful project management. Based on a large database with 4106 projects ever developed, this study statistically examines the factors that influence development effort. The factors found to be significant for effort are project size, average number of developers that worked on the project, type of development, development language, development platform, and the use of rapid application development. Among these factors, project size is the most critical cost driver. Unsurprisingly, this study found that the use of CASE tools does not necessarily reduce development effort, which adds support to the claim that the use of tools is subtle. As many of the current estimation models are rarely or unsuccessfully used, this study proposes a parsimonious parametric model for the prediction of effort which is both simple and more accurate than previous models.
Abstract: In this paper the concept of the cosets of an anti Lfuzzy
normal subgroup of a group is given. Furthermore, the group
G/A of cosets of an anti L-fuzzy normal subgroup A of a group
G is shown to be isomorphic to a factor group of G in a natural
way. Finally, we prove that if f : G1 -→ G2 is an epimorphism of
groups, then there is a one-to-one order-preserving correspondence
between the anti L-fuzzy normal subgroups of G2 and those of G1
which are constant on the kernel of f.
Abstract: Software reliability, defined as the probability of a
software system or application functioning without failure or errors
over a defined period of time, has been an important area of research
for over three decades. Several research efforts aimed at developing
models to improve reliability are currently underway. One of the
most popular approaches to software reliability adopted by some of
these research efforts involves the use of operational profiles to
predict how software applications will be used. Operational profiles
are a quantification of usage patterns for a software application. The
research presented in this paper investigates an innovative multiagent
framework for automatic creation and management of
operational profiles for generic distributed systems after their release
into the market. The architecture of the proposed Operational Profile
MAS (Multi-Agent System) is presented along with detailed
descriptions of the various models arrived at following the analysis
and design phases of the proposed system. The operational profile in
this paper is extended to comprise seven different profiles. Further,
the criticality of operations is defined using a new composed metrics
in order to organize the testing process as well as to decrease the time
and cost involved in this process. A prototype implementation of the
proposed MAS is included as proof-of-concept and the framework is
considered as a step towards making distributed systems intelligent
and self-managing.
Abstract: Optimization is often a critical issue for most system
design problems. Evolutionary Algorithms are population-based,
stochastic search techniques, widely used as efficient global
optimizers. However, finding optimal solution to complex high
dimensional, multimodal problems often require highly
computationally expensive function evaluations and hence are
practically prohibitive. The Dynamic Approximate Fitness based
Hybrid EA (DAFHEA) model presented in our earlier work [14]
reduced computation time by controlled use of meta-models to
partially replace the actual function evaluation by approximate
function evaluation. However, the underlying assumption in
DAFHEA is that the training samples for the meta-model are
generated from a single uniform model. Situations like model
formation involving variable input dimensions and noisy data
certainly can not be covered by this assumption. In this paper we
present an enhanced version of DAFHEA that incorporates a
multiple-model based learning approach for the SVM approximator.
DAFHEA-II (the enhanced version of the DAFHEA framework) also
overcomes the high computational expense involved with additional
clustering requirements of the original DAFHEA framework. The
proposed framework has been tested on several benchmark functions
and the empirical results illustrate the advantages of the proposed
technique.
Abstract: Shadow detection is still considered as one of the
potential challenges for intelligent automated video surveillance
systems. A pre requisite for reliable and accurate detection and
tracking is the correct shadow detection and classification. In such a
landscape of conditions, privacy issues add more and more
complexity and require reliable shadow detection.
In this work the intertwining between security, accuracy,
reliability and privacy is analyzed and, accordingly, a novel
architecture for Privacy Enhancing Video Surveillance (PEVS) is
introduced. Shadow detection and masking are dealt with through the
combination of two different approaches simultaneously. This results
in a unique privacy enhancement, without affecting security.
Subsequently, the methodology was employed successfully in a
large-scale wireless video surveillance system; privacy relevant
information was stored and encrypted on the unit, without
transferring it over an un-trusted network.
Abstract: The use of electronic sensors in the electronics
industry has become increasingly popular over the past few years,
and it has become a high competition product. The frequency
adjustment process is regarded as one of the most important process
in the electronic sensor manufacturing process. Due to inaccuracies
in the frequency adjustment process, up to 80% waste can be caused
due to rework processes; therefore, this study aims to provide a
preliminary understanding of the role of parameters used in the
frequency adjustment process, and also make suggestions in order to
further improve performance. Four parameters are considered in this
study: air pressure, dispensing time, vacuum force, and the distance
between the needle tip and the product. A full factorial design for
experiment 2k was considered to determine those parameters that
significantly affect the accuracy of the frequency adjustment process,
where a deviation in the frequency after adjustment and the target
frequency is expected to be 0 kHz. The experiment was conducted on
two levels, using two replications and with five center-points added.
In total, 37 experiments were carried out. The results reveal that air
pressure and dispensing time significantly affect the frequency
adjustment process. The mathematical relationship between these
two parameters was formulated, and the optimal parameters for air
pressure and dispensing time were found to be 0.45 MPa and 458 ms,
respectively. The optimal parameters were examined by carrying out
a confirmation experiment in which an average deviation of 0.082
kHz was achieved.
Abstract: The present work was conducted to find out the effect
of biofertilizer formulated with four species of bacteria (two species
of Azotobacter and two species of Lysobacter) and zinc sulphate.
Field experiments with mustard plant were conducted to study the
effectiveness of soil application of zinc sulphate and biofertilizer at
0, 10, 20, 30, 40, 50 days after sowing. Plant height and condition of
plant was found to be increased significantly using a mixture of
biofertilizer and zinc sulphate than other treatments after 40 days
sowing. Three treatments were also used in this field experiment such
as bacteria only, zinc sulphate only and mixture of biofertilizer and
zinc sulphate. The treatment using a mixture of zinc sulphate and
biofertilizer had the best yield (4688.008 kg/ha) within 50 days of
sowing and performed better than other treatments. Field experiment
using zinc sulphate only was second best yield (3380.75Kg/ha) and
biofertilizer only treatment gave (2639.04kg/ha).
Abstract: The Proton Exchange Membrane Fuel Cell (PEMFC)
control system has an important effect on operation of cell.
Traditional controllers couldn-t lead to acceptable responses because
of time- change, long- hysteresis, uncertainty, strong- coupling and
nonlinear characteristics of PEMFCs, so an intelligent or adaptive
controller is needed. In this paper a neural network predictive
controller have been designed to control the voltage of at the
presence of fluctuations of temperature. The results of
implementation of this designed NN Predictive controller on a
dynamic electrochemical model of a small size 5 KW, PEM fuel cell
have been simulated by MATLAB/SIMULINK.
Abstract: The number of electronic participation (eParticipation) projects introduced by different governments and international organisations is considerably high and increasing. In order to have an overview of the development of these projects, various evaluation frameworks have been proposed. In this paper, a five-level participation model, which takes into account the advantages of the Social Web or Web 2.0, together with a quantitative approach for the evaluation of eParticipation projects is presented. Each participation level is evaluated independently, taking into account three main components: Web evolution, media richness, and communication channels. This paper presents the evaluation of a number of existing Voting Advice Applications (VAAs). The results provide an overview of the main features implemented by each project, their strengths and weaknesses, and the participation levels reached.