Abstract: The purpose of this study was to understand the main
sources of copper (Cu) accumulation in target organs of tilapia
(Oreochromis mossambicus) and to investigate how the organism
mediate the process of Cu accumulation under prolonged conditions.
By measuring both dietary and waterborne Cu accumulation and total
concentrations in tilapia with biokinetic modeling approach, we were
able to clarify the biokinetic coping mechanisms for the long term Cu
accumulation. This study showed that water and food are both the
major source of Cu for the muscle and liver of tilapia. This implied
that control the Cu concentration in these two routes will be correlated
to the Cu bioavailability for tilapia. We found that exposure duration
and level of waterborne Cu drove the Cu accumulation in tilapia. The
ability for Cu biouptake and depuration in organs of tilapia were
actively mediated under prolonged exposure conditions. Generally,
the uptake rate, depuration rate and net bioaccumulation ability in all
selected organs decreased with the increasing level of waterborne Cu
and extension of exposure duration.Muscle tissues accounted for over
50%of the total accumulated Cu and played a key role in buffering the
Cu burden in the initial period of exposure, alternatively, the liver
acted a more important role in the storage of Cu with the extension of
exposures. We concluded that assumption of the constant biokinetic
rates could lead to incorrect predictions with overestimating the
long-term Cu accumulation in ecotoxicological risk assessments.
Abstract: Faults in a network may take various forms such as hardware/software errors, vertex/edge faults, etc. Folded hypercube is a well-known variation of the hypercube structure and can be constructed from a hypercube by adding a link to every pair of nodes with complementary addresses. Let FFv (respectively, FFe) be the set of faulty nodes (respectively, faulty links) in an n-dimensional folded hypercube FQn. Hsieh et al. have shown that FQn - FFv - FFe for n ≥ 3 contains a fault-free cycle of length at least 2n -2|FFv|, under the constraints that (1) |FFv| + |FFe| ≤ 2n - 4 and (2) every node in FQn is incident to at least two fault-free links. In this paper, we further consider the constraints |FFv| + |FFe| ≤ 2n - 3. We prove that FQn - FFv - FFe for n ≥ 5 still has a fault-free cycle of length at least 2n - 2|FFv|, under the constraints : (1) |FFv| + |FFe| ≤ 2n - 3, (2) |FFe| ≥ n + 2, and (3) every vertex is still incident with at least two links.
Abstract: Numerical study of a plane jet occurring in a vertical
heated channel is carried out. The aim is to explore the influence of
the forced flow, issued from a flat nozzle located in the entry section
of a channel, on the up-going fluid along the channel walls. The
Reynolds number based on the nozzle width and the jet velocity
ranges between 3 103 and 2.104; whereas, the Grashof number based
on the channel length and the wall temperature difference is 2.57
1010. Computations are established for a symmetrically heated
channel and various nozzle positions. The system of governing
equations is solved with a finite volumes method. The obtained
results show that the jet-wall interactions activate the heat transfer,
the position variation modifies the heat transfer especially for low
Reynolds numbers: the heat transfer is enhanced for the adjacent
wall; however it is decreased for the opposite one. The numerical
velocity and temperature fields are post-processed to compute the
quantities of engineering interest such as the induced mass flow rate,
and the Nusselt number along the plates.
Abstract: Dust storms are one of the most costly and destructive
events in many desert regions. They can cause massive damages both
in natural environments and human lives. This paper is aimed at
presenting a preliminary study on dust storms, as a major natural
hazard in arid and semi-arid regions. As a case study, dust storm
events occurred in Zabol city located in Sistan Region of Iran was
analyzed to diagnose and predict dust storms. The identification and
prediction of dust storm events could have significant impacts on
damages reduction. Present models for this purpose are complicated
and not appropriate for many areas with poor-data environments. The
present study explores Gamma test for identifying inputs of ANNs
model, for dust storm prediction. Results indicate that more attempts
must be carried out concerning dust storms identification and
segregate between various dust storm types.
Abstract: In the recent works related with mixture discriminant
analysis (MDA), expectation and maximization (EM) algorithm is
used to estimate parameters of Gaussian mixtures. But, initial values
of EM algorithm affect the final parameters- estimates. Also, when
EM algorithm is applied two times, for the same data set, it can be
give different results for the estimate of parameters and this affect the
classification accuracy of MDA. Forthcoming this problem, we use
Self Organizing Mixture Network (SOMN) algorithm to estimate
parameters of Gaussians mixtures in MDA that SOMN is more robust
when random the initial values of the parameters are used [5]. We
show effectiveness of this method on popular simulated waveform
datasets and real glass data set.
Abstract: The study deals with the modelling of the gas flow during heliox therapy. A special model has been developed to study the effect of the helium upon the gas flow in the airways during the spontaneous breathing. Lower density of helium compared with air decreases the Reynolds number and it allows improving the flow during the spontaneous breathing. In the cases, where the flow becomes turbulent while the patient inspires air the flow is still laminar when the patient inspires heliox. The use of heliox decreases the work of breathing and improves ventilation. It allows in some cases to prevent the intubation of the patients.
Abstract: The more recent satellite projects/programs makes
extensive usage of real – time embedded systems. 16 bit processors
which meet the Mil-Std-1750 standard architecture have been used in
on-board systems. Most of the Space Applications have been written
in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are
needed in the area of spacecraft computing and therefore an effort is
desirable in the study and survey of 64 bit architectures for space
applications. This will also result in significant technology
development in terms of VLSI and software tools for ADA (as the
legacy code is in ADA).
There are several basic requirements for a special processor for
this purpose. They include Radiation Hardened (RadHard) devices,
very low power dissipation, compatibility with existing operational
systems, scalable architectures for higher computational needs,
reliability, higher memory and I/O bandwidth, predictability, realtime
operating system and manufacturability of such processors.
Further on, these may include selection of FPGA devices, selection
of EDA tool chains, design flow, partitioning of the design, pin
count, performance evaluation, timing analysis etc.
This project deals with a brief study of 32 and 64 bit processors
readily available in the market and designing/ fabricating a 64 bit
RISC processor named RISC MicroProcessor with added
functionalities of an extended double precision floating point unit
and a 32 bit signal processing unit acting as co-processors. In this
paper, we emphasize the ease and importance of using Open Core
(OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as
Icarus to develop FPGA based prototypes quickly. Commercial tools
such as Xilinx ISE for Synthesis are also used when appropriate.
Abstract: In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.
Abstract: In the present essay, a model of choice by actors is analysedby utilizing the theory of chaos to explain how change comes about. Then, by using ancient and modern sources of literature, the theory of the social contract is analysed as a historical phenomenon that first appeared during the period of Classical Greece. Then, based on the findings of this analysis, the practice of direct democracy and public choice in ancient Athens is analysed, through two historical cases: Eubulus and Lycurgus political program in the second half of the 4th century. The main finding of this research is that these policies can be interpreted as an implementation of a social contract, through which citizens were taking decisions based on rational choice according to economic considerations.
Abstract: We address the balancing problem of transfer lines in
this paper to find the optimal line balancing that minimizes the nonproductive
time. We focus on the tool change time and face
orientation change time both of which influence the makespane. We
consider machine capacity limitations and technological constraints
associated with the manufacturing process of auto cylinder heads.
The problem is represented by a mixed integer programming model
that aims at distributing the design features to workstations and
sequencing the machining processes at a minimum non-productive
time. The proposed model is solved by an algorithm established using
linearization schemes and Benders- decomposition approach. The
experiments show the efficiency of the algorithm in reaching the
exact solution of small and medium problem instances at reasonable
time.
Abstract: This paper proposes a new method for analyzing textual data. The method deals with items of textual data, where each item is described based on various viewpoints. The method acquires 2- class classification models of the viewpoints by applying an inductive learning method to items with multiple viewpoints. The method infers whether the viewpoints are assigned to the new items or not by using the models. The method extracts expressions from the new items classified into the viewpoints and extracts characteristic expressions corresponding to the viewpoints by comparing the frequency of expressions among the viewpoints. This paper also applies the method to questionnaire data given by guests at a hotel and verifies its effect through numerical experiments.
Abstract: A high energy dual-wavelength extracavity KTA
optical parametric oscillator (OPO) with excellent stability and beam
quality, which is pumped by a Q-switched single-longitudinal-mode
Nd:YAG laser, has been demonstrated based on a type II noncritical
phase matching (NCPM) KTA crystal. The maximum pulse energy of
10.2 mJ with the output stability of better than 4.1% rms at 3.467 μm is
obtained at the repetition rate of 10 Hz and pulse width of 2 ns, and the
11.9 mJ of 1.535 μm radiation is obtained simultaneously. This
extracavity NCPM KTA OPO is very useful when high energy, high
beam quality and smooth time domain are needed.
Abstract: Some believe that stigma is the worst side effect of the
people who have mental illness. Mental illness researchers have
focused on the influence of mass media on the stigmatization of the
people with mental illness. However, no studies have investigated the
effects of the interactive media, such as blogs, on the stigmatization
of mentally ill people, even though the media have a significant
influence on people in all areas of life. The purpose of this study is to
investigate the use of interactivity in destigmatization of the mentally
ill and the moderating effect of self-construal (independent versus
interdependent self-construal) on the relation between interactivity
and destigmatization. The findings suggested that people in the
human-human interaction condition had less social distance toward
people with mental illness. Additionally, participants with higher
independence showed more favorable affection and less social
distance toward mentally ill people. Finally, direct contact with
mentally ill people increased a person-s positive affect toward people
with mental illness. The current study should provide insights for
mental health practitioners by suggesting how they can use
interactive media to approach the public that stigmatizes the mentally
ill.
Abstract: Long terms variation of solar insolation had been
widely studied. However, its parallel observations in short time scale
is rather lacking. This paper aims to investigate the short time scale
evolution of solar radiation spectrum (UV, PAR, and NIR bands) due
to atmospheric aerosols and water vapors. A total of 25 days of
global and diffused solar spectrum ranges from air mass 2 to 6 were
collected using ground-based spectrometer with shadowband
technique. The result shows that variation of solar radiation is the
least in UV fraction, followed by PAR and the most in NIR. Broader
variations in PAR and NIR are associated with the short time scale
fluctuations of aerosol and water vapors. The corresponding daily
evolution of UV, PAR, and NIR fractions implies that aerosol and
water vapors variation could also be responsible for the deviation
pattern in the Langley-plot analysis.
Abstract: Crosstalk is the major limiting issue in very high bit-rate digital subscriber line (VDSL) systems in terms of bit-rate or service coverage. At the central office side, joint signal processing accompanied by appropriate power allocation enables complex multiuser processors to provide near capacity rates. Unfortunately complexity grows with the square of the number of lines within a binder, so by taking into account that there are only a few dominant crosstalkers who contribute to main part of crosstalk power, the canceller structure can be simplified which resulted in a much lower run-time complexity. In this paper, a multiuser power control scheme, namely iterative waterfilling, is combined with previously proposed partial crosstalk cancellation approaches to demonstrate the best ever achieved performance which is verified by simulation results.
Abstract: Clustering algorithms help to understand the hidden
information present in datasets. A dataset may contain intrinsic and
nested clusters, the detection of which is of utmost importance. This
paper presents a Distributed Grid-based Density Clustering algorithm
capable of identifying arbitrary shaped embedded clusters as well as
multi-density clusters over large spatial datasets. For handling
massive datasets, we implemented our method using a 'sharednothing'
architecture where multiple computers are interconnected
over a network. Experimental results are reported to establish the
superiority of the technique in terms of scale-up, speedup as well as
cluster quality.
Abstract: In this paper, by using the continuation theorem of coincidence degree theory, M-matrix theory and constructing some suitable Lyapunov functions, some sufficient conditions are obtained for the existence and global exponential stability of periodic solutions of recurrent neural networks with distributed delays and impulses on time scales. Without assuming the boundedness of the activation functions gj, hj , these results are less restrictive than those given in the earlier references.
Abstract: In this article, while it is attempted to describe the
problem and its importance, transformational leadership is studied by considering leadership theories. Issues such as the definition of
transformational leadership and its aspects are compared on the basis of the ideas of various connoisseurs and then it (transformational leadership) is examined in successful and
unsuccessful companies. According to the methodology, the
method of research, hypotheses, population and statistical sample
are investigated and research findings are analyzed by using descriptive and inferential statistical methods in the framework of
analytical tables. Finally, our conclusion is provided by considering the results of statistical tests. The final result shows that
transformational leadership is significantly higher in successful companies than unsuccessful ones P
Abstract: We report on a high-speed quantum cryptography
system that utilizes simultaneous entanglement in polarization and in
“time-bins". With multiple degrees of freedom contributing to the
secret key, we can achieve over ten bits of random entropy per detected coincidence. In addition, we collect from multiple spots o
the downconversion cone to further amplify the data rate, allowing usto achieve over 10 Mbits of secure key per second.
Abstract: Effective evaluation of software development effort is an important aspect of successful project management. Based on a large database with 4106 projects ever developed, this study statistically examines the factors that influence development effort. The factors found to be significant for effort are project size, average number of developers that worked on the project, type of development, development language, development platform, and the use of rapid application development. Among these factors, project size is the most critical cost driver. Unsurprisingly, this study found that the use of CASE tools does not necessarily reduce development effort, which adds support to the claim that the use of tools is subtle. As many of the current estimation models are rarely or unsuccessfully used, this study proposes a parsimonious parametric model for the prediction of effort which is both simple and more accurate than previous models.