Abstract: Lanthanide-doped upconversion nanoparticles which can convert near-infrared lights to visible lights have attracted growing interest because of their great potentials in fluorescence imaging. Upconversion fluorescence imaging technique with excitation in the near-infrared (NIR) region has been used for imaging of biological cells and tissues. However, improving the detection sensitivity and decreasing the absorption and scattering in biological tissues are as yet unresolved problems. In this present study, a novel NIR-reflected multispectral imaging system was developed for upconversion fluorescent imaging in small animals. Based on this system, we have obtained the high contrast images without the autofluorescence when biocompatible UCPs were injected near the body surface or deeply into the tissue. Furthermore, we have extracted respective spectra of the upconversion fluorescence and relatively quantify the fluorescence intensity with the multispectral analysis. To our knowledge, this is the first time to analyze and quantify the upconversion fluorescence in the small animal imaging.
Abstract: This paper provides an introduction into the evolution
of information and communication technology and illustrates its
usage in the work domain. The paper is sub-divided into two parts.
The first part gives an overview over the different phases of
information processing in the work domain. It starts by charting the
past and present usage of computers in work environments and shows
current technological trends, which are likely to influence future
business applications. The second part starts by briefly describing,
how the usage of computers changed business processes in the past,
and presents first Ambient Intelligence applications based on
identification and localization information, which are already used in
the production and retail sector. Based on current systems and
prototype applications, the paper gives an outlook of how Ambient
Intelligence technologies could change business processes in the
future.
Abstract: Using spatial models as a shared common basis of
information about the environment for different kinds of contextaware
systems has been a heavily researched topic in the last years.
Thereby the research focused on how to create, to update, and to
merge spatial models so as to enable highly dynamic, consistent and
coherent spatial models at large scale. In this paper however, we
want to concentrate on how context-aware applications could use this
information so as to adapt their behavior according to the situation
they are in. The main idea is to provide the spatial model
infrastructure with a situation recognition component based on
generic situation templates. A situation template is – as part of a
much larger situation template library – an abstract, machinereadable
description of a certain basic situation type, which could be
used by different applications to evaluate their situation. In this
paper, different theoretical and practical issues – technical, ethical
and philosophical ones – are discussed important for understanding
and developing situation dependent systems based on situation
templates. A basic system design is presented which allows for the
reasoning with uncertain data using an improved version of a
learning algorithm for the automatic adaption of situation templates.
Finally, for supporting the development of adaptive applications, we
present a new situation-aware adaptation concept based on
workflows.
Abstract: This paper compares Hilditch, Rosenfeld, Zhang-
Suen, dan Nagendraprasad Wang Gupta (NWG) thinning algorithms
for Javanese character image recognition. Thinning is an effective
process when the focus in not on the size of the pattern, but rather on
the relative position of the strokes in the pattern. The research
analyzes the thinning of 60 Javanese characters.
Time-wise, Zhang-Suen algorithm gives the best results with the
average process time being 0.00455188 seconds. But if we look at
the percentage of pixels that meet one-pixel thickness, Rosenfelt
algorithm gives the best results, with a 99.98% success rate. From the
number of pixels that are erased, NWG algorithm gives the best
results with the average number of pixels erased being 84.12%. It can
be concluded that the Hilditch algorithm performs least successfully
compared to the other three algorithms.
Abstract: This paper presents an approach based on the
adoption of a distributed cognition framework and a non parametric
multicriteria evaluation methodology (DEA) designed specifically to
compare e-commerce websites from the consumer/user viewpoint. In
particular, the framework considers a website relative efficiency as a
measure of its quality and usability. A website is modelled as a black
box capable to provide the consumer/user with a set of
functionalities. When the consumer/user interacts with the website to
perform a task, he/she is involved in a cognitive activity, sustaining a
cognitive cost to search, interpret and process information, and
experiencing a sense of satisfaction. The degree of ambiguity and
uncertainty he/she perceives and the needed search time determine
the effort size – and, henceforth, the cognitive cost amount – he/she
has to sustain to perform his/her task. On the contrary, task
performing and result achievement induce a sense of gratification,
satisfaction and usefulness. In total, 9 variables are measured,
classified in a set of 3 website macro-dimensions (user experience,
site navigability and structure). The framework is implemented to
compare 40 websites of businesses performing electronic commerce
in the information technology market. A questionnaire to collect
subjective judgements for the websites in the sample was purposely
designed and administered to 85 university students enrolled in
computer science and information systems engineering
undergraduate courses.
Abstract: This paper presents a hybrid approach for solving nqueen problem by combination of PSO and SA. PSO is a population based heuristic method that sometimes traps in local maximum. To solve this problem we can use SA. Although SA suffer from many iterations and long time convergence for solving some problems, By good adjusting initial parameters such as temperature and the length of temperature stages SA guarantees convergence. In this article we use discrete PSO (due to nature of n-queen problem) to achieve a good local maximum. Then we use SA to escape from local maximum. The experimental results show that our hybrid method in comparison of SA method converges to result faster, especially for high dimensions n-queen problems.
Abstract: Nuclear energy sources have been widely used in the
past decades in order to power spacecraft subsystems. Nevertheless,
their use has attracted controversy because of the risk of harmful
material released into the atmosphere if an accident were to occur
during the launch phase of the mission, leading to the general
adoption of photovoltaic systems.
As compared to solar cells, wind turbines have a great advantage
on Mars, as they can continuously produce power both during dust
storms and at night-time: this paper focuses on the potential of a wind
energy conversion system (WECS) considering the atmospheric
conditions on Mars. Wind potential on Martian surface has been
estimated, as well as the average energy requirements of a Martian
probe or surface rover. Finally, the expected daily energy output of
the WECS has been computed on the basis of both the swept area of
the rotor and the equivalent wind speed at the landing site.
Abstract: We consider linear regression models where both input data (the values of independent variables) and output data (the observations of the dependent variable) are interval-censored. We introduce a possibilistic generalization of the least squares estimator, so called OLS-set for the interval model. This set captures the impact of the loss of information on the OLS estimator caused by interval censoring and provides a tool for quantification of this effect. We study complexity-theoretic properties of the OLS-set. We also deal with restricted versions of the general interval linear regression model, in particular the crisp input – interval output model. We give an argument that natural descriptions of the OLS-set in the crisp input – interval output cannot be computed in polynomial time. Then we derive easily computable approximations for the OLS-set which can be used instead of the exact description. We illustrate the approach by an example.
Abstract: Policies that support entrepreneurship are keys to the
generation of new business. In Brazil, seed capital, installation of
technology parks, programs and zero interest financing, economic
subsidy as Program First Innovative Company (PRIME) are
examples of incentive policies. For the implementation of PRIME, in
particular the Brazilian Innovation Agency (FINEP) decentralized
operationalization so that business incubators could select innovative
projects. This paper analyzes the program PRIME Business Incubator
Center of the State of Sergipe (CISE) after calculating the mean and
standard deviation of the grades obtained by companies in the factors
of innovation, market potential, financial return economic, market
strategy and staff and application of the Mann-Whitney test.
Abstract: Electromyography (EMG) is the study of muscles function through analysis of electrical activity produced from muscles. This electrical activity which is displayed in the form of signal is the result of neuromuscular activation associated with muscle contraction. The most common techniques of EMG signal recording are by using surface and needle/wire electrode where the latter is usually used for interest in deep muscle. This paper will focus on surface electromyogram (SEMG) signal. During SEMG recording, several problems had to been countered such as noise, motion artifact and signal instability. Thus, various signal processing techniques had been implemented to produce a reliable signal for analysis. SEMG signal finds broad application particularly in biomedical field. It had been analyzed and studied for various interests such as neuromuscular disease, enhancement of muscular function and human-computer interface.
Abstract: In the article the remains of the base of the minaret,
found in 2009 at the medieval fortress shakhristan Aktobe, which is
located along the courses of the rivers Balta and Aksu. The minaret,
which consists of two parts: the stylobate in the pit and base part
refers to the XI-XII centuries. The preserved height of the building is
3.6 meters. Volume stylobat quadrangular minaret, the corners of
which are aimed at the four corners of the world amounts to 8,65 x8,
5 m, height – 2.6 m. Diameter octagonal upper cap of 7.85 m and a
height of preserved – 1 m. This minaret is of particular importance
among the historical and architectural monuments of Kazakhstan, as
it is so far the only minaret belonging to Karakhanid epoch in which
Islam was the state religion.
Abstract: The main purpose of this research is the calculation of implicit prices of the environmental level of air quality in the city of Moscow on the basis of housing property prices. The database used contains records of approximately 20 thousand apartments and has been provided by a leading real estate agency operating in Russia. The explanatory variables include physical characteristics of the houses, environmental (industry emissions), neighbourhood sociodemographic and geographic data: GPS coordinates of each house. The hedonic regression results for ecological variables show «negative» prices while increasing the level of air contamination from such substances as carbon monoxide, nitrogen dioxide, sulphur dioxide, and particles (CO, NO2, SO2, TSP). The marginal willingness to pay for higher environmental quality is presented for linear and log-log models.
Abstract: Compared to oil production from microorganisms, little work has been performed for mixed culture of microalgae and yeast. In this article it is aimed to show high oil accumulation potential of mixed culture of microalgae Chlorella sp. KKU-S2 and oleaginous yeast Torulaspora maleeae Y30 using sugarcane molasses as substrate. The monoculture of T. maleeae Y30 grew faster than that of microalgae Chlorella sp. KKU-S2. In monoculture of yeast, a biomass of 6.4g/L with specific growth rate (m) of 0.265 (1/d) and lipid yield of 0.466g/L were obtained, while 2.53g/L of biomass with m of 0.133 (1/d) and lipid yield of 0.132g/L were obtained for monoculture of Chlorella sp. KKU-S2. The biomass concentration in the mixed culture of T. maleeae Y30 with Chlorella sp. KKU-S2 increased faster and was higher compared with that in the monoculture and mixed culture of microalgae. In mixed culture of microalgae Chlorella sp. KKU-S2 and C. vulgaris TISTR8580, a biomass of 3.47g/L and lipid yield of 0.123 g/L were obtained. In mixed culture of T. maleeae Y30 with Chlorella sp. KKU-S2, a maximum biomass of 7.33 g/L and lipid yield of 0.808g/L were obtained. Maximum cell yield coefficient (YX/S, 0.229g/L), specific yield of lipid (YP/X, 0.11g lipid/g cells) and volumetric lipid production rate (QP, 0.115 g/L/d) were obtained in mixed culture of yeast and microalgae. Clearly, T. maleeae Y30 and Chlorella sp. KKU-S2 use sugarcane molasses as organic nutrients efficiently in mixed culture under mixotrophic growth. The biomass productivity and lipid yield are notably enhanced in comparison with monoculture.
Abstract: Traditional higher-education classrooms allow lecturers to observe students- behaviours and responses to a particular pedagogy during learning in a way that can influence changes to the pedagogical approach. Within current e-learning systems it is difficult to perform continuous analysis of the cohort-s behavioural tendency, making real-time pedagogical decisions difficult. This paper presents a Virtual Learning Process Environment (VLPE) based on the Business Process Management (BPM) conceptual framework. Within the VLPE, course designers can model various education pedagogies in the form of learning process workflows using an intuitive flow diagram interface. These diagrams are used to visually track the learning progresses of a cohort of students. This helps assess the effectiveness of the chosen pedagogy, providing the information required to improve course design. A case scenario of a cohort of students is presented and quantitative statistical analysis of their learning process performance is gathered and displayed in realtime using dashboards.
Abstract: Rice seed expression (cDNA) library in the Lambda
Zap 11® phage constructed from the developing grain 10-20 days
after flowering was transformed into yeast for functional
complementation assays in three salt sensitive yeast mutants S.
cerevisiae strain CY162, G19 and Axt3K. Transformed cells of G19
and Axt3K with pYES vector with cDNA inserts showed enhance
tolerance than those with empty pYes vector. Sequencing of the
cDNA inserts revealed that they encode for the putative proteins with
the sequence homologous to rice putative protein PROLM24
(Os06g31070), a prolamin precursor. Expression of this cDNA did
not affect yeast growth in absence of salt. Axt3k and G19 strains
expressing the PROLM24 were able to grow upto 400 mM and 600
mM of NaCl respectively. Similarly, Axt3k mutant with PROLM24
expression showed comparatively higher growth rate in the medium
with excess LiCl (50 mM). The observation that expression of
PROLM24 rescued the salt sensitive phenotypes of G19 and Axt3k
indicates the existence of a regulatory system that ameliorates the
effect of salt stress in the transformed yeast mutants. However, the
exact function of the cDNA sequence, which shows partial sequence
homology to yeast UTR1 is not clear. Although UTR1 involved in
ferrous uptake and iron homeostasis in yeast cells, there is no
evidence to prove its role in Na+ homeostasis in yeast cells. Absence
of transmembrane regions in Os06g31070 protein indicates that salt
tolerance is achieved not through the direct functional
complementation of the mutant genes but through an alternative
mechanism.
Abstract: This paper proposes a Particle Swarm Optimization
(PSO) based technique for the optimal allocation of Distributed
Generation (DG) units in the power systems. In this paper our aim is
to decide optimal number, type, size and location of DG units for
voltage profile improvement and power loss reduction in distribution
network. Two types of DGs are considered and the distribution load
flow is used to calculate exact loss. Load flow algorithm is combined
appropriately with PSO till access to acceptable results of this
operation. The suggested method is programmed under MATLAB
software. Test results indicate that PSO method can obtain better
results than the simple heuristic search method on the 30-bus and 33-
bus radial distribution systems. It can obtain maximum loss reduction
for each of two types of optimally placed multi-DGs. Moreover,
voltage profile improvement is achieved.
Abstract: This paper presents a conceptual model of agreement
options for negotiation support in multi-person decision on
optimizing high-rise building columns. The decision is complicated
since many parties involved in choosing a single alternative from a
set of solutions. There are different concern caused by differing
preferences, experiences, and background. Such building columns as
alternatives are referred to as agreement options which are
determined by identifying the possible decision maker group,
followed by determining the optimal solution for each group. The
group in this paper is based on three-decision makers preferences that
are designer, programmer, and construction manager. Decision
techniques applied to determine the relative value of the alternative
solutions for performing the function. Analytical Hierarchy Process
(AHP) was applied for decision process and game theory based agent
system for coalition formation. An n-person cooperative game is
represented by the set of all players. The proposed coalition
formation model enables each agent to select individually its allies or
coalition. It further emphasizes the importance of performance
evaluation in the design process and value-based decision.
Abstract: Most of the real queuing systems include special properties and constraints, which can not be analyzed directly by using the results of solved classical queuing models. Lack of Markov chains features, unexponential patterns and service constraints, are the mentioned conditions. This paper represents an applied general algorithm for analysis and optimizing the queuing systems. The algorithm stages are described through a real case study. It is consisted of an almost completed non-Markov system with limited number of customers and capacities as well as lots of common exception of real queuing networks. Simulation is used for optimizing this system. So introduced stages over the following article include primary modeling, determining queuing system kinds, index defining, statistical analysis and goodness of fit test, validation of model and optimizing methods of system with simulation.
Abstract: Innovations in technology have created new ethical
challenges. Essential use of electronic communication in the
workplace has escalated at an astronomical rate over the past decade.
As such, legal and ethical dilemmas confronted by both the employer
and the employee concerning managerial control and ownership of einformation
have increased dramatically in the USA. From the
employer-s perspective, ownership and control of all information
created for the workplace is an undeniable source of economic
advantage and must be monitored zealously. From the perspective of
the employee, individual rights, such as privacy, freedom of speech,
and freedom from unreasonable search and seizure, continue to be
stalwart legal guarantees that employers are not legally or ethically
entitled to abridge in the workplace. These issues have been the
source of great debate and the catalyst for legal reform. The fine line
between ethical and legal has been complicated by emerging
technologies. This manuscript will identify and discuss a number of
specific legal and ethical issues raised by the dynamic electronic
workplace and conclude with suggestions that employers should
follow to respect the delicate balance between employees- legal
rights to privacy and the employer's right to protect its knowledge
systems and infrastructure.
Abstract: Mapping between local and global coordinates is an
important issue in finite element method, as all calculations are
performed in local coordinates. The concern arises when subparametric
are used, in which the shape functions of the field variable
and the geometry of the element are not the same. This is particularly
the case for C* elements in which the extra degrees of freedoms
added to the nodes make the elements sub-parametric. In the present
work, transformation matrix for C1* (an 8-noded hexahedron
element with 12 degrees of freedom at each node) is obtained using
equivalent C0 elements (with the same number of degrees of
freedom). The convergence rate of 8-noded C1* element is nearly
equal to its equivalent C0 element, while it consumes less CPU time
with respect to the C0 element. The existence of derivative degrees
of freedom at the nodes of C1* element along with excellent
convergence makes it superior compared with it equivalent C0
element.