Abstract: Healthcare providers sometimes use the power of
humor as a treatment and therapy for buffering mental health or easing
mental disorders because humor can provide relief from distress and
conflict. Humor is also very suitable for advertising because of similar
benefits. This study carefully examines humor's widespread use in
advertising and identifies relationships among humor mechanisms,
female depictions, and product types. The purpose is to conceptualize
how humor theories can be used not only to successfully define a
product as fitting within one of four color categories of the product
color matrix, but also to identify compelling contemporary female
depictions through humor in ads. The results can offer an idealization
for marketing managers and consumers to help them understand how
female role depictions can be effectively used with humor in ads. The
four propositions developed herein are derived from related literature,
through the identification of marketing strategy formulations that
achieve product memory enhancement by adopting humor
mechanisms properly matched with female role depictions.
Abstract: The concept of housing affordability is a contested
issue, but a pressing and widespread problem for many countries.
Simple ratio measures based on housing expenditure and income are
habitually used to defined and assess housing affordability. However,
conceptualising and measuring affordability in this manner focuses
only on financial attributes and fails to deal with wider issues such as
housing quality, location and access to services and facilities.
The research is based on the notion that the housing affordability
problem encompasses more than the financial costs of housing and a
households ability to meet such costs and must address larger issues
such as social and environmental sustainability and the welfare of
households. Therefore, the need arises for a broad and more
encompassing set of attributes by which housing affordability can be
assessed. This paper presents a system of criteria by which the
affordability of different housing locations could be assessed in a
comprehensive and sustainable manner. Moreover, the paper explores
the way in which such criteria could be measured.
Abstract: The Emergency Department of a medical center in
Taiwan cooperated to conduct the research. A predictive model of
triage system is contracted from the contract procedure, selection of
parameters to sample screening. 2,000 pieces of data needed for the
patients is chosen randomly by the computer. After three
categorizations of data mining (Multi-group Discriminant Analysis,
Multinomial Logistic Regression, Back-propagation Neural
Networks), it is found that Back-propagation Neural Networks can
best distinguish the patients- extent of emergency, and the accuracy
rate can reach to as high as 95.1%. The Back-propagation Neural
Networks that has the highest accuracy rate is simulated into the triage
acuity expert system in this research. Data mining applied to the
predictive model of the triage acuity expert system can be updated
regularly for both the improvement of the system and for education
training, and will not be affected by subjective factors.
Abstract: This study addresses the effect of impurities on the
crystallization of Na2CO3 produced within a strategy for capturing
CO2 from flue gases by alkaline absorption. A novel technology -
membrane assisted crystallization - is proposed for Na2CO3
crystallization from mother liquors containing impurities. High purity
of Na2CO3•10H2O crystals was obtained without impacting the
performance of the mass transfer of water vapor through membranes
during crystallization.
Abstract: Fundamental sensor-motor couplings form the backbone
of most mobile robot control tasks, and often need to be implemented
fast, efficiently and nevertheless reliably. Machine learning
techniques are therefore often used to obtain the desired sensor-motor
competences.
In this paper we present an alternative to established machine
learning methods such as artificial neural networks, that is very fast,
easy to implement, and has the distinct advantage that it generates
transparent, analysable sensor-motor couplings: system identification
through nonlinear polynomial mapping.
This work, which is part of the RobotMODIC project at the
universities of Essex and Sheffield, aims to develop a theoretical understanding
of the interaction between the robot and its environment.
One of the purposes of this research is to enable the principled design
of robot control programs.
As a first step towards this aim we model the behaviour of the
robot, as this emerges from its interaction with the environment, with
the NARMAX modelling method (Nonlinear, Auto-Regressive, Moving
Average models with eXogenous inputs). This method produces
explicit polynomial functions that can be subsequently analysed using
established mathematical methods.
In this paper we demonstrate the fidelity of the obtained NARMAX
models in the challenging task of robot route learning; we present a
set of experiments in which a Magellan Pro mobile robot was taught
to follow four different routes, always using the same mechanism to
obtain the required control law.
Abstract: In the paper the results of calculations of the dynamic
response of a multi-storey reinforced concrete building to a strong
mining shock originated from the main region of mining activity in
Poland (i.e. the Legnica-Glogow Copper District) are presented. The
representative time histories of accelerations registered in three
directions were used as ground motion data in calculations of the
dynamic response of the structure. Two variants of a numerical model
were applied: the model including only structural elements of the
building and the model including both structural and non-structural
elements (i.e. partition walls and ventilation ducts made of brick). It
turned out that non-structural elements of multi-storey RC buildings
have a small impact of about 10 % on natural frequencies of these
structures. It was also proved that the dynamic response of building
to mining shock obtained in case of inclusion of all non-structural
elements in the numerical model is about 20 % smaller than in case
of consideration of structural elements only. The principal stresses
obtained in calculations of dynamic response of multi-storey building
to strong mining shock are situated on the level of about 30% of
values obtained from static analysis (dead load).
Abstract: In blended learning environments, the Internet can be combined with other technologies. The aim of this research was to design, introduce and validate a model to support synchronous and asynchronous activities by managing content domains in an Adaptive Hypermedia System (AHS). The application is based on information recovery techniques, clustering algorithms and adaptation rules to adjust the user's model to contents and objects of study. This system was applied to blended learning in higher education. The research strategy used was the case study method. Empirical studies were carried out on courses at two universities to validate the model. The results of this research show that the model had a positive effect on the learning process. The students indicated that the synchronous and asynchronous scenario is a good option, as it involves a combination of work with the lecturer and the AHS. In addition, they gave positive ratings to the system and stated that the contents were adapted to each user profile.
Abstract: In the present work, study of the vibration of thin cylindrical shells made of a functionally gradient material (FGM) composed of stainless steel and nickel is presented. Material properties are graded in the thickness direction of the shell according to volume fraction power law distribution. The objective is to study the natural frequencies, the influence of constituent volume fractions and the effects of boundary conditions on the natural frequencies of the FG cylindrical shell. The study is carried out using third order shear deformation shell theory. The analysis is carried out using Hamilton's principle. The governing equations of motion of FG cylindrical shells are derived based on shear deformation theory. Results are presented on the frequency characteristics, influence of constituent volume fractions and the effects of clamped-free boundary conditions
Abstract: The prospective analysis is presented as an important tool to identify the most relevant opportunities and needs in research and development from planned interventions in innovation systems. This study chose Phyllanthus niruri, known as "stone break" to describe the knowledge about the specie, by using biotechnological forecasting through the software Vantage Point. It can be seen a considerable increase in studies on Phyllanthus niruri in recent years and that there are patents about this plant since twenty-five years ago. India was the country that most carried out research on the specie, showing interest, mainly in studies of hepatoprotection, antioxidant and anti-cancer activities. Brazil is in the second place, with special interest for anti-tumor studies. Given the identification of the Brazilian groups that exploit the species it is possible to mediate partnerships and cooperation aiming to help on the implementing of the Program of Herbal medicines (phytotherapics) in Brazil.
Abstract: This paper shows that some properties of the decision
rules in the literature do not hold by presenting a counterexample. We
give sufficient and necessary conditions under which these properties
are valid. These results will be helpful when one tries to choose the
right decision rules in the research of rough set theory.
Abstract: We measured the major and trace element contents
and Rb-Sr isotopic compositions of 12 tektites from the Maoming
area, Guandong province (south China). All the samples studied are
splash-form tektites which show pitted or grooved surfaces with
schlieren structures on some surfaces. The trace element ratios Ba/Rb
(avg. 4.33), Th/Sm (avg. 2.31), Sm/Sc (avg. 0.44), Th/Sc (avg. 1.01) ,
La/Sc (avg. 2.86), Th/U (avg. 7.47), Zr/Hf (avg. 46.01) and the rare
earth elements (REE) contents of tektites of this study are similar to the
average upper continental crust. From the chemical composition, it is
suggested that tektites in this study are derived from similar parental
terrestrial sedimentary deposit which may be related to post-Archean
upper crustal rocks. The tektites from the Maoming area have high
positive εSr(0) values-ranging from 176.9~190.5 which indicate that
the parental material for these tektites have similar Sr isotopic
compositions to old terrestrial sedimentary rocks and they were not
dominantly derived from recent young sediments (such as soil or
loess). The Sr isotopic data obtained by the present study support the
conclusion proposed by Blum et al. (1992)[1] that the depositional age
of sedimentary target materials is close to 170Ma (Jurassic). Mixing
calculations based on the model proposed by Ho and Chen (1996)[2]
for various amounts and combinations of target rocks indicate that the
best fit for tektites from the Maoming area is a mixture of 40% shale,
30% greywacke, 30% quartzite.
Abstract: In this paper, a numerical solution based on sinc
functions is used for finding the solution of boundary value problems
which arise from the problems of calculus of variations. This
approximation reduce the problems to an explicit system of algebraic
equations. Some numerical examples are also given to illustrate the
accuracy and applicability of the presented method.
Abstract: One of the purposes of the robust method of
estimation is to reduce the influence of outliers in the data, on the
estimates. The outliers arise from gross errors or contamination from
distributions with long tails. The trimmed mean is a robust estimate.
This means that it is not sensitive to violation of distributional
assumptions of the data. It is called an adaptive estimate when the
trimming proportion is determined from the data rather than being
fixed a “priori-.
The main objective of this study is to find out the robustness
properties of the adaptive trimmed means in terms of efficiency, high
breakdown point and influence function. Specifically, it seeks to find
out the magnitude of the trimming proportion of the adaptive
trimmed mean which will yield efficient and robust estimates of the
parameter for data which follow a modified Weibull distribution with
parameter λ = 1/2 , where the trimming proportion is determined by a
ratio of two trimmed means defined as the tail length. Secondly, the
asymptotic properties of the tail length and the trimmed means are
also investigated. Finally, a comparison is made on the efficiency of
the adaptive trimmed means in terms of the standard deviation for the
trimming proportions and when these were fixed a “priori".
The asymptotic tail lengths defined as the ratio of two trimmed
means and the asymptotic variances were computed by using the
formulas derived. While the values of the standard deviations for the
derived tail lengths for data of size 40 simulated from a Weibull
distribution were computed for 100 iterations using a computer
program written in Pascal language.
The findings of the study revealed that the tail lengths of the
Weibull distribution increase in magnitudes as the trimming
proportions increase, the measure of the tail length and the adaptive
trimmed mean are asymptotically independent as the number of
observations n becomes very large or approaching infinity, the tail
length is asymptotically distributed as the ratio of two independent
normal random variables, and the asymptotic variances decrease as
the trimming proportions increase. The simulation study revealed
empirically that the standard error of the adaptive trimmed mean
using the ratio of tail lengths is relatively smaller for different values
of trimming proportions than its counterpart when the trimming
proportions were fixed a 'priori'.
Abstract: This work involved the use of phytoremediation to
remediate an aged soil contaminated with polychlorinated biphenyls
(PCBs). At microcosm scale, tests were prepared using soil samples
that have been collected in an industrial area with a total PCBs
concentration of about 250 μg kg-1. Medicago sativa and Lolium
italicum were the species selected in this study that is used as
“feasibility test" for full scale remediation. The experiment was
carried out with the addition of a mixture of randomly methylatedbeta-
cyclodextrins (RAMEB). At the end of the experiment analysis
of soil samples showed that in general the presence of plants has led
to a higher degradation of most congeners with respect to not
vegetated soil. The two plant species efficiencies were comparable
and improved by RAMEB addition with a final reduction of total
PCBs near to 50%. With increasing the chlorination of the congeners
the removal percentage of PCBs progressively decreased.
Abstract: This article is devoted to the numerical solution of
large-scale quadratic eigenvalue problems. Such problems arise in
a wide variety of applications, such as the dynamic analysis of
structural mechanical systems, acoustic systems, fluid mechanics,
and signal processing. We first introduce a generalized second-order
Krylov subspace based on a pair of square matrices and two initial
vectors and present a generalized second-order Arnoldi process for
constructing an orthonormal basis of the generalized second-order
Krylov subspace. Then, by using the projection technique and the
refined projection technique, we propose a restarted generalized
second-order Arnoldi method and a restarted refined generalized
second-order Arnoldi method for computing some eigenpairs of largescale
quadratic eigenvalue problems. Some theoretical results are also
presented. Some numerical examples are presented to illustrate the
effectiveness of the proposed methods.
Abstract: The Sphere Method is a flexible interior point algorithm for linear programming problems. This was developed mainly by Professor Katta G. Murty. It consists of two steps, the centering step and the descent step. The centering step is the most expensive part of the algorithm. In this centering step we proposed some improvements such as introducing two or more initial feasible solutions as we solve for the more favorable new solution by objective value while working with the rigorous updates of the feasible region along with some ideas integrated in the descent step. An illustration is given confirming the advantage of using the proposed procedure.
Abstract: In the time of globalisation, growing uncertainty, ambiguity and change, traditional way of doing business are no longer sufficient and it is important to consider non-conventional methods and approaches to release creativity and facilitate innovation and growth. Thus, creative industries, as a natural source of creativity and innovation, draw particular attention. This paper explores feasibility of building creative partnerships between creative industries and business and brings attention to mutual benefits derived from such partnerships. Design/approach - This paper is a theoretical exploration of projects, practices and research findings addressing collaboration between creative industries and business. Thus, it concerns creative industries, arts, business and its representatives in order to define requirements for creative partnerships to work and succeed. Findings – Current practices in engaging into arts-business partnerships are still very few, although most of creative partnerships proved to be highly valuable and mutually beneficial. Certain conditions shall be provided in order to benefit from arts-business creative synergy. Originality/value- By integrating different sources of literature, this article provides a base for conducting empirical research in several dimensions within arts-business partnerships.
Abstract: Formaldehyde is the illegal chemical substance used
for food preservation in fish and vegetable. It can promote
carcinogenesis. Superoxide dismutases are the important
antioxidative enzymes that catalyze the dismutation of superoxide
anion into oxygen and hydrogen peroxide. The resultant level of
oxidative stress in formaldehyde-treated lymphocytes was
investigated. The formaldehyde concentrations of 0, 20, 40, 60, 80
and 120μmol/L were treated in human lymphocytes for 12 hours.
After 12 treated hours, the superoxide dismutase activity change was
measured in formaldehyde-treated lymphocytes. The results showed
that the formaldehyde concentrations of 60, 80 and 120μmol/L
significantly decreased superoxide dismutase activities in
lymphocytes (P < 0.05). The change of superoxide dismutase
activity in formaldehyde-treated lymphocytes may be the biomarker
for detect cellular injury, such as damage to DNA, due to
formaldehyde exposure.
Abstract: It has formed an essential issue that Climate Change, composed of highly knowledge complexity, reveals its significant impact on human existence. Therefore, specific national policies, some of which present the educational aspects, have been published for overcoming the imperative problem. Accordingly, the study aims to analyze as well as integrate the relationship between Climate Change and environmental education and apply the perspective of concept map to represent the knowledge contents and structures of Climate Change; by doing so, knowledge contents of Climate Change could be represented in an even more comprehensive way and manipulated as the tool for environmental education. The method adapted for this study is knowledge conversion model compounded of the platform for experts and teachers, who were the participants for this study, to cooperate and combine each participant-s standpoints into a complete knowledge framework that is the foundation for structuring the concept map. The result of this research contains the important concepts, the precise propositions and the entire concept map for representing the robust concepts of Climate Change.
Abstract: The concept of privacy, seen in connection to the consumer's private space and personalization, has recently gained a higher importance as a consequence of the increasing marketing efforts of the organizations based on the capturing, processing and usage of consumer-s personal data.Paper intends to provide a definition of the consumer-s private space based on the types of personal data the consumer is willing to disclose, to assess the attitude toward personalization and to identify the means preferred by consumers to control their personal data and defend their private space. Several implications generated through the definition of the consumer-s private space are identified and weighted from both the consumers- and organizations- perspectives.