Abstract: Unsatisfactory experiences due to an information shortage regarding the future pay-offs of actual choices, yield satisficing decision-making. This research will examine, for the first time in the literature, the motivation behind suboptimal decisions due to uncertainty by subjecting Adam Smith’s and Jeremy Bentham’s assumptions about the nature of the actions that lead to satisficing behavior, in order to clarify the theoretical background of a “consumption-based satisfactory time” concept. The contribution of this paper with respect to the existing literature is threefold: firstly, it is showed in this paper that Adam Smith’s uncertainty is related to the problem of the constancy of ideas and not related directly to beliefs. Secondly, possessions, as in Jeremy Bentham’s oeuvre, are assumed to be just as pleasing, as protecting and improving the actual or expected quality of life, so long as they reduce any displeasure due to the undesired outcomes of uncertainty. Finally, each consumption decision incurs its own satisfactory time period, owed to not feeling hungry, being healthy, not having transportation…etc. This reveals that the level of satisfaction is indeed a behavioral phenomenon where its value would depend on the simultaneous satisfaction derived from all activities.
Abstract: It is an indispensible strategy to adopt greenery
approach on architectural bases so as to improve ecological habitats,
decrease heat-island effect, purify air quality, and relieve surface
runoff as well as noise pollution, all of which are done in an attempt to
achieve sustainable environment. How we can do with plant design to
attain the best visual quality and ideal carbon dioxide fixation depends
on whether or not we can appropriately make use of greenery
according to the nature of architectural bases. To achieve the goal, it is
a need that architects and landscape architects should be provided with
sufficient local references. Current greenery studies focus mainly on
the heat-island effect of urban with large scale. Most of the architects
still rely on people with years of expertise regarding the adoption and
disposition of plantation in connection with microclimate scale.
Therefore, environmental design, which integrates science and
aesthetics, requires fundamental research on landscape environment
technology divided from building environment technology. By doing
so, we can create mutual benefits between green building and the
environment. This issue is extremely important for the greening design
of the bases of green buildings in cities and various open spaces. The
purpose of this study is to establish plant selection and allocation
strategies under different building sunshade levels. Initially, with the
shading of sunshine on the greening bases as the starting point, the
effects of the shades produced by different building types on the
greening strategies were analyzed. Then, by measuring the PAR
(photosynthetic active radiation), the relative DLI (daily light integral)
was calculated, while the DLI Map was established in order to
evaluate the effects of the building shading on the established
environmental greening, thereby serving as a reference for plant
selection and allocation. The discussion results were to be applied in
the evaluation of environment greening of greening buildings and
establish the “right plant, right place” design strategy of multi-level
ecological greening for application in urban design and landscape
design development, as well as the greening criteria to feedback to the
eco-city greening buildings.
Abstract: This study discusses a Turkish music education model
similar to its Venezuelan counterpart El Sistema, in which
socialization and human development are the main goals. The Music
for Peace (Baris Icin Muzik) model, founded in 2005 by an idealist
humanitarian in Istanbul, started as a pilot project with accordion and
today makes symphonic music education. The program aims to offer
social change through free-of-charge. In such a big city like Istanbul, in a deprived inner city center
people have poor economic, social and cultural conditions. In that
Edirnekapi district people don’t have opportunities to join the cultural
and social life, like music or sports. It is believed that this initiative
covered a part of this gap by giving children the opportunities to
participate in social and cultural life. In this study it is planned to understand what social changes could
music education could make in children’s lives. In the complimentary
music lessons children works in groups, which helps them to learn
the feelings of solidarity, friendship, communion and sharing. By Music for Peace project children connect with the community,
they have the belief to succeed in life because they feel that they are
loved by their friends, instructors and families. In short they feel that
they are important, thus brings the success in life. Additionally, it is
believed that, this program has achieved success. Today
approximately 400 children participate in this programs orchestras
and choirs. Some of the students get into the conservatories. And the
center is not just a place where they get music lessons but also a place
where they get socialized. And music education helps children to
have strong sense of identity, self-confidence and self-esteem.
Abstract: The localization information is crucial for the
operation of WSN. There are principally two types of localization
algorithms. The Range-based localization algorithm has strict
requirements on hardware, thus is expensive to be implemented in
practice. The Range-free localization algorithm reduces the hardware
cost. However, it can only achieve high accuracy in ideal scenarios.
In this paper, we locate unknown nodes by incorporating the
advantages of these two types of methods. The proposed algorithm
makes the unknown nodes select the nearest anchor using the
Received Signal Strength Indicator (RSSI) and choose two other
anchors which are the most accurate to achieve the estimated
location. Our algorithm improves the localization accuracy compared
with previous algorithms, which has been demonstrated by the
simulating results.
Abstract: New environmental regulations and the increasing
market preference for companies that respect the ecosystem had
encouraged the industry to look after new treatments for its effluents.
The sugar industry, one of the largest emitter of environmental
pollutants, follows this tendency. Membrane technology is
convenient for separation of suspended solids, colloids and high
molecular weight materials that are present in a wastewater from
sugar industry. The idea is to microfilter the wastewater, where the
permeate passes through the membrane and becomes available for
recycle and re-use in the sugar manufacturing process. For
microfiltration of this effluent a tubular ceramic membrane was used
with a pore size of 200 nm at transmembrane pressure in range of 1–3
bars and in range of flow rate of 50–150 l/h. Kenics static mixer was
used for permeate flux enhancement. Turbidity and suspended solids
were removed and the permeate flux was continuously monitored
during the microfiltration process. The flux achieved after 90 minutes
of microfiltration was in a range of 50–70 l/m2h. The obtained
turbidity decrease was in the range of 50-99 % and total amount of
suspended solids was removed.
Abstract: This research presents the main ideas to implement an
intelligent system composed by communicating wireless sensors
measuring environmental data linked to drought indicators (such as
air temperature, soil moisture , etc...). On the other hand, the setting
up of a spatio temporal database communicating with a Web mapping
application for a monitoring in real time in activity 24:00 /day, 7
days/week is proposed to allow the screening of the drought
parameters time evolution and their extraction. Thus this system
helps detecting surfaces touched by the phenomenon of drought.
Spatio-temporal conceptual models seek to answer the users who
need to manage soil water content for irrigating or fertilizing or other
activities pursuing crop yield augmentation. Effectively, spatiotemporal
conceptual models enable users to obtain a diagram of
readable and easy data to apprehend. Based on socio-economic
information, it helps identifying people impacted by the phenomena
with the corresponding severity especially that this information is
accessible by farmers and stakeholders themselves. The study will be
applied in Siliana watershed Northern Tunisia.
Abstract: Localization of nodes is one of the key issues of
Wireless Sensor Network (WSN) that gained a wide attention in
recent years. The existing localization techniques can be generally
categorized into two types: range-based and range-free. Compared
with rang-based schemes, the range-free schemes are more costeffective,
because no additional ranging devices are needed. As a
result, we focus our research on the range-free schemes. In this paper
we study three types of range-free location algorithms to compare the
localization error and energy consumption of each one. Centroid
algorithm requires a normal node has at least three neighbor anchors,
while DV-hop algorithm doesn’t have this requirement. The third
studied algorithm is the amorphous algorithm similar to DV-Hop
algorithm, and the idea is to calculate the hop distance between two
nodes instead of the linear distance between them. The simulation
results show that the localization accuracy of the amorphous
algorithm is higher than that of other algorithms and the energy
consumption does not increase too much.
Abstract: In order to help the expert to validate association rules
extracted from data, some quality measures are proposed in the
literature. We distinguish two categories: objective and subjective
measures. The first one depends on a fixed threshold and on data
quality from which the rules are extracted. The second one consists
on providing to the expert some tools in the objective to explore and
visualize rules during the evaluation step. However, the number of
extracted rules to validate remains high. Thus, the manually mining
rules task is very hard. To solve this problem, we propose, in this
paper, a semi-automatic method to assist the expert during the
association rule's validation. Our method uses rule-based
classification as follow: (i) We transform association rules into
classification rules (classifiers), (ii) We use the generated classifiers
for data classification. (iii) We visualize association rules with their
quality classification to give an idea to the expert and to assist him
during validation process.
Abstract: The purpose of this article is to make an approach to
the Security Studies, exposing their theories and concepts to
understand the role that they have had in the interpretation of the
changes and continuities of the world order and their impact on
policies in facing the problems of the 21st century. The aim is to
build a bridge between the security studies as a subfield and the
meaning that has been given to the world order. The idea of epistemic
communities serves as a methodological proposal for the different
programs of research in security studies, showing their influence in
the realities of States, intergovernmental organizations and
transnational forces, moving to implement, perpetuate and project a
vision of the world order.
Abstract: Bloom’s Taxonomy has been changed during the
years. The idea of this writing is about the revision that has happened
in both facts and terms. It also contains case studies of using
cognitive Bloom’s taxonomy in teaching geometric solids to the
secondary school students, affective objectives in a creative
workshop for adults and psychomotor objectives in fixing a
malfunctioned refrigerator lamp. There is also pointed to the
important role of classification objectives in adult education as a way
to prevent memory loss.
Abstract: Biodiesel production from vegetable oil will produce
glycerol as by-product about 10% of the biodiesel production. The
amount of glycerol that was produced needed alternative way to
handling immediately so as to not become the waste that polluted
environment. One of the solutions was to process glycerol to
polyglycidyl nitrate (PGN). PGN is synthesized from glycerol by
three-step reactions i.e. nitration of glycerol, cyclization of 13-
dinitroglycerine and polymerization of glycosyl nitrate. Optimum
condition of nitration of glycerol with nitric acid has not been known.
Thermodynamic feasibility should be done before run experiments in
the laboratory. The aim of this study was to determine the parameters
those affect nitration of glycerol and nitric acid and chose the
operation condition. Many parameters were simulated to verify its
possibility to experiment under conditions which would get the
highest conversion of 1, 3-dinitroglycerine and which was the ideal
condition to get it. The parameters that need to be studied to obtain
the highest conversion of 1, 3-dinitroglycerine were mol ratio of
nitric acid/glycerol, reaction temperature, mol ratio of
glycerol/dichloromethane and pressure. The highest conversion was
obtained in the range of mol ratio of nitric acid /glycerol between 2/1
– 5/1, reaction temperature of 5-25oC and pressure of 1 atm. The
parameters that need to be studied further to obtain the highest
conversion of 1.3 DNG are mol ratio of nitric acid/glycerol and
reaction temperature.
Abstract: The efficient and economic allocation of resources is
one main goal in the field of production planning and control.
Nowadays, a new variable gains in importance throughout the
planning process: Energy. Energy-efficiency has already been widely
discussed in literature, but with a strong focus on reducing the overall
amount of energy used in production. This paper provides a brief
systematic approach, how energy-supply-orientation can be used for
an energy-cost-efficient production planning and thus combining the
idea of energy-efficiency and energy-flexibility.
Abstract: The growth of organic farming practices in the last
few decades is continuing to stimulate the international debate about
this alternative food market. As a part of a PhD project research
about embeddedness in Alternative Food Networks (AFNs), this
paper focuses on the promotional aspects of organic farms websites
from the Madrid region. As a theoretical tool, some knowledge
categories drawn on the geographic studies literature are used to
classify the many ideas expressed in the web pages. By analysing
texts and pictures of 30 websites, the study aims to question how and
to what extent actors from organic world communicate to the
potential customers their personal beliefs about farming practices,
products qualities, and ecological and social benefits. Moreover, the
paper raises the question of whether organic farming laws and
regulations lack of completeness about the social and cultural aspects
of food.
Abstract: Background in music analysis: Traditionally, when we
think about a composer’s sketches, the chances are that we are
thinking in terms of the working out of detail, rather than the
evolution of an overall concept. Since music is a “time art,” it follows
that questions of a form cannot be entirely detached from
considerations of time. One could say that composers tend to regard
time either as a place gradually and partially intuitively filled, or they
can look for a specific strategy to occupy it. It seems that the one
thing that sheds light on Stockhausen’s compositional thinking is his
frequent use of “form schemas,” that is often a single-page
representation of the entire structure of a piece.
Background in music technology: Sonic Visualiser is a program
used to study a musical recording. It is an open source application for
viewing, analyzing, and annotating music audio files. It contains a
number of visualisation tools, which are designed with useful default
parameters for musical analysis. Additionally, the Vamp plugin
format of SV supports to provide analysis such as for example
structural segmentation.
Aims: The aim of paper is to show how SV may be used to obtain
a better understanding of the specific musical work, and how the
compositional strategy does impact on musical structures and musical
surfaces. It is known that “traditional” music analytic methods don’t
allow indicating interrelationships between musical surface (which is
perceived) and underlying musical/acoustical structure.
Main Contribution: Stockhausen had dealt with the most diverse
musical problems by the most varied methods. A characteristic which
he had never ceased to be placed at the center of his thought and
works, it was the quest for a new balance founded upon an acute
connection between speculation and intuition. In the case with
Mikrophonie I (1964) for tam-tam and 6 players Stockhausen makes
a distinction between the “connection scheme,” which indicates the
ground rules underlying all versions, and the form scheme, which is
associated with a particular version. The preface to the published
score includes both the connection scheme, and a single instance of a
“form scheme,” which is what one can hear on the CD recording. In
the current study, the insight into the compositional strategy chosen
by Stockhausen was been compared with auditory image, that is, with
the perceived musical surface. Stockhausen’s musical work is
analyzed both in terms of melodic/voice and timbre evolution.
Implications: The current study shows how musical structures
have determined of musical surface. The general assumption is this,
that while listening to music we can extract basic kinds of musical
information from musical surfaces. It is shown that interactive
strategies of musical structure analysis can offer a very fruitful way
of looking directly into certain structural features of music.
Abstract: In this paper a new model for center of motion
creating is proposed. This new method uses cables. So, it is very
useful in robots because it is light and has easy assembling process.
In the robots which need to be in touch with some things this method
is so useful. It will be described in the following. The accuracy of the
idea is proved by two experiments. This system could be used in the
robots which need a fixed point in the contact with some things and
make a circular motion.
Abstract: The notion of power and gender domination is one of
the inseparable aspects of themes in postmodern literature. The
reason of its importance has been discussed frequently since the rise
of Michel Foucault and his insight into the circulation of power and
the transgression of forces. Language and society operate as the basic
grounds for the study, as all human beings are bound to the set of
rules and norms which shape them in the acceptable way in the
macrocosm. How different genders in different positions behave and
show reactions to the provocation of social forces and superiority of
one another is of great interest to writers and literary critics. Mamet’s
works are noticeable for their controversial but timely themes which
illustrate human conflicts with the society and greed for power. Many
critics like Christopher Bigsby and Harold Bloom have discussed
Mamet and his ideas in recent years. This paper is the study of
Oleanna, Mamet’s masterpiece about the teacher-student relationship
and the circulation of power between a man and woman. He shows
the very breakable boundaries in the domination of a gender and the
downfall of speech as the consequence of transgression and freedom.
The failure of the language the teacher uses and the abuse of his own
words by a student who seeks superiority and knowledge are the
main subjects of the discussion. Supported by the ideas of Foucault,
the language Mamet uses to present his characters becomes the
fundamental premise in this study. As a result, language becomes
both the means of achievement and downfall.
Abstract: This paper presents an application of a “Systematic
Soft Domain Driven Design Framework” as a soft systems approach
to domain-driven design of information systems development. The
framework use SSM as a guiding methodology within which we have
embedded a sequence of design tasks based on the UML leading to
the implementation of a software system using the Naked Objects
framework. This framework have been used in action research
projects that have involved the investigation and modelling of
business processes using object-oriented domain models and the
implementation of software systems based on those domain models.
Within this framework, Soft Systems Methodology (SSM) is used as
a guiding methodology to explore the problem situation and to
develop the domain model using UML for the given business
domain. The framework is proposed and evaluated in our previous
works, and a real case study “Information Retrieval System for
academic research” is used, in this paper, to show further practice and
evaluation of the framework in different business domain. We argue
that there are advantages from combining and using techniques from
different methodologies in this way for business domain modelling.
The framework is overviewed and justified as multimethodology
using Mingers multimethodology ideas.
Abstract: The future and the development of science is therefore
seen in interdisciplinary areas such as biomedical engineering. Selfassembled
structures, similar to stem cell niches would inhibit fast
division process and subsequently capture the stem cells from the
blood flow. By means of surface topography and the stiffness as well
as microstructure progenitor cells should be differentiated towards
the formation of endothelial cells monolayer which effectively will
inhibit activation of the coagulation cascade. The idea of the material
surface development met the interest of the clinical institutions,
which support the development of science in this area and are waiting
for scientific solutions that could contribute to the development of
heart assist systems. This would improve the efficiency of the
treatment of patients with myocardial failure, supported with artificial
heart assist systems. Innovative materials would enable the redesign,
in the post project activity, construction of ventricular heart assist.
Abstract: It is a well-established fact that terrorism is one of the foremost threats to present-day international security. The creation of tools or mechanisms for confronting it in an effective and efficient manner will only be possible by way of an objective assessment of the phenomenon. In order to achieve this, this paper has the following three main objectives: Firstly, setting out to find the reasons that have prevented the establishment of a universally accepted definition of terrorism, and consequently trying to outline the main features defining the face of the terrorist threat in order to discover the fundamental goals of what is now a serious blight on world society. Secondly, trying to explain the differences between a terrorist movement and a terrorist organisation, and the reasons for which a terrorist movement can be led to transform itself into an organisation. After analysing these motivations and the characteristics of a terrorist organisation, an example of the latter will be succinctly analysed to help the reader understand the ideas expressed. Lastly, discovering and exposing the factors that can lead to the appearance of terrorist tendencies, and discussing the most efficient and effective responses that can be given to this global security threat.
Abstract: Nowadays, cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime. It also provides an optimized and secured access to the resources and gives more security for the data which is stored in the platform. However, some companies do not trust Cloud providers, they think that providers can access and modify some confidential data such as bank accounts. Many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, but, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some operations on the data before sending them to the provider Cloud in the objective to make them unreadable. The principal idea is to allow user how it can protect his data with his own methods. In this paper, we are going to demonstrate our approach and prove that is more efficient in term of execution time than some existing methods. This work aims at enhancing the quality of service of providers and ensuring the trust of the customers.