Abstract: This research paper presents guiding on how to design
social media into higher education courses. The research
methodology used a survey approach. The research instrument was a
questionnaire about guiding on how to design social media into
higher education courses. Thirty-one lecturers completed the
questionnaire. The data were scored by frequency and percentage.
The research results were the lecturers’ opinions concerning the
designing social media into higher education courses as follows: 1)
Lecturers deem that the most suitable learning theory is Collaborative
Learning. 2) Lecturers consider that the most important learning and
innovation Skill in the 21st century is communication and
collaboration skills. 3) Lecturers think that the most suitable
evaluation technique is authentic assessment. 4) Lecturers consider
that the most appropriate portion used as blended learning should be
70% in the classroom setting and 30% online.
Abstract: In this paper, we report the development of the device
for diagnostics of cardiovascular system state and associated
automated workstation for large-scale medical measurement data
collection and analysis. It was shown that optimal design for the
monitoring device is wristband as it represents engineering trade-off
between accuracy and usability. Monitoring device is based on the
infrared reflective photoplethysmographic sensor, which allows
collecting multiple physiological parameters, such as heart rate and
pulsing wave characteristics. Developed device uses BLE interface
for medical and supplementary data transmission to the coupled
mobile phone, which processes it and send it to the doctor's
automated workstation. Results of this experimental model
approbation confirmed the applicability of the proposed approach.
Abstract: Discursive practices enacted by educators in
kindergarten create a blueprint for how the educational trajectories of
students with disabilities are constructed. This two-year ethnographic
case study critically examines educators’ relationships with students
considered to present challenging behaviors in one kindergarten
classroom located in a predominantly White middle class school
district in the Northeast of the United States. Focusing on the
language and practices used by one special education teacher and
three teaching assistants, this paper analyzes how teacher responses
to students’ behaviors constructs and positions students over one year
of kindergarten education. Using a critical discourse analysis it shows
that educators understand students’ behaviors as deficit and needing
consequences. This study highlights how educators’ responses reflect
students' individual characteristics including family background,
socioeconomics and ability status. This paper offers in depth analysis
of two students’ stories, which evidenced that the language used by
educators amplifies the social positioning of students within the
classroom and creates a foundation for who they are constructed to
be. Through exploring routine language and practices, this paper
demonstrates that educators outlined a blueprint of kindergartners,
which positioned students as learners in ways that became the ground
for either a limited or a promising educational pathway for them.
Abstract: This research presents the main ideas to implement an
intelligent system composed by communicating wireless sensors
measuring environmental data linked to drought indicators (such as
air temperature, soil moisture , etc...). On the other hand, the setting
up of a spatio temporal database communicating with a Web mapping
application for a monitoring in real time in activity 24:00 /day, 7
days/week is proposed to allow the screening of the drought
parameters time evolution and their extraction. Thus this system
helps detecting surfaces touched by the phenomenon of drought.
Spatio-temporal conceptual models seek to answer the users who
need to manage soil water content for irrigating or fertilizing or other
activities pursuing crop yield augmentation. Effectively, spatiotemporal
conceptual models enable users to obtain a diagram of
readable and easy data to apprehend. Based on socio-economic
information, it helps identifying people impacted by the phenomena
with the corresponding severity especially that this information is
accessible by farmers and stakeholders themselves. The study will be
applied in Siliana watershed Northern Tunisia.
Abstract: Indonesia has experienced annual forest fires that have
rapidly destroyed and degraded its forests. Fires in the peat swamp
forests of Riau Province, have set the stage for problems to worsen,
this being the ecosystem most prone to fires (which are also the most
difficult, to extinguish). Despite various efforts to curb deforestation,
and forest degradation processes, severe forest fires are still
occurring. To find an effective solution, the basic causes of the
problems must be identified. It is therefore critical to have an indepth
understanding of the underlying causal factors that have
contributed to deforestation and forest degradation as a whole, in
order to attain reductions in their rates. An assessment of the drivers of deforestation and forest
degradation was carried out, in order to design and implement
measures that could slow these destructive processes. Research was
conducted in Giam Siak Kecil–Bukit Batu Biosphere Reserve
(GSKBB BR), in the Riau Province of Sumatera, Indonesia. A
biosphere reserve was selected as the study site because such reserves
aim to reconcile conservation with sustainable development. A
biosphere reserve should promote a range of local human activities,
together with development values that are in line spatially and
economically with the area conservation values, through use of a
zoning system. Moreover, GSKBB BR is an area with vast peatlands,
and is experiencing forest fires annually. Various factors were
analysed to assess the drivers of deforestation and forest degradation
in GSKBB BR; data were collected from focus group discussions
with stakeholders, key informant interviews with key stakeholders,
field observation and a literature review. Landsat satellite imagery was used to map forest-cover changes
for various periods. Analysis of landsat images, taken during the
period 2010-2014, revealed that within the non-protected area of core
zone, there was a trend towards decreasing peat swamp forest areas,
increasing land clearance, and increasing areas of community oilpalm
and rubber plantations. Fire was used for land clearing and most
of the forest fires occurred in the most populous area (the transition
area). The study found a relationship between the deforested/
degraded areas, and certain distance variables, i.e. distance from
roads, villages and the borders between the core area and the buffer
zone. The further the distance from the core area of the reserve, the
higher was the degree of deforestation and forest degradation. Research findings suggested that agricultural expansion may be
the direct cause of deforestation and forest degradation in the reserve,
whereas socio-economic factors were the underlying driver of forest
cover changes; such factors consisting of a combination of sociocultural,
infrastructural, technological, institutional (policy and governance), demographic (population pressure) and economic
(market demand) considerations. These findings indicated that local
factors/problems were the critical causes of deforestation and
degradation in GSKBB BR. This research therefore concluded that
reductions in deforestation and forest degradation in GSKBB BR
could be achieved through ‘local actor’-tailored approaches such as
community empowerment.
Abstract: In order to help the expert to validate association rules
extracted from data, some quality measures are proposed in the
literature. We distinguish two categories: objective and subjective
measures. The first one depends on a fixed threshold and on data
quality from which the rules are extracted. The second one consists
on providing to the expert some tools in the objective to explore and
visualize rules during the evaluation step. However, the number of
extracted rules to validate remains high. Thus, the manually mining
rules task is very hard. To solve this problem, we propose, in this
paper, a semi-automatic method to assist the expert during the
association rule's validation. Our method uses rule-based
classification as follow: (i) We transform association rules into
classification rules (classifiers), (ii) We use the generated classifiers
for data classification. (iii) We visualize association rules with their
quality classification to give an idea to the expert and to assist him
during validation process.
Abstract: The development of transport systems has negative
impacts on the environment although it has beneficial effects on
society. The car policy caused many problems such as: - the
spectacular growth of fuel consumption hence the very vast increase
in urban pollution, traffic congestion in certain places and at certain
times, the increase in the number of accidents. The exhaust emissions
from cars and weather conditions are the main factors that determine
the level of pollution in urban atmosphere. These conditions lead to
the phenomenon of heat transfer and radiation occurring between the
air and the soil surface of any town. These exchanges give rise, in
urban areas, to the effects of heat islands that correspond to the
appearance of excess air temperature between the city and its
surrounding space. In this object, we perform a numerical simulation
of the plume generated by the cars exhaust gases and show that these
gases form a screening effect above the urban city which cause the
heat island in the presence of wind flow. This study allows us: 1. To
understand the different mechanisms of interactions between these
phenomena.2. To consider appropriate technical solutions to mitigate
the effects of the heat island.
Abstract: Based on the hypothesis that disaster risk is
constructed socially and historically, this article shows the
importance of keeping alive the historical memory of disaster by
means of architectural and urban heritage conservation. This is
illustrated with three examples of Latin American World Heritage
cities, where disasters like floods and earthquakes have shaped urban
form. Therefore, the study of urban form or "Urban Morphology" is
proposed as a tool to understand and analyze urban transformations
with the documentation of the occurrence of disasters. Lessons
learned from such cities may be useful to reduce disasters risk in
contemporary built environments.
Abstract: Public space is essential to strengthen the social and
urban fabric and the social cohesion; there lies the importance of its
study. Hence, the aim of this paper is to analyze the quality of public
space in the XXI century in both quantitative and qualitative terms. In
this article, the concept of public space includes open spaces such as
parks, public squares and walking areas. To make this analysis, we
take Mexico City as the case study. It has a population of nearly 9
million inhabitants and is composed of sixteen boroughs. For this
analysis, we consider both existing public spaces and the government
intervention for building and improving new and existent public
spaces. Results show that on the one hand, quantitatively there is not
an equitable distribution of public spaces due to both the growth of
the city itself as well as due to the absence of political will to create
public spaces. Another factor is the evolution of this city, which has
been growing merely in a “patched pattern”, where public space has
played no role at all with a total absence of urban design. On the
other hand, qualitatively, even the boroughs with the most public
spaces have not shown interest in making these spaces qualitatively
inclusive and open to the general population aiming for integration.
Therefore, urban projects that privatize public space seem to be the
rule, rather than a rehabilitation effort of the existent public spaces.
Hence, state intervention should reinforce its role as an agent of
social change acting in benefit of the majority of the inhabitants with
the promotion of more inclusive public spaces.
Abstract: This study analyzes the critical gaps in the
architecture of European stability and the expected role of the
banking union as the new important step towards completing the
Economic and Monetary Union that should enable the creation of
safe and sound financial sector for the euro area market. The single
rulebook together with the Single Supervisory Mechanism and the
Single Resolution Mechanism - as two main pillars of the banking
union, should provide a consistent application of common rules and
administrative standards for supervision, recovery and resolution of
banks – with the final aim of replacing the former bail-out practice
with the bail-in system through which possible future bank failures
would be resolved by their own funds, i.e. with minimal costs for
taxpayers and real economy. In this way, the vicious circle between
banks and sovereigns would be broken. It would also reduce the
financial fragmentation recorded in the years of crisis as the result of
divergent behaviors in risk premium, lending activities and interest
rates between the core and the periphery. In addition, it should
strengthen the effectiveness of monetary transmission channels, in
particular the credit channels and overflows of liquidity on the money
market which, due to the fragmentation of the common financial
market, has been significantly disabled in period of crisis. However,
contrary to all the positive expectations related to the future
functioning of the banking union, major findings of this study
indicate that characteristics of the economic system in which the
banking union will operate should not be ignored. The euro area is an
integration of strong and weak entities with large differences in
economic development, wealth, assets of banking systems, growth
rates and accountability of fiscal policy. The analysis indicates that
low and unbalanced economic growth remains a challenge for the
maintenance of financial stability and this problem cannot be
resolved just by a single supervision. In many countries bank assets
exceed their GDP by several times and large banks are still a matter
of concern, because of their systemic importance for individual
countries and the euro zone as a whole. The creation of the Single
Supervisory Mechanism and the Single Resolution Mechanism is a
response to the European crisis, which has particularly affected
peripheral countries and caused the associated loop between the
banking crisis and the sovereign debt crisis, but has also influenced
banks’ balance sheets in the core countries, as the result of crossborder
capital flows. The creation of the SSM and the SRM should
prevent the similar episodes to happen again and should also provide
a new opportunity for strengthening of economic and financial
systems of the peripheral countries. On the other hand, there is a
potential threat that future focus of the ECB, resolution mechanism
and other relevant institutions will be extremely oriented towards
large and significant banks (whereby one half of them operate in the
core and most important euro area countries), and therefore it remains
questionable to what extent will the common resolution funds will be used for rescue of less important institutions. Recent geopolitical
developments will be the optimal indicator to show whether the
previously established mechanisms are sufficient enough to maintain
the adequate financial stability in the euro area market.
Abstract: Latin America is probably the region with greater
social inequality, contrary to the amount of rights enshrined in their
constitutions. In the last decade of the twentieth century, the area
resulted in significant changes to democratization and constitutional
changes. Through low-key public policy, political leaders activated
participation in the culture of human rights. The struggle for social
rights in Latin America has been a constant regulation. His
consecration at the constitutional level has chained search
application. The constitutionalization and judicial protection of these
rights have been crucial in countries like Argentina, Venezuela, Peru
and Colombia. This paper presents an analytical view on the
constitutionalization of social rights in the Latin American context
and its justiciability.
Abstract: Life cycle assessment is a technique to assess the
environmental aspects and potential impacts associated with a
product, process, or service, by compiling an inventory of relevant
energy and material inputs and environmental releases; evaluating the
potential environmental impacts associated with identified inputs and
releases; and interpreting the results to help you make a more
informed decision. In this paper, the life cycle assessment of
aluminum and beech wood as two commonly used materials in Egypt
for window frames are heading, highlighting their benefits and
weaknesses. Window frames of the two materials have been assessed
on the basis of their production, energy consumption and
environmental impacts. It has been found that the climate change of
the windows made of aluminum and beech wood window, for a
reference window (1.2m×1.2m), are 81.7 mPt and -52.5 mPt impacts
respectively. Among the most important results are: fossil fuel
consumption, potential contributions to the green building effect and
quantities of solid waste tend to be minor for wood products
compared to aluminum products; incineration of wood products can
cause higher impacts of acidification and eutrophication than
aluminum, whereas thermal energy can be recovered.
Abstract: This paper applied factor conditions from Porter’s
Diamond Model (1990) to understand the various challenges facing
the AMISA. Factor conditions highlighted in Porter’s model are
grouped into two groups namely, basic and advance factors. Two
AMISA associations representing over 10 000 employees were
interviewed. The largest Clothing, Textiles and Leather (CTL)
apparel retail group was also interviewed with a government
department implementing the industrialization policy were
interviewed. The paper points out that AMISA have basic factor conditions
necessary for competitive advantage in the apparel industries.
However advance factor creation has proven to be a challenge for
AMISA, Higher Education Institutions (HEIs) and government. Poor
infrastructural maintenance has contributed to high manufacturing
costs and poor quick response technologies. The use of Porter’s
Factor Conditions as a tool to analyze the sector’s competitive
advantage challenges and opportunities has increased knowledge
regarding factors that limit the AMISA’s competitiveness. It is
therefore argued that other studies on Porter’s Diamond model
factors like Demand conditions, Firm strategy, structure and rivalry
and Related and supporting industries can be used to analyze the
situation of the AMISA for the purposes of improving competitive
advantage.
Abstract: Singular value decomposition based optimisation of
geometric design parameters of a 5-speed gearbox is studied. During
the optimisation, a four-degree-of freedom torsional vibration model
of the pinion gear-wheel gear system is obtained and the minimum
singular value of the transfer matrix is considered as the objective
functions. The computational cost of the associated singular value
problems is quite low for the objective function, because it is only
necessary to compute the largest and smallest singular values (μmax
and μmin) that can be achieved by using selective eigenvalue solvers;
the other singular values are not needed. The design parameters are
optimised under several constraints that include bending stress,
contact stress and constant distance between gear centres. Thus, by
optimising the geometric parameters of the gearbox such as, the
module, number of teeth and face width it is possible to obtain a
light-weight-gearbox structure. It is concluded that the all optimised
geometric design parameters also satisfy all constraints.
Abstract: This study makes an integrated investigation on how
life satisfaction is associated with the Korean game users'
psychological variables (self-esteem, game and life self- efficacy),
social variables (bonding and bridging social capital), and
demographic variables (age, gender). The data used for the empirical
analysis came from a representative sample survey conducted in South
Korea. Results show that self-esteem and game efficacy were an
important antecedent to the degree of users’ life satisfaction. Both
bonding social capital and bridging social capital enhance the level of
the users’ life satisfaction. The importance of perspectives as well as
their implications for the game users and further associated research is
explored.
Abstract: Particle size distribution, the most important
characteristics of aerosols, is obtained through electrical
characterization techniques. The dynamics of charged nanoparticles
under the influence of electric field in Electrical Mobility
Spectrometer (EMS) reveals the size distribution of these particles.
The accuracy of this measurement is influenced by flow conditions,
geometry, electric field and particle charging process, therefore by
the transfer function (transfer matrix) of the instrument. In this work,
a wire-cylinder corona charger was designed and the combined fielddiffusion
charging process of injected poly-disperse aerosol particles
was numerically simulated as a prerequisite for the study of a
multichannel EMS. The result, a cloud of particles with no uniform
charge distribution, was introduced to the EMS. The flow pattern and
electric field in the EMS were simulated using Computational Fluid
Dynamics (CFD) to obtain particle trajectories in the device and
therefore to calculate the reported signal by each electrometer.
According to the output signals (resulted from bombardment of
particles and transferring their charges as currents), we proposed a
modification to the size of detecting rings (which are connected to
electrometers) in order to evaluate particle size distributions more
accurately. Based on the capability of the system to transfer
information contents about size distribution of the injected particles,
we proposed a benchmark for the assessment of optimality of the
design. This method applies the concept of Von Neumann entropy
and borrows the definition of entropy from information theory
(Shannon entropy) to measure optimality. Entropy, according to the
Shannon entropy, is the ''average amount of information contained in
an event, sample or character extracted from a data stream''.
Evaluating the responses (signals) which were obtained via various
configurations of detecting rings, the best configuration which gave
the best predictions about the size distributions of injected particles,
was the modified configuration. It was also the one that had the
maximum amount of entropy. A reasonable consistency was also
observed between the accuracy of the predictions and the entropy
content of each configuration. In this method, entropy is extracted
from the transfer matrix of the instrument for each configuration.
Ultimately, various clouds of particles were introduced to the
simulations and predicted size distributions were compared to the
exact size distributions.
Abstract: The purpose of this study is to determine the
relationship of anxiety level between male and female undergraduates
at a private university in Malaysia. Convenient sampling method used
in this study in which the students were selected based on the
grouping assigned by the faculty. There were 214 undergraduates
who registered the probability courses had participated in this study.
Mathematics Anxiety Rating Scale (MARS) was the instrument used
in study which used to determine students’ anxiety level towards
probability. Reliability and validity of instrument was done before the
major study was conducted. In the major study, students were given
briefing about the study conducted. Participation of this study was
voluntary. Students were given consent form to determine whether
they agree to participate in the study. Duration of two weeks was
given for students to complete the given online questionnaire. The
data collected will be analyzed using Statistical Package for the
Social Sciences (SPSS) to determine the level of anxiety. There were
three anxiety level, i.e., low, average and high. Students’ anxiety
level was determined based on their scores obtained compared with
the mean and standard deviation. If the scores obtained were below
mean and standard deviation, the anxiety level was low. If the scores
were at below and above the mean and between one standard
deviation, the anxiety level was average. If the scores were above the
mean and greater than one standard deviation, the anxiety level was
high. Results showed that both of genders were having average
anxiety level. Among low, average and high anxiety level, frequency
of males were found to be higher as compared to females. Hence, the
mean values obtained for males (M = 3.62) was higher than females
(M = 3.42). In order to be significant of anxiety level among the
gender, the p-value should be less than .05. The p-value obtained in
this study was .117. However, this value was greater than .05. Thus,
there was no significant difference of anxiety level among the gender.
In other words, there was no relationship of anxiety level with the
gender.
Abstract: In this paper, we provided a literature survey on the
artificial stock problem (ASM). The paper began by exploring the
complexity of the stock market and the needs for ASM. ASM
aims to investigate the link between individual behaviors (micro
level) and financial market dynamics (macro level). The variety of
patterns at the macro level is a function of the AFM complexity. The
financial market system is a complex system where the relationship
between the micro and macro level cannot be captured analytically.
Computational approaches, such as simulation, are expected to
comprehend this connection. Agent-based simulation is a simulation
technique commonly used to build AFMs. The paper proceeds by
discussing the components of the ASM. We consider the roles
of behavioral finance (BF) alongside the traditionally risk-averse
assumption in the construction of agent’s attributes. Also, the
influence of social networks in the developing of agents interactions is
addressed. Network topologies such as a small world, distance-based,
and scale-free networks may be utilized to outline economic
collaborations. In addition, the primary methods for developing
agents learning and adaptive abilities have been summarized.
These incorporated approach such as Genetic Algorithm, Genetic
Programming, Artificial neural network and Reinforcement Learning.
In addition, the most common statistical properties (the stylized facts)
of stock that are used for calibration and validation of ASM are
discussed. Besides, we have reviewed the major related previous
studies and categorize the utilized approaches as a part of these
studies. Finally, research directions and potential research questions
are argued. The research directions of ASM may focus on the macro
level by analyzing the market dynamic or on the micro level by
investigating the wealth distributions of the agents.
Abstract: Experiential marketing is one of the marketing
approaches that offer an exceptional framework to integrate elements
of experience and entertainment in a product or service. Experiential
marketing is defined as a memorable experience that goes deeply into
the customer’s mind. Besides that, customer satisfaction is defined as
an emotional response to the experiences provided by and associated
with particular products or services purchased. Thus, experiential
marketing activities can affect the level of customer satisfaction and
loyalty. In this context, the research aims to explore the relationship
among experiential marketing, customer satisfaction and customer
loyalty among the cosmetic products customers in Konya. The partial
least squares (PLS) method is used to analyze the survey data.
Findings of the present study revealed that experiential marketing has
been a significant predictor of customer satisfaction and customer
loyalty, and also experiential marketing has a significantly positive
effect on customer satisfaction and customer loyalty.
Abstract: Sustainability is a very important and heavily
discussed subject, expanding through tourism as well. The study
proposition was to collect data and present it to the competent bodies
so they can mold their public policies to improve the conditions of
the site. It was hypothesized that the lack of data is currently
affecting the quality of life and the sustainable development of the
site and the tourism. The research was held in Mateiros, a city in the
state of Tocantins (TO)/Brasil near Palmas, its capital city. Because
of the concentration of tourists during the high season and several
tourist attractions being around, the research took place in Mateiros.
The methodological procedure had a script of theoretical construction
and investigation of the deductive scientific method parameters
through a case study in the Jalapão/TO/Brazil region, using it as a
tool for a questionnaire given to the competent bodies in an interview
system with the UN sustainability indexes as a base. In the three
sustainable development scope: environmental, social and economic,
the results indicated that the data presented by the interviewed were
scarce or nonexistent. It shows that more research is necessary,
providing the tools for the ones responsible to propose action plans to
improve the site, strengthening the tourism and making it even more
sustainable.