Abstract: Wikis are promoted as collaborative writing tools that
allow students to transform a text into a collective document by
information sharing and group reflection. However, despite the
promising collaborative capabilities of wikis, their pedagogical value
regarding collaborative writing is still questionable. Wiki alone
cannot make collaborative writing happen, and students do not
automatically become more active, participate, and collaborate with
others when they use wikis. To foster collaborative writing and active
involvement in wiki development there is a need for a systematic
approach to wikis. Themain goal of this paper is to propose and
evaluate a co-writing approach to the development of wikis, along
with the study of three wiki applications to report on pedagogical
implications of collaborative writing in higher education.
Abstract: The issue of leadership has been investigated from
several perspectives; however, very less from ethical perspective.
With the growing number of corporate scandals and unethical roles
played by business leaders in several parts of the world, the need to
examine leadership from ethical perspective cannot be over
emphasized. The importance of leadership credibility has been
discussed in the authentic model of leadership. Authentic leaders
display high degree of integrity, have deep sense of purpose, and
committed to their core values. As a result they promote a more
trusting relationship in their work groups that translates into several
positive outcomes. The present study examined how authentic
leadership contribute to subordinates- trust in leadership and how this
trust, in turn, predicts subordinates- work engagement. A sample of
395 employees was randomly selected from several local banks
operating in Malaysia. Standardized tools such as ALQ, OTI, and
EEQ were employed. Results indicated that authentic leadership
promoted subordinates- trust in leader, and contributed to work
engagement. Also, interpersonal trust predicted employees- work
engagement as well as mediated the relationship between this style of
leadership and employees- work engagement.
Abstract: This paper discusses on the use of Spline Interpolation
and Mean Square Error (MSE) as tools to process data acquired from
the developed simulator that shall replicate sea bed logging environment.
Sea bed logging (SBL) is a new technique that uses marine
controlled source electromagnetic (CSEM) sounding technique and is
proven to be very successful in detecting and characterizing hydrocarbon
reservoirs in deep water area by using resistivity contrasts. It uses
very low frequency of 0.1Hz to 10 Hz to obtain greater wavelength.
In this work the in house built simulator was used and was provided
with predefined parameters and the transmitted frequency was varied
for sediment thickness of 1000m to 4000m for environment with and
without hydrocarbon. From series of simulations, synthetics data were
generated. These data were interpolated using Spline interpolation
technique (degree of three) and mean square error (MSE) were
calculated between original data and interpolated data. Comparisons
were made by studying the trends and relationship between frequency
and sediment thickness based on the MSE calculated. It was found
that the MSE was on increasing trends in the set up that has the
presence of hydrocarbon in the setting than the one without. The MSE
was also on decreasing trends as sediment thickness was increased
and with higher transmitted frequency.
Abstract: In the various working field, vibration may cause injurious to human body. Especially, in case of the vibration which is constantly and repeatedly transferred to the human. That gives serious physical problem, so called, Reynaud phenomenon. In this paper, we propose a vibration transmissibility reduction module with flexure mechanism for personal tools. At first, we select a target personal tool, grass cutter, and measure the level of vibration transmissibility on the hand. And then, we develop the concept design of the module that has stiffness for reduction the vibration transmissibility more than 20%, where the vibration transmissibility is measured with an accelerometer. In addition, the vibration reduction can be enhanced when the interior gap between inner and outer body is filled with silicone gel. This will be verified by the further experiment.
Abstract: As the latest advancement and trend in IT field, Green
& Smart IT has attracted more and more attentions from researchers.
This study focuses on the development of assessing tools which can be
used for evaluating Green & Smart IT level within an organization. In
order to achieve meaningful results, a comprehensive review of
relevant literature was performed in advance, then, Delphi survey and
other processes were also employed to develop the assessment tools
for Green & Smart IT level. Two rounds of Delphi questionnaire
survey were conducted with 20 IT experts in public sector. The results
reveal that the top five weighted KPIs to evaluate maturity of Green &
Smart IT were: (1) electronic execution of business process; (2)
shutdown of unused IT devices; (3) virtualization of severs; (4)
automation of constant temperature and humidity; and (5) introduction
of smart-work system. Finally, these tools were applied to case study
of a public research institute in Korea. The findings presented in this
study provide organizations with useful implications for the
introduction and promotion of Green & Smart IT in the future
Abstract: This paper describes part of a project about Learningby-
Modeling (LbM). Studying complex systems is increasingly
important in teaching and learning many science domains. Many
features of complex systems make it difficult for students to develop
deep understanding. Previous research indicates that involvement
with modeling scientific phenomena and complex systems can play a
powerful role in science learning. Some researchers argue with this
view indicating that models and modeling do not contribute to
understanding complexity concepts, since these increases the
cognitive load on students. This study will investigate the effect of
different modes of involvement in exploring scientific phenomena
using computer simulation tools, on students- mental model from the
perspective of structure, behavior and function. Quantitative and
qualitative methods are used to report about 121 freshmen students
that engaged in participatory simulations about complex phenomena,
showing emergent, self-organized and decentralized patterns. Results
show that LbM plays a major role in students' concept formation
about complexity concepts.
Abstract: The primary objective of this paper was to construct a
“kinematic parameter-independent modeling of three-axis machine
tools for geometric error measurement" technique. Improving the
accuracy of the geometric error for three-axis machine tools is one of
the machine tools- core techniques. This paper first applied the
traditional method of HTM to deduce the geometric error model for
three-axis machine tools. This geometric error model was related to the
three-axis kinematic parameters where the overall errors was relative
to the machine reference coordinate system. Given that the
measurement of the linear axis in this model should be on the ideal
motion axis, there were practical difficulties. Through a measurement
method consolidating translational errors and rotational errors in the
geometric error model, we simplified the three-axis geometric error
model to a kinematic parameter-independent model. Finally, based on
the new measurement method corresponding to this error model, we
established a truly practical and more accurate error measuring
technique for three-axis machine tools.
Abstract: Pressures for urban redevelopment are intensifying in
all large cities. A new logic for urban development is required –
green urbanism – that provides a spatial framework for directing
population and investment inwards to brownfields and greyfields
precincts, rather than outwards to the greenfields. This represents
both a major opportunity and a major challenge for city planners in
pluralist liberal democracies. However, plans for more compact
forms of urban redevelopment are stalling in the face of community
resistance. A new paradigm and spatial planning platform is required
that will support timely multi-level and multi-actor stakeholder
engagement, resulting in the emergence of consensus plans for
precinct-level urban regeneration capable of more rapid
implementation. Using Melbourne, Australia as a case study, this
paper addresses two of the urban intervention challenges – where and
how – via the application of a 21st century planning tool ENVISION
created for this purpose.
Abstract: Opinion extraction about products from customer
reviews is becoming an interesting area of research. Customer
reviews about products are nowadays available from blogs and
review sites. Also tools are being developed for extraction of opinion
from these reviews to help the user as well merchants to track the
most suitable choice of product. Therefore efficient method and
techniques are needed to extract opinions from review and blogs. As
reviews of products mostly contains discussion about the features,
functions and services, therefore, efficient techniques are required to
extract user comments about the desired features, functions and
services. In this paper we have proposed a novel idea to find features
of product from user review in an efficient way. Our focus in this
paper is to get the features and opinion-oriented words about
products from text through auxiliary verbs (AV) {is, was, are, were,
has, have, had}. From the results of our experiments we found that
82% of features and 85% of opinion-oriented sentences include AVs.
Thus these AVs are good indicators of features and opinion
orientation in customer reviews.
Abstract: The use of new technologies such internet (e-mail, chat
rooms) and cell phones has steeply increased in recent years.
Especially among children and young people, use of technological
tools and equipments is widespread. Although many teachers and
administrators now recognize the problem of school bullying, few are
aware that students are being harassed through electronic
communication. Referred to as electronic bullying, cyber bullying, or
online social cruelty, this phenomenon includes bullying through email,
instant messaging, in a chat room, on a website, or through
digital messages or images sent to a cell phone. Cyber bullying is
defined as causing deliberate/intentional harm to others using internet
or other digital technologies. It has a quantitative research design nd
uses relational survey as its method. The participants consisted of
300 secondary school students in the city of Konya, Turkey. 195
(64.8%) participants were female and 105 (35.2%) were male. 39
(13%) students were at grade 1, 187 (62.1%) were at grade 2 and 74
(24.6%) were at grade 3. The “Cyber Bullying Question List"
developed by Ar─▒cak (2009) was given to students. Following
questions about demographics, a functional definition of cyber
bullying was provided. In order to specify students- human values,
“Human Values Scale (HVS)" developed by Dilmaç (2007) for
secondary school students was administered. The scale consists of 42
items in six dimensions. Data analysis was conducted by the primary
investigator of the study using SPSS 14.00 statistical analysis
software. Descriptive statistics were calculated for the analysis of
students- cyber bullying behaviour and simple regression analysis was
conducted in order to test whether each value in the scale could
explain cyber bullying behaviour.
Abstract: Expression data analysis is based mostly on the
statistical approaches that are indispensable for the study of
biological systems. Large amounts of multidimensional data resulting
from the high-throughput technologies are not completely served by
biostatistical techniques and are usually complemented with visual,
knowledge discovery and other computational tools. In many cases,
in biological systems we only speculate on the processes that are
causing the changes, and it is the visual explorative analysis of data
during which a hypothesis is formed. We would like to show the
usability of multidimensional visualization tools and promote their
use in life sciences. We survey and show some of the
multidimensional visualization tools in the process of data
exploration, such as parallel coordinates and radviz and we extend
them by combining them with the self-organizing map algorithm. We
use a time course data set of transitional cell carcinoma of the bladder
in our examples. Analysis of data with these tools has the potential to
uncover additional relationships and non-trivial structures.
Abstract: As the web continues to grow exponentially, the idea
of crawling the entire web on a regular basis becomes less and less
feasible, so the need to include information on specific domain,
domain-specific search engines was proposed. As more information
becomes available on the World Wide Web, it becomes more difficult
to provide effective search tools for information access. Today,
people access web information through two main kinds of search
interfaces: Browsers (clicking and following hyperlinks) and Query
Engines (queries in the form of a set of keywords showing the topic
of interest) [2]. Better support is needed for expressing one's
information need and returning high quality search results by web
search tools. There appears to be a need for systems that do reasoning
under uncertainty and are flexible enough to recover from the
contradictions, inconsistencies, and irregularities that such reasoning
involves. In a multi-view problem, the features of the domain can be
partitioned into disjoint subsets (views) that are sufficient to learn the
target concept. Semi-supervised, multi-view algorithms, which
reduce the amount of labeled data required for learning, rely on the
assumptions that the views are compatible and uncorrelated. This
paper describes the use of semi-structured machine learning approach
with Active learning for the “Domain Specific Search Engines". A
domain-specific search engine is “An information access system that
allows access to all the information on the web that is relevant to a
particular domain. The proposed work shows that with the help of
this approach relevant data can be extracted with the minimum
queries fired by the user. It requires small number of labeled data and
pool of unlabelled data on which the learning algorithm is applied to
extract the required data.
Abstract: This paper presents a study of the Taguchi design
application to optimize surface quality in damper inserted end milling
operation. Maintaining good surface quality usually involves
additional manufacturing cost or loss of productivity. The Taguchi
design is an efficient and effective experimental method in which a
response variable can be optimized, given various factors, using
fewer resources than a factorial design. This Study included spindle
speed, feed rate, and depth of cut as control factors, usage of different
tools in the same specification, which introduced tool condition and
dimensional variability. An orthogonal array of L9(3^4)was used;
ANOVA analyses were carried out to identify the significant factors
affecting surface roughness, and the optimal cutting combination was
determined by seeking the best surface roughness (response) and
signal-to-noise ratio. Finally, confirmation tests verified that the
Taguchi design was successful in optimizing milling parameters for
surface roughness.
Abstract: In the artificial intelligence field, knowledge
representation and reasoning are important areas for intelligent
systems, especially knowledge base systems and expert systems.
Knowledge representation Methods has an important role in
designing the systems. There have been many models for knowledge
such as semantic networks, conceptual graphs, and neural networks.
These models are useful tools to design intelligent systems. However,
they are not suitable to represent knowledge in the domains of reality
applications. In this paper, new models for knowledge representation
called computational networks will be presented. They have been
used in designing some knowledge base systems in education for
solving problems such as the system that supports studying
knowledge and solving analytic geometry problems, the program for
studying and solving problems in Plane Geometry, the program for
solving problems about alternating current in physics.
Abstract: In this paper we propose an intelligent agent approach
to control the electric power grid at a smaller granularity in order to
give it self-healing capabilities. We develop a method using the
influence model to transform transmission substations into
information processing, analyzing and decision making (intelligent
behavior) units. We also develop a wireless communication method
to deliver real-time uncorrupted information to an intelligent
controller in a power system environment. A combined networking
and information theoretic approach is adopted in meeting both the
delay and error probability requirements. We use a mobile agent
approach in optimizing the achievable information rate vector and in
the distribution of rates to users (sensors). We developed the concept
and the quantitative tools require in the creation of cooperating semiautonomous
subsystems which puts the electric grid on the path
towards intelligent and self-healing system.
Abstract: Evaluation of educational portals is an important
subject area that needs more attention from researchers. A university
that has an educational portal which is difficult to use and interact by
teachers or students or management staff can reduce the position and
reputation of the university. Therefore, it is important to have the
ability to make an evaluation of the quality of e-services the
university provide to improve them over time.
The present study evaluates the usability of the Information
Technology Faculty portal at University of Benghazi. Two evaluation
methods were used: a questionnaire-based method and an online
automated tool-based method. The first method was used to measure
the portal's external attributes of usability (Information, Content and
Organization of the portal, Navigation, Links and Accessibility,
Aesthetic and Visual Appeal, Performance and Effectiveness and
educational purpose) from users' perspectives, while the second
method was used to measure the portal's internal attributes of
usability (number and size of HTML files, number and size of images,
load time, HTML check errors, browsers compatibility problems,
number of bad and broken links), which cannot be perceived by the
users. The study showed that some of the usability aspects have been
found at the acceptable level of performance and quality, and some
others have been found otherwise. In general, it was concluded that
the usability of IT faculty educational portal generally acceptable.
Recommendations and suggestions to improve the weakness and
quality of the portal usability are presented in this study.
Abstract: A computational platform is presented in this
contribution. It has been designed as a virtual laboratory to be used
for exploring optimization algorithms in biological problems. This
platform is built on a blackboard-based agent architecture. As a test
case, the version of the platform presented here is devoted to the
study of protein folding, initially with a bead-like description of the
chain and with the widely used model of hydrophobic and polar
residues (HP model). Some details of the platform design are
presented along with its capabilities and also are revised some
explorations of the protein folding problems with different types of
discrete space. It is also shown the capability of the platform to
incorporate specific tools for the structural analysis of the runs in
order to understand and improve the optimization process.
Accordingly, the results obtained demonstrate that the ensemble of
computational tools into a single platform is worthwhile by itself,
since experiments developed on it can be designed to fulfill different
levels of information in a self-consistent fashion. By now, it is being
explored how an experiment design can be useful to create a
computational agent to be included within the platform. These
inclusions of designed agents –or software pieces– are useful for the
better accomplishment of the tasks to be developed by the platform.
Clearly, while the number of agents increases the new version of the
virtual laboratory thus enhances in robustness and functionality.
Abstract: National Biodiversity Database System (NBIDS) has
been developed for collecting Thai biodiversity data. The goal of this
project is to provide advanced tools for querying, analyzing,
modeling, and visualizing patterns of species distribution for
researchers and scientists. NBIDS data record two types of datasets:
biodiversity data and environmental data. Biodiversity data are
specie presence data and species status. The attributes of biodiversity
data can be further classified into two groups: universal and projectspecific
attributes. Universal attributes are attributes that are common
to all of the records, e.g. X/Y coordinates, year, and collector name.
Project-specific attributes are attributes that are unique to one or a
few projects, e.g., flowering stage. Environmental data include
atmospheric data, hydrology data, soil data, and land cover data
collecting by using GLOBE protocols. We have developed webbased
tools for data entry. Google Earth KML and ArcGIS were used
as tools for map visualization. webMathematica was used for simple
data visualization and also for advanced data analysis and
visualization, e.g., spatial interpolation, and statistical analysis.
NBIDS will be used by park rangers at Khao Nan National Park, and
researchers.
Abstract: Performance Measurement is still a difficult task for forwarding companies. This is caused on the one hand by missing resources and on the other hand by missing tools. The research project “Management Information System for Logistics Service Providers" aims for closing the gap between needed and disposable solutions. Core of the project is the development
Abstract: Six Sigma is a well known discipline that reduces
variation using complex statistical tools and the DMAIC model. By
integrating Goldratts-s Theory of Constraints, the Five Focusing
Points and System Thinking tools, Six Sigma projects can be selected
where it can cause more impact in the company. This research
defines an integrated model of six sigma and constraint management
that shows a step-by-step guide using the original methodologies
from each discipline and is evaluated in a case study from the
production line of a Automobile engine monoblock V8, resulting in
an increase in the line capacity from 18.7 pieces per hour to 22.4
pieces per hour, a reduction of 60% of Work-In-Process and a
variation decrease of 0.73%.