Abstract: Perth will run out of available sustainable natural
water resources by 2015 if nothing is done to slow usage rates,
according to a Western Australian study [1]. Alternative water
technology options need to be considered for the long-term
guaranteed supply of water for agricultural, commercial, domestic
and industrial purposes. Seawater is an alternative source of water for
human consumption, because seawater can be desalinated and
supplied in large quantities to a very high quality.
While seawater desalination is a promising option, the technology
requires a large amount of energy which is typically generated from
fossil fuels. The combustion of fossil fuels emits greenhouse gases
(GHG) and, is implicated in climate change. In addition to
environmental emissions from electricity generation for desalination,
greenhouse gases are emitted in the production of chemicals and
membranes for water treatment. Since Australia is a signatory to the
Kyoto Protocol, it is important to quantify greenhouse gas emissions
from desalinated water production.
A life cycle assessment (LCA) has been carried out to determine
the greenhouse gas emissions from the production of 1 gigalitre (GL)
of water from the new plant. In this LCA analysis, a new desalination
plant that will be installed in Bunbury, Western Australia, and known
as Southern Seawater Desalinization Plant (SSDP), was taken as a
case study. The system boundary of the LCA mainly consists of three
stages: seawater extraction, treatment and delivery. The analysis
found that the equivalent of 3,890 tonnes of CO2 could be emitted
from the production of 1 GL of desalinated water. This LCA analysis
has also identified that the reverse osmosis process would cause the
most significant greenhouse emissions as a result of the electricity
used if this is generated from fossil fuels
Abstract: The computer, among the most important inventions of the twentieth century, has become an increasingly important component in our everyday lives. Computer games also have become increasingly popular among people day-by-day, owing to their features based on realistic virtual environments, audio and visual features, and the roles they offer players. In the present study, the metaphors students have for computer games are investigated, as well as an effort to fill the gap in the literature. Students were asked to complete the sentence—‘Computer game is like/similar to….because….’— to determine the middle school students’ metaphorical images of the concept for ‘computer game’. The metaphors created by the students were grouped in six categories, based on the source of the metaphor. These categories were ordered as ‘computer game as a means of entertainment’, ‘computer game as a beneficial means’, ‘computer game as a basic need’, ‘computer game as a source of evil’, ‘computer game as a means of withdrawal’, and ‘computer game as a source of addiction’, according to the number of metaphors they included.
Abstract: In this paper the main objective is to analyze the
quality of service of the bus companies operating in the city of
Campos, located in the state of Rio de Janeiro, Brazil. This analysis,
based on the opinion of the bus customers, will help to determine
their degree of satisfaction with the service provided by the bus
companies. The result of this assessment shows that the bus
customers are displeased with the quality of service supplied by the
bus companies. Therefore, it is necessary to identify alternative
solutions to minimize the consequences of the main problems related
to customers- dissatisfaction identified in our evaluation and to help
the bus companies operating in Campos better fulfill their riders-
needs.
Abstract: Discrete choice model is the most used methodology for studying traveler-s mode choice and demand. However, to calibrate the discrete choice model needs to have plenty of questionnaire survey. In this study, an aggregative model is proposed. The historical data of passenger volumes for high speed rail and domestic civil aviation are employed to calibrate and validate the model. In this study, different models are compared so as to propose the best one. From the results, systematic equations forecast better than single equation do. Models with the external variable, which is oil price, are better than models based on closed system assumption.
Abstract: Intellectual capital measurement is a central aspect of knowledge management. The measurement and the evaluation of intangible assets play a key role in allowing an effective management of these assets as sources of competitiveness. For these reasons, managers and practitioners need conceptual and analytical tools taking into account the unique characteristics and economic significance of Intellectual Capital. Following this lead, we propose an efficiency and productivity analysis of Intellectual Capital, as a determinant factor of the company competitive advantage. The analysis is carried out by means of Data Envelopment Analysis (DEA) and Malmquist Productivity Index (MPI). These techniques identify Bests Practice companies that have accomplished competitive advantage implementing successful strategies of Intellectual Capital management, and offer to inefficient companies development paths by means of benchmarking. The proposed methodology is employed on the Biotechnology industry in the period 2007-2010.
Abstract: The objective of this research is to calculate the
optimal inventory lot-sizing for each supplier and minimize the total
inventory cost which includes joint purchase cost of the products,
transaction cost for the suppliers, and holding cost for remaining
inventory. Genetic algorithms (GAs) are applied to the multi-product
and multi-period inventory lot-sizing problems with supplier
selection under storage space. Also a maximum storage space for the
decision maker in each period is considered. The decision maker
needs to determine what products to order in what quantities with
which suppliers in which periods. It is assumed that demand of
multiple products is known over a planning horizon. The problem is
formulated as a mixed integer programming and is solved with the
GAs. The detailed computation results are presented.
Abstract: One important problem in today organizations is the
existence of non-integrated information systems, inconsistency and
lack of suitable correlations between legacy and modern systems.
One main solution is to transfer the local databases into a global one.
In this regards we need to extract the data structures from the legacy
systems and integrate them with the new technology systems. In
legacy systems, huge amounts of a data are stored in legacy
databases. They require particular attention since they need more
efforts to be normalized, reformatted and moved to the modern
database environments. Designing the new integrated (global)
database architecture and applying the reverse engineering requires
data normalization. This paper proposes the use of database reverse
engineering in order to integrate legacy and modern databases in
organizations. The suggested approach consists of methods and
techniques for generating data transformation rules needed for the
data structure normalization.
Abstract: This paper proposes a modeling methodology for the
development of data analysis solution. The Author introduce the
approach to address data warehousing issues at the at enterprise level.
The methodology covers the process of the requirements eliciting and
analysis stage as well as initial design of data warehouse. The paper
reviews extended business process model, which satisfy the needs of
data warehouse development. The Author considers that the use of
business process models is necessary, as it reflects both enterprise
information systems and business functions, which are important for
data analysis. The Described approach divides development into
three steps with different detailed elaboration of models. The
Described approach gives possibility to gather requirements and
display them to business users in easy manner.
Abstract: Graph based image segmentation techniques are
considered to be one of the most efficient segmentation techniques
which are mainly used as time & space efficient methods for real
time applications. How ever, there is need to focus on improving the
quality of segmented images obtained from the earlier graph based
methods. This paper proposes an improvement to the graph based
image segmentation methods already described in the literature. We
contribute to the existing method by proposing the use of a weighted
Euclidean distance to calculate the edge weight which is the key
element in building the graph. We also propose a slight modification
of the segmentation method already described in the literature, which
results in selection of more prominent edges in the graph. The
experimental results show the improvement in the segmentation
quality as compared to the methods that already exist, with a slight
compromise in efficiency.
Abstract: Academic research information service is a must for surveying previous studies in research and development process. OntoFrame is an academic research information service under Semantic Web framework different from simple keyword-based services such as CiteSeer and Google Scholar. The first purpose of this study is for revealing user behavior in their surveys, the objects of using academic research information services, and their needs. The second is for applying lessons learned from the results to OntoFrame.
Abstract: purpose of this study was to investigate the current status of support services for students with special education needs (SEN) at colleges and universities in Taiwan. Seventy-two college and universities received a questionnaire on its resource room operation process and four resource room staffs each from different areas were interviewed through semi- structured interview forms. The main findings were (1) most colleges and universities did offer sufficient administrative resources; (2) more efforts on preventions for SEN students and establishment of disability awareness should be made for all campus faculties ; (3) more comprehensive services were required to help students to have better transition into post-school life; (4) most schools provided basic administrative resource requirements but qualities of the resource room programs needed to be enhanced; and (5) most resource room staffs lacked of professional knowledge in counseling the SEN students which needed to be strengthened in the future.
Abstract: Effective knowledge support relies on providing
operation-relevant knowledge to workers promptly and accurately. A
knowledge flow represents an individual-s or a group-s
knowledge-needs and referencing behavior of codified knowledge
during operation performance. The flow has been utilized to facilitate
organizational knowledge support by illustrating workers-
knowledge-needs systematically and precisely. However,
conventional knowledge-flow models cannot work well in cooperative
teams, which team members usually have diverse knowledge-needs in
terms of roles. The reason is that those models only provide one single
view to all participants and do not reflect individual knowledge-needs
in flows. Hence, we propose a role-based knowledge-flow view model
in this work. The model builds knowledge-flow views (or virtual
knowledge flows) by creating appropriate virtual knowledge nodes
and generalizing knowledge concepts to required concept levels. The
customized views could represent individual role-s knowledge-needs
in teamwork context. The novel model indicates knowledge-needs in
condensed representation from a roles perspective and enhances the
efficiency of cooperative knowledge support in organizations.
Abstract: High precision in motion is required to manipulate the
micro objects in precision industries for micro assembly, cell
manipulation etc. Precision manipulation is achieved based on the
appropriate mechanism design of micro devices such as
microgrippers. Design of a compliant based mechanism is the better
option to achieve a highly precised and controlled motion. This
research article highlights the method of designing a compliant based
three fingered microgripper suitable for holding asymmetric objects.
Topological optimization technique, a systematic method is
implemented in this research work to arrive a topologically optimized
design of the mechanism needed to perform the required micro
motion of the gripper. Optimization technique has a drawback of
generating senseless regions such as node to node connectivity and
staircase effect at the boundaries. Hence, it is required to have post
processing of the design to make it manufacturable. To reduce the
effect of post processing stage and to preserve the edges of the image,
a cubic spline interpolation technique is introduced in the MATLAB
program. Structural performance of the topologically developed
mechanism design is tested using finite element method (FEM)
software. Further the microgripper structure is examined to find its
fatigue life and vibration characteristics.
Abstract: Knowledge is attributed to human whose problemsolving
behavior is subjective and complex. In today-s knowledge
economy, the need to manage knowledge produced by a community
of actors cannot be overemphasized. This is due to the fact that
actors possess some level of tacit knowledge which is generally
difficult to articulate. Problem-solving requires searching and sharing
of knowledge among a group of actors in a particular context.
Knowledge expressed within the context of a problem resolution
must be capitalized for future reuse. In this paper, an approach that
permits dynamic capitalization of relevant and reliable actors-
knowledge in solving decision problem following Economic
Intelligence process is proposed. Knowledge annotation method and
temporal attributes are used for handling the complexity in the
communication among actors and in contextualizing expressed
knowledge. A prototype is built to demonstrate the functionalities of
a collaborative Knowledge Management system based on this
approach. It is tested with sample cases and the result showed that
dynamic capitalization leads to knowledge validation hence
increasing reliability of captured knowledge for reuse. The system
can be adapted to various domains.
Abstract: Saudi Arabia in recent years has seen drastic increase
in traffic related crashes. With population of over 29 million, Saudi
Arabia is considered as a fast growing and emerging economy. The
rapid population increase and economic growth has resulted in rapid
expansion of transportation infrastructure, which has led to increase
in road crashes. Saudi Ministry of Interior reported more than 7,000
people killed and 68,000 injured in 2011 ranking Saudi Arabia to be
one of the worst worldwide in traffic safety. The traffic safety issues
in the country also result in distress to road users and cause and
economic loss exceeding 3.7 billion Euros annually. Keeping this in
view, the researchers in Saudi Arabia are investigating ways to
improve traffic safety conditions in the country. This paper presents a
multilevel approach to collect traffic safety related data required to do
traffic safety studies in the region. Two highway corridors including
King Fahd Highway 39 kilometre and Gulf Cooperation Council
Highway 42 kilometre long connecting the cities of Dammam and
Khobar were selected as a study area. Traffic data collected included
traffic counts, crash data, travel time data, and speed data. The
collected data was analysed using geographic information system to
evaluate any correlation. Further research is needed to investigate the
effectiveness of traffic safety related data when collected in a
concerted effort.
Abstract: Modern managements of water distribution system
(WDS) need water quality models that are able to accurately predict
the dynamics of water quality variations within the distribution system
environment. Before water quality models can be applied to solve
system problems, they should be calibrated. Although former
researchers use GA solver to calibrate relative parameters, it is
difficult to apply on the large-scale or medium-scale real system for
long computational time. In this paper a new method is designed
which combines both macro and detailed model to optimize the water
quality parameters. This new combinational algorithm uses radial
basis function (RBF) metamodeling as a surrogate to be optimized for
the purpose of decreasing the times of time-consuming water quality
simulation and can realize rapidly the calibration of pipe wall reaction
coefficients of chlorine model of large-scaled WDS. After two cases
study this method is testified to be more efficient and promising, and
deserve to generalize in the future.
Abstract: Deaths from cardiovascular diseases have decreased substantially over the past two decades, largely as a result of advances in acute care and cardiac surgery. These developments have produced a growing population of patients who have survived a myocardial infarction. These patients need to be continuously monitored so that the initiation of treatment can be given within the crucial golden hour. The available conventional methods of monitoring mostly perform offline analysis and restrict the mobility of these patients within a hospital or room. Hence the aim of this paper is to design a Portable Cardiac Telemedicine System to aid the patients to regain their independence and return to an active work schedule, there by improving the psychological well being. The portable telemedicine system consists of a Wearable ECG Transmitter (WET) and a slightly modified mobile phone, which has an inbuilt ECG analyzer. The WET is placed on the body of the patient that continuously acquires the ECG signals from the high-risk cardiac patients who can move around anywhere. This WET transmits the ECG to the patient-s Bluetooth enabled mobile phone using blue tooth technology. The ECG analyzer inbuilt in the mobile phone continuously analyzes the heartbeats derived from the received ECG signals. In case of any panic condition, the mobile phone alerts the patients care taker by an SMS and initiates the transmission of a sample ECG signal to the doctor, via the mobile network.
Abstract: Information hiding, especially watermarking is a
promising technique for the protection of intellectual property rights.
This technology is mainly advanced for multimedia but the same has
not been done for text. Web pages, like other documents, need a
protection against piracy. In this paper, some techniques are
proposed to show how to hide information in web pages using some
features of the markup language used to describe these pages. Most
of the techniques proposed here use the white space to hide
information or some varieties of the language in representing
elements. Experiments on a very small page and analysis of five
thousands web pages show that these techniques have a wide
bandwidth available for information hiding, and they might form a
solid base to develop a robust algorithm for web page watermarking.
Abstract: Recent years have seen a growing trend towards the
integration of multiple information sources to support large-scale
prediction of protein-protein interaction (PPI) networks in model
organisms. Despite advances in computational approaches, the
combination of multiple “omic" datasets representing the same type
of data, e.g. different gene expression datasets, has not been
rigorously studied. Furthermore, there is a need to further investigate
the inference capability of powerful approaches, such as fullyconnected
Bayesian networks, in the context of the prediction of PPI
networks. This paper addresses these limitations by proposing a
Bayesian approach to integrate multiple datasets, some of which
encode the same type of “omic" data to support the identification of
PPI networks. The case study reported involved the combination of
three gene expression datasets relevant to human heart failure (HF).
In comparison with two traditional methods, Naive Bayesian and
maximum likelihood ratio approaches, the proposed technique can
accurately identify known PPI and can be applied to infer potentially
novel interactions.
Abstract: Grid computing provides an effective infrastructure for massive computation among flexible and dynamic collection of individual system for resource discovery. The major challenge for grid computing is to prevent breaches and secure the data from trespassers. To overcome such conflicts a semantic approach can be designed which will filter the access requests of peers by checking the resource description specifying the data and the metadata as factual statements. Between every node in the grid a semantic firewall as a middleware will be present The intruder will be required to present an application specifying there needs to the firewall and hence accordingly the system will grant or deny the application request.