Abstract: Data mining has been integrated into application systems to enhance the quality of the decision-making process. This study aims to focus on the integration of data mining technology and Knowledge Management System (KMS), due to the ability of data mining technology to create useful knowledge from large volumes of data. Meanwhile, KMS vitally support the creation and use of knowledge. The integration of data mining technology and KMS are popularly used in business for enhancing and sustaining organizational performance. However, there is a lack of studies that applied data mining technology and KMS in the education sector; particularly students- academic performance since this could reflect the IHL performance. Realizing its importance, this study seeks to integrate data mining technology and KMS to promote an effective management of knowledge within IHLs. Several concepts from literature are adapted, for proposing the new integrative data mining technology and KMS framework to an IHL.
Abstract: Simulation is a very powerful method used for highperformance
and high-quality design in distributed system, and now
maybe the only one, considering the heterogeneity, complexity and
cost of distributed systems. In Grid environments, foe example, it is
hard and even impossible to perform scheduler performance
evaluation in a repeatable and controllable manner as resources and
users are distributed across multiple organizations with their own
policies. In addition, Grid test-beds are limited and creating an
adequately-sized test-bed is expensive and time consuming.
Scalability, reliability and fault-tolerance become important
requirements for distributed systems in order to support distributed
computation. A distributed system with such characteristics is called
dependable. Large environments, like Cloud, offer unique
advantages, such as low cost, dependability and satisfy QoS for all
users. Resource management in large environments address
performant scheduling algorithm guided by QoS constrains. This
paper presents the performance evaluation of scheduling heuristics
guided by different optimization criteria. The algorithms for
distributed scheduling are analyzed in order to satisfy users
constrains considering in the same time independent capabilities of
resources. This analysis acts like a profiling step for algorithm
calibration. The performance evaluation is based on simulation. The
simulator is MONARC, a powerful tool for large scale distributed
systems simulation. The novelty of this paper consists in synthetic
analysis results that offer guidelines for scheduler service
configuration and sustain the empirical-based decision. The results
could be used in decisions regarding optimizations to existing Grid
DAG Scheduling and for selecting the proper algorithm for DAG
scheduling in various actual situations.
Abstract: This study describes a micro device integrated with
multi-chamber for polymerase chain reaction (PCR) with different
annealing temperatures. The device consists of the reaction
polydimethylsiloxane (PDMS) chip, a cover glass chip, and is
equipped with cartridge heaters, fans, and thermocouples for
temperature control. In this prototype, commercial software is utilized
to determine the geometric and operational parameters those are
responsible for creating the denaturation, annealing, and extension
temperatures within the chip. Two cartridge heaters are placed at two
sides of the chip and maintained at two different temperatures to
achieve a thermal gradient on the chip during the annealing step. The
temperatures on the chip surface are measured via an infrared imager.
Some thermocouples inserted into the reaction chambers are used to
obtain the transient temperature profiles of the reaction chambers
during several thermal cycles. The experimental temperatures
compared to the simulated results show a similar trend. This work
should be interesting to persons involved in the high-temperature
based reactions and genomics or cell analysis.
Abstract: Consumer behaviour analysis represents an important
field of study in marketing. Particularly strategy development for
marketing and communications will be more focused and effective
when marketers have an understanding of the motivations, behaviour
and psychology of consumers. While materialism has been found to
be one of the important elements in consumer behaviour, compulsive
consumption represents another aspect that has recently attracted
more attention. This is because of the growing prevalence of
dysfunctional buying that has raised concern in consumer societies.
Present studies and analyses on origins and motivations of
compulsive buying have mainly focused on either individual factors
or groups of related factors and hence a need for a holistic view
exists. This paper provides a comprehensive perspective on
compulsive consumption and establishes relevant propositions
keeping the family life cycle stages as a reference for the incidence of
chronic consumer states and their influence on compulsive
consumption.
Abstract: Increasing use of cell phone as a medium of human interaction is playing a vital role in solving riddles of crime as well. A young girl went missing from her home late in the evening in the month of August, 2008 when her enraged relatives and villagers physically assaulted and chased her fiancée who often frequented her home. Two years later, her mother lodged a complaint against the relatives and the villagers alleging that after abduction her daughter was either sold or killed as she had failed to trace her. On investigation, a rusted cell phone with partial visible IMEI number, clothes, bangles, human skeleton etc. recovered from abandoned well in the month of May, 2011 were examined in the lab. All hopes pinned on identity of cell phone, for only linking evidence to fix the scene of occurrence supported by call detail record (CDR) and to dispel doubts about mode of sudden disappearance or death as DNA technology did not help in establishing identity of the deceased. The conventional scientific methods were used without success and international mobile equipment identification number of the cell phone could be generated by using statistical analysis followed by online verification.
Abstract: Social cognitive theory explains the power to inaugurate change is determined by the mutual influence of personal proclivity and social factors which will shape ones- motivations and expectations. In construction industry, green concept offers an opportunity to leave a lighter footprint on the environment. This opportunity, however, has not been fully grasped by many countries. As such, venturing into green construction for many practitioners would be their maiden experience. Decision to venture into new practice such as green construction will be influenced by certain drivers. This paper explores these drivers which is further expanded into motivational factors and later becomes the platform upon which expectation for green construction stands. This theoretical concept of motivation and expectations, which is adapted from social cognitive theory, focus on developers- view because of their crucial role in green application. This conceptual framework, which serves as the basis for further research, will benefit the industry as it elucidate cognitive angles to attract more new entrants to green business.
Abstract: For several high speed networks, providing resilience against failures is an essential requirement. The main feature for designing next generation optical networks is protecting and restoring high capacity WDM networks from the failures. Quick detection, identification and restoration make networks more strong and consistent even though the failures cannot be avoided. Hence, it is necessary to develop fast, efficient and dependable fault localization or detection mechanisms. In this paper we propose a new fault localization algorithm for WDM networks which can identify the location of a failure on a failed lightpath. Our algorithm detects the failed connection and then attempts to reroute data stream through an alternate path. In addition to this, we develop an algorithm to analyze the information of the alarms generated by the components of an optical network, in the presence of a fault. It uses the alarm correlation in order to reduce the list of suspected components shown to the network operators. By our simulation results, we show that our proposed algorithms achieve less blocking probability and delay while getting higher throughput.
Abstract: Environmental micro-organisms include a large number of taxa and some species that are generally considered nonpathogenic, but can represent a risk in certain conditions, especially for elderly people and immunocompromised individuals. Chemotaxonomic identification techniques are powerful tools for environmental micro-organisms, and cellular fatty acid methyl esters (FAME) content is a powerful fingerprinting identification technique. A system based on an unsupervised artificial neural network (ANN) was set up using the fatty acid profiles of standard bacterial strains, obtained by gas-chromatography, used as learning data. We analysed 45 certified strains belonging to Acinetobacter, Aeromonas, Alcaligenes, Aquaspirillum, Arthrobacter, Bacillus, Brevundimonas, Enterobacter, Flavobacterium, Micrococcus, Pseudomonas, Serratia, Shewanella and Vibrio genera. A set of 79 bacteria isolated from a drinking water line (AMGA, the major water supply system in Genoa) were used as an example for identification compared to standard MIDI method. The resulting ANN output map was found to be a very powerful tool to identify these fresh isolates.
Abstract: Various formal and informal brand alliances are being formed in professional service firms. Professional service corporate brand is heavily dependent on brands of professional employees who comprise them, and professional employee brands are in turn dependent on the corporate brand. Prior work provides limited scientific evidence of brand alliance effects in professional service area – i.e., how professional service corporate-employee brand allies are affected by an alliance, what are brand attitude effects after alliance formation and how these effects vary with different strengths of an ally. Scientific literature analysis and theoretical modeling are the main methods of the current study. As a result, a theoretical model is constructed for estimating spillover effects of professional service corporate-employee brand alliances and for comparison among different professional service firm expertise practice models – from “brains" to “procedure" model. The resulting theoretical model lays basis for future experimental studies.
Abstract: Pipeline infrastructures normally represent high cost of investment and the pipeline must be free from risks that could cause environmental hazard and potential threats to personnel safety. Pipeline integrity such monitoring and management become very crucial to provide unimpeded transportation and avoiding unnecessary production deferment. Thus proper cleaning and inspection is the key to safe and reliable pipeline operation and plays an important role in pipeline integrity management program and has become a standard industry procedure. In view of this, understanding the motion (dynamic behavior), prediction and control of the PIG speed is important in executing pigging operation as it offers significant benefits, such as estimating PIG arrival time at receiving station, planning for suitable pigging operation, and improves efficiency of pigging tasks. The objective of this paper is to review recent developments in speed control system of pipeline PIGs. The review carried out would serve as an industrial application in a form of quick reference of recent developments in pipeline PIG speed control system, and further initiate others to add-in/update the list in the future leading to knowledge based data, and would attract active interest of others to share their view points.
Abstract: Measuring the complexity of software has been an
insoluble problem in software engineering. Complexity measures can
be used to predict critical information about testability, reliability,
and maintainability of software systems from automatic analysis of
the source code. During the past few years, many complexity
measures have been invented based on the emerging Cognitive
Informatics discipline. These software complexity measures,
including cognitive functional size, lend themselves to the approach
of the total cognitive weights of basic control structures such as loops
and branches. This paper shows that the current existing calculation
method can generate different results that are algebraically
equivalence. However, analysis of the combinatorial meanings of this
calculation method shows significant flaw of the measure, which also
explains why it does not satisfy Weyuker's properties. Based on the
findings, improvement directions, such as measures fusion, and
cumulative variable counting scheme are suggested to enhance the
effectiveness of cognitive complexity measures.
Abstract: Within the realm of e-government, the development has moved towards testing new means for democratic decisionmaking, like e-panels, electronic discussion forums, and polls. Although such new developments seem promising, they are not problem-free, and the outcomes are seldom used in the subsequent formal political procedures. Nevertheless, process models offer promising potential when it comes to structuring and supporting transparency of decision processes in order to facilitate the integration of the public into decision-making procedures in a reasonable and manageable way. Based on real-life cases of urban planning processes in Sweden, we present an outline for an integrated framework for public decision making to: a) provide tools for citizens to organize discussion and create opinions; b) enable governments, authorities, and institutions to better analyse these opinions; and c) enable governments to account for this information in planning and societal decision making by employing a process model for structured public decision making.
Abstract: A precision CMOS chopping amplifier is adopted in this work to improve a CMOS temperature sensor high sensitive enough for intracranial temperature monitoring. An amplified temperature sensitivity of 18.8 ± 3*0.2 mV/oC is attained over the temperature range from 20 oC to 80 oC from a given 10 samples of the same wafer. The analog frontend design outputs the temperature dependent and the temperature independent signals which can be directly interfaced to a 10 bit ADC to accomplish an accurate temperature instrumentation system.
Abstract: The classification of the protein structure is commonly
not performed for the whole protein but for structural domains, i.e.,
compact functional units preserved during evolution. Hence, a first
step to a protein structure classification is the separation of the
protein into its domains. We approach the problem of protein domain
identification by proposing a novel graph theoretical algorithm. We
represent the protein structure as an undirected, unweighted and
unlabeled graph which nodes correspond the secondary structure
elements of the protein. This graph is call the protein graph. The
domains are then identified as partitions of the graph corresponding
to vertices sets obtained by the maximization of an objective function,
which mutually maximizes the cycle distributions found in the
partitions of the graph. Our algorithm does not utilize any other kind
of information besides the cycle-distribution to find the partitions. If
a partition is found, the algorithm is iteratively applied to each of
the resulting subgraphs. As stop criterion, we calculate numerically
a significance level which indicates the stability of the predicted
partition against a random rewiring of the protein graph. Hence,
our algorithm terminates automatically its iterative application. We
present results for one and two domain proteins and compare our
results with the manually assigned domains by the SCOP database
and differences are discussed.
Abstract: The source voltage of high-power fuel cell shows strong load dependence at comparatively low voltage levels. In order to provide the voltage of 750V on the DC-link for feeding electrical energy into the mains via a three phase inverter a step-up converter with a large step-up ratio is required. The output voltage of this DC/DC-converter must be stabile during variations of the load current and the voltage of the fuel cell. This paper presents the methods and results of the calculation of the efficiency and the expense for the realization for the circuits of the DC/DC-converter that meet these requirements.
Abstract: In this work, the condensation fraction and transition
temperature of neutral many bosonic system are studied within the
static fluctuation approximation (SFA). The effect of the potential
parameters such as the strength and range on the condensate fraction
was investigated. A model potential consisting of a repulsive step
potential and an attractive potential well was used. As the potential
strength or the core radius of the repulsive part increases, the
condensation fraction is found to be decreased at the same
temperature. Also, as the potential depth or the range of the attractive
part increases, the condensation fraction is found to be increased. The
transition temperature is decreased as the potential strength or the
core radius of the repulsive part increases, and it increases as the
potential depth or the range of the attractive part increases.
Abstract: We consider a cooperative game played by n players against a referee. The players names are randomly distributed among n lockers, with one name per locker. Each player can open up to half the lockers and each player must find his name. Once the game starts the players may not communicate. It has been previously shown that, quite surprisingly, an optimal strategy exists for which the success probability is never worse than 1 − ln 2 ≈ 0.306. In this paper we consider an extension where the number of lockers is greater than the number of players, so that some lockers are empty. We show that the players may still win with positive probability even if there are a constant k number of empty lockers. We show that for each fixed probability p, there is a constant c so that the players can win with probability at least p if they are allowed to open cn lockers.
Abstract: One of the basic concepts in marketing is the concept
of meeting customers- needs. Since customer satisfaction is essential
for lasting survival and development of a business, screening and
observing customer satisfaction and recognizing its underlying
factors must be one of the key activities of every business.
The purpose of this study is to recognize the drivers that effect
customer satisfaction in a business-to-business situation in order to
improve marketing activities. We conducted a survey in which 93
business customers of a manufacturer of Diesel Generator in Iran
participated and they talked about their ideas and satisfaction of
supplier-s services related to its products. We developed the measures
for drivers of satisfaction first by as investigative research (by means
of feedback from executives and customers of sponsoring firm). Then
based on these measures, we created a mail survey, and asked the
respondents to explain their opinion about the sponsoring firm which
was a supplier of diesel generator and similar products. Furthermore,
the survey required the participants to mention their functional areas
and their company features.
In Conclusion we found that there are three drivers for customer
satisfaction, which are reliability, information about product, and
commercial features. Buyers/users from different functional areas
attribute different degree of importance to the last two drivers. For
instance, people from buying and management areas believe that
commercial features are more important than information about
products. But people in engineering, maintenance and production
areas believe that having information about products is more
important than commercial aspects. Marketing experts should
consider the attribute of customers regarding information about the
product and commercial features to improve market share.
Abstract: Computer modeling has played a unique role in
understanding electrocardiography. Modeling and simulating cardiac
action potential propagation is suitable for studying normal and
pathological cardiac activation. This paper presents a 2-D Cellular
Automata model for simulating action potential propagation in
cardiac tissue. We demonstrate a novel algorithm in order to use
minimum neighbors. This algorithm uses the summation of the
excitability attributes of excited neighboring cells. We try to
eliminate flat edges in the result patterns by inserting probability to
the model. We also preserve the real shape of action potential by
using linear curve fitting of one well known electrophysiological
model.
Abstract: Landslide susceptibility map delineates the potential
zones for landslide occurrence. Previous works have applied
multivariate methods and neural networks for mapping landslide
susceptibility. This study proposed a new approach to integrate
decision tree model and spatial cluster statistic for assessing landslide
susceptibility spatially. A total of 2057 landslide cells were digitized
for developing the landslide decision tree model. The relationships of
landslides and instability factors were explicitly represented by using
tree graphs in the model. The local Getis-Ord statistics were used to
cluster cells with high landslide probability. The analytic result from
the local Getis-Ord statistics was classed to create a map of landslide
susceptibility zones. The map was validated using new landslide data
with 482 cells. Results of validation show an accuracy rate of 86.1% in
predicting new landslide occurrence. This indicates that the proposed
approach is useful for improving landslide susceptibility mapping.