Abstract: Vibration characteristics of subcooled flow boiling on
thin and long structures such as a heating rod were recently
investigated by the author. The results show that the intensity of the
subcooled boiling-induced vibration (SBIV) was influenced strongly
by the conditions of the subcooling temperature, linear power density
and flow velocity. Implosive bubble formation and collapse are the
main nature of subcooled boiling, and their behaviors are the only
sources to originate from SBIV. Therefore, in order to explain the
phenomenon of SBIV, it is essential to obtain reliable information
about bubble behavior in subcooled boiling conditions. This was
investigated at different conditions of coolant subcooling
temperatures of 25 to 75°C, coolant flow velocities of 0.16 to
0.53m/s, and linear power densities of 100 to 600 W/cm. High speed
photography at 13,500 frames per second was performed at these
conditions. The results show that even at the highest subcooling
condition, the absolute majority of bubbles collapse very close to the
surface after detaching from the heating surface. Based on these
observations, a simple model of surface tension and momentum
change is introduced to offer a rough quantitative estimate of the
force exerted on the heating surface during the bubble ebullition. The
formation of a typical bubble in subcooled boiling is predicted to
exert an excitation force in the order of 10-4 N.
Abstract: Rapid steps made in the field of Information and Communication Technology (ICT) has facilitated the development of teaching and learning methods and prepared them to serve the needs of an assorted educational institution. In other words, the information age has redefined the fundamentals and transformed the institutions and method of services delivery forever. The vision is the articulation of a desire to transform the method of teaching and learning could proceed through e-learning. E-learning is commonly deliberated to use of networked information and communications technology in teaching and learning practice. This paper deals the general aspects of the e-leaning with its issues, developments, opportunities and challenges, which can the higher institutions own.
Abstract: The understanding of the system level of biological behavior and phenomenon variously needs some elements such as gene sequence, protein structure, gene functions and metabolic pathways. Challenging problems are representing, learning and reasoning about these biochemical reactions, gene and protein structure, genotype and relation between the phenotype, and expression system on those interactions. The goal of our work is to understand the behaviors of the interactions networks and to model their evolution in time and in space. We propose in this study an ontological meta-model for the knowledge representation of the genetic regulatory networks. Ontology in artificial intelligence means the fundamental categories and relations that provide a framework for knowledge models. Domain ontology's are now commonly used to enable heterogeneous information resources, such as knowledge-based systems, to communicate with each other. The interest of our model is to represent the spatial, temporal and spatio-temporal knowledge. We validated our propositions in the genetic regulatory network of the Aarbidosis thaliana flower
Abstract: The most influential programming paradigm today
is object oriented (OO) programming and it is widely used in
education and industry. Recognizing the importance of equipping
students with OO knowledge and skills, it is not surprising that most
Computer Science degree programs offer OO-related courses. How
do we assess whether the students have acquired the right objectoriented
skills after they have completed their OO courses? What are
object oriented skills? Currently none of the current assessment
techniques would be able to provide this answer. Traditional forms of
OO programming assessment provide a ways for assigning numerical
scores to determine letter grades. But this rarely reveals information
about how students actually understand OO concept. It appears
reasonable that a better understanding of how to define and assess
OO skills is needed by developing a criterion referenced model. It is
even critical in the context of Malaysia where there is currently a
growing concern over the level of competency of Malaysian IT
graduates in object oriented programming. This paper discussed the
approach used to develop the criterion-referenced assessment model.
The model can serve as a guideline when conducting OO
programming assessment as mentioned. The proposed model is
derived by using Goal Questions Metrics methodology, which helps
formulate the metrics of interest. It concluded with a few suggestions
for further study.
Abstract: With the explosive growth of data available on the
Internet, personalization of this information space become a
necessity. At present time with the rapid increasing popularity of the
WWW, Websites are playing a crucial role to convey knowledge and
information to the end users. Discovering hidden and meaningful
information about Web users usage patterns is critical to determine
effective marketing strategies to optimize the Web server usage for
accommodating future growth. The task of mining useful information
becomes more challenging when the Web traffic volume is enormous
and keeps on growing. In this paper, we propose a intelligent model
to discover and analyze useful knowledge from the available Web
log data.
Abstract: The evaluation of conversational agents or chatterbots question answering systems is a major research area that needs much attention. Before the rise of domain-oriented conversational agents based on natural language understanding and reasoning, evaluation is never a problem as information retrieval-based metrics are readily available for use. However, when chatterbots began to become more domain specific, evaluation becomes a real issue. This is especially true when understanding and reasoning is required to cater for a wider variety of questions and at the same time to achieve high quality responses. This paper discusses the inappropriateness of the existing measures for response quality evaluation and the call for new standard measures and related considerations are brought forward. As a short-term solution for evaluating response quality of conversational agents, and to demonstrate the challenges in evaluating systems of different nature, this research proposes a blackbox approach using observation, classification scheme and a scoring mechanism to assess and rank three example systems, AnswerBus, START and AINI.
Abstract: Intelligent systems are required in order to quickly and accurately analyze enormous quantities of data in the Internet environment. In intelligent systems, information extracting processes can be divided into supervised learning and unsupervised learning. This paper investigates intelligent clustering by unsupervised learning. Intelligent clustering is the clustering system which determines the clustering model for data analysis and evaluates results by itself. This system can make a clustering model more rapidly, objectively and accurately than an analyzer. The methodology for the automatic clustering intelligent system is a multi-agent system that comprises a clustering agent and a cluster performance evaluation agent. An agent exchanges information about clusters with another agent and the system determines the optimal cluster number through this information. Experiments using data sets in the UCI Machine Repository are performed in order to prove the validity of the system.
Abstract: As the use of registration packages spreads, the number of the aligned image pairs in image databases (either by manual or automatic methods) increases dramatically. These image pairs can serve as a set of training data. Correspondingly, the images that are to be registered serve as testing data. In this paper, a novel medical image registration method is proposed which is based on the a priori knowledge of the expected joint intensity distribution estimated from pre-aligned training images. The goal of the registration is to find the optimal transformation such that the distance between the observed joint intensity distribution obtained from the testing image pair and the expected joint intensity distribution obtained from the corresponding training image pair is minimized. The distance is measured using the divergence measure based on Tsallis entropy. Experimental results show that, compared with the widely-used Shannon mutual information as well as Tsallis mutual information, the proposed method is computationally more efficient without sacrificing registration accuracy.
Abstract: In this study, a high accuracy protein-protein interaction
prediction method is developed. The importance of the proposed
method is that it only uses sequence information of proteins while
predicting interaction. The method extracts phylogenetic profiles of
proteins by using their sequence information. Combining the phylogenetic
profiles of two proteins by checking existence of homologs
in different species and fitting this combined profile into a statistical
model, it is possible to make predictions about the interaction status
of two proteins.
For this purpose, we apply a collection of pattern recognition
techniques on the dataset of combined phylogenetic profiles of protein
pairs. Support Vector Machines, Feature Extraction using ReliefF,
Naive Bayes Classification, K-Nearest Neighborhood Classification,
Decision Trees, and Random Forest Classification are the methods
we applied for finding the classification method that best predicts
the interaction status of protein pairs. Random Forest Classification
outperformed all other methods with a prediction accuracy of 76.93%
Abstract: Currently in many major cities, public transit schedules
are disseminated through lists of routes, grids of stop times and
static maps. This paper describes a web based geographic information
system which disseminates the same schedule information through
intuitive GIS techniques. Using data from Calgary, Canada, an map
based interface has been created to allow users to see routes, stops and
moving buses all at once. Zoom and pan controls as well as satellite
imagery allows users to apply their personal knowledge about the
local geography to achieve faster, and more pertinent transit results.
Using asynchronous requests to web services, users are immersed
in an application where buses and stops can be added and removed
interactively, without the need to wait for responses to HTTP requests.
Abstract: Trust management is one of the drawbacks in Peer-to-Peer (P2P) system. Lack of centralized control makes it difficult to control the behavior of the peers. Reputation system is one approach to provide trust assessment in P2P system. In this paper, we use fuzzy logic to model trust in a P2P environment. Our trust model combines first-hand (direct experience) and second-hand (reputation)information to allow peers to represent and reason with uncertainty regarding other peers' trustworthiness. Fuzzy logic can help in handling the imprecise nature and uncertainty of trust. Linguistic labels are used to enable peers assign a trust level intuitively. Our fuzzy trust model is flexible such that inference rules are used to weight first-hand and second-hand accordingly.
Abstract: The theory of rough sets is generalized by using a
filter. The filter is induced by binary relations and it is used to
generalize the basic rough set concepts. The knowledge
representations and processing of binary relations in the style of
rough set theory are investigated.
Abstract: In many countries, digital city or ubiquitous city
(u-City) projects have been initiated to provide digitalized economic
environments to cities. Recently in Korea, Kangwon Province has
started the u-Kangwon project to boost local economy with digitalized
tourism services. We analyze the limitations of the ubiquitous IT
approach through the u-Kangwon case. We have found that travelers
are more interested in quality over speed in access of information. For
improved service quality, we are looking to develop an
IT-convergence service design framework (ISDF). The ISDF is based
on the service engineering technique and composed of three parts:
Service Design, Service Simulation, and the Service Platform.
Abstract: Information society is an absolutely new public formation at which the infrastructure and the social relations correspond to the socialized essence of «information genotype» mankind. Information society is a natural social environment which allows the person to open completely the information nature, to use intelligence for joint creation with other people of new information on the basis of knowledge earlier saved up by previous generations.
Abstract: The given work is devoted to the description of
Information Technologies NAS of Azerbaijan created and
successfully maintained in Institute. On the basis of the decision of
board of the Supreme Certifying commission at the President of the
Azerbaijan Republic and Presidium of National Academy of
Sciences of the Azerbaijan Republic, the organization of training
courses on Computer Sciences for all post-graduate students and
dissertators of the republic, taking of examinations of candidate
minima, it was on-line entrusted to Institute of Information
Technologies of the National Academy of Sciences of Azerbaijan.
Therefore, teaching the computer sciences to post-graduate
students and dissertators a scientific - methodological manual on
effective application of new information technologies for research
works by post-graduate students and dissertators and taking of
candidate minima is carried out in the Educational Center.
Information and communication technologies offer new
opportunities and prospects of their application for teaching and
training. The new level of literacy demands creation of essentially
new technology of obtaining of scientific knowledge. Methods of
training and development, social and professional requirements,
globalization of the communicative economic and political projects
connected with construction of a new society, depends on a level of
application of information and communication technologies in the
educational process. Computer technologies develop ideas of
programmed training, open completely new, not investigated
technological ways of training connected to unique opportunities of
modern computers and telecommunications. Computer technologies
of training are processes of preparation and transfer of the
information to the trainee by means of computer. Scientific and
technical progress as well as global spread of the technologies
created in the most developed countries of the world is the main
proof of the leading role of education in XXI century. Information
society needs individuals having modern knowledge. In practice, all
technologies, using special technical information means (computer,
audio, video) are called information technologies of education.
Abstract: Construction project control attempts to obtain real-time information and effectively enhance dynamic control and management via information sharing and analysis among project participants to eliminate construction conflicts and project delays. However, survey results for Taiwan indicate that construction commercial project management software is not widely accepted for subcontractors and suppliers. To solve the project communications problems among participants, this study presents a novel system called the Construction Dynamic Teams Communication Management (Con-DTCM) system for small-to-medium sized subcontractors and suppliers in Taiwanese Construction industry, and demonstrates that the Con-DTCM system responds to the most recent project information efficiently and enhances management of project teams (general contractor, suppliers and subcontractors) through web-based environment. Web-based technology effectively enhances information sharing during construction project management, and generates cost savings via the Internet. The main unique characteristic of the proposed Con-DTCM system is extremely user friendly and easily design compared with current commercial project management applications. The Con-DTCM system is applied to a case study of construction of a building project in Taiwan to confirm the proposed methodology and demonstrate the effectiveness of information sharing during the construction phase. The advantages of the Con-DTCM system are in improving project control and management efficiency for general contractors, and in providing dynamic project tracking and management, which enables subcontractors and suppliers to acquire the most recent project-related information. Furthermore, this study presents and implements a generic system architecture.
Abstract: Automated discovery of Rule is, due to its applicability, one of the most fundamental and important method in KDD. It has been an active research area in the recent past. Hierarchical representation allows us to easily manage the complexity of knowledge, to view the knowledge at different levels of details, and to focus our attention on the interesting aspects only. One of such efficient and easy to understand systems is Hierarchical Production rule (HPRs) system. A HPR, a standard production rule augmented with generality and specificity information, is of the following form: Decision If < condition> Generality Specificity . HPRs systems are capable of handling taxonomical structures inherent in the knowledge about the real world. This paper focuses on the issue of mining Quantified rules with crisp hierarchical structure using Genetic Programming (GP) approach to knowledge discovery. The post-processing scheme presented in this work uses Quantified production rules as initial individuals of GP and discovers hierarchical structure. In proposed approach rules are quantified by using Dempster Shafer theory. Suitable genetic operators are proposed for the suggested encoding. Based on the Subsumption Matrix(SM), an appropriate fitness function is suggested. Finally, Quantified Hierarchical Production Rules (HPRs) are generated from the discovered hierarchy, using Dempster Shafer theory. Experimental results are presented to demonstrate the performance of the proposed algorithm.
Abstract: A virtualized and virtual approach is presented on
academically preparing students to successfully engage at a strategic
perspective to understand those concerns and measures that are both
structured and not structured in the area of cyber security and
information assurance. The Master of Science in Cyber Security and
Information Assurance (MSCSIA) is a professional degree for those
who endeavor through technical and managerial measures to ensure
the security, confidentiality, integrity, authenticity, control,
availability and utility of the world-s computing and information
systems infrastructure. The National University Cyber Security and
Information Assurance program is offered as a Master-s degree. The
emphasis of the MSCSIA program uniquely includes hands-on
academic instruction using virtual computers. This past year, 2011,
the NU facility has become fully operational using system
architecture to provide a Virtual Education Laboratory (VEL)
accessible to both onsite and online students. The first student cohort
completed their MSCSIA training this past March 2, 2012 after
fulfilling 12 courses, for a total of 54 units of college credits. The
rapid pace scheduling of one course per month is immensely
challenging, perpetually changing, and virtually multifaceted. This
paper analyses these descriptive terms in consideration of those
globalization penetration breaches as present in today-s world of
cyber security. In addition, we present current NU practices to
mitigate risks.
Abstract: Hidden failure in a protection system has been
recognized as one of the main reasons which may cause to a power
system instability leading to a system cascading collapse. This paper
presents a computationally systematic approach used to obtain the
estimated average probability of a system cascading collapse by
considering the effect of probability hidden failure in a protection
system. The estimated average probability of a system cascading
collapse is then used to determine the severe loading condition
contributing to the higher risk of critical system cascading collapse.
This information is essential to the system utility since it will assist
the operator to determine the highest point of increased system
loading condition prior to the event of critical system cascading
collapse.
Abstract: This paper challenges the relevance of knowledgebased
management research by arguing that the majority of the
literature emphasizes information and knowledge provision instead of
their business usage. For this reason the related processes are
considered valuable and eligible as such, which has led to
overlapping nature of knowledge-based management disciplines. As
a solution, this paper turns the focus on the information usage. Value
of knowledge and respective management tasks are then defined by
the business need and the knowledge-user becomes the main actor.
The paper analyses the prevailing literature streams and recognizes
the need for a more focused and robust understanding of knowledgebased
value creation. The paper contributes by synthetizing the
existing literature and pinpointing the essence of knowledge-based
management disciplines.