Abstract: Gradual patterns have been studied for many years as
they contain precious information. They have been integrated in
many expert systems and rule-based systems, for instance to reason
on knowledge such as “the greater the number of turns, the greater
the number of car crashes”. In many cases, this knowledge has been
considered as a rule “the greater the number of turns → the greater
the number of car crashes” Historically, works have thus been
focused on the representation of such rules, studying how implication
could be defined, especially fuzzy implication. These rules were
defined by experts who were in charge to describe the systems they
were working on in order to turn them to operate automatically. More
recently, approaches have been proposed in order to mine databases
for automatically discovering such knowledge. Several approaches
have been studied, the main scientific topics being: how to determine
what is an relevant gradual pattern, and how to discover them as
efficiently as possible (in terms of both memory and CPU usage).
However, in some cases, end-users are not interested in raw level
knowledge, and are rather interested in trends. Moreover, it may be
the case that no relevant pattern can be discovered at a low level of
granularity (e.g. city), whereas some can be discovered at a higher
level (e.g. county). In this paper, we thus extend gradual pattern
approaches in order to consider multiple level gradual patterns. For
this purpose, we consider two aggregation policies, namely
horizontal and vertical.
Abstract: Traveling salesman problem (TSP) is hard to resolve
when the number of cities and routes become large. The frequency
graph is constructed to tackle the problem. A frequency graph
maintains the topological relationships of the original weighted graph.
The numbers on the edges are the frequencies of the edges emulated
from the local optimal Hamiltonian paths. The simplest kind of local
optimal Hamiltonian paths are computed based on the four vertices
and three lines inequality. The search algorithm is given to find the
optimal Hamiltonian circuit based on the frequency graph. The
experiments show that the method can find the optimal Hamiltonian
circuit within several trials.
Abstract: The state of the art in instructional design for
computer-assisted learning has been strongly influenced by advances
in information technology, Internet and Web-based systems. The
emphasis of educational systems has shifted from training to
learning. The course delivered has also been changed from large
inflexible content to sequential small chunks of learning objects. The
concepts of learning objects together with the advanced technologies
of Web and communications support the reusability, interoperability,
and accessibility design criteria currently exploited by most learning
systems. These concepts enable just-in-time learning. We propose to
extend theses design criteria further to include the learnability
concept that will help adapting content to the needs of learners. The
learnability concept offers a better personalization leading to the
creation and delivery of course content more appropriate to
performance and interest of each learner. In this paper we present a
new framework of learning environments containing knowledge
discovery as a tool to automatically learn patterns of learning
behavior from learners' profiles and history.
Abstract: Laser soldering is based on applying some soldering material (albumin) onto the approximated edges of the cut and heating the solder (and the underlying tissues) by a laser beam. Endogenous and exogenous materials such as indocyanine green (ICG) are often added to solders to enhance light absorption. Gold nanoshells are new materials which have an optical response dictated by the plasmon resonance. The wavelength at which the resonance occurs depends on the core and shell sizes, allowing nanoshells to be tailored for particular applications. The purposes of this study was use combination of ICG and different concentration of gold nanoshells for skin tissue soldering and also to examine the effect of laser soldering parameters on the properties of repaired skin. Two mixtures of albumin solder and different combinations of ICG and gold nanoshells were prepared. A full thickness incision of 2×20 mm2 was made on the surface and after addition of mixtures it was irradiated by an 810nm diode laser at different power densities. The changes of tensile strength σt due to temperature rise, number of scan (Ns), and scan velocity (Vs) were investigated. The results showed at constant laser power density (I), σt of repaired incisions increases by increasing the concentration of gold nanoshells in solder, Ns and decreasing Vs. It is therefore important to consider the tradeoff between the scan velocity and the surface temperature for achieving an optimum operating condition. In our case this corresponds to σt =1800 gr/cm2 at I~ 47 Wcm-2, T ~ 85ºC, Ns =10 and Vs=0.3mms-1.
Abstract: Surface metrology with image processing is a challenging task having wide applications in industry. Surface roughness can be evaluated using texture classification approach. Important aspect here is appropriate selection of features that characterize the surface. We propose an effective combination of features for multi-scale and multi-directional analysis of engineering surfaces. The features include standard deviation, kurtosis and the Canny edge detector. We apply the method by analyzing the surfaces with Discrete Wavelet Transform (DWT) and Dual-Tree Complex Wavelet Transform (DT-CWT). We used Canberra distance metric for similarity comparison between the surface classes. Our database includes the surface textures manufactured by three machining processes namely Milling, Casting and Shaping. The comparative study shows that DT-CWT outperforms DWT giving correct classification performance of 91.27% with Canberra distance metric.
Abstract: Planning community has been long discussing emerging paradigms within the planning theory in the face of the changing conditions of the world order. The paradigm shift concept was introduced by Thomas Kuhn, in 1960, who claimed the necessity of shifting within scientific knowledge boundaries; and following him in 1970 Imre Loktas also gave priority to the emergence of multi-paradigm societies [24]. Multi-paradigm is changing our predetermined lifeworld through uncertainties. Those uncertainties are reflected in two sides, the first one is uncertainty as a concept of possibility and creativity in public sphere and the second one is uncertainty as a risk. Therefore, it is necessary to apply a resilience planning approach to be more dynamic in controlling uncertainties which have the potential to transfigure present time and space definitions. In this way, stability of system can be achieved. Uncertainty is not only an outcome of worldwide changes but also a place-specific issue, i.e. it changes from continent to continent, a country to country; a region to region. Therefore, applying strategic spatial planning with respect to resilience principle contributes to: control, grasp and internalize uncertainties through place-specific strategies. In today-s fast changing world, planning system should follow strategic spatial projects to control multi-paradigm societies with adaptability capacities. Here, we have selected two alternatives to demonstrate; these are; 1.Tehran (Iran) from the Middle East 2.Bath (United Kingdom) from Europe. The study elaborates uncertainties and particularities in their strategic spatial planning processes in a comparative manner. Through the comparison, the study aims at assessing place-specific priorities in strategic planning. The approach is to a two-way stream, where the case cities from the extreme end of the spectrum can learn from each other. The structure of this paper is to firstly compare semi-periphery (Tehran) and coreperiphery (Bath) cities, with the focus to reveal how they equip to face with uncertainties according to their geographical locations and local particularities. Secondly, the key message to address is “Each locality requires its own strategic planning approach to be resilient.--
Abstract: This article discusses the concept of student ownership of knowledge and seeks to determine how to move students from knowledge acquisition to knowledge application and ultimately to knowledge generation in a virtual setting. Instructional strategies for fostering student engagement in a virtual environment are critical to the learner-s strategic ownership of the knowledge. A number of relevant theories that focus on learning, affect, needs and adult concerns are presented to provide a basis for exploring the transfer of knowledge from teacher to learner. A model under development is presented that combines the dimensions of knowledge approach, the teacher-student relationship with regards to knowledge authority and teaching approach to demonstrate the recursive and scaffolded design for creation of virtual learning environments.
Abstract: Wind power is among the most actively developing distributed generation (DG) technology. Majority of the wind power based DG technologies employ wind turbine induction generators (WTIG) instead of synchronous generators, for the technical advantages like: reduced size, increased robustness, lower cost, and increased electromechanical damping. However, dynamic changes of wind speed make the amount of active/reactive power injected/drawn to a WTIG embedded distribution network highly variable. This paper analyzes the effect of wind speed changes on the active and reactive power penetration to the wind energy embedded distribution network. Four types of wind speed changes namely; constant, linear change, gust change and random change of wind speed are considered in the analysis. The study is carried out by three-phase, non-linear, dynamic simulation of distribution system component models. Results obtained from the investigation are presented and discussed.
Abstract: Stick models are widely used in studying the
behaviour of straight as well as skew bridges and viaducts subjected
to earthquakes while carrying out preliminary studies. The
application of such models to highly curved bridges continues to
pose challenging problems. A viaduct proposed in the foothills of the
Himalayas in Northern India is chosen for the study. It is having 8
simply supported spans @ 30 m c/c. It is doubly curved in horizontal
plane with 20 m radius. It is inclined in vertical plane as well. The
superstructure consists of a box section. Three models have been
used: a conventional stick model, an improved stick model and a 3D
finite element model. The improved stick model is employed by
making use of body constraints in order to study its capabilities. The
first 8 frequencies are about 9.71% away in the latter two models.
Later the difference increases to 80% in 50th mode. The viaduct was
subjected to all three components of the El Centro earthquake of May
1940. The numerical integration was carried out using the Hilber-
Hughes-Taylor method as implemented in SAP2000. Axial forces
and moments in the bridge piers as well as lateral displacements at
the bearing levels are compared for the three models. The maximum
difference in the axial forces and bending moments and
displacements vary by 25% between the improved and finite element
model. Whereas, the maximum difference in the axial forces,
moments, and displacements in various sections vary by 35%
between the improved stick model and equivalent straight stick
model. The difference for torsional moment was as high as 75%. It is
concluded that the stick model with body constraints to model the
bearings and expansion joints is not desirable in very sharp S curved
viaducts even for preliminary analysis. This model can be used only
to determine first 10 frequency and mode shapes but not for member
forces. A 3D finite element analysis must be carried out for
meaningful results.
Abstract: Possible advantages of technology in educational
context required the defining boundaries of formal and informal
learning. Increasing opportunity to ubiquitous learning by
technological support has revealed a question of how to discover
the potential of individuals in the spontaneous environments such as
social networks. This seems to be related with the question of what
purposes in social networks have been being used? Social networks
provide various advantages in educational context as collaboration,
knowledge sharing, common interests, active participation and
reflective thinking. As a consequence of these, the purpose of this
study is composed of proposing a new model that could determine
factors which effect adoption of social network applications for usage
in educational context. While developing a model proposal, the
existing adoption and diffusion models have been reviewed and they
are thought to be suitable on handling an original perspective instead
of using completely other diffusion or acceptance models because of
different natures of education from other organizations. In the
proposed model; social factors, perceived ease of use, perceived
usefulness and innovativeness are determined four direct constructs
that effect adoption process. Facilitating conditions, image,
subjective norms and community identity are incorporated to model
as antecedents of these direct four constructs.
Abstract: Facial expression analysis is rapidly becoming an
area of intense interest in computer science and human-computer
interaction design communities. The most expressive way humans
display emotions is through facial expressions. In this paper we
present a method to analyze facial expression from images by
applying Gabor wavelet transform (GWT) and Discrete Cosine
Transform (DCT) on face images. Radial Basis Function (RBF)
Network is used to classify the facial expressions. As a second stage,
the images are preprocessed to enhance the edge details and non
uniform down sampling is done to reduce the computational
complexity and processing time. Our method reliably works even
with faces, which carry heavy expressions.
Abstract: Judgment is affected by many agents and distortion in this assessment is unpreventable. Personality dimensions are among those factors that interfere with the distortion. In this research, the relations between personality dimensions of subject and his judgment on friends- personality dimensions is investigated. One-hundred friend couples completed both NEO Five Factor Inventory (NEOFFI) and Ahvaz Reality Distortion Inventory (ARDI) to make judgments about themselves and their friends. Observations show that judge-s Agreement and Neuroticism dimensions are impressed by reality distortion. On the other hand, this reality distortion interferes with one-s evaluation of his friend-s Agreement, Neuroticism, and Conscientiousness dimensions. Conscientiousness with suppressive effect on judge-s other dimensions plays the irrelevant role on personality judgment. Therefore, observer-rating tools which are used as a conventional criterion seem to be not valid because of the reality distortion due to judge-s personality dimensions.
Abstract: This paper presents recent work on the improvement
of the robotics vision based control strategy for underwater pipeline
tracking system. The study focuses on developing image processing
algorithms and a fuzzy inference system for the analysis of the
terrain. The main goal is to implement the supervisory fuzzy learning
control technique to reduce the errors on navigation decision due to
the pipeline occlusion problem. The system developed is capable of
interpreting underwater images containing occluded pipeline, seabed
and other unwanted noise. The algorithm proposed in previous work
does not explore the cooperation between fuzzy controllers,
knowledge and learnt data to improve the outputs for underwater
pipeline tracking. Computer simulations and prototype simulations
demonstrate the effectiveness of this approach. The system accuracy
level has also been discussed.
Abstract: This paper develops the fiscal health index of 21 local
governments in Taiwan over the 1984 to 2010 period. A quantile
regression analysis was used to explore the extent that economic
variables, political budget cycles, and legislative checks and balances,
impact different quantiles of fiscal health index for a country over a
sample period of time. Our findings suggest that local governments at
the lower quantile are significantly benefited from political budget
cycles and the increase in central government revenues, while
legislative effective checks and balances and the increase in central
government expenditures have a significantly negative effect on local
fiscal health. When local governments are in the upper tail of the
distribution, legislative checks and balances and growth in
macroeconomics have significant and adverse effects on the fiscal
health of local governments. However, increases in central
government revenues have significant and positive effects on the
health status of local government in Taiwan.
Abstract: Through the course of this paper we define Business Case Management and its characteristics, and highlight its link to knowledge workers. Business Case Management combines knowledge and process effectively, supporting the ad hoc and unpredictable nature of cases, and coordinate a range of other technologies to appropriately support knowledge-intensive processes. We emphasize the growing importance of knowledge workers and the current poor support for knowledge work automation. We also discuss the challenges in supporting this kind of knowledge work and propose a novel approach to overcome these challenges.
Abstract: With the rapid growth in business size, today's businesses orient towards electronic technologies. Amazon.com and e-bay.com are some of the major stakeholders in this regard. Unfortunately the enormous size and hugely unstructured data on the web, even for a single commodity, has become a cause of ambiguity for consumers. Extracting valuable information from such an everincreasing data is an extremely tedious task and is fast becoming critical towards the success of businesses. Web content mining can play a major role in solving these issues. It involves using efficient algorithmic techniques to search and retrieve the desired information from a seemingly impossible to search unstructured data on the Internet. Application of web content mining can be very encouraging in the areas of Customer Relations Modeling, billing records, logistics investigations, product cataloguing and quality management. In this paper we present a review of some very interesting, efficient yet implementable techniques from the field of web content mining and study their impact in the area specific to business user needs focusing both on the customer as well as the producer. The techniques we would be reviewing include, mining by developing a knowledge-base repository of the domain, iterative refinement of user queries for personalized search, using a graphbased approach for the development of a web-crawler and filtering information for personalized search using website captions. These techniques have been analyzed and compared on the basis of their execution time and relevance of the result they produced against a particular search.
Abstract: The importance of low power consumption is widely
acknowledged due to the increasing use of portable devices, which
require minimizing the consumption of energy. Energy dissipation is
heavily dependent on the software used in the system. Applying
design patterns in object-oriented designs is a common practice
nowadays. In this paper we analyze six design patterns and explore
the effect of them on energy consumption and performance.
Abstract: Natural disasters, including earthquake, kill many people around the world every year. Society rescue actions, which start after the earthquake and are called LAST in abbreviation, include locating, access, stabilization and transportation. In the present article, we have studied the process of local accessibility to the injured and transporting them to health care centers. With regard the heavy traffic load due to earthquake, the destruction of connecting roads and bridges and the heavy debris in alleys and street, which put the lives of the injured and the people buried under the debris in danger, accelerating the rescue actions and facilitating the accessibilities are of great importance, obviously. Tehran, the capital of Iran, is among the crowded cities in the world and is the center of extensive economic, political, cultural and social activities. Tehran has a population of about 9.5 millions and because of the immigration of people from the surrounding cities. Furthermore, considering the fact that Tehran is located on two important and large faults, a 6 Richter magnitude earthquake in this city could lead to the greatest catastrophe during the entire human history. The present study is a kind of review and a major part of the required information for it, has been obtained from libraries all of the rescue vehicles around the world, including rescue helicopters, ambulances, fire fighting vehicles and rescue boats, and their applied technology, and also the robots specifically designed for the rescue system and the advantages and disadvantages of them, have been investigated. The studies show that there is a significant relationship between the rescue team-s arrival time at the incident zone and the number of saved people; so that, if the duration of burial under debris 30 minutes, the probability of survival is %99.3, after a day is %81, after 2days is %19 and after 5days is %7.4. The exiting transport systems all have some defects. If these defects are removed, more people could be saved each hour and the preparedness against natural disasters is increased. In this study, transport system has been designed for the rescue team and the injured; which could carry the rescue team to the incident zone and the injured to the health care centers. In addition, this system is able to fly in the air and move on the earth as well; so that the destruction of roads and the heavy traffic load could not prevent the rescue team from arriving early at the incident zone. The system also has the equipment required firebird for debris removing, optimum transport of the injured and first aid.
Abstract: Electronic Government is one of the special concepts
which has been performed successfully within recent decades.
Electronic government is a digital, wall-free government with a
virtual organization for presenting of online governmental services
and further cooperation in different political/social activities. In order
to have a successful implementation of electronic government
strategy and benefiting from its complete potential and benefits and
generally for establishment and applying of electronic government, it
is necessary to have different infrastructures as the basics of
electronic government with lack of which it is impossible to benefit
from mentioned services. For this purpose, in this paper we have
managed to recognize relevant obstacles for establishment of
electronic government in Iran. All required data for recognition of
obstacles were collected from statistical society of involved
specialists of Ministry of Communications & Information
Technology of Iran and Information Technology Organization of
Tehran Municipality through questionnaire. Then by considering of
five-point Likert scope and μ =3 as the index of relevant factors of
proposed model, we could specify current obstacles against
electronic government in Iran along with some guidelines and
proposal in this regard. According to the results, mentioned obstacles
for applying of electronic government in Iran are as follows:
Technical & technological problems, Legal, judicial & safety
problems, Economic problems and Humanistic Problems.
Abstract: MATCH project [1] entitle the development of an
automatic diagnosis system that aims to support treatment of colon
cancer diseases by discovering mutations that occurs to tumour
suppressor genes (TSGs) and contributes to the development of
cancerous tumours. The constitution of the system is based on a)
colon cancer clinical data and b) biological information that will be
derived by data mining techniques from genomic and proteomic
sources The core mining module will consist of the popular, well
tested hybrid feature extraction methods, and new combined
algorithms, designed especially for the project. Elements of rough
sets, evolutionary computing, cluster analysis, self-organization maps
and association rules will be used to discover the annotations
between genes, and their influence on tumours [2]-[11].
The methods used to process the data have to address their high
complexity, potential inconsistency and problems of dealing with the
missing values. They must integrate all the useful information
necessary to solve the expert's question. For this purpose, the system
has to learn from data, or be able to interactively specify by a domain
specialist, the part of the knowledge structure it needs to answer a
given query. The program should also take into account the
importance/rank of the particular parts of data it analyses, and adjusts
the used algorithms accordingly.