Abstract: We have defined two suites of metrics, which cover
static and dynamic aspects of component assembly. The static
metrics measure complexity and criticality of component assembly,
wherein complexity is measured using Component Packing Density
and Component Interaction Density metrics. Further, four criticality
conditions namely, Link, Bridge, Inheritance and Size criticalities
have been identified and quantified. The complexity and criticality
metrics are combined to form a Triangular Metric, which can be used
to classify the type and nature of applications. Dynamic metrics are
collected during the runtime of a complete application. Dynamic
metrics are useful to identify super-component and to evaluate the
degree of utilisation of various components. In this paper both static
and dynamic metrics are evaluated using Weyuker-s set of properties.
The result shows that the metrics provide a valid means to measure
issues in component assembly. We relate our metrics suite with
McCall-s Quality Model and illustrate their impact on product
quality and to the management of component-based product
development.
Abstract: Computer languages are usually lumped together
into broad -paradigms-, leaving us in want of a finer classification
of kinds of language. Theories distinguishing between -genuine
differences- in language has been called for, and we propose that
such differences can be observed through a notion of expressive mode.
We outline this concept, propose how it could be operationalized and
indicate a possible context for the development of a corresponding
theory. Finally we consider a possible application in connection
with evaluation of language revision. We illustrate this with a case,
investigating possible revisions of the relational algebra in order to
overcome weaknesses of the division operator in connection with
universal queries.
Abstract: This paper solves the environmental/ economic dispatch
power system problem using the Non-dominated Sorting Genetic
Algorithm-II (NSGA-II) and its hybrid with a Convergence Accelerator
Operator (CAO), called the NSGA-II/CAO. These multiobjective
evolutionary algorithms were applied to the standard IEEE 30-bus
six-generator test system. Several optimization runs were carried out
on different cases of problem complexity. Different quality measure
which compare the performance of the two solution techniques were
considered. The results demonstrated that the inclusion of the CAO
in the original NSGA-II improves its convergence while preserving
the diversity properties of the solution set.
Abstract: Irradiated material is a typical example of a complex
system with nonlinear coupling between its elements. During
irradiation the radiation damage is developed and this development
has bifurcations and qualitatively different kinds of behavior.
The accumulation of primary defects in irradiated crystals is
considered in frame work of nonlinear evolution of complex system.
The thermo-concentration nonlinear feedback is carried out as a
mechanism of self-oscillation development.
It is shown that there are two ways of the defect density evolution
under stationary irradiation. The first is the accumulation of defects;
defect density monotonically grows and tends to its stationary state
for some system parameters. Another way that takes place for
opportune parameters is the development of self-oscillations of the
defect density.
The stationary state, its stability and type are found. The
bifurcation values of parameters (environment temperature, defect
generation rate, etc.) are obtained. The frequency of the selfoscillation
and the conditions of their development is found and
rated. It is shown that defect density, heat fluxes and temperature
during self-oscillations can reach much higher values than the
expected steady-state values. It can lead to a change of typical
operation and an accident, e.g. for nuclear equipment.
Abstract: The emergence of blended learning has been
influenced by the rapid changes in Higher Education within the last
few years. However, there is a lack of studies that look into the future
of blended learning in the Saudi context. The most likely explanation
is that blended learning is relatively new and, with respect to learning
in general, under-researched. This study addresses this gap and
explores the views of lecturers and students towards the future of
blended learning in Saudi Arabia. This study was informed by the
interpretive paradigm that appears to be most appropriate to
understand and interpret the perceptions of students and instructors
towards a new learning environment. While globally there has been
considerable research on the perceptions of e-learning and blended
learning with its different models, there is plenty of space for further
research specifically in the Arab region, and in Saudi Arabia where
blended learning is now being introduced.
Abstract: In digital signal processing it is important to
approximate multi-dimensional data by the method called rank
reduction, in which we reduce the rank of multi-dimensional data from
higher to lower. For 2-dimennsional data, singular value
decomposition (SVD) is one of the most known rank reduction
techniques. Additional, outer product expansion expanded from SVD
was proposed and implemented for multi-dimensional data, which has
been widely applied to image processing and pattern recognition.
However, the multi-dimensional outer product expansion has behavior
of great computation complex and has not orthogonally between the
expansion terms. Therefore we have proposed an alterative method,
Third-order Orthogonal Tensor Product Expansion short for 3-OTPE.
3-OTPE uses the power method instead of nonlinear optimization
method for decreasing at computing time. At the same time the group
of B. D. Lathauwer proposed Higher-Order SVD (HOSVD) that is
also developed with SVD extensions for multi-dimensional data.
3-OTPE and HOSVD are similarly on the rank reduction of
multi-dimensional data. Using these two methods we can obtain
computation results respectively, some ones are the same while some
ones are slight different. In this paper, we compare 3-OTPE to
HOSVD in accuracy of calculation and computing time of resolution,
and clarify the difference between these two methods.
Abstract: As the enormous amount of on-line text grows on the
World-Wide Web, the development of methods for automatically
summarizing this text becomes more important. The primary goal of
this research is to create an efficient tool that is able to summarize
large documents automatically. We propose an Evolving
connectionist System that is adaptive, incremental learning and
knowledge representation system that evolves its structure and
functionality. In this paper, we propose a novel approach for Part of
Speech disambiguation using a recurrent neural network, a paradigm
capable of dealing with sequential data. We observed that
connectionist approach to text summarization has a natural way of
learning grammatical structures through experience. Experimental
results show that our approach achieves acceptable performance.
Abstract: The complexity of today-s software systems makes
collaborative development necessary to accomplish tasks.
Frameworks are necessary to allow developers perform their tasks
independently yet collaboratively. Similarity detection is one of the
major issues to consider when developing such frameworks. It allows
developers to mine existing repositories when developing their own
views of a software artifact, and it is necessary for identifying the
correspondences between the views to allow merging them and
checking their consistency. Due to the importance of the
requirements specification stage in software development, this paper
proposes a framework for collaborative development of Object-
Oriented formal specifications along with a similarity detection
approach to support the creation, merging and consistency checking
of specifications. The paper also explores the impact of using
additional concepts on improving the matching results. Finally, the
proposed approach is empirically evaluated.
Abstract: XML has become a popular standard for information exchange via web. Each XML document can be presented as a rooted, ordered, labeled tree. The Node label shows the exact position of a node in the original document. Region and Dewey encoding are two famous methods of labeling trees. In this paper, we propose a new insert friendly labeling method named IFDewey based on recently proposed scheme, called Extended Dewey. In Extended Dewey many labels must be modified when a new node is inserted into the XML tree. Our method eliminates this problem by reserving even numbers for future insertion. Numbers generated by Extended Dewey may be even or odd. IFDewey modifies Extended Dewey so that only odd numbers are generated and even numbers can then be used for a much easier insertion of nodes.
Abstract: Personal computers draw non-sinusoidal current
with odd harmonics more significantly. Power Quality of
distribution networks is severely affected due to the flow of these
generated harmonics during the operation of electronic loads. In
this paper, mathematical modeling of odd harmonics in current like
3rd, 5th, 7th and 9th influencing the power quality has been presented.
Live signals have been captured with the help of power quality
analyzer for analysis purpose. The interesting feature is that Total
Harmonic Distortion (THD) in current decreases with the increase
of nonlinear loads has been verified theoretically. The results
obtained using mathematical expressions have been compared with
the practical results and exciting results have been found.
Abstract: Simultaneous transient conduction and radiation heat
transfer with heat generation is investigated. Analysis is carried out
for both steady and unsteady situations. two-dimensional gray
cylindrical enclosure with an absorbing, emitting, and isotropically
scattering medium is considered. Enclosure boundaries are assumed
at specified temperatures. The heat generation rate is considered
uniform and constant throughout the medium. The lattice Boltzmann
method (LBM) was used to solve the energy equation of a transient
conduction-radiation heat transfer problem. The control volume finite
element method (CVFEM) was used to compute the radiative
information. To study the compatibility of the LBM for the energy
equation and the CVFEM for the radiative transfer equation, transient
conduction and radiation heat transfer problems in 2-D cylindrical
geometries were considered. In order to establish the suitability of the
LBM, the energy equation of the present problem was also solved
using the the finite difference method (FDM) of the computational
fluid dynamics. The CVFEM used in the radiative heat transfer was
employed to compute the radiative information required for the
solution of the energy equation using the LBM or the FDM (of the
CFD). To study the compatibility and suitability of the LBM for the
solution of energy equation and the CVFEM for the radiative
information, results were analyzed for the effects of various
parameters such as the boundary emissivity. The results of the LBMCVFEM
combination were found to be in excellent agreement with
the FDM-CVFEM combination. The number of iterations and the
steady state temperature in both of the combinations were found
comparable. Results are found for situations with and without heat
generation. Heat generation is found to have significant bearing on
temperature distribution.
Abstract: In this paper, we propose a reversible watermarking
scheme based on histogram shifting (HS) to embed watermark bits
into the H.264/AVC standard videos by modifying the last nonzero
level in the context adaptive variable length coding (CAVLC) domain.
The proposed method collects all of the last nonzero coefficients (or
called last level coefficient) of 4×4 sub-macro blocks in a macro
block and utilizes predictions for the current last level from the
neighbor block-s last levels to embed watermark bits. The feature of
the proposed method is low computational and has the ability of
reversible recovery. The experimental results have demonstrated that
our proposed scheme has acceptable degradation on video quality and
output bit-rate for most test videos.
Abstract: In this paper presented initial design of Low Speed
Axial Flux Permanent Magnet (AFPM) Machine with Non-Slotted
TORUS topology type by use of certain algorithm (Appendix).
Validation of design algorithm studied by means of selected data of
an initial prototype machine. Analytically design calculation carried
out by means of design algorithm and obtained results compared with
results of Finite Element Method (FEM).
Abstract: Soccer simulation is an effort to motivate researchers and practitioners to do artificial and robotic intelligence research; and at the same time put into practice and test the results. Many researchers and practitioners throughout the world are continuously working to polish their ideas and improve their implemented systems. At the same time, new groups are forming and they bring bright new thoughts to the field. The research includes designing and executing robotic soccer simulation algorithms. In our research, a soccer simulation player is considered to be an intelligent agent that is capable of receiving information from the environment, analyze it and to choose the best action from a set of possible ones, for its next move. We concentrate on developing a two-phase method for the soccer player agent to choose its best next move. The method is then implemented into our software system called Nexus simulation team of Ferdowsi University. This system is based on TsinghuAeolus[1] team that was the champion of the world RoboCup soccer simulation contest in 2001 and 2002.
Abstract: Advent enhancements in the field of computing have
increased massive use of web based electronic documents. Current
Copyright protection laws are inadequate to prove the ownership for
electronic documents and do not provide strong features against
copying and manipulating information from the web. This has
opened many channels for securing information and significant
evolutions have been made in the area of information security.
Digital Watermarking has developed into a very dynamic area of
research and has addressed challenging issues for digital content.
Watermarking can be visible (logos or signatures) and invisible
(encoding and decoding). Many visible watermarking techniques
have been studied for text documents but there are very few for web
based text. XML files are used to trade information on the internet
and contain important information. In this paper, two invisible
watermarking techniques using Synonyms and Acronyms are
proposed for XML files to prove the intellectual ownership and to
achieve the security. Analysis is made for different attacks and
amount of capacity to be embedded in the XML file is also noticed.
A comparative analysis for capacity is also made for both methods.
The system has been implemented using C# language and all tests are
made practically to get the results.
Abstract: Subjective loneliness describes people who feel a
disagreeable or unacceptable lack of meaningful social relationships,
both at the quantitative and qualitative level. The studies to be
presented tested an Italian 18-items self-report loneliness measure,
that included items adapted from scales previously developed,
namely a short version of the UCLA (Russell, Peplau and Cutrona,
1980), and the 11-items Loneliness scale by De Jong-Gierveld &
Kamphuis (JGLS; 1985). The studies aimed at testing the developed
scale and at verifying whether loneliness is better conceptualized as a
unidimensional (so-called 'general loneliness') or a bidimensional
construct, namely comprising the distinct facets of social and
emotional loneliness. The loneliness questionnaire included 2 singleitem
criterion measures of sad mood, and social contact, and asked
participants to supply information on a number of socio-demographic
variables. Factorial analyses of responses obtained in two
preliminary studies, with 59 and 143 Italian participants respectively,
showed good factor loadings and subscale reliability and confirmed
that perceived loneliness has clearly two components, a social and an
emotional one, the latter measured by two subscales, a 7-item
'general' loneliness subscale derived from UCLA, and a 6–item
'emotional' scale included in the JGLS. Results further showed that
type and amount of loneliness are related, negatively, to frequency of
social contacts, and, positively, to sad mood. In a third study data
were obtained from a nation-wide sample of 9.097 Italian subjects,
12 to about 70 year-olds, who filled the test on-line, on the Italian
web site of a large-audience magazine, Focus. The results again
confirmed the reliability of the component subscales, namely social,
emotional, and 'general' loneliness, and showed that they were
highly correlated with each other, especially the latter two.
Loneliness scores were significantly predicted by sex, age, education
level, sad mood and social contact, and, less so, by other variables –
e.g., geographical area and profession. The scale validity was
confirmed by the results of a fourth study, with elderly men and
women (N 105) living at home or in residential care units. The three
subscales were significantly related, among others, to depression, and
to various measures of the extension of, and satisfaction with, social
contacts with relatives and friends. Finally, a fifth study with 315
career-starters showed that social and emotional loneliness correlate
with life satisfaction, and with measures of emotional intelligence.
Altogether the results showed a good validity and reliability in the
tested samples of the entire scale, and of its components.
Abstract: Transmission control protocol (TCP) Vegas detects
network congestion in the early stage and successfully prevents
periodic packet loss that usually occurs in TCP Reno. It has been
demonstrated that TCP Vegas outperforms TCP Reno in many
aspects. However, TCP Vegas suffers several problems that affect its
congestion avoidance mechanism. One of the most important
weaknesses in TCP Vegas is that alpha and beta depend on a good
expected throughput estimate, which as we have seen, depends on a
good minimum RTT estimate. In order to make the system more
robust alpha and beta must be made responsive to network conditions
(they are currently chosen statically). This paper proposes a modified
Vegas algorithm, which can be adjusted to present good performance
compared to other transmission control protocols (TCPs). In order to
do this, we use PSO algorithm to tune alpha and beta. The simulation
results validate the advantages of the proposed algorithm in term of
performance.
Abstract: This paper presents a new classification algorithm using colour and texture for obstacle detection. Colour information is computationally cheap to learn and process. However in many cases, colour alone does not provide enough information for classification. Texture information can improve classification performance but usually comes at an expensive cost. Our algorithm uses both colour and texture features but texture is only needed when colour is unreliable. During the training stage, texture features are learned specifically to improve the performance of a colour classifier. The algorithm learns a set of simple texture features and only the most effective features are used in the classification stage. Therefore our algorithm has a very good classification rate while is still fast enough to run on a limited computer platform. The proposed algorithm was tested with a challenging outdoor image set. Test result shows the algorithm achieves a much better trade-off between classification performance and efficiency than a typical colour classifier.
Abstract: Mobile Ad hoc network consists of a set of mobile
nodes. It is a dynamic network which does not have fixed topology.
This network does not have any infrastructure or central
administration, hence it is called infrastructure-less network. The
change in topology makes the route from source to destination as
dynamic fixed and changes with respect to time. The nature of
network requires the algorithm to perform route discovery, maintain
route and detect failure along the path between two nodes [1]. This
paper presents the enhancements of ARA [2] to improve the
performance of routing algorithm. ARA [2] finds route between
nodes in mobile ad-hoc network. The algorithm is on-demand source
initiated routing algorithm. This is based on the principles of swarm
intelligence. The algorithm is adaptive, scalable and favors load
balancing. The improvements suggested in this paper are handling of
loss ants and resource reservation.
Abstract: Cellular networks provide voice and data services to the users with mobility. To deliver services to the mobile users, the cellular network is capable of tracking the locations of the users, and allowing user movement during the conversations. These capabilities are achieved by the location management. Location management in mobile communication systems is concerned with those network functions necessary to allow the users to be reached wherever they are in the network coverage area. In a cellular network, a service coverage area is divided into smaller areas of hexagonal shape, referred to as cells. The cellular concept was introduced to reuse the radio frequency. Continued expansion of cellular networks, coupled with an increasingly restricted mobile spectrum, has established the reduction of communication overhead as a highly important issue. Much of this traffic is used in determining the precise location of individual users when relaying calls, with the field of location management aiming to reduce this overhead through prediction of user location. This paper describes and compares various location management schemes in the cellular networks.