Abstract: The Internet is the global data communications
infrastructure based on the interconnection of both public and private
networks using protocols that implement Internetworking on a global
scale. Hence the control of protocol and infrastructure development,
resource allocation and network operation are crucial and interlinked
aspects. Internet Governance is the hotly debated and contentious
subject that refers to the global control and operation of key Internet
infrastructure such as domain name servers and resources such as
domain names. It is impossible to separate technical and political
positions as they are interlinked. Furthermore the existence of a
global market, transparency and competition impact upon Internet
Governance and related topics such as network neutrality and
security. Current trends and developments regarding Internet
governance with a focus on the policy-making process, security and
control have been observed to evaluate current and future
implications on the Internet. The multi stakeholder approach to
Internet Governance discussed in this paper presents a number of
opportunities, issues and developments that will affect the future
direction of the Internet. Internet operation, maintenance and
advisory organisations such as the Internet Corporation for Assigned
Names and Numbers (ICANN) or the Internet Governance Forum
(IGF) are currently in the process of formulating policies for future
Internet Governance. Given the controversial nature of the issues at
stake and the current lack of agreement it is predicted that
institutional as well as market governance will remain present for the
network access and content.
Abstract: Current image-based individual human recognition
methods, such as fingerprints, face, or iris biometric modalities
generally require a cooperative subject, views from certain aspects,
and physical contact or close proximity. These methods cannot
reliably recognize non-cooperating individuals at a distance in the
real world under changing environmental conditions. Gait, which
concerns recognizing individuals by the way they walk, is a relatively
new biometric without these disadvantages. The inherent gait
characteristic of an individual makes it irreplaceable and useful in
visual surveillance.
In this paper, an efficient gait recognition system for human
identification by extracting two features namely width vector of
the binary silhouette and the MPEG-7-based region-based shape
descriptors is proposed. In the proposed method, foreground objects
i.e., human and other moving objects are extracted by estimating
background information by a Gaussian Mixture Model (GMM) and
subsequently, median filtering operation is performed for removing
noises in the background subtracted image. A moving target classification
algorithm is used to separate human being (i.e., pedestrian)
from other foreground objects (viz., vehicles). Shape and boundary
information is used in the moving target classification algorithm.
Subsequently, width vector of the outer contour of binary silhouette
and the MPEG-7 Angular Radial Transform coefficients are taken as
the feature vector. Next, the Principal Component Analysis (PCA)
is applied to the selected feature vector to reduce its dimensionality.
These extracted feature vectors are used to train an Hidden Markov
Model (HMM) for identification of some individuals. The proposed
system is evaluated using some gait sequences and the experimental
results show the efficacy of the proposed algorithm.
Abstract: In this paper, a plane-strain orthotropic elasto-plastic
dynamic constitutive model is established, and with this constitutive
model, the thermal shock wave induced by intense pulsed X-ray
radiation in cylinder shell composite is simulated by the finite element
code, then the properties of thermal shock wave propagation are
discussed. The results show that the thermal shock wave exhibit
different shapes under the radiation of soft and hard X-ray, and while
the composite is radiated along different principal axes, great
differences exist in some aspects, such as attenuation of the peak stress
value, spallation and so on.
Abstract: Using a scoring system, this paper provides a
comparative assessment of the quality of data between XBRL
formatted financial reports and non-XBRL financial reports. It shows a
major improvement in the quality of data of XBRL formatted financial
reports. Although XBRL formatted financial reports do not show
much advantage in the quality at the beginning, XBRL financial
reports lately display a large improvement in the quality of data in
almost all aspects. With the improved XBRL web data managing,
presentation and analysis applications, XBRL formatted financial
reports have a much better accessibility, are more accurate and better
in timeliness.
Abstract: One of the most important aspects expected from ERP systems is to integrate various operations existing in administrative, financial, commercial, human resources, and production departments of the consumer organization. Also, it is often needed to integrate the new ERP system with the organization legacy systems when implementing the ERP package in the organization. Without relying on an appropriate software architecture to realize the required integration, ERP implementation processes become error prone and time consuming; in some cases, the ERP implementation may even encounters serious risks. In this paper, we propose a new architecture that is based on the agent oriented vision and supplies the integration expected from ERP systems using several independent but cooperator agents. Besides integration which is the main issue of this paper, the presented architecture will address some aspects of intelligence and learning capabilities existing in ERP systems
Abstract: The mosaicing technique has been employed in more and more application fields, from entertainment to scientific ones. In the latter case, often the final evaluation is still left to human beings, that assess visually the quality of the mosaic. Many times, a lack of objective measurements in microscopic mosaicing may prevent the mosaic from being used as a starting image for further analysis. In this work we analyze three different metrics and indexes, in the domain of signal analysis, image analysis and visual quality, to measure the quality of different aspects of the mosaicing procedure, such as registration errors and visual quality. As the case study we consider the mosaicing algorithm we developed. The experiments have been carried out by considering mosaics with very different features: histological samples, that are made of detailed and contrasted images, and live stem cells, that show a very low contrast and low detail levels.
Abstract: This paper presents a new approach for image
segmentation by applying Pillar-Kmeans algorithm. This
segmentation process includes a new mechanism for clustering the
elements of high-resolution images in order to improve precision and
reduce computation time. The system applies K-means clustering to
the image segmentation after optimized by Pillar Algorithm. The
Pillar algorithm considers the pillars- placement which should be
located as far as possible from each other to withstand against the
pressure distribution of a roof, as identical to the number of centroids
amongst the data distribution. This algorithm is able to optimize the
K-means clustering for image segmentation in aspects of precision
and computation time. It designates the initial centroids- positions
by calculating the accumulated distance metric between each data
point and all previous centroids, and then selects data points which
have the maximum distance as new initial centroids. This algorithm
distributes all initial centroids according to the maximum
accumulated distance metric. This paper evaluates the proposed
approach for image segmentation by comparing with K-means and
Gaussian Mixture Model algorithm and involving RGB, HSV, HSL
and CIELAB color spaces. The experimental results clarify the
effectiveness of our approach to improve the segmentation quality in
aspects of precision and computational time.
Abstract: The stochastic nature of tool life using conventional discrete-wear data from experimental tests usually exists due to many individual and interacting parameters. It is a common practice in batch production to continually use the same tool to machine different parts, using disparate machining parameters. In such an environment, the optimal points at which tools have to be changed, while achieving minimum production cost and maximum production rate within the surface roughness specifications, have not been adequately studied. In the current study, two relevant aspects are investigated using coated and uncoated inserts in turning operations: (i) the accuracy of using machinability information, from fixed parameters testing procedures, when variable parameters situations are emerged, and (ii) the credibility of tool life machinability data from prior discrete testing procedures in a non-stop machining. A novel technique is proposed and verified to normalize the conventional fixed parameters machinability data to suit the cases when parameters have to be changed for the same tool. Also, an experimental investigation has been established to evaluate the error in the tool life assessment when machinability from discrete testing procedures is employed in uninterrupted practical machining.
Abstract: Market competition and a desire to gain advantages on globalized market, drives companies towards innovation efforts. Project overload is an unpleasant phenomenon, which is happening for employees inside those organizations trying to make the most efficient use of their resources to be innovative. But what are the impacts of project overload on organization-s innovation capabilities? Advanced engineering teams (AE) inside a major heavy equipment manufacturer are suffering from project overload in their quest for innovation. In this paper, Agent-based modeling (ABM) is used to examine the current reality of the company context, and of the AE team, where the opportunities and challenges for reducing the risk of project overload and moving towards innovation were identified. Project overload is more likely to stifle innovation and creativity inside teams. On the other hand, motivations on proper challenging goals are more likely to help individual to alleviate the negative aspects of low level of project overload.
Abstract: The prediction of Software quality during development life cycle of software project helps the development organization to make efficient use of available resource to produce the product of highest quality. “Whether a module is faulty or not" approach can be used to predict quality of a software module. There are numbers of software quality prediction models described in the literature based upon genetic algorithms, artificial neural network and other data mining algorithms. One of the promising aspects for quality prediction is based on clustering techniques. Most quality prediction models that are based on clustering techniques make use of K-means, Mixture-of-Guassians, Self-Organizing Map, Neural Gas and fuzzy K-means algorithm for prediction. In all these techniques a predefined structure is required that is number of neurons or clusters should be known before we start clustering process. But in case of Growing Neural Gas there is no need of predetermining the quantity of neurons and the topology of the structure to be used and it starts with a minimal neurons structure that is incremented during training until it reaches a maximum number user defined limits for clusters. Hence, in this work we have used Growing Neural Gas as underlying cluster algorithm that produces the initial set of labeled cluster from training data set and thereafter this set of clusters is used to predict the quality of test data set of software modules. The best testing results shows 80% accuracy in evaluating the quality of software modules. Hence, the proposed technique can be used by programmers in evaluating the quality of modules during software development.
Abstract: As new challenges emerge in power electrical
workplace safety, it is the responsibility of the systems designer to
seek out new approaches and solutions that address them. Design
decisions made today will impact cost, safety and serviceability of
the installed systems for 40 or 50 years during the useful life for the
owner. Studies have shown that this cost is an order of magnitude of
7 to 10 times the installed cost of the power distribution equipment.
This paper reviews some aspects of earthing system design in power
substation surrounded by residential houses. The electrical potential
rise and split factors are discussed and a few recommendations are
provided to achieve a safety voltage in the area beyond the boundary
of the substation.
Abstract: Software projects are very dynamic and require
recurring adjustments of their project plans. These settings can be
understood as reconfigurations in the schedule, in the resources
allocation and other design elements. Yet, during the planning and
execution of a software project, the integration of specific activities
in the projects with the activities that take part in the organization-s
common activity flow should be considered. This article presents the
results from a systematic review of aspects related to software
projects- dynamic reconfiguration emphasizing the integration of
project management with the organizational flows. A series of studies
was analyzed from the year 2000 to the present. The results of this
work show that there is a diversity of techniques and strategies for
dynamic reconfiguration of software projects-. However, few
approaches consider the integration of software project activities with
the activities that take part in the organization-s common workflow.
Abstract: Computers are being integrated in the various aspects
of human every day life in different shapes and abilities. This fact
has intensified a requirement for the software development
technologies which is ability to be: 1) portable, 2) adaptable, and 3)
simple to develop. This problem is also known as the Pervasive
Computing Problem (PCP) which can be implemented in different
ways, each has its own pros and cons and Context Oriented
Programming (COP) is one of the methods to address the PCP.
In this paper a design for a COP framework, a context aware
framework, is presented which has eliminated weak points of a
previous design based on interpreter languages, while introducing the
compiler languages power in implementing these frameworks.
The key point of this improvement is combining COP and
Dependency Injection (DI) techniques. Both old and new frameworks
are analyzed to show advantages and disadvantages. Finally a
simulation of both designs is proposed to indicating that the practical
results agree with the theoretical analysis while the new design runs
almost 8 times faster.
Abstract: This research is a comparative study of complexity, as a multidimensional concept, in the context of streetscape composition in Algeria and Japan. 80 streetscapes visual arrays have been collected and then presented to 20 participants, with different cultural backgrounds, in order to be categorized and classified according to their degrees of complexity. Three analysis methods have been used in this research: cluster analysis, ranking method and Hayashi Quantification method (Method III). The results showed that complexity, disorder, irregularity and disorganization are often conflicting concepts in the urban context. Algerian daytime streetscapes seem to be balanced, ordered and regular, and Japanese daytime streetscapes seem to be unbalanced, regular and vivid. Variety, richness and irregularity with some aspects of order and organization seem to characterize Algerian night streetscapes. Japanese night streetscapes seem to be more related to balance, regularity, order and organization with some aspects of confusion and ambiguity. Complexity characterized mainly Algerian avenues with green infrastructure. Therefore, for Japanese participants, Japanese traditional night streetscapes were complex. And for foreigners, Algerian and Japanese avenues nightscapes were the most complex visual arrays.
Abstract: This paper explores how Critical Systems Thinking and Action Research can be used to improve student performance in Networking. When describing a system from a systems thinking perspective, the following aspects can be identified: the total system performance, the systems environment, the resources, the components and the management of the system. Following the history of system thinking we observe three emerged methodologies namely, hard systems, soft systems, and critical systems. This paper uses Critical Systems Thinking (CST) which describes systems in terms of contradictions and conflict. It demonstrates how CST can be used in an Action Research (AR) project to improve the performance of students. Intervention in terms of student assessment is discussed and the impact of the intervention is discussed.