Abstract: Among various testing methodologies, Built-in Self-
Test (BIST) is recognized as a low cost, effective paradigm. Also,
full adders are one of the basic building blocks of most arithmetic
circuits in all processing units. In this paper, an optimized testable 2-
bit full adder as a test building block is proposed. Then, a BIST
procedure is introduced to scale up the building block and to generate
a self testable n-bit full adders. The target design can achieve 100%
fault coverage using insignificant amount of hardware redundancy.
Moreover, Overall test time is reduced by utilizing polymorphic
gates and also by testing full adder building blocks in parallel.
Abstract: Knowledge management (KM) is generally
considered to be a positive process in an organisation, facilitating
opportunities to achieve competitive advantage via better quality
information handling, compilation of expert know-how and rapid
response to fluctuations in the business environment. The KM
paradigm as portrayed in the literature informs the processes that can
increase intangible assets so that corporate knowledge is preserved.
However, in some instances, knowledge management exists in a
universe of dynamic tension among the conflicting needs to respect
privacy and intellectual property (IP), to guard against data theft, to
protect national security and to stay within the laws. While the
Knowledge Management literature focuses on the bright side of the
paradigm, there is also a different side in which knowledge is
distorted, suppressed or misappropriated due to personal or
organisational motives (the paradox). This paper describes the ethical
paradoxes that occur within the taxonomy and deontology of
knowledge management and suggests that recognising both the
promises and pitfalls of KM requires wisdom.
Abstract: Designing and implementing intelligent systems has become a crucial factor for the innovation and development of better products of space technologies. A neural network is a parallel system, capable of resolving paradigms that linear computing cannot. Field programmable gate array (FPGA) is a digital device that owns reprogrammable properties and robust flexibility. For the neural network based instrument prototype in real time application, conventional specific VLSI neural chip design suffers the limitation in time and cost. With low precision artificial neural network design, FPGAs have higher speed and smaller size for real time application than the VLSI and DSP chips. So, many researchers have made great efforts on the realization of neural network (NN) using FPGA technique. In this paper, an introduction of ANN and FPGA technique are briefly shown. Also, Hardware Description Language (VHDL) code has been proposed to implement ANNs as well as to present simulation results with floating point arithmetic. Synthesis results for ANN controller are developed using Precision RTL. Proposed VHDL implementation creates a flexible, fast method and high degree of parallelism for implementing ANN. The implementation of multi-layer NN using lookup table LUT reduces the resource utilization for implementation and time for execution.
Abstract: The main aim of this paper was to investigate the
existing architecture in Cyprus, and thus identify and describe the
overall architectural rationale of the built environment. In Cyprus,
where individuals live in a society that reflects postmodern
paradigms rather than modern ones, the existing built environment
has many different reflections of the structure of its society.
Abstract: In the self-stabilizing algorithmic paradigm, each node has a local view of the system, in a finite amount of time the system converges to a global state with desired property. In a graph G =
(V, E), a subset S C V is a 2-packing if Vi c V: IN[i] n SI
Abstract: After Apple's first introduction its smart phone, iPhone
in the end of 2009 in Korea, the number of Korean smarphone users
had been rapidly increasing so that the half of Korean population
became smart phone users as of February, 2012. Currently, smart
phones are positioned as a major digital media with powerful
influences in Korea. And, now, Koreans are leaning new information,
enjoying games and communicating other people every time and
everywhere. As smart phone devices' performances increased, the
number of usable services became more while adequate GUI
developments are required to implement various functions with smart
phones. The strategy to provide similar experiences on smart phones
through familiar features based on employment of existing media's
functions mostly contributed to smart phones' popularization in
connection with smart phone devices' iconic GUIs.
The spread of Smart phone increased mobile web accesses.
Therefore, the attempts to implement PC's web in the smart phone's
web are continuously made. The mobile web GUI provides familiar
experiences to users through designs adequately utilizing the smart
phone's GUIs. As the number of users familiarized to smart phones
and mobile web GUIs, opposite to reversed remediation from many
parts of PCs, PCs are starting to adapt smart phone GUIs.
This study defines this phenomenon as the reversed remediation,
and reviews the reversed remediation cases of Smart phone GUI'
characteristics of PCs. For this purpose, the established study issues
are as under:
· what is the reversed remediation?
· what are the smart phone GUI's characteristics?
· what kind of interrelationship exist s between the smart phone and
PC's web site?
It is meaningful in the forecast of the future GUI's change by
understanding of characteristics in the paradigm changes of PC and
smart phone's GUI designs. This also will be helpful to establish
strategies for digital devices' development and design.
Abstract: In Thailand, both the 1997 and the current 2007 Thai Constitutions have mentioned the establishment of independent organizations as a new mechanism to play a key role in proposing policy recommendations to national decision-makers in the interest of collective consumers. Over the last ten years, no independent organizations have yet been set up. Evidently, nobody could point out who should be key players in establishing provincial independent consumer bodies. The purpose of this study was to find definitive stakeholders in establishing and developing independent consumer bodies in a Thai context. This was a cross-sectional study between August and September 2007, using a postal questionnaire with telephone follow-up. The questionnaire was designed and used to obtain multiple stakeholder assessment of three key attributes (power, interest and influence). Study population was 153 stakeholders associated with policy decision-making, formulation and implementation processes of civil-based consumer protection in pilot provinces. The population covered key representatives from five sectors (academics, government officers, business traders, mass media and consumer networks) who participated in the deliberative forums at 10 provinces. A 49.7% response rate was achieved. Data were analyzed, comparing means of three stakeholder attributes and classification of stakeholder typology. The results showed that the provincial health officers were the definitive stakeholders as they had legal power, influence and interest in establishing and sustaining the independent consumer bodies. However, only a few key representatives of the provincial health officers expressed their own paradigm on the civil-based consumer protection. Most provincial health officers put their own standpoint of building civic participation at only a plan-implementation level. For effective policy implementation by the independent consumer bodies, the Thai government should provide budgetary support for the operation of the provincial health officers with their paradigm shift as well as their own clarified standpoint on corporate governance.
Abstract: In this paper challenges associated with a new
generation of Computer Science students are examined. The mode of
education in tertiary institutes has progressed slowly while the needs
of students have changed rapidly in an increasingly technological
world. The major learning paradigms and learning theories within
these paradigms are studied to find a suitable strategy for educating
modern students. These paradigms include Behaviourism,
Constructivism, Humanism and Cogntivism. Social Learning theory
and Elaboration theory are two theories that are further examined and
a survey is done to determine how these strategies will be received by
students. The results and findings are evaluated and indicate that
students are fairly receptive to a method that incorporates both Social
Learning theory and Elaboration theory, but that some aspects of all
paradigms need to be implemented to create a balanced and effective
strategy with technology as foundation.
Abstract: This paper describes a simulation model for analyzing artificial emotion injected to design the game characters. Most of the game storyboard is interactive in nature and the virtual characters of the game are equipped with an individual personality and dynamic emotion value which is similar to real life emotion and behavior. The uncertainty in real expression, mood and behavior is also exhibited in game paradigm and this is focused in the present paper through a fuzzy logic based agent and storyboard. Subsequently, a pheromone distribution or labeling is presented mimicking the behavior of social insects.
Abstract: Manufacturing Industries face a crucial change as products and processes are required to, easily and efficiently, be reconfigurable and reusable. In order to stay competitive and flexible, situations also demand distribution of enterprises globally, which requires implementation of efficient communication strategies. A prototype system called the “Broadcaster" has been developed with an assumption that the control environment description has been engineered using the Component-based system paradigm. This prototype distributes information to a number of globally distributed partners via an adoption of the circular-based data processing mechanism. The work highlighted in this paper includes the implementation of this mechanism in the domain of the manufacturing industry. The proposed solution enables real-time remote propagation of machine information to a number of distributed supply chain client resources such as a HMI, VRML-based 3D views and remote client instances regardless of their distribution nature and/ or their mechanisms. This approach is presented together with a set of evaluation results. Authors- main concentration surrounds the reliability and the performance metric of the adopted approach. Performance evaluation is carried out in terms of the response times taken to process the data in this domain and compared with an alternative data processing implementation such as the linear queue mechanism. Based on the evaluation results obtained, authors justify the benefits achieved from this proposed implementation and highlight any further research work that is to be carried out.
Abstract: For identifying the discriminative sequence features between exons and introns, a new paradigm, rescaled-range frameshift analysis (RRFA), was proposed. By RRFA, two new
sequence features, the frameshift sensitivity (FS) and the accumulative
penta-mer complexity (APC), were discovered which
were further integrated into a new feature of larger scale, the persistency in anti-mutation (PAM). The feature-validation experiments
were performed on six model organisms to test the power
of discrimination. All the experimental results highly support that FS, APC and PAM were all distinguishing features between exons
and introns. These identified new sequence features provide new insights into the sequence composition of genes and they have
great potentials of forming a new basis for recognizing the exonintron boundaries in gene sequences.
Abstract: Cloud Computing has recently emerged as a
compelling paradigm for managing and delivering services over the
internet. The rise of Cloud Computing is rapidly changing the
landscape of information technology, and ultimately turning the longheld
promise of utility computing into a reality. As the development
of Cloud Computing paradigm is speedily progressing, concepts, and
terminologies are becoming imprecise and ambiguous, as well as
different technologies are interfering. Thus, it becomes crucial to
clarify the key concepts and definitions. In this paper, we present the
anatomy of Cloud Computing, covering its essential concepts,
prominent characteristics, its affects, architectural design and key
technologies. We differentiate various service and deployment
models. Also, significant challenges and risks need are tackled in
order to guarantee the long-term success of Cloud Computing. The
aim of this paper is to provide a better understanding of the anatomy
of Cloud Computing and pave the way for further research in this
area.
Abstract: This paper proposes a novel approach that combines statistical models and support vector machines. A hybrid scheme which appropriately incorporates the advantages of both the generative and discriminant model paradigms is described and evaluated. Support vector machines (SVMs) are trained to divide the whole speakers' space into small subsets of speakers within a hierarchical tree structure. During testing a speech token is assigned to its corresponding group and evaluation using gaussian mixture models (GMMs) is then processed. Experimental results show that the proposed method can significantly improve the performance of text independent speaker identification task. We report improvements of up to 50% reduction in identification error rate compared to the baseline statistical model.
Abstract: Many studies have applied the Theory of Planned
Behavior (TPB) in predicting health behaviors among unique
populations. However, a new paradigm is emerging where focus is
now directed to modification and expansion of the TPB model rather
than utilization of the traditional theory. This review proposes new
models modified from the Theory of Planned Behavior and suggest
an appropriate study design that can be used to test the models within
physical activity and dietary practice domains among Type 2
diabetics in Kenya. The review was conducted by means of literature
search in the field of nutrition behavior, health psychology and
mixed methods using predetermined key words. The results identify
pre-intention and post intention gaps within the TPB model that need
to be filled. Additional psychosocial factors are proposed to be
included in the TPB model to generate new models and the efficacy
of these models tested using mixed methods design.
Abstract: Culture and family structure provide a sense security.
Further, the chrono, macro and micro contexts of development
influence developmental transitions and timetable particularly owing
to variations in the macrosystem associated with non normative life
events like migration. Migration threatens family links, security and
attachment bonds. Rising migratory trends have prompted an
increased interest in migration consequences on familial bonds,
developmental autonomy, socialization process, and sense of
security. This paper takes a narrative approach and applies the
attachment paradigm from a lifespan perspective, to examine the
settlement experiences of an India-born migrant student in Sydney,
Australia. It focuses on her quest to preserve family ties; her remote
secure base; her continual struggle to balance dependency and
autonomy, a major developmental milestone. As positional parental
power is culturally more potent in the Indian society, the paper
therefore raises some important concerns related to cultural
expectations, adaptation, acculturative stress and sense of security.
Abstract: The Partitioned Global Address Space (PGAS) programming
paradigm offers ease-of-use in expressing parallelism
through a global shared address space while emphasizing performance
by providing locality awareness through the partitioning of
this address space. Therefore, the interest in PGAS programming
languages is growing and many new languages have emerged and
are becoming ubiquitously available on nearly all modern parallel
architectures. Recently, new parallel machines with multiple cores
are designed for targeting high performance applications. Most of the
efforts have gone into benchmarking but there are a few examples of
real high performance applications running on multicore machines.
In this paper, we present and evaluate a parallelization technique
for implementing a local DNA sequence alignment algorithm using
a PGAS based language, UPC (Unified Parallel C) on a chip
multithreading architecture, the UltraSPARC T1.
Abstract: This paper describes a paradigmatic approach to develop architecture of secure systems by describing the requirements from four different points of view: that of the owner, the administrator, the user, and the network. Deriving requirements and developing architecture implies the joint elicitation and describing the problem and the structure of the solution. The view points proposed in this paper are those we consider as requirements towards their contributions as major parties in the design, implementation, usage and maintenance of secure systems. The dramatic growth of the technology of Internet and the applications deployed in World Wide Web have lead to the situation where the security has become a very important concern in the development of secure systems. Many security approaches are currently being used in organizations. In spite of the widespread use of many different security solutions, the security remains a problem. It is argued that the approach that is described in this paper for the development of secure architecture is practical by all means. The models representing these multiple points of view are termed the requirements model (views of owner and administrator) and the operations model (views of user and network). In this paper, this multiple view paradigm is explained by first describing the specific requirements and or characteristics of secure systems (particularly in the domain of networks) and the secure architecture / system development methodology.
Abstract: Nowadays, ontologies are the only widely accepted paradigm for the management of sharable and reusable knowledge in a way that allows its automatic interpretation. They are collaboratively created across the Web and used to index, search and annotate documents. The vast majority of the ontology based approaches, however, focus on indexing texts at document level. Recently, with the advances in ontological engineering, it became clear that information indexing can largely benefit from the use of general purpose ontologies which aid the indexing of documents at word level. This paper presents a concept indexing algorithm, which adds ontology information to words and phrases and allows full text to be searched, browsed and analyzed at different levels of abstraction. This algorithm uses a general purpose ontology, OntoRo, and an ontologically tagged corpus, OntoCorp, both developed for the purpose of this research. OntoRo and OntoCorp are used in a two-stage supervised machine learning process aimed at generating ontology tagging rules. The first experimental tests show a tagging accuracy of 78.91% which is encouraging in terms of the further improvement of the algorithm.
Abstract: In order to maximize efficiency of an information management platform and to assist in decision making, the collection, storage and analysis of performance-relevant data has become of fundamental importance. This paper addresses the merits and drawbacks provided by the OLAP paradigm for efficiently navigating large volumes of performance measurement data hierarchically. The system managers or database administrators navigate through adequately (re)structured measurement data aiming to detect performance bottlenecks, identify causes for performance problems or assessing the impact of configuration changes on the system and its representative metrics. Of particular importance is finding the root cause of an imminent problem, threatening availability and performance of an information system. Leveraging OLAP techniques, in contrast to traditional static reporting, this is supposed to be accomplished within moderate amount of time and little processing complexity. It is shown how OLAP techniques can help improve understandability and manageability of measurement data and, hence, improve the whole Performance Analysis process.
Abstract: The current trend of increasing quality and demands
of the final product is affected by time analysis of the entire
manufacturing process. The primary requirement of manufacturing is
to produce as many products as soon as possible, at the lowest
possible cost, but of course with the highest quality. Such
requirements may be satisfied only if all the elements entering and
affecting the production cycle are in a fully functional condition.
These elements consist of sensory equipment and intelligent control
elements that are essential for building intelligent manufacturing
systems. The intelligent manufacturing paradigm includes a new
approach to production system structure design. Intelligent behaviors
are based on the monitoring of important parameters of system and
its environment. The flexible reaction to changes. The realization and
utilization of this design paradigm as an "intelligent manufacturing
system" enables the flexible system reaction to production
requirement as soon as environmental changes too. Results of these
flexible reactions are a smaller layout space, be decreasing of
production and investment costs and be increasing of productivity.
Intelligent manufacturing system itself should be a system that can
flexibly respond to changes in entering and exiting the process in
interaction with the surroundings.