Abstract: A compact 1x3 power splitter based on Photonic
Crystal Waveguides (PCW) with flexible power splitting ratio is
presented in this paper. Multimode interference coupler (MMI) is
integrated with PCW. The device size reduction compared with the
conventional MMI power splitter is attributed to the large dispersion
of the PCW. Band Solve tool is used to calculate the band structure of
PCW. Finite Difference Time Domain (FDTD) method is adopted to
simulate the relevant structure at 1550nm wavelength. The device is
polarization insensitive and allows the control of output (o/p) powers
within certain percentage points for both polarizations.
Abstract: Factoring Boolean functions is one of the basic operations in algorithmic logic synthesis. A novel algebraic factorization heuristic for single-output combinatorial logic functions is presented in this paper and is developed based on the set theory paradigm. The impact of factoring is analyzed mainly from a low power design perspective for standard cell based digital designs in this paper. The physical implementation of a number of MCNC/IWLS combinational benchmark functions and sub-functions are compared before and after factoring, based on a simple technology mapping procedure utilizing only standard gate primitives (readily available as standard cells in a technology library) and not cells corresponding to optimized complex logic. The power results were obtained at the gate-level by means of an industry-standard power analysis tool from Synopsys, targeting a 130nm (0.13μm) UMC CMOS library, for the typical case. The wire-loads were inserted automatically and the simulations were performed with maximum input activity. The gate-level simulations demonstrate the advantage of the proposed factoring technique in comparison with other existing methods from a low power perspective, for arbitrary examples. Though the benchmarks experimentation reports mixed results, the mean savings in total power and dynamic power for the factored solution over a non-factored solution were 6.11% and 5.85% respectively. In terms of leakage power, the average savings for the factored forms was significant to the tune of 23.48%. The factored solution is expected to better its non-factored counterpart in terms of the power-delay product as it is well-known that factoring, in general, yields a delay-efficient multi-level solution.
Abstract: This article analyses conspiracy theories as part of the
wider discourses of missionary politics. It presents a case study of
Venezuela and describes how its leaders use conspiracy theories as
political tools. Through quotes taken form Venezuelan president
Chavez-s public speeches and other sources, and through a short
analysis of the ideological basis of his discourses, it shows how
conspiracy theories are constructed and how they affect the local
political praxis. The article also describes how conspiracy theories
have been consistently used as an important part of the construction of
a political religion for the New Man of the Bolivarian Revolution. It
concludes that the use of conspiracy theories by political leaders
produces a sense of loss of political agency.
Abstract: Vector quantization is a powerful tool for speech
coding applications. This paper deals with LPC Coding of speech
signals which uses a new technique called Multi Switched Split
Vector Quantization, This is a hybrid of two product code vector
quantization techniques namely the Multi stage vector quantization
technique, and Switched split vector quantization technique,. Multi
Switched Split Vector Quantization technique quantizes the linear
predictive coefficients in terms of line spectral frequencies. From
results it is proved that Multi Switched Split Vector Quantization
provides better trade off between bitrate and spectral distortion
performance, computational complexity and memory requirements
when compared to Switched Split Vector Quantization, Multi stage
vector quantization, and Split Vector Quantization techniques. By
employing the switching technique at each stage of the vector
quantizer the spectral distortion, computational complexity and
memory requirements were greatly reduced. Spectral distortion was
measured in dB, Computational complexity was measured in
floating point operations (flops), and memory requirements was
measured in (floats).
Abstract: The aim of this study is to examine the reading
comprehension scores of Turkish 5th grade students according to the
variables given in the student questionnaire. In this descriptive
survey study research participated 279 5th grade students, who
studied at 10 different primary schools in four provinces of Ankara in
2008-2009 academic year. Two different data collection tools were
made use of in the study: “Reading Comprehension Test" and
“Student Information Questionnaire". Independent sample t-test, oneway
Anova and two-way Anova tests were used in the analyses of
the gathered data. The results of the study indicate that the reading
comprehension scores of the students differ significantly according to
sex of the students, the number of books in their houses, the
frequency of summarizing activities on the reading text of free and
the frequency reading hours provided by their teachers; but, differ
not significantly according to educational level of their mothers and
fathers.
Abstract: The importance of good requirements engineering is well documented. Agile practices, promoting collaboration and communications, facilitate the elicitation and management of volatile requirements. However, current Agile practices work in a well-defined environment. It is necessary to have a co-located customer. With distributed development it is not always possible to realize this co-location. In this environment a suitable process, possibly supported by tools, is required to support changing requirements. This paper introduces the issues of concern when managing requirements in a distributed environment and describes work done at the Software Technology Research Centre as part of the NOMAD project.
Abstract: This article presents the developments of efficient
algorithms for tablet copies comparison. Image recognition has
specialized use in digital systems such as medical imaging,
computer vision, defense, communication etc. Comparison between
two images that look indistinguishable is a formidable task. Two
images taken from different sources might look identical but due to
different digitizing properties they are not. Whereas small variation
in image information such as cropping, rotation, and slight
photometric alteration are unsuitable for based matching
techniques. In this paper we introduce different matching
algorithms designed to facilitate, for art centers, identifying real
painting images from fake ones. Different vision algorithms for
local image features are implemented using MATLAB. In this
framework a Table Comparison Computer Tool “TCCT" is
designed to facilitate our research. The TCCT is a Graphical Unit
Interface (GUI) tool used to identify images by its shapes and
objects. Parameter of vision system is fully accessible to user
through this graphical unit interface. And then for matching, it
applies different description technique that can identify exact
figures of objects.
Abstract: In the present investigation, H13 tool steel has been
deposited on copper alloy substrate using both CO2 and diode laser.
A detailed parametric analysis has been carried out in order to find
out optimum processing zone for coating defect free H13 tool steel
on copper alloy substrate. Followed by parametric optimization, the
microstructure and microhardness of the deposited clads have been
evaluated. SEM micrographs revealed dendritic microstructure in
both clads. However, the microhardness of CO2 laser deposited clad
was much higher compared to diode laser deposited clad.
Abstract: Perceptions of quality from both designers and users
perspective have now stretched beyond the traditional usability,
incorporating abstract and subjective concepts. This has led to a shift
in human computer interaction research communities- focus; a shift
that focuses on achieving user experience (UX) by not only fulfilling
conventional usability needs but also those that go beyond them. The
term UX, although widely spread and given significant importance,
lacks consensus in its unified definition. In this paper, we survey
various UX definitions and modeling frameworks and examine them
as the foundation for proposing a UX evolution lifecycle framework
for understanding UX in detail. In the proposed framework we identify
the building blocks of UX and discuss how UX evolves in various
phases. The framework can be used as a tool to understand experience
requirements and evaluate them, resulting in better UX design and
hence improved user satisfaction.
Abstract: In the proposed method for Web page-ranking, a
novel theoretic model is introduced and tested by examples of order
relationships among IP addresses. Ranking is induced using a
convexity feature, which is learned according to these examples
using a self-organizing procedure. We consider the problem of selforganizing
learning from IP data to be represented by a semi-random
convex polygon procedure, in which the vertices correspond to IP
addresses. Based on recent developments in our regularization
theory for convex polygons and corresponding Euclidean distance
based methods for classification, we develop an algorithmic
framework for learning ranking functions based on a Computational
Geometric Theory. We show that our algorithm is generic, and
present experimental results explaining the potential of our approach.
In addition, we explain the generality of our approach by showing its
possible use as a visualization tool for data obtained from diverse
domains, such as Public Administration and Education.
Abstract: Decision support based upon risk analysis into
comparison of the electricity generation from different renewable
energy technologies can provide information about their effects on
the environment and society. The aim of this paper is to develop the
assessment framework regarding risks to health and environment,
and the society-s benefits of the electric power plant generation from
different renewable sources. The multicriteria framework to
multiattribute risk analysis technique and the decision analysis
interview technique are applied in order to support the decisionmaking
process for the implementing renewable energy projects to
the Bangkok case study. Having analyses the local conditions and
appropriate technologies, five renewable power plants are postulated
as options. As this work demonstrates, the analysis can provide a tool
to aid decision-makers for achieving targets related to promote
sustainable energy system.
Abstract: The ARMrayan Multimedia Mobile CMS (Content
Management System) is the first mobile CMS that gives the
opportunity to users for creating multimedia J2ME mobile
applications with their desired content, design and logo; simply,
without any need for writing even a line of code. The low-level
programming and compatibility problems of the J2ME, along with
UI designing difficulties, makes it hard for most people –even
programmers- to broadcast their content to the widespread mobile
phones used by nearly all people. This system provides user-friendly,
PC-based tools for creating a tree index of pages and inserting
multiple multimedia contents (e.g. sound, video and picture) in each
page for creating a J2ME mobile application. The output is a standalone
Java mobile application that has a user interface, shows texts
and pictures and plays music and videos regardless of the type of
devices used as long as the devices support the J2ME platform.
Bitmap fonts have also been used thus Middle Eastern languages can
be easily supported on all mobile phone devices. We omitted
programming concepts for users in order to simplify multimedia
content-oriented mobile applictaion designing for use in educational,
cultural or marketing centers. Ordinary operators can now create a
variety of multimedia mobile applications such as tutorials,
catalogues, books, and guides in minutes rather than months.
Simplicity and power has been the goal of this CMS. In this paper,
we present the software engineered-designed concepts of the
ARMrayan MCMS along with the implementation challenges faces
and solutions adapted.
Abstract: Systems Analysis and Design is a key subject in
Information Technology courses, but students do not find it easy to
cope with, since it is not “precise" like programming and not exact
like Mathematics. It is a subject working with many concepts,
modeling ideas into visual representations and then translating the
pictures into a real life system. To complicate matters users who are
not necessarily familiar with computers need to give their inputs to
ensure that they get the system the need. Systems Analysis and
Design also covers two fields, namely Analysis, focusing on the
analysis of the existing system and Design, focusing on the design of
the new system. To be able to test the analysis and design of a
system, it is necessary to develop a system or at least a prototype of
the system to test the validity of the analysis and design. The skills
necessary in each aspect differs vastly. Project Management Skills,
Database Knowledge and Object Oriented Principles are all
necessary. In the context of a developing country where students
enter tertiary education underprepared and the digital divide is alive
and well, students need to be motivated to learn the necessary skills,
get an opportunity to test it in a “live" but protected environment –
within the framework of a university. The purpose of this article is to
improve the learning experience in Systems Analysis and Design
through reviewing the underlying teaching principles used, the
teaching tools implemented, the observations made and the
reflections that will influence future developments in Systems
Analysis and Design. Action research principles allows the focus to
be on a few problematic aspects during a particular semester.
Abstract: Detecting protein-protein interactions is a central problem in computational biology and aberrant such interactions may have implicated in a number of neurological disorders. As a result, the prediction of protein-protein interactions has recently received considerable attention from biologist around the globe. Computational tools that are capable of effectively identifying protein-protein interactions are much needed. In this paper, we propose a method to detect protein-protein interaction based on substring similarity measure. Two protein sequences may interact by the mean of the similarities of the substrings they contain. When applied on the currently available protein-protein interaction data for the yeast Saccharomyces cerevisiae, the proposed method delivered reasonable improvement over the existing ones.
Abstract: The manufacture of large-scale precision aerospace
components using CNC requires a highly effective maintenance
strategy to ensure that the required accuracy can be achieved over
many hours of production. This paper reviews a strategy for a
maintenance management system based on Failure Mode Avoidance,
which uses advanced techniques and technologies to underpin a
predictive maintenance strategy. It is shown how condition
monitoring (CM) is important to predict potential failures in high
precision machining facilities and achieve intelligent and integrated
maintenance management. There are two distinct ways in which CM
can be applied. One is to monitor key process parameters and
observe trends which may indicate a gradual deterioration of
accuracy in the product. The other is the use of CM techniques to
monitor high status machine parameters enables trends to be
observed which can be corrected before machine failure and
downtime occurs.
It is concluded that the key to developing a flexible and intelligent
maintenance framework in any precision manufacturing operation is
the ability to evaluate reliably and routinely machine tool condition
using condition monitoring techniques within a framework of Failure
Mode Avoidance.
Abstract: The scale, complexity and worldwide geographical
spread of the LHC computing and data analysis problems are
unprecedented in scientific research. The complexity of processing
and accessing this data is increased substantially by the size and
global span of the major experiments, combined with the limited
wide area network bandwidth available. We present the latest
generation of the MONARC (MOdels of Networked Analysis at
Regional Centers) simulation framework, as a design and modeling
tool for large scale distributed systems applied to HEP experiments.
We present simulation experiments designed to evaluate the
capabilities of the current real-world distributed infrastructure to
support existing physics analysis processes and the means by which
the experiments bands together to meet the technical challenges
posed by the storage, access and computing requirements of LHC
data analysis within the CMS experiment.
Abstract: Some quality control tools use non metric subjective information coming from experts, who qualify the intensity of relations existing inside processes, but without quantifying them. In this paper we have developed a quality control analytic tool, measuring the impact or strength of the relationship between process operations and product characteristics. The tool includes two models: a qualitative model, allowing relationships description and analysis; and a formal quantitative model, by means of which relationship quantification is achieved. In the first one, concepts from the Graphs Theory were applied to identify those process elements which can be sources of variation, that is, those quality characteristics or operations that have some sort of prelacy over the others and that should become control items. Also the most dependent elements can be identified, that is those elements receiving the effects of elements identified as variation sources. If controls are focused in those dependent elements, efficiency of control is compromised by the fact that we are controlling effects, not causes. The second model applied adapts the multivariate statistical technique of Covariance Structural Analysis. This approach allowed us to quantify the relationships. The computer package LISREL was used to obtain statistics and to validate the model.
Abstract: In the past decade, artificial neural networks (ANNs)
have been regarded as an instrument for problem-solving and
decision-making; indeed, they have already done with a substantial
efficiency and effectiveness improvement in industries and businesses.
In this paper, the Back-Propagation neural Networks (BPNs) will be
modulated to demonstrate the performance of the collaborative
forecasting (CF) function of a Collaborative Planning, Forecasting and
Replenishment (CPFR®) system. CPFR functions the balance between
the sufficient product supply and the necessary customer demand in a
Supply and Demand Chain (SDC). Several classical standard BPN will
be grouped, collaborated and exploited for the easy implementation of
the proposed modular ANN framework based on the topology of a
SDC. Each individual BPN is applied as a modular tool to perform the
task of forecasting SKUs (Stock-Keeping Units) levels that are
managed and supervised at a POS (point of sale), a wholesaler, and a
manufacturer in an SDC. The proposed modular BPN-based CF
system will be exemplified and experimentally verified using lots of
datasets of the simulated SDC. The experimental results showed that a
complex CF problem can be divided into a group of simpler
sub-problems based on the single independent trading partners
distributed over SDC, and its SKU forecasting accuracy was satisfied
when the system forecasted values compared to the original simulated
SDC data. The primary task of implementing an autonomous CF
involves the study of supervised ANN learning methodology which
aims at making “knowledgeable" decision for the best SKU sales plan
and stocks management.
Abstract: The successful implementation of Service-Oriented Architecture (SOA) is not confined to Information Technology systems and required changes of the whole enterprise. In order to adapt IT and business, the enterprise requires adequate and measurable methods. The adoption of SOA creates new problem with regard to measuring and analysis the performance. In fact the enterprise should investigate to what extent the development of services will increase the value of business. It is required for every business to measure the extent of SOA adaptation with the goals of enterprise. Moreover, precise performance metrics and their combination with the advanced evaluation methodologies as a solution should be defined. The aim of this paper is to present a systematic methodology for designing a measurement system at the technical and business levels, so that: (1) it will determine measurement metrics precisely (2) the results will be analysed by mapping identified metrics to the measurement tools.
Abstract: Electro Chemical Discharge Machining (ECDM) is an
emerging hybrid machining process used in precision machining of hard and brittle non-conducting materials. The present paper gives a
critical review on materials machined by ECDM under the prevailing machining conditions; capability indicators of the process are
reported. Some results obtained while performing experiments in micro-channeling on soda lime glass using ECDM are also presented. In these experiments, Tool Wear (TW) and Material Removal (MR)
were studied using design of experiments and L–4 orthogonal array. Experimental results showed that the applied voltage was the most influencing parameter in both MR and TW studies. Field
emission scanning electron microscopy (FESEM) results obtained on the microchannels confirmed the presence of micro-cracks, primarily responsible for MR. Chemical etching was also seen along the edges.
The Energy dispersive spectroscopy (EDS) results were used to detect the elements present in the debris and specimens.