Abstract: Overcurrent (OC) relays are the major protection
devices in a distribution system. The operating time of the OC relays
are to be coordinated properly to avoid the mal-operation of the
backup relays. The OC relay time coordination in ring fed
distribution networks is a highly constrained optimization problem
which can be stated as a linear programming problem (LPP). The
purpose is to find an optimum relay setting to minimize the time of
operation of relays and at the same time, to keep the relays properly
coordinated to avoid the mal-operation of relays.
This paper presents two phase simplex method for optimum time
coordination of OC relays. The method is based on the simplex
algorithm which is used to find optimum solution of LPP. The
method introduces artificial variables to get an initial basic feasible
solution (IBFS). Artificial variables are removed using iterative
process of first phase which minimizes the auxiliary objective
function. The second phase minimizes the original objective function
and gives the optimum time coordination of OC relays.
Abstract: Recently global concerns for the energy security have
steadily been on the increase and are expected to become a major
issue over the next few decades. Energy security refers to a resilient
energy system. This resilient system would be capable of
withstanding threats through a combination of active, direct security
measures and passive or more indirect measures such as redundancy,
duplication of critical equipment, diversity in fuel, other sources of
energy, and reliance on less vulnerable infrastructure. Threats and
disruptions (disturbances) to one part of the energy system affect
another. The paper presents methodology in theoretical background
about energy system as an interconnected network and energy supply
disturbances impact to the network. The proposed methodology uses
a network flow approach to develop mathematical model of the
energy system network as the system of nodes and arcs with energy
flowing from node to node along paths in the network.
Abstract: The major building block of most elliptic curve cryptosystems
are computation of multi-scalar multiplication. This paper
proposes a novel algorithm for simultaneous multi-scalar multiplication,
that is by employing addition chains. The previously known
methods utilizes double-and-add algorithm with binary representations.
In order to accomplish our purpose, an efficient empirical
method for finding addition chains for multi-exponents has been
proposed.
Abstract: Antimicrobial resistant is becoming a major factor in
virtually all hospital acquired infection may soon untreatable is a
serious public health problem. These concerns have led to major
research effort to discover alternative strategies for the treatment of
bacterial infection. Nanobiotehnology is an upcoming and fast
developing field with potential application for human welfare. An
important area of nanotechnology for development of reliable and
environmental friendly process for synthesis of nanoscale particles
through biological systems In the present studies are reported on the
use of fungal strain Aspergillus species for the extracellular synthesis
of bionanoparticles from 1 mM silver nitrate (AgNO3) solution. The
report would be focused on the synthesis of metallic bionanoparticles
of silver using a reduction of aqueous Ag+ ion with the
culture supernatants of Microorganisms. The bio-reduction of the
Ag+ ions in the solution would be monitored in the aqueous
component and the spectrum of the solution would measure through
UV-visible spectrophotometer The bionanoscale particles were
further characterized by Atomic Force Microscopy (AFM), Fourier
Transform Infrared Spectroscopy (FTIR) and Thin layer
chromatography. The synthesized bionanoscale particle showed a
maximum absorption at 385 nm in the visible region. Atomic Force
Microscopy investigation of silver bionanoparticles identified that
they ranged in the size of 250 nm - 680 nm; the work analyzed the
antimicrobial efficacy of the silver bionanoparticles against various
multi drug resistant clinical isolates. The present Study would be
emphasizing on the applicability to synthesize the metallic
nanostructures and to understand the biochemical and molecular
mechanism of nanoparticles formation by the cell filtrate in order to
achieve better control over size and polydispersity of the
nanoparticles. This would help to develop nanomedicine against
various multi drug resistant human pathogens.
Abstract: Localized surface plasmon resonance (LSPR) is the
coherent oscillation of conductive electrons confined in noble
metallic nanoparticles excited by electromagnetic radiation, and
nanosphere lithography (NSL) is one of the cost-effective methods to
fabricate metal nanostructures for LSPR. NSL can be categorized
into two major groups: dispersed NSL and closely pack NSL. In
recent years, gold nanocrescents and gold nanoholes with vertical
sidewalls fabricated by dispersed NSL, and silver nanotriangles and
gold nanocaps on silica nanospheres fabricated by closely pack NSL,
have been reported for LSPR biosensing. This paper introduces
several novel gold nanostructures fabricated by NSL in LSPR
applications, including 3D nanostructures obtained by evaporating
gold obliquely on dispersed nanospheres, nanoholes with slant
sidewalls, and patchy nanoparticles on closely packed nanospheres,
all of which render satisfactory sensitivity for LSPR sensing. Since
the LSPR spectrum is very sensitive to the shape of the metal
nanostructures, formulas are derived and software is developed for
calculating the profiles of the obtainable metal nanostructures by
NSL, for different nanosphere masks with different fabrication
conditions. The simulated profiles coincide well with the profiles of
the fabricated gold nanostructures observed under scanning electron
microscope (SEM) and atomic force microscope (AFM), which
proves that the software is a useful tool for the process design of
different LSPR nanostructures.
Abstract: Chua’s circuit is one of the most important electronic devices that are used for Chaos and Bifurcation studies. A central role of secure communication is devoted to it. Since the adaptive control is used vastly in the linear systems control, here we introduce a new trend of application of adaptive method in the chaos controlling field. In this paper, we try to derive a new adaptive control scheme for Chua’s circuit controlling because control of chaos is often very important in practical operations. The novelty of this approach is for sake of its robustness against the external perturbations which is simulated as an additive noise in all measured states and can be generalized to other chaotic systems. Our approach is based on Lyapunov analysis and the adaptation law is considered for the feedback gain. Because of this, we have named it NAFT (Nonlinear Adaptive Feedback Technique). At last, simulations show the capability of the presented technique for Chua’s circuit.
Abstract: Grid networks provide the ability to perform higher throughput computing by taking advantage of many networked computer-s resources to solve large-scale computation problems. As the popularity of the Grid networks has increased, there is a need to efficiently distribute the load among the resources accessible on the network. In this paper, we present a stochastic network system that gives a distributed load-balancing scheme by generating almost regular networks. This network system is self-organized and depends only on local information for load distribution and resource discovery. The in-degree of each node is refers to its free resources, and job assignment and resource discovery processes required for load balancing is accomplished by using fitted random sampling. Simulation results show that the generated network system provides an effective, scalable, and reliable load-balancing scheme for the distributed resources accessible on Grid networks.
Abstract: Independent component analysis (ICA) in the
frequency domain is used for solving the problem of blind source
separation (BSS). However, this method has some problems. For
example, a general ICA algorithm cannot determine the permutation
of signals which is important in the frequency domain ICA. In this
paper, we propose an approach to the solution for a permutation
problem. The idea is to effectively combine two conventional
approaches. This approach improves the signal separation
performance by exploiting features of the conventional approaches.
We show the simulation results using artificial data.
Abstract: The paradigm of mobile agent provides a promising technology for the development of distributed and open applications. However, one of the main obstacles to widespread adoption of the mobile agent paradigm seems to be security. This paper treats the security of the mobile agent against malicious host attacks. It describes generic mobile agent protection architecture. The proposed approach is based on the dynamic adaptability and adopts the reflexivity as a model of conception and implantation. In order to protect it against behaviour analysis attempts, the suggested approach supplies the mobile agent with a flexibility faculty allowing it to present an unexpected behaviour. Furthermore, some classical protective mechanisms are used to reinforce the level of security.
Abstract: It is necessary to evaluate the bridges conditions and
strengthen bridges or parts of them. The reinforcement necessary due
to some reasons can be summarized as: First, a changing in use of
bridge could produce internal forces in a part of structural which
exceed the existing cross-sectional capacity. Second, bridges may
also need reinforcement because damage due to external factors
which reduced the cross-sectional resistance to external loads. One of
other factors could listed here its misdesign in some details, like
safety of bridge or part of its.This article identify the design demands
of Qing Shan bridge located in is in Heilongjiang Province He gang -
Nen Jiang Road 303 provincial highway, Wudalianchi area, China, is
an important bridge in the urban areas. The investigation program
was include the observation and evaluate the damage in T- section
concrete beams , prestressed concrete box girder bridges section in
additional evaluate the whole state of bridge includes the pier ,
abutments , bridge decks, wings , bearing and capping beam, joints,
........etc. The test results show that the bridges in general structural
condition are good. T beam span No 10 were observed, crack
extended upward along the ribbed T beam, and continue to the T
beam flange. Crack width varying between 0.1mm to 0.4mm, the
maximum about 0.4mm. The bridge needs to be improved flexural
bending strength especially at for T beam section.
Abstract: IP multicasting is a key technology for many existing and emerging applications on the Internet. Furthermore, with increasing popularity of wireless devices and mobile equipment, it is necessary to determine the best way to provide this service in a wireless environment. IETF Mobile IP, that provides mobility for hosts in IP networks, proposes two approaches for mobile multicasting, namely, remote subscription (MIP-RS) and bi-directional tunneling (MIP-BT). In MIP-RS, a mobile host re-subscribes to the multicast groups each time it moves to a new foreign network. MIP-RS suffers from serious packet losses while mobile host handoff occurs. In MIP-BT, mobile hosts send and receive multicast packets by way of their home agents (HAs), using Mobile IP tunnels. Therefore, it suffers from inefficient routing and wastage of system resources. In this paper, we propose a protocol called Mobile Multicast support using Old Foreign Agent (MMOFA) for Mobile Hosts. MMOFA is derived from MIP-RS and with the assistance of Mobile host's Old foreign agent, routes the missing datagrams due to handoff in adjacent network via tunneling. Also, we studied the performance of the proposed protocol by simulation under ns-2.27. The results demonstrate that MMOFA has optimal routing efficiency and low delivery cost, as compared to other approaches.
Abstract: The last decade has shown that object-oriented
concept by itself is not that powerful to cope with the rapidly
changing requirements of ongoing applications. Component-based
systems achieve flexibility by clearly separating the stable parts of
systems (i.e. the components) from the specification of their
composition. In order to realize the reuse of components effectively
in CBSD, it is required to measure the reusability of components.
However, due to the black-box nature of components where the
source code of these components are not available, it is difficult to
use conventional metrics in Component-based Development as these
metrics require analysis of source codes. In this paper, we survey
few existing component-based reusability metrics. These metrics
give a border view of component-s understandability, adaptability,
and portability. It also describes the analysis, in terms of quality
factors related to reusability, contained in an approach that aids
significantly in assessing existing components for reusability.
Abstract: In this paper, to optimize the “Characteristic Straight Line Method" which is used in the soil displacement analysis, a “best estimate" of the geodetic leveling observations has been achieved by taking in account the concept of 'Height systems'. This concept has been discussed in detail and consequently the concept of “height". In landslides dynamic analysis, the soil is considered as a mosaic of rigid blocks. The soil displacement has been monitored and analyzed by using the “Characteristic Straight Line Method". Its characteristic components have been defined constructed from a “best estimate" of the topometric observations. In the measurement of elevation differences, we have used the most modern leveling equipment available. Observational procedures have also been designed to provide the most effective method to acquire data. In addition systematic errors which cannot be sufficiently controlled by instrumentation or observational techniques are minimized by applying appropriate corrections to the observed data: the level collimation correction minimizes the error caused by nonhorizontality of the leveling instrument's line of sight for unequal sight lengths, the refraction correction is modeled to minimize the refraction error caused by temperature (density) variation of air strata, the rod temperature correction accounts for variation in the length of the leveling rod' s Invar/LO-VAR® strip which results from temperature changes, the rod scale correction ensures a uniform scale which conforms to the international length standard and the introduction of the concept of the 'Height systems' where all types of height (orthometric, dynamic, normal, gravity correction, and equipotential surface) have been investigated. The “Characteristic Straight Line Method" is slightly more convenient than the “Characteristic Circle Method". It permits to evaluate a displacement of very small magnitude even when the displacement is of an infinitesimal quantity. The inclination of the landslide is given by the inverse of the distance reference point O to the “Characteristic Straight Line". Its direction is given by the bearing of the normal directed from point O to the Characteristic Straight Line (Fig..6). A “best estimate" of the topometric observations was used to measure the elevation of points carefully selected, before and after the deformation. Gross errors have been eliminated by statistical analyses and by comparing the heights within local neighborhoods. The results of a test using an area where very interesting land surface deformation occurs are reported. Monitoring with different options and qualitative comparison of results based on a sufficient number of check points are presented.
Abstract: The purpose of this paper is to perform a multidisciplinary design and analysis (MDA) of honeycomb panels used in the satellites structural design. All the analysis is based on clamped-free boundary conditions. In the present work, detailed finite element models for honeycomb panels are developed and analysed. Experimental tests were carried out on a honeycomb specimen of which the goal is to compare the previous modal analysis made by the finite element method as well as the existing equivalent approaches. The obtained results show a good agreement between the finite element analysis, equivalent and tests results; the difference in the first two frequencies is less than 4% and less than 10% for the third frequency. The results of the equivalent model presented in this analysis are obtained with a good accuracy. Moreover, investigations carried out in this research relate to the honeycomb plate modal analysis under several aspects including the structural geometrical variation by studying the various influences of the dimension parameters on the modal frequency, the variation of core and skin material of the honeycomb. The various results obtained in this paper are promising and show that the geometry parameters and the type of material have an effect on the value of the honeycomb plate modal frequency.
Abstract: A filter is used to remove undesirable frequency information from a dynamic signal. This paper shows that the Znotch filter filtering technique can be applied to remove the noise nuisance from a machining signal. In machining, the noise components were identified from the sound produced by the operation of machine components itself such as hydraulic system, motor, machine environment and etc. By correlating the noise components with the measured machining signal, the interested components of the measured machining signal which was less interfered by the noise, can be extracted. Thus, the filtered signal is more reliable to be analysed in terms of noise content compared to the unfiltered signal. Significantly, the I-kaz method i.e. comprises of three dimensional graphical representation and I-kaz coefficient, Z∞ could differentiate between the filtered and the unfiltered signal. The bigger space of scattering and the higher value of Z∞ demonstrated that the signal was highly interrupted by noise. This method can be utilised as a proactive tool in evaluating the noise content in a signal. The evaluation of noise content is very important as well as the elimination especially for machining operation fault diagnosis purpose. The Z-notch filtering technique was reliable in extracting noise component from the measured machining signal with high efficiency. Even though the measured signal was exposed to high noise disruption, the signal generated from the interaction between cutting tool and work piece still can be acquired. Therefore, the interruption of noise that could change the original signal feature and consequently can deteriorate the useful sensory information can be eliminated.
Abstract: A human verification system is presented in this
paper. The system consists of several steps: background subtraction,
thresholding, line connection, region growing, morphlogy, star
skelatonization, feature extraction, feature matching, and decision
making. The proposed system combines an advantage of star
skeletonization and simple statistic features. A correlation matching
and probability voting have been used for verification, followed by a
logical operation in a decision making stage. The proposed system
uses small number of features and the system reliability is
convincing.
Abstract: Nigeria is considered as one of the many countries in
sub-Saharan Africa with a weak economy and gross deficiencies in technology and engineering. Available data from international monitoring and regulatory organizations show that technology is pivotal to determining the economic strengths of nations all over the
world. Education is critical to technology acquisition, development,
dissemination and adaptation. Thus, this paper seeks to critically
assess and discuss issues and challenges facing technological
advancement in Nigeria, particularly in the education sector, and also
proffers solutions to resuscitate the Nigerian education system
towards achieving national technological and economic sustainability
such that Nigeria can compete favourably with other technologicallydriven
economies of the world in the not-too-distant future.
Abstract: In this paper, we argue that Design research is basic to countries- national productivity and competition agendas at the same time that vagaries of research training presents as one of the barriers faced by Design Higher Degree by Research students in engaging those agendas. We argue that, given industry requirements for research-trained recruits, students have the right to expect that research training will provide the foundations of a successful career on an academic or research pathway or a professional pathway, but that universities have yet to address problems in their provision of research training for Design doctoral students. We suggest that to facilitate this, rigorous research conducted on the provision of Doctoral programs in Design would serve to inform future activities in Design research in productive ways.
Abstract: The paper is concerned with relationships between
SSME and ICTs and focuses on the role of Web 2.0 tools in
the service development process. The research presented aims at
exploring how collaborative technologies can support and improve
service processes, highlighting customer centrality and value coproduction.
The core idea of the paper is the centrality of user
participation and the collaborative technologies as enabling factors;
Wikipedia is analyzed as an example. The result of such analysis is
the identification and description of a pattern characterising specific
services in which users collaborate by means of web tools with value
co-producers during the service process. The pattern of collaborative
co-production concerning several categories of services including
knowledge based services is then discussed.
Abstract: A fully on-chip low drop-out (LDO) voltage regulator with 100pF output load capacitor is presented. A novel frequency compensation scheme using current buffer is adopted to realize single dominant pole within the unit gain frequency of the regulation loop, the phase margin (PM) is at least 50 degree under the full range of the load current, and the power supply rejection (PSR) character is improved compared with conventional Miller compensation. Besides, the differentiator provides a high speed path during the load current transient. Implemented in 0.18μm CMOS technology, the LDO voltage regulator provides 100mA load current with a stable 1.8V output voltage consuming 80μA quiescent current.