Abstract: This paper presents two simplified models to
determine nodal voltages in power distribution networks. These
models allow estimating the impact of the installation of reactive
power compensations equipments like fixed or switched capacitor
banks. The procedure used to develop the models is similar to the
procedure used to develop linear power flow models of transmission
lines, which have been widely used in optimization problems of
operation planning and system expansion. The steady state non-linear
load flow equations are approximated by linear equations relating the
voltage amplitude and currents. The approximations of the linear
equations are based on the high relationship between line resistance
and line reactance (ratio R/X), which is valid for power distribution
networks. The performance and accuracy of the models are evaluated
through comparisons with the exact results obtained from the
solution of the load flow using two test networks: a hypothetical
network with 23 nodes and a real network with 217 nodes.
Abstract: In some real applications of Statistical Process Control
it is necessary to design a control chart to not detect small process
shifts, but keeping a good performance to detect moderate and large
shifts in the quality. In this work we develop a new quality control
chart, the synthetic T2 control chart, that can be designed to cope with
this objective. A multi-objective optimization is carried out employing
Genetic Algorithms, finding the Pareto-optimal front of
non-dominated solutions for this optimization problem.
Abstract: This research was conducted for the first time at the
southeastern coasts of the Caspian Sea in order to evaluate the
performance of osteichthyes cooperatives through production (catch)
function. Using one of the indirect valuation methods in this research,
contributory factors in catch were identified and were inserted into
the function as independent variables. In order to carry out this
research, the performance of 25 Osteichthyes catching cooperatives
in the utilization year of 2009 which were involved in fishing in
Miankale wildlife refuge region. The contributory factors in catch
were divided into groups of economic, ecological and biological
factors. In the mentioned function, catch rate of the cooperative were
inserted into as the dependant variable and fourteen partial variables
in terms of nine general variables as independent variables. Finally,
after function estimation, seven variables were rendered significant at
99 percent reliably level. The results of the function estimation
indicated that human resource (fisherman quantity) had the greatest
positive effect on catch rate with an influence coefficient of 1.7 while
weather conditions had the greatest negative effect on the catch rate
of cooperatives with an influence coefficient of -2.07. Moreover,
factors like member's share, experience and fisherman training and
fishing effort played the main roles in the catch rate of cooperative
with influence coefficients of 0.81, 0.5 and 0.21, respectively.
Abstract: A 10bit, 40 MSps, sample and hold, implemented in 0.18-μm CMOS technology with 3.3V supply, is presented for application in the front-end stage of an analog-to-digital converter. Topology selection, biasing, compensation and common mode feedback are discussed. Cascode technique has been used to increase the dc gain. The proposed opamp provides 149MHz unity-gain bandwidth (wu), 80 degree phase margin and a differential peak to peak output swing more than 2.5v. The circuit has 55db Total Harmonic Distortion (THD), using the improved fully differential two stage operational amplifier of 91.7dB gain. The power dissipation of the designed sample and hold is 4.7mw. The designed system demonstrates relatively suitable response in different process, temperature and supply corners (PVT corners).
Abstract: In-core memory requirement is a bottleneck in solving
large three dimensional Navier-Stokes finite element problem
formulations using sparse direct solvers. Out-of-core solution
strategy is a viable alternative to reduce the in-core memory
requirements while solving large scale problems. This study
evaluates the performance of various out-of-core sequential solvers
based on multifrontal or supernodal techniques in the context of
finite element formulations for three dimensional problems on a
Windows platform. Here three different solvers, HSL_MA78,
MUMPS and PARDISO are compared. The performance of these
solvers is evaluated on a 64-bit machine with 16GB RAM for finite
element formulation of flow through a rectangular channel. It is
observed that using out-of-core PARDISO solver, relatively large
problems can be solved. The implementation of Newton and
modified Newton's iteration is also discussed.
Abstract: The research objective of the project and article
“European Ecological Network Natura 2000 – opportunities and
threats” Natura 2000 sites constitute a form of environmental
protection, several legal problems are likely to result. Most
controversially, certain sites will be subject to two regimes of
protection: as national parks and as Natura 2000 sites. This dualism
of the legal regulation makes it difficult to perform certain legal
obligations related to the regimes envisaged under each form of
environmental protection. Which regime and which obligations
resulting from the particular form of environmental protection have
priority and should prevail? What should be done if these obligations
are contradictory? Furthermore, an institutional problem consists in
that no public administration authority has the power to resolve legal
conflicts concerning the application of a particular regime on a given
site. There are also no criteria to decide priority and superiority of
one form of environmental protection over the other. Which
regulations are more important, those that pertain to national parks or
to Natura 2000 sites? In the light of the current regulations, it is
impossible to give a decisive answer to these questions. The internal
hierarchy of forms of environmental protection has not been
determined, and all such forms should be treated equally.
Abstract: Nowadays, engineering ceramics have significant
applications in different industries such as; automotive, aerospace,
electrical, electronics and even martial industries due to their
attractive physical and mechanical properties like very high hardness
and strength at elevated temperatures, chemical stability, low friction
and high wear resistance. However, these interesting properties plus
low heat conductivity make their machining processes too hard,
costly and time consuming. Many attempts have been made in order
to make the grinding process of engineering ceramics easier and
many scientists have tried to find proper techniques to economize
ceramics' machining processes. This paper proposes a new diamond
plunge grinding technique using ultrasonic vibration for grinding
Alumina ceramic (Al2O3). For this purpose, a set of laboratory
equipments have been designed and simulated using Finite Element
Method (FEM) and constructed in order to be used in various
measurements. The results obtained have been compared with the
conventional plunge grinding process without ultrasonic vibration
and indicated that the surface roughness and fracture strength
improved and the grinding forces decreased.
Abstract: Developing countries are facing a problem of slums and there appears to be no fool proof solution to eradicate them. For improving the quality of life there are three approaches of slum development and In-situ up-gradation approach is found to be the best one, while the relocation approach has proved to be failure. Factors responsible for failure of relocation projects are needed to be assessed, which is the basic aim of the paper. Factors responsible for failure of relocation projects are loss of livelihood, security of tenure and inefficiency of the Government. These factors are traced out & mapped from the examples of Western & Indian cities. National habitat, Resettlement policy emphasized relationship between shelter and work place. SRA has identified 55 slums for relocation due reservation of land uses, security of tenure and non- notified status of slums. The policy guidelines have been suggested for successful relocation projects. KeywordsLivelihood, Relocation, Slums, Urban poor.
Abstract: The necessity of accurate and timely field data is
shared among organizations engaged in fundamentally different
activities, public services or commercial operations. Basically, there
are three major components in the process of the qualitative research:
data collection, interpretation and organization of data, and analytic
process. Representative technological advancements in terms of
innovation have been made in mobile devices (mobile phone, PDA-s,
tablets, laptops, etc). Resources that can be potentially applied on the
data collection activity for field researches in order to improve this
process.
This paper presents and discuss the main features of a mobile
phone based solution for field data collection, composed of basically
three modules: a survey editor, a server web application and a client
mobile application. The data gathering process begins with the
survey creation module, which enables the production of tailored
questionnaires. The field workforce receives the questionnaire(s) on
their mobile phones to collect the interviews responses and sending
them back to a server for immediate analysis.
Abstract: In this paper, we propose a Perceptually Optimized Foveation based Embedded ZeroTree Image Coder (POEFIC) that introduces a perceptual weighting to wavelet coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to a given bit rate a fixation point which determines the region of interest ROI. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEFIC quality assessment. Our POEFIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) foveation masking to remove or reduce considerable high frequencies from peripheral regions 2) luminance and Contrast masking, 3) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.
Abstract: The increased number of automobiles in recent years
has resulted in great demand for fossil fuel. This has led to the
development of automobile by using alternative fuels which include
gaseous fuels, biofuels and vegetables oils as fuel. Energy from
biomass and more specific bio-diesel is one of the opportunities that
could cover the future demand of fossil fuel shortage. Biomass in the
form of cashew nut shell represents a new energy source and
abundant source of energy in India. The bio-fuel is derived from
cashew nut shell oil and its blend with diesel are promising
alternative fuel for diesel engine. In this work the pyrolysis Cashew
Nut Shell Liquid (CNSL)-Diesel Blends (CDB) was used to run the
Direct Injection (DI) diesel engine. The experiments were conducted
with various blends of CNSL and Diesel namely B20, B40, B60, B80
and B100. The results are compared with neat diesel operation. The
brake thermal efficiency was decreased for blends of CNSL and
Diesel except the lower blends of B20. The brake thermal efficiency
of B20 is nearly closer to that of diesel fuel. Also the emission level
of the all CNSL and Diesel blends was increased compared to neat
diesel. The higher viscosity and lower volatility of CNSL leads to
poor mixture formation and hence lower brake thermal efficiency and
higher emission levels. The higher emission level can be reduced by
adding suitable additives and oxygenates with CNSL and Diesel
blends.
Abstract: In this paper, a tooth shape optimization method for
cogging torque reduction in Permanent Magnet (PM) motors is
developed by using the Reduced Basis Technique (RBT) coupled by
Finite Element Analysis (FEA) and Design of Experiments (DOE)
methods. The primary objective of the method is to reduce the
enormous number of design variables required to define the tooth
shape. RBT is a weighted combination of several basis shapes. The
aim of the method is to find the best combination using the weights
for each tooth shape as the design variables. A multi-level design
process is developed to find suitable basis shapes or trial shapes at
each level that can be used in the reduced basis technique. Each level
is treated as a separated optimization problem until the required
objective – minimum cogging torque – is achieved. The process is
started with geometrically simple basis shapes that are defined by
their shape co-ordinates. The experimental design of Taguchi method
is used to build the approximation model and to perform
optimization. This method is demonstrated on the tooth shape
optimization of a 8-poles/12-slots PM motor.
Abstract: Due to the ever growing amount of publications about
protein-protein interactions, information extraction from text is
increasingly recognized as one of crucial technologies in
bioinformatics. This paper presents a Protein Interaction Extraction
System using a Link Grammar Parser from biomedical abstracts
(PIELG). PIELG uses linkage given by the Link Grammar Parser to
start a case based analysis of contents of various syntactic roles as
well as their linguistically significant and meaningful combinations.
The system uses phrasal-prepositional verbs patterns to overcome
preposition combinations problems. The recall and precision are
74.4% and 62.65%, respectively. Experimental evaluations with two
other state-of-the-art extraction systems indicate that PIELG system
achieves better performance. For further evaluation, the system is
augmented with a graphical package (Cytoscape) for extracting
protein interaction information from sequence databases. The result
shows that the performance is remarkably promising.
Abstract: We propose that Virtual Learning Environments (VLEs) should be designed by taking into account the characteristics, the special needs and the specific operating rules of the academic institutions in which they are employed. In this context, we describe a VLE module that extends the support of the organization and delivery of course material by including administration activities related to the various stages of teaching. These include the co-ordination, collaboration and monitoring of the course material development process and institution-specific course material delivery modes. Our specialized module, which enhances VLE capabilities by Helping Educators and Learners through a Laboratory Assistance System, is willing to assist the Greek tertiary technological sector, which includes Technological Educational Institutes (T.E.I.).
Abstract: This paper presents a mathematical model and a
methodology to analyze the losses in transmission expansion
planning (TEP) under uncertainty in demand. The methodology is
based on discrete particle swarm optimization (DPSO). DPSO is a
useful and powerful stochastic evolutionary algorithm to solve the
large-scale, discrete and nonlinear optimization problems like TEP.
The effectiveness of the proposed idea is tested on an actual
transmission network of the Azerbaijan regional electric company,
Iran. The simulation results show that considering the losses even for
transmission expansion planning of a network with low load growth
is caused that operational costs decreases considerably and the
network satisfies the requirement of delivering electric power more
reliable to load centers.
Abstract: Supply chain networks are frequently hit by
unplanned events which lead to disruptions and cause operational and
financial consequences. It is neither possible to avoid disruption risk
entirely, nor are network members able to prepare for every possible
disruptive event. Therefore a continuity planning should be set up
which supports effective operational responses in supply chain
networks in times of emergencies. In this research network related
degrees of freedom which determine the options for responsive
actions are derived from interview data. The findings are further
embedded into a common risk management process. The paper
provides support for researchers and practitioners to identify the
network related options for responsive actions and to determine the
need for improving the reaction capabilities.
Abstract: Finite impulse response (FIR) filters have the advantage of linear phase, guaranteed stability, fewer finite precision errors, and efficient implementation. In contrast, they have a major disadvantage of high order need (more coefficients) than IIR counterpart with comparable performance. The high order demand imposes more hardware requirements, arithmetic operations, area usage, and power consumption when designing and fabricating the filter. Therefore, minimizing or reducing these parameters, is a major goal or target in digital filter design task. This paper presents an algorithm proposed for modifying values and the number of non-zero coefficients used to represent the FIR digital pulse shaping filter response. With this algorithm, the FIR filter frequency and phase response can be represented with a minimum number of non-zero coefficients. Therefore, reducing the arithmetic complexity needed to get the filter output. Consequently, the system characteristic i.e. power consumption, area usage, and processing time are also reduced. The proposed algorithm is more powerful when integrated with multiplierless algorithms such as distributed arithmetic (DA) in designing high order digital FIR filters. Here the DA usage eliminates the need for multipliers when implementing the multiply and accumulate unit (MAC) and the proposed algorithm will reduce the number of adders and addition operations needed through the minimization of the non-zero values coefficients to get the filter output.
Abstract: The overall penumbra is usually defined as the
distance, p20–80, separating the 20% and 80% of the dose on the beam axis at the depth of interest. This overall penumbra accounts
also for the fact that some photons emitted by the distal parts of the source are only partially attenuated by the collimator. Medulloblastoma is the most common type of childhood brain tumor
and often spreads to the spine. Current guidelines call for surgery to remove as much of the tumor as possible, followed by radiation of the brain and spinal cord, and finally treatment with chemotherapy.
The purpose of this paper was to present results on an Uniformity of dose distribution in radiation fields surrounding the spine using film
dosimetry and comparison with 3D treatment planning software.
Abstract: The major objective of this paper is to introduce a new method to select genes from DNA microarray data. As criterion to select genes we suggest to measure the local changes in the correlation graph of each gene and to select those genes whose local changes are largest. More precisely, we calculate the correlation networks from DNA microarray data of cervical cancer whereas each network represents a tissue of a certain tumor stage and each node in the network represents a gene. From these networks we extract one tree for each gene by a local decomposition of the correlation network. The interpretation of a tree is that it represents the n-nearest neighbor genes on the n-th level of a tree, measured by the Dijkstra distance, and, hence, gives the local embedding of a gene within the correlation network. For the obtained trees we measure the pairwise similarity between trees rooted by the same gene from normal to cancerous tissues. This evaluates the modification of the tree topology due to tumor progression. Finally, we rank the obtained similarity values from all tissue comparisons and select the top ranked genes. For these genes the local neighborhood in the correlation networks changes most between normal and cancerous tissues. As a result we find that the top ranked genes are candidates suspected to be involved in tumor growth. This indicates that our method captures essential information from the underlying DNA microarray data of cervical cancer.
Abstract: Soil washing process with a surfactant solution is a potential technology for the rapid removal of hydrophobic organic compound (HOC) from soil. However, large amount of washed water would be produced during operation and this should be treated effectively by proper methods. The soil washed water for complex contaminated site with HOC and heavy metals might contain high amount of pollutants such as HOC and heavy metals as well as used surfactant. The heavy metals in the soil washed water have toxic effects on microbial activities thus these should be removed from the washed water before proceeding to a biological waste-water treatment system. Moreover, the used surfactant solutions are necessary to be recovered for reducing the soil washing operation cost. In order to simultaneously remove the heavy metals and HOC from soil-washed water, activated carbon (AC) was used in the present study. In an anionic-nonionic surfactant mixed solution, the Cd(II) and phenanthrene (PHE) were effectively removed by adsorption on activated carbon. The removal efficiency for Cd(II) was increased from 0.027 mmol-Cd/g-AC to 0.142 mmol-Cd/g-AC as the mole ratio of SDS increased in the presence of PHE. The adsorptive capacity of PHE was also increased according to the SDS mole ratio due to the decrement of molar solubilization ratios (MSR) for PHE in an anionic-nonionic surfactant mixture. The simultaneous adsorption of HOC and cationic heavy metals using activated carbon could be a useful method for surfactant recovery and the reduction of heavy metal toxicity in a surfactant-enhanced soil washing process.