Abstract: This paper undertakes the problem of optimal
capacitor placement in a distribution system. The problem is how to
optimally determine the locations to install capacitors, the types and
sizes of capacitors to he installed and, during each load level,the
control settings of these capacitors in order that a desired objective
function is minimized while the load constraints,network constraints
and operational constraints (e.g. voltage profile) at different load
levels are satisfied. The problem is formulated as a combinatorial
optimization problem with a nondifferentiable objective function.
Four solution mythologies based on algorithms (GA),tabu search
(TS), and hybrid GA-SA algorithms are presented.The solution
methodologies are preceded by a sensitivity analysis to select the
candidate capacitor installation locations.
Abstract: Connected dominating set (CDS) problem in unit disk
graph has signi£cant impact on an ef£cient design of routing protocols
in wireless sensor networks, where the searching space for a
route is reduced to nodes in the set. A set is dominating if all the
nodes in the system are either in the set or neighbors of nodes in the
set. In this paper, a simple and ef£cient heuristic method is proposed
for £nding a minimum connected dominating set (MCDS) in ad hoc
wireless networks based on the new parameter support of vertices.
With this parameter the proposed heuristic approach effectively
£nds the MCDS of a graph. Extensive computational experiments
show that the proposed approach outperforms the recently proposed
heuristics found in the literature for the MCD
Abstract: In today-s competitive environment, the security concerns have grown tremendously. In the modern world, possession is known to be 9/10-ths of the law. Hence, it is imperative for one to be able to safeguard one-s property from worldly harms such as thefts, destruction of property, people with malicious intent etc. Due to the advent of technology in the modern world, the methodologies used by thieves and robbers for stealing have been improving exponentially. Therefore, it is necessary for the surveillance techniques to also improve with the changing world. With the improvement in mass media and various forms of communication, it is now possible to monitor and control the environment to the advantage of the owners of the property. The latest technologies used in the fight against thefts and destruction are the video surveillance and monitoring. By using the technologies, it is possible to monitor and capture every inch and second of the area in interest. However, so far the technologies used are passive in nature, i.e., the monitoring systems only help in detecting the crime but do not actively participate in stopping or curbing the crime while it takes place. Therefore, we have developed a methodology to detect the motion in a video stream environment and this is an idea to ensure that the monitoring systems not only actively participate in stopping the crime, but do so while the crime is taking place. Hence, a system is used to detect any motion in a live streaming video and once motion has been detected in the live stream, the software will activate a warning system and capture the live streaming video.
Abstract: In this paper we introduce three watermarking methods that can be used to count the number of times that a user has played some content. The proposed methods are tested with audio content in our experimental system using the most common signal processing attacks. The test results show that the watermarking methods used enable the watermark to be extracted under the most common attacks with a low bit error rate.
Abstract: Multi-residue analysis method for penicillins was
developed and validated in bovine muscle, chicken, milk, and flatfish.
Detection was based on liquid chromatography tandem mass
spectrometry (LC/MS/MS). The developed method was validated for
specificity, precision, recovery, and linearity. The analytes were
extracted with 80% acetonitrile and clean-up by a single
reversed-phase solid-phase extraction step. Six penicillins presented
recoveries higher than 76% with the exception of Amoxicillin
(59.7%). Relative standard deviations (RSDs) were not more than
10%. LOQs values ranged from 0.1 and to 4.5 ug/kg. The method was
applied to 128 real samples. Benzylpenicillin was detected in 15
samples and Cloxacillin was detected in 7 samples. Oxacillin was
detected in 2 samples. But the detected levels were under the MRL
levels for penicillins in samples.
Abstract: High voltage generators are being subject to higher
voltage rating and are being designed to operate in harsh conditions.
Stator windings are the main component of generators in which
Electrical, magnetically and thermal stresses remain major failures
for insulation degradation accelerated aging. A large number of
generators failed due to stator winding problems, mainly insulation
deterioration. Insulation degradation assessment plays vital role in the
asset life management. Mostly the stator failure is catastrophic
causing significant damage to the plant. Other than generation loss,
stator failure involves heavy repair or replacement cost. Electro
thermal analysis is the main characteristic for improvement design of
stator slot-s insulation. Dielectric parameters such as insulation
thickness, spacing, material types, geometry of winding and slot are
major design consideration. A very powerful method available to
analyze electro thermal performance is Finite Element Method
(FEM) which is used in this paper. The analysis of various stator coil
and slot configurations are used to design the better dielectric system
to reduce electrical and thermal stresses in order to increase the
power of generator in the same volume of core. This paper describes
the process used to perform classical design and improvement
analysis of stator slot-s insulation.
Abstract: In general, class complexity is measured based on any
one of these factors such as Line of Codes (LOC), Functional points
(FP), Number of Methods (NOM), Number of Attributes (NOA) and so on. There are several new techniques, methods and metrics with
the different factors that are to be developed by the researchers for calculating the complexity of the class in Object Oriented (OO)
software. Earlier, Arockiam et.al has proposed a new complexity measure namely Extended Weighted Class Complexity (EWCC)
which is an extension of Weighted Class Complexity which is proposed by Mishra et.al. EWCC is the sum of cognitive weights of
attributes and methods of the class and that of the classes derived. In EWCC, a cognitive weight of each attribute is considered to be 1.
The main problem in EWCC metric is that, every attribute holds the
same value but in general, cognitive load in understanding the
different types of attributes cannot be the same. So here, we are proposing a new metric namely Attribute Weighted Class Complexity
(AWCC). In AWCC, the cognitive weights have to be assigned for the attributes which are derived from the effort needed to understand
their data types. The proposed metric has been proved to be a better
measure of complexity of class with attributes through the case studies and experiments
Abstract: Polymers are one of the most widely used materials in our every day life. The subject of renewable resources has attracted great attention in the last period of time. New polymeric materials derived from renewable resources, like carbohydrates draw attention to public eye especially because of their biocompatibility and biodegradability. The aim of our paper was to obtain environmentally compatible polymers from monosaccharides. Novel glycopolymers based on D-glucose have been obtained from copolymerization of a new monomer carrying carbohydrate moiety with methyl methacrylate (MMA) via free radical bulk polymerization. Differential scanning calorimetry (DSC) was performed in order to study the copolymerization process of the monomer into the chosen co-monomer; the activation energy of this process was evaluated using Ozawa method. The copolymers obtained were characterized using ATR-FTIR spectroscopy. The thermal stability of the obtained products was studied by thermogravimetry (TG).
Abstract: The zero inflated models are usually used in modeling
count data with excess zeros where the existence of the excess zeros
could be structural zeros or zeros which occur by chance. These type
of data are commonly found in various disciplines such as finance,
insurance, biomedical, econometrical, ecology, and health sciences
which involve sex and health dental epidemiology. The most popular
zero inflated models used by many researchers are zero inflated
Poisson and zero inflated negative binomial models. In addition, zero
inflated generalized Poisson and zero inflated double Poisson models
are also discussed and found in some literature. Recently zero
inflated inverse trinomial model and zero inflated strict arcsine
models are advocated and proven to serve as alternative models in
modeling overdispersed count data caused by excessive zeros and
unobserved heterogeneity. The purpose of this paper is to review
some related literature and provide a variety of examples from
different disciplines in the application of zero inflated models.
Different model selection methods used in model comparison are
discussed.
Abstract: This study applies the sequential panel selection
method (SPSM) procedure proposed by Chortareas and Kapetanios
(2009) to investigate the time-series properties of energy
consumption in 50 US states from 1963 to 2009. SPSM involves the
classification of the entire panel into a group of stationary series and
a group of non-stationary series to identify how many and which
series in the panel are stationary processes. Empirical results obtained
through SPSM with the panel KSS unit root test developed by Ucar
and Omay (2009) combined with a Fourier function indicate that
energy consumption in all the 50 US states are stationary. The results
of this study have important policy implications for the 50 US states.
Abstract: Suburban area is an important area to the development of a city and a country. Russias economy is going through major transitions. These transitions are rapidly changing the relationship between cities (urban areas), countryside (rural areas) and the development, growth, and popularity of suburbia. The process of suburbanization takes place in biggest cities of Russia, including Krasnoyarsk City. The modern Krasnoyarsk with a population of about 1mln people occupies the territory of 34115 ha. This article examines the analysis of functions of suburban area and connects these functions with zoning of the suburban territory. The author uses the method of hierarchy to select the best conditions to each function in connection with nature component, transportation and distance from the city. The result of this research is the map of the functional zoning of suburban area of Krasnoyarsk City. The author uses a variety of factors, which have an influence on suburban area, to compare and choose the best conditions. KeywordsSuburban area, zoning of territory, Krasnoyarsk City.
Abstract: This paper describes a method to improve the robustness of a face recognition system based on the combination of two compensating classifiers. The face images are preprocessed by the appearance-based statistical approaches such as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA). LDA features of the face image are taken as the input of the Radial Basis Function Network (RBFN). The proposed approach has been tested on the ORL database. The experimental results show that the LDA+RBFN algorithm has achieved a recognition rate of 93.5%
Abstract: A full six degrees of freedom (6-DOF) flight dynamics
model is proposed for the accurate prediction of short and long-range
trajectories of high spin and fin-stabilized projectiles via atmospheric
flight to final impact point. The projectiles is assumed to be both rigid
(non-flexible), and rotationally symmetric about its spin axis launched
at low and high pitch angles. The mathematical model is based on the
full equations of motion set up in the no-roll body reference frame and
is integrated numerically from given initial conditions at the firing
site. The projectiles maneuvering motion depends on the most
significant force and moment variations, in addition to wind and
gravity. The computational flight analysis takes into consideration the
Mach number and total angle of attack effects by means of the
variable aerodynamic coefficients. For the purposes of the present
work, linear interpolation has been applied from the tabulated database
of McCoy-s book. The developed computational method gives
satisfactory agreement with published data of verified experiments and
computational codes on atmospheric projectile trajectory analysis for
various initial firing flight conditions.
Abstract: Electrocardiogram (ECG) data compression algorithm
is needed that will reduce the amount of data to be transmitted, stored
and analyzed, but without losing the clinical information content. A
wavelet ECG data codec based on the Set Partitioning In Hierarchical
Trees (SPIHT) compression algorithm is proposed in this paper. The
SPIHT algorithm has achieved notable success in still image coding.
We modified the algorithm for the one-dimensional (1-D) case and
applied it to compression of ECG data.
By this compression method, small percent root mean square
difference (PRD) and high compression ratio with low
implementation complexity are achieved. Experiments on selected
records from the MIT-BIH arrhythmia database revealed that the
proposed codec is significantly more efficient in compression and in
computation than previously proposed ECG compression schemes.
Compression ratios of up to 48:1 for ECG signals lead to acceptable
results for visual inspection.
Abstract: The increasingly sophisticated technologies have now been able to provide assistance for surgeons to improve surgical
performance through various training programs. Equally important to learning skills is the assessment method as it determines the learning and technical proficiency of a trainee. A consistent and
rigorous assessment system will ensure that trainees acquire the specific level of competency prior to certification. This paper
reviews the methods currently in use for assessment of surgical
skill and some modern techniques using computer-based
measurements and virtual reality systems for more quantitative
measurements
Abstract: It is a challenge to provide a wide range of queries to
database query systems for small mobile devices, such as the PDAs
and cell phones. Currently, due to the physical and resource
limitations of these devices, most reported database querying systems
developed for them are only offering a small set of pre-determined
queries for users to possibly pose. The above can be resolved by
allowing free-form queries to be entered on the devices. Hence, a
query language that does not restrict the combination of query terms
entered by users is proposed. This paper presents the free-form query
language and the method used in translating free-form queries to
their equivalent SQL statements.
Abstract: In the paper the method of product analysis from
recycling point of view has been described. The analysis bases on set
of measures that assess a product from the point of view of final
stages of its lifecycle. It was assumed that such analysis will be
performed at the design phase – in order to conduct such analysis the
computer system that aids the designer during the design process has
been developed. The structure of the computer tool, based on agent
technology, and example results has been also included in the paper.
Abstract: In the planning point of view, it is essential to have
mode choice, due to the massive amount of incurred in transportation
systems. The intercity travellers in Libya have distinct features, as
against travellers from other countries, which includes cultural and
socioeconomic factors. Consequently, the goal of this study is to
recognize the behavior of intercity travel using disaggregate models,
for projecting the demand of nation-level intercity travel in Libya.
Multinomial Logit Model for all the intercity trips has been
formulated to examine the national-level intercity transportation in
Libya. The Multinomial logit model was calibrated using nationwide
revealed preferences (RP) and stated preferences (SP) survey. The
model was developed for deference purpose of intercity trips (work,
social and recreational). The variables of the model have been
predicted based on maximum likelihood method. The data needed for
model development were obtained from all major intercity corridors
in Libya. The final sample size consisted of 1300 interviews. About
two-thirds of these data were used for model calibration, and the
remaining parts were used for model validation. This study, which is
the first of its kind in Libya, investigates the intercity traveler’s
mode-choice behavior. The intercity travel mode-choice model was
successfully calibrated and validated. The outcomes indicate that, the
overall model is effective and yields higher precision of estimation.
The proposed model is beneficial, due to the fact that, it is receptive
to a lot of variables, and can be employed to determine the impact of
modifications in the numerous characteristics on the need for various
travel modes. Estimations of the model might also be of valuable to
planners, who can estimate possibilities for various modes and
determine the impact of unique policy modifications on the need for
intercity travel.
Abstract: Traveling salesman problem (TSP) is hard to resolve
when the number of cities and routes become large. The frequency
graph is constructed to tackle the problem. A frequency graph
maintains the topological relationships of the original weighted graph.
The numbers on the edges are the frequencies of the edges emulated
from the local optimal Hamiltonian paths. The simplest kind of local
optimal Hamiltonian paths are computed based on the four vertices
and three lines inequality. The search algorithm is given to find the
optimal Hamiltonian circuit based on the frequency graph. The
experiments show that the method can find the optimal Hamiltonian
circuit within several trials.
Abstract: Sediment formation and its transport along the river course is considered as important hydraulic consideration in river engineering. Their impact on the morphology of rivers on one hand and important considerations of which in the design and construction of the hydraulic structures on the other has attracted the attention of experts in arid and semi-arid regions. Under certain conditions where the momentum energy of the flow stream reaches a specific rate, the sediment materials start to be transported with the flow. This can usually be analyzed in two different categories of suspended and bed load materials. Sedimentation phenomenon along the waterways and the conveyance of vast volume of materials into the canal networks can potentially influence water abstraction in the intake structures. This can pose a serious threat to operational sustainability and water delivery performance in the canal networks. The situation is serious where ineffective watershed management (poor vegetation cover in the water basin) is the underlying cause of soil erosion which feeds the materials into the waterways that intern would necessitate comprehensive study. The present paper aims to present an analytical investigation of the sediment process in the waterways on one hand and estimation of the sediment load transport into the lined canals using the SHARC software on the other. For this reason, the paper focuses on the comparative analysis of the hydraulic behaviors of the Sabilli main canal that feeds the pumping station with that of the Western canal in the Greater Dezful region to identify effective factors in sedimentation and ways of mitigating their impact on water abstraction in the canal systems. The method involved use of observational data available in the Dezful Dastmashoon hydrometric station along a 6 km waterway of the Sabilli main canal using the SHARC software to estimate the suspended load concentration and bed load materials. Results showed the transport of a significant volume of sediment loads from the waterways into the canal system which is assumed to have arisen from the absence of stilling basin on one hand and the gravity flow on the other has caused serious challenges. This is contrary to what occurs in the Sabilli canal, where the design feature which incorporates a settling basin just before the pumping station is the major cause of reduced sediment load transport into the canal system.Results showed that modification of the present design features by constructing a settling basin just upstream of the western intake structure can considerably reduce the entry of sediment materials into the canal system. Not only this can result in the sustainability of the hydraulic structures but can also improve operational performance of water conveyance and distribution system, all of which are the pre-requisite to secure reliable and equitable water delivery regime for the command area.