Abstract: This paper proposes a novel heuristic algorithm that aims to determine the best size and location of distributed generators in unbalanced distribution networks. The proposed heuristic algorithm can deal with the planning cases where power loss is to be optimized without violating the system practical constraints. The distributed generation units in the proposed algorithm is modeled as voltage controlled node with the flexibility to be converted to constant power factor node in case of reactive power limit violation. The proposed algorithm is implemented in MATLAB and tested on the IEEE 37 -node feeder. The results obtained show the effectiveness of the proposed algorithm.
Abstract: In this paper, we present a model-based regression test
suite reducing approach that uses EFSM model dependence analysis
and probability-driven greedy algorithm to reduce software regression
test suites. The approach automatically identifies the difference
between the original model and the modified model as a set of
elementary model modifications. The EFSM dependence analysis is
performed for each elementary modification to reduce the regression
test suite, and then the probability-driven greedy algorithm is adopted
to select the minimum set of test cases from the reduced regression test
suite that cover all interaction patterns. Our initial experience shows
that the approach may significantly reduce the size of regression test
suites.
Abstract: This research aims to develop an algorithm to
generate a schedule of multiple cranes that will maximize load
throughputs in anodizing operation. The algorithm proposed utilizes
an enumerative strategy to search for constant time between
successive loads and crane covering range over baths. The computer
program developed is able to generate a near-optimal crane schedule
within reasonable times, i.e. within 10 minutes. Its results are
compared with existing solutions from an aluminum extrusion
industry. The program can be used to generate crane schedules for
mixed products, thus allowing mixed-model line balancing to
improve overall cycle times.
Abstract: This paper presents a computational study of steady
state three dimensional very high turbulent flow and heat transfer
characteristics in a constant temperature-surfaced circular duct fitted
with 900 hemispherical inline baffles. The computations are based on
realizable k-ɛ model with standard wall function considering the
finite volume method, and the SIMPLE algorithm has been
implemented. Computational Study are carried out for Reynolds
number, Re ranging from 80000 to 120000, Prandtl Number, Pr of
0.73, Pitch Ratios, PR of 1,2,3,4,5 based on the hydraulic diameter of
the channel, hydrodynamic entry length, thermal entry length and the
test section. Ansys Fluent 15.0 software has been used to solve the
flow field. Study reveals that circular pipe having baffles has a higher
Nusselt number and friction factor compared to the smooth circular
pipe without baffles. Maximum Nusselt number and friction factor
are obtained for the PR=5 and PR=1 respectively. Nusselt number
increases while pitch ratio increases in the range of study; however,
friction factor also decreases up to PR 3 and after which it becomes
almost constant up to PR 5. Thermal enhancement factor increases
with increasing pitch ratio but with slightly decreasing Reynolds
number in the range of study and becomes almost constant at higher
Reynolds number. The computational results reveal that optimum
thermal enhancement factor of 900 inline hemispherical baffle is
about 1.23 for pitch ratio 5 at Reynolds number 120000.It also shows
that the optimum pitch ratio for which the baffles can be installed in
such very high turbulent flows should be 5. Results show that pitch
ratio and Reynolds number play an important role on both fluid flow
and heat transfer characteristics.
Abstract: By the evolvement in technology, the way of
expressing opinions switched direction to the digital world. The
domain of politics, as one of the hottest topics of opinion mining
research, merged together with the behavior analysis for affiliation
determination in texts, which constitutes the subject of this paper.
This study aims to classify the text in news/blogs either as
Republican or Democrat with the minimum number of features. As
an initial set, 68 features which 64 were constituted by Linguistic
Inquiry and Word Count (LIWC) features were tested against 14
benchmark classification algorithms. In the later experiments, the
dimensions of the feature vector reduced based on the 7 feature
selection algorithms. The results show that the “Decision Tree”,
“Rule Induction” and “M5 Rule” classifiers when used with “SVM”
and “IGR” feature selection algorithms performed the best up to
82.5% accuracy on a given dataset. Further tests on a single feature
and the linguistic based feature sets showed the similar results. The
feature “Function”, as an aggregate feature of the linguistic category,
was found as the most differentiating feature among the 68 features
with the accuracy of 81% in classifying articles either as Republican
or Democrat.
Abstract: Submerged arc welding is a very complex process. It
is a very efficient and high performance welding process. In this
present study an attempt have been done to reduce the welding
distortion by increased amount of oxide flux through TiO2 in
submerged arc welding process. Care has been taken to avoid the
excessiveness of the adding agent for attainment of significant
results. Data Envelopment Analysis (DEA) based BAT algorithm is
used for the parametric optimization purpose in which DEA is used
to convert multi response parameters into a single response
parameter. The present study also helps to know the effectiveness of
the addition of TiO2 in active flux during submerged arc welding
process.
Abstract: Any signal transmitted over a channel is corrupted by noise and interference. A host of channel coding techniques has been proposed to alleviate the effect of such noise and interference. Among these Turbo codes are recommended, because of increased capacity at higher transmission rates and superior performance over convolutional codes. The multimedia elements which are associated with ample amount of data are best protected by Turbo codes. Turbo decoder employs Maximum A-posteriori Probability (MAP) and Soft Output Viterbi Decoding (SOVA) algorithms. Conventional Turbo coded systems employ Equal Error Protection (EEP) in which the protection of all the data in an information message is uniform. Some applications involve Unequal Error Protection (UEP) in which the level of protection is higher for important information bits than that of other bits. In this work, enhancement to the traditional Log MAP decoding algorithm is being done by using optimized scaling factors for both the decoders. The error correcting performance in presence of UEP in Additive White Gaussian Noise channel (AWGN) and Rayleigh fading are analyzed for the transmission of image with Discrete Cosine Transform (DCT) as source coding technique. This paper compares the performance of log MAP, Modified log MAP (MlogMAP) and Enhanced log MAP (ElogMAP) algorithms used for image transmission. The MlogMAP algorithm is found to be best for lower Eb/N0 values but for higher Eb/N0 ElogMAP performs better with optimized scaling factors. The performance comparison of AWGN with fading channel indicates the robustness of the proposed algorithm. According to the performance of three different message classes, class3 would be more protected than other two classes. From the performance analysis, it is observed that ElogMAP algorithm with UEP is best for transmission of an image compared to Log MAP and MlogMAP decoding algorithms.
Abstract: A sensory network consists of multiple detection
locations called sensor nodes, each of which is tiny, featherweight
and portable. A single path routing protocols in wireless sensor
network can lead to holes in the network, since only the nodes
present in the single path is used for the data transmission. Apart
from the advantages like reduced computation, complexity and
resource utilization, there are some drawbacks like throughput,
increased traffic load and delay in data delivery. Therefore, multipath
routing protocols are preferred for WSN. Distributing the traffic
among multiple paths increases the network lifetime. We propose a
scheme, for the data to be transmitted through a dominant path to
save energy. In order to obtain a high delivery ratio, a basic route
reconstruction protocol is utilized to reconstruct the path whenever a
failure is detected. A basic reconstruction routing (BRR) algorithm is
proposed, in which a node can leap over path failure by using the
already existing routing information from its neighbourhood while
the composed data is transmitted from the source to the sink. In order
to save the energy and attain high data delivery ratio, data is
transmitted along a multiple path, which is achieved by BRR
algorithm whenever a failure is detected. Further, the analysis of
how the proposed protocol overcomes the drawback of the existing
protocols is presented. The performance of our protocol is compared
to AOMDV and energy efficient node-disjoint multipath routing
protocol (EENDMRP). The system is implemented using NS-2.34.
The simulation results show that the proposed protocol has high
delivery ratio with low energy consumption.
Abstract: Nowadays, the use of renewable energy sources has been increasingly great because of the cost increase and public demand for clean energy sources. One of the fastest growing sources is wind energy. In this paper, Wind Diesel Hybrid System (WDHS) comprising a Diesel Generator (DG), a Wind Turbine Generator (WTG), the Consumer Load, a Battery-based Energy Storage System (BESS), and a Dump Load (DL) is used. Voltage is controlled by Diesel Generator; the frequency is controlled by BESS and DL. The BESS elimination is an efficient way to reduce maintenance cost and increase the dynamic response. Simulation results with graphs for the frequency of Power System, active power, and the battery power are presented for load changes. The controlling parameters are optimized by using Imperialist Competitive Algorithm (ICA). The simulation results for the BESS/no BESS cases are compared. Results show that in no BESS case, the frequency control is more optimal than the BESS case by using ICA.
Abstract: The main function of Medium Access Control (MAC) is to share the channel efficiently between all nodes. In the real-time scenario, there will be certain amount of wastage in bandwidth due to back-off periods. More bandwidth will be wasted in idle state if the back-off period is very high and collision may occur if the back-off period is small. So, an optimization is needed for this problem. The main objective of the work is to reduce delay due to back-off period thereby reducing collision and increasing throughput. Here a method, called the virtual back-off algorithm (VBA) is used to optimize the back-off period and thereby it increases throughput and reduces collisions. The main idea is to optimize the number of transmission for every node. A counter is introduced at each node to implement this idea. Here counter value represents the sequence number. VBA is classified into two types VBA with counter sharing (VBA-CS) and VBA with no counter sharing (VBA-NCS). These two classifications of VBA are compared for various parameters. Simulation is done in NS-2 environment. The results obtained are found to be promising.
Abstract: This paper presented a study of three algorithms, the
equalization algorithm to equalize the transmission channel with ZF
and MMSE criteria, application of channel Bran A, and adaptive
filtering algorithms LMS and RLS to estimate the parameters of the
equalizer filter, i.e. move to the channel estimation and therefore
reflect the temporal variations of the channel, and reduce the error in
the transmitted signal. So far the performance of the algorithm
equalizer with ZF and MMSE criteria both in the case without noise,
a comparison of performance of the LMS and RLS algorithm.
Abstract: Hybrid electric vehicles can reduce pollution and
improve fuel economy. Power-split hybrid electric vehicles (HEVs)
provide two power paths between the internal combustion engine
(ICE) and energy storage system (ESS) through the gears of an
electrically variable transmission (EVT). EVT allows ICE to operate
independently from vehicle speed all the time. Therefore, the ICE can
operate in the efficient region of its characteristic brake specific fuel
consumption (BSFC) map. The two-mode powertrain can operate in
input-split or compound-split EVT modes and in four different fixed
gear configurations. Power-split architecture is advantageous because
it combines conventional series and parallel power paths. This
research focuses on input-split and compound-split modes in the
two-mode power-split powertrain. Fuzzy Logic Control (FLC) for an
internal combustion engine (ICE) and PI control for electric machines
(EMs) are derived for the urban driving cycle simulation. These
control algorithms reduce vehicle fuel consumption and improve ICE
efficiency while maintaining the state of charge (SOC) of the energy
storage system in an efficient range.
Abstract: In this paper we propose a computer-aided solution
with Genetic Algorithms in order to reduce the drafting of reports:
FMEA analysis and Control Plan required in the manufacture of the
product launch and improved knowledge development teams for
future projects. The solution allows to the design team to introduce
data entry required to FMEA. The actual analysis is performed using
Genetic Algorithms to find optimum between RPN risk factor and
cost of production. A feature of Genetic Algorithms is that they are
used as a means of finding solutions for multi criteria optimization
problems. In our case, along with three specific FMEA risk factors is
considered and reduce production cost. Analysis tool will generate
final reports for all FMEA processes. The data obtained in FMEA
reports are automatically integrated with other entered parameters in
Control Plan. Implementation of the solution is in the form of an
application running in an intranet on two servers: one containing
analysis and plan generation engine and the other containing the
database where the initial parameters and results are stored. The
results can then be used as starting solutions in the synthesis of other
projects. The solution was applied to welding processes, laser cutting
and bending to manufacture chassis for buses. Advantages of the
solution are efficient elaboration of documents in the current project
by automatically generating reports FMEA and Control Plan using
multiple criteria optimization of production and build a solid
knowledge base for future projects. The solution which we propose is
a cheap alternative to other solutions on the market using Open
Source tools in implementation.
Abstract: The paper develops a Non-Linear Model Predictive
Control (NMPC) of water quality in Drinking Water Distribution
Systems (DWDS) based on the advanced non-linear quality dynamics
model including disinfections by-products (DBPs). A special attention
is paid to the analysis of an impact of the flow trajectories prescribed
by an upper control level of the recently developed two-time scale
architecture of an integrated quality and quantity control in DWDS.
The new quality controller is to operate within this architecture in the
fast time scale as the lower level quality controller. The controller
performance is validated by a comprehensive simulation study based
on an example case study DWDS.
Abstract: In this paper, we are interested in the problem of
finding similar images in a large database. For this purpose we
propose a new algorithm based on a combination of the 2-D
histogram intersection in the HSV space and statistical moments. The
proposed histogram is based on a 3x3 window and not only on the
intensity of the pixel. This approach overcome the drawback of the
conventional 1-D histogram which is ignoring the spatial distribution
of pixels in the image, while the statistical moments are used to
escape the effects of the discretisation of the color space which is
intrinsic to the use of histograms. We compare the performance of
our new algorithm to various methods of the state of the art and we
show that it has several advantages. It is fast, consumes little memory
and requires no learning. To validate our results, we apply this
algorithm to search for similar images in different image databases.
Abstract: Image segmentation and color identification is an
important process used in various emerging fields like intelligent
robotics. A method is proposed for the manipulator to grasp and place
the color object into correct location. The existing methods such as
PSO, has problems like accelerating the convergence speed and
converging to a local minimum leading to sub optimal performance.
To improve the performance, we are using watershed algorithm and
for color identification, we are using EPSO. EPSO method is used to
reduce the probability of being stuck in the local minimum. The
proposed method offers the particles a more powerful global
exploration capability. EPSO methods can determine the particles
stuck in the local minimum and can also enhance learning speed as
the particle movement will be faster.
Abstract: In this study, the pedestrian simulation VISWALK
integration and application platform ant algorithms written program
made to construct a renovation engineering schedule planning mode.
The use of simulation analysis platform construction site when the user
running the simulation, after calculating the user walks in the case of
construction delays, the ant algorithm to find out the minimum delay
time schedule plan, and add volume and unit area deactivated loss of
business computing, and finally to the owners and users of two
different positions cut considerations pick out the best schedule
planning. To assess and validate its effectiveness, this study
constructed the model imported floor of a shopping mall floor
renovation engineering cases. Verify that the case can be found from
the mode of the proposed project schedule planning program can
effectively reduce the delay time and the user's walking mall loss of
business, the impact of the operation on the renovation engineering
facilities in the building to a minimum.
Abstract: The Economic Lot Scheduling Problem (ELSP) is a
valuable mathematical model that can support decision-makers to
make scheduling decisions. The basic period approach is effective for
solving the ELSP. The assumption for applying the basic period
approach is that a product must use its maximum production rate to be
produced. However, a product can lower its production rate to reduce
the average total cost when a facility has extra idle time. The past
researches discussed how a product adjusts its production rate under
the common cycle approach. To the best of our knowledge, no studies
have addressed how a product lowers its production rate under the
basic period approach. This research is the first paper to discuss this
topic. The research develops a simple fixed rate approach that adjusts
the production rate of a product under the basic period approach to
solve the ELSP. Our numerical example shows our approach can find a
better solution than the traditional basic period approach. Our
mathematical model that applies the fixed rate approach under the
basic period approach can serve as a reference for other related
researches.
Abstract: The system for analyzing and eliciting public
grievances serves its main purpose to receive and process all sorts of
complaints from the public and respond to users. Due to the more
number of complaint data becomes big data which is difficult to store
and process. The proposed system uses HDFS to store the big data
and uses MapReduce to process the big data. The concept of cache
was applied in the system to provide immediate response and timely
action using big data analytics. Cache enabled big data increases the
response time of the system. The unstructured data provided by the
users are efficiently handled through map reduce algorithm. The
processing of complaints takes place in the order of the hierarchy of
the authority. The drawbacks of the traditional database system used
in the existing system are set forth by our system by using Cache
enabled Hadoop Distributed File System. MapReduce framework
codes have the possible to leak the sensitive data through
computation process. We propose a system that add noise to the
output of the reduce phase to avoid signaling the presence of
sensitive data. If the complaints are not processed in the ample time,
then automatically it is forwarded to the higher authority. Hence it
ensures assurance in processing. A copy of the filed complaint is sent
as a digitally signed PDF document to the user mail id which serves
as a proof. The system report serves to be an essential data while
making important decisions based on legislation.
Abstract: In the past decade, the use of digital image correlation
(DIC) techniques has increased significantly in the area of
experimental mechanics, especially for materials behavior
characterization. This non-contact tool enables full field displacement
and strain measurements over a complete region of interest. The DIC
algorithm requires a random contrast pattern on the surface of the
specimen in order to perform properly. To create this pattern, the
specimen is usually first coated using a white matt paint. Next, a
black random speckle pattern is applied using any suitable method. If
the applied paint coating is too thick, its top surface may not be able
to exactly follow the deformation of the specimen, and consequently,
the strain measurement might be underestimated. In the present
article, a study of the influence of the paint thickness on the strain
underestimation is performed for different strain levels. The results
are then compared to typical paint coating thicknesses applied by
experienced DIC users. A slight strain underestimation was observed
for paint coatings thicker than about 30μm. On the other hand, this
value was found to be uncommonly high compared to coating
thicknesses applied by DIC users.