Abstract: Textures are replications, symmetries and
combinations of various basic patterns, usually with some random
variation one of the gray-level statistics. This article proposes a
new approach to Segment texture images. The proposed approach
proceeds in 2 stages. First, in this method, local texture information
of a pixel is obtained by fuzzy texture unit and global texture
information of an image is obtained by fuzzy texture spectrum.
The purpose of this paper is to demonstrate the usefulness of fuzzy
texture spectrum for texture Segmentation.
The 2nd Stage of the method is devoted to a decision process,
applying a global analysis followed by a fine segmentation,
which is only focused on ambiguous points. The above Proposed
approach was applied to brain image to identify the components
of brain in turn, used to locate the brain tumor and its Growth
rate.
Abstract: Star graphs are Cayley graphs of symmetric groups of permutations, with transpositions as the generating sets. A star graph is a preferred interconnection network topology to a hypercube for its ability to connect a greater number of nodes with lower degree. However, an attractive property of the hypercube is that it has a Hamiltonian decomposition, i.e. its edges can be partitioned into disjoint Hamiltonian cycles, and therefore a simple routing can be found in the case of an edge failure. The existence of Hamiltonian cycles in Cayley graphs has been known for some time. So far, there are no published results on the much stronger condition of the existence of Hamiltonian decompositions. In this paper, we give a construction of a Hamiltonian decomposition of the star graph 5-star of degree 4, by defining an automorphism for 5-star and a Hamiltonian cycle which is edge-disjoint with its image under the automorphism.
Abstract: Prior to 1975, women in Laos suffered from having
reduced levels of power over decision-making in their families and in
their communities. This has had a negative impact on their ability to
develop their own identities. Their roles were identified as being
responsible for household activities and making preparations for their
marriage. Many women lost opportunities to get educated and access
the outdoor work that might have empowered them to improve their
situations. So far, no accurate figures of either emigrants or return
migrants have been compiled but it appears that most of them were
women, and it was women who most and more frequently remitted
money home. However, very few recent studies have addressed the
relationship between remittances and the roles of women in Laos.
This study, therefore, aims at redressing to some extent the
deficiencies in knowledge. Qualitative techniques were used to gather
data, including individual in-depth interviews and direct observation
in combination with the content analysis method. Forty women in
Vientiane Municipality and Savannakhet province were individually
interviewed. It was found that the monetary remittance was typically
used for family security and well-being; on fungible activities; on
economic and business activities; and on community development,
especially concerning hospitality and providing daily household
necessities. Remittances played important roles in improving many
respondents- livelihoods and positively changed their identities in
families and communities. Women became empowered as they were
able to start commercial businesses, rather than taking care of (just)
housework, children and elders. Interviews indicated that 92.5% of
the respondents their quality of lives improved, 90% felt happier in
their families and 82.5% felt conflicts in their families were reduced.
Abstract: Transmission network expansion planning (TNEP) is
a basic part of power system planning that determines where, when
and how many new transmission lines should be added to the
network. Up till now, various methods have been presented to solve
the static transmission network expansion planning (STNEP)
problem. But in all of these methods, transmission expansion
planning considering network adequacy restriction has not been
investigated. Thus, in this paper, STNEP problem is being studied
considering network adequacy restriction using discrete particle
swarm optimization (DPSO) algorithm. The goal of this paper is
obtaining a configuration for network expansion with lowest
expansion cost and a specific adequacy. The proposed idea has been
tested on the Garvers network and compared with the decimal
codification genetic algorithm (DCGA). The results show that the
network will possess maximum efficiency economically. Also, it is
shown that precision and convergence speed of the proposed DPSO
based method for the solution of the STNEP problem is more than
DCGA approach.
Abstract: In this paper we proposed multistage adaptive
ARQ/HARQ/HARQ scheme. This method combines pure ARQ
(Automatic Repeat reQuest) mode in low channel bit error rate and
hybrid ARQ method using two different Reed-Solomon codes in
middle and high error rate conditions. It follows, that our scheme has
three stages. The main goal is to increase number of states in adaptive
HARQ methods and be able to achieve maximum throughput for
every channel bit error rate. We will prove the proposal by
calculation and then with simulations in land mobile satellite channel
environment. Optimization of scheme system parameters is described
in order to maximize the throughput in the whole defined Signal-to-
Noise Ratio (SNR) range in selected channel environment.
Abstract: Heat pipes are used to control the thermal problem for
electronic cooling. It is especially difficult to dissipate heat to a heat
sink in an environment in space compared to earth. For solving this
problem, in this study, the Poiseuille (Po) number, which is the main
measure of the performance of a heat pipe, is studied by CFD; then, the
heat pipe performance is verified with experimental results. A heat
pipe is then fabricated for a spatial environment, and an in-house code
is developed. Further, a heat pipe subsystem, which consists of a heat
pipe, MLI (Multi Layer Insulator), SSM (Second Surface Mirror), and
radiator, is tested and correlated with the TMM (Thermal
Mathematical Model) through a commercial code. The correlation
results satisfy the 3K requirement, and the generated thermal model is
verified for application to a spatial environment.
Abstract: Bendability is constrained by maximum top roller
load imparting capacity of the machine. Maximum load is
encountered during the edge pre-bending stage of roller bending.
Capacity of 3-roller plate bending machine is specified by
maximum thickness and minimum shell diameter combinations that
can be pre-bend for given plate material of maximum width.
Commercially available plate width or width of the plate that can be
accommodated on machine decides the maximum rolling width.
Original equipment manufacturers (OEM) provide the machine
capacity chart based on reference material considering perfectly
plastic material model. Reported work shows the bendability analysis
of heavy duty 3-roller plate bending machine. The input variables for
the industry are plate thickness, shell diameter and material property
parameters, as it is fixed by the design. Analytical models of
equivalent thickness, equivalent width and maximum width based on
power law material model were derived to study the bendability.
Equation of maximum width provides bendability for designed
configuration i.e. material property, shell diameter and thickness
combinations within the machine limitations. Equivalent thicknesses
based on perfectly plastic and power law material model were
compared for four different materials grades of C-Mn steel in order
to predict the bend-ability. Effect of top roller offset on the
bendability at maximum top roller load imparting capacity is
reported.
Abstract: There are some existing Java benchmarks, application benchmarks as well as micro benchmarks or mixture both of them,such as: Java Grande, Spec98, CaffeMark, HBech, etc. But none of them deal with behaviors of multi tasks operating systems. As a result, the achieved outputs are not satisfied for performance evaluation engineers. Behaviors of multi tasks operating systems are based on a schedule management which is employed in these systems. Different processes can have different priority to share the same resources. The time is measured by estimating from applications started to it is finished does not reflect the real time value which the system need for running those programs. New approach to this problem should be done. Having said that, in this paper we present a new Java benchmark, named FHOJ benchmark, which directly deals with multi tasks behaviors of a system. Our study shows that in some cases, results from FHOJ benchmark are far more reliable in comparison with some existing Java benchmarks.
Abstract: In this work, we consider an application of neural networks in LD converter. Application of this approach assumes a reliable prediction of steel temperature and reduces a reblow ratio in steel work. It has been applied a conventional model to charge calculation, the obtained results by this technique are not always good, this is due to the process complexity. Difficulties are mainly generated by the noisy measurement and the process non linearities. Artificial Neural Networks (ANNs) have become a powerful tool for these complex applications. It is used a backpropagation algorithm to learn the neural nets. (ANNs) is used to predict the steel bath temperature in oxygen converter process for the end condition. This model has 11 inputs process variables and one output. The model was tested in steel work, the obtained results by neural approach are better than the conventional model.
Abstract: This paper presents the averaging model of a buck
converter derived from the generalized state-space averaging method.
The sliding mode control is used to regulate the output voltage of the
converter and taken into account in the model. The proposed model
requires the fast computational time compared with those of the full
topology model. The intensive time-domain simulations via the exact
topology model are used as the comparable model. The results show
that a good agreement between the proposed model and the switching
model is achieved in both transient and steady-state responses. The
reported model is suitable for the optimal controller design by using
the artificial intelligence techniques.
Abstract: Safety instrumented systems (SISs) are becoming
increasingly complex and the proportion of programmable electronic
parts is growing. The IEC 61508 global standard was established to
ensure the functional safety of SISs, but it was expressed in highly
macroscopic terms. This study introduces an evaluation process for
hardware safety integrity levels through failure modes, effects, and
diagnostic analysis (FMEDA).FMEDA is widely used to evaluate
safety levels, and it provides the information on failure rates and
failure mode distributions necessary to calculate a diagnostic coverage
factor for a given component. In our evaluation process, the
components of the SIS subsystem are first defined in terms of failure
modes and effects. Then, the failure rate and failure mechanism
distribution are assigned to each component. The safety mode and
detectability of each failure mode are determined for each component.
Finally, the hardware safety integrity level is evaluated based on the
calculated results.
Abstract: While OCD is one of the most commonly occurring
psychiatric conditions experienced by older adults, there is a paucity
of research conducted into the treatment of older adults with OCD.
This case study represents the first published investigation of a
cognitive treatment for geriatric OCD. It describes the successful
treatment of an 86-year old man with a 63-year history of OCD using
Danger Ideation Reduction Therapy (DIRT). The client received 14
individual, 50-minute treatment sessions of DIRT over 13 weeks.
Clinician-based Y-BOCS scores reduced 84% from 25 (severe) at
pre-treatment, to 4 (subclinical) at 6-month post-treatment follow-up
interview, demonstrating the efficacy of DIRT for this client. DIRT
may have particular advantages over ERP and pharmacological
approaches, however further research is required in older adults with
OCD.
Abstract: Distant-talking voice-based HCI system suffers from
performance degradation due to mismatch between the acoustic
speech (runtime) and the acoustic model (training). Mismatch is
caused by the change in the power of the speech signal as observed at
the microphones. This change is greatly influenced by the change in
distance, affecting speech dynamics inside the room before reaching
the microphones. Moreover, as the speech signal is reflected, its
acoustical characteristic is also altered by the room properties. In
general, power mismatch due to distance is a complex problem. This
paper presents a novel approach in dealing with distance-induced
mismatch by intelligently sensing instantaneous voice power variation
and compensating model parameters. First, the distant-talking speech
signal is processed through microphone array processing, and the
corresponding distance information is extracted. Distance-sensitive
Gaussian Mixture Models (GMMs), pre-trained to capture both
speech power and room property are used to predict the optimal
distance of the speech source. Consequently, pre-computed statistic
priors corresponding to the optimal distance is selected to correct
the statistics of the generic model which was frozen during training.
Thus, model combinatorics are post-conditioned to match the power
of instantaneous speech acoustics at runtime. This results to an
improved likelihood in predicting the correct speech command at
farther distances. We experiment using real data recorded inside two
rooms. Experimental evaluation shows voice recognition performance
using our method is more robust to the change in distance compared
to the conventional approach. In our experiment, under the most
acoustically challenging environment (i.e., Room 2: 2.5 meters), our
method achieved 24.2% improvement in recognition performance
against the best-performing conventional method.
Abstract: Nowadays, with the emerging of the new applications
like robot control in image processing, artificial vision for visual
servoing is a rapidly growing discipline and Human-machine
interaction plays a significant role for controlling the robot. This
paper presents a new algorithm based on spatio-temporal volumes for
visual servoing aims to control robots. In this algorithm, after
applying necessary pre-processing on video frames, a spatio-temporal
volume is constructed for each gesture and feature vector is extracted.
These volumes are then analyzed for matching in two consecutive
stages. For hand gesture recognition and classification we tested
different classifiers including k-Nearest neighbor, learning vector
quantization and back propagation neural networks. We tested the
proposed algorithm with the collected data set and results showed the
correct gesture recognition rate of 99.58 percent. We also tested the
algorithm with noisy images and algorithm showed the correct
recognition rate of 97.92 percent in noisy images.
Abstract: Transesterified vegetable oils (biodiesel) are promising alternative fuel for diesel engines. Used vegetable oils are disposed from restaurants in large quantities. But higher viscosity restricts their direct use in diesel engines. In this study, used cooking oil was dehydrated and then transesterified using an alkaline catalyst. The combustion, performance and emission characteristics of Used Cooking oil Methyl Ester (UCME) and its blends with diesel oil are analysed in a direct injection C.I. engine. The fuel properties and the combustion characteristics of UCME are found to be similar to those of diesel. A minor decrease in thermal efficiency with significant improvement in reduction of particulates, carbon monoxide and unburnt hydrocarbons is observed compared to diesel. The use of transesterified used cooking oil and its blends as fuel for diesel engines will reduce dependence on fossil fuels and also decrease considerably the environmental pollution.
Abstract: In this work, we present for the first time in our perception an efficient digital watermarking scheme for mpeg audio layer 3 files that operates directly in the compressed data domain, while manipulating the time and subband/channel domain. In addition, it does not need the original signal to detect the watermark. Our scheme was implemented taking special care for the efficient usage of the two limited resources of computer systems: time and space. It offers to the industrial user the capability of watermark embedding and detection in time immediately comparable to the real music time of the original audio file that depends on the mpeg compression, while the end user/audience does not face any artifacts or delays hearing the watermarked audio file. Furthermore, it overcomes the disadvantage of algorithms operating in the PCMData domain to be vulnerable to compression/recompression attacks, as it places the watermark in the scale factors domain and not in the digitized sound audio data. The strength of our scheme, that allows it to be used with success in both authentication and copyright protection, relies on the fact that it gives to the users the enhanced capability their ownership of the audio file not to be accomplished simply by detecting the bit pattern that comprises the watermark itself, but by showing that the legal owner knows a hard to compute property of the watermark.
Abstract: This paper presents an efficient method of obtaining a straight-line motion in the tool configuration space using an articulated robot between two specified points. The simulation results & the implementation results show the effectiveness of the method.
Abstract: The statistical process control (SPC) is one of the most powerful tools developed to assist ineffective control of quality, involves collecting, organizing and interpreting data during production. This article aims to show how the use of CEP industries can control and continuously improve product quality through monitoring of production that can detect deviations of parameters representing the process by reducing the amount of off-specification products and thus the costs of production. This study aimed to conduct a technological forecasting in order to characterize the research being done related to the CEP. The survey was conducted in the databases Spacenet, WIPO and the National Institute of Industrial Property (INPI). Among the largest are the United States depositors and deposits via PCT, the classification section that was presented in greater abundance to F.
Abstract: Facility location problem involves locating a facility
to optimize some performance measures. Location of a public facility
to serve the community, such as a fire station, significantly affects its
service quality. Main objective in locating a fire station is to
minimize the response time, which is the time duration between
receiving a call and reaching the place of incident. In metropolitan
areas, fire vehicles need to cross highways and other traffic obstacles
through some obstacle-overcoming points which delay the response
time. In this paper, fire station location problem is analyzed.
Simulation models are developed for the location problems which
involve obstacles. Particular case problems are analyzed and the
results are presented.
Abstract: One of the essential requirements of a realistic
surgical simulator is to reproduce haptic sensations due to the
interactions in the virtual environment. However, the interaction need
to be performed in real-time, since a delay between the user action
and the system reaction reduces the immersion sensation. In this
paper, a prototype of a coronary stent implant simulator is present;
this system allows real-time interactions with an artery by means of a
specific haptic device. To improve the realism of the simulation, the
building of the virtual environment is based on real patients- images
and a Web Portal is used to search in the geographically remote
medical centres a virtual environment with specific features in terms
of pathology or anatomy. The functional architecture of the system
defines several Medical Centres in which virtual environments built
from the real patients- images and related metadata with specific
features in terms of pathology or anatomy are stored. The searched
data are downloaded from the Medical Centre to the Training Centre
provided with a specific haptic device and with the software
necessary both to manage the interaction in the virtual environment.
After the integration of the virtual environment in the simulation
system it is possible to perform training on the specific surgical
procedure.