Abstract: The vertex connectivity of a graph is the smallest number of vertices whose deletion separates the graph or makes it trivial. This work is devoted to the problem of vertex connectivity test of graphs in a distributed environment based on a general and a constructive approach. The contribution of this paper is threefold. First, using a preconstructed spanning tree of the considered graph, we present a protocol to test whether a given graph is 2-connected using only local knowledge. Second, we present an encoding of this protocol using graph relabeling systems. The last contribution is the implementation of this protocol in the message passing model. For a given graph G, where M is the number of its edges, N the number of its nodes and Δ is its degree, our algorithms need the following requirements: The first one uses O(Δ×N2) steps and O(Δ×logΔ) bits per node. The second one uses O(Δ×N2) messages, O(N2) time and O(Δ × logΔ) bits per node. Furthermore, the studied network is semi-anonymous: Only the root of the pre-constructed spanning tree needs to be identified.
Abstract: An information procuring and processing emerging technology wireless sensor network (WSN) Consists of autonomous nodes with versatile devices underpinned by applications. Nodes are equipped with different capabilities such as sensing, computing, actuation and wireless communications etc. based on application requirements. A WSN application ranges from military implementation in the battlefield, environmental monitoring, health sector as well as emergency response of surveillance. The nodes are deployed independently to cooperatively monitor the physical and environmental conditions. The architecture of WSN differs based on the application requirements and focus on low cost, flexibility, fault tolerance capability, deployment process as well as conserve energy. In this paper we have present the characteristics, architecture design objective and architecture of WSN
Abstract: Considering complexity of products, new geometrical
design and investment tolerances that are necessary, measuring and
dimensional controlling involve modern and more precise methods.
Photo digitizing method using two cameras to record pictures and
utilization of conventional method named “cloud points" and data
analysis by the use of ATOUS software, is known as modern and
efficient in mentioned context. In this paper, benefits of photo
digitizing method in evaluating sampling of machining processes
have been put forward. For example, assessment of geometrical
integrity surface in 5-axis milling process and measurement of
carbide tool wear in turning process, can be can be brought forward.
Advantages of this method comparing to conventional methods have
been expressed.
Abstract: In this paper we present a novel design of a wearable
electronic textile. After defining a special application, we used the
specifications of some low power, tiny elements including sensors,
microcontrollers, transceivers, and a fault tolerant special topology to
have the most reliability as well as low power consumption and
longer lifetime. We have considered two different conditions as
normal and bodily critical conditions and set priorities for using
different sensors in various conditions to have a longer effective
lifetime.
Abstract: The objective of this work which is based on the
approach of simultaneous engineering is to contribute to the
development of a CIM tool for the synthesis of functional design
dimensions expressed by average values and tolerance intervals. In
this paper, the dispersions method known as the Δl method which
proved reliable in the simulation of manufacturing dimensions is
used to develop a methodology for the automation of the simulation.
This methodology is constructed around three procedures. The first
procedure executes the verification of the functional requirements by
automatically extracting the functional dimension chains in the
mechanical sub-assembly. Then a second procedure performs an
optimization of the dispersions on the basis of unknown variables.
The third procedure uses the optimized values of the dispersions to
compute the optimized average values and tolerances of the
functional dimensions in the chains. A statistical and cost based
approach is integrated in the methodology in order to take account of
the capabilities of the manufacturing processes and to distribute
optimal values among the individual components of the chains.
Abstract: Optical character recognition of cursive scripts
presents a number of challenging problems in both segmentation and
recognition processes in different languages, including Persian. In
order to overcome these problems, we use a newly developed Persian
word segmentation method and a recognition-based segmentation
technique to overcome its segmentation problems. This method is
robust as well as flexible. It also increases the system-s tolerances to
font variations. The implementation results of this method on a
comprehensive database show a high degree of accuracy which meets
the requirements for commercial use. Extended with a suitable pre
and post-processing, the method offers a simple and fast framework
to develop a full OCR system.
Abstract: The Institute of Product Development is dealing
with the development, design and dimensioning of micro components
and systems as a member of the Collaborative Research
Centre 499 “Design, Production and Quality Assurance of
Molded micro components made of Metallic and Ceramic Materials".
Because of technological restrictions in the miniaturization
of conventional manufacturing techniques, shape and
material deviations cannot be scaled down in the same proportion
as the micro parts, rendering components with relatively
wide tolerance fields. Systems that include such components
should be designed with this particularity in mind, often requiring
large clearance. On the end, the output of such systems
results variable and prone to dynamical instability. To save
production time and resources, every study of these effects
should happen early in the product development process and
base on computer simulation to avoid costly prototypes. A
suitable method is proposed here and exemplary applied to a
micro technology demonstrator developed by the CRC499. It
consists of a one stage planetary gear train in a sun-planet-ring
configuration, with input through the sun gear and output
through the carrier. The simulation procedure relies on ordinary
Multi Body Simulation methods and subsequently adds
other techniques to further investigate details of the system-s
behavior and to predict its response. The selection of the relevant
parameters and output functions followed the engineering
standards for regular sized gear trains. The first step is to
quantify the variability and to reveal the most critical points of
the system, performed through a whole-mechanism Sensitivity
Analysis. Due to the lack of previous knowledge about the system-s
behavior, different DOE methods involving small and
large amount of experiments were selected to perform the SA.
In this particular case the parameter space can be divided into
two well defined groups, one of them containing the gear-s profile
information and the other the components- spatial location.
This has been exploited to explore the different DOE techniques
more promptly. A reduced set of parameters is derived for
further investigation and to feed the final optimization process,
whether as optimization parameters or as external perturbation
collective. The 10 most relevant perturbation factors and 4 to 6
prospective variable parameters are considered in a new, simplified
model. All of the parameters are affected by the mentioned
production variability. The objective functions of interest
are based on scalar output-s variability measures, so the
problem becomes an optimization under robustness and reliability constrains. The study shows an initial step on the development
path of a method to design and optimize complex micro
mechanisms composed of wide tolerated elements accounting
for the robustness and reliability of the systems- output.
Abstract: With the intention of screening for heavy metal
tolerance, a number of bacteria were isolated and characterized from
a pristine soil. Two Gram positive isolates were identified as
Paenibacillus sp. and Bacillus thuringeinsis. Tolerance of Cd2+, Cu2+
and Zn2+ by these bacteria was studied and found that both bacteria
were highly sensitive to Cu2+ compared to other two metals. Both
bacteria showed the same pattern of metal tolerance in the order Zn+
> Cd2+ > Cu2+. When the metal tolerance in both bacteria was
compared, Paenibacillus sp. showed the highest sensitivity to Cu2+
where as B. thuringiensis showed highest sensitivity to Cd2+ and Zn2+
.These findings revealed the potential of Paenibacillus sp. in
developing a biosensor to detect Cu2+ in environmental samples.
Abstract: In this paper we address the issue of classifying the fluorescent intensity of a sample in Indirect Immuno-Fluorescence (IIF). Since IIF is a subjective, semi-quantitative test in its very nature, we discuss a strategy to reliably label the image data set by using the diagnoses performed by different physicians. Then, we discuss image pre-processing, feature extraction and selection. Finally, we propose two ANN-based classifiers that can separate intrinsically dubious samples and whose error tolerance can be flexibly set. Measured performance shows error rates less than 1%, which candidates the method to be used in daily medical practice either to perform pre-selection of cases to be examined, or to act as a second reader.
Abstract: The present work analyses different parameters of pressure die casting to minimize the casting defects. Pressure diecasting is usually applied for casting of aluminium alloys. Good surface finish with required tolerances and dimensional accuracy can be achieved by optimization of controllable process parameters such as solidification time, molten temperature, filling time, injection pressure and plunger velocity. Moreover, by selection of optimum process parameters the pressure die casting defects such as porosity, insufficient spread of molten material, flash etc. are also minimized. Therefore, a pressure die casting component, carburetor housing of aluminium alloy (Al2Si2O5) has been considered. The effects of selected process parameters on casting defects and subsequent setting of parameters with the levels have been accomplished by Taguchi-s parameter design approach. The experiments have been performed as per the combination of levels of different process parameters suggested by L18 orthogonal array. Analyses of variance have been performed for mean and signal-to-noise ratio to estimate the percent contribution of different process parameters. Confidence interval has also been estimated for 95% consistency level and three conformational experiments have been performed to validate the optimum level of different parameters. Overall 2.352% reduction in defects has been observed with the help of suggested optimum process parameters.
Abstract: Large scale systems such as computational Grid is
a distributed computing infrastructure that can provide globally
available network resources. The evolution of information processing
systems in Data Grid is characterized by a strong decentralization of
data in several fields whose objective is to ensure the availability and
the reliability of the data in the reason to provide a fault tolerance
and scalability, which cannot be possible only with the use of the
techniques of replication. Unfortunately the use of these techniques
has a height cost, because it is necessary to maintain consistency
between the distributed data. Nevertheless, to agree to live with
certain imperfections can improve the performance of the system by
improving competition. In this paper, we propose a multi-layer protocol
combining the pessimistic and optimistic approaches conceived
for the data consistency maintenance in large scale systems. Our
approach is based on a hierarchical representation model with tree
layers, whose objective is with double vocation, because it initially
makes it possible to reduce response times compared to completely
pessimistic approach and it the second time to improve the quality
of service compared to an optimistic approach.
Abstract: This paper presents the applicability of artificial
neural networks for 24 hour ahead solar power generation forecasting
of a 20 kW photovoltaic system, the developed forecasting is suitable
for a reliable Microgrid energy management. In total four neural
networks were proposed, namely: multi-layred perceptron, radial
basis function, recurrent and a neural network ensemble consisting in
ensemble of bagged networks. Forecasting reliability of the proposed
neural networks was carried out in terms forecasting error
performance basing on statistical and graphical methods. The
experimental results showed that all the proposed networks achieved
an acceptable forecasting accuracy. In term of comparison the neural
network ensemble gives the highest precision forecasting comparing
to the conventional networks. In fact, each network of the ensemble
over-fits to some extent and leads to a diversity which enhances the
noise tolerance and the forecasting generalization performance
comparing to the conventional networks.
Abstract: Electronics Products that achieve high levels of integrated communications, computing and entertainment, multimedia features in small, stylish and robust new form factors are winning in the market place. Due to the high costs that an industry may undergo and how a high yield is directly proportional to high profits, IC (Integrated Circuit) manufacturers struggle to maximize yield, but today-s customers demand miniaturization, low costs, high performance and excellent reliability making the yield maximization a never ending research of an enhanced assembly process. With factors such as minimum tolerances, tighter parameter variations a systematic approach is needed in order to predict the assembly process. In order to evaluate the quality of upcoming circuits, yield models are used which not only predict manufacturing costs but also provide vital information in order to ease the process of correction when the yields fall below expectations. For an IC manufacturer to obtain higher assembly yields all factors such as boards, placement, components, the material from which the components are made of and processes must be taken into consideration. Effective placement yield depends heavily on machine accuracy and the vision of the system which needs the ability to recognize the features on the board and component to place the device accurately on the pads and bumps of the PCB. There are currently two methods for accurate positioning, using the edge of the package and using solder ball locations also called footprints. The only assumption that a yield model makes is that all boards and devices are completely functional. This paper will focus on the Monte Carlo method which consists in a class of computational algorithms (information processed algorithms) which depends on repeated random samplings in order to compute the results. This method utilized in order to recreate the simulation of placement and assembly processes within a production line.
Abstract: This paper uses p-tolerance with the lowest posterior
loss, quadratic loss function, average length criteria, average
coverage criteria, and worst outcome criterion for computing of
sample size to estimate proportion in Binomial probability function
with Beta prior distribution. The proposed methodology is examined,
and its effectiveness is shown.
Abstract: This paper proposes a novel game theoretical
technique to address the problem of data object replication in largescale
distributed computing systems. The proposed technique draws
inspiration from computational economic theory and employs the
extended Vickrey auction. Specifically, players in a non-cooperative
environment compete for server-side scarce memory space to
replicate data objects so as to minimize the total network object
transfer cost, while maintaining object concurrency. Optimization of
such a cost in turn leads to load balancing, fault-tolerance and
reduced user access time. The method is experimentally evaluated
against four well-known techniques from the literature: branch and
bound, greedy, bin-packing and genetic algorithms. The experimental
results reveal that the proposed approach outperforms the four
techniques in both the execution time and solution quality.
Abstract: This research was aimed at determining the impact of conservation techniques including bench terrace, stone terrace, mulching, grass strip and intercropping on soil erosion at tobacco-based farming system at Progo Hulu subwatershed, Central Java, Indonesia. Research was conducted from September 2007 to September 2009, located at Progo Hulu subwatershed, Central Java, Indonesia. Research site divided into 27 land units, and experimental fields were grouped based on the soil type and slope, ie: 30%, 45% and 70%, with the following treatments: 1) ST0= stone terrace (control); 2) ST1= stone terrace + Setaria spacelata grass strip on a 5 cm height dike at terrace lips + tobacco stem mulch with dose of 50% (7 ton/ ha); 3) ST2= stone terrace + Setaria spacelata grass strip on a 5 cm height dike at terrace lips + tobacco stem mulch with dose of 100% (14 ton/ ha); 4) ST3= stone terrace + tobacco and red bean intercropping + tobacco stem mulch with dose of 50% (7 ton/ ha). 5) BT0= bench terrace (control); 6) BT1= bench terrace + Setaria spacelata grass strip at terrace lips + tobacco stem mulch with dose of 50% (7 ton/ ha); 7) BT2= bench terrace + Setaria spacelata grass strip at terrace lips + tobacco stem mulch with dose of 100% (14 ton/ ha); 8) BT3= bench terrace + tobacco and red bean intercropping + tobacco stem mulch with dose of 50% (7 ton/ ha). The results showed that the actual erosion rates of research site were higher than that of tolerance erosion with mean value 89.08 ton/ha/year and 33.40 ton/ha/year, respectively. These resulted in 69% of total research site (5,119.15 ha) highly degraded. Conservation technique of ST2 was the most effective in suppressing soil erosion, by 42.87%, following with BT2 as much 30.63%. Others suppressed erosion only less than 21%.
Abstract: Neighborhood Rough Sets (NRS) has been proven to
be an efficient tool for heterogeneous attribute reduction. However,
most of researches are focused on dealing with complete and noiseless
data. Factually, most of the information systems are noisy, namely,
filled with incomplete data and inconsistent data. In this paper, we
introduce a generalized neighborhood rough sets model, called
VPTNRS, to deal with the problem of heterogeneous attribute
reduction in noisy system. We generalize classical NRS model with
tolerance neighborhood relation and the probabilistic theory.
Furthermore, we use the neighborhood dependency to evaluate the
significance of a subset of heterogeneous attributes and construct a
forward greedy algorithm for attribute reduction based on it.
Experimental results show that the model is efficient to deal with noisy
data.
Abstract: The purpose of this study was to study postpartum breastfeeding mothers to determine the impact their psychosocial and spiritual dimensions play in promoting full-term (6 month duration) breastfeeding of their infants. Purposive and snowball sampling methods were used to identify and recruit the study's participants. A total of 23 postpartum mothers, who were breastfeeding within 6 weeks after giving birth, participated in this study. In-depth interviews combined with observations, participant focus groups, and ethnographic records were used for data collection. The Data were then analyzed using content analysis and typology. The results of this study illustrated that postpartum mothers experienced fear and worry that they would lack support from their spouse, family and peers, and that their infant would not get enough milk It was found that the main barrier mothers faced in breastfeeding to full-term was the difficulty of continuing to breastfeed when returning to work. 81.82% of the primiparous mothers and 91.67% of the non-primiparous mothers were able to breastfeed for the desired full-term of 6 months. Factors found to be related to breastfeeding for six months included 1) belief and faith in breastfeeding, 2) support from spouse and family members, 3) counseling from public health nurses and friends. The sample also provided evidence that religious principles such as tolerance, effort, love, and compassion to their infant, and positive thinking, were used in solving their physical, mental and spiritual problems.
Abstract: This paper studies the dependability of componentbased
applications, especially embedded ones, from the diagnosis
point of view. The principle of the diagnosis technique is to
implement inter-component tests in order to detect and locate the
faulty components without redundancy. The proposed approach for
diagnosing faulty components consists of two main aspects. The first
one concerns the execution of the inter-component tests which
requires integrating test functionality within a component. This is the
subject of this paper. The second one is the diagnosis process itself
which consists of the analysis of inter-component test results to
determine the fault-state of the whole system. Advantage of this
diagnosis method when compared to classical redundancy faulttolerant
techniques are application autonomy, cost-effectiveness and
better usage of system resources. Such advantage is very important
for many systems and especially for embedded ones.
Abstract: Rice seed expression (cDNA) library in the Lambda
Zap 11® phage constructed from the developing grain 10-20 days
after flowering was transformed into yeast for functional
complementation assays in three salt sensitive yeast mutants S.
cerevisiae strain CY162, G19 and Axt3K. Transformed cells of G19
and Axt3K with pYES vector with cDNA inserts showed enhance
tolerance than those with empty pYes vector. Sequencing of the
cDNA inserts revealed that they encode for the putative proteins with
the sequence homologous to rice putative protein PROLM24
(Os06g31070), a prolamin precursor. Expression of this cDNA did
not affect yeast growth in absence of salt. Axt3k and G19 strains
expressing the PROLM24 were able to grow upto 400 mM and 600
mM of NaCl respectively. Similarly, Axt3k mutant with PROLM24
expression showed comparatively higher growth rate in the medium
with excess LiCl (50 mM). The observation that expression of
PROLM24 rescued the salt sensitive phenotypes of G19 and Axt3k
indicates the existence of a regulatory system that ameliorates the
effect of salt stress in the transformed yeast mutants. However, the
exact function of the cDNA sequence, which shows partial sequence
homology to yeast UTR1 is not clear. Although UTR1 involved in
ferrous uptake and iron homeostasis in yeast cells, there is no
evidence to prove its role in Na+ homeostasis in yeast cells. Absence
of transmembrane regions in Os06g31070 protein indicates that salt
tolerance is achieved not through the direct functional
complementation of the mutant genes but through an alternative
mechanism.