Abstract: For gamma radiation detection, assemblies having
scintillation crystals and a photomultiplier tube, also there is a
preamplifier connected to the detector because the signals from
photomultiplier tube are of small amplitude. After pre-amplification
the signals are sent to the amplifier and then to the multichannel
analyser. The multichannel analyser sorts all incoming electrical
signals according to their amplitudes and sorts the detected photons
in channels covering small energy intervals. The energy range of
each channel depends on the gain settings of the multichannel
analyser and the high voltage across the photomultiplier tube. The
exit spectrum data of the two main isotopes studied ,putting data in
biomass program ,process it by Matlab program to get the solid
holdup image (solid spherical nuclear fuel)
Abstract: Because today-s media centric students have adopted
digital as their native form of communication, teachers are having
increasingly difficult time motivating reluctant readers to read and
write. Our research has shown these text-averse individuals can learn
to understand the importance of reading and writing if the instruction
is based on digital narratives. While these students are naturally
attracted to story, they are better at consuming them than creating
them. Therefore, any intervention that utilizes story as its basis needs
to include instruction on the elements of story making. This paper
presents a series of digitally-based tools to identify potential
weaknesses of visually impaired visual learners and to help motivate
these and other media-centric students to select and complete books
that are assigned to them
Abstract: Breast carcinoma is the most common form of cancer
in women. Multicolour fluorescent in-situ hybridisation (m-FISH) is
a common method for staging breast carcinoma. The interpretation
of m-FISH images is complicated due to two effects: (i) Spectral
overlap in the emission spectra of fluorochrome marked DNA probes
and (ii) tissue autofluorescence. In this paper hyper-spectral images of
m-FISH samples are used and spectral unmixing is applied to produce
false colour images with higher contrast and better information
content than standard RGB images. The spectral unmixing is realised
by combinations of: Orthogonal Projection Analysis (OPA), Alterating
Least Squares (ALS), Simple-to-use Interactive Self-Modeling
Mixture Analysis (SIMPLISMA) and VARIMAX. These are applied
on the data to reduce tissue autofluorescence and resolve the spectral
overlap in the emission spectra. The results show that spectral unmixing
methods reduce the intensity caused by tissue autofluorescence by
up to 78% and enhance image contrast by algorithmically reducing
the overlap of the emission spectra.
Abstract: An epidemiological cross sectional study was
undertaken in Yaoundé in 2002 and updated in 2005. Focused on
health within the city, the objectives were to measure diarrheal
prevalence and to identify the risk factors associated with them.
Results of microbiological examinations have revealed an urban
average prevalence rate of 14.5%. Access to basic services in the
living environment appears to be an important risk factor for
diarrheas. Statistical and spatial analyses conducted have revealed
that prevalence of diarrheal diseases vary among the two main types
of settlement (informal and planned). More importantly, this study
shows that, diarrhea prevalence rates (notably bacterial and parasitic
diarrheas) vary according to the sub- category of settlements. The
study draws a number of theoretical and policy implications for
researchers and policy decision makers.
Abstract: Moulded parts contribute to more than 70% of
components in products. However, common defects particularly in
plastic injection moulding exist such as: warpage, shrinkage, sink
marks, and weld lines. In this paper Taguchi experimental design
methods are applied to reduce the warpage defect of thin plate
Acrylonitrile Butadiene Styrene (ABS) and are demonstrated in two
levels; namely, orthogonal arrays of Taguchi and the Analysis of
Variance (ANOVA). Eight trials have been run in which the optimal
parameters that can minimize the warpage defect in factorial
experiment are obtained. The results obtained from ANOVA
approach analysis with respect to those derived from MINITAB
illustrate the most significant factors which may cause warpage in
injection moulding process. Moreover, ANOVA approach in
comparison with other approaches like S/N ratio is more accurate and
with the interaction of factors it is possible to achieve higher and the
better outcomes.
Abstract: We introduce an effective approach for automatic offline au- thentication of handwritten samples where the forgeries are skillfully done, i.e., the true and forgery sample appearances are almost alike. Subtle details of temporal information used in online verification are not available offline and are also hard to recover robustly. Thus the spatial dynamic information like the pen-tip pressure characteristics are considered, emphasizing on the extraction of low density pixels. The points result from the ballistic rhythm of a genuine signature which a forgery, however skillful that may be, always lacks. Ten effective features, including these low density points and den- sity ratio, are proposed to make the distinction between a true and a forgery sample. An adaptive decision criteria is also derived for better verification judgements.
Abstract: Multicast Network Technology has pervaded our
lives-a few examples of the Networking Techniques and also for the
improvement of various routing devices we use. As we know the
Multicast Data is a technology offers many applications to the user
such as high speed voice, high speed data services, which is presently
dominated by the Normal networking and the cable system and
digital subscriber line (DSL) technologies. Advantages of Multi cast
Broadcast such as over other routing techniques. Usually QoS
(Quality of Service) Guarantees are required in most of Multicast
applications. The bandwidth-delay constrained optimization and we
use a multi objective model and routing approach based on genetic
algorithm that optimizes multiple QoS parameters simultaneously.
The proposed approach is non-dominated routes and the performance
with high efficiency of GA. Its betterment and high optimization has
been verified. We have also introduced and correlate the result of
multicast GA with the Broadband wireless to minimize the delay in
the path.
Abstract: The customary practice of identifying industrial sickness is a set traditional techniques which rely upon a range of manual monitoring and compilation of financial records. It makes the process tedious, time consuming and often are susceptible to manipulation. Therefore, certain readily available tools are required which can deal with such uncertain situations arising out of industrial sickness. It is more significant for a country like India where the fruits of development are rarely equally distributed. In this paper, we propose an approach based on Artificial Neural Network (ANN) to deal with industrial sickness with specific focus on a few such units taken from a less developed north-east (NE) Indian state like Assam. The proposed system provides decision regarding industrial sickness using eight different parameters which are directly related to the stages of sickness of such units. The mechanism primarily uses certain signals and symptoms of industrial health to decide upon the state of a unit. Specifically, we formulate an ANN based block with data obtained from a few selected units of Assam so that required decisions related to industrial health could be taken. The system thus formulated could become an important part of planning and development. It can also contribute towards computerization of decision support systems related to industrial health and help in better management.
Abstract: Titanium nitride (TiN) has been synthesized using the
sheet plasma negative ion source (SPNIS). The parameters used for
its effective synthesis has been determined from previous
experiments and studies. In this study, further enhancement of the
deposition rate of TiN synthesis and advancement of the SPNIS
operation is presented. This is primarily achieved by the addition of
Sm-Co permanent magnets and a modification of the configuration in
the TiN deposition process. The magnetic enhancement is aimed at
optimizing the sputtering rate and the sputtering yield of the process.
The Sm-Co permanent magnets are placed below the Ti target for
better sputtering by argon. The Ti target is biased from –250V to –
350V and is sputtered by Ar plasma produced at discharge current of
2.5–4A and discharge potential of 60–90V. Steel substrates of
dimensions 20x20x0.5mm3 were prepared with N2:Ar volumetric
ratios of 1:3, 1:5 and 1:10. Ocular inspection of samples exhibit
bright gold color associated with TiN. XRD characterization
confirmed the effective TiN synthesis as all samples exhibit the (200)
and (311) peaks of TiN and the non-stoichiometric Ti2N (220) facet.
Cross-sectional SEM results showed increase in the TiN deposition
rate of up to 0.35μm/min. This doubles what was previously obtained
[1]. Scanning electron micrograph results give a comparative
morphological picture of the samples. Vickers hardness results gave
the largest hardness value of 21.094GPa.
Abstract: For complete support of Quality of Service, it is better that environment itself predicts resource requirements of a job by using special methods in the Grid computing. The exact and correct prediction causes exact matching of required resources with available resources. After the execution of each job, the used resources will be saved in the active database named "History". At first some of the attributes will be exploit from the main job and according to a defined similarity algorithm the most similar executed job will be exploited from "History" using statistic terms such as linear regression or average, resource requirements will be predicted. The new idea in this research is based on active database and centralized history maintenance. Implementation and testing of the proposed architecture results in accuracy percentage of 96.68% to predict CPU usage of jobs and 91.29% of memory usage and 89.80% of the band width usage.
Abstract: Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.
Abstract: This study aims to screen out and to optimize the
major nutrients for maximum carotenoid production and
antioxidation characteristics by Rhodotorula rubra. It was found that
supplementary of 10 g/l glucose as carbon source, 1 g/l ammonium
sulfate as nitrogen source and 1 g/l yeast extract as growth factor in
the medium provided the better yield of carotenoid content of 30.39
μg/g cell dry weight the amount of antioxidation of Rhodotorula
rubra by DPPH, ABTS and MDA method were 1.463%, 34.21% and
34.09 μmol/l, respectively.
Abstract: This paper presents a signal analysis process for
improving energy completeness based on the Hilbert-Huang
Transform (HHT). Firstly, the vibration signal of a DC Motor obtained
by employing an accelerometer is the model used to analyze the
signal. Secondly, the intrinsic mode functions (IMFs) and Hilbert
spectrum of the decomposed signal are obtained by applying HHT.
The results of the IMFs constituent and the original signal are
compared and the process of energy loss is discussed. Finally, the
differences between Wavelet Transform (WT) and HHT in analyzing
the signal are compared. The simulated results reveal the analysis
process based on HHT is advantageous for the enhancement of energy
completeness.
Abstract: This paper reviews recent studies and particularly the
effects of Climate Change in the North Tropical Atlantic by studying
atmospheric conditions that prevailed in 2005 ; Coral Bleaching
HotSpot and Hurricane Katrina. In the aim to better understand and
estimate the impact of the physical phenomenon, i.e. Thermal
Oceanic HotSpot (TOHS), isotopic studies of δ18O and δ13C on
marine animals from Guadeloupe (French Caribbean Island) were
carried out. Recorded measures show Sea Surface Temperature (SST)
up to 35°C in August which is much higher than data recorded by
NOAA satellites 32°C. After having reviewed the process that led to
the creation of Hurricane Katrina which hit New Orleans in August
29, 2005, it will be shown that the climatic conditions in the
Caribbean from August to October 2005 have influenced Katrina
evolution. This TOHS is a combined effect of various phenomenon
which represent an additional factor to estimate future climate
changes.
Abstract: In this work a new platform for mobile-health systems is
presented. System target application is providing decision support to
rescue corps or military medical personnel in combat areas. Software
architecture relies on a distributed client-server system that manages a
wireless ad-hoc networks hierarchy in which several different types of
client operate. Each client is characterized for different hardware and
software requirements. Lower hierarchy levels rely in a network of
completely custom devices that store clinical information and patient
status and are designed to form an ad-hoc network operating in the
2.4 GHz ISM band and complying with the IEEE 802.15.4 standard
(ZigBee). Medical personnel may interact with such devices, that are
called MICs (Medical Information Carriers), by means of a PDA
(Personal Digital Assistant) or a MDA (Medical Digital Assistant),
and transmit the information stored in their local databases as well as
issue a service request to the upper hierarchy levels by using IEEE
802.11 a/b/g standard (WiFi). The server acts as a repository that
stores both medical evacuation forms and associated events (e.g., a
teleconsulting request). All the actors participating in the diagnostic
or evacuation process may access asynchronously to such repository
and update its content or generate new events. The designed system
pretends to optimise and improve information spreading and flow
among all the system components with the aim of improving both
diagnostic quality and evacuation process.
Abstract: The Sensor Network consists of densely deployed
sensor nodes. Energy optimization is one of the most important
aspects of sensor application design. Data acquisition and aggregation
techniques for processing data in-network should be energy efficient.
Due to the cross-layer design, resource-limited and noisy nature
of Wireless Sensor Networks(WSNs), it is challenging to study
the performance of these systems in a realistic setting. In this
paper, we propose optimizing queries by aggregation of data and
data redundancy to reduce energy consumption without requiring
all sensed data and directed diffusion communication paradigm to
achieve power savings, robust communication and processing data
in-network. To estimate the per-node power consumption POWERTossim
mica2 energy model is used, which provides scalable and
accurate results. The performance analysis shows that the proposed
methods overcomes the existing methods in the aspects of energy
consumption in wireless sensor networks.
Abstract: A novel methodology has been used to design an
evaporator coil of a refrigerant. The methodology used is through a
complete Computer Aided Design /Computer Aided Engineering
approach, by means of a Computational Fluid Dynamic/Finite
Element Analysis model which is executed many times for the
thermal-fluid exploration of several designs' configuration by an
commercial optimizer. Hence the design is carried out automatically
by parallel computations, with an optimization package taking the
decisions rather than the design engineer. The engineer instead takes
decision regarding the physical settings and initializing of the
computational models to employ, the number and the extension of the
geometrical parameters of the coil fins and the optimization tools to
be employed. The final design of the coil geometry found to be better
than the initial design.
Abstract: The term interactive education indicates the meaning
related with multidisciplinary aspects of distance education following
contemporary means around a common basis with different
functional requirements. The aim of this paper is to reflect the new
techniques in education with the new methods and inventions. These
methods are better supplied by interactivity. The integration of
interactive facilities in the discipline of education with distance
learning is not a new concept but in addition the usage of these
methods on design issue is newly being adapted to design education.
In this paper the general approach of this method and after the
analysis of different samples, the advantages and disadvantages of
these approaches are being identified. The method of this paper is to
evaluate the related samples and then analyzing the main hypothesis.
The main focus is to mention the formation processes of this
education. Technological developments in education should be
filtered around the necessities of the design education and the
structure of the system could then be formed or renewed. The
conclusion indicates that interactive methods of education in design
issue is a meaning capturing not only technical and computational
intelligence aspects but also aesthetical and artistic approaches
coming together around the same purpose.
Abstract: Power Spectral Density (PSD) computed by taking the Fourier transform of auto-correlation functions (Wiener-Khintchine Theorem) gives better result, in case of noisy data, as compared to the Periodogram approach. However, the computational complexity of Wiener-Khintchine approach is more than that of the Periodogram approach. For the computation of short time Fourier transform (STFT), this problem becomes even more prominent where computation of PSD is required after every shift in the window under analysis. In this paper, recursive version of the Wiener-Khintchine theorem has been derived by using the sliding DFT approach meant for computation of STFT. The computational complexity of the proposed recursive Wiener-Khintchine algorithm, for a window size of N, is O(N).
Abstract: Green buildings have been commonly cited to be more
expensive than conventional buildings. However, limited research
has been conducted to clearly identify elements that contribute to this
cost differential. The construction cost of buildings can be typically
divided into “hard" costs and “soft" cost elements. Using a review
analysis of existing literature, the study identified six main elements
in green buildings that contribute to the general cost elements that are
“soft" in nature. The six elements found are insurance, developer-s
experience, design cost, certification, commissioning and energy
modeling. Out of the six elements, most literatures have highlighted
the increase in design cost for green design as compared to
conventional design due to additional architectural and engineering
costs, eco-charettes, extra design time, and the further need for a
green consultant. The study concluded that these elements of soft cost
contribute to the green premium or cost differential of green
buildings.