Abstract: Quality Function Deployment (QFD) is an expounded, multi-step planning method for delivering commodity, services, and processes to customers, both external and internal to an organization. It is a way to convert between the diverse customer languages expressing demands (Voice of the Customer), and the organization-s languages expressing results that sate those demands. The policy is to establish one or more matrices that inter-relate producer and consumer reciprocal expectations. Due to its visual presence is called the “House of Quality" (HOQ). In this paper, we assumed HOQ in multi attribute decision making (MADM) pattern and through a proposed MADM method, rank technical specifications. Thereafter compute satisfaction degree of customer requirements and for it, we apply vagueness and uncertainty conditions in decision making by fuzzy set theory. This approach would propound supervised neural network (perceptron) for MADM problem solving.
Abstract: The three-time-scale plant model of a wind power
generator, including a wind turbine, a flexible vertical shaft, a Variable
Inertia Flywheel (VIF) module, an Active Magnetic Bearing (AMB)
unit and the applied wind sequence, is constructed. In order to make
the wind power generator be still able to operate as the spindle speed
exceeds its rated speed, the VIF is equipped so that the spindle speed
can be appropriately slowed down once any stronger wind field is
exerted. To prevent any potential damage due to collision by shaft
against conventional bearings, the AMB unit is proposed to regulate
the shaft position deviation. By singular perturbation order-reduction
technique, a lower-order plant model can be established for the
synthesis of feedback controller. Two major system parameter
uncertainties, an additive uncertainty and a multiplicative uncertainty,
are constituted by the wind turbine and the VIF respectively.
Frequency Shaping Sliding Mode Control (FSSMC) loop is proposed
to account for these uncertainties and suppress the unmodeled
higher-order plant dynamics. At last, the efficacy of the FSSMC is
verified by intensive computer and experimental simulations for
regulation on position deviation of the shaft and counter-balance of
unpredictable wind disturbance.
Abstract: Two-dimensional Direct Numerical Simulation (DNS)
of high Schmidt number mass transfer in a convective flow environment
(Rayleigh-B'enard) is carried out and results are compared to
experimental data. A fourth-order accurate WENO-scheme has been
used for scalar transport in order to aim for a high accuracy in areas
of high concentration gradients. It was found that the typical spatial
distance between downward plumes of cold high concentration water
and the eddy size are in good agreement with experiments using a
combined PIV-LIF technique for simultaneous and spatially synoptic
measurements of 2D velocity and concentration fields.
Abstract: One important problem in today organizations is the
existence of non-integrated information systems, inconsistency and
lack of suitable correlations between legacy and modern systems.
One main solution is to transfer the local databases into a global one.
In this regards we need to extract the data structures from the legacy
systems and integrate them with the new technology systems. In
legacy systems, huge amounts of a data are stored in legacy
databases. They require particular attention since they need more
efforts to be normalized, reformatted and moved to the modern
database environments. Designing the new integrated (global)
database architecture and applying the reverse engineering requires
data normalization. This paper proposes the use of database reverse
engineering in order to integrate legacy and modern databases in
organizations. The suggested approach consists of methods and
techniques for generating data transformation rules needed for the
data structure normalization.
Abstract: The study of the transport coefficients in electronic
devices is currently carried out by analytical and empirical models.
This study requires several simplifying assumptions, generally
necessary to lead to analytical expressions in order to study the
different characteristics of the electronic silicon-based devices.
Further progress in the development, design and optimization of
Silicon-based devices necessarily requires new theory and modeling
tools. In our study, we use the PSO (Particle Swarm Optimization)
technique as a computational tool to develop analytical approaches in
order to study the transport phenomenon of the electron in crystalline
silicon as function of temperature and doping concentration. Good
agreement between our results and measured data has been found.
The optimized analytical models can also be incorporated into the
circuits simulators to study Si-based devices without impact on the
computational time and data storage.
Abstract: The algorithm represents the DCT coefficients to concentrate signal energy and proposes combination and dictator to eliminate the correlation in the same level subband for encoding the DCT-based images. This work adopts DCT and modifies the SPIHT algorithm to encode DCT coefficients. The proposed algorithm also provides the enhancement function in low bit rate in order to improve the perceptual quality. Experimental results indicate that the proposed technique improves the quality of the reconstructed image in terms of both PSNR and the perceptual results close to JPEG2000 at the same bit rate.
Abstract: Automatic face detection is a complex problem in
image processing. Many methods exist to solve this problem such as
template matching, Fisher Linear Discriminate, Neural Networks,
SVM, and MRC. Success has been achieved with each method to
varying degrees and complexities. In proposed algorithm we used
upright, frontal faces for single gray scale images with decent
resolution and under good lighting condition. In the field of face
recognition technique the single face is matched with single face
from the training dataset. The author proposed a neural network
based face detection algorithm from the photographs as well as if any
test data appears it check from the online scanned training dataset.
Experimental result shows that the algorithm detected up to 95%
accuracy for any image.
Abstract: A high performance computer includes a fast
processor and millions bytes of memory. During the data processing,
huge amount of information are shuffled between the memory and
processor. Because of its small size and its effectiveness speed, cache
has become a common feature of high performance computers.
Enhancing cache performance proved to be essential in the speed up
of cache-based computers. Most enhancement approaches can be
classified as either software based or hardware controlled. The
performance of the cache is quantified in terms of hit ratio or miss
ratio. In this paper, we are optimizing the cache performance based
on enhancing the cache hit ratio. The optimum cache performance is
obtained by focusing on the cache hardware modification in the way
to make a quick rejection to the missed line's tags from the hit-or
miss comparison stage, and thus a low hit time for the wanted line in
the cache is achieved. In the proposed technique which we called
Even- Odd Tabulation (EOT), the cache lines come from the main
memory into cache are classified in two types; even line's tags and
odd line's tags depending on their Least Significant Bit (LSB). This
division is exploited by EOT technique to reject the miss match line's
tags in very low time compared to the time spent by the main
comparator in the cache, giving an optimum hitting time for the
wanted cache line. The high performance of EOT technique against
the familiar mapping technique FAM is shown in the simulated
results.
Abstract: The purpose of this study was to study postpartum breastfeeding mothers to determine the impact their psychosocial and spiritual dimensions play in promoting full-term (6 month duration) breastfeeding of their infants. Purposive and snowball sampling methods were used to identify and recruit the study's participants. A total of 23 postpartum mothers, who were breastfeeding within 6 weeks after giving birth, participated in this study. In-depth interviews combined with observations, participant focus groups, and ethnographic records were used for data collection. The Data were then analyzed using content analysis and typology. The results of this study illustrated that postpartum mothers experienced fear and worry that they would lack support from their spouse, family and peers, and that their infant would not get enough milk It was found that the main barrier mothers faced in breastfeeding to full-term was the difficulty of continuing to breastfeed when returning to work. 81.82% of the primiparous mothers and 91.67% of the non-primiparous mothers were able to breastfeed for the desired full-term of 6 months. Factors found to be related to breastfeeding for six months included 1) belief and faith in breastfeeding, 2) support from spouse and family members, 3) counseling from public health nurses and friends. The sample also provided evidence that religious principles such as tolerance, effort, love, and compassion to their infant, and positive thinking, were used in solving their physical, mental and spiritual problems.
Abstract: Graph based image segmentation techniques are
considered to be one of the most efficient segmentation techniques
which are mainly used as time & space efficient methods for real
time applications. How ever, there is need to focus on improving the
quality of segmented images obtained from the earlier graph based
methods. This paper proposes an improvement to the graph based
image segmentation methods already described in the literature. We
contribute to the existing method by proposing the use of a weighted
Euclidean distance to calculate the edge weight which is the key
element in building the graph. We also propose a slight modification
of the segmentation method already described in the literature, which
results in selection of more prominent edges in the graph. The
experimental results show the improvement in the segmentation
quality as compared to the methods that already exist, with a slight
compromise in efficiency.
Abstract: Microcirculation is essential for the proper supply of
oxygen and nutritive substances to the biological tissue and the
removal of waste products of metabolism. The determination of
blood flow in the capillaries is therefore of great interest to clinicians.
A comparison has been carried out using the developed non-invasive,
non-contact and whole field laser speckle contrast imaging (LSCI)
based technique and as well as a commercially available laser
Doppler blood flowmeter (LDF) to evaluate blood flow at the finger
tip and elbow and is presented here. The LSCI technique gives more
quantitative information on the velocity of blood when compared to
the perfusion values obtained using the LDF. Measurement of blood
flow in capillaries can be of great interest to clinicians in the
diagnosis of vascular diseases of the upper extremities.
Abstract: A active inductor in CMOS techonology with a supply voltage of 1.8V is presented. The value of the inductance L can be in the range from 0.12nH to 0.25nH in high frequency(HF). The proposed active inductor is designed in TSMC 0.18-um CMOS technology. The power dissipation of this inductor can retain constant at all operating frequency bands and consume around 20mW from 1.8V power supply. Inductors designed by integrated circuit occupy much smaller area, for this reason,attracted researchers attention for more than decade. In this design we used Advanced Designed System (ADS) for simulating cicuit.
Abstract: This paper demonstrates the results when either
Shiftrows stage or Mixcolumns stage and when both the stages are
omitted in the well known block cipher Advanced Encryption
Standard(AES) and its modified version AES with Key Dependent
S-box(AES-KDS), using avalanche criterion and other tests namely
encryption quality, correlation coefficient, histogram analysis and
key sensitivity tests.
Abstract: PARADIGMA (PARticipative Approach to DIsease
Global Management) is a pilot project which aims to develop and
demonstrate an Internet based reference framework to share scientific
resources and findings in the treatment of major diseases.
PARADIGMA defines and disseminates a common methodology and
optimised protocols (Clinical Pathways) to support service functions
directed to patients and individuals on matters like prevention, posthospitalisation
support and awareness. PARADIGMA will provide a
platform of information services - user oriented and optimised
against social, cultural and technological constraints - supporting the
Health Care Global System of the Euro-Mediterranean Community
in a continuous improvement process.
Abstract: In this paper, we start by first characterizing the most
important and distinguishing features of wavelet-based watermarking
schemes. We studied the overwhelming amount of algorithms
proposed in the literature. Application scenario, copyright protection
is considered and building on the experience that was gained,
implemented two distinguishing watermarking schemes. Detailed
comparison and obtained results are presented and discussed. We
concluded that Joo-s [1] technique is more robust for standard noise
attacks than Dote-s [2] technique.
Abstract: The paper presents a computational tool developed for
the evaluation of technical and economic advantages of an innovative
cleaning and conditioning technology of fluidized bed steam/oxygen
gasifiers outlet product gas. This technology integrates into a single
unit the steam gasification of biomass and the hot gas cleaning and
conditioning system. Both components of the computational tool,
process flowsheet and economic evaluator, have been developed
under IPSEpro software. The economic model provides information
that can help potential users, especially small and medium size
enterprises acting in the regenerable energy field, to decide the
optimal scale of a plant and to better understand both potentiality and
limits of the system when applied to a wide range of conditions.
Abstract: The current paper conceptualizes the technique of
release consistency indispensable with the concept of
synchronization that is user-defined. Programming model concreted
with object and class is illustrated and demonstrated. The essence of
the paper is phases, events and parallel computing execution .The
technique by which the values are visible on shared variables is
implemented. The second part of the paper consist of user defined
high level synchronization primitives implementation and system
architecture with memory protocols. There is a proposition of
techniques which are core in deciding the validating and invalidating
a stall page .
Abstract: Multimedia information availability has increased
dramatically with the advent of video broadcasting on handheld
devices. But with this availability comes problems of maintaining the
security of information that is displayed in public. ISMA Encryption
and Authentication (ISMACryp) is one of the chosen technologies for
service protection in DVB-H (Digital Video Broadcasting-
Handheld), the TV system for portable handheld devices. The
ISMACryp is encoded with H.264/AVC (advanced video coding),
while leaving all structural data as it is. Two modes of ISMACryp are
available; the CTR mode (Counter type) and CBC mode (Cipher
Block Chaining) mode. Both modes of ISMACryp are based on 128-
bit AES algorithm. AES algorithms are more complex and require
larger time for execution which is not suitable for real time
application like live TV. The proposed system aims to gain a deep
understanding of video data security on multimedia technologies and
to provide security for real time video applications using selective
encryption for H.264/AVC. Five level of security proposed in this
paper based on the content of NAL unit in Baseline Constrain profile
of H.264/AVC. The selective encryption in different levels provides
encryption of intra-prediction mode, residue data, inter-prediction
mode or motion vectors only. Experimental results shown in this
paper described that fifth level which is ISMACryp provide higher
level of security with more encryption time and the one level provide
lower level of security by encrypting only motion vectors with lower
execution time without compromise on compression and quality of
visual content. This encryption scheme with compression process
with low cost, and keeps the file format unchanged with some direct
operations supported. Simulation was being carried out in Matlab.
Abstract: There are many situations where input feature vectors are incomplete and methods to tackle the problem have been studied for a long time. A commonly used procedure is to replace each missing value with an imputation. This paper presents a method to perform categorical missing data imputation from numerical and categorical variables. The imputations are based on Simpson-s fuzzy min-max neural networks where the input variables for learning and classification are just numerical. The proposed method extends the input to categorical variables by introducing new fuzzy sets, a new operation and a new architecture. The procedure is tested and compared with others using opinion poll data.
Abstract: The healthcare environment is generally perceived as
being information rich yet knowledge poor. However, there is a lack
of effective analysis tools to discover hidden relationships and trends
in data. In fact, valuable knowledge can be discovered from
application of data mining techniques in healthcare system. In this
study, a proficient methodology for the extraction of significant
patterns from the Coronary Heart Disease warehouses for heart
attack prediction, which unfortunately continues to be a leading cause
of mortality in the whole world, has been presented. For this purpose,
we propose to enumerate dynamically the optimal subsets of the
reduced features of high interest by using rough sets technique
associated to dynamic programming. Therefore, we propose to
validate the classification using Random Forest (RF) decision tree to
identify the risky heart disease cases. This work is based on a large
amount of data collected from several clinical institutions based on
the medical profile of patient. Moreover, the experts- knowledge in
this field has been taken into consideration in order to define the
disease, its risk factors, and to establish significant knowledge
relationships among the medical factors. A computer-aided system is
developed for this purpose based on a population of 525 adults. The
performance of the proposed model is analyzed and evaluated based
on set of benchmark techniques applied in this classification problem.