Abstract: One important problem in today organizations is the
existence of non-integrated information systems, inconsistency and
lack of suitable correlations between legacy and modern systems.
One main solution is to transfer the local databases into a global one.
In this regards we need to extract the data structures from the legacy
systems and integrate them with the new technology systems. In
legacy systems, huge amounts of a data are stored in legacy
databases. They require particular attention since they need more
efforts to be normalized, reformatted and moved to the modern
database environments. Designing the new integrated (global)
database architecture and applying the reverse engineering requires
data normalization. This paper proposes the use of database reverse
engineering in order to integrate legacy and modern databases in
organizations. The suggested approach consists of methods and
techniques for generating data transformation rules needed for the
data structure normalization.
Abstract: In this paper, He-s homotopy perturbation method (HPM) is applied to spatial one and three spatial dimensional inhomogeneous wave equation Cauchy problems for obtaining exact solutions. HPM is used for analytic handling of these equations. The results reveal that the HPM is a very effective, convenient and quite accurate to such types of partial differential equations (PDEs).
Abstract: The study of the transport coefficients in electronic
devices is currently carried out by analytical and empirical models.
This study requires several simplifying assumptions, generally
necessary to lead to analytical expressions in order to study the
different characteristics of the electronic silicon-based devices.
Further progress in the development, design and optimization of
Silicon-based devices necessarily requires new theory and modeling
tools. In our study, we use the PSO (Particle Swarm Optimization)
technique as a computational tool to develop analytical approaches in
order to study the transport phenomenon of the electron in crystalline
silicon as function of temperature and doping concentration. Good
agreement between our results and measured data has been found.
The optimized analytical models can also be incorporated into the
circuits simulators to study Si-based devices without impact on the
computational time and data storage.
Abstract: The algorithm represents the DCT coefficients to concentrate signal energy and proposes combination and dictator to eliminate the correlation in the same level subband for encoding the DCT-based images. This work adopts DCT and modifies the SPIHT algorithm to encode DCT coefficients. The proposed algorithm also provides the enhancement function in low bit rate in order to improve the perceptual quality. Experimental results indicate that the proposed technique improves the quality of the reconstructed image in terms of both PSNR and the perceptual results close to JPEG2000 at the same bit rate.
Abstract: This paper proposes a modeling methodology for the
development of data analysis solution. The Author introduce the
approach to address data warehousing issues at the at enterprise level.
The methodology covers the process of the requirements eliciting and
analysis stage as well as initial design of data warehouse. The paper
reviews extended business process model, which satisfy the needs of
data warehouse development. The Author considers that the use of
business process models is necessary, as it reflects both enterprise
information systems and business functions, which are important for
data analysis. The Described approach divides development into
three steps with different detailed elaboration of models. The
Described approach gives possibility to gather requirements and
display them to business users in easy manner.
Abstract: Twist drills are geometrical complex tools and thus various researchers have adopted different mathematical and experimental approaches for their simulation. The present paper acknowledges the increasing use of modern CAD systems and using the API (Application Programming Interface) of a CAD system, drilling simulations are carried out. The developed DRILL3D software routine, creates parametrically controlled tool geometries and using different cutting conditions, achieves the generation of solid models for all the relevant data involved (drilling tool, cut workpiece, undeformed chip). The final data derived, consist a platform for further direct simulations regarding the determination of cutting forces, tool wear, drilling optimizations etc.
Abstract: A high performance computer includes a fast
processor and millions bytes of memory. During the data processing,
huge amount of information are shuffled between the memory and
processor. Because of its small size and its effectiveness speed, cache
has become a common feature of high performance computers.
Enhancing cache performance proved to be essential in the speed up
of cache-based computers. Most enhancement approaches can be
classified as either software based or hardware controlled. The
performance of the cache is quantified in terms of hit ratio or miss
ratio. In this paper, we are optimizing the cache performance based
on enhancing the cache hit ratio. The optimum cache performance is
obtained by focusing on the cache hardware modification in the way
to make a quick rejection to the missed line's tags from the hit-or
miss comparison stage, and thus a low hit time for the wanted line in
the cache is achieved. In the proposed technique which we called
Even- Odd Tabulation (EOT), the cache lines come from the main
memory into cache are classified in two types; even line's tags and
odd line's tags depending on their Least Significant Bit (LSB). This
division is exploited by EOT technique to reject the miss match line's
tags in very low time compared to the time spent by the main
comparator in the cache, giving an optimum hitting time for the
wanted cache line. The high performance of EOT technique against
the familiar mapping technique FAM is shown in the simulated
results.
Abstract: The purpose of this study was to study postpartum breastfeeding mothers to determine the impact their psychosocial and spiritual dimensions play in promoting full-term (6 month duration) breastfeeding of their infants. Purposive and snowball sampling methods were used to identify and recruit the study's participants. A total of 23 postpartum mothers, who were breastfeeding within 6 weeks after giving birth, participated in this study. In-depth interviews combined with observations, participant focus groups, and ethnographic records were used for data collection. The Data were then analyzed using content analysis and typology. The results of this study illustrated that postpartum mothers experienced fear and worry that they would lack support from their spouse, family and peers, and that their infant would not get enough milk It was found that the main barrier mothers faced in breastfeeding to full-term was the difficulty of continuing to breastfeed when returning to work. 81.82% of the primiparous mothers and 91.67% of the non-primiparous mothers were able to breastfeed for the desired full-term of 6 months. Factors found to be related to breastfeeding for six months included 1) belief and faith in breastfeeding, 2) support from spouse and family members, 3) counseling from public health nurses and friends. The sample also provided evidence that religious principles such as tolerance, effort, love, and compassion to their infant, and positive thinking, were used in solving their physical, mental and spiritual problems.
Abstract: This paper presents an interval-based multi-attribute
decision making (MADM) approach in support of the decision
process with imprecise information. The proposed decision
methodology is based on the model of linear additive utility function
but extends the problem formulation with the measure of composite
utility variance. A sample study concerning with the evaluation of
electric generation expansion strategies is provided showing how the
imprecise data may affect the choice toward the best solution and
how a set of alternatives, acceptable to the decision maker (DM),
may be identified with certain confidence.
Abstract: The aim of the paper is based on detailed analysis of
literary sources and carried out research to develop a model
development and implementation of innovation strategy in the
business. The paper brings the main results of the authors conducted
research on a sample of 462 respondents that shows the current
situation in the Slovak enterprises in the use of innovation strategy.
Carried out research and analysis provided the base for a model
development and implementation of innovation strategy in the
business, which is in the paper in detail, step by step explained with
emphasis on the implementation process. Implementing the
innovation strategy is described a separate model. Paper contains
recommendations for successful implementation of innovation
strategy in the business. These recommendations should serve mainly
business managers as valuable tool in implementing the innovation
strategy.
Abstract: The job shop scheduling problem (JSSP) is well known as one of the most difficult combinatorial optimization problems. This paper presents a hybrid genetic algorithm for the JSSP with the objective of minimizing makespan. The efficiency of the genetic algorithm is enhanced by integrating it with a local search method. The chromosome representation of the problem is based on operations. Schedules are constructed using a procedure that generates full active schedules. In each generation, a local search heuristic based on Nowicki and Smutnicki-s neighborhood is applied to improve the solutions. The approach is tested on a set of standard instances taken from the literature and compared with other approaches. The computation results validate the effectiveness of the proposed algorithm.
Abstract: In designing of condensers, the prediction of pressure
drop is as important as the prediction of heat transfer coefficient.
Modeling of two phase flow, particularly liquid – vapor flow under
diabatic conditions inside a horizontal tube using CFD analysis is
difficult with the available two phase models in FLUENT due to
continuously changing flow patterns. In the present analysis, CFD
analysis of two phase flow of refrigerants inside a horizontal tube of
inner diameter, 0.0085 m and 1.2 m length is carried out using
homogeneous model under adiabatic conditions. The refrigerants
considered are R22, R134a and R407C. The analysis is performed at
different saturation temperatures and at different flow rates to
evaluate the local frictional pressure drop. Using Homogeneous
model, average properties are obtained for each of the refrigerants
that is considered as single phase pseudo fluid. The so obtained
pressure drop data is compared with the separated flow models
available in literature.
Abstract: Available Bit Rate Service (ABR) is the lower priority
service and the better service for the transmission of data. On wireline
ATM networks ABR source is always getting the feedback from
switches about increase or decrease of bandwidth according to the
changing network conditions and minimum bandwidth is guaranteed.
In wireless networks guaranteeing the minimum bandwidth is really a
challenging task as the source is always in mobile and traveling from
one cell to another cell. Re establishment of virtual circuits from start
to end every time causes the delay in transmission. In our proposed
solution we proposed the mechanism to provide more available
bandwidth to the ABR source by re-usage of part of old Virtual
Channels and establishing the new ones. We want the ABR source to
transmit the data continuously (non-stop) inorderto avoid the delay.
In worst case scenario at least minimum bandwidth is to be allocated.
In order to keep the data flow continuously, priority is given to the
handoff ABR call against new ABR call.
Abstract: This paper presents the experimental results of a
single cylinder Enfield engine using an electronically controlled fuel
injection system which was developed to carry out exhaustive tests
using neat CNG, and mixtures of hydrogen in compressed natural gas
(HCNG) as 0, 5, 10, 15 and 20% by energy. Experiments were
performed at 2000 and 2400 rpm with wide open throttle and varying
the equivalence ratio. Hydrogen which has fast burning rate, when
added to compressed natural gas, enhances its flame propagation rate.
The emissions of HC, CO, decreased with increasing percentage of
hydrogen but NOx was found to increase. The results indicated a
marked improvement in the brake thermal efficiency with the
increase in percentage of hydrogen added. The improved thermal
efficiency was clearly observed to be more in lean region as
compared to rich region. This study is expected to reduce vehicular
emissions along with increase in thermal efficiency and thus help in
reduction of further environmental degradation.
Abstract: Graph based image segmentation techniques are
considered to be one of the most efficient segmentation techniques
which are mainly used as time & space efficient methods for real
time applications. How ever, there is need to focus on improving the
quality of segmented images obtained from the earlier graph based
methods. This paper proposes an improvement to the graph based
image segmentation methods already described in the literature. We
contribute to the existing method by proposing the use of a weighted
Euclidean distance to calculate the edge weight which is the key
element in building the graph. We also propose a slight modification
of the segmentation method already described in the literature, which
results in selection of more prominent edges in the graph. The
experimental results show the improvement in the segmentation
quality as compared to the methods that already exist, with a slight
compromise in efficiency.
Abstract: In this article we are going to discuss the improvement
of the multi classes- classification problem using multi layer
Perceptron. The considered approach consists in breaking down the
n-class problem into two-classes- subproblems. The training of each
two-class subproblem is made independently; as for the phase of test,
we are going to confront a vector that we want to classify to all two
classes- models, the elected class will be the strongest one that won-t
lose any competition with the other classes. Rates of recognition
gotten with the multi class-s approach by two-class-s decomposition
are clearly better that those gotten by the simple multi class-s
approach.
Abstract: Logistics outsourcing is a growing trend and measuring its performance, a challenge. It must be consistent with the objectives set for logistics outsourcing, but we have found no objective-based performance measurement system. We have conducted a comprehensive review of the specialist literature to cover this gap, which has led us to identify and define these objectives. The outcome is that we have obtained a list of the most relevant objectives and their descriptions. This will enable us to analyse in a future study whether the indicators used for measuring logistics outsourcing performance are consistent with the objectives pursued with the outsourcing. If this is not the case, a proposal will be made for a set of financial and operational indicators to measure performance in logistics outsourcing that take the goals being pursued into account.
Abstract: A active inductor in CMOS techonology with a supply voltage of 1.8V is presented. The value of the inductance L can be in the range from 0.12nH to 0.25nH in high frequency(HF). The proposed active inductor is designed in TSMC 0.18-um CMOS technology. The power dissipation of this inductor can retain constant at all operating frequency bands and consume around 20mW from 1.8V power supply. Inductors designed by integrated circuit occupy much smaller area, for this reason,attracted researchers attention for more than decade. In this design we used Advanced Designed System (ADS) for simulating cicuit.
Abstract: This paper demonstrates the results when either
Shiftrows stage or Mixcolumns stage and when both the stages are
omitted in the well known block cipher Advanced Encryption
Standard(AES) and its modified version AES with Key Dependent
S-box(AES-KDS), using avalanche criterion and other tests namely
encryption quality, correlation coefficient, histogram analysis and
key sensitivity tests.
Abstract: PARADIGMA (PARticipative Approach to DIsease
Global Management) is a pilot project which aims to develop and
demonstrate an Internet based reference framework to share scientific
resources and findings in the treatment of major diseases.
PARADIGMA defines and disseminates a common methodology and
optimised protocols (Clinical Pathways) to support service functions
directed to patients and individuals on matters like prevention, posthospitalisation
support and awareness. PARADIGMA will provide a
platform of information services - user oriented and optimised
against social, cultural and technological constraints - supporting the
Health Care Global System of the Euro-Mediterranean Community
in a continuous improvement process.