Abstract: The success of IT-projects concerning the
implementation of business application Software is strongly
depending upon the application of an efficient requirements
management, to understand the business requirements and to realize
them in the IT. But in fact, the Potentials of the requirements
management are not fully exhausted by small and medium sized
enterprises (SME) of the IT sector. To work out recommendations for
action and furthermore a possible solution, allowing a better exhaust
of potentials, it shall be examined in a scientific research project,
which problems occur out of which causes. In the same place, the
storage of knowledge from the requirements management, and its
later reuse are important, to achieve sustainable improvements of the
competitive of the IT-SMEs. Requirements Engineering is one of the
most important topics in Product Management for Software to
achieve the goal of optimizing the success of the software product.
Abstract: Photo-crosslinked rice starch-based biodegradable
films were prepared by casting film-solution on leveled trays and
ultra violet (UV) irradiation was applied for 10 minute. The effect of
the content (3%, 6% and 9 wt. %)of photosensitiser (sodium
benzoate) on mechanical properties, water vapor permeability (WVP)
and structural properties of rice starch films were investigated. The
tensile strength increased while elongation at break and water
resistance properties of rice starch films decreased with addition and
increasing content of photosensitiser. The % crystallinity of rice
starch films were decreased when the content of photosensitiser
increased and UV were applied. The results showed that the
carboxylate group band of sodium benzoate was found in the FTIR
spectrum of rice starch films and found that incorporation of 6% of
photosensitiser into the films showed a higher absorption band of
resulted films. This result pointed out the highest interaction between
starch molecules was occurred.
Abstract: The gases generated in oil filled transformers can be
used for qualitative determination of incipient faults. The Dissolved
Gas Analysis has been widely used by utilities throughout the world
as the primarily diagnostic tool for transformer maintenance. In this
paper, various Artificial Intelligence Techniques that have been used
by the researchers in the past have been reviewed, some conclusions
have been drawn and a sequential hybrid system has been proposed.
The synergy of ANN and FIS can be a good solution for reliable
results for predicting faults because one should not rely on a single
technology when dealing with real–life applications.
Abstract: In this paper, fully developed flow and heat transfer of
viscoelastic materials in curved ducts with square cross section under
constant heat flux have been investigated. Here, staggered mesh is
used as computational grids and flow and heat transfer parameters
have been allocated in this mesh with marker and cell method.
Numerical solution of governing equations has being performed with
FTCS finite difference method. Furthermore, Criminale-Eriksen-
Filbey (CEF) constitutive equation has being used as viscoelastic
model. CEF constitutive equation is a suitable model for studying
steady shear flow of viscoelastic materials which is able to model
both effects of the first and second normal stress differences. Here, it
is shown that the first and second normal stresses differences have
noticeable and inverse effect on secondary flows intensity and mean
Nusselt number which is the main novelty of current research.
Abstract: Modeling of the distributed systems allows us to
represent the whole its functionality. The working system instance
rarely fulfils the whole functionality represented by model; usually
some parts of this functionality should be accessible periodically.
The reporting system based on the Data Warehouse concept seams to
be an intuitive example of the system that some of its functionality is
required only from time to time. Analyzing an enterprise risk
associated with the periodical change of the system functionality, we
should consider not only the inaccessibility of the components
(object) but also their functions (methods), and the impact of such a
situation on the system functionality from the business point of view.
In the paper we suggest that the risk attributes should be estimated
from risk attributes specified at the requirements level (Use Case in
the UML model) on the base of the information about the structure of
the model (presented at other levels of the UML model). We argue
that it is desirable to consider the influence of periodical changes in
requirements on the enterprise risk estimation. Finally, the
proposition of such a solution basing on the UML system model is
presented.
Abstract: Trends in business intelligence, e-commerce and
remote access make it necessary and practical to store data in
different ways on multiple systems with different operating systems.
As business evolve and grow, they require efficient computerized
solution to perform data update and to access data from diverse
enterprise business applications. The objective of this paper is to
demonstrate the capability of DTS [1] as a database solution for
automatic data transfer and update in solving business problem. This
DTS package is developed for the sales of variety of plants and
eventually expanded into commercial supply and landscaping
business. Dimension data modeling is used in DTS package to
extract, transform and load data from heterogeneous database
systems such as MySQL, Microsoft Access and Oracle that
consolidates into a Data Mart residing in SQL Server. Hence, the
data transfer from various databases is scheduled to run automatically
every quarter of the year to review the efficient sales analysis.
Therefore, DTS is absolutely an attractive solution for automatic data
transfer and update which meeting today-s business needs.
Abstract: Artificial Immune System is applied as a Heuristic
Algorithm for decades. Nevertheless, many of these applications
took advantage of the benefit of this algorithm but seldom proposed
approaches for enhancing the efficiency. In this paper, a
Self-evolving Artificial Immune System is proposed via developing
the T and B cell in Immune System and built a self-evolving
mechanism for the complexities of different problems. In this
research, it focuses on enhancing the efficiency of Clonal selection
which is responsible for producing Affinities to resist the invading of
Antigens. T and B cell are the main mechanisms for Clonal
Selection to produce different combinations of Antibodies.
Therefore, the development of T and B cell will influence the
efficiency of Clonal Selection for searching better solution.
Furthermore, for better cooperation of the two cells, a co-evolutional
strategy is applied to coordinate for more effective productions of
Antibodies. This work finally adopts Flow-shop scheduling
instances in OR-library to validate the proposed algorithm.
Abstract: In this work, we consider a deterministic model for
the transmission of leptospirosis which is currently spreading in the
Thai population. The SIR model which incorporates the features of
this disease is applied to the epidemiological data in Thailand. It is
seen that the numerical solutions of the SIR equations are in good
agreement with real empirical data. Further improvements are
discussed.
Abstract: This paper made an attempt to investigate the problem associated with enhancement of emulsions of light crude oil-water recovery in an oil field of Algerian Sahara. Measurements were taken through experiments using RheoStress (RS600). Factors such as shear rate, temperature and light oil concentration on the viscosity behavior were considered. Experimental measurements were performed in terms of shear stress–shear rate, yield stress and flow index on mixture of light crude oil–water. The rheological behavior of emulsion showed Non-Newtonian shear thinning behavior (Herschel-Bulkley). The experiments done in the laboratory showed the stability of some water in light crude oil emulsions form during consolidate oil recovery process. To break the emulsion using additives may involve higher cost and could be very expensive. Therefore, further research should be directed to find solution of these problems that have been encountered.
Abstract: Currently, there are many local area industrial networks
that can give guaranteed bandwidth to synchronous traffic, particularly
providing CBR channels (Constant Bit Rate), which allow
improved bandwidth management. Some of such networks operate
over Ethernet, delivering channels with enough capacity, specially
with compressors, to integrate multimedia traffic in industrial monitoring
and image processing applications with many sources. In
these industrial environments where a low latency is an essential
requirement, JPEG is an adequate compressing technique but it
generates VBR traffic (Variable Bit Rate). Transmitting VBR traffic
in CBR channels is inefficient and current solutions to this problem
significantly increase the latency or further degrade the quality. In
this paper an R(q) model is used which allows on-line calculation of
the JPEG quantification factor. We obtained increased quality, a lower
requirement for the CBR channel with reduced number of discarded
frames along with better use of the channel bandwidth.
Abstract: In an era of knowledge explosion, the growth of data
increases rapidly day by day. Since data storage is a limited resource,
how to reduce the data space in the process becomes a challenge issue.
Data compression provides a good solution which can lower the
required space. Data mining has many useful applications in recent
years because it can help users discover interesting knowledge in large
databases. However, existing compression algorithms are not
appropriate for data mining. In [1, 2], two different approaches were
proposed to compress databases and then perform the data mining
process. However, they all lack the ability to decompress the data to
their original state and improve the data mining performance. In this
research a new approach called Mining Merged Transactions with the
Quantification Table (M2TQT) was proposed to solve these problems.
M2TQT uses the relationship of transactions to merge related
transactions and builds a quantification table to prune the candidate
itemsets which are impossible to become frequent in order to improve
the performance of mining association rules. The experiments show
that M2TQT performs better than existing approaches.
Abstract: Abrasive waterjet is a novel machining process capable of processing wide range of hard-to-machine materials. This research addresses modeling and optimization of the process parameters for this machining technique. To model the process a set of experimental data has been used to evaluate the effects of various parameter settings in cutting 6063-T6 aluminum alloy. The process variables considered here include nozzle diameter, jet traverse rate, jet pressure and abrasive flow rate. Depth of cut, as one of the most important output characteristics, has been evaluated based on different parameter settings. The Taguchi method and regression modeling are used in order to establish the relationships between input and output parameters. The adequacy of the model is evaluated using analysis of variance (ANOVA) technique. The pairwise effects of process parameters settings on process response outputs are also shown graphically. The proposed model is then embedded into a Simulated Annealing algorithm to optimize the process parameters. The optimization is carried out for any desired values of depth of cut. The objective is to determine proper levels of process parameters in order to obtain a certain level of depth of cut. Computational results demonstrate that the proposed solution procedure is quite effective in solving such multi-variable problems.
Abstract: Image retrieval is a topic where scientific interest is currently high. The important steps associated with image retrieval system are the extraction of discriminative features and a feasible similarity metric for retrieving the database images that are similar in content with the search image. Gabor filtering is a widely adopted technique for feature extraction from the texture images. The recently proposed sparsity promoting l1-norm minimization technique finds the sparsest solution of an under-determined system of linear equations. In the present paper, the l1-norm minimization technique as a similarity metric is used in image retrieval. It is demonstrated through simulation results that the l1-norm minimization technique provides a promising alternative to existing similarity metrics. In particular, the cases where the l1-norm minimization technique works better than the Euclidean distance metric are singled out.
Abstract: Road traffic accidents are a major cause of death worldwide. In an attempt to reduce accidents, some research efforts have focused on creating Advanced Driver Assistance Systems (ADAS) able to detect vehicle, driver and environmental conditions and to use this information to identify cues for potential accidents. This paper presents continued work on a novel Non-intrusive Intelligent Driver Assistance and Safety System (Ni-DASS) for assessing driver point of regard within vehicles. It uses an on-board CCD camera to observe the driver-s face. A template matching approach is used to compare the driver-s eye-gaze pattern with a set of eye-gesture templates of the driver looking at different focal points within the vehicle. The windscreen is divided into cells and comparison of the driver-s eye-gaze pattern with templates of a driver-s eyes looking at each cell is used to determine the driver-s point of regard on the windscreen. Results indicate that the proposed technique could be useful in situations where low resolution estimates of driver point of regard are adequate. For instance, To allow ADAS systems to alert the driver if he/she has positively failed to observe a hazard.
Abstract: An adaptive Fuzzy Inference Perceptual model has
been proposed for watermarking of digital images. The model
depends on the human visual characteristics of image sub-regions in
the frequency multi-resolution wavelet domain. In the proposed
model, a multi-variable fuzzy based architecture has been designed to
produce a perceptual membership degree for both candidate
embedding sub-regions and strength watermark embedding factor.
Different sizes of benchmark images with different sizes of
watermarks have been applied on the model. Several experimental
attacks have been applied such as JPEG compression, noises and
rotation, to ensure the robustness of the scheme. In addition, the
model has been compared with different watermarking schemes. The
proposed model showed its robustness to attacks and at the same time
achieved a high level of imperceptibility.
Abstract: Nowadays companies strive to survive in a
competitive global environment. To speed up product
development/modifications, it is suggested to adopt a collaborative
product development approach. However, despite the advantages of
new IT improvements still many CAx systems work separately and
locally. Collaborative design and manufacture requires a product
information model that supports related CAx product data models. To
solve this problem many solutions are proposed, which the most
successful one is adopting the STEP standard as a product data model
to develop a collaborative CAx platform. However, the improvement
of the STEP-s Application Protocols (APs) over the time, huge
number of STEP AP-s and cc-s, the high costs of implementation,
costly process for conversion of older CAx software files to the STEP
neutral file format; and lack of STEP knowledge, that usually slows
down the implementation of the STEP standard in collaborative data
exchange, management and integration should be considered. In this
paper the requirements for a successful collaborative CAx system is
discussed. The STEP standard capability for product data integration
and its shortcomings as well as the dominant platforms for supporting
CAx collaboration management and product data integration are
reviewed. Finally a platform named LAYMOD to fulfil the
requirements of CAx collaborative environment and integrating the
product data is proposed. The platform is a layered platform to enable
global collaboration among different CAx software
packages/developers. It also adopts the STEP modular architecture
and the XML data structures to enable collaboration between CAx
software packages as well as overcoming the STEP standard
limitations. The architecture and procedures of LAYMOD platform
to manage collaboration and avoid contradicts in product data
integration are introduced.
Abstract: Due to the coexistence of different Radio Access
Technologies (RATs), Next Generation Wireless Networks (NGWN)
are predicted to be heterogeneous in nature. The coexistence of
different RATs requires a need for Common Radio Resource
Management (CRRM) to support the provision of Quality of Service
(QoS) and the efficient utilization of radio resources. RAT selection
algorithms are part of the CRRM algorithms. Simply, their role is to
verify if an incoming call will be suitable to fit into a heterogeneous
wireless network, and to decide which of the available RATs is most
suitable to fit the need of the incoming call and admit it.
Guaranteeing the requirements of QoS for all accepted calls and at
the same time being able to provide the most efficient utilization of
the available radio resources is the goal of RAT selection algorithm.
The normal call admission control algorithms are designed for
homogeneous wireless networks and they do not provide a solution
to fit a heterogeneous wireless network which represents the NGWN.
Therefore, there is a need to develop RAT selection algorithm for
heterogeneous wireless network. In this paper, we propose an
approach for RAT selection which includes receiving different
criteria, assessing and making decisions, then selecting the most
suitable RAT for incoming calls. A comprehensive survey of
different RAT selection algorithms for a heterogeneous wireless
network is studied.
Abstract: In the gas refineries of Iran-s South Pars Gas
Complex, Sulfrex demercaptanization process is used to remove
volatile and corrosive mercaptans from liquefied petroleum gases by
caustic solution. This process consists of two steps. Removing low
molecular weight mercaptans and regeneration exhaust caustic. Some
parameters such as LPG feed temperature, caustic concentration and
feed-s mercaptan in extraction step and sodium mercaptide content in
caustic, catalyst concentration, caustic temperature, air injection rate
in regeneration step are effective factors. In this paper was focused on
temperature factor that play key role in mercaptans extraction and
caustic regeneration. The experimental results demonstrated by
optimization of temperature, sodium mercaptide content in caustic
because of good oxidation minimized and sulfur impurities in
product reduced.
Abstract: In this paper, we study a new modified Novikov equation for its classical and nonclassical symmetries and use the symmetries to reduce it to a nonlinear ordinary differential equation (ODE). With the aid of solutions of the nonlinear ODE by using the modified (G/G)-expansion method proposed recently, multiple exact traveling wave solutions are obtained and the traveling wave solutions are expressed by the hyperbolic functions, trigonometric functions and rational functions.
Abstract: Hydrodynamic pressures acting on upstream of concrete dams during an earthquake are an important factor in designing and assessing the safety of these structures in Earthquake regions. Due to inherent complexities, assessing exact hydrodynamic pressure is only feasible for problems with simple geometry. In this research, the governing equation of concrete gravity dam reservoirs with effect of fluid viscosity in frequency domain is solved and then compared with that in which viscosity is assumed zero. The results show that viscosity influences the reservoir-s natural frequency. In excitation frequencies near the reservoir's natural frequencies, hydrodynamic pressure has a considerable difference in compare to the results of non-viscose fluid.