Abstract: Typical Intelligent Decision Support System is 4-based, its design composes of Data Warehouse, Online Analytical Processing, Data Mining and Decision Supporting based on models, which is called Decision Support System Based on Data Warehouse (DSSBDW). This way takes ETL,OLAP and DM as its implementing means, and integrates traditional model-driving DSS and data-driving DSS into a whole. For this kind of problem, this paper analyzes the DSSBDW architecture and DW model, and discusses the following key issues: ETL designing and Realization; metadata managing technology using XML; SQL implementing, optimizing performance, data mapping in OLAP; lastly, it illustrates the designing principle and method of DW in DSSBDW.
Abstract: In this paper application of artificial intelligence for
baby and children caring is studied. Then a new idea for injury
prevention and safety announcement is presented by using digital
image processing. The paper presents the structure of the proposed
system. The system determines the possibility of the dangers for
children and babies in yards, gardens and swimming pools or etc. In
the presented idea, multi camera System is used and receiver videos
are processed to find the hazardous areas then the entrance of
children and babies in the determined hazardous areas are analyzed.
In this condition the system does the programmed action capture,
produce alarm or tone or send message.
Abstract: This paper presents a review on vision aided systems
and proposes an approach for visual rehabilitation using stereo vision
technology. The proposed system utilizes stereo vision, image
processing methodology and a sonification procedure to support
blind navigation. The developed system includes a wearable
computer, stereo cameras as vision sensor and stereo earphones, all
moulded in a helmet. The image of the scene infront of visually
handicapped is captured by the vision sensors. The captured images
are processed to enhance the important features in the scene in front,
for navigation assistance. The image processing is designed as model
of human vision by identifying the obstacles and their depth
information. The processed image is mapped on to musical stereo
sound for the blind-s understanding of the scene infront. The
developed method has been tested in the indoor and outdoor
environments and the proposed image processing methodology is
found to be effective for object identification.
Abstract: Information on weed distribution within the field is
necessary to implement spatially variable herbicide application.
Since hand labor is costly, an automated weed control system could be
feasible. This paper deals with the development of an algorithm for
real time specific weed recognition system based on Histogram
Analysis of an image that is used for the weed classification. This
algorithm is specifically developed to classify images into broad and
narrow class for real-time selective herbicide application. The
developed system has been tested on weeds in the lab, which have
shown that the system to be very effectiveness in weed identification.
Further the results show a very reliable performance on images of
weeds taken under varying field conditions. The analysis of the results
shows over 95 percent classification accuracy over 140 sample images
(broad and narrow) with 70 samples from each category of weeds.
Abstract: In order to give high expertise the computer aided
design of mechanical systems involves specific activities focused on
processing two type of information: knowledge and data. Expert rule
based knowledge is generally processing qualitative information and
involves searching for proper solutions and their combination into
synthetic variant. Data processing is based on computational models
and it is supposed to be inter-related with reasoning in the knowledge
processing. In this paper an Intelligent Integrated System is proposed,
for the objective of choosing the adequate material. The software is
developed in Prolog – Flex software and takes into account various
constraints that appear in the accurate operation of gears.
Abstract: The mitigation of crop loss due to damaging freezes
requires accurate air temperature prediction models. Previous work
established that the Ward-style artificial neural network (ANN) is a
suitable tool for developing such models. The current research
focused on developing ANN models with reduced average prediction
error by increasing the number of distinct observations used in
training, adding additional input terms that describe the date of an
observation, increasing the duration of prior weather data included in
each observation, and reexamining the number of hidden nodes used
in the network. Models were created to predict air temperature at
hourly intervals from one to 12 hours ahead. Each ANN model,
consisting of a network architecture and set of associated parameters,
was evaluated by instantiating and training 30 networks and
calculating the mean absolute error (MAE) of the resulting networks
for some set of input patterns. The inclusion of seasonal input terms,
up to 24 hours of prior weather information, and a larger number of
processing nodes were some of the improvements that reduced
average prediction error compared to previous research across all
horizons. For example, the four-hour MAE of 1.40°C was 0.20°C, or
12.5%, less than the previous model. Prediction MAEs eight and 12
hours ahead improved by 0.17°C and 0.16°C, respectively,
improvements of 7.4% and 5.9% over the existing model at these
horizons. Networks instantiating the same model but with different
initial random weights often led to different prediction errors. These
results strongly suggest that ANN model developers should consider
instantiating and training multiple networks with different initial
weights to establish preferred model parameters.
Abstract: Truncated multiplier is a good candidate for digital
signal processing (DSP) applications including finite impulse
response (FIR) and discrete cosine transform (DCT). Through
truncated multiplier a significant reduction in Field Programmable
Gate Array (FPGA) resources can be achieved. This paper presents
for the first time a comparison of resource utilization of Spartan-3AN
and Virtex-5 implementation of standard and truncated multipliers
using Very High Speed Integrated Circuit Hardware Description
Language (VHDL). The Virtex-5 FPGA shows significant
improvement as compared to Spartan-3AN FPGA device. The
Virtex-5 FPGA device shows better performance with a percentage
ratio of number of occupied slices for standard to truncated
multipliers is increased from 40% to 73.86% as compared to Spartan-
3AN is decreased from 68.75% to 58.78%. Results show that the
anomaly in Spartan-3AN FPGA device average connection and
maximum pin delay have been efficiently reduced in Virtex-5 FPGA
device.
Abstract: Super resolution (SR) technologies are now being
applied to video to improve resolution. Some TV sets are now
equipped with SR functions. However, it is not known if super
resolution image reconstruction (SRR) for TV really works or not.
Super resolution with non-linear signal processing (SRNL) has
recently been proposed. SRR and SRNL are the only methods for
processing video signals in real time. The results from subjective
assessments of SSR and SRNL are described in this paper. SRR video
was produced in simulations with quarter precision motion vectors and
100 iterations. These are ideal conditions for SRR. We found that the
image quality of SRNL is better than that of SRR even though SRR
was processed under ideal conditions.
Abstract: The Mahin area is a part of Tarom- Hashtjin zone that
located in west of Qazvin province in northwest of Iran. Many copper
and base metals ore deposits are hosted by this zone. High potential
localities identification in this area is very necessary. The objective of
this research, is finding hydrothermal alteration zones by remote
sensing methods and best processing technique of Advanced
Spaceborne Thermal Emission and Reflection Radiometer (ASTER)
data. Different methods such as band ratio, Principal Component
Analysis (PCA), Minimum Noise Fraction (MNF) and Least Square
Fit (LS-Fit) were used for mapping hydrothermal alteration zones.
Abstract: This research presents the development of simulation
modeling for WIP management in semiconductor fabrication.
Manufacturing simulation modeling is needed for productivity
optimization analysis due to the complex process flows involved
more than 35 percent re-entrance processing steps more than 15 times
at same equipment. Furthermore, semiconductor fabrication required
to produce high product mixed with total processing steps varies from
300 to 800 steps and cycle time between 30 to 70 days. Besides the
complexity, expansive wafer cost that potentially impact the
company profits margin once miss due date is another motivation to
explore options to experiment any analysis using simulation
modeling. In this paper, the simulation model is developed using
existing commercial software platform AutoSched AP, with
customized integration with Manufacturing Execution Systems
(MES) and Advanced Productivity Family (APF) for data collections
used to configure the model parameters and data source. Model
parameters such as processing steps cycle time, equipment
performance, handling time, efficiency of operator are collected
through this customization. Once the parameters are validated, few
customizations are made to ensure the prior model is executed. The
accuracy for the simulation model is validated with the actual output
per day for all equipments. The comparison analysis from result of
the simulation model compared to actual for achieved 95 percent
accuracy for 30 days. This model later was used to perform various
what if analysis to understand impacts on cycle time and overall
output. By using this simulation model, complex manufacturing
environment like semiconductor fabrication (fab) now have
alternative source of validation for any new requirements impact
analysis.
Abstract: To compress, improve bit error performance and also enhance 2D images, a new scheme, called Iterative Cellular-Turbo System (IC-TS) is introduced. In IC-TS, the original image is partitioned into 2N quantization levels, where N is denoted as bit planes. Then each of the N-bit-plane is coded by Turbo encoder and transmitted over Additive White Gaussian Noise (AWGN) channel. At the receiver side, bit-planes are re-assembled taking into consideration of neighborhood relationship of pixels in 2-D images. Each of the noisy bit-plane values of the image is evaluated iteratively using IC-TS structure, which is composed of equalization block; Iterative Cellular Image Processing Algorithm (ICIPA) and Turbo decoder. In IC-TS, there is an iterative feedback link between ICIPA and Turbo decoder. ICIPA uses mean and standard deviation of estimated values of each pixel neighborhood. It has extra-ordinary satisfactory results of both Bit Error Rate (BER) and image enhancement performance for less than -1 dB Signal-to-Noise Ratio (SNR) values, compared to traditional turbo coding scheme and 2-D filtering, applied separately. Also, compression can be achieved by using IC-TS systems. In compression, less memory storage is used and data rate is increased up to N-1 times by simply choosing any number of bit slices, sacrificing resolution. Hence, it is concluded that IC-TS system will be a compromising approach in 2-D image transmission, recovery of noisy signals and image compression.
Abstract: Documents retrieval in Information Retrieval
Systems (IRS) is generally about understanding of
information in the documents concern. The more the system
able to understand the contents of documents the more
effective will be the retrieval outcomes. But understanding of the
contents is a very complex task. Conventional IRS apply algorithms
that can only approximate the meaning of document contents through
keywords approach using vector space model. Keywords may be
unstemmed or stemmed. When keywords are stemmed and conflated
in retrieving process, we are a step forwards in applying semantic
technology in IRS. Word stemming is a process in morphological
analysis under natural language processing, before syntactic and
semantic analysis. We have developed algorithms for Malay and
Arabic and incorporated stemming in our experimental systems in
order to measure retrieval effectiveness. The results have shown that
the retrieval effectiveness has increased when stemming is used in
the systems.
Abstract: Due to availability of powerful image processing software
and improvement of human computer knowledge, it becomes
easy to tamper images. Manipulation of digital images in different
fields like court of law and medical imaging create a serious problem
nowadays. Copy-move forgery is one of the most common types
of forgery which copies some part of the image and pastes it to
another part of the same image to cover an important scene. In
this paper, a copy-move forgery detection method proposed based
on Fourier transform to detect forgeries. Firstly, image is divided to
same size blocks and Fourier transform is performed on each block.
Similarity in the Fourier transform between different blocks provides
an indication of the copy-move operation. The experimental results
prove that the proposed method works on reasonable time and works
well for gray scale and colour images. Computational complexity
reduced by using Fourier transform in this method.
Abstract: UK breweries generate extensive by products in the
form of spent grain, slurry and yeast. Much of the spent grain is
produced by large breweries and processed in bulk for animal feed.
Spent brewery grains contain up to 20% protein dry weight and up to
60% fiber and are useful additions to animal feed. Bulk processing is
economic and allows spent grain to be sold so providing an income
to the brewery. A proportion of spent grain, however, is produced by
small local breweries and is more variably distributed to farms or
other users using intermittent collection methods. Such use is much
less economic and may incur losses if not carefully assessed for
transport costs. This study reports an economic returns of using wet
brewery spent grain (WBSG) in animal feed using the Co-product
Optimizer Decision Evaluator model (Cattle CODE) developed by
the University of Nebraska to predict performance and economic
returns when byproducts are fed to finishing cattle. The results
indicated that distance from brewery to farm had a significantly
greater effect on the economics of use of small brewery spent grain
and that alternative uses than cattle feed may be important to
develop.
Abstract: The Yasuj city stream named the Beshar supply
water for different usages such as aquaculture farms , drinking,
agricultural and industrial usages. Fish processing plants
,Agricultural farms, waste water of industrial zones and hospitals
waste water which they are generate by human activity produce a
considerable volume of effluent and when they are released in to the
stream they can effect on the water quality and down stream aquatic
systems. This study was conducted to evaluate the effects of outflow
effluent from different human activity and point and non point
pollution sources on the water quality and health of the Beshar
river next to Yasuj. Yasuj is the biggest and most important city in
the Kohkiloye and Boyerahmad province . The Beshar River is one
of the most important aquatic ecosystems in the upstream of the
Karun watershed in south of Iran which is affected by point and non
point pollutant sources . This study was done in order to evaluate the
effects of human activities on the water quality and health of the
Beshar river. This river is approximately 190 km in length and
situated at the geographical positions of 51° 20' to 51° 48' E and 30°
18' to 30° 52' N it is one of the most important aquatic ecosystems of
Kohkiloye and Boyerahmad province in south-west Iran. In this
research project, five study stations were selected to examine water
pollution in the Beshar River systems. Human activity is now one of
the most important factors affecting on hydrology and water quality
of the Beshar river. Humans use large amounts of resources to sustain
various standards of living, although measures of sustainability are
highly variable depending on how sustainability is defined. The
Beshar river ecosystems are particularly sensitive and vulnerable to
human activities. The water samples were analyzed, then some
important water quality parameters such as pH, dissolve oxygen
(DO), Biochemical Oxygen Demand (BOD5), Chemical Oxygen
Demand (COD), Total Suspended Solids (TDS),Turbidity,
Temperature, Nitrates (NO3) and Phosphates (PO4) were estimated
at the two stations. The results show a downward trend in the water
quality at the down stream of the city. The amounts of
BOD5,COD,TSS,T,Turbidity, NO3 and PO4 in the down stream
stations were considerably more than the station 1. By contrast the
amounts of DO in the down stream stations were less than to the
station 1. However when effluent discharge consequence of human
activities are released into the Beshar river near the city, the quality
of river are decreases and the environmental problems of the river
during the next years are predicted to rise.
Abstract: The paper presents a method for multivariate time
series forecasting using Independent Component Analysis (ICA), as a preprocessing tool. The idea of this approach is to do the forecasting in the space of independent components (sources), and then to transform back the results to the original time series
space. The forecasting can be done separately and with a different
method for each component, depending on its time structure. The
paper gives also a review of the main algorithms for independent component analysis in the case of instantaneous mixture models, using second and high-order statistics. The method has been applied in simulation to an artificial multivariate time series
with five components, generated from three sources and a mixing matrix, randomly generated.
Abstract: The goal of Gene Expression Analysis is to understand the processes that underlie the regulatory networks and pathways controlling inter-cellular and intra-cellular activities. In recent times microarray datasets are extensively used for this purpose. The scope of such analysis has broadened in recent times towards reconstruction of gene networks and other holistic approaches of Systems Biology. Evolutionary methods are proving to be successful in such problems and a number of such methods have been proposed. However all these methods are based on processing of genotypic information. Towards this end, there is a need to develop evolutionary methods that address phenotypic interactions together with genotypic interactions. We present a novel evolutionary approach, called Phenomic algorithm, wherein the focus is on phenotypic interaction. We use the expression profiles of genes to model the interactions between them at the phenotypic level. We apply this algorithm to the yeast sporulation dataset and show that the algorithm can identify gene networks with relative ease.
Abstract: A new target detection technique is presented in this
paper for the identification of small boats in coastal surveillance. The
proposed technique employs an adaptive progressive thresholding (APT) scheme to first process the given input scene to separate any
objects present in the scene from the background. The preprocessing
step results in an image having only the foreground objects, such as
boats, trees and other cluttered regions, and hence reduces the search
region for the correlation step significantly. The processed image is then fed to the shifted phase-encoded fringe-adjusted joint transform
correlator (SPFJTC) technique which produces single and delta-like
correlation peak for a potential target present in the input scene. A
post-processing step involves using a peak-to-clutter ratio (PCR) to determine whether the boat in the input scene is authorized or unauthorized. Simulation results are presented to show that the
proposed technique can successfully determine the presence of an authorized boat and identify any intruding boat present in the given input scene.
Abstract: Optical properties of sputter-deposited ZnS thin films
were investigated as potential replacements for CBD(chemical bath
deposition) CdS buffer layers in the application of CIGS solar cells.
ZnS thin films were fabricated on glass substrates at RT, 150oC, 200oC,
and 250oC with 50 sccm Ar gas using an RF magnetron sputtering
system. The crystal structure of the thin film is found to be zinc blende
(cubic) structure. Lattice parameter of ZnS is slightly larger than CdS
on the plane and thus better matched with that of CIGS. Within a
400-800 nm wavelength region, the average transmittance was larger
than 75%. When the deposition temperature of the thin film was
increased, the blue shift phenomenon was enhanced. Band gap energy
of the ZnS thin film tended to increase as the deposition temperature
increased. ZnS thin film is a promising material system for the CIGS
buffer layer, in terms of ease of processing, low cost, environmental
friendliness, higher transparency, and electrical properties
Abstract: Measurement of the quality of image compression is important for image processing application. In this paper, we propose an objective image quality assessment to measure the quality of gray scale compressed image, which is correlation well with subjective quality measurement (MOS) and least time taken. The new objective image quality measurement is developed from a few fundamental of objective measurements to evaluate the compressed image quality based on JPEG and JPEG2000. The reliability between each fundamental objective measurement and subjective measurement (MOS) is found. From the experimental results, we found that the Maximum Difference measurement (MD) and a new proposed measurement, Structural Content Laplacian Mean Square Error (SCLMSE), are the suitable measurements that can be used to evaluate the quality of JPEG200 and JPEG compressed image, respectively. In addition, MD and SCLMSE measurements are scaled to make them equivalent to MOS, given the rate of compressed image quality from 1 to 5 (unacceptable to excellent quality).