Abstract: Amazing development of the information technology,
communications and internet expansion as well as the requirements
of the city managers to new ideas to run the city and higher
participation of the citizens encourage us to complete the electronic
city as soon as possible. The foundations of this electronic city are in
information technology. People-s participation in metropolitan
management is a crucial topic. Information technology does not
impede this matter. It can ameliorate populace-s participation and
better interactions between the citizens and the city managers.
Citizens can proffer their ideas, beliefs and votes through digital
mass media based upon the internet and computerization plexuses on
the topical matters to receive appropriate replies and services. They
can participate in urban projects by becoming cognizant of the city
views. The most significant challenges are as follows: information
and communicative management, altering citizens- views, as well as
legal and office documents
Electronic city obstacles have been identified in this research. The
required data were forgathered through questionnaires to identify the
barriers from a statistical community comprising specialists and
practitioners of the ministry of information technology and
communication, the municipality information technology
organization.
The conclusions demonstrate that the prioritized electronic city
application barriers in Iran are as follows:
The support quandaries (non-financial ones), behavioral, cultural
and educational plights, the security, legal and license predicaments,
the hardware, orismological and infrastructural curbs, the software
and fiscal problems.
Abstract: This article proposes a new methodology to be used by SMEs (Small and Medium enterprises) to characterize their performance in quality, highlighting weaknesses and area for improvement. The methodology aims to identify the principal causes of quality problems and help to prioritize improvement initiatives. This is a self-assessment methodology that intends to be easy to implement by companies with low maturity level in quality. The methodology is organized in six different steps which includes gathering information about predetermined processes and subprocesses of quality management, defined based on the well-known Juran-s trilogy for quality management (Quality planning, quality control and quality improvement) and, predetermined results categories, defined based on quality concept. A set of tools for data collecting and analysis, such as interviews, flowcharts, process analysis diagrams and Failure Mode and effects Analysis (FMEA) are used. The article also presents the conclusions obtained in the application of the methodology in two cases studies.
Abstract: This paper proposes a power-controlled scheduling scheme for devices using a directional antenna in smart home. In the case of the home network using directional antenna, devices can concurrently transmit data in the same frequency band. Accordingly, the throughput increases compared to that of devices using omni-directional antenna in proportional to the number of concurrent transmissions. Also, the number of concurrent transmissions depends on the beamwidth of antenna, the number of devices operating in the network , transmission power, interference and so on. In particular, the less transmission power is used, the more concurrent transmissions occur due to small transmission range. In this paper, we considered sub-optimal scheduling scheme for throughput maximization and power consumption minimization. In the scheme, each device is equipped with a directional antenna. Various beamwidths, path loss components, and antenna radiation efficiencies are considered. Numerical results show that the proposed schemes outperform the scheduling scheme using directional antennas without power control.
Abstract: The distinction among urban, periurban and rural areas represents a classical example of uncertainty in land classification. Satellite images, geostatistical analysis and all kinds of spatial data are very useful in urban sprawl studies, but it is important to define precise rules in combining great amounts of data to build complex knowledge about territory. Rough Set theory may be a useful method to employ in this field. It represents a different mathematical approach to uncertainty by capturing the indiscernibility. Two different phenomena can be indiscernible in some contexts and classified in the same way when combining available information about them. This approach has been applied in a case of study, comparing the results achieved with both Map Algebra technique and Spatial Rough Set. The study case area, Potenza Province, is particularly suitable for the application of this theory, because it includes 100 municipalities with different number of inhabitants and morphologic features.
Abstract: While financial institutions have faced difficulties
over the years for a multitude of reasons, the major cause of serious
banking problems continues to be directly related to lax credit
standards for borrowers and counterparties, poor portfolio risk
management, or a lack of attention to changes in economic or other
circumstances that can lead to a deterioration in the credit standing of
a bank's counterparties. Credit risk is most simply defined as the
potential that a bank borrower or counterparty will fail to meet its
obligations in accordance with agreed terms. The goal of credit risk
management is to maximize a bank's risk-adjusted rate of return by
maintaining credit risk exposure within acceptable parameters. Banks
need to manage the credit risk inherent in the entire portfolio as well
as the risk in individual credits or transactions. Banks should also
consider the relationships between credit risk and other risks. The
effective management of credit risk is a critical component of a
comprehensive approach to risk management and essential to the
long-term success of any banking organization. In this research we
also study the relationship between credit risk indices and borrower-s
timely payback in Karafarin bank.
Abstract: There is lot of work done in prediction of the fault proneness of the software systems. But, it is the severity of the faults that is more important than number of faults existing in the developed system as the major faults matters most for a developer and those major faults needs immediate attention. In this paper, we tried to predict the level of impact of the existing faults in software systems. Neuro-Fuzzy based predictor models is applied NASA-s public domain defect dataset coded in C programming language. As Correlation-based Feature Selection (CFS) evaluates the worth of a subset of attributes by considering the individual predictive ability of each feature along with the degree of redundancy between them. So, CFS is used for the selecting the best metrics that have highly correlated with level of severity of faults. The results are compared with the prediction results of Logistic Models (LMT) that was earlier quoted as the best technique in [17]. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provide a relatively better prediction accuracy as compared to other models and hence, can be used for the modeling of the level of impact of faults in function based systems.
Abstract: The nature of consumer products causes the difficulty
in forecasting the future demands and the accuracy of the forecasts
significantly affects the overall performance of the supply chain
system. In this study, two data mining methods, artificial neural
network (ANN) and support vector machine (SVM), were utilized to
predict the demand of consumer products. The training data used was
the actual demand of six different products from a consumer product
company in Thailand. The results indicated that SVM had a better
forecast quality (in term of MAPE) than ANN in every category of
products. Moreover, another important finding was the margin
difference of MAPE from these two methods was significantly high
when the data was highly correlated.
Abstract: To study on effect of PEG and NaCl stress on
germination and early seedling stages on two cultivar of corn, two
separated experiment were laid out at physiology laboratory, faculty
of Agriculture, Razi University, Kermanshah, Iran in 2009. This
investigation was performed as factorial experiment under Complete
Randomized Design (CRD) with three replications. Cultivar factor
contains of two varieties (sweet corn SC403 and Flint corn SC704)
and five levels of stress (0, -2, -4, -6 and -8 bar). The principal aim of
current study was to compare the two varieties of maize in relative to
the stress conditions. Results indicated that significant decrease was
observed in percentage of germination, germination rate, length of
radicle and plumule and radicle and plumule dry matter. On the basis
of the results, NaCl as compared with PEG had more effect on
germination and early seedling stage and sweet corn had more
resistant than flint corn in both stress conditions.
Abstract: In this paper, the Gaussian type quadrature rules for fuzzy functions are discussed. The errors representation and convergence theorems are given. Moreover, four kinds of Gaussian type quadrature rules with error terms for approximate of fuzzy integrals are presented. The present paper complements the theoretical results of the paper by T. Allahviranloo and M. Otadi [T. Allahviranloo, M. Otadi, Gaussian quadratures for approximate of fuzzy integrals, Applied Mathematics and Computation 170 (2005) 874-885]. The obtained results are illustrated by solving some numerical examples.
Abstract: The purpose of the experiments described in this article was the comparison of integrated fixed film activated sludge (IFAS) and activated sludge (AS) system. The IFAS applied system consists of the cigarette filter rods (wasted filter in tobacco factories) as a biofilm carrier. The comparison with activated sludge was performed by two parallel treatment lines. Organic substance, ammonia and TP removal was investigated over four month period. Synthetic wastewater was prepared with ordinary tap water and glucose as the main sources of carbon and energy, plus balanced macro and micro nutrients. COD removal percentages of 94.55%, and 81.62% were achieved for IFAS and activated sludge system, respectively. Also, ammonia concentration significantly decreased by increasing the HRT in both systems. The average ammonia removal of 97.40 % and 96.34% were achieved for IFAS and activated sludge system, respectively. The removal efficiency of total phosphorus (TP-P) was 60.64%, higher than AS process by 56.63% respectively.
Abstract: In this paper, we apply the PQ theory with shunt active power filter in an unbalanced and distorted power system voltage to compensate the perturbations generated by non linear load. The power factor is also improved in the current source. The PLL system is used to extract the fundamental component of the even sequence under conditions mentioned of the power system voltage.
Abstract: Biological data has several characteristics that strongly differentiate it from typical business data. It is much more complex, usually large in size, and continuously changes. Until recently business data has been the main target for discovering trends, patterns or future expectations. However, with the recent rise in biotechnology, the powerful technology that was used for analyzing business data is now being applied to biological data. With the advanced technology at hand, the main trend in biological research is rapidly changing from structural DNA analysis to understanding cellular functions of the DNA sequences. DNA chips are now being used to perform experiments and DNA analysis processes are being used by researchers. Clustering is one of the important processes used for grouping together similar entities. There are many clustering algorithms such as hierarchical clustering, self-organizing maps, K-means clustering and so on. In this paper, we propose a clustering algorithm that imitates the ecosystem taking into account the features of biological data. We implemented the system using an Ant-Colony clustering algorithm. The system decides the number of clusters automatically. The system processes the input biological data, runs the Ant-Colony algorithm, draws the Topic Map, assigns clusters to the genes and displays the output. We tested the algorithm with a test data of 100 to1000 genes and 24 samples and show promising results for applying this algorithm to clustering DNA chip data.
Abstract: This paper presents an innovative approach within the area of Group Decision Support System (GDSS) by using tools based on intelligent agents. It introduces iGDSS, a software platform for decision support and collaboration and an application of this platform - eCollaborative Decisions - for academic environment, all these developed within a framework of a research project.
Abstract: Classifier fusion may generate more accurate
classification than each of the basic classifiers. Fusion is often based
on fixed combination rules like the product, average etc. This paper
presents decision templates as classifier fusion method for the
recognition of the handwritten English and Farsi numerals (1-9).
The process involves extracting a feature vector on well-known
image databases. The extracted feature vector is fed to multiple
classifier fusion. A set of experiments were conducted to compare
decision templates (DTs) with some combination rules. Results from
decision templates conclude 97.99% and 97.28% for Farsi and
English handwritten digits.
Abstract: This paper presents a new fingerprint coding technique
based on contourlet transform and multistage vector quantization.
Wavelets have shown their ability in representing natural images that
contain smooth areas separated with edges. However, wavelets
cannot efficiently take advantage of the fact that the edges usually
found in fingerprints are smooth curves. This issue is addressed by
directional transforms, known as contourlets, which have the
property of preserving edges. The contourlet transform is a new
extension to the wavelet transform in two dimensions using
nonseparable and directional filter banks. The computation and
storage requirements are the major difficulty in implementing a
vector quantizer. In the full-search algorithm, the computation and
storage complexity is an exponential function of the number of bits
used in quantizing each frame of spectral information. The storage
requirement in multistage vector quantization is less when compared
to full search vector quantization. The coefficients of contourlet
transform are quantized by multistage vector quantization. The
quantized coefficients are encoded by Huffman coding. The results
obtained are tabulated and compared with the existing wavelet based
ones.
Abstract: International markets driven forces are changing
continuously, therefore companies need to gain a competitive edge in
such markets. Improving the company's products, processes and
practices is no longer auxiliary. Lean production is a production
management philosophy that consolidates work tasks with minimum
waste resulting in improved productivity. Lean production practices
can be mapped into many production areas. One of these is
Manufacturing Equipment and Technology (MET). Many lean
production practices can be implemented in MET, namely, specific
equipment configurations, total preventive maintenance, visual
control, new equipment/ technologies, production process
reengineering and shared vision of perfection.The purpose of this
paper is to investigate the implementation level of these six practices
in Jordanian industries. To achieve that a questionnaire survey has
been designed according to five-point Likert scale. The questionnaire
is validated through pilot study and through experts review. A sample
of 350 Jordanian companies were surveyed, the response rate was
83%. The respondents were asked to rate the extent of
implementation for each of practices. A relationship conceptual
model is developed, hypotheses are proposed, and consequently the
essential statistical analyses are then performed. An assessment tool
that enables management to monitor the progress and the
effectiveness of lean practices implementation is designed and
presented. Consequently, the results show that the average
implementation level of lean practices in MET is 77%, Jordanian
companies are implementing successfully the considered lean
production practices, and the presented model has Cronbach-s alpha
value of 0.87 which is good evidence on model consistency and
results validation.
Abstract: Meeting users- requirements is one of predictors of project success. There should be a match between the expectations of the users and the perception of key project personnel with respect to usability and functionality. The aim of this study is to make a comparison of key project personnel-s and potential users- (customer representatives) evaluations of the relative importance of usability and functionality factors in a software design project. Analytical Network Process (ANP) was used to analyze the relative importance of the factors. The results show that navigation and interaction are the most significant factors,andsatisfaction and efficiency are the least important factors for both groups. Further, it can be concluded that having similar orders and scores of usability and functionality factors for both groups shows that key project personnel have captured the expectations and requirements of potential users accurately.
Abstract: Biodiesel is traditionally produced from oleaginous
plants. On the other hand, increasing biodiesel production from these
raw materials could create problems of food supply. Producing
biodiesel from microalgae could help to overcome this difficulty,
because microalgae are rich in lipids and do not compete for arable
lands. However, no studies had compared vegetable and microalgae
oil-based biodiesel in terms of yield, viscosity and heat of
combustion. In the present study, commercial canola and microalgae
oil were therefore transesterified with methanol under a homogenous
alkali catalyst (potassium hydroxide) at 100oC for 1h. The result
showed that microalgae-based oil has a higher yield in biodiesel with
89.7% (g biodiesel/g oil) and a lower kinematic viscosity (22oC) of
4.31 mm/s2 than canola oil.
Abstract: Vermicomposting is the conversion of organic waste
into bio-fertilizers through the action of earthworm. This technology
is widely used for organic solid waste management. Waste corn pulp
blended with cow dung manure was vermicomposted over 30 days
using Eisenia fetida earthworms species. pH, temperature, moisture
content, and electrical conductivity were daily monitored. The
feedstock, vermicompost and vermiwash were analyzed for nutrient
composition. The average temperature and moisture content in the
vermi-reactor was 22.5°C and 42.5% respectively. The vermicompost
and vermiwash had an almost neutral pH whilst the electrical
conductivity was 21% higher in the vermicompost. The nitrogen and
potassium content was 57% and 79.6% richer in the vermicompost
respectively compared to the vermiwash. However, the vermiwash
was 84% richer in phosphorous as compared to vermicompost.
Furthermore, the vermiwash was 89.1% and 97.6% richer in Ca and
Mg respectively and was 97.8% richer in Na salts compared to the
vermicompost. The vermiwash also indicated a significantly higher
amount of micronutrients. Both bio-fertilizers were rich in nutrients
specification for fertilizers.
Abstract: Sedimentation formation is a complex hydraulic phenomenon that has emerged as a major operational and maintenance consideration in modern hydraulic engineering in general and river engineering in particular. Sediments accumulation along the river course and their eventual storage in a form of islands affect water intake in the canal systems that are fed by the storage reservoirs. Without proper management, sediment transport can lead to major operational challenges in water distribution system of arid regions like the Dez and Hamidieh command areas. The paper aims to investigate sedimentation in the Western Canal of Dez Diversion Weir using the SHARC model and compare the results with the two intake structures of the Hamidieh dam in Iran using SSIIM model. The objective was to identify the factors which influence the process, check reliability of outcome and provide ways in which to mitigate the implications on operation and maintenance of the structures. Results estimated sand and silt bed loads concentrations to be 193 ppm and 827ppm respectively. This followed ,ore or less similar pattern in Hamidieh where the sediment formation impeded water intake in the canal system. Given the available data on average annual bed loads and average suspended sediment loads of 165ppm and 837ppm in the Dez, there was a significant statistical difference (16%) between the sand grains, whereas no significant difference (1.2%) was find in the silt grain sizes. One explanation for such finding being that along the 6 Km river course there was considerable meandering effects which explains recent shift in the hydraulic behavior along the stream course under investigation. The sand concentration in downstream relative to present state of the canal showed a steep descending curve. Sediment trapping on the other hand indicated a steep ascending curve. These occurred because the diversion weir was not considered in the simulation model. The comparative study showed very close similarities in the results which explains the fact that both software can be used as accurate and reliable analytical tools for simulation of the sedimentation in hydraulic engineering.