Abstract: A multi-agent system is developed here to predict
monthly details of the upcoming peak of the 24th solar magnetic
cycle. While studies typically predict the timing and magnitude of
cycle peaks using annual data, this one utilizes the unsmoothed
monthly sunspot number instead. Monthly numbers display more
pronounced fluctuations during periods of strong solar magnetic
activity than the annual sunspot numbers. Because strong magnetic
activities may cause significant economic damages, predicting
monthly variations should provide different and perhaps helpful
information for decision-making purposes. The multi-agent system
developed here operates in two stages. In the first, it produces twelve
predictions of the monthly numbers. In the second, it uses those
predictions to deliver a final forecast. Acting as expert agents, genetic
programming and neural networks produce the twelve fits and
forecasts as well as the final forecast. According to the results
obtained, the next peak is predicted to be 156 and is expected to
occur in October 2011- with an average of 136 for that year.
Abstract: In the current economy of increasing global
competition, many organizations are attempting to use knowledge as
one of the means to gain sustainable competitive advantage. Besides
large organizations, the success of SMEs can be linked to how well
they manage their knowledge. Despite the profusion of research
about knowledge management within large organizations, fewer
studies tried to analyze KM in SMEs.
This research proposes a new framework showing the determinant
role of organizational dimensions onto KM approaches. The paper
and its propositions are based on a literature review and analysis.
In this research, personalization versus codification,
individualization versus institutionalization and IT-based versus non
IT-based are highlighted as three distinct dimensions of knowledge
management approaches.
The study contributes to research by providing a more nuanced
classification of KM approaches and provides guidance to managers
about the types of KM approaches that should be adopted based on
the size, geographical dispersion and task nature of SMEs.
To the author-s knowledge, the paper is the first of its kind to
examine if there are suitable configurations of KM approaches for
SMEs with different dimensions. It gives valuable information, which
hopefully will help SME sector to accomplish KM.
Abstract: This work presents a low-cost and eco-friendly
building material named Agrostone panel. Africa-s urban population
is growing at an annual rate of 2.8% and 62% of its population will
live in urban areas by 2050. As a consequence, many of the least
urbanized and least developed African countries- will face serious
challenges in providing affordable housing to the urban dwellers.
Since the cost of building materials accounts for the largest
proportion of the overall construction cost, innovating low-cost
building material is vital. Agrostone panel is used in housing projects
in Ethiopia. It uses raw materials of agricultural/industrial wastes
and/or natural minerals as a filler, magnesium-based chemicals as a
binder and fiberglass as reinforcement. Agrostone panel reduces the
cost of wall construction by 50% compared with the conventional
building materials. The pros and cons of Agrostone panel as well as
the use of other waste materials as a raw material to make the panel
more sustainable, low-cost and better properties are discussed.
Abstract: This article proposes a new methodology to be used by SMEs (Small and Medium enterprises) to characterize their performance in quality, highlighting weaknesses and area for improvement. The methodology aims to identify the principal causes of quality problems and help to prioritize improvement initiatives. This is a self-assessment methodology that intends to be easy to implement by companies with low maturity level in quality. The methodology is organized in six different steps which includes gathering information about predetermined processes and subprocesses of quality management, defined based on the well-known Juran-s trilogy for quality management (Quality planning, quality control and quality improvement) and, predetermined results categories, defined based on quality concept. A set of tools for data collecting and analysis, such as interviews, flowcharts, process analysis diagrams and Failure Mode and effects Analysis (FMEA) are used. The article also presents the conclusions obtained in the application of the methodology in two cases studies.
Abstract: An optical fault monitoring in FTTH-PON using ACS
is demonstrated. This device can achieve real-time fault monitoring
for protection feeder fiber. In addition, the ACS can distinguish
optical fiber fault from the transmission services to other customers
in the FTTH-PON. It is essential to use a wavelength different from
the triple-play services operating wavelengths for failure detection.
ACS is using the operating wavelength 1625 nm for monitoring and
failure detection control. Our solution works on a standard local area
network (LAN) using a specially designed hardware interfaced with a
microcontroller integrated Ethernet.
Abstract: Since the pioneering work of Zadeh, fuzzy set theory has been applied to a myriad of areas. Song and Chissom introduced the concept of fuzzy time series and applied some methods to the enrollments of the University of Alabama. In recent years, a number of techniques have been proposed for forecasting based on fuzzy set theory methods. These methods have either used enrollment numbers or differences of enrollments as the universe of discourse. We propose using the year to year percentage change as the universe of discourse. In this communication, the approach of Jilani, Burney, and Ardil is modified by using the year to year percentage change as the universe of discourse. We use enrollment figures for the University of Alabama to illustrate our proposed method. The proposed method results in better forecasting accuracy than existing models.
Abstract: This paper proposes a power-controlled scheduling scheme for devices using a directional antenna in smart home. In the case of the home network using directional antenna, devices can concurrently transmit data in the same frequency band. Accordingly, the throughput increases compared to that of devices using omni-directional antenna in proportional to the number of concurrent transmissions. Also, the number of concurrent transmissions depends on the beamwidth of antenna, the number of devices operating in the network , transmission power, interference and so on. In particular, the less transmission power is used, the more concurrent transmissions occur due to small transmission range. In this paper, we considered sub-optimal scheduling scheme for throughput maximization and power consumption minimization. In the scheme, each device is equipped with a directional antenna. Various beamwidths, path loss components, and antenna radiation efficiencies are considered. Numerical results show that the proposed schemes outperform the scheduling scheme using directional antennas without power control.
Abstract: Characteristics of ad hoc networks and even their existence depend on the nodes forming them. Thus, services and applications designed for ad hoc networks should adapt to this dynamic and distributed environment. In particular, multicast algorithms having reliability and scalability requirements should abstain from centralized approaches. We aspire to define a reliable and scalable multicast protocol for ad hoc networks. Our target is to utilize epidemic techniques for this purpose. In this paper, we present a brief survey of epidemic algorithms for reliable multicasting in ad hoc networks, and describe formulations and analytical results for simple epidemics. Then, P2P anti-entropy algorithm for content distribution and our prototype simulation model are described together with our initial results demonstrating the behavior of the algorithm.
Abstract: Growing world population has fundamental impacts
and often catastrophic on natural habitat. The immethodical
consumption of energy, destruction of the forests and extinction of
plant and animal species are the consequence of this experience.
Urban sustainability and sustainable urban development, that is so
spoken these days, should be considered as a strategy, goal and
policy, beyond just considering environmental issues and protection.
The desert-s climate has made a bunch of problems for its residents.
Very hot and dry climate in summers of the Iranian desert areas,
when there was no access to modern energy source and mechanical
cooling systems in the past, made Iranian architects to design a
natural ventilation system in their buildings. The structure, like a
tower going upward the roof, besides its ornamental application and
giving a beautiful view to the building, was used as a spontaneous
ventilation system. In this paper, it has been tried to name the
problems of the area and it-s inconvenience, then some answers has
pointed out in order to solve the problems and as an alternative
solution BADGIR (wind-catcher) has been introduced as a solution
knowing that it has been playing a major role in dealing with the
problems.
Abstract: It is essential to have a uniform and calm flow field
for a settling tank to have high performance. In general, the
recirculation zones always occurred in sedimentation tanks. The
presence of these regions may have different effects. The nonuniformity
of the velocity field, the short-circuiting at the surface and
the motion of the jet at the bed of the tank that occurs because of the
recirculation in the sedimentation layer, are affected by the geometry
of the tank. There are some ways to decrease the size of these dead
zones, which would increase the performance. One of the ways is to
use a suitable baffle configuration. In this study, the presence of
baffle with different position has been investigated by a finite volume
method, with VOF (Volume of Fluid) model. Besides, the k-ε
turbulence model is used in the numerical calculations. The results
indicate that the best position of the baffle is obtained when the
volume of the recirculation region is minimized or is divided to
smaller part and the flow field trend to be uniform in the settling
zone.
Abstract: The distinction among urban, periurban and rural areas represents a classical example of uncertainty in land classification. Satellite images, geostatistical analysis and all kinds of spatial data are very useful in urban sprawl studies, but it is important to define precise rules in combining great amounts of data to build complex knowledge about territory. Rough Set theory may be a useful method to employ in this field. It represents a different mathematical approach to uncertainty by capturing the indiscernibility. Two different phenomena can be indiscernible in some contexts and classified in the same way when combining available information about them. This approach has been applied in a case of study, comparing the results achieved with both Map Algebra technique and Spatial Rough Set. The study case area, Potenza Province, is particularly suitable for the application of this theory, because it includes 100 municipalities with different number of inhabitants and morphologic features.
Abstract: While financial institutions have faced difficulties
over the years for a multitude of reasons, the major cause of serious
banking problems continues to be directly related to lax credit
standards for borrowers and counterparties, poor portfolio risk
management, or a lack of attention to changes in economic or other
circumstances that can lead to a deterioration in the credit standing of
a bank's counterparties. Credit risk is most simply defined as the
potential that a bank borrower or counterparty will fail to meet its
obligations in accordance with agreed terms. The goal of credit risk
management is to maximize a bank's risk-adjusted rate of return by
maintaining credit risk exposure within acceptable parameters. Banks
need to manage the credit risk inherent in the entire portfolio as well
as the risk in individual credits or transactions. Banks should also
consider the relationships between credit risk and other risks. The
effective management of credit risk is a critical component of a
comprehensive approach to risk management and essential to the
long-term success of any banking organization. In this research we
also study the relationship between credit risk indices and borrower-s
timely payback in Karafarin bank.
Abstract: There is lot of work done in prediction of the fault proneness of the software systems. But, it is the severity of the faults that is more important than number of faults existing in the developed system as the major faults matters most for a developer and those major faults needs immediate attention. In this paper, we tried to predict the level of impact of the existing faults in software systems. Neuro-Fuzzy based predictor models is applied NASA-s public domain defect dataset coded in C programming language. As Correlation-based Feature Selection (CFS) evaluates the worth of a subset of attributes by considering the individual predictive ability of each feature along with the degree of redundancy between them. So, CFS is used for the selecting the best metrics that have highly correlated with level of severity of faults. The results are compared with the prediction results of Logistic Models (LMT) that was earlier quoted as the best technique in [17]. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provide a relatively better prediction accuracy as compared to other models and hence, can be used for the modeling of the level of impact of faults in function based systems.
Abstract: Among the chemicals used for ammunition production, TNT (Trinitrotoluene) play a significant role since World War I and II. Various types of military weapon utilize TNT in casting process. However, the TNT casting process for warhead is difficult to control the cooling rate of the liquid TNT. This problem occurs because the casting process lacks the equipment to detect the temperature during the casting procedure This study presents the temperature detected by infrared camera to illustrate the cooling rate and cooling zone of curing, and demonstrates the optimization of TNT condition to reduce the risk of air gap occurred in the warhead which can result in the destruction afterward. Premature initiation of explosive-filled projectiles in response to set-back forces during gunfiring cause by casting defects. Finally the study can help improving the process of the TNT casting. The operators can control the curing of TNT inside the case by rising up the heating rod at the proper time. Consequently this can reduce tremendous time of rework if the air gaps occur and increase strength to lower elastic modulus. Therefore, it can be clearly concluded that the use of Infrared Cameras in this process is another method to improve the casting procedure.
Abstract: Monitoring lightning electromagnetic pulses (sferics)
and other terrestrial as well as extraterrestrial transient radiation signals
is of considerable interest for practical and theoretical purposes
in astro- and geophysics as well as meteorology. Managing a continuous
flow of data, automisation of the detection and classification
process is important. Features based on a combination of wavelet and
statistical methods proved efficient for analysis and characterisation
of transients and as input into a radial basis function network that is
trained to discriminate transients from pulse like to wave like.
Abstract: In this paper a controller for the pitch angle of an
aircraft regarding to the elevator deflection angle is designed.
The way how the elevator angle affects pitching motion of the
aircraft is pointed out, as well as, how a pitch controller can be
applied for the aircraft to reach certain pitch angle. In this digital
optimal system, the elevator deflection angle and pitching angle
of the plane are considered to be input and output respectively.
A single input single output (SISO) system is presented. A
digital pitch aircraft control is demonstrated. A simulation for
the whole system has been performed. The optimal control
weighting vectors, Q and R have been determined.
Abstract: This paper discusses a genetic algorithm (GA) based optimal load shedding that can apply for electrical distribution networks with and without dispersed generators (DG). Also, the proposed method has the ability for considering constant and variable capacity deficiency caused by unscheduled outages in the bulked generation and transmission system of bulked power supply. The genetic algorithm (GA) is employed to search for the optimal load shedding strategy in distribution networks considering DGs in two cases of constant and variable modelling of bulked power supply of distribution networks. Electrical power distribution systems have a radial network and unidirectional power flows. With the advent of dispersed generations, the electrical distribution system has a locally looped network and bidirectional power flows. Therefore, installed DG in the electrical distribution systems can cause operational problems and impact on existing operational schemes. Introduction of DGs in electrical distribution systems has introduced many new issues in operational and planning level. Load shedding as one of operational issue has no exempt. The objective is to minimize the sum of curtailed load and also system losses within the frame-work of system operational and security constraints. The proposed method is tested on a radial distribution system with 33 load points for more practical applications.
Abstract: The objective of the research was focused on the
design, development and evaluation of a sustainable web based
network system to be used as an interoperable environment for
University process workflow and document management. In this
manner the most of the process workflows in Universities can be
entirely realized electronically and promote integrated University.
Definition of the most used University process workflows enabled
creating electronic workflows and their execution on standard
workflow execution engines. Definition or reengineering of
workflows provided increased work efficiency and helped in having
standardized process through different faculties. The concept and the
process definition as well as the solution applied as Case study are
evaluated and findings are reported.
Abstract: The nature of consumer products causes the difficulty
in forecasting the future demands and the accuracy of the forecasts
significantly affects the overall performance of the supply chain
system. In this study, two data mining methods, artificial neural
network (ANN) and support vector machine (SVM), were utilized to
predict the demand of consumer products. The training data used was
the actual demand of six different products from a consumer product
company in Thailand. The results indicated that SVM had a better
forecast quality (in term of MAPE) than ANN in every category of
products. Moreover, another important finding was the margin
difference of MAPE from these two methods was significantly high
when the data was highly correlated.
Abstract: Biological data has several characteristics that strongly differentiate it from typical business data. It is much more complex, usually large in size, and continuously changes. Until recently business data has been the main target for discovering trends, patterns or future expectations. However, with the recent rise in biotechnology, the powerful technology that was used for analyzing business data is now being applied to biological data. With the advanced technology at hand, the main trend in biological research is rapidly changing from structural DNA analysis to understanding cellular functions of the DNA sequences. DNA chips are now being used to perform experiments and DNA analysis processes are being used by researchers. Clustering is one of the important processes used for grouping together similar entities. There are many clustering algorithms such as hierarchical clustering, self-organizing maps, K-means clustering and so on. In this paper, we propose a clustering algorithm that imitates the ecosystem taking into account the features of biological data. We implemented the system using an Ant-Colony clustering algorithm. The system decides the number of clusters automatically. The system processes the input biological data, runs the Ant-Colony algorithm, draws the Topic Map, assigns clusters to the genes and displays the output. We tested the algorithm with a test data of 100 to1000 genes and 24 samples and show promising results for applying this algorithm to clustering DNA chip data.