Low-Cost Eco-Friendly Building Material: A Case Study in Ethiopia

This work presents a low-cost and eco-friendly building material named Agrostone panel. Africa-s urban population is growing at an annual rate of 2.8% and 62% of its population will live in urban areas by 2050. As a consequence, many of the least urbanized and least developed African countries- will face serious challenges in providing affordable housing to the urban dwellers. Since the cost of building materials accounts for the largest proportion of the overall construction cost, innovating low-cost building material is vital. Agrostone panel is used in housing projects in Ethiopia. It uses raw materials of agricultural/industrial wastes and/or natural minerals as a filler, magnesium-based chemicals as a binder and fiberglass as reinforcement. Agrostone panel reduces the cost of wall construction by 50% compared with the conventional building materials. The pros and cons of Agrostone panel as well as the use of other waste materials as a raw material to make the panel more sustainable, low-cost and better properties are discussed.

A Methodology for Quality Problems Diagnosis in SMEs

This article proposes a new methodology to be used by SMEs (Small and Medium enterprises) to characterize their performance in quality, highlighting weaknesses and area for improvement. The methodology aims to identify the principal causes of quality problems and help to prioritize improvement initiatives. This is a self-assessment methodology that intends to be easy to implement by companies with low maturity level in quality. The methodology is organized in six different steps which includes gathering information about predetermined processes and subprocesses of quality management, defined based on the well-known Juran-s trilogy for quality management (Quality planning, quality control and quality improvement) and, predetermined results categories, defined based on quality concept. A set of tools for data collecting and analysis, such as interviews, flowcharts, process analysis diagrams and Failure Mode and effects Analysis (FMEA) are used. The article also presents the conclusions obtained in the application of the methodology in two cases studies.

Access Control System: Monitoring Tool for Fiber to the Home Passive Optical Network

An optical fault monitoring in FTTH-PON using ACS is demonstrated. This device can achieve real-time fault monitoring for protection feeder fiber. In addition, the ACS can distinguish optical fiber fault from the transmission services to other customers in the FTTH-PON. It is essential to use a wavelength different from the triple-play services operating wavelengths for failure detection. ACS is using the operating wavelength 1625 nm for monitoring and failure detection control. Our solution works on a standard local area network (LAN) using a specially designed hardware interfaced with a microcontroller integrated Ethernet.

Fuzzy Time Series Forecasting Using Percentage Change as the Universe of Discourse

Since the pioneering work of Zadeh, fuzzy set theory has been applied to a myriad of areas. Song and Chissom introduced the concept of fuzzy time series and applied some methods to the enrollments of the University of Alabama. In recent years, a number of techniques have been proposed for forecasting based on fuzzy set theory methods. These methods have either used enrollment numbers or differences of enrollments as the universe of discourse. We propose using the year to year percentage change as the universe of discourse. In this communication, the approach of Jilani, Burney, and Ardil is modified by using the year to year percentage change as the universe of discourse. We use enrollment figures for the University of Alabama to illustrate our proposed method. The proposed method results in better forecasting accuracy than existing models.

A Power-Controlled Scheduling Scheme Using a Directional Antenna in Smart Home

This paper proposes a power-controlled scheduling scheme for devices using a directional antenna in smart home. In the case of the home network using directional antenna, devices can concurrently transmit data in the same frequency band. Accordingly, the throughput increases compared to that of devices using omni-directional antenna in proportional to the number of concurrent transmissions. Also, the number of concurrent transmissions depends on the beamwidth of antenna, the number of devices operating in the network , transmission power, interference and so on. In particular, the less transmission power is used, the more concurrent transmissions occur due to small transmission range. In this paper, we considered sub-optimal scheduling scheme for throughput maximization and power consumption minimization. In the scheme, each device is equipped with a directional antenna. Various beamwidths, path loss components, and antenna radiation efficiencies are considered. Numerical results show that the proposed schemes outperform the scheduling scheme using directional antennas without power control.

Analyzing Periurban Fringe with Rough Set

The distinction among urban, periurban and rural areas represents a classical example of uncertainty in land classification. Satellite images, geostatistical analysis and all kinds of spatial data are very useful in urban sprawl studies, but it is important to define precise rules in combining great amounts of data to build complex knowledge about territory. Rough Set theory may be a useful method to employ in this field. It represents a different mathematical approach to uncertainty by capturing the indiscernibility. Two different phenomena can be indiscernible in some contexts and classified in the same way when combining available information about them. This approach has been applied in a case of study, comparing the results achieved with both Map Algebra technique and Spatial Rough Set. The study case area, Potenza Province, is particularly suitable for the application of this theory, because it includes 100 municipalities with different number of inhabitants and morphologic features.

Credit Risk Management and Analysis in an Iranian Bank

While financial institutions have faced difficulties over the years for a multitude of reasons, the major cause of serious banking problems continues to be directly related to lax credit standards for borrowers and counterparties, poor portfolio risk management, or a lack of attention to changes in economic or other circumstances that can lead to a deterioration in the credit standing of a bank's counterparties. Credit risk is most simply defined as the potential that a bank borrower or counterparty will fail to meet its obligations in accordance with agreed terms. The goal of credit risk management is to maximize a bank's risk-adjusted rate of return by maintaining credit risk exposure within acceptable parameters. Banks need to manage the credit risk inherent in the entire portfolio as well as the risk in individual credits or transactions. Banks should also consider the relationships between credit risk and other risks. The effective management of credit risk is a critical component of a comprehensive approach to risk management and essential to the long-term success of any banking organization. In this research we also study the relationship between credit risk indices and borrower-s timely payback in Karafarin bank.

Predicting the Impact of the Defect on the Overall Environment in Function Based Systems

There is lot of work done in prediction of the fault proneness of the software systems. But, it is the severity of the faults that is more important than number of faults existing in the developed system as the major faults matters most for a developer and those major faults needs immediate attention. In this paper, we tried to predict the level of impact of the existing faults in software systems. Neuro-Fuzzy based predictor models is applied NASA-s public domain defect dataset coded in C programming language. As Correlation-based Feature Selection (CFS) evaluates the worth of a subset of attributes by considering the individual predictive ability of each feature along with the degree of redundancy between them. So, CFS is used for the selecting the best metrics that have highly correlated with level of severity of faults. The results are compared with the prediction results of Logistic Models (LMT) that was earlier quoted as the best technique in [17]. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provide a relatively better prediction accuracy as compared to other models and hence, can be used for the modeling of the level of impact of faults in function based systems.

Microwave Imaging by Application of Information Theory Criteria in MUSIC Algorithm

The performance of time-reversal MUSIC algorithm will be dramatically degrades in presence of strong noise and multiple scattering (i.e. when scatterers are close to each other). This is due to error in determining the number of scatterers. The present paper provides a new approach to alleviate such a problem using an information theoretic criterion referred as minimum description length (MDL). The merits of the novel approach are confirmed by the numerical examples. The results indicate the time-reversal MUSIC yields accurate estimate of the target locations with considerable noise and multiple scattering in the received signals.

More on Gaussian Quadratures for Fuzzy Functions

In this paper, the Gaussian type quadrature rules for fuzzy functions are discussed. The errors representation and convergence theorems are given. Moreover, four kinds of Gaussian type quadrature rules with error terms for approximate of fuzzy integrals are presented. The present paper complements the theoretical results of the paper by T. Allahviranloo and M. Otadi [T. Allahviranloo, M. Otadi, Gaussian quadratures for approximate of fuzzy integrals, Applied Mathematics and Computation 170 (2005) 874-885]. The obtained results are illustrated by solving some numerical examples.

An Ant-based Clustering System for Knowledge Discovery in DNA Chip Analysis Data

Biological data has several characteristics that strongly differentiate it from typical business data. It is much more complex, usually large in size, and continuously changes. Until recently business data has been the main target for discovering trends, patterns or future expectations. However, with the recent rise in biotechnology, the powerful technology that was used for analyzing business data is now being applied to biological data. With the advanced technology at hand, the main trend in biological research is rapidly changing from structural DNA analysis to understanding cellular functions of the DNA sequences. DNA chips are now being used to perform experiments and DNA analysis processes are being used by researchers. Clustering is one of the important processes used for grouping together similar entities. There are many clustering algorithms such as hierarchical clustering, self-organizing maps, K-means clustering and so on. In this paper, we propose a clustering algorithm that imitates the ecosystem taking into account the features of biological data. We implemented the system using an Ant-Colony clustering algorithm. The system decides the number of clusters automatically. The system processes the input biological data, runs the Ant-Colony algorithm, draws the Topic Map, assigns clusters to the genes and displays the output. We tested the algorithm with a test data of 100 to1000 genes and 24 samples and show promising results for applying this algorithm to clustering DNA chip data.

Fingerprint Compression Using Contourlet Transform and Multistage Vector Quantization

This paper presents a new fingerprint coding technique based on contourlet transform and multistage vector quantization. Wavelets have shown their ability in representing natural images that contain smooth areas separated with edges. However, wavelets cannot efficiently take advantage of the fact that the edges usually found in fingerprints are smooth curves. This issue is addressed by directional transforms, known as contourlets, which have the property of preserving edges. The contourlet transform is a new extension to the wavelet transform in two dimensions using nonseparable and directional filter banks. The computation and storage requirements are the major difficulty in implementing a vector quantizer. In the full-search algorithm, the computation and storage complexity is an exponential function of the number of bits used in quantizing each frame of spectral information. The storage requirement in multistage vector quantization is less when compared to full search vector quantization. The coefficients of contourlet transform are quantized by multistage vector quantization. The quantized coefficients are encoded by Huffman coding. The results obtained are tabulated and compared with the existing wavelet based ones.

To Design Holistic Health Service Systems on the Internet

There are different kinds of online systems on the Internet for people who need support and develop new knowledge. Online communities and Ask the Expert systems are two such systems. In the health care area, the number of users of these systems has increased at a rapid pace. Interactions with medical trained experts take place online, and people with concerns about similar health problems come together to share experiences and advice. The systems are also used as storages and browsed for health information. Over the years, studies have been conducted of the usage of the different systems. However, in what ways the systems can be used together to enhance learning has not been explored. This paper presents results from a study of online health-communities and an Ask the Expert system for people who suffer from overweight. Differences and similarities in regards to posted issues and replies are discussed, and suggestions for a new holistic design of the two systems are presented.

Robust Adaptive ELS-QR Algorithm for Linear Discrete Time Stochastic Systems Identification

This work proposes a recursive weighted ELS algorithm for system identification by applying numerically robust orthogonal Householder transformations. The properties of the proposed algorithm show it obtains acceptable results in a noisy environment: fast convergence and asymptotically unbiased estimates. Comparative analysis with others robust methods well known from literature are also presented.

Analytical Investigation of the Effects of a Standing Ocean Wave in a Wave-Power Device OWC

In this work we study analytically and numerically the performance of the mean heave motion of an OWC coupled with the governing equation of the spreading ocean waves due to the wide variation in an open parabolic channel with constant depth. This paper considers that the ocean wave propagation is under the assumption of a shallow flow condition. In order to verify the effect of the waves in the OWC firstly we establish the analytical model in a non-dimensional form based on the energy equation. The proposed wave-power system has to aims: one is to perturb the ocean waves as a consequence of the channel shape in order to concentrate the maximum ocean wave amplitude in the neighborhood of the OWC and the second is to determine the pressure and volume oscillation of air inside the compression chamber.

An Improved Algorithm for Channel Estimations of OFDM System based Pilot Signal

This paper presents a new algorithm for the channel estimation of the OFDM system based on a pilot signal for the new generation of high data rate communication systems. In orthogonal frequency division multiplexing (OFDM) systems over fast-varying fading channels, channel estimation and tracking is generally carried out by transmitting known pilot symbols in given positions of the frequency-time grid. In this paper, we propose to derive an improved algorithm based on the calculation of the mean and the variance of the adjacent pilot signals for a specific distribution of the pilot signals in the OFDM frequency-time grid then calculating of the entire unknown channel coefficients from the equation of the mean and the variance. Simulation results shows that the performance of the OFDM system increase as the length of the channel increase where the accuracy of the estimated channel will be increased using this low complexity algorithm, also the number of the pilot signal needed to be inserted in the OFDM signal will be reduced which lead to increase in the throughput of the signal over the OFDM system in compared with other type of the distribution such as Comb type and Block type channel estimation.

Cross Signal Identification for PSG Applications

The standard investigational method for obstructive sleep apnea syndrome (OSAS) diagnosis is polysomnography (PSG), which consists of a simultaneous, usually overnight recording of multiple electro-physiological signals related to sleep and wakefulness. This is an expensive, encumbering and not a readily repeated protocol, and therefore there is need for simpler and easily implemented screening and detection techniques. Identification of apnea/hypopnea events in the screening recordings is the key factor for the diagnosis of OSAS. The analysis of a solely single-lead electrocardiographic (ECG) signal for OSAS diagnosis, which may be done with portable devices, at patient-s home, is the challenge of the last years. A novel artificial neural network (ANN) based approach for feature extraction and automatic identification of respiratory events in ECG signals is presented in this paper. A nonlinear principal component analysis (NLPCA) method was considered for feature extraction and support vector machine for classification/recognition. An alternative representation of the respiratory events by means of Kohonen type neural network is discussed. Our prospective study was based on OSAS patients of the Clinical Hospital of Pneumology from Iaşi, Romania, males and females, as well as on non-OSAS investigated human subjects. Our computed analysis includes a learning phase based on cross signal PSG annotation.

Effect of Rotor to Casing Ratios with Different Rotor Vanes on Performance of Shaft Output of a Vane Type Novel Air Turbine

This paper deals with new concept of using compressed atmospheric air as a zero pollution power source for running motorbikes. The motorbike is equipped with an air turbine in place of an internal combustion engine, and transforms the energy of the compressed air into shaft work. The mathematical modeling and performance evaluation of a small capacity compressed air driven vaned type novel air turbine is presented in this paper. The effect of isobaric admission and adiabatic expansion of high pressure air for different rotor to casing diameter ratios with respect to different vane angles (number of vanes) have been considered and analyzed. It is found that the shaft work output is optimum for some typical values of rotor / casing diameter ratios at a particular value of vane angle (no. of vanes). In this study, the maximum power is obtained as 4.5kW - 5.3kW (5.5-6.25 HP) when casing diameter is taken 100 mm, and rotor to casing diameter ratios are kept from 0.65 to 0.55. This value of output is sufficient to run motorbike.

A New Method for Contour Approximation Using Basic Ramer Idea

This paper presented two new efficient algorithms for contour approximation. The proposed algorithm is compared with Ramer (good quality), Triangle (faster) and Trapezoid (fastest) in this work; which are briefly described. Cartesian co-ordinates of an input contour are processed in such a manner that finally contours is presented by a set of selected vertices of the edge of the contour. In the paper the main idea of the analyzed procedures for contour compression is performed. For comparison, the mean square error and signal-to-noise ratio criterions are used. Computational time of analyzed methods is estimated depending on a number of numerical operations. Experimental results are obtained both in terms of image quality, compression ratios, and speed. The main advantages of the analyzed algorithm is small numbers of the arithmetic operations compared to the existing algorithms.

Comprehensive Hierarchy Evaluation of Power Quality Based on an Incentive Mechanism

In a liberalized electricity market, it is not surprising that different customers require different power quality (PQ) levels at different price. Power quality related to several power disturbances is described by many parameters, so how to define a comprehensive hierarchy evaluation system of power quality (PQCHES) has become a concerned issue. In this paper, based on four electromagnetic compatibility (EMC) levels, the numerical range of each power disturbance is divided into five grades (Grade I –Grade V), and the “barrel principle" of power quality is used for the assessment of overall PQ performance with only one grade indicator. A case study based on actual monitored data of PQ shows that the site PQ grade indicates the electromagnetic environment level and also expresses the characteristics of loads served by the site. The shortest plank principle of PQ barrel is an incentive mechanism, which can combine with the rewards/penalty mechanism (RPM) of consumed energy “on quality demand", to stimulate utilities to improve the overall PQ level and also stimulate end-user more “smart" under the infrastructure of future SmartGrid..