Trends, Problems and Needs of Urban Housing in Malaysia

The right to housing is a basic need while good quality and affordable housing is a reflection of a high quality of life. However, housing remains a major problem for most, especially for the bottom billions. Satisfaction on housing and neighbourhood conditions are one of the important indicators that reflect quality of life. These indicators are also important in the process of evaluating housing policy with the objective to increase the quality of housing and neighbourhood. The research method is purely based on a quantitative method, using a survey. The findings show that housing purchasing trend in urban Malaysia is determined by demographic profiles, mainly by education level, age, gender and income. The period of housing ownership also influenced the socio-cultural interactions and satisfaction of house owners with their neighbourhoods. The findings also show that the main concerns for house buyers in urban areas are price and location of the house. Respondents feel that houses in urban Malaysia is too expensive and beyond their affordability. Location of houses and distance from work place are also regarded as the main concern. However, respondents are fairly satisfied with religious and socio-cultural facilities in the housing areas and most importantly not many regard ethnicity as an issue in their decision-making, when buying a house.

A Flexible and Scalable Agent Platform for Multi-Agent Systems

Multi-agent system is composed by several agents capable of reaching the goal cooperatively. The system needs an agent platform for efficient and stable interaction between intelligent agents. In this paper we propose a flexible and scalable agent platform by composing the containers with multiple hierarchical agent groups. It also allows efficient implementation of multiple domain presentations of the agents unlike JADE. The proposed platform provides both group management and individual management of agents for efficiency. The platform has been implemented and tested, and it can be used as a flexible foundation of the dynamic multi-agent system targeting seamless delivery of ubiquitous services.

Universal Method for Timetable Construction based on Evolutionary Approach

Timetabling problems are often hard and timeconsuming to solve. Most of the methods of solving them concern only one problem instance or class. This paper describes a universal method for solving large, highly constrained timetabling problems from different domains. The solution is based on evolutionary algorithm-s framework and operates on two levels – first-level evolutionary algorithm tries to find a solution basing on given set of operating parameters, second-level algorithm is used to establish those parameters. Tabu search is employed to speed up the solution finding process on first level. The method has been used to solve three different timetabling problems with promising results.

Integrated Approaches to Enhance Aggregate Production Planning with Inventory Uncertainty Based On Improved Harmony Search Algorithm

This work presents a multiple objective linear programming (MOLP) model based on the desirability function approach for solving the aggregate production planning (APP) decision problem upon Masud and Hwang-s model. The proposed model minimises total production costs, carrying or backordering costs and rates of change in labor levels. An industrial case demonstrates the feasibility of applying the proposed model to the APP problems with three scenarios of inventory levels. The proposed model yields an efficient compromise solution and the overall levels of DM satisfaction with the multiple combined response levels. There has been a trend to solve complex planning problems using various metaheuristics. Therefore, in this paper, the multi-objective APP problem is solved by hybrid metaheuristics of the hunting search (HuSIHSA) and firefly (FAIHSA) mechanisms on the improved harmony search algorithm. Results obtained from the solution of are then compared. It is observed that the FAIHSA can be used as a successful alternative solution mechanism for solving APP problems over three scenarios. Furthermore, the FAIHSA provides a systematic framework for facilitating the decision-making process, enabling a decision maker interactively to modify the desirability function approach and related model parameters until a good optimal solution is obtained with proper selection of control parameters when compared.

A Study of Liver Checkup in Patients with Hepatitis C in the Region of Batna

Hepatitis C is an infectious disease transmitted by blood and due to hepatitis C virus (HCV), which attacks the liver. The infection is characterized by liver inflammation (hepatitis) that is often asymptomatic but can progress to chronic hepatitis and later cirrhosis and liver cancer. Our problem tends to highlight on the one hand the prevalence of infectious disease in the population of the region of Batna and on other hand the biological characteristics of this disease by a screening and a specific diagnosis based on serological tests, liver checkup (measurement of haematological and biochemical parameters). The results showed: The serology of hepatitis C establishes the diagnosis of infection with hepatitis C. In this study and with the serological test, 24 cases of the disease of hepatitis C were found in 1000 suspected cases (7 cases with normal transaminases and 17 cases with elevated transaminases). The prevalence of this disease in this study population was 2.4%. The presence of hepatitis C disrupts liver function including the onset of cytolysis, cholestasis, jaundice, thrombocytopenia, and coagulation disorders.

A New Model of English-Vietnamese Bilingual Information Retrieval System

In this paper, we propose a new model of English- Vietnamese bilingual Information Retrieval system. Although there are so many CLIR systems had been researched and built, the accuracy of searching results in different languages that the CLIR system supports still need to improve, especially in finding bilingual documents. The problems identified in this paper are the limitation of machine translation-s result and the extra large collections of document to be found. So we try to establish a different model to overcome these problems.

Adaptive Neuro-Fuzzy Inference System for Financial Trading using Intraday Seasonality Observation Model

The prediction of financial time series is a very complicated process. If the efficient market hypothesis holds, then the predictability of most financial time series would be a rather controversial issue, due to the fact that the current price contains already all available information in the market. This paper extends the Adaptive Neuro Fuzzy Inference System for High Frequency Trading which is an expert system that is capable of using fuzzy reasoning combined with the pattern recognition capability of neural networks to be used in financial forecasting and trading in high frequency. However, in order to eliminate unnecessary input in the training phase a new event based volatility model was proposed. Taking volatility and the scaling laws of financial time series into consideration has brought about the development of the Intraday Seasonality Observation Model. This new model allows the observation of specific events and seasonalities in data and subsequently removes any unnecessary data. This new event based volatility model provides the ANFIS system with more accurate input and has increased the overall performance of the system.

Introducing Successful Financial Innovations: Rewriting the Rules in Light of the Global Financial Crisis

Since the 1980s, banks and financial service institutions have been running in an endless race of innovation to cope with the advancing technology, the fierce competition, and the more sophisticated and demanding customers. In order to guide their innovation efforts, several researches were conducted to identify the success and failure factors of new financial services. These mainly included organizational factors, marketplace factors and new service development process factors. They almost all emphasized the importance of customer and market orientation as a response to the highly perceptual and intangible characteristics of financial services. However, they deemphasized the critical characteristics of high involvement of risk and close correlation with the economic conditions, a factor that heavily contributed to the Global financial Crisis of 2008. This paper reviews the success and failure factors of new financial services. It then adds new perspectives emerging from the analysis of the role of innovation in the global financial crisis.

Analysis of Relation between Unlabeled and Labeled Data to Self-Taught Learning Performance

Obtaining labeled data in supervised learning is often difficult and expensive, and thus the trained learning algorithm tends to be overfitting due to small number of training data. As a result, some researchers have focused on using unlabeled data which may not necessary to follow the same generative distribution as the labeled data to construct a high-level feature for improving performance on supervised learning tasks. In this paper, we investigate the impact of the relationship between unlabeled and labeled data for classification performance. Specifically, we will apply difference unlabeled data which have different degrees of relation to the labeled data for handwritten digit classification task based on MNIST dataset. Our experimental results show that the higher the degree of relation between unlabeled and labeled data, the better the classification performance. Although the unlabeled data that is completely from different generative distribution to the labeled data provides the lowest classification performance, we still achieve high classification performance. This leads to expanding the applicability of the supervised learning algorithms using unsupervised learning.

Food Security in India: A Case Study of Kandi Region of Punjab

Banishing hunger from the face of earth has been frequently expressed in various international, national and regional level conferences since 1974. Providing food security has become important issue across the world particularly in developing countries. In a developing country like India, where growth rate of population is more than that of the food grains production, food security is a question of great concern. According to the International Food Policy Research Institute's Global Hunger Index, 2011, India ranks 67 of the 81 countries of the world with the worst food security status. After Green Revolution, India became a food surplus country. Its production has increased from 74.23 million tonnes in 1966-67 to 257.44 million tonnes in 2011-12. But after achieving selfsufficiency in food during last three decades, the country is now facing new challenges due to increasing population, climate change, stagnation in farm productivity. Therefore, the main objective of the present paper is to examine the food security situation at national level in the country and further to explain the paradox of food insecurity in a food surplus state of India i.e in Punjab at micro level. In order to achieve the said objectives, secondary data collected from the Ministry of Agriculture and the Agriculture department of Punjab State was analyzed. The result of the study showed that despite having surplus food production the country is still facing food insecurity problem at micro level. Within the Kandi belt of Punjab state, the area adjacent to plains is food secure while the area along the hills falls in food insecure zone. The present paper is divided into following three sections (i) Introduction, (ii) Analysis of food security situation at national level as well as micro level (Kandi belt of Punjab State) (iii) Concluding Observations

A Simple Affymetrix Ratio-transformation Method Yields Comparable Expression Level Quantifications with cDNA Data

Gene expression profiling is rapidly evolving into a powerful technique for investigating tumor malignancies. The researchers are overwhelmed with the microarray-based platforms and methods that confer them the freedom to conduct large-scale gene expression profiling measurements. Simultaneously, investigations into cross-platform integration methods have started gaining momentum due to their underlying potential to help comprehend a myriad of broad biological issues in tumor diagnosis, prognosis, and therapy. However, comparing results from different platforms remains to be a challenging task as various inherent technical differences exist between the microarray platforms. In this paper, we explain a simple ratio-transformation method, which can provide some common ground for cDNA and Affymetrix platform towards cross-platform integration. The method is based on the characteristic data attributes of Affymetrix- and cDNA- platform. In the work, we considered seven childhood leukemia patients and their gene expression levels in either platform. With a dataset of 822 differentially expressed genes from both these platforms, we carried out a specific ratio-treatment to Affymetrix data, which subsequently showed an improvement in the relationship with the cDNA data.

Marketing Strategy Analysis of Boon Rawd Brewery Company

Boon Rawd Brewery is a beer company based in Thailand that has an exemplary image, both as a good employer and a well-managed company with a strong record of social responsibility. The most famous of the company’s products is Singha beer. To study the company’s marketing strategy, a case study analysis was conducted together with qualitative research methods. The study analyzed the marketing strategy of Boon Rawd Brewery before the liberalization of the liquor market in 2000. The company’s marketing strategies consisted of the following: product line strategy, product development strategy, block channel strategy, media strategy, trade strategy, and consumer incentive strategy. Additionally, the company employed marketing mix strategy based on the 4Ps: product, price, promotion and place (of distribution).

Modified Fuzzy ARTMAP and Supervised Fuzzy ART: Comparative Study with Multispectral Classification

In this article a modification of the algorithm of the fuzzy ART network, aiming at returning it supervised is carried out. It consists of the search for the comparison, training and vigilance parameters giving the minimum quadratic distances between the output of the training base and those obtained by the network. The same process is applied for the determination of the parameters of the fuzzy ARTMAP giving the most powerful network. The modification consist in making learn the fuzzy ARTMAP a base of examples not only once as it is of use, but as many time as its architecture is in evolution or than the objective error is not reached . In this way, we don-t worry about the values to impose on the eight (08) parameters of the network. To evaluate each one of these three networks modified, a comparison of their performances is carried out. As application we carried out a classification of the image of Algiers-s bay taken by SPOT XS. We use as criterion of evaluation the training duration, the mean square error (MSE) in step control and the rate of good classification per class. The results of this study presented as curves, tables and images show that modified fuzzy ARTMAP presents the best compromise quality/computing time.

Life Cycle Assessment of Seawater Desalinization in Western Australia

Perth will run out of available sustainable natural water resources by 2015 if nothing is done to slow usage rates, according to a Western Australian study [1]. Alternative water technology options need to be considered for the long-term guaranteed supply of water for agricultural, commercial, domestic and industrial purposes. Seawater is an alternative source of water for human consumption, because seawater can be desalinated and supplied in large quantities to a very high quality. While seawater desalination is a promising option, the technology requires a large amount of energy which is typically generated from fossil fuels. The combustion of fossil fuels emits greenhouse gases (GHG) and, is implicated in climate change. In addition to environmental emissions from electricity generation for desalination, greenhouse gases are emitted in the production of chemicals and membranes for water treatment. Since Australia is a signatory to the Kyoto Protocol, it is important to quantify greenhouse gas emissions from desalinated water production. A life cycle assessment (LCA) has been carried out to determine the greenhouse gas emissions from the production of 1 gigalitre (GL) of water from the new plant. In this LCA analysis, a new desalination plant that will be installed in Bunbury, Western Australia, and known as Southern Seawater Desalinization Plant (SSDP), was taken as a case study. The system boundary of the LCA mainly consists of three stages: seawater extraction, treatment and delivery. The analysis found that the equivalent of 3,890 tonnes of CO2 could be emitted from the production of 1 GL of desalinated water. This LCA analysis has also identified that the reverse osmosis process would cause the most significant greenhouse emissions as a result of the electricity used if this is generated from fossil fuels

Natural Discovery: Electricity Potential from Vermicompost (Waste to Energy)

Wastages such as grated coconut meat, spent tea and used sugarcane had contributed negative impacts to the environment. Vermicomposting method is fully utilized to manage the wastes towards a more sustainable approach. The worms that are used in the vermicomposting are Eisenia foetida and Eudrillus euginae. This research shows that the vermicompost of wastages has voltage of electrical energy and is able to light up the Light-Emitting Diode (LED) device. Based on the experiment, the use of replicated and double compartments of the component will produce double of voltage. Hence, for conclusion, this harmless and low cost technology of vermicompost can act as a dry cell in order to reduce the usage of hazardous chemicals that can contaminate the environment.

Study on Performance of Wigner Ville Distribution for Linear FM and Transient Signal Analysis

This research paper presents some methods to assess the performance of Wigner Ville Distribution for Time-Frequency representation of non-stationary signals, in comparison with the other representations like STFT, Spectrogram etc. The simultaneous timefrequency resolution of WVD is one of the important properties which makes it preferable for analysis and detection of linear FM and transient signals. There are two algorithms proposed here to assess the resolution and to compare the performance of signal detection. First method is based on the measurement of area under timefrequency plot; in case of a linear FM signal analysis. A second method is based on the instantaneous power calculation and is used in case of transient, non-stationary signals. The implementation is explained briefly for both methods with suitable diagrams. The accuracy of the measurements is validated to show the better performance of WVD representation in comparison with STFT and Spectrograms.

Phase Noise Impact on BER in Space Communication

This paper deals with the modeling and the evaluation of a multiplicative phase noise influence on the bit error ratio in a general space communication system. Our research is focused on systems with multi-state phase shift keying modulation techniques and it turns out, that the phase noise significantly affects the bit error rate, especially for higher signal to noise ratios. These results come from a system model created in Matlab environment and are shown in a form of constellation diagrams and bit error rate dependencies. The change of a user data bit rate is also considered and included into simulation results. Obtained outcomes confirm theoretical presumptions.

Remarks Regarding Queuing Model and Packet Loss Probability for the Traffic with Self-Similar Characteristics

Network management techniques have long been of interest to the networking research community. The queue size plays a critical role for the network performance. The adequate size of the queue maintains Quality of Service (QoS) requirements within limited network capacity for as many users as possible. The appropriate estimation of the queuing model parameters is crucial for both initial size estimation and during the process of resource allocation. The accurate resource allocation model for the management system increases the network utilization. The present paper demonstrates the results of empirical observation of memory allocation for packet-based services.

Assessing the Function of Light and Colorin Architectural View

Light is one of the most important qualitative and symbolic factors and has a special position in architecture and urban development in regard to practical function. The main function of light, either natural or artificial, is lighting up the environment and the constructional forms which is called lighting. However, light is used to redefine the urban spaces by architectural genius with regard to three aesthetic, conceptual and symbolic factors. In architecture and urban development, light has a function beyond lighting up the environment, and the designers consider it as one of the basic components. The present research aims at studying the function of light and color in architectural view and their effects in buildings.

A Complexity Measure for Java Bean based Software Components

The traditional software product and process metrics are neither suitable nor sufficient in measuring the complexity of software components, which ultimately is necessary for quality and productivity improvement within organizations adopting CBSE. Researchers have proposed a wide range of complexity metrics for software systems. However, these metrics are not sufficient for components and component-based system and are restricted to the module-oriented systems and object-oriented systems. In this proposed study it is proposed to find the complexity of the JavaBean Software Components as a reflection of its quality and the component can be adopted accordingly to make it more reusable. The proposed metric involves only the design issues of the component and does not consider the packaging and the deployment complexity. In this way, the software components could be kept in certain limit which in turn help in enhancing the quality and productivity.