Conservation and Repair Works for Traditional Timber Mosque in Malaysia: A Review on Techniques

Building life cycle will never be excused from the existence of defects and deterioration. They are common problems in building, existed in newly build or in aged building. Buildings constructed from wood are indeed affected by its agent and serious defects and damages can reduce values to a building. In repair works, it is important to identify the causes and repair techniques that best suites with the condition. This paper reviews the conservation of traditional timber mosque in Malaysia comprises the concept, principles and approaches of mosque conservation in general. As in conservation practice, wood in historic building can be conserved by using various restoration and conservation techniques which this can be grouped as Fully and Partial Replacement, Mechanical Reinforcement, Consolidation by Impregnation and Reinforcement, Removing Paint and also Preservation of Wood and Control Insect Invasion, as to prolong and extended the function of a timber in a building. It resulted that the common techniques adopted in timber mosque conservation are from the conventional ways and the understanding of the repair technique requires the use of only preserve wood to prevent the future immature defects.

A Review of Critical Success Factor in Building Maintenance Management Practice for University Sector

Building maintenance plays an important role among other activities in building operation. Building defect and damages are part of the building maintenance 'bread and butter' as their input indicated in the building inspection is very much justified, particularly as to determine the building performance. There will be no escape route or short cut from building maintenance work. This study attempts to identify a competitive performance that translates the Critical Success Factor achievements and satisfactorily meet the university-s expectation. The quality and efficiency of maintenance management operation of building depends, to some extent, on the building condition information, the expectation from the university sector and the works carried out for each maintenance activity. This paper reviews the critical success factor in building maintenance management practice for university sectors from four (4) perspectives which include (1) customer (2) internal processes (3) financial and (4) learning and growth perspective. The enhancement of these perspectives is capable to reach the maintenance management goal for a better living environment in university campus.

Gluten-Free Cookies Enriched with Blueberry Pomace: Optimization of Baking Process

With the aim of improving nutritional profile and antioxidant capacity of gluten-free cookies, blueberry pomace, by-product of juice production, was processed into a new food ingredient by drying and grinding and used for a gluten-free cookie formulation. Since the quality of a baked product is highly influenced by the baking conditions, the objective of this work was to optimize the baking time and thickness of dough pieces, by applying Response Surface Methodology (RSM) in order to obtain the best technological quality of the cookies. The experiments were carried out according to a Central Composite Design (CCD) by selecting the dough thickness and baking time as independent variables, while hardness, color parameters (L*, a* and b* values), water activity, diameter and short/long ratio were response variables. According to the results of RSM analysis, the baking time of 13.74min and dough thickness of 4.08mm was found to be the optimal for the baking temperature of 170°C. As similar optimal parameters were obtained by previously conducted experiment based on sensory analysis, response surface methodology (RSM) can be considered as a suitable approach to optimize the baking process.

Multiple Sequence Alignment Using Optimization Algorithms

Proteins or genes that have similar sequences are likely to perform the same function. One of the most widely used techniques for sequence comparison is sequence alignment. Sequence alignment allows mismatches and insertion/deletion, which represents biological mutations. Sequence alignment is usually performed only on two sequences. Multiple sequence alignment, is a natural extension of two-sequence alignment. In multiple sequence alignment, the emphasis is to find optimal alignment for a group of sequences. Several applicable techniques were observed in this research, from traditional method such as dynamic programming to the extend of widely used stochastic optimization method such as Genetic Algorithms (GAs) and Simulated Annealing. A framework with combination of Genetic Algorithm and Simulated Annealing is presented to solve Multiple Sequence Alignment problem. The Genetic Algorithm phase will try to find new region of solution while Simulated Annealing can be considered as an alignment improver for any near optimal solution produced by GAs.

Trust Building Mechanisms for Electronic Business Networks and Their Relation to eSkills

Globalization, supported by information and communication technologies, changes the rules of competitiveness and increases the significance of information, knowledge and network cooperation. In line with this trend, the need for efficient trust-building tools has emerged. The absence of trust building mechanisms and strategies was identified within several studies. Through trust development, participation on e-business network and usage of network services will increase and provide to SMEs new economic benefits. This work is focused on effective trust building strategies development for electronic business network platforms. Based on trust building mechanism identification, the questionnairebased analysis of its significance and minimum level of requirements was conducted. In the paper, we are confirming the trust dependency on e-Skills which play crucial role in higher level of trust into the more sophisticated and complex trust building ICT solutions.

Detecting Abnormal ECG Signals Utilising Wavelet Transform and Standard Deviation

ECG contains very important clinical information about the cardiac activities of the heart. Often the ECG signal needs to be captured for a long period of time in order to identify abnormalities in certain situations. Such signal apart of a large volume often is characterised by low quality due to the noise and other influences. In order to extract features in the ECG signal with time-varying characteristics at first need to be preprocessed with the best parameters. Also, it is useful to identify specific parts of the long lasting signal which have certain abnormalities and to direct the practitioner to those parts of the signal. In this work we present a method based on wavelet transform, standard deviation and variable threshold which achieves 100% accuracy in identifying the ECG signal peaks and heartbeat as well as identifying the standard deviation, providing a quick reference to abnormalities.

Vibration of Functionally Graded Cylindrical Shells under Effects Free-free and Clamed-clamped Boundary Conditions

In the present work, study of the vibration of thin cylindrical shells made of a functionally gradient material (FGM) composed of stainless steel and nickel is presented. Material properties are graded in the thickness direction of the shell according to volume fraction power law distribution. The objective is to study the natural frequencies, the influence of constituent volume fractions and the effects of boundary conditions on the natural frequencies of the FG cylindrical shell. The study is carried out using third order shear deformation shell theory. The analysis is carried out using Hamilton's principle. The governing equations of motion of FG cylindrical shells are derived based on shear deformation theory. Results are presented on the frequency characteristics, influence of constituent volume fractions and the effects of free-free and clamped-clamped boundary conditions.

High Performance Liquid Chromatographic Method for Determination of Colistin Sulfate and its Application in Medicated Premixand Animal Feed

The aim of the present study was to develop and validate an inexpensive and simple high performance liquid chromatographic (HPLC) method for the determination of colistin sulfate. Separation of colistin sulfate was achieved on a ZORBAX Eclipse XDB-C18 column using UV detection at λ=215 nm. The mobile phase was 30 mM sulfate buffer (pH 2.5):acetonitrile(76:24). An excellent linearity (r2=0.998) was found in the concentration range of 25 - 400 μg/mL. Intra- day and inter-day precisions of method (%RSD, n=3) were less than 7.9%.The developed and validated method was applied to determination of the content of colistin sulfate in medicated premix and animal feed sample.The recovery of colistin from animal feed was satisfactorily ranged from 90.92 to 93.77%. The results demonstrated that the HPLC method developed in this work is appropriate for direct determination of colistin sulfate in commercial medicated premixes and animal feed.

Advanced Stochastic Models for Partially Developed Speckle

Speckled images arise when coherent microwave, optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted by speckle noise is complicated by the nature of the noise and is not as straightforward as detection and estimation in additive noise. In this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series of Laguerre weighted exponential functions, resulting in a doubly stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form. It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.

A New Protocol for Concealed Data Aggregation in Wireless Sensor Networks

Wireless sensor networks (WSN) consists of many sensor nodes that are placed on unattended environments such as military sites in order to collect important information. Implementing a secure protocol that can prevent forwarding forged data and modifying content of aggregated data and has low delay and overhead of communication, computing and storage is very important. This paper presents a new protocol for concealed data aggregation (CDA). In this protocol, the network is divided to virtual cells, nodes within each cell produce a shared key to send and receive of concealed data with each other. Considering to data aggregation in each cell is locally and implementing a secure authentication mechanism, data aggregation delay is very low and producing false data in the network by malicious nodes is not possible. To evaluate the performance of our proposed protocol, we have presented computational models that show the performance and low overhead in our protocol.

An Approach for Reducing the End-to-end Delay and Increasing Network Lifetime in Mobile Adhoc Networks

Mobile adhoc network (MANET) is a collection of mobile devices which form a communication network with no preexisting wiring or infrastructure. Multiple routing protocols have been developed for MANETs. As MANETs gain popularity, their need to support real time applications is growing as well. Such applications have stringent quality of service (QoS) requirements such as throughput, end-to-end delay, and energy. Due to dynamic topology and bandwidth constraint supporting QoS is a challenging task. QoS aware routing is an important building block for QoS support. The primary goal of the QoS aware protocol is to determine the path from source to destination that satisfies the QoS requirements. This paper proposes a new energy and delay aware protocol called energy and delay aware TORA (EDTORA) based on extension of Temporally Ordered Routing Protocol (TORA).Energy and delay verifications of query packet have been done in each node. Simulation results show that the proposed protocol has a higher performance than TORA in terms of network lifetime, packet delivery ratio and end-to-end delay.

Molecular Evolutionary Analysis of Yeast Protein Interaction Network

To understand life as biological system, evolutionary understanding is indispensable. Protein interactions data are rapidly accumulating and are suitable for system-level evolutionary analysis. We have analyzed yeast protein interaction network by both mathematical and biological approaches. In this poster presentation, we inferred the evolutionary birth periods of yeast proteins by reconstructing phylogenetic profile. It has been thought that hub proteins that have high connection degree are evolutionary old. But our analysis showed that hub proteins are entirely evolutionary new. We also examined evolutionary processes of protein complexes. It showed that member proteins of complexes were tend to have appeared in the same evolutionary period. Our results suggested that protein interaction network evolved by modules that form the functional unit. We also reconstructed standardized phylogenetic trees and calculated evolutionary rates of yeast proteins. It showed that there is no obvious correlation between evolutionary rates and connection degrees of yeast proteins.

Hexagonal Honeycomb Sandwich Plate Optimization Using Gravitational Search Algorithm

Honeycomb sandwich panels are increasingly used in the construction of space vehicles because of their outstanding strength, stiffness and light weight properties. However, the use of honeycomb sandwich plates comes with difficulties in the design process as a result of the large number of design variables involved, including composite material design, shape and geometry. Hence, this work deals with the presentation of an optimal design of hexagonal honeycomb sandwich structures subjected to space environment. The optimization process is performed using a set of algorithms including the gravitational search algorithm (GSA). Numerical results are obtained and presented for a set of algorithms. The results obtained by the GSA algorithm are much better compared to other algorithms used in this study.

Design of Smith-like Predictive Controller with Communication Delay Adaptation

This paper addresses the design of predictive networked controller with adaptation of a communication delay. The networked control system contains random delays from sensor to controller and from controller to actuator. The proposed predictive controller includes an adaptation loop which decreases the influence of communication delay on the control performance. Also, the predictive controller contains a filter which improves the robustness of the control system. The performance of the proposed adaptive predictive controller is demonstrated by simulation results in comparison with PI controller and predictive controller with constant delay.

Application of Central Composite Design Based Response Surface Methodology in Parameter Optimization and on Cellulase Production Using Agricultural Waste

Response Surface Methodology (RSM) is a powerful and efficient mathematical approach widely applied in the optimization of cultivation process. Cellulase enzyme production by Trichoderma reesei RutC30 using agricultural waste rice straw and banana fiber as carbon source were investigated. In this work, sequential optimization strategy based statistical design was employed to enhance the production of cellulase enzyme through submerged cultivation. A fractional factorial design (26-2) was applied to elucidate the process parameters that significantly affect cellulase production. Temperature, Substrate concentration, Inducer concentration, pH, inoculum age and agitation speed were identified as important process parameters effecting cellulase enzyme synthesis. The concentration of lignocelluloses and lactose (inducer) in the cultivation medium were found to be most significant factors. The steepest ascent method was used to locate the optimal domain and a Central Composite Design (CCD) was used to estimate the quadratic response surface from which the factor levels for maximum production of cellulase were determined.

A Neural Network Approach in Predicting the Blood Glucose Level for Diabetic Patients

Diabetes Mellitus is a chronic metabolic disorder, where the improper management of the blood glucose level in the diabetic patients will lead to the risk of heart attack, kidney disease and renal failure. This paper attempts to enhance the diagnostic accuracy of the advancing blood glucose levels of the diabetic patients, by combining principal component analysis and wavelet neural network. The proposed system makes separate blood glucose prediction in the morning, afternoon, evening and night intervals, using dataset from one patient covering a period of 77 days. Comparisons of the diagnostic accuracy with other neural network models, which use the same dataset are made. The comparison results showed overall improved accuracy, which indicates the effectiveness of this proposed system.

Classification of Non Stationary Signals Using Ben Wavelet and Artificial Neural Networks

The automatic classification of non stationary signals is an important practical goal in several domains. An essential classification task is to allocate the incoming signal to a group associated with the kind of physical phenomena producing it. In this paper, we present a modular system composed by three blocs: 1) Representation, 2) Dimensionality reduction and 3) Classification. The originality of our work consists in the use of a new wavelet called "Ben wavelet" in the representation stage. For the dimensionality reduction, we propose a new algorithm based on the random projection and the principal component analysis.

Monte Carlo Analysis and Fuzzy Sets for Uncertainty Propagation in SIS Performance Assessment

The object of this work is the probabilistic performance evaluation of safety instrumented systems (SIS), i.e. the average probability of dangerous failure on demand (PFDavg) and the average frequency of failure (PFH), taking into account the uncertainties related to the different parameters that come into play: failure rate (λ), common cause failure proportion (β), diagnostic coverage (DC)... This leads to an accurate and safe assessment of the safety integrity level (SIL) inherent to the safety function performed by such systems. This aim is in keeping with the requirement of the IEC 61508 standard with respect to handling uncertainty. To do this, we propose an approach that combines (1) Monte Carlo simulation and (2) fuzzy sets. Indeed, the first method is appropriate where representative statistical data are available (using pdf of the relating parameters), while the latter applies in the case characterized by vague and subjective information (using membership function). The proposed approach is fully supported with a suitable computer code.

Efficient Implementation of Serial and Parallel Support Vector Machine Training with a Multi-Parameter Kernel for Large-Scale Data Mining

This work deals with aspects of support vector learning for large-scale data mining tasks. Based on a decomposition algorithm that can be run in serial and parallel mode we introduce a data transformation that allows for the usage of an expensive generalized kernel without additional costs. In order to speed up the decomposition algorithm we analyze the problem of working set selection for large data sets and analyze the influence of the working set sizes onto the scalability of the parallel decomposition scheme. Our modifications and settings lead to improvement of support vector learning performance and thus allow using extensive parameter search methods to optimize classification accuracy.

Supportability Analysis in LCI Environment

Starting from the basic pillars of the supportability analysis this paper queries its characteristics in LCI (Life Cycle Integration) environment. The research methodology contents a review of modern logistics engineering literature with the objective to collect and synthesize the knowledge relating to standards of supportability design in e-logistics environment. The results show that LCI framework has properties which are in fully compatibility with the requirement of simultaneous logistics support and productservice bundle design. The proposed approach is a contribution to the more comprehensive and efficient supportability design process. Also, contributions are reflected through a greater consistency of collected data, automated creation of reports suitable for different analysis, as well as the possibility of their customization according with customer needs. In addition to this, convenience of this approach is its practical use in real time. In a broader sense, LCI allows integration of enterprises on a worldwide basis facilitating electronic business.