The Investigation of Crack's Parameters on the V-Notch using Photoelasticity Method

The V-notches are most possible case for initiation of cracks in parts. The specifications of cracks on the tip of the notch will be influenced via opening angle, tip radius and depth of V-notch. In this study, the effects of V-notch-s opening angle on stress intensity factor and T-stress of crack on the notch has been investigated. The experiment has been done in different opening angles and various crack length in mode (I) loading using Photoelasticity method. The results illustrate that while angle increases in constant crack-s length, SIF and T-stress will decrease. Beside, the effect of V-notch angle in short crack is more than long crack. These V-notch affects are negligible by increasing the length of crack, and the crack-s behavior can be considered as a single-edge crack specimen. Finally, the results have been evaluated with numerical finite element analysis and good agreement was obvious.

Preparation of Polylactic Acid Graft Polyvinyl Acetate Compatibilizers for 50/50 Starch/PLLA Blending

Polylactic acid-g-polyvinyl acetate (PLLA-g-PVAc) was used as a compatibilizer for 50/50 starch/PLLA blend. PLLA-g- PVAc with different mol% of PVAc contents were prepared by grafting PVAc onto PLLA backbone via free radical polymerization in solution process. Various conditions such as type and the amount of initiator, monomer concentration, polymerization time and temperature were studied. Results showed that the highest mol% of PVAc grafting (16 mol%) was achieved by conducting graft copolymerization in toluene at 110°C for 10 h using DCP as an initiator. Chemical structure of the PVAc grafted PLLA was confirmed by 1H NMR. Blending of modified starch and PLLA in the presence compatibilizer with different amounts and mol% PVAc was acquired using internal mixer at 160°C for 15 min. Effects of PVAc content and the amount of compatibilizer on mechanical properties of polymer blend were studied. Results revealed that tensile strength and tensile modulus of polymer blend with higher PVAc grafting content compatibilizer showed better properties than that of lower PVAc grafting content compatibilizer. The amount of compatibilizer was found optimized in the range of 0.5-1.0 Wt% depending on the mol% PVAc.

A Discretizing Method for Reliability Computation in Complex Stress-strength Models

This paper proposes, implements and evaluates an original discretization method for continuous random variables, in order to estimate the reliability of systems for which stress and strength are defined as complex functions, and whose reliability is not derivable through analytic techniques. This method is compared to other two discretizing approaches appeared in literature, also through a comparative study involving four engineering applications. The results show that the proposal is very efficient in terms of closeness of the estimates to the true (simulated) reliability. In the study we analyzed both a normal and a non-normal distribution for the random variables: this method is theoretically suitable for each parametric family.

Innovative Techniques for Characterization of Nonwoven Insulation Materials Embedded with Aerogel

The major objective of this study is to understand the potential of a newly fabricated equipment to study the thermal properties of nonwoven textile fabrics treated with aerogel at subzero temperatures. Thermal conductivity was calculated by using the empirical relation Fourier’s law, The relationship between the thermal conductivity and thermal resistance of the samples were studied at various environmental temperatures (which was set in the clima temperature system between +25oC to -25oC). The newly fabricated equipment was found to be a suitable for measuring at subzero temperatures. This field of measurements is being developed and will be the subject of further research which will be more suitable for measurement of the various thermal characteristics.

Face Detection using Variance based Haar-Like feature and SVM

This paper proposes a new approach to perform the problem of real-time face detection. The proposed method combines primitive Haar-Like feature and variance value to construct a new feature, so-called Variance based Haar-Like feature. Face in image can be represented with a small quantity of features using this new feature. We used SVM instead of AdaBoost for training and classification. We made a database containing 5,000 face samples and 10,000 non-face samples extracted from real images for learning purposed. The 5,000 face samples contain many images which have many differences of light conditions. And experiments showed that face detection system using Variance based Haar-Like feature and SVM can be much more efficient than face detection system using primitive Haar-Like feature and AdaBoost. We tested our method on two Face databases and one Non-Face database. We have obtained 96.17% of correct detection rate on YaleB face database, which is higher 4.21% than that of using primitive Haar-Like feature and AdaBoost.

Strengthening of RC Beams with Large Openings in Shear by CFRP Laminates: 2D Nonlinear FE Analysis

To date, theoretical studies concerning the Carbon Fiber Reinforced Polymer (CFRP) strengthening of RC beams with openings have been rather limited. In addition, various numerical analyses presented so far have effectively simulated the behaviour of solid beam strengthened by FRP material. In this paper, a two dimensional nonlinear finite element analysis is presented to validate against the laboratory test results of six RC beams. All beams had the same rectangular cross-section geometry and were loaded under four point bending. The crack pattern results of the finite element model show good agreement with the crack pattern of the experimental beams. The load midspan deflection curves of the finite element models exhibited a stiffer result compared to the experimental beams. The possible reason may be due to the perfect bond assumption used between the concrete and steel reinforcement.

A new Configurable Decimation Filter using Pascal-s Triangle Theorem

This paper presents a new configurable decimation filter for sigma-delta modulators. The filter employs the Pascal-s triangle-s theorem for building the coefficients of non-recursive decimation filters. The filter can be connected to the back-end of various modulators with different output accuracy. In this work two methods are shown and then compared from area occupation viewpoint. First method uses the memory and the second one employs Pascal-s triangle-s method, aiming to reduce required gates. XILINX ISE v10 is used for implementation and confirmation the filter.

Enhanced Character Based Algorithm for Small Parsimony

Phylogenetic tree is a graphical representation of the evolutionary relationship among three or more genes or organisms. These trees show relatedness of data sets, species or genes divergence time and nature of their common ancestors. Quality of a phylogenetic tree requires parsimony criterion. Various approaches have been proposed for constructing most parsimonious trees. This paper is concerned about calculating and optimizing the changes of state that are needed called Small Parsimony Algorithms. This paper has proposed enhanced small parsimony algorithm to give better score based on number of evolutionary changes needed to produce the observed sequence changes tree and also give the ancestor of the given input.

Fast Dummy Sequence Insertion Method for PAPR Reduction in WiMAX Systems

In literatures, many researches proposed various methods to reduce PAPR (Peak to Average Power Ratio). Among those, DSI (Dummy Sequence Insertion) is one of the most attractive methods for WiMAX systems because it does not require side information transmitted along with user data. However, the conventional DSI methods find dummy sequence by performing an iterative procedure until achieving PAPR under a desired threshold. This causes a significant delay on finding dummy sequence and also effects to the overall performances in WiMAX systems. In this paper, the new method based on DSI is proposed by finding dummy sequence without the need of iterative procedure. The fast DSI method can reduce PAPR without either delays or required side information. The simulation results confirm that the proposed method is able to carry out PAPR performances as similar to the other methods without any delays. In addition, the simulations of WiMAX system with adaptive modulations are also investigated to realize the use of proposed methods on various fading schemes. The results suggest the WiMAX designers to modify a new Signal to Noise Ratio (SNR) criteria for adaptation.

Analyzing the Factors Effecting the Passenger Car Breakdowns using Com-Poisson GLM

Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observations as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use quasi-likelihood estimation approach to estimate the parameters of the model. Under-dispersion parameter is estimated to be 2.14 justifying the appropriateness of Com-Poisson distribution in modelling under-dispersed count responses recorded in this study.

Real-time Target Tracking Using a Pan and Tilt Platform

In recent years, we see an increase of interest for efficient tracking systems in surveillance applications. Many of the proposed techniques are designed for static cameras environments. When the camera is moving, tracking moving objects become more difficult and many techniques fail to detect and track the desired targets. The problem becomes more complex when we want to track a specific object in real-time using a moving Pan and Tilt camera system to keep the target within the image. This type of tracking is of high importance in surveillance applications. When a target is detected at a certain zone, the possibility of automatically tracking it continuously and keeping it within the image until action is taken is very important for security personnel working in very sensitive sites. This work presents a real-time tracking system permitting the detection and continuous tracking of targets using a Pan and Tilt camera platform. A novel and efficient approach for dealing with occlusions is presented. Also a new intelligent forget factor is introduced in order to take into account target shape variations and avoid learning non desired objects. Tests conducted in outdoor operational scenarios show the efficiency and robustness of the proposed approach.

Application of Micro-continuum Approach in the Estimation of Snow Drift Density, Velocity and Mass Transport in Hilly Bound Cold Regions

We estimate snow velocity and snow drift density on hilly terrain under the assumption that the drifting snow mass can be represented using a micro-continuum approach (i.e. using a nonclassical mechanics approach assuming a class of fluids for which basic equations of mass, momentum and energy have been derived). In our model, the theory of coupled stress fluids proposed by Stokes [1] has been employed for the computation of flow parameters. Analyses of bulk drift velocity, drift density, drift transport and mass transport of snow particles have been carried out and computations made, considering various parametric effects. Results are compared with those of classical mechanics (logarithmic wind profile). The results indicate that particle size affects the flow characteristics significantly.

Target Tracking in Sensor Networks: A Distributed Constraint Satisfaction Approach

In distributed resource allocation a set of agents must assign their resources to a set of tasks. This problem arises in many real-world domains such as distributed sensor networks, disaster rescue, hospital scheduling and others. Despite the variety of approaches proposed for distributed resource allocation, a systematic formalization of the problem, explaining the different sources of difficulties, and a formal explanation of the strengths and limitations of key approaches is missing. We take a step towards this goal by using a formalization of distributed resource allocation that represents both dynamic and distributed aspects of the problem. In this paper we present a new idea for target tracking in sensor networks and compare it with previous approaches. The central contribution of the paper is a generalized mapping from distributed resource allocation to DDCSP. This mapping is proven to correctly perform resource allocation problems of specific difficulty. This theoretical result is verified in practice by a simulation on a realworld distributed sensor network.

Antecedent Factors of Ethical Ideologies in Moral Judgment: Evidence from the Mixed Method Study

This research investigates the factors that influence moral judgments when dealing with ethical dilemmas in the organizational context. It also investigates the antecedents of individual ethical ideology (idealism and relativism). A mixed method study, which combines qualitative (field study) and quantitative (survey) approaches, was used in this study. An initial model was developed first, which was then fine-tuned based on field studies. Data were collected from managers in Malaysian large organizations. The results of this study reveal that in-group collectivism culture, power distance culture, parental values, and religiosity were significant as antecedents of ethical ideology. However, direct effects of these variables on moral judgment were not significant. Furthermore, the results of this study confirm the significant effects of ethical ideology on moral judgment. This study provides valuable insight into evaluating the validity of existing theory as proposed in the literature and offers significant practical implications.

Detailed Mapping of Pyroclastic Flow Deposits by SAR Data Processing for an Active Volcano in the Torrid Zone

Field mapping activity for an active volcano mainly in the Torrid Zone is usually hampered by several problems such as steep terrain and bad atmosphere conditions. In this paper we present a simple solution for such problem by a combination Synthetic Aperture Radar (SAR) and geostatistical methods. By this combination, we could reduce the speckle effect from the SAR data and then estimate roughness distribution of the pyroclastic flow deposits. The main purpose of this study is to detect spatial distribution of new pyroclastic flow deposits termed as P-zone accurately using the β°data from two RADARSAT-1 SAR level-0 data. Single scene of Hyperion data and field observation were used for cross-validation of the SAR results. Mt. Merapi in central Java, Indonesia, was chosen as a study site and the eruptions in May-June 2006 were examined. The P-zones were found in the western and southern flanks. The area size and the longest flow distance were calculated as 2.3 km2 and 6.8 km, respectively. The grain size variation of the P-zone was mapped in detail from fine to coarse deposits regarding the C-band wavelength of 5.6 cm.

The Maximum Likelihood Method of Random Coefficient Dynamic Regression Model

The Random Coefficient Dynamic Regression (RCDR) model is to developed from Random Coefficient Autoregressive (RCA) model and Autoregressive (AR) model. The RCDR model is considered by adding exogenous variables to RCA model. In this paper, the concept of the Maximum Likelihood (ML) method is used to estimate the parameter of RCDR(1,1) model. Simulation results have shown the AIC and BIC criterion to compare the performance of the the RCDR(1,1) model. The variables as the stationary and weakly stationary data are good estimates where the exogenous variables are weakly stationary. However, the model selection indicated that variables are nonstationarity data based on the stationary data of the exogenous variables.

The Social and Environmental Roles of Verandah in Tropical Houses

Located within the tropical belt region, there are certain rules which should implemented in creating a passive sustainable housing design in Malaysia. Traditional Malay house possess a strong character with certain special spaces to create a sustainable house which suit to the tropical climate in Malaysia. One of the special space known as verandah or serambi gantung, create various advantages in solving various issues. However, this special space is not extremely being applied currently which produce major issues in term of social and environmental aspects. Hence, this phenomena create a negative impact to the occupant while Malaysia already has a best housing design previously. Therefore, this paper aims to explore both of the main issues mentioned above and reveal the advantages of implementing verandah into passive sustainable housing design in Malaysia. A systematic literature review is the main methodology in this research to identify the various advantages about verandah.. The study reveals that verandah is the best solution in term of social and environmental issues and should be implemented in current housing design in Malaysia.

On the Move to Semantic Web Services

Semantic Web services will enable the semiautomatic and automatic annotation, advertisement, discovery, selection, composition, and execution of inter-organization business logic, making the Internet become a common global platform where organizations and individuals communicate with each other to carry out various commercial activities and to provide value-added services. There is a growing consensus that Web services alone will not be sufficient to develop valuable solutions due the degree of heterogeneity, autonomy, and distribution of the Web. This paper deals with two of the hottest R&D and technology areas currently associated with the Web – Web services and the Semantic Web. It presents the synergies that can be created between Web Services and Semantic Web technologies to provide a new generation of eservices.

Comparing Arabic and Latin Handwritten Digits Recognition Problems

A comparison between the performance of Latin and Arabic handwritten digits recognition problems is presented. The performance of ten different classifiers is tested on two similar Arabic and Latin handwritten digits databases. The analysis shows that Arabic handwritten digits recognition problem is easier than that of Latin digits. This is because the interclass difference in case of Latin digits is smaller than in Arabic digits and variances in writing Latin digits are larger. Consequently, weaker yet fast classifiers are expected to play more prominent role in Arabic handwritten digits recognition.

Enhancement of Essential Oil from Agarwood by Subcritical Water Extraction and Pretreatments on Hydrodistillation

The traditional method for essential oil extraction from agarwood (Aquilaria Crassna) is to soak it in water and follow with hydrodistillation. The effect of various agarwood pretreatments: ethanol, acid, alkaline, enzymes, and ultrasound, and the effect of subcritical water extraction(SWE) was studied to compare with the traditional method. The major compositions of agarwood oil from hydrodistillation were aroma compounds as follow: aristol-9-en-8- one (21.53%), selina-3, 7(11)-diene (12.96%), τ-himachalene (9.28%), β-guaiene (5.79%), hexadecanoic acid (4.90%) and guaia- 3,9-diene (4.21%). Whereas agarwood oil from pretreatments with ethanol and ultrasound, and SWE got fatty acid compounds. Extraction of agarwood oil using these pretreatments could improve the agarwood oil yields up to 2 times that of the traditional method. The components of the pretreated sample with diluted acid (H2SO4) at pH 4 gave quite similar results as the traditional method. Therefore, the enhancement of essential oil from agarwood depends on requirement of type of extracted oil that involved extraction methods.