The Robust Clustering with Reduction Dimension

A clustering is process to identify a homogeneous groups of object called as cluster. Clustering is one interesting topic on data mining. A group or class behaves similarly characteristics. This paper discusses a robust clustering process for data images with two reduction dimension approaches; i.e. the two dimensional principal component analysis (2DPCA) and principal component analysis (PCA). A standard approach to overcome this problem is dimension reduction, which transforms a high-dimensional data into a lower-dimensional space with limited loss of information. One of the most common forms of dimensionality reduction is the principal components analysis (PCA). The 2DPCA is often called a variant of principal component (PCA), the image matrices were directly treated as 2D matrices; they do not need to be transformed into a vector so that the covariance matrix of image can be constructed directly using the original image matrices. The decomposed classical covariance matrix is very sensitive to outlying observations. The objective of paper is to compare the performance of robust minimizing vector variance (MVV) in the two dimensional projection PCA (2DPCA) and the PCA for clustering on an arbitrary data image when outliers are hiden in the data set. The simulation aspects of robustness and the illustration of clustering images are discussed in the end of paper

Reducing Stock-out Incidents at a Hospital Using Six Sigma

In managing healthcare logistics, cost is not the only factor to be considered. The level of items- criticality used in patient care services plays an important role as well. A stock-out incident of a high critical item could threaten a patient's life. In this paper, the DMAIC (Define-Measure-Analyze-Improve-Control) methodology is used to drive improvement projects based on customer driven critical to quality characteristics at a Jordanian hospital. This paper shows how the application of Six Sigma improves the performance of the case hospital logistics system by reducing the number of stock-out incidents.

Smart Surveillance using PDA

The aim of this research is to develop a fast and reliable surveillance system based on a personal digital assistant (PDA) device. This is to extend the capability of the device to detect moving objects which is already available in personal computers. Secondly, to compare the performance between Background subtraction (BS) and Temporal Frame Differencing (TFD) techniques for PDA platform as to which is more suitable. In order to reduce noise and to prepare frames for the moving object detection part, each frame is first converted to a gray-scale representation and then smoothed using a Gaussian low pass filter. Two moving object detection schemes i.e., BS and TFD have been analyzed. The background frame is updated by using Infinite Impulse Response (IIR) filter so that the background frame is adapted to the varying illuminate conditions and geometry settings. In order to reduce the effect of noise pixels resulting from frame differencing morphological filters erosion and dilation are applied. In this research, it has been found that TFD technique is more suitable for motion detection purpose than the BS in term of speed. On average TFD is approximately 170 ms faster than the BS technique

Bio-Inspired Generalized Global Shape Approach for Writer Identification

Writer identification is one of the areas in pattern recognition that attract many researchers to work in, particularly in forensic and biometric application, where the writing style can be used as biometric features for authenticating an identity. The challenging task in writer identification is the extraction of unique features, in which the individualistic of such handwriting styles can be adopted into bio-inspired generalized global shape for writer identification. In this paper, the feasibility of generalized global shape concept of complimentary binding in Artificial Immune System (AIS) for writer identification is explored. An experiment based on the proposed framework has been conducted to proof the validity and feasibility of the proposed approach for off-line writer identification.

Credit Spread Changes and Volatility Spillover Effects

The purpose of this paper is to investigate the influence of a number of variables on the conditional mean and conditional variance of credit spread changes. The empirical analysis in this paper is conducted within the context of bivariate GARCH-in- Mean models, using the so-called BEKK parameterization. We show that credit spread changes are determined by interest-rate and equityreturn variables, which is in line with theory as provided by the structural models of default. We also identify the credit spread change volatility as an important determinant of credit spread changes, and provide evidence on the transmission of volatility between the variables under study.

Two Undetectable On-line Dictionary Attacks on Debiao et al.’s S-3PAKE Protocol

In 2011, Debiao et al. pointed out that S-3PAKE protocol proposed by Lu and Cao for password-authenticated key exchange in the three-party setting is vulnerable to an off-line dictionary attack. Then, they proposed some countermeasures to eliminate the security vulnerability of the S-3PAKE. Nevertheless, this paper points out their enhanced S-3PAKE protocol is still vulnerable to undetectable on-line dictionary attacks unlike their claim.

Effects of Dry Period Length on, Milk Production and Composition, Blood Metabolites and Complete Blood Count in Subsequent Lactation of Holstein Dairy Cows

Twenty - nine Holstein cows were used to evaluate the effects of different dry period (DP) lengths on milk yield and composition, some blood metabolites, and complete blood count (CBC). Cows were assigned to one of 2 treatments: 1) 60-d dry period, 2) 35-d DP. Milk yield, from calving to 60 days, was not different for cows on the treatments (p =0.130). Cows in the 35-d DP produced more milk protein and SNF compare with cows in treatment 1 (p ≤ 0.05). Serum glucose, non-esterified fatty acids (NEFA), beta hydroxyl butyrate acid (BHBA), blood urea nitrogen (BUN), urea, and glutamic oxaloacetic transaminase (GOT) were all similar among the treatments. Body condition score (BCS), body weight (BW), complete blood count (CBC) and health problems were similar between the treatments. The results of this study demonstrated we can reduce the dry period length to 35 days with no problems.

Using the Combined Model of PROMETHEE and Fuzzy Analytic Network Process for Determining Question Weights in Scientific Exams through Data Mining Approach

Need for an appropriate system of evaluating students- educational developments is a key problem to achieve the predefined educational goals. Intensity of the related papers in the last years; that tries to proof or disproof the necessity and adequacy of the students assessment; is the corroborator of this matter. Some of these studies tried to increase the precision of determining question weights in scientific examinations. But in all of them there has been an attempt to adjust the initial question weights while the accuracy and precision of those initial question weights are still under question. Thus In order to increase the precision of the assessment process of students- educational development, the present study tries to propose a new method for determining the initial question weights by considering the factors of questions like: difficulty, importance and complexity; and implementing a combined method of PROMETHEE and fuzzy analytic network process using a data mining approach to improve the model-s inputs. The result of the implemented case study proves the development of performance and precision of the proposed model.

A Fully Implicit Finite-Difference Solution to One Dimensional Coupled Nonlinear Burgers’ Equations

A fully implicit finite-difference method has been proposed for the numerical solutions of one dimensional coupled nonlinear Burgers’ equations on the uniform mesh points. The method forms a system of nonlinear difference equations which is to be solved at each iteration. Newton’s iterative method has been implemented to solve this nonlinear assembled system of equations. The linear system has been solved by Gauss elimination method with partial pivoting algorithm at each iteration of Newton’s method. Three test examples have been carried out to illustrate the accuracy of the method. Computed solutions obtained by proposed scheme have been compared with analytical solutions and those already available in the literature by finding L2 and L∞ errors.

A Review of Critical Success Factor in Building Maintenance Management Practice for University Sector

Building maintenance plays an important role among other activities in building operation. Building defect and damages are part of the building maintenance 'bread and butter' as their input indicated in the building inspection is very much justified, particularly as to determine the building performance. There will be no escape route or short cut from building maintenance work. This study attempts to identify a competitive performance that translates the Critical Success Factor achievements and satisfactorily meet the university-s expectation. The quality and efficiency of maintenance management operation of building depends, to some extent, on the building condition information, the expectation from the university sector and the works carried out for each maintenance activity. This paper reviews the critical success factor in building maintenance management practice for university sectors from four (4) perspectives which include (1) customer (2) internal processes (3) financial and (4) learning and growth perspective. The enhancement of these perspectives is capable to reach the maintenance management goal for a better living environment in university campus.

Experimental Study of Upsetting and Die Forging with Controlled Impact

The results from experimental research of deformation by upsetting and die forging of lead specimens wit controlled impact are presented. Laboratory setup for conducting the investigations, which uses cold rocket engine operated with compressed air, is described. The results show that when using controlled impact is achieving greater plastic deformation and consumes less impact energy than at ordinary impact deformation process.

The Contraction Point for Phan-Thien/Tanner Model of Tube-Tooling Wire-Coating Flow

The simulation of extrusion process is studied widely in order to both increase products and improve quality, with broad application in wire coating. The annular tube-tooling extrusion was set up by a model that is termed as Navier-Stokes equation in addition to a rheological model of differential form based on singlemode exponential Phan-Thien/Tanner constitutive equation in a twodimensional cylindrical coordinate system for predicting the contraction point of the polymer melt beyond the die. Numerical solutions are sought through semi-implicit Taylor-Galerkin pressurecorrection finite element scheme. The investigation was focused on incompressible creeping flow with long relaxation time in terms of Weissenberg numbers up to 200. The isothermal case was considered with surface tension effect on free surface in extrudate flow and no slip at die wall. The Stream Line Upwind Petrov-Galerkin has been proposed to stabilize solution. The structure of mesh after die exit was adjusted following prediction of both top and bottom free surfaces so as to keep the location of contraction point around one unit length which is close to experimental results. The simulation of extrusion process is studied widely in order to both increase products and improve quality, with broad application in wire coating. The annular tube-tooling extrusion was set up by a model that is termed as Navier-Stokes equation in addition to a rheological model of differential form based on single-mode exponential Phan- Thien/Tanner constitutive equation in a two-dimensional cylindrical coordinate system for predicting the contraction point of the polymer melt beyond the die. Numerical solutions are sought through semiimplicit Taylor-Galerkin pressure-correction finite element scheme. The investigation was focused on incompressible creeping flow with long relaxation time in terms of Weissenberg numbers up to 200. The isothermal case was considered with surface tension effect on free surface in extrudate flow and no slip at die wall. The Stream Line Upwind Petrov-Galerkin has been proposed to stabilize solution. The structure of mesh after die exit was adjusted following prediction of both top and bottom free surfaces so as to keep the location of contraction point around one unit length which is close to experimental results.

Multiple Sequence Alignment Using Optimization Algorithms

Proteins or genes that have similar sequences are likely to perform the same function. One of the most widely used techniques for sequence comparison is sequence alignment. Sequence alignment allows mismatches and insertion/deletion, which represents biological mutations. Sequence alignment is usually performed only on two sequences. Multiple sequence alignment, is a natural extension of two-sequence alignment. In multiple sequence alignment, the emphasis is to find optimal alignment for a group of sequences. Several applicable techniques were observed in this research, from traditional method such as dynamic programming to the extend of widely used stochastic optimization method such as Genetic Algorithms (GAs) and Simulated Annealing. A framework with combination of Genetic Algorithm and Simulated Annealing is presented to solve Multiple Sequence Alignment problem. The Genetic Algorithm phase will try to find new region of solution while Simulated Annealing can be considered as an alignment improver for any near optimal solution produced by GAs.

Optimum Design of an 8x8 Optical Switch with Thermal Compensated Mechanisms

This paper studies the optimum design for reducing optical loss of an 8x8 mechanical type optical switch due to the temperature change. The 8x8 optical switch is composed of a base, 8 input fibers, 8 output fibers, 3 fixed mirrors and 17 movable mirrors. First, an innovative switch configuration is proposed with thermal-compensated design. Most mechanical type optical switches have a disadvantage that their precision and accuracy are influenced by the ambient temperature. Therefore, the thermal-compensated design is to deal with this situation by using materials with different thermal expansion coefficients (α). Second, a parametric modeling program is developed to generate solid models for finite element analysis, and the thermal and structural behaviors of the switch are analyzed. Finally, an integrated optimum design program, combining Autodesk Inventor Professional software, finite element analysis software, and genetic algorithms, is developed for improving the thermal behaviors that the optical loss of the switch is reduced. By changing design parameters of the switch in the integrated design program, the final optimum design that satisfies the design constraints and specifications can be found.

The Concept of Place and Sense of Place In Architectural Studies

Place is a where dimension formed by people-s relationship with physical settings, individual and group activities, and meanings. 'Place Attachment', 'Place Identity'and 'Sense of Place' are some concepts that could describe the quality of people-s relationships with a place. The concept of Sense of place is used in studying human-place bonding, attachment and place meaning. Sense of Place usually is defined as an overarching impression encompassing the general ways in which people feel about places, senses it, and assign concepts and values to it. Sense of place is highlighted in this article as one of the prevailing concepts among place-based researches. Considering dimensions of sense of place has always been beneficial for investigating public place attachment and pro-environmental attitudes towards these places. The creation or preservation of Sense of place is important in maintaining the quality of the environment as well as the integrity of human life within it. While many scholars argued that sense of place is a vague concept, this paper will summarize and analyze the existing seminal literature. Therefore, in this paper first the concept of Sense of place and its characteristics will be examined afterward the scales of Sense of place will be reviewed and the factors that contribute to form Sense of place will be evaluated and finally Place Attachment as an objective dimension for measuring the sense of place will be described.

Trust Building Mechanisms for Electronic Business Networks and Their Relation to eSkills

Globalization, supported by information and communication technologies, changes the rules of competitiveness and increases the significance of information, knowledge and network cooperation. In line with this trend, the need for efficient trust-building tools has emerged. The absence of trust building mechanisms and strategies was identified within several studies. Through trust development, participation on e-business network and usage of network services will increase and provide to SMEs new economic benefits. This work is focused on effective trust building strategies development for electronic business network platforms. Based on trust building mechanism identification, the questionnairebased analysis of its significance and minimum level of requirements was conducted. In the paper, we are confirming the trust dependency on e-Skills which play crucial role in higher level of trust into the more sophisticated and complex trust building ICT solutions.

Study of Remote Sensing and Satellite Images Ability in Preparing Agricultural Land Use Map (ALUM)

In this research the Preparation of Land use map of scanner LISS III satellite data, belonging to the IRS in the Aghche region in Isfahan province, is studied carefully. For this purpose, the IRS satellite images of August 2008 and various land preparation uses in region including rangelands, irrigation farming, dry farming, gardens and urban areas were separated and identified. Therefore, the GPS and Erdas Imaging software were used and three methods of Maximum Likelihood, Mahalanobis Distance and Minimum Distance were analyzed. In each of these methods, matrix error and Kappa index were calculated and accuracy of each method, based on percentages: 53.13, 56.64 and 48.44, were obtained respectively. Considering the low accuracy of these methods in separation of land preparation use, the visual interpretation of the map was used. Finally, regional visits of 150 points were noted at random and no error was observed. It shows that the map prepared by visual interpretation is in high accuracy. Although the probable errors due to visual interpretation and geometric correction might happen but the desired accuracy of the map which is more than 85 percent is reliable.

Operational Risk – Scenario Analysis

This paper focuses on operational risk measurement techniques and on economic capital estimation methods. A data sample of operational losses provided by an anonymous Central European bank is analyzed using several approaches. Loss Distribution Approach and scenario analysis method are considered. Custom plausible loss events defined in a particular scenario are merged with the original data sample and their impact on capital estimates and on the financial institution is evaluated. Two main questions are assessed – What is the most appropriate statistical method to measure and model operational loss data distribution? and What is the impact of hypothetical plausible events on the financial institution? The g&h distribution was evaluated to be the most suitable one for operational risk modeling. The method based on the combination of historical loss events modeling and scenario analysis provides reasonable capital estimates and allows for the measurement of the impact of extreme events on banking operations.

High Performance Liquid Chromatographic Method for Determination of Colistin Sulfate and its Application in Medicated Premixand Animal Feed

The aim of the present study was to develop and validate an inexpensive and simple high performance liquid chromatographic (HPLC) method for the determination of colistin sulfate. Separation of colistin sulfate was achieved on a ZORBAX Eclipse XDB-C18 column using UV detection at λ=215 nm. The mobile phase was 30 mM sulfate buffer (pH 2.5):acetonitrile(76:24). An excellent linearity (r2=0.998) was found in the concentration range of 25 - 400 μg/mL. Intra- day and inter-day precisions of method (%RSD, n=3) were less than 7.9%.The developed and validated method was applied to determination of the content of colistin sulfate in medicated premix and animal feed sample.The recovery of colistin from animal feed was satisfactorily ranged from 90.92 to 93.77%. The results demonstrated that the HPLC method developed in this work is appropriate for direct determination of colistin sulfate in commercial medicated premixes and animal feed.

Advanced Stochastic Models for Partially Developed Speckle

Speckled images arise when coherent microwave, optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted by speckle noise is complicated by the nature of the noise and is not as straightforward as detection and estimation in additive noise. In this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series of Laguerre weighted exponential functions, resulting in a doubly stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form. It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.