Analysis of Long-Term File System Activities on Cluster Systems

I/O workload is a critical and important factor to analyze I/O pattern and to maximize file system performance. However to measure I/O workload on running distributed parallel file system is non-trivial due to collection overhead and large volume of data. In this paper, we measured and analyzed file system activities on two large-scale cluster systems which had TFlops level high performance computation resources. By comparing file system activities of 2009 with those of 2006, we analyzed the change of I/O workloads by the development of system performance and high-speed network technology.

GeoSEMA: A Modelling Platform, Emerging “GeoSpatial-based Evolutionary and Mobile Agents“

Spatial and mobile computing evolves. This paper describes a smart modeling platform called “GeoSEMA". This approach tends to model multidimensional GeoSpatial Evolutionary and Mobile Agents. Instead of 3D and location-based issues, there are some other dimensions that may characterize spatial agents, e.g. discrete-continuous time, agent behaviors. GeoSEMA is seen as a devoted design pattern motivating temporal geographic-based applications; it is a firm foundation for multipurpose and multidimensional special-based applications. It deals with multipurpose smart objects (buildings, shapes, missiles, etc.) by stimulating geospatial agents. Formally, GeoSEMA refers to geospatial, spatio-evolutive and mobile space constituents where a conceptual geospatial space model is given in this paper. In addition to modeling and categorizing geospatial agents, the model incorporates the concept of inter-agents event-based protocols. Finally, a rapid software-architecture prototyping GeoSEMA platform is also given. It will be implemented/ validated in the next phase of our work.

Design and Analysis of Gauge R&R Studies: Making Decisions Based on ANOVA Method

In a competitive production environment, critical decision making are based on data resulted by random sampling of product units. Efficiency of these decisions depends on data quality and also their reliability scale. This point leads to the necessity of a reliable measurement system. Therefore, the conjecture process and analysing the errors contributes to a measurement system known as Measurement System Analysis (MSA). The aim of this research is on determining the necessity and assurance of extensive development in analysing measurement systems, particularly with the use of Repeatability and Reproducibility Gages (GR&R) to improve physical measurements. Nowadays in productive industries, repeatability and reproducibility gages released so well but they are not applicable as well as other measurement system analysis methods. To get familiar with this method and gain a feedback in improving measurement systems, this survey would be on “ANOVA" method as the most widespread way of calculating Repeatability and Reproducibility (R&R).

On Pattern-Based Programming towards the Discovery of Frequent Patterns

The problem of frequent pattern discovery is defined as the process of searching for patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a database. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages. Such paradigm is inefficient when set of patterns is large and the frequent pattern is long. We suggest a high-level declarative style of programming apply to the problem of frequent pattern discovery. We consider two languages: Haskell and Prolog. Our intuitive idea is that the problem of finding frequent patterns should be efficiently and concisely implemented via a declarative paradigm since pattern matching is a fundamental feature supported by most functional languages and Prolog. Our frequent pattern mining implementation using the Haskell and Prolog languages confirms our hypothesis about conciseness of the program. The comparative performance studies on line-of-code, speed and memory usage of declarative versus imperative programming have been reported in the paper.

Independent Spanning Trees on Systems-on-chip Hypercubes Routing

Independent spanning trees (ISTs) provide a number of advantages in data broadcasting. One can cite the use in fault tolerance network protocols for distributed computing and bandwidth. However, the problem of constructing multiple ISTs is considered hard for arbitrary graphs. In this paper we present an efficient algorithm to construct ISTs on hypercubes that requires minimum resources to be performed.

Design and Fabrication of a Miniature Railway Vehicle

We present design, fabrication, and characterization of a small (12 mm × 12 mm × 8 mm) movable railway vehicle for sensor carrying. The miniature railway vehicle (MRV) was mainly composed of a vibrational structure and three legs. A railway was designed and fabricated to power and guide the MRV. It also transmits the sensed data from the MRV to the signal processing unit. The MRV with legs on the railway was moving due to its high-frequency vibration. A model was derived to describe the motion. Besides, FEM simulations were performed to design the legs. Then, the MRV and the railway were fabricated by precision machining. Finally, an infrared sensor was carried and tested. The result shows that the MRV without loading was moving along the railway and its maximum speed was 12.2 mm/s. Moreover, the testing signal was sensed by the MRV.

Ultimate Load Capacity of the Cable Tower of Liede Bridge

The cable tower of Liede Bridge is a double-column curved-lever arched-beam portal framed structure. Being novel and unique in structure, its cable tower differs in complexity from traditional ones. This paper analyzes the ultimate load capacity of cable tower by adopting the finite element calculations and model tests which indicate that constitutive relations applied here give a better simulation of actual failure process of prestressed reinforced concrete. In vertical load, horizontal load and overloading tests, the stepped loading of the tower model is of linear relationship, and the test data has good repeatability. All suggests that the cable tower has good bearing capacity, rational design and high emergency capacity.

Analysis of Air Quality in the Outdoor Environment of the City of Messina by an Application of the Pollution Index Method

In this paper is reported an analysis about the outdoor air pollution of the urban centre of the city of Messina. The variations of the most critical pollutants concentrations (PM10, O3, CO, C6H6) and their trends respect of climatic parameters and vehicular traffic have been studied. Linear regressions have been effectuated for representing the relations among the pollutants; the differences between pollutants concentrations on weekend/weekday were also analyzed. In order to evaluate air pollution and its effects on human health, a method for calculating a pollution index was implemented and applied in the urban centre of the city. This index is based on the weighted mean of the most detrimental air pollutants concentrations respect of their limit values for protection of human health. The analyzed data of the polluting substances were collected by the Assessorship of the Environment of the Regional Province of Messina in the year 2004. A statistical analysis of the air quality index trends is also reported.

Generator of Hypotheses an Approach of Data Mining Based on Monotone Systems Theory

Generator of hypotheses is a new method for data mining. It makes possible to classify the source data automatically and produces a particular enumeration of patterns. Pattern is an expression (in a certain language) describing facts in a subset of facts. The goal is to describe the source data via patterns and/or IF...THEN rules. Used evaluation criteria are deterministic (not probabilistic). The search results are trees - form that is easy to comprehend and interpret. Generator of hypotheses uses very effective algorithm based on the theory of monotone systems (MS) named MONSA (MONotone System Algorithm).

Parallelization of Ensemble Kalman Filter (EnKF) for Oil Reservoirs with Time-lapse Seismic Data

In this paper we describe the design and implementation of a parallel algorithm for data assimilation with ensemble Kalman filter (EnKF) for oil reservoir history matching problem. The use of large number of observations from time-lapse seismic leads to a large turnaround time for the analysis step, in addition to the time consuming simulations of the realizations. For efficient parallelization it is important to consider parallel computation at the analysis step. Our experiments show that parallelization of the analysis step in addition to the forecast step has good scalability, exploiting the same set of resources with some additional efforts.

Performance Analysis of List Scheduling in Heterogeneous Computing Systems

Given a parallel program to be executed on a heterogeneous computing system, the overall execution time of the program is determined by a schedule. In this paper, we analyze the worst-case performance of the list scheduling algorithm for scheduling tasks of a parallel program in a mixed-machine heterogeneous computing system such that the total execution time of the program is minimized. We prove tight lower and upper bounds for the worst-case performance ratio of the list scheduling algorithm. We also examine the average-case performance of the list scheduling algorithm. Our experimental data reveal that the average-case performance of the list scheduling algorithm is much better than the worst-case performance and is very close to optimal, except for large systems with large heterogeneity. Thus, the list scheduling algorithm is very useful in real applications.

Developing Student Teachers to Be Professional Teachers

Practicum placements are an critical factor for student teachers on Education Programs. How can student teachers become professionals? This study was to investigate problems, weakness and obstacles of practicum placements and develop guidelines for partnership in the practicum placements. In response to this issue, a partnership concept was implemented for developing student teachers into professionals. Data were collected through questionnaires on attitude toward problems, weaknesses, and obstacles of practicum placements of student teachers in Rajabhat universities and included focus group interviews. The research revealed that learning management, classroom management, curriculum, assessment and evaluation, classroom action research, and teacher demeanor are the important factors affecting the professional development of Education Program student teachers. Learning management plan and classroom management concerning instructional design, teaching technique, instructional media, and student behavior management are another important aspects influencing the professional development for student teachers.

Quality Evaluation of Compressed MRI Medical Images for Telemedicine Applications

Medical image modalities such as computed tomography (CT), magnetic resonance imaging (MRI), ultrasound (US), X-ray are adapted to diagnose disease. These modalities provide flexible means of reviewing anatomical cross-sections and physiological state in different parts of the human body. The raw medical images have a huge file size and need large storage requirements. So it should be such a way to reduce the size of those image files to be valid for telemedicine applications. Thus the image compression is a key factor to reduce the bit rate for transmission or storage while maintaining an acceptable reproduction quality, but it is natural to rise the question of how much an image can be compressed and still preserve sufficient information for a given clinical application. Many techniques for achieving data compression have been introduced. In this study, three different MRI modalities which are Brain, Spine and Knee have been compressed and reconstructed using wavelet transform. Subjective and objective evaluation has been done to investigate the clinical information quality of the compressed images. For the objective evaluation, the results show that the PSNR which indicates the quality of the reconstructed image is ranging from (21.95 dB to 30.80 dB, 27.25 dB to 35.75 dB, and 26.93 dB to 34.93 dB) for Brain, Spine, and Knee respectively. For the subjective evaluation test, the results show that the compression ratio of 40:1 was acceptable for brain image, whereas for spine and knee images 50:1 was acceptable.

An Efficient Biometric Cryptosystem using Autocorrelators

Cryptography provides the secure manner of information transmission over the insecure channel. It authenticates messages based on the key but not on the user. It requires a lengthy key to encrypt and decrypt the sending and receiving the messages, respectively. But these keys can be guessed or cracked. Moreover, Maintaining and sharing lengthy, random keys in enciphering and deciphering process is the critical problem in the cryptography system. A new approach is described for generating a crypto key, which is acquired from a person-s iris pattern. In the biometric field, template created by the biometric algorithm can only be authenticated with the same person. Among the biometric templates, iris features can efficiently be distinguished with individuals and produces less false positives in the larger population. This type of iris code distribution provides merely less intra-class variability that aids the cryptosystem to confidently decrypt messages with an exact matching of iris pattern. In this proposed approach, the iris features are extracted using multi resolution wavelets. It produces 135-bit iris codes from each subject and is used for encrypting/decrypting the messages. The autocorrelators are used to recall original messages from the partially corrupted data produced by the decryption process. It intends to resolve the repudiation and key management problems. Results were analyzed in both conventional iris cryptography system (CIC) and non-repudiation iris cryptography system (NRIC). It shows that this new approach provides considerably high authentication in enciphering and deciphering processes.

Linear Pocket Profile based Threshold Voltage Model for sub-100 nm n-MOSFET

This paper presents a threshold voltage model of pocket implanted sub-100 nm n-MOSFETs incorporating the drain and substrate bias effects using two linear pocket profiles. Two linear equations are used to simulate the pocket profiles along the channel at the surface from the source and drain edges towards the center of the n-MOSFET. Then the effective doping concentration is derived and is used in the threshold voltage equation that is obtained by solving the Poisson-s equation in the depletion region at the surface. Simulated threshold voltages for various gate lengths fit well with the experimental data already published in the literature. The simulated result is compared with the two other pocket profiles used to derive the threshold voltage models of n-MOSFETs. The comparison shows that the linear model has a simple compact form that can be utilized to study and characterize the pocket implanted advanced ULSI devices.

Probability Distribution of Rainfall Depth at Hourly Time-Scale

Rainfall data at fine resolution and knowledge of its characteristics plays a major role in the efficient design and operation of agricultural, telecommunication, runoff and erosion control as well as water quality control systems. The paper is aimed to study the statistical distribution of hourly rainfall depth for 12 representative stations spread across Peninsular Malaysia. Hourly rainfall data of 10 to 22 years period were collected and its statistical characteristics were estimated. Three probability distributions namely, Generalized Pareto, Exponential and Gamma distributions were proposed to model the hourly rainfall depth, and three goodness-of-fit tests, namely, Kolmogorov-Sminov, Anderson-Darling and Chi-Squared tests were used to evaluate their fitness. Result indicates that the east cost of the Peninsular receives higher depth of rainfall as compared to west coast. However, the rainfall frequency is found to be irregular. Also result from the goodness-of-fit tests show that all the three models fit the rainfall data at 1% level of significance. However, Generalized Pareto fits better than Exponential and Gamma distributions and is therefore recommended as the best fit.

A Framework for Ranking Quality of Information on Weblog

The vast amount of information on the World Wide Web is created and published by many different types of providers. Unlike books and journals, most of this information is not subject to editing or peer review by experts. This lack of quality control and the explosion of web sites make the task of finding quality information on the web especially critical. Meanwhile new facilities for producing web pages such as Blogs make this issue more significant because Blogs have simple content management tools enabling nonexperts to build easily updatable web diaries or online journals. On the other hand despite a decade of active research in information quality (IQ) there is no framework for measuring information quality on the Blogs yet. This paper presents a novel experimental framework for ranking quality of information on the Weblog. The results of data analysis revealed seven IQ dimensions for the Weblog. For each dimension, variables and related coefficients were calculated so that presented framework is able to assess IQ of Weblogs automatically.

Performance Analysis of Genetic Algorithm with kNN and SVM for Feature Selection in Tumor Classification

Tumor classification is a key area of research in the field of bioinformatics. Microarray technology is commonly used in the study of disease diagnosis using gene expression levels. The main drawback of gene expression data is that it contains thousands of genes and a very few samples. Feature selection methods are used to select the informative genes from the microarray. These methods considerably improve the classification accuracy. In the proposed method, Genetic Algorithm (GA) is used for effective feature selection. Informative genes are identified based on the T-Statistics, Signal-to-Noise Ratio (SNR) and F-Test values. The initial candidate solutions of GA are obtained from top-m informative genes. The classification accuracy of k-Nearest Neighbor (kNN) method is used as the fitness function for GA. In this work, kNN and Support Vector Machine (SVM) are used as the classifiers. The experimental results show that the proposed work is suitable for effective feature selection. With the help of the selected genes, GA-kNN method achieves 100% accuracy in 4 datasets and GA-SVM method achieves in 5 out of 10 datasets. The GA with kNN and SVM methods are demonstrated to be an accurate method for microarray based tumor classification.

A Study of Factors Influencing the Improvement of Technology Business Incubator's Effectiveness: An Explanatory Model

In Both developed and developing countries, governments play a basic role in making policies, programs and instruments which support the development of micro, small and medium enterprises. One of the mechanisms employed to nurture small firms for more than two decades is business incubation. One of the mechanisms employed to nurture small firms for more than two decades is technology business incubation. The main aim of this research was to establish influencing factors in Technology Business Incubator's effectiveness and their explanatory model. Therefore, among 56 Technology Business Incubators in Iran, 32 active incubators were selected and by stratified random sampling, 528 start-ups were chosen. The validity of research questionnaires was determines by expert consensus, item analysis and factor analysis; and their reliability calculated by Cronbach-s alpha. Data analysis was then made through SPSS and LISREL soft wares. Both organizational procedures and entrepreneurial behaviors were the meaningful mediators. Organizational procedures with (P < .01, β =0.45) was stronger mediator for the improvement of Technology Business Incubator's effectiveness comparing to entrepreneurial behavior with (P < .01, β =0.36).

Fault Detection and Identification of COSMED K4b2 Based On PCA and Neural Network

COSMED K4b2 is a portable electrical device designed to test pulmonary functions. It is ideal for many applications that need the measurement of the cardio-respiratory response either in the field or in the lab is capable with the capability to delivery real time data to a sink node or a PC base station with storing data in the memory at the same time. But the actual sensor outputs and data received may contain some errors, such as impulsive noise which can be related to sensors, low batteries, environment or disturbance in data acquisition process. These abnormal outputs might cause misinterpretations of exercise or living activities to persons being monitored. In our paper we propose an effective and feasible method to detect and identify errors in applications by principal component analysis (PCA) and a back propagation (BP) neural network.