An Agent Based Dynamic Resource Scheduling Model with FCFS-Job Grouping Strategy in Grid Computing

Grid computing is a group of clusters connected over high-speed networks that involves coordinating and sharing computational power, data storage and network resources operating across dynamic and geographically dispersed locations. Resource management and job scheduling are critical tasks in grid computing. Resource selection becomes challenging due to heterogeneity and dynamic availability of resources. Job scheduling is a NP-complete problem and different heuristics may be used to reach an optimal or near optimal solution. This paper proposes a model for resource and job scheduling in dynamic grid environment. The main focus is to maximize the resource utilization and minimize processing time of jobs. Grid resource selection strategy is based on Max Heap Tree (MHT) that best suits for large scale application and root node of MHT is selected for job submission. Job grouping concept is used to maximize resource utilization for scheduling of jobs in grid computing. Proposed resource selection model and job grouping concept are used to enhance scalability, robustness, efficiency and load balancing ability of the grid.

Environmental Analysis of the Zinc Oxide Nanophotocatalyst Synthesis

Nanophotocatalysts such as titanium (TiO2), zinc (ZnO), and iron (Fe2O3) oxides can be used in organic pollutants oxidation, and in many other applications. But among the challenges for technological application (scale-up) of the nanotechnology scientific developments two aspects are still little explored: research on environmental risk of the nanomaterials preparation methods, and the study of nanomaterials properties and/or performance variability. The environmental analysis was performed for six different methods of ZnO nanoparticles synthesis, and showed that it is possible to identify the more environmentally compatible process even at laboratory scale research. The obtained ZnO nanoparticles were tested as photocatalysts, and increased the degradation rate of the Rhodamine B dye up to 30 times.

Modeling “Web of Trust“ with Web 2.0

“Web of Trust" is one of the recognized goals for Web 2.0. It aims to make it possible for the people to take responsibility for what they publish on the web, including organizations, businesses and individual users. These objectives, among others, drive most of the technologies and protocols recently standardized by the governing bodies. One of the great advantages of Web infrastructure is decentralization of publication. The primary motivation behind Web 2.0 is to assist the people to add contents for Collective Intelligence (CI) while providing mechanisms to link content with people for evaluations and accountability of information. Such structure of contents will interconnect users and contents so that users can use contents to find participants and vice versa. This paper proposes conceptual information storage and linking model, based on decentralized information structure, that links contents and people together. The model uses FOAF, Atom, RDF and RDFS and can be used as a blueprint to develop Web 2.0 applications for any e-domain. However, primary target for this paper is online trust evaluation domain. The proposed model targets to assist the individuals to establish “Web of Trust" in online trust domain.

Bi-Criteria Latency Optimization of Intra-and Inter-Autonomous System Traffic Engineering

Traffic Engineering (TE) is the process of controlling how traffic flows through a network in order to facilitate efficient and reliable network operations while simultaneously optimizing network resource utilization and traffic performance. TE improves the management of data traffic within a network and provides the better utilization of network resources. Many research works considers intra and inter Traffic Engineering separately. But in reality one influences the other. Hence the effective network performances of both inter and intra Autonomous Systems (AS) are not optimized properly. To achieve a better Joint Optimization of both Intra and Inter AS TE, we propose a joint Optimization technique by considering intra-AS features during inter – AS TE and vice versa. This work considers the important criterion say latency within an AS and between ASes. and proposes a Bi-Criteria Latency optimization model. Hence an overall network performance can be improved by considering this jointoptimization technique in terms of Latency.

Active Packaging Influence on Shelf Life Extension of Sliced Wheat Bread

The research object was wheat bread. Experiments were carried out at the Faculty of Food Technology of the Latvia University of Agriculture. An active packaging in combination with modified atmosphere (MAP, CO2 60% and N2 40%) was examined and compared with traditional packaging in air ambiance. Polymer Multibarrier 60, PP and OPP bags were used. Influence of iron based oxygen absorber in sachets of 100 cc obtained from Mitsubishi Gas Chemical Europe Ageless® was tested on the quality during the shelf of wheat bread. Samples of 40±4 g were packaged in polymer pouches (110 mm x 120 mm), hermetically sealed by MULTIVAC C300 vacuum chamber machine, and stored in room temperature +21.0±0.5 °C. The physiochemical properties – weight losses, moisture content, hardness, pH, colour, changes of atmosphere content (CO2 and O2) in headspace of packs, and microbial conditions were analysed before packaging and in the 7th, 14th, 21st and 28th days of storage.

A New True RMS-to-DC Converter in CMOS Technology

This paper presents a new true RMS-to-DC converter circuit based on a square-root-domain squarer/divider. The circuit is designed by employing up-down translinear loop and using of MOSFET transistors that operate in strong inversion saturation region. The converter offer advantages of two-quadrant input current, low circuit complexity, low supply voltage (1.2V) and immunity from the body effect. The circuit has been simulated by HSPICE. The simulation results are seen to conform to the theoretical analysis and shows benefits of the proposed circuit.

Performance and Availability Analyses of PV Generation Systems in Taiwan

The purpose of this article applies the monthly final energy yield and failure data of 202 PV systems installed in Taiwan to analyze the PV operational performance and system availability. This data is collected by Industrial Technology Research Institute through manual records. Bad data detection and failure data estimation approaches are proposed to guarantee the quality of the received information. The performance ratio value and system availability are then calculated and compared with those of other countries. It is indicated that the average performance ratio of Taiwan-s PV systems is 0.74 and the availability is 95.7%. These results are similar with those of Germany, Switzerland, Italy and Japan.

Contourlet versus Wavelet Transform for a Robust Digital Image Watermarking Technique

In this paper, a watermarking algorithm that uses the wavelet transform with Multiple Description Coding (MDC) and Quantization Index Modulation (QIM) concepts is introduced. Also, the paper investigates the role of Contourlet Transform (CT) versus Wavelet Transform (WT) in providing robust image watermarking. Two measures are utilized in the comparison between the waveletbased and the contourlet-based methods; Peak Signal to Noise Ratio (PSNR) and Normalized Cross-Correlation (NCC). Experimental results reveal that the introduced algorithm is robust against different attacks and has good results compared to the contourlet-based algorithm.

Prediction of Tool and Nozzle Flow Behavior in Ultrasonic Machining Process

The use of hard and brittle material has become increasingly more extensive in recent years. Therefore processing of these materials for the parts fabrication has become a challenging problem. However, it is time-consuming to machine the hard brittle materials with the traditional metal-cutting technique that uses abrasive wheels. In addition, the tool would suffer excessive wear as well. However, if ultrasonic energy is applied to the machining process and coupled with the use of hard abrasive grits, hard and brittle materials can be effectively machined. Ultrasonic machining process is mostly used for the brittle materials. The present research work has developed models using finite element approach to predict the mechanical stresses sand strains produced in the tool during ultrasonic machining process. Also the flow behavior of abrasive slurry coming out of the nozzle has been studied for simulation using ANSYS CFX module. The different abrasives of different grit sizes have been used for the experimentation work.

Product-Based Industrial Information Systems (Application to the Steel Industry)

This paper shows a simple and effective approach to the design and implementation of Industrial Information Systems (IIS) oriented to control the characteristics of each individual product manufactured in a production line and also their manufacturing conditions. The particular products considered in this work are large steel strips that are coiled just after their manufacturing. However, the approach is directly applicable to coiled strips in other industries, like paper, textile, aluminum, etc. These IIS provide very detailed information of each manufactured product, which complement the general information managed by the ERP system of the production line. In spite of the high importance of this type of IIS to guarantee and improve the quality of the products manufactured in many industries, there are very few works about them in the technical literature. For this reason, this paper represents an important contribution to the development of this type of IIS, providing guidelines for their design, implementation and exploitation.

Lung Nodule Detection in CT Scans

In this paper we describe a computer-aided diagnosis (CAD) system for automated detection of pulmonary nodules in computed-tomography (CT) images. After extracting the pulmonary parenchyma using a combination of image processing techniques, a region growing method is applied to detect nodules based on 3D geometric features. We applied the CAD system to CT scans collected in a screening program for lung cancer detection. Each scan consists of a sequence of about 300 slices stored in DICOM (Digital Imaging and Communications in Medicine) format. All malignant nodules were detected and a low false-positive detection rate was achieved.

On Methodologies for Analysing Sickness Absence Data: An Insight into a New Method

Sickness absence represents a major economic and social issue. Analysis of sick leave data is a recurrent challenge to analysts because of the complexity of the data structure which is often time dependent, highly skewed and clumped at zero. Ignoring these features to make statistical inference is likely to be inefficient and misguided. Traditional approaches do not address these problems. In this study, we discuss model methodologies in terms of statistical techniques for addressing the difficulties with sick leave data. We also introduce and demonstrate a new method by performing a longitudinal assessment of long-term absenteeism using a large registration dataset as a working example available from the Helsinki Health Study for municipal employees from Finland during the period of 1990-1999. We present a comparative study on model selection and a critical analysis of the temporal trends, the occurrence and degree of long-term sickness absences among municipal employees. The strengths of this working example include the large sample size over a long follow-up period providing strong evidence in supporting of the new model. Our main goal is to propose a way to select an appropriate model and to introduce a new methodology for analysing sickness absence data as well as to demonstrate model applicability to complicated longitudinal data.

The Influence of Preprocessing Parameters on Text Categorization

Text categorization (the assignment of texts in natural language into predefined categories) is an important and extensively studied problem in Machine Learning. Currently, popular techniques developed to deal with this task include many preprocessing and learning algorithms, many of which in turn require tuning nontrivial internal parameters. Although partial studies are available, many authors fail to report values of the parameters they use in their experiments, or reasons why these values were used instead of others. The goal of this work then is to create a more thorough comparison of preprocessing parameters and their mutual influence, and report interesting observations and results.

A Hybrid Approach for Quantification of Novelty in Rule Discovery

Rule Discovery is an important technique for mining knowledge from large databases. Use of objective measures for discovering interesting rules lead to another data mining problem, although of reduced complexity. Data mining researchers have studied subjective measures of interestingness to reduce the volume of discovered rules to ultimately improve the overall efficiency of KDD process. In this paper we study novelty of the discovered rules as a subjective measure of interestingness. We propose a hybrid approach that uses objective and subjective measures to quantify novelty of the discovered rules in terms of their deviations from the known rules. We analyze the types of deviation that can arise between two rules and categorize the discovered rules according to the user specified threshold. We implement the proposed framework and experiment with some public datasets. The experimental results are quite promising.

AI Applications to Metal Stamping Die Design– A Review

Metal stamping die design is a complex, experiencebased and time-consuming task. Various artificial intelligence (AI) techniques are being used by worldwide researchers for stamping die design to reduce complexity, dependence on human expertise and time taken in design process as well as to improve design efficiency. In this paper a comprehensive review of applications of AI techniques in manufacturability evaluation of sheet metal parts, die design and process planning of metal stamping die is presented. Further the salient features of major research work published in the area of metal stamping are presented in tabular form and scope of future research work is identified.

A Hybrid Approach for Color Image Quantization Using K-means and Firefly Algorithms

Color Image quantization (CQ) is an important problem in computer graphics, image and processing. The aim of quantization is to reduce colors in an image with minimum distortion. Clustering is a widely used technique for color quantization; all colors in an image are grouped to small clusters. In this paper, we proposed a new hybrid approach for color quantization using firefly algorithm (FA) and K-means algorithm. Firefly algorithm is a swarmbased algorithm that can be used for solving optimization problems. The proposed method can overcome the drawbacks of both algorithms such as the local optima converge problem in K-means and the early converge of firefly algorithm. Experiments on three commonly used images and the comparison results shows that the proposed algorithm surpasses both the base-line technique k-means clustering and original firefly algorithm.

Mode III Interlaminar Fracture in Woven Glass/Epoxy Composite Laminates

In the present study, fracture behavior of woven fabric-reinforced glass/epoxy composite laminates under mode III crack growth was experimentally investigated and numerically modeled. Two methods were used for the calculation of the strain energy release rate: the experimental compliance calibration (CC) method and the Virtual Crack Closure Technique (VCCT). To achieve this aim ECT (Edge Crack Torsion) was used to evaluate fracture toughness in mode III loading (out of plane-shear) at different crack lengths. Load–displacement and associated energy release rates were obtained for various case of interest. To calculate fracture toughness JIII, two criteria were considered including non-linearity and maximum points in load-displacement curve and it is observed that JIII increases with the crack length increase. Both the experimental compliance method and the virtual crack closure technique proved applicable for the interpretation of the fracture mechanics data of woven glass/epoxy laminates in mode III.

Visualisation and Navigation in Large Scale P2P Service Networks

In Peer-to-Peer service networks, where peers offer any kind of publicly available services or applications, intuitive navigation through all services in the network becomes more difficult as the number of services increases. In this article, a concept is discussed that enables users to intuitively browse and use large scale P2P service networks. The concept extends the idea of creating virtual 3D-environments solely based on Peer-to-Peer technologies. Aside from browsing, users shall have the possibility to emphasize services of interest using their own semantic criteria. The appearance of the virtual world shall intuitively reflect network properties that may be of interest for the user. Additionally, the concept comprises options for load- and traffic-balancing. In this article, the requirements concerning the underlying infrastructure and the graphical user interface are defined. First impressions of the appearance of future systems are presented and the next steps towards a prototypical implementation are discussed.

A Single-chip Proportional to Absolute Temperature Sensor Using CMOS Technology

Nowadays it is a trend for electronic circuit designers to integrate all system components on a single-chip. This paper proposed the design of a single-chip proportional to absolute temperature (PTAT) sensor including a voltage reference circuit using CEDEC 0.18m CMOS Technology. It is a challenge to design asingle-chip wide range linear response temperature sensor for many applications. The channel widths between the compensation transistor and the reference transistor are critical to design the PTAT temperature sensor circuit. The designed temperature sensor shows excellent linearity between -100°C to 200° and the sensitivity is about 0.05mV/°C. The chip is designed to operate with a single voltage source of 1.6V.

IVE: Virtual Humans’ AI Prototyping Toolkit

IVE toolkit has been created for facilitating research,education and development in the field of virtual storytelling and computer games. Primarily, the toolkit is intended for modelling action selection mechanisms of virtual humans, investigating level-of-detail AI techniques for large virtual environments, and for exploring joint behaviour and role-passing technique (Sec. V). Additionally, the toolkit can be used as an AI middleware without any changes. The main facility of IVE is that it serves for prototyping both the AI and virtual worlds themselves. The purpose of this paper is to describe IVE's features in general and to present our current work - including an educational game - on this platform.