An Innovative Wireless Sensor Network Protocol Implementation using a Hybrid FPGA Technology

Traditional development of wireless sensor network mote is generally based on SoC1 platform. Such method of development faces three main drawbacks: lack of flexibility in terms of development due to low resource and rigid architecture of SoC; low capability of evolution and portability versus performance if specific micro-controller architecture features are used; and the rapid obsolescence of micro-controller comparing to the long lifetime of power plants or any industrial installations. To overcome these drawbacks, we have explored a new approach of development of wireless sensor network mote using a hybrid FPGA technology. The application of such approach is illustrated through the implementation of an innovative wireless sensor network protocol called OCARI.

Advanced Image Analysis Tools Development for the Early Stage Bronchial Cancer Detection

Autofluorescence (AF) bronchoscopy is an established method to detect dysplasia and carcinoma in situ (CIS). For this reason the “Sotiria" Hospital uses the Karl Storz D-light system. However, in early tumor stages the visualization is not that obvious. With the help of a PC, we analyzed the color images we captured by developing certain tools in Matlab®. We used statistical methods based on texture analysis, signal processing methods based on Gabor models and conversion algorithms between devicedependent color spaces. Our belief is that we reduced the error made by the naked eye. The tools we implemented improve the quality of patients' life.

Analysis of Linked in Series Servers with Blocking, Priority Feedback Service and Threshold Policy

The use of buffer thresholds, blocking and adequate service strategies are well-known techniques for computer networks traffic congestion control. This motivates the study of series queues with blocking, feedback (service under Head of Line (HoL) priority discipline) and finite capacity buffers with thresholds. In this paper, the external traffic is modelled using the Poisson process and the service times have been modelled using the exponential distribution. We consider a three-station network with two finite buffers, for which a set of thresholds (tm1 and tm2) is defined. This computer network behaves as follows. A task, which finishes its service at station B, gets sent back to station A for re-processing with probability o. When the number of tasks in the second buffer exceeds a threshold tm2 and the number of task in the first buffer is less than tm1, the fed back task is served under HoL priority discipline. In opposite case, for fed backed tasks, “no two priority services in succession" procedure (preventing a possible overflow in the first buffer) is applied. Using an open Markovian queuing schema with blocking, priority feedback service and thresholds, a closed form cost-effective analytical solution is obtained. The model of servers linked in series is very accurate. It is derived directly from a twodimensional state graph and a set of steady-state equations, followed by calculations of main measures of effectiveness. Consequently, efficient expressions of the low computational cost are determined. Based on numerical experiments and collected results we conclude that the proposed model with blocking, feedback and thresholds can provide accurate performance estimates of linked in series networks.

Enhancing Efficiency for Reducing Sugar from Cassava Bagasse by Pretreatment

Cassava bagasse is one of major biomass wastes in Thailand from starch processing industry, which contains high starch content of about 60%. The object of this study was to investigate the optimal condition for hydrothermally pretreating cassava baggasses with or without acid addition. The pretreated samples were measured reducing sugar yield directly or after enzymatic hydrolysis (alpha-amylase). In enzymatic hydrolysis, the highest reducing sugar content was obtained under hydrothermal conditions for at 125oC for 30 min. The result shows that pretreating cassava baggasses increased the efficiency of enzymatic hydrolysis. For acid hydrolysis, pretreating cassava baggasses with sulfuric acid at 120oC for 60 min gave a maximum reducing sugar yield. In this study, sulfuric acid had a greater capacity for hydrolyzing cassava baggasses than phosphoric acid. In comparison, dilute acid hydrolysis to provide a higher yield of reducing sugar than the enzymatic hydrolysis combined hydrothermal pretreatment. However, enzymatic hydrolysis in a combination with hydrothermal pretreatment was an alternative to enhance efficiency reducing sugar production from cassava bagasse.

Grid Learning; Computer Grid Joins to e- Learning

According to development of communications and web-based technologies in recent years, e-Learning has became very important for everyone and is seen as one of most dynamic teaching methods. Grid computing is a pattern for increasing of computing power and storage capacity of a system and is based on hardware and software resources in a network with common purpose. In this article we study grid architecture and describe its different layers. In this way, we will analyze grid layered architecture. Then we will introduce a new suitable architecture for e-Learning which is based on grid network, and for this reason we call it Grid Learning Architecture. Various sections and layers of suggested architecture will be analyzed; especially grid middleware layer that has key role. This layer is heart of grid learning architecture and, in fact, regardless of this layer, e-Learning based on grid architecture will not be feasible.

Importance of Pastoral Human Factor Overloading in Land Desertification: Case Studies in Northeastern Libya

Grazing and pastoral overloading through human factors result in significant land desertification. Failure to take into account the phenomenon of desertification as a serious problem can lead to an environmental disaster because of the damages caused by land encroachment. Therefore, soil on residential and urban areas is affected because of the deterioration of vegetation. Overgrazing or grazing in open and irregular lands is practiced in these areas almost throughout the year, especially during the growth cycle of edible plants, thereby leading to their disappearance. In addition, the large number of livestock in these areas exceeds the capacity of these pastures because of pastoral land overloading, which results in deterioration and desertification in the region. In addition, rare plants, the extinction of some edible plants in the region, and the emergence of plants unsuitable for grazing, must be taken into consideration, as along with the emergence of dust and sand storms during the dry seasons (summer to autumn) due to the degradation of vegetation. These results show that strategic plans and regulations that protect the environment from desertification must be developed. Therefore, increased pastoral load is a key human factor in the deterioration of vegetation cover, leading to land desertification in this region.

A Comparison of Fuzzy Clustering Algorithms to Cluster Web Messages

Our objective in this paper is to propose an approach capable of clustering web messages. The clustering is carried out by assigning, with a certain probability, texts written by the same web user to the same cluster based on Stylometric features and using fuzzy clustering algorithms. Focus in the present work is on comparing the most popular algorithms in fuzzy clustering theory namely, Fuzzy C-means, Possibilistic C-means and Fuzzy Possibilistic C-Means.

Determinants of Capital Structure in Malaysia Electrical and Electronic Sector

Capital structure is one of the most important financial decisions in corporate financing strategy. It involves the choice of debt and equity level in financing a company-s operations. This study aims to investigate whether the capital structure choice of Malaysian electrical and electronic manufacturing companies that are listed in the Bursa Malaysia can be explained by factors that have been found by most studies as dominant determinants of capital structure (company size, profitability, asset tangibility, liquidity and growth). Using debt ratio as the proxy for capital structure and applying pooled ordinary least square multiple regression estimation, the results showed that on average, Malaysian electrical and electronic manufacturing companies used less debt in funding their business operations. The findings also showed that size and asset tangibility has a significant positive relationship with debt level, while liquidity has a negative significant relationship with leverage.

Algorithm for Bleeding Determination Based On Object Recognition and Local Color Features in Capsule Endoscopy

Automatic determination of blood in less bright or noisy capsule endoscopic images is difficult due to low S/N ratio. Especially it may not be accurate to analyze these images due to the influence of external disturbance. Therefore, we proposed detection methods that are not dependent only on color bands. In locating bleeding regions, the identification of object outlines in the frame and features of their local colors were taken into consideration. The results showed that the capability of detecting bleeding was much improved.

Development of Improved Three Dimensional Unstructured Tetrahedral Mesh Generator

Meshing is the process of discretizing problem domain into many sub domains before the numerical calculation can be performed. One of the most popular meshes among many types of meshes is tetrahedral mesh, due to their flexibility to fit into almost any domain shape. In both 2D and 3D domains, triangular and tetrahedral meshes can be generated by using Delaunay triangulation. The quality of mesh is an important factor in performing any Computational Fluid Dynamics (CFD) simulations as the results is highly affected by the mesh quality. Many efforts had been done in order to improve the quality of the mesh. The paper describes a mesh generation routine which has been developed capable of generating high quality tetrahedral cells in arbitrary complex geometry. A few test cases in CFD problems are used for testing the mesh generator. The result of the mesh is compared with the one generated by a commercial software. The results show that no sliver exists for the meshes generated, and the overall quality is acceptable since the percentage of the bad tetrahedral is relatively small. The boundary recovery was also successfully done where all the missing faces are rebuilt.

Effect of Dynamic Stall, Finite Aspect Ratio and Streamtube Expansion on VAWT Performance Prediction using the BE-M Model

A multiple-option analytical model for the evaluation of the energy performance and distribution of aerodynamic forces acting on a vertical-axis Darrieus wind turbine depending on both rotor architecture and operating conditions is presented. For this purpose, a numerical algorithm, capable of generating the desired rotor conformation depending on design geometric parameters, is coupled to a Single/Double-Disk Multiple-Streamtube Blade Element – Momentum code. Both single and double-disk configurations are analyzed and model predictions are compared to literature experimental data in order to test the capability of the code for predicting rotor performance. Effective airfoil characteristics based on local blade Reynolds number are obtained through interpolation of literature low-Reynolds airfoil databases. Some corrections are introduced inside the original model with the aim of simulating also the effects of blade dynamic stall, rotor streamtube expansion and blade finite aspect ratio, for which a new empirical relationship to better fit the experimental data is proposed. In order to predict also open field rotor operation, a freestream wind shear profile is implemented, reproducing the effect of atmospheric boundary layer.

Development of Composite Adsorbent for Waste Water Treatment Using Adsorption and Electrochemical Regeneration

A unique combination of adsorption and electrochemical regeneration with a proprietary adsorbent material called Nyex 100 was introduced at the University of Manchester for waste water treatment applications. Nyex 100 is based on graphite intercalation compound. It is non porous and electrically conducing adsorbent material. This material exhibited very small BET surface area i.e. 2.75 m2g-1, in consequence, small adsorptive capacities for the adsorption of various organic pollutants were obtained. This work aims to develop composite adsorbent material essentially capable of electrochemical regeneration coupled with improved adsorption characteristics. An organic dye, acid violet 17 was used as standard organic pollutant. The developed composite material was successfully electrochemically regenerated using a DC current of 1 A for 60 minutes. Regeneration efficiency was maintained at around 100% for five adsorption-regeneration cycles.

Information Fusion as a Means of Forecasting Expenditures for Regenerating Complex Investment Goods

Planning capacities when regenerating complex investment goods involves particular challenges in that the planning is subject to a large degree of uncertainty regarding load information. Using information fusion – by applying Bayesian Networks – a method is being developed for forecasting the anticipated expenditures (human labor, tool and machinery utilization, time etc.) for regenerating a good. The generated forecasts then later serve as a tool for planning capacities and ensure a greater stability in the planning processes.

Assessing Habitat-Suitability Models with a Virtual Species at Khao Nan National Park, Thailand

This study examined a habitat-suitability assessment method namely the Ecological Niche Factor Analysis (ENFA). A virtual species was created and then dispatched in a geographic information system model of a real landscape in three historic scenarios: (1) spreading, (2) equilibrium, and (3) overabundance. In each scenario, the virtual species was sampled and these simulated data sets were used as inputs for the ENFA to reconstruct the habitat suitability model. The 'equilibrium' scenario gives the highest quantity and quality among three scenarios. ENFA was sensitive to the distribution scenarios but not sensitive to sample sizes. The use of a virtual species proved to be a very efficient method, allowing one to fully control the quality of the input data as well as to accurately evaluate the predictive power of the analyses.

Component Based Framework for Authoring and Multimedia Training in Mathematics

The new programming technologies allow for the creation of components which can be automatically or manually assembled to reach a new experience in knowledge understanding and mastering or in getting skills for a specific knowledge area. The project proposes an interactive framework that permits the creation, combination and utilization of components that are specific to mathematical training in high schools. The main framework-s objectives are: • authoring lessons by the teacher or the students; all they need are simple operating skills for Equation Editor (or something similar, or Latex); the rest are just drag & drop operations, inserting data into a grid, or navigating through menus • allowing sonorous presentations of mathematical texts and solving hints (easier understood by the students) • offering graphical representations of a mathematical function edited in Equation • storing of learning objects in a database • storing of predefined lessons (efficient for expressions and commands, the rest being calculations; allows a high compression) • viewing and/or modifying predefined lessons, according to the curricula The whole thing is focused on a mathematical expressions minicompiler, storing the code that will be later used for different purposes (tables, graphics, and optimisations). Programming technologies used. A Visual C# .NET implementation is proposed. New and innovative digital learning objects for mathematics will be developed; they are capable to interpret, contextualize and react depending on the architecture where they are assembled.

Analyzing Periurban Fringe with Rough Set

The distinction among urban, periurban and rural areas represents a classical example of uncertainty in land classification. Satellite images, geostatistical analysis and all kinds of spatial data are very useful in urban sprawl studies, but it is important to define precise rules in combining great amounts of data to build complex knowledge about territory. Rough Set theory may be a useful method to employ in this field. It represents a different mathematical approach to uncertainty by capturing the indiscernibility. Two different phenomena can be indiscernible in some contexts and classified in the same way when combining available information about them. This approach has been applied in a case of study, comparing the results achieved with both Map Algebra technique and Spatial Rough Set. The study case area, Potenza Province, is particularly suitable for the application of this theory, because it includes 100 municipalities with different number of inhabitants and morphologic features.

An Optimal Load Shedding Approach for Distribution Networks with DGs considering Capacity Deficiency Modelling of Bulked Power Supply

This paper discusses a genetic algorithm (GA) based optimal load shedding that can apply for electrical distribution networks with and without dispersed generators (DG). Also, the proposed method has the ability for considering constant and variable capacity deficiency caused by unscheduled outages in the bulked generation and transmission system of bulked power supply. The genetic algorithm (GA) is employed to search for the optimal load shedding strategy in distribution networks considering DGs in two cases of constant and variable modelling of bulked power supply of distribution networks. Electrical power distribution systems have a radial network and unidirectional power flows. With the advent of dispersed generations, the electrical distribution system has a locally looped network and bidirectional power flows. Therefore, installed DG in the electrical distribution systems can cause operational problems and impact on existing operational schemes. Introduction of DGs in electrical distribution systems has introduced many new issues in operational and planning level. Load shedding as one of operational issue has no exempt. The objective is to minimize the sum of curtailed load and also system losses within the frame-work of system operational and security constraints. The proposed method is tested on a radial distribution system with 33 load points for more practical applications.

Improved Approximation to the Derivative of a Digital Signal Using Wavelet Transforms for Crosstalk Analysis

The information revealed by derivatives can help to better characterize digital near-end crosstalk signatures with the ultimate goal of identifying the specific aggressor signal. Unfortunately, derivatives tend to be very sensitive to even low levels of noise. In this work we approximated the derivatives of both quiet and noisy digital signals using a wavelet-based technique. The results are presented for Gaussian digital edges, IBIS Model digital edges, and digital edges in oscilloscope data captured from an actual printed circuit board. Tradeoffs between accuracy and noise immunity are presented. The results show that the wavelet technique can produce first derivative approximations that are accurate to within 5% or better, even under noisy conditions. The wavelet technique can be used to calculate the derivative of a digital signal edge when conventional methods fail.

Usability and Functionality: A Comparison of Key Project Personnel's and Potential Users' Evaluations

Meeting users- requirements is one of predictors of project success. There should be a match between the expectations of the users and the perception of key project personnel with respect to usability and functionality. The aim of this study is to make a comparison of key project personnel-s and potential users- (customer representatives) evaluations of the relative importance of usability and functionality factors in a software design project. Analytical Network Process (ANP) was used to analyze the relative importance of the factors. The results show that navigation and interaction are the most significant factors,andsatisfaction and efficiency are the least important factors for both groups. Further, it can be concluded that having similar orders and scores of usability and functionality factors for both groups shows that key project personnel have captured the expectations and requirements of potential users accurately.

Generating Concept Trees from Dynamic Self-organizing Map

Self-organizing map (SOM) provides both clustering and visualization capabilities in mining data. Dynamic self-organizing maps such as Growing Self-organizing Map (GSOM) has been developed to overcome the problem of fixed structure in SOM to enable better representation of the discovered patterns. However, in mining large datasets or historical data the hierarchical structure of the data is also useful to view the cluster formation at different levels of abstraction. In this paper, we present a technique to generate concept trees from the GSOM. The formation of tree from different spread factor values of GSOM is also investigated and the quality of the trees analyzed. The results show that concept trees can be generated from GSOM, thus, eliminating the need for re-clustering of the data from scratch to obtain a hierarchical view of the data under study.