Evolving a Fuzzy Rule-Base for Image Segmentation

A new method for color image segmentation using fuzzy logic is proposed in this paper. Our aim here is to automatically produce a fuzzy system for color classification and image segmentation with least number of rules and minimum error rate. Particle swarm optimization is a sub class of evolutionary algorithms that has been inspired from social behavior of fishes, bees, birds, etc, that live together in colonies. We use comprehensive learning particle swarm optimization (CLPSO) technique to find optimal fuzzy rules and membership functions because it discourages premature convergence. Here each particle of the swarm codes a set of fuzzy rules. During evolution, a population member tries to maximize a fitness criterion which is here high classification rate and small number of rules. Finally, particle with the highest fitness value is selected as the best set of fuzzy rules for image segmentation. Our results, using this method for soccer field image segmentation in Robocop contests shows 89% performance. Less computational load is needed when using this method compared with other methods like ANFIS, because it generates a smaller number of fuzzy rules. Large train dataset and its variety, makes the proposed method invariant to illumination noise

Utilizing Virtual Worlds in Education: The Implications for Practice

Multi User Virtual Worlds are becoming a valuable educational tool. Learning experiences within these worlds focus on discovery and active experiences that both engage students and motivate them to explore new concepts. As educators, we need to explore these environments to determine how they can most effectively be used in our instructional practices. This paper explores the current application of virtual worlds to identify meaningful educational strategies that are being used to engage students and enhance teaching and learning.

Structural Characteristics of Three-Dimensional Random Packing of Aggregates with Wide Size Distribution

The mechanical properties of granular solids are dependent on the flow of stresses from one particle to another through inter-particle contact. Although some experimental methods have been used to study the inter-particle contacts in the past, preliminary work with these techniques indicated that they do not have the necessary resolution to distinguish between those contacts that transmit the load and those that do not, especially for systems with a wide distribution of particle sizes. In this research, computer simulations are used to study the nature and distribution of contacts in a compact with wide particle size distribution, representative of aggregate size distribution used in asphalt pavement construction. The packing fraction, the mean number of contacts and the distribution of contacts were studied for different scenarios. A methodology to distinguish and compute the fraction of load-bearing particles and the fraction of space-filling particles (particles that do not transmit any force) is needed for further investigation.

Choosing Search Algorithms in Bayesian Optimization Algorithm

The Bayesian Optimization Algorithm (BOA) is an algorithm based on the estimation of distributions. It uses techniques from modeling data by Bayesian networks to estimating the joint distribution of promising solutions. To obtain the structure of Bayesian network, different search algorithms can be used. The key point that BOA addresses is whether the constructed Bayesian network could generate new and useful solutions (strings), which could lead the algorithm in the right direction to solve the problem. Undoubtedly, this ability is a crucial factor of the efficiency of BOA. Varied search algorithms can be used in BOA, but their performances are different. For choosing better ones, certain suitable method to present their ability difference is needed. In this paper, a greedy search algorithm and a stochastic search algorithm are used in BOA to solve certain optimization problem. A method using Kullback-Leibler (KL) Divergence to reflect their difference is described.

Improvising Intrusion Detection for Malware Activities on Dual-Stack Network Environment

Malware is software which was invented and meant for doing harms on computers. Malware is becoming a significant threat in computer network nowadays. Malware attack is not just only involving financial lost but it can also cause fatal errors which may cost lives in some cases. As new Internet Protocol version 6 (IPv6) emerged, many people believe this protocol could solve most malware propagation issues due to its broader addressing scheme. As IPv6 is still new compares to native IPv4, some transition mechanisms have been introduced to promote smoother migration. Unfortunately, these transition mechanisms allow some malwares to propagate its attack from IPv4 to IPv6 network environment. In this paper, a proof of concept shall be presented in order to show that some existing IPv4 malware detection technique need to be improvised in order to detect malware attack in dual-stack network more efficiently. A testbed of dual-stack network environment has been deployed and some genuine malware have been released to observe their behaviors. The results between these different scenarios will be analyzed and discussed further in term of their behaviors and propagation methods. The results show that malware behave differently on IPv6 from the IPv4 network protocol on the dual-stack network environment. A new detection technique is called for in order to cater this problem in the near future.

Geomatics Techniques for Urban Transport Planning

The major urban centers are all facing rapid growth is most often associated with spreading urbanization, social status of the car has also changed: it has become a commodity of mass consumption. There are currently about 5 million and 260 cars in Algeria (2008), this number increases every year 200,000 new cars. These phenomena induce a demand for greater mobility and a significant need for transport infrastructure. Faced with these problems and development of the growing use of the automobile, central governments and local authorities in charge of urban transport issues are aware of the need to develop their urban transport systems but often lack opportunities. Urban Transport Plans (PDU) were born in reaction to the "culture of automobile." Their existence in the world the '80s, however, they had little success before laws on air and rational use of energy in 90 years does not alter substantially their content and make mandatory their implementation in cities of over 100,000 inhabitants (Abroad) [1]. The objective of this work is to use the tool and specifically Geomatics techniques as decision support in the organization and management of travel while taking into consideration the influence, which will then translate by National Urban Transport Plan.

Experimentation on Piercing with Abrasive Waterjet

Abrasive waterjet cutting (AWJ) is a highly efficient method for cutting almost any type of material. When holes shall be cut the waterjet first needs to pierce the material.This paper presents a vast experimental analysis of piercing parameters effect on piercing time. Results from experimentation on feed rates, work piece thicknesses, abrasive flow rates, standoff distances and water pressure are also presented as well as studies on three methods for dynamic piercing. It is shown that a large amount of time and resources can be saved by choosing the piercing parameters in a correct way. The large number of experiments puts demands on the experimental setup. An automated experimental setup including piercing detection is presented to enable large series of experiments to be carried out efficiently.

Analysis of Highway Slope Failure by an Application of the Stereographic Projection

The mountain road slope failures triggered by earthquake activities and torrential rain namely to create the disaster. Province Road No. 24 is a main route to the Wutai Township. The area of the study is located at the mileages between 46K and 47K along the road. However, the road has been suffered frequent damages as a result of landslide and slope failures during typhoon seasons. An understanding of the sliding behaviors in the area appears to be necessary. Slope failures triggered by earthquake activities and heavy rainfalls occur frequently. The study is to understand the mechanism of slope failures and to look for the way to deal with the situation. In order to achieve these objectives, this paper is based on theoretical and structural geology data interpretation program to assess the potential slope sliding behavior. The study showed an intimate relationship between the landslide behavior of the slopes and the stratum materials, based on structural geology analysis method to analysis slope stability and finds the slope safety coefficient to predict the sites of destroyed layer. According to the case study and parameter analyses results, the slope mainly slips direction compared to the site located in the southeast area. Find rainfall to result in the rise of groundwater level is main reason of the landslide mechanism. Future need to set up effective horizontal drain at corrective location, that can effective restrain mountain road slope failures and increase stability of slope.

Expert Witness Testimony in the Battered Woman Syndrome

The Expert Witness Testimony in the Battered Woman Syndrome Expert witness testimony (EWT) is a kind of information given by an expert specialized in the field (here in BWS) to the jury in order to help the court better understand the case. EWT does not always work in favor of the battered women. Two main decision-making models are discussed in the paper: the Mathematical model and the Explanation model. In the first model, the jurors calculate ″the importance and strength of each piece of evidence″ whereas in the second model they try to integrate the EWT with the evidence and create a coherent story that would describe the crime. The jury often misunderstands and misjudges battered women for their action (or in this case inaction). They assume that these women are masochists and accept being mistreated for if a man abuses a woman constantly, she should and could divorce him or simply leave at any time. The research in the domain found that indeed, expert witness testimony has a powerful influence on juror’s decisions thus its quality needs to be further explored. One of the important factors that need further studies is a bias called the dispositionist worldview (a belief that what happens to people is of their own doing). This kind of attributional bias represents a tendency to think that a person’s behavior is due to his or her disposition, even when the behavior is clearly attributed to the situation. Hypothesis The hypothesis of this paper is that if a juror has a dispositionist worldview then he or she will blame the rape victim for triggering the assault. The juror would therefore commit the fundamental attribution error and believe that the victim’s disposition caused the rape and not the situation she was in. Methods The subjects in the study were 500 randomly sampled undergraduate students from McGill, Concordia, Université de Montréal and UQAM. Dispositional Worldview was scored on the Dispositionist Worldview Questionnaire. After reading the Rape Scenarios, each student was asked to play the role of a juror and answer a questionnaire consisting of 7 questions about the responsibility, causality and fault of the victim. Results The results confirm the hypothesis which states that if a juror has a dispositionist worldview then he or she will blame the rape victim for triggering the assault. By doing so, the juror commits the fundamental attribution error because he will believe that the victim’s disposition, and not the constraints or opportunities of the situation, caused the rape scenario.

The Relationship between Learners-Motivation (Integrative and Instrumental) and English Proficiency among Iranian EFL Learners

The current study aims at investigating the relationship between the learners- integrative and instrumental motivation and English proficiency among Iranian EFL learners. The participants in this study consisted of 128 undergraduate university students including 64 males and 64 females, majoring in English as a foreign language, from Shiraz Azad University. Two research instruments were used to gather the needed data for this study: 1) Language Proficiency Test. 2) A scale on motivation which determines the type of the EFL learners- motivation. Correlatin coefficient and t-test were used to analyze the collected data and the main result was found as follows: There is a significant relationship between the integrative motivation and instrumental motivation with English proficiency among EFL learners of Shiraz Azad University.

A Hyper-Domain Image Watermarking Method based on Macro Edge Block and Wavelet Transform for Digital Signal Processor

In order to protect original data, watermarking is first consideration direction for digital information copyright. In addition, to achieve high quality image, the algorithm maybe can not run on embedded system because the computation is very complexity. However, almost nowadays algorithms need to build on consumer production because integrator circuit has a huge progress and cheap price. In this paper, we propose a novel algorithm which efficient inserts watermarking on digital image and very easy to implement on digital signal processor. In further, we select a general and cheap digital signal processor which is made by analog device company to fit consumer application. The experimental results show that the image quality by watermarking insertion can achieve 46 dB can be accepted in human vision and can real-time execute on digital signal processor.

BugCatcher.Net: Detecting Bugs and Proposing Corrective Solutions

Although achieving zero-defect software release is practically impossible, software industries should take maximum care to detect defects/bugs well ahead in time allowing only bare minimums to creep into released version. This is a clear indicator of time playing an important role in the bug detection. In addition to this, software quality is the major factor in software engineering process. Moreover, early detection can be achieved only through static code analysis as opposed to conventional testing. BugCatcher.Net is a static analysis tool, which detects bugs in .NET® languages through MSIL (Microsoft Intermediate Language) inspection. The tool utilizes a Parser based on Finite State Automata to carry out bug detection. After being detected, bugs need to be corrected immediately. BugCatcher.Net facilitates correction, by proposing a corrective solution for reported warnings/bugs to end users with minimum side effects. Moreover, the tool is also capable of analyzing the bug trend of a program under inspection.

Performance Analysis of Digital Signal Processors Using SMV Benchmark

Unlike general-purpose processors, digital signal processors (DSP processors) are strongly application-dependent. To meet the needs for diverse applications, a wide variety of DSP processors based on different architectures ranging from the traditional to VLIW have been introduced to the market over the years. The functionality, performance, and cost of these processors vary over a wide range. In order to select a processor that meets the design criteria for an application, processor performance is usually the major concern for digital signal processing (DSP) application developers. Performance data are also essential for the designers of DSP processors to improve their design. Consequently, several DSP performance benchmarks have been proposed over the past decade or so. However, none of these benchmarks seem to have included recent new DSP applications. In this paper, we use a new benchmark that we recently developed to compare the performance of popular DSP processors from Texas Instruments and StarCore. The new benchmark is based on the Selectable Mode Vocoder (SMV), a speech-coding program from the recent third generation (3G) wireless voice applications. All benchmark kernels are compiled by the compilers of the respective DSP processors and run on their simulators. Weighted arithmetic mean of clock cycles and arithmetic mean of code size are used to compare the performance of five DSP processors. In addition, we studied how the performance of a processor is affected by code structure, features of processor architecture and optimization of compiler. The extensive experimental data gathered, analyzed, and presented in this paper should be helpful for DSP processor and compiler designers to meet their specific design goals.

A Software of Intrusion Detection Mechanism for Virtual Platforms

Security is an interesting and significance issue for popular virtual platforms, such as virtualization cluster and cloud platforms. Virtualization is the powerful technology for cloud computing services, there are a lot of benefits by using virtual machine tools which be called hypervisors, such as it can quickly deploy all kinds of virtual Operating Systems in single platform, able to control all virtual system resources effectively, cost down for system platform deployment, ability of customization, high elasticity and high reliability. However, some important security problems need to take care and resolved in virtual platforms that include terrible viruses, evil programs, illegal operations and intrusion behavior. In this paper, we present useful Intrusion Detection Mechanism (IDM) software that not only can auto to analyze all system-s operations with the accounting journal database, but also is able to monitor the system-s state for virtual platforms.

IT Management: How IT Managers Gain IT knowledge

It is not a secret that, IT management has become more and more and integrated part of almost all organizations. IT managers posses an enormous amount of knowledge within both organizational knowledge and general IT knowledge. This article investigates how IT managers keep themselves updated on IT knowledge in general and looks into how much time IT managers spend on weekly basis searching the net for new or problem solving IT knowledge. The theory used in this paper is used to investigate the current role of IT managers and what issues they are facing. Furthermore a research is conducted where 7 IT managers in medium sized and large Danish companies are interviewed to add further focus on the role of the IT manager and to focus on how they keep themselves updated. Beside finding substantial need for more research, IT managers – generalists or specialists – only have limited knowledge resources at hand in updating their own knowledge – leaving much initiative to vendors.

Multi-Scale Gabor Feature Based Eye Localization

Eye localization is necessary for face recognition and related application areas. Most of eye localization algorithms reported so far still need to be improved about precision and computational time for successful applications. In this paper, we propose an eye location method based on multi-scale Gabor feature vectors, which is more robust with respect to initial points. The eye localization based on Gabor feature vectors first needs to constructs an Eye Model Bunch for each eye (left or right eye) which consists of n Gabor jets and average eye coordinates of each eyes obtained from n model face images, and then tries to localize eyes in an incoming face image by utilizing the fact that the true eye coordinates is most likely to be very close to the position where the Gabor jet will have the best Gabor jet similarity matching with a Gabor jet in the Eye Model Bunch. Similar ideas have been already proposed in such as EBGM (Elastic Bunch Graph Matching). However, the method used in EBGM is known to be not robust with respect to initial values and may need extensive search range for achieving the required performance, but extensive search ranges will cause much more computational burden. In this paper, we propose a multi-scale approach with a little increased computational burden where one first tries to localize eyes based on Gabor feature vectors in a coarse face image obtained from down sampling of the original face image, and then localize eyes based on Gabor feature vectors in the original resolution face image by using the eye coordinates localized in the coarse scaled image as initial points. Several experiments and comparisons with other eye localization methods reported in the other papers show the efficiency of our proposed method.

Insights into Smoothies with High Levels of Fibre and Polyphenols: Factors Influencing Chemical, Rheological and Sensory Properties

Attempts to add fibre and polyphenols (PPs) into popular beverages present challenges related to the properties of finished products such as smoothies. Consumer acceptability, viscosity and phenolic composition of smoothies containing high levels of fruit fibre (2.5-7.5 g per 300 mL serve) and PPs (250-750 mg per 300 mL serve) were examined. The changes in total extractable PP, vitamin C content, and colour of selected smoothies over a storage stability trial (4°C, 14 days) were compared. A set of acidic aqueous model beverages were prepared to further examine the effect of two different heat treatments on the stability and extractability of PPs. Results show that overall consumer acceptability of high fibre and PP smoothies was low, with average hedonic scores ranging from 3.9 to 6.4 (on a 1-9 scale). Flavour, texture and overall acceptability decreased as fibre and polyphenol contents increased, with fibre content exerting a stronger effect. Higher fibre content resulted in greater viscosity, with an elevated PP content increasing viscosity only slightly. The presence of fibre also aided the stability and extractability of PPs after heating. A reduction of extractable PPs, vitamin C content and colour intensity of smoothies was observed after a 14-day storage period at 4°C. Two heat treatments (75°C for 45 min or 85°C for 1 min) that are normally used for beverage production, did not cause significant reduction of total extracted PPs. It is clear that high levels of added fibre and PPs greatly influence the consumer appeal of smoothies, suggesting the need to develop novel formulation and processing methods if a satisfactory functional beverage is to be developed incorporating these ingredients.

Carbon Accumulation in Winter Wheat under Different Growing Intensity and Climate Change

World population growth drives food demand, promotes intensification of agriculture, development of new production technologies and varieties more suitable for regional nature conditions. Climate change can affect the length of growing period, biomass and carbon accumulation in winter wheat. The increasing mean air temperature resulting from climate change can reduce the length of growth period of cereals, and without adequate adjustments in growing technologies or varieties, can reduce biomass and carbon accumulation. Deeper understanding and effective measures for monitoring and management of cereal growth process are needed for adaptation to changing climate and technological conditions.

Crash Severity Modeling in Urban Highways Using Backward Regression Method

Identifying and classifying intersections according to severity is very important for implementation of safety related counter measures and effective models are needed to compare and assess the severity. Highway safety organizations have considered intersection safety among their priorities. In spite of significant advances in highways safety, the large numbers of crashes with high severities still occur in the highways. Investigation of influential factors on crashes enables engineers to carry out calculations in order to reduce crash severity. Previous studies lacked a model capable of simultaneous illustration of the influence of human factors, road, vehicle, weather conditions and traffic features including traffic volume and flow speed on the crash severity. Thus, this paper is aimed at developing the models to illustrate the simultaneous influence of these variables on the crash severity in urban highways. The models represented in this study have been developed using binary Logit Models. SPSS software has been used to calibrate the models. It must be mentioned that backward regression method in SPSS was used to identify the significant variables in the model. Consider to obtained results it can be concluded that the main factor in increasing of crash severity in urban highways are driver age, movement with reverse gear, technical defect of the vehicle, vehicle collision with motorcycle and bicycle, bridge, frontal impact collisions, frontal-lateral collisions and multi-vehicle crashes in urban highways which always increase the crash severity in urban highways.

Modelling of Soil Erosion by Non Conventional Methods

Soil erosion is the most serious problem faced at global and local level. So planning of soil conservation measures has become prominent agenda in the view of water basin managers. To plan for the soil conservation measures, the information on soil erosion is essential. Universal Soil Loss Equation (USLE), Revised Universal Soil Loss Equation 1 (RUSLE1or RUSLE) and Modified Universal Soil Loss Equation (MUSLE), RUSLE 1.06, RUSLE1.06c, RUSLE2 are most widely used conventional erosion estimation methods. The essential drawbacks of USLE, RUSLE1 equations are that they are based on average annual values of its parameters and so their applicability to small temporal scale is questionable. Also these equations do not estimate runoff generated soil erosion. So applicability of these equations to estimate runoff generated soil erosion is questionable. Data used in formation of USLE, RUSLE1 equations was plot data so its applicability at greater spatial scale needs some scale correction factors to be induced. On the other hand MUSLE is unsuitable for predicting sediment yield of small and large events. Although the new revised forms of USLE like RUSLE 1.06, RUSLE1.06c and RUSLE2 were land use independent and they have almost cleared all the drawbacks in earlier versions like USLE and RUSLE1, they are based on the regional data of specific area and their applicability to other areas having different climate, soil, land use is questionable. These conventional equations are applicable for sheet and rill erosion and unable to predict gully erosion and spatial pattern of rills. So the research was focused on development of nonconventional (other than conventional) methods of soil erosion estimation. When these non-conventional methods are combined with GIS and RS, gives spatial distribution of soil erosion. In the present paper the review of literature on non- conventional methods of soil erosion estimation supported by GIS and RS is presented.