A Monte Carlo Method to Data Stream Analysis

Data stream analysis is the process of computing various summaries and derived values from large amounts of data which are continuously generated at a rapid rate. The nature of a stream does not allow a revisit on each data element. Furthermore, data processing must be fast to produce timely analysis results. These requirements impose constraints on the design of the algorithms to balance correctness against timely responses. Several techniques have been proposed over the past few years to address these challenges. These techniques can be categorized as either dataoriented or task-oriented. The data-oriented approach analyzes a subset of data or a smaller transformed representation, whereas taskoriented scheme solves the problem directly via approximation techniques. We propose a hybrid approach to tackle the data stream analysis problem. The data stream has been both statistically transformed to a smaller size and computationally approximated its characteristics. We adopt a Monte Carlo method in the approximation step. The data reduction has been performed horizontally and vertically through our EMR sampling method. The proposed method is analyzed by a series of experiments. We apply our algorithm on clustering and classification tasks to evaluate the utility of our approach.

Mercerization Treatment Parameter Effect on Natural Fiber Reinforced Polymer Matrix Composite: A Brief Review

Environmental awareness and depletion of the petroleum resources are among vital factors that motivate a number of researchers to explore the potential of reusing natural fiber as an alternative composite material in industries such as packaging, automotive and building constructions. Natural fibers are available in abundance, low cost, lightweight polymer composite and most importance its biodegradability features, which often called “ecofriendly" materials. However, their applications are still limited due to several factors like moisture absorption, poor wettability and large scattering in mechanical properties. Among the main challenges on natural fibers reinforced matrices composite is their inclination to entangle and form fibers agglomerates during processing due to fiber-fiber interaction. This tends to prevent better dispersion of the fibers into the matrix, resulting in poor interfacial adhesion between the hydrophobic matrix and the hydrophilic reinforced natural fiber. Therefore, to overcome this challenge, fiber treatment process is one common alternative that can be use to modify the fiber surface topology by chemically, physically or mechanically technique. Nevertheless, this paper attempt to focus on the effect of mercerization treatment on mechanical properties enhancement of natural fiber reinforced composite or so-called bio composite. It specifically discussed on mercerization parameters, and natural fiber reinforced composite mechanical properties enhancement.

Factors of Effective Business Software Systems Development and Enhancement Projects Work Effort Estimation

Majority of Business Software Systems (BSS) Development and Enhancement Projects (D&EP) fail to meet criteria of their effectiveness, what leads to the considerable financial losses. One of the fundamental reasons for such projects- exceptionally low success rate are improperly derived estimates for their costs and time. In the case of BSS D&EP these attributes are determined by the work effort, meanwhile reliable and objective effort estimation still appears to be a great challenge to the software engineering. Thus this paper is aimed at presenting the most important synthetic conclusions coming from the author-s own studies concerning the main factors of effective BSS D&EP work effort estimation. Thanks to the rational investment decisions made on the basis of reliable and objective criteria it is possible to reduce losses caused not only by abandoned projects but also by large scale of overrunning the time and costs of BSS D&EP execution.

The Para-Universe of Collaborative Group Work in Today-s University Classrooms: Strategies to Help Ensure Success

Group work, projects and discussions are important components of teacher education courses whether they are face-toface, blended or exclusively online formats. This paper examines the varieties of tasks and challenges with this learning format in a face to face class teacher education class providing specific examples of both failure and success from both the student and instructor perspective. The discussion begins with a brief history of collaborative and cooperative learning, moves to an exploration of the promised benefits and then takes a look at some of the challenges which can arise specifically from the use of new technologies. The discussion concludes with guidelines and specific suggestions.

Context Aware Lightweight Energy Efficient Framework

Context awareness is a capability whereby mobile computing devices can sense their physical environment and adapt their behavior accordingly. The term context-awareness, in ubiquitous computing, was introduced by Schilit in 1994 and has become one of the most exciting concepts in early 21st-century computing, fueled by recent developments in pervasive computing (i.e. mobile and ubiquitous computing). These include computing devices worn by users, embedded devices, smart appliances, sensors surrounding users and a variety of wireless networking technologies. Context-aware applications use context information to adapt interfaces, tailor the set of application-relevant data, increase the precision of information retrieval, discover services, make the user interaction implicit, or build smart environments. For example: A context aware mobile phone will know that the user is currently in a meeting room, and reject any unimportant calls. One of the major challenges in providing users with context-aware services lies in continuously monitoring their contexts based on numerous sensors connected to the context aware system through wireless communication. A number of context aware frameworks based on sensors have been proposed, but many of them have neglected the fact that monitoring with sensors imposes heavy workloads on ubiquitous devices with limited computing power and battery. In this paper, we present CALEEF, a lightweight and energy efficient context aware framework for resource limited ubiquitous devices.

Sprayer Boom Active Suspension Using Intelligent Active Force Control

The control of sprayer boom undesired vibrations pose a great challenge to investigators due to various disturbances and conditions. Sprayer boom movements lead to reduce of spread efficiency and crop yield. This paper describes the design of a novel control method for an active suspension system applying proportional-integral-derivative (PID) controller with an active force control (AFC) scheme integration of an iterative learning algorithm employed to a sprayer boom. The iterative learning as an intelligent method is principally used as a method to calculate the best value of the estimated inertia of the sprayer boom needed for the AFC loop. Results show that the proposed AFC-based scheme performs much better than the standard PID control technique. Also, this shows that the system is more robust and accurate.

A Case Study of Collective Action in Fishermen's Wives Group (KUNITA), Malaysia

Collective action can be an effective means for local development as well as important strategy to enhance livelihoods especially among rural people. This article explores the level of collective action among members of Fishermen-s Wives Group (KUNITA) in Malaysia. KUNITA was established by the Malaysian Fishery Development Authority (LKIM) with an objective to increase the socio-economic status of fishermen-s families. The members who are mostly the wives and daughters of fishermen are strongly encouraged by LKIM to venture into entrepreneurship activities. The objective of this research was to see the level of collective action among members in KUNITA groups in the state of Selangor. The finding shows that high level of collective action among KUNITA members is strongly based on volunteerism. However, the level of cooperation among members in the group is relatively low. The findings present significant challenges for the group in maintaining the sustainability of KUNITA organization.

mCRM-s New Opportunities of Customer Satisfaction

This paper aims at a new challenge of customer satisfaction on mobile customer relationship management. In this paper presents a conceptualization of mCRM on its unique characteristics of customer satisfaction. Also, this paper develops an empirical framework in conception of customer satisfaction in mCRM. A single-case study is applied as the methodology. In order to gain an overall view of the empirical case, this paper accesses to invisible and important information of company in this investigation. Interview is the key data source form the main informants of the company through which the issues are identified and the proposed framework is built. It supports the development of customer satisfaction in mCRM; links this theoretical framework into practice; and provides the direction for future research. Therefore, this paper is very useful for the industries as it helps them to understand how customer satisfaction changes the mCRM structure and increase the business competitive advantage. Finally, this paper provides a contribution in practice by linking a theoretical framework in conception of customer satisfaction in mCRM for companies to a practical real case.

GeNS: a Biological Data Integration Platform

The scientific achievements coming from molecular biology depend greatly on the capability of computational applications to analyze the laboratorial results. A comprehensive analysis of an experiment requires typically the simultaneous study of the obtained dataset with data that is available in several distinct public databases. Nevertheless, developing a centralized access to these distributed databases rises up a set of challenges such as: what is the best integration strategy, how to solve nomenclature clashes, how to solve database overlapping data and how to deal with huge datasets. In this paper we present GeNS, a system that uses a simple and yet innovative approach to address several biological data integration issues. Compared with existing systems, the main advantages of GeNS are related to its maintenance simplicity and to its coverage and scalability, in terms of number of supported databases and data types. To support our claims we present the current use of GeNS in two concrete applications. GeNS currently contains more than 140 million of biological relations and it can be publicly downloaded or remotely access through SOAP web services.

Cognitive Radio Networks (CRN): Resource Allocation Techniques Based On DNA-inspired Computing

Spectrum is a scarce commodity, and considering the spectrum scarcity faced by the wireless-based service providers led to high congestion levels. Technical inefficiencies from pooled, since all networks share a common pool of channels, exhausting the available channels will force networks to block the services. Researchers found that cognitive radio (CR) technology may resolve the spectrum scarcity. A CR is a self-configuring entity in a wireless networking that senses its environment, tracks changes, and frequently exchanges information with their networks. However, CRN facing challenges and condition become worst while tracks changes i.e. reallocation of another under-utilized channels while primary network user arrives. In this paper, channels or resource reallocation technique based on DNA-inspired computing algorithm for CRN has been proposed.

Agent-Based Simulation and Analysis of Network-Centric Air Defense Missile Systems

Network-Centric Air Defense Missile Systems (NCADMS) represents the superior development of the air defense missile systems and has been regarded as one of the major research issues in military domain at present. Due to lack of knowledge and experience on NCADMS, modeling and simulation becomes an effective approach to perform operational analysis, compared with those equation based ones. However, the complex dynamic interactions among entities and flexible architectures of NCADMS put forward new requirements and challenges to the simulation framework and models. ABS (Agent-Based Simulations) explicitly addresses modeling behaviors of heterogeneous individuals. Agents have capability to sense and understand things, make decisions, and act on the environment. They can also cooperate with others dynamically to perform the tasks assigned to them. ABS proves an effective approach to explore the new operational characteristics emerging in NCADMS. In this paper, based on the analysis of network-centric architecture and new cooperative engagement strategies for NCADMS, an agent-based simulation framework by expanding the simulation framework in the so-called System Effectiveness Analysis Simulation (SEAS) was designed. The simulation framework specifies components, relationships and interactions between them, the structure and behavior rules of an agent in NCADMS. Based on scenario simulations, information and decision superiority and operational advantages in NCADMS were analyzed; meanwhile some suggestions were provided for its future development.

Requirements Engineering for Enterprise Applications Development: Seven Challenges in Higher Education Environment

This paper describes the challenges on the requirements engineering for developing an enterprise applications in higher education environment. The development activities include software implementation, maintenance, and enhancement and support for online transaction processing and overnight batch processing. Generally, an enterprise application for higher education environment may include Student Information System (SIS), HR/Payroll system, Financial Systems etc. By the way, there are so many challenges in requirement engineering phases in order to provide two distinctive services that are production processing support and systems development.

Challenges of Irrigation Water Supply in Croplands of Arid Regions and their Environmental Consequences – A Case Study in the Dez and Moghan Command Areas of Iran

Renewable water resources are crucial production variables in arid and semi-arid regions where intensive agriculture is practiced to meet ever-increasing demand for food and fiber. This is crucial for the Dez and Moghan command areas where water delivery problems and adverse environmental issues are widespread. This paper aims to identify major problems areas using on-farm surveys of 200 farmers, agricultural extensionists and water suppliers which was complemented by secondary data and field observations during 2010- 2011 cultivating season. The SPSS package was used to analyze and synthesis data. Results indicated inappropriate canal operations in both schemes, though there was no unanimity about the underlying causes. Inequitable and inflexible distribution was found to be rooted in deficient hydraulic structures particularly in the main and secondary canals. The inadequacy and inflexibility of water scheduling regime was the underlying causes of recurring pest and disease spread which often led to the decline of crop yield and quality, although these were not disputed, the water suppliers were not prepared to link with the deficiencies in the operation of the main and secondary canals. They rather attributed these to the prevailing salinity; alkalinity, water table fluctuations and leaching of the valuable agro-chemical inputs from the plants- route zone with farreaching consequences. Examples of these include the pollution of ground and surface resources due to over-irrigation at the farm level which falls under the growers- own responsibility. Poor irrigation efficiency and adverse environmental problems were attributed to deficient and outdated farming practices that were in turn rooted in poor extension programs and irrational water charges.

Optimal Water Conservation in a Mechanical Cooling Tower Operations

Water recycling represents an important challenge for many countries, in particular in countries where this natural resource is rare. On the other hand, in many operations, water is used as a cooling medium, as a high proportion of water consumed in industry is used for cooling purposes. Generally this water is rejected directly to the nature. This reject will cause serious environment damages as well as an important waste of this precious element.. On way to solve these problems is to reuse and recycle this warm water, through the use of natural cooling medium, such as air in a heat exchanger unit, known as a cooling tower. A poor performance, design or reliability of cooling towers will result in lower flow rate of cooling water an increase in the evaporation of water, an hence losses of water and energy. This paper which presents an experimental investigate of thermal and hydraulic performances of a mechanical cooling tower, enables to show that the water evaporation rate, Mev, increases with an increase in the air and water flow rates, as well as inlet water temperature and for fixed air flow rates, the pressure drop (ΔPw/Z) increases with increasing , L, due to the hydrodynamic behavior of the air/water flow.

Authentication Protocol for Wireless Sensor Networks

Wireless sensor networks can be used to measure and monitor many challenging problems and typically involve in monitoring, tracking and controlling areas such as battlefield monitoring, object tracking, habitat monitoring and home sentry systems. However, wireless sensor networks pose unique security challenges including forgery of sensor data, eavesdropping, denial of service attacks, and the physical compromise of sensor nodes. Node in a sensor networks may be vanished due to power exhaustion or malicious attacks. To expand the life span of the sensor network, a new node deployment is needed. In military scenarios, intruder may directly organize malicious nodes or manipulate existing nodes to set up malicious new nodes through many kinds of attacks. To avoid malicious nodes from joining the sensor network, a security is required in the design of sensor network protocols. In this paper, we proposed a security framework to provide a complete security solution against the known attacks in wireless sensor networks. Our framework accomplishes node authentication for new nodes with recognition of a malicious node. When deployed as a framework, a high degree of security is reachable compared with the conventional sensor network security solutions. A proposed framework can protect against most of the notorious attacks in sensor networks, and attain better computation and communication performance. This is different from conventional authentication methods based on the node identity. It includes identity of nodes and the node security time stamp into the authentication procedure. Hence security protocols not only see the identity of each node but also distinguish between new nodes and old nodes.

Medical Image Segmentation Using Deformable Model and Local Fitting Binary: Thoracic Aorta

This paper presents an application of level sets for the segmentation of abdominal and thoracic aortic aneurysms in CTA datasets. An important challenge in reliably detecting aortic is the need to overcome problems associated with intensity inhomogeneities. Level sets are part of an important class of methods that utilize partial differential equations (PDEs) and have been extensively applied in image segmentation. A kernel function in the level set formulation aids the suppression of noise in the extracted regions of interest and then guides the motion of the evolving contour for the detection of weak boundaries. The speed of curve evolution has been significantly improved with a resulting decrease in segmentation time compared with previous implementations of level sets, and are shown to be more effective than other approaches in coping with intensity inhomogeneities. We have applied the Courant Friedrichs Levy (CFL) condition as stability criterion for our algorithm.

High-Speed Train Planning in France, Lessons from Mediterranean TGV-Line

To fight against the economic crisis, French Government, like many others in Europe, has decided to give a boost to high-speed line projects. This paper explores the implementation and decision-making process in TGV projects, their evolutions, especially since the Mediterranean TGV-line. This project was probably the most controversial, but paradoxically represents today a huge success for all the actors involved. What kind of lessons we can learn from this experience? How to evaluate the impact of this project on TGV-line planning? How can we characterize this implementation and decision-making process regards to the sustainability challenges? The construction of Mediterranean TGV-line was the occasion to make several innovations: to introduce more dialog into the decisionmaking process, to take into account the environment, to introduce a new project management and technological innovations. That-s why this project appears today as an example in terms of integration of sustainable development. In this paper we examine the different kinds of innovations developed in this project, by using concepts from sociology of innovation to understand how these solutions emerged in a controversial situation. Then we analyze the lessons which were drawn from this decision-making process (in the immediacy and a posteriori) and the way in which procedures evolved: creation of new tools and devices (public consultation, project management...). Finally we try to highlight the impact of this evolution on TGV projects governance. In particular, new methods of implementation and financing involve a reconfiguration of the system of actors. The aim of this paper is to define the impact of this reconfiguration on negotiations between stakeholders.

Application of the Data Distribution Service for Flexible Manufacturing Automation

This paper discusses the applicability of the Data Distribution Service (DDS) for the development of automated and modular manufacturing systems which require a flexible and robust communication infrastructure. DDS is an emergent standard for datacentric publish/subscribe middleware systems that provides an infrastructure for platform-independent many-to-many communication. It particularly addresses the needs of real-time systems that require deterministic data transfer, have low memory footprints and high robustness requirements. After an overview of the standard, several aspects of DDS are related to current challenges for the development of modern manufacturing systems with distributed architectures. Finally, an example application is presented based on a modular active fixturing system to illustrate the described aspects.

Efficient Supplies to Assembly Areas from Storage Stages

Guaranteeing the availability of the required parts at the scheduled time represents a key logistical challenge. This is especially important when several parts are required together. This article describes a tool that supports the positioning in the area of conflict between low stock costs and a high service level for a consumer.

GridNtru: High Performance PKCS

Cryptographic algorithms play a crucial role in the information society by providing protection from unauthorized access to sensitive data. It is clear that information technology will become increasingly pervasive, Hence we can expect the emergence of ubiquitous or pervasive computing, ambient intelligence. These new environments and applications will present new security challenges, and there is no doubt that cryptographic algorithms and protocols will form a part of the solution. The efficiency of a public key cryptosystem is mainly measured in computational overheads, key size and bandwidth. In particular the RSA algorithm is used in many applications for providing the security. Although the security of RSA is beyond doubt, the evolution in computing power has caused a growth in the necessary key length. The fact that most chips on smart cards can-t process key extending 1024 bit shows that there is need for alternative. NTRU is such an alternative and it is a collection of mathematical algorithm based on manipulating lists of very small integers and polynomials. This allows NTRU to high speeds with the use of minimal computing power. NTRU (Nth degree Truncated Polynomial Ring Unit) is the first secure public key cryptosystem not based on factorization or discrete logarithm problem. This means that given sufficient computational resources and time, an adversary, should not be able to break the key. The multi-party communication and requirement of optimal resource utilization necessitated the need for the present day demand of applications that need security enforcement technique .and can be enhanced with high-end computing. This has promoted us to develop high-performance NTRU schemes using approaches such as the use of high-end computing hardware. Peer-to-peer (P2P) or enterprise grids are proven as one of the approaches for developing high-end computing systems. By utilizing them one can improve the performance of NTRU through parallel execution. In this paper we propose and develop an application for NTRU using enterprise grid middleware called Alchemi. An analysis and comparison of its performance for various text files is presented.