Impact of MAC Layer on the Performance of Routing Protocols in Mobile Ad hoc Networks

Mobile Ad hoc Networks is an autonomous system of mobile nodes connected by multi-hop wireless links without centralized infrastructure support. As mobile communication gains popularity, the need for suitable ad hoc routing protocols will continue to grow. Efficient dynamic routing is an important research challenge in such a network. Bandwidth constrained mobile devices use on-demand approach in their routing protocols because of its effectiveness and efficiency. Many researchers have conducted numerous simulations for comparing the performance of these protocols under varying conditions and constraints. Most of them are not aware of MAC Protocols, which will impact the relative performance of routing protocols considered in different network scenarios. In this paper we investigate the choice of MAC protocols affects the relative performance of ad hoc routing protocols under different scenarios. We have evaluated the performance of these protocols using NS2 simulations. Our results show that the performance of routing protocols of ad hoc networks will suffer when run over different MAC Layer protocols.

Bridging the Green-Value-Gap: A South African Approach

Green- spaces might be very attractive, but where are the economic benefits? What value do nature and landscape have for us? What difference will it make to jobs, health and the economic strength of areas struggling with deprivation and social problems? [1].There is a need to consider green spaces from a different perspective. Green planning is not just about flora and fauna, but also about planning for economic benefits [2]. It is worth trying to quantify the value of green spaces since nature and landscape are crucially important to our quality of life and sustainable development. The reality, however, is that urban development often takes place at the expense of green spaces. Urbanization is an ongoing process throughout the world; however, hyper-urbanization without environmental planning is destructive, not constructive [3]. Urban spaces are believed to be more valuable than other land uses, particular green areas, simply because of the market value connected to urban spaces. However, attractive landscapes can help raise the quality and value of the urban market even more. In order to reach these objectives of integrated planning, the Green-Value-Gap needs to be bridged. Economists have to understand the concept of Green-Planning and the spinoffs, and Environmentalists have to understand the importance of urban economic development and the benefits thereof to green planning. An interface between Environmental Management, Economic Development and sustainable Spatial Planning are needed to bridge the Green-Value-Gap.

Utilizing Ontologies Using Ontology Editor for Creating Initial Unified Modeling Language (UML)Object Model

One of object oriented software developing problem is the difficulty of searching the appropriate and suitable objects for starting the system. In this work, ontologies appear in the part of supporting the object discovering in the initial of object oriented software developing. There are many researches try to demonstrate that there is a great potential between object model and ontologies. Constructing ontology from object model is called ontology engineering can be done; On the other hand, this research is aiming to support the idea of building object model from ontology is also promising and practical. Ontology classes are available online in any specific areas, which can be searched by semantic search engine. There are also many helping tools to do so; one of them which are used in this research is Protégé ontology editor and Visual Paradigm. To put them together give a great outcome. This research will be shown how it works efficiently with the real case study by using ontology classes in travel/tourism domain area. It needs to combine classes, properties, and relationships from more than two ontologies in order to generate the object model. In this paper presents a simple methodology framework which explains the process of discovering objects. The results show that this framework has great value while there is possible for expansion. Reusing of existing ontologies offers a much cheaper alternative than building new ones from scratch. More ontologies are becoming available on the web, and online ontologies libraries for storing and indexing ontologies are increasing in number and demand. Semantic and Ontologies search engines have also started to appear, to facilitate search and retrieval of online ontologies.

Application of the Balanced Scorecard into the Formulation of the Firm Strategy

In contemporary global and dynamically developing environment there is a need of the strategic planning fundamental. It is complicated, but at the same time important process from the point of view of continual keeping of competitive advantage. The aim of the paper is formulation of strategic goals for the needs of the small enterprises. There will be used Balanced Scorecard as a balanced system of the indicators for the clearing and transferring vision into particular goals. In particular perspectives the theme will be focused on strategic goals. Consequently will be mention the concept of the competitiveness IDINMOSU. This connect to Balanced Scorecard.

On the Verification of Power Nap Associated with Stage 2 Sleep and Its Application

One of the most important causes of accidents is driver fatigue. To reduce the accidental rate, the driver needs a quick nap when feeling sleepy. Hence, searching for the minimum time period of nap is a very challenging problem. The purpose of this paper is twofold, i.e. to investigate the possible fastest time period for nap and its relationship with stage 2 sleep, and to develop an automatic stage 2 sleep detection and alarm device. The experiment for this investigation is designed with 21 subjects. It yields the result that waking up the subjects after getting into stage 2 sleep for 3-5 minutes can efficiently reduce the sleepiness. Furthermore, the automatic stage 2 sleep detection and alarm device yields the real-time detection accuracy of approximately 85% which is comparable with the commercial sleep lab system.

Choice of Exchange Rate Regimes: Case of Ex-Yugoslavia Countries

There are little subjects in macroeconomics that are so widely discussed, but at the same time controversial and without a clear solution such as the choice of exchange rate regime. National authorities need to take into consideration numerous fundamentals, trying to fulfil goals of economic growth, low and stable inflation and international stability. This paper focuses on the countries of ex- Yugoslavia and their exchange rate history as independent states. We follow the development of the regimes in 6 countries during the transition through the financial crisis of the second part of the 2000s to the prospects of their final goal: full membership in the European Union. Main question is to what extent has the exchange regime contributed to their economic success, considering other objective factors.

Development of an Autonomous Greenhouse Gas Monitoring System

This paper describes the designs of a first and second generation autonomous gas monitoring system and the successful field trial of the final system (2nd generation). Infrared sensing technology is used to detect and measure the greenhouse gases methane (CH4) and carbon dioxide (CO2) at point sources. The ability to monitor real-time events is further enhanced through the implementation of both GSM and Bluetooth technologies to communicate these data in real-time. These systems are robust, reliable and a necessary tool where the monitoring of gas events in real-time are needed.

Methodology Issues and Design Approach of VLE on Mathematical Concepts Acquisition within Secondary Education in England

This study used positivist quantitative approach to examine the mathematical concepts acquisition of- KS4 (14-16) Special Education Needs (SENs) students within the school sector education in England. The research is based on a pilot study and the design is completely holistic in its approach with mixing methodologies. The study combines the qualitative and quantitative methods of approach in gathering formative data for the design process. Although, the approach could best be described as a mix method, fundamentally with a strong positivist paradigm, hence my earlier understanding of the differentiation of the students, student – teacher body and the various elements of indicators that is being measured which will require an attenuated description of individual research subjects. The design process involves four phases with five key stages which are; literature review and document analysis, the survey, interview, and observation; then finally the analysis of data set. The research identified the need for triangulation with Reid-s phases of data management providing scaffold for the study. The study clearly identified the ideological and philosophical aspects of educational research design for the study of mathematics by the special education needs (SENs) students in England using the virtual learning environment (VLE) platform.

A Blind SLM Scheme for Reduction of PAPR in OFDM Systems

In this paper we propose a blind algorithm for peakto- average power ratio (PAPR) reduction in OFDM systems, based on selected mapping (SLM) algorithm as a distortionless method. The main drawback of the conventional SLM technique is the need for transmission of several side information bits, for each data block, which results in loss in data rate transmission. In the proposed method some special number of carriers in the OFDM frame is reserved to be rotated with one of the possible phases according to the number of phase sequence blocks in SLM algorithm. Reserving some limited number of carriers wont effect the reduction in PAPR of OFDM signal. Simulation results show using ML criteria at the receiver will lead to the same system-performance as the conventional SLM algorithm, while there is no need to send any side information to the receiver.

Noise Reduction in Image Sequences using an Effective Fuzzy Algorithm

In this paper, we propose a novel spatiotemporal fuzzy based algorithm for noise filtering of image sequences. Our proposed algorithm uses adaptive weights based on a triangular membership functions. In this algorithm median filter is used to suppress noise. Experimental results show when the images are corrupted by highdensity Salt and Pepper noise, our fuzzy based algorithm for noise filtering of image sequences, are much more effective in suppressing noise and preserving edges than the previously reported algorithms such as [1-7]. Indeed, assigned weights to noisy pixels are very adaptive so that they well make use of correlation of pixels. On the other hand, the motion estimation methods are erroneous and in highdensity noise they may degrade the filter performance. Therefore, our proposed fuzzy algorithm doesn-t need any estimation of motion trajectory. The proposed algorithm admissibly removes noise without having any knowledge of Salt and Pepper noise density.

Validation on 3D Surface Roughness Algorithm for Measuring Roughness of Psoriasis Lesion

Psoriasis is a widespread skin disease affecting up to 2% population with plaque psoriasis accounting to about 80%. It can be identified as a red lesion and for the higher severity the lesion is usually covered with rough scale. Psoriasis Area Severity Index (PASI) scoring is the gold standard method for measuring psoriasis severity. Scaliness is one of PASI parameter that needs to be quantified in PASI scoring. Surface roughness of lesion can be used as a scaliness feature, since existing scale on lesion surface makes the lesion rougher. The dermatologist usually assesses the severity through their tactile sense, therefore direct contact between doctor and patient is required. The problem is the doctor may not assess the lesion objectively. In this paper, a digital image analysis technique is developed to objectively determine the scaliness of the psoriasis lesion and provide the PASI scaliness score. Psoriasis lesion is modelled by a rough surface. The rough surface is created by superimposing a smooth average (curve) surface with a triangular waveform. For roughness determination, a polynomial surface fitting is used to estimate average surface followed by a subtraction between rough and average surface to give elevation surface (surface deviations). Roughness index is calculated by using average roughness equation to the height map matrix. The roughness algorithm has been tested to 444 lesion models. From roughness validation result, only 6 models can not be accepted (percentage error is greater than 10%). These errors occur due the scanned image quality. Roughness algorithm is validated for roughness measurement on abrasive papers at flat surface. The Pearson-s correlation coefficient of grade value (G) of abrasive paper and Ra is -0.9488, its shows there is a strong relation between G and Ra. The algorithm needs to be improved by surface filtering, especially to overcome a problem with noisy data.

On the AC-Side Interface Filter in Three-Phase Shunt Active Power Filter Systems

The proper selection of the AC-side passive filter interconnecting the voltage source converter to the power supply is essential to obtain satisfactory performances of an active power filter system. The use of the LCL-type filter has the advantage of eliminating the high frequency switching harmonics in the current injected into the power supply. This paper is mainly focused on analyzing the influence of the interface filter parameters on the active filtering performances. Some design aspects are pointed out. Thus, the design of the AC interface filter starts from transfer functions by imposing the filter performance which refers to the significant current attenuation of the switching harmonics without affecting the harmonics to be compensated. A Matlab/Simulink model of the entire active filtering system including a concrete nonlinear load has been developed to examine the system performances. It is shown that a gamma LC filter could accomplish the attenuation requirement of the current provided by converter. Moreover, the existence of an optimal value of the grid-side inductance which minimizes the total harmonic distortion factor of the power supply current is pointed out. Nevertheless, a small converter-side inductance and a damping resistance in series with the filter capacitance are absolutely needed in order to keep the ripple and oscillations of the current at the converter side within acceptable limits. The effect of change in the LCL-filter parameters is evaluated. It is concluded that good active filtering performances can be achieved with small values of the capacitance and converter-side inductance.

Study of Efficiency and Capability LZW++ Technique in Data Compression

The purpose of this paper is to show efficiency and capability LZWµ in data compression. The LZWµ technique is enhancement from existing LZW technique. The modification the existing LZW is needed to produce LZWµ technique. LZW read one by one character at one time. Differ with LZWµ technique, where the LZWµ read three characters at one time. This paper focuses on data compression and tested efficiency and capability LZWµ by different data format such as doc type, pdf type and text type. Several experiments have been done by different types of data format. The results shows LZWµ technique is better compared to existing LZW technique in term of file size.

Heterogeneity-Aware Load Balancing for Multimedia Access over Wireless LAN Hotspots

Wireless LAN (WLAN) access in public hotspot areas becomes popular in the recent years. Since more and more multimedia information is available in the Internet, there is an increasing demand for accessing multimedia information through WLAN hotspots. Currently, the bandwidth offered by an IEEE 802.11 WLAN cannot afford many simultaneous real-time video accesses. A possible way to increase the offered bandwidth in a hotspot is the use of multiple access points (APs). However, a mobile station is usually connected to the WLAN AP with the strongest received signal strength indicator (RSSI). The total consumed bandwidth cannot be fairly allocated among those APs. In this paper, we will propose an effective load-balancing scheme via the support of the IAPP and SNMP in APs. The proposed scheme is an open solution and doesn-t need any changes in both wireless stations and APs. This makes load balancing possible in WLAN hotspots, where a variety of heterogeneous mobile devices are employed.

A Frame Work for Query Results Refinement in Multimedia Databases

In the current age, retrieval of relevant information from massive amount of data is a challenging job. Over the years, precise and relevant retrieval of information has attained high significance. There is a growing need in the market to build systems, which can retrieve multimedia information that precisely meets the user's current needs. In this paper, we have introduced a framework for refining query results before showing it to the user, using ambient intelligence, user profile, group profile, user location, time, day, user device type and extracted features. A prototypic tool was also developed to demonstrate the efficiency of the proposed approach.

Optimization of Multicast Transmissions in NC-HMIPv6 Environment

Multicast transmissions allow an host (the source) to send only one flow bound for a group of hosts (the receivers). Any equipment eager to belong to the group may explicitly register itself to that group via its multicast router. This router will be given the responsibility to convey all information relating to the group to all registered hosts. However in an environment in which the final receiver or the source frequently moves, the multicast flows need particular treatment. This constitutes one of the multicast transmissions problems around which several proposals were made in the Mobile IPv6 case in general. In this article, we describe the problems involved in this IPv6 multicast mobility and the existing proposals for their resolution. Then architecture will be proposed aiming to satisfy and optimize these transmissions in the specific case of a mobile multicast receiver in NC-HMIPv6 environment.

Developing Online Bookstore to Facilitate Manual Process – UTP Case Study

Knowledge sharing enables the information or knowledge to be transmitted from one source to another. This paper demonstrates the needs of having the online book catalogue which can be used to facilitate disseminating information on textbook used in the university. This project is aimed to give access to the students and lecturers to the list of books in the bookstore and at the same time to allow book reviewing without having to visit the bookstore physically. Research is carried out according to the boundaries which accounts to current process of new book purchasing, current system used by the bookstore and current process the lecturers go through for reviewing textbooks. The questionnaire is used to gather the requirements and it is distributed to 100 students and 40 lecturers. This project has enabled the improvement of a manual process to be carried out automatically, through a web based platform. It is shown based on the user acceptance survey carried out that target groups found that this web service is feasible to be implemented in Universiti Teknologi PETRONAS (UTP), and they have shown positive signs of interest in utilizing it in the future.

Speedup of Data Vortex Network Architecture

In this paper, 3X3 routing nodes are proposed to provide speedup and parallel processing capability in Data Vortex network architectures. The new design not only significantly improves network throughput and latency, but also eliminates the need for distributive traffic control mechanism originally embedded among nodes and the need for nodal buffering. The cost effectiveness is studied by a comparison study with the previously proposed 2- input buffered networks, and considerable performance enhancement can be achieved with similar or lower cost of hardware. Unlike previous implementation, the network leaves small probability of contention, therefore, the packet drop rate must be kept low for such implementation to be feasible and attractive, and it can be achieved with proper choice of operation conditions.

Towards a Suitable and Systematic Approach for Component Based Software Development

Software crisis refers to the situation in which the developers are not able to complete the projects within time and budget constraints and moreover these overscheduled and over budget projects are of low quality as well. Several methodologies have been adopted form time to time to overcome this situation and now in the focus is component based software engineering. In this approach, emphasis is on reuse of already existing software artifacts. But the results can not be achieved just by preaching the principles; they need to be practiced as well. This paper highlights some of the very basic elements of this approach, which has to be in place to get the desired goals of high quality, low cost with shorter time-to-market software products.

Salbutamol Sulphate-Ethylcellulose Tabletted Microcapsules: Pharmacokinetic Study using Convolution Approach

The aim of this article is to narrate the utility of novel simulation approach i.e. convolution method to predict blood concentration of drug utilizing dissolution data of salbutamol sulphate microparticulate formulations with different release patterns (1:1, 1:2 and 1:3, drug:polymer). Dissolution apparatus II USP 2007 and 900 ml double distilled water stirrd at 50 rpm was employed for dissolution analysis. From dissolution data, blood drug concentration was determined, and in return predicted blood drug concentration data was used to calculate the pharmacokinetic parameters i.e. Cmax, Tmax, and AUC. Convolution is a good biwaiver technique; however its better utility needs it application in the conditions where biorelevant dissolution media are used.