Performance Analysis of Wireless Ad-Hoc Network Based on EDCA IEEE802.11e

IEEE 802.11e is the enhanced version of the IEEE 802.11 MAC dedicated to provide Quality of Service of wireless network. It supports QoS by the service differentiation and prioritization mechanism. Data traffic receives different priority based on QoS requirements. Fundamentally, applications are divided into four Access Categories (AC). Each AC has its own buffer queue and behaves as an independent backoff entity. Every frame with a specific priority of data traffic is assigned to one of these access categories. IEEE 802.11e EDCA (Enhanced Distributed Channel Access) is designed to enhance the IEEE 802.11 DCF (Distributed Coordination Function) mechanisms by providing a distributed access method that can support service differentiation among different classes of traffic. Performance of IEEE 802.11e MAC layer with different ACs is evaluated to understand the actual benefits deriving from the MAC enhancements.

Commercializing Technology Solutions- Moving from Products to Solutions

The paper outlines the drivers behind the movement from products to solutions in the Hi-Tech Business-to-Business markets. The paper lists out the challenges in enabling the transformation from products to solutions and also attempts to explore strategic and operational recommendations based on the authors- factual experiences with Japanese Hi-tech manufacturing organizations. Organizations in the Hi-Tech Business-to-Business markets are increasingly being compelled to move to a solutions model from the conventional products model. Despite the added complexity of solutions, successful technology commercialization can be achieved by making prudent choices in defining a relevant solutions model, by backing the solution model through appropriate organizational design, and by overhauling the new product development process and supporting infrastructure.

Fracture Characterization of Plain Woven Fabric Glass-Epoxy Composites

Delamination between layers in composite materials is a major structural failure. The delamination resistance is quantified by the critical strain energy release rate (SERR). The present investigation deals with the strain energy release rate of two woven fabric composites. Materials used are made of two types of glass fiber (360 gsm and 600 gsm) of plain weave and epoxy as matrix. The fracture behavior is studied using the mode I, double cantilever beam test and the mode II, end notched flexure test, in order to determine the energy required for the initiation and growth of an artificial crack. The delamination energy of these two materials is compared in order to study the effect of weave and reinforcement on mechanical properties. The fracture mechanism is also analyzed by means of scanning electron microscopy (SEM). It is observed that the plain weave fabric composite with lesser strand width has higher inter laminar fracture properties compared to the plain weave fabric composite with more strand width.

On Best Estimation for Parameter Weibull Distribution

The objective of this study is to introduce estimators to the parameters and survival function for Weibull distribution using three different methods, Maximum Likelihood estimation, Standard Bayes estimation and Modified Bayes estimation. We will then compared the three methods using simulation study to find the best one base on MPE and MSE.

Labeling Method in Steganography

In this paper a way of hiding text message (Steganography) in the gray image has been presented. In this method tried to find binary value of each character of text message and then in the next stage, tried to find dark places of gray image (black) by converting the original image to binary image for labeling each object of image by considering on 8 connectivity. Then these images have been converted to RGB image in order to find dark places. Because in this way each sequence of gray color turns into RGB color and dark level of grey image is found by this way if the Gary image is very light the histogram must be changed manually to find just dark places. In the final stage each 8 pixels of dark places has been considered as a byte and binary value of each character has been put in low bit of each byte that was created manually by dark places pixels for increasing security of the main way of steganography (LSB).

Analysis of Linear Equalizers for Cooperative Multi-User MIMO Based Reporting System

In this paper, we consider a multi user multiple input multiple output (MU-MIMO) based cooperative reporting system for cognitive radio network. In the reporting network, the secondary users forward the primary user data to the common fusion center (FC). The FC is equipped with linear equalizers and an energy detector to make the decision about the spectrum. The primary user data are considered to be a digital video broadcasting - terrestrial (DVB-T) signal. The sensing channel and the reporting channel are assumed to be an additive white Gaussian noise and an independent identically distributed Raleigh fading respectively. We analyzed the detection probability of MU-MIMO system with linear equalizers and arrived at the closed form expression for average detection probability. Also the system performance is investigated under various MIMO scenarios through Monte Carlo simulations.

Genetic Algorithm for Solving Non-Convex Economic Dispatch Problem

Economic dispatch (ED) is considered to be one of the key functions in electric power system operation. This paper presents a new hybrid approach based genetic algorithm (GA) to economic dispatch problems. GA is most commonly used optimizing algorithm predicated on principal of natural evolution. Utilization of chaotic queue with GA generates several neighborhoods of near optimal solutions to keep solution variation. It could avoid the search process from becoming pre-mature. For the objective of chaotic queue generation, utilization of tent equation as opposed to logistic equation results in improvement of iterative speed. The results of the proposed approach were compared in terms of fuel cost, with existing differential evolution and other methods in literature.

Extraction of Semantic Digital Signatures from MRI Photos for Image-Identification Purposes

This paper makes an attempt to solve the problem of searching and retrieving of similar MRI photos via Internet services using morphological features which are sourced via the original image. This study is aiming to be considered as an additional tool of searching and retrieve methods. Until now the main way of the searching mechanism is based on the syntactic way using keywords. The technique it proposes aims to serve the new requirements of libraries. One of these is the development of computational tools for the control and preservation of the intellectual property of digital objects, and especially of digital images. For this purpose, this paper proposes the use of a serial number extracted by using a previously tested semantic properties method. This method, with its center being the multi-layers of a set of arithmetic points, assures the following two properties: the uniqueness of the final extracted number and the semantic dependence of this number on the image used as the method-s input. The major advantage of this method is that it can control the authentication of a published image or its partial modification to a reliable degree. Also, it acquires the better of the known Hash functions that the digital signature schemes use and produces alphanumeric strings for cases of authentication checking, and the degree of similarity between an unknown image and an original image.

Speckle Reducing Contourlet Transform for Medical Ultrasound Images

Speckle noise affects all coherent imaging systems including medical ultrasound. In medical images, noise suppression is a particularly delicate and difficult task. A tradeoff between noise reduction and the preservation of actual image features has to be made in a way that enhances the diagnostically relevant image content. Even though wavelets have been extensively used for denoising speckle images, we have found that denoising using contourlets gives much better performance in terms of SNR, PSNR, MSE, variance and correlation coefficient. The objective of the paper is to determine the number of levels of Laplacian pyramidal decomposition, the number of directional decompositions to perform on each pyramidal level and thresholding schemes which yields optimal despeckling of medical ultrasound images, in particular. The proposed method consists of the log transformed original ultrasound image being subjected to contourlet transform, to obtain contourlet coefficients. The transformed image is denoised by applying thresholding techniques on individual band pass sub bands using a Bayes shrinkage rule. We quantify the achieved performance improvement.

Analysis and Design of a Novel Active Soft Switched Phase-Shifted Full Bridge Converter

This paper proposes an active soft-switching circuit for bridge converters aiming to improve the power conversion efficiency. The proposed circuit achieves loss-less switching for both main and auxiliary switches without increasing the main switch current/voltage rating. A winding coupled to the primary of power transformer ensures ZCS for the auxiliary switches during their turn-off. A 350 W, 100 kHz phase shifted full bridge (PSFB) converter is built to validate the analysis and design. Theoretical loss calculations for proposed circuit is presented. The proposed circuit is compared with passive soft switched PSFB in terms of efficiency and loss in duty cycle.

Theoretical Investigation of the Instantaneous Folding Force during the First Fold Creation in a Square Column

In this paper, a theoretical formula is presented to predict the instantaneous folding force of the first fold creation in a square column under axial loading. Calculations are based on analysis of “Basic Folding Mechanism" introduced by Wierzbicki and Abramowicz. For this purpose, the sum of dissipated energy rate under bending around horizontal and inclined hinge lines and dissipated energy rate under extensional deformations are equated to the work rate of the external force on the structure. Final formula obtained in this research, reasonably predicts the instantaneous folding force of the first fold creation versus folding distance and folding angle and also predicts the instantaneous folding force instead of the average value. Finally, according to the calculated theoretical relation, instantaneous folding force of the first fold creation in a square column was sketched versus folding distance and was compared to the experimental results which showed a good correlation.

A Temporal Synchronization Model for Heterogeneous Data in Distributed Systems

Multimedia distributed systems deal with heterogeneous data, such as texts, images, graphics, video and audio. The specification of temporal relations among different data types and distributed sources is an open research area. This paper proposes a fully distributed synchronization model to be used in multimedia systems. One original aspect of the model is that it avoids the use of a common reference (e.g. wall clock and shared memory). To achieve this, all possible multimedia temporal relations are specified according to their causal dependencies.

Hierarchies Based On the Number of Cooperating Systems of Finite Automata on Four-Dimensional Input Tapes

In theoretical computer science, the Turing machine has played a number of important roles in understanding and exploiting basic concepts and mechanisms in computing and information processing [20]. It is a simple mathematical model of computers [9]. After that, M.Blum and C.Hewitt first proposed two-dimensional automata as a computational model of two-dimensional pattern processing, and investigated their pattern recognition abilities in 1967 [7]. Since then, a lot of researchers in this field have been investigating many properties about automata on a two- or three-dimensional tape. On the other hand, the question of whether processing fourdimensional digital patterns is much more difficult than two- or threedimensional ones is of great interest from the theoretical and practical standpoints. Thus, the study of four-dimensional automata as a computasional model of four-dimensional pattern processing has been meaningful [8]-[19],[21]. This paper introduces a cooperating system of four-dimensional finite automata as one model of four-dimensional automata. A cooperating system of four-dimensional finite automata consists of a finite number of four-dimensional finite automata and a four-dimensional input tape where these finite automata work independently (in parallel). Those finite automata whose input heads scan the same cell of the input tape can communicate with each other, that is, every finite automaton is allowed to know the internal states of other finite automata on the same cell it is scanning at the moment. In this paper, we mainly investigate some accepting powers of a cooperating system of eight- or seven-way four-dimensional finite automata. The seven-way four-dimensional finite automaton is an eight-way four-dimensional finite automaton whose input head can move east, west, south, north, up, down, or in the fu-ture, but not in the past on a four-dimensional input tape.

Improved IDR(s) Method for Gaining Very Accurate Solutions

The IDR(s) method based on an extended IDR theorem was proposed by Sonneveld and van Gijzen. The original IDR(s) method has excellent property compared with the conventional iterative methods in terms of efficiency and small amount of memory. IDR(s) method, however, has unexpected property that relative residual 2-norm stagnates at the level of less than 10-12. In this paper, an effective strategy for stagnation detection, stagnation avoidance using adaptively information of parameter s and improvement of convergence rate itself of IDR(s) method are proposed in order to gain high accuracy of the approximated solution of IDR(s) method. Through numerical experiments, effectiveness of adaptive tuning IDR(s) method is verified and demonstrated.

Minimizing of Target Localization Error using Multi-robot System and Particle Filters

In recent years a number of applications with multirobot systems (MRS) is growing in various areas. But their design is in practice often difficult and algorithms are proposed for the theoretical background and do not consider errors and noise in real conditions, so they are not usable in real environment. These errors are visible also in task of target localization enough, when robots try to find and estimate the position of the target by the sensors. Localization of target is possible also with one robot but as it was examined target finding and localization with group of mobile robots can estimate the target position more accurately and faster. The accuracy of target position estimation is made by cooperation of MRS and particle filtering. Advantage of usage the MRS with particle filtering was tested on task of fixed target localization by group of mobile robots.

Scrum as the Method Supporting the Implementation of Knowledge Management in an Organization

Many companies have switched their processes to project-oriented in the last years. This brings new possibilities and effectiveness not only in the field of external processes connected with the product delivery but also the internal processes as well. However centralized project organization which is based on the role of project manager in the team has proved insufficient in some cases. Agile methods of project organization are trying to solve this problem by bringing new view on the project organization, roles, processes and competences. Scrum is one of these methods which builds on the principles of knowledge management to drive the project to effectiveness from all view angles. Using this method to organize internal and delivery projects helps the organization to create and share knowledge throughout the company. It also supports forming unique competences of individuals and project teams and drives innovations in the company.

Preliminary Analysis of Energy Efficiency in Data Center: Case Study

As the data-driven economy is growing faster than ever and the demand for energy is being spurred, we are facing unprecedented challenges of improving energy efficiency in data centers. Effectively maximizing energy efficiency or minimising the cooling energy demand is becoming pervasive for data centers. This paper investigates overall energy consumption and the energy efficiency of cooling system for a data center in Finland as a case study. The power, cooling and energy consumption characteristics and operation condition of facilities are examined and analysed. Potential energy and cooling saving opportunities are identified and further suggestions for improving the performance of cooling system are put forward. Results are presented as a comprehensive evaluation of both the energy performance and good practices of energy efficient cooling operations for the data center. Utilization of an energy recovery concept for cooling system is proposed. The conclusion we can draw is that even though the analysed data center demonstrated relatively high energy efficiency, based on its power usage effectiveness value, there is still a significant potential for energy saving from its cooling systems.

Enhancing Operational Effectiveness in the Norwegian Army through Simulation-Based Training

The Norwegian Military Academy (Army) has initiated a project with the main ambition to explore possible avenues to enhancing operational effectiveness through an increased use of simulation-based training and exercises. Within a cost/benefit framework, we discuss opportunities and limitations of vertical and horizontal integration of the existing tactical training system. Vertical integration implies expanding the existing training system to span the full range of training from tactical level (platoon, company) to command and staff level (battalion, brigade). Horizontal integration means including other domains than army tactics and staff procedures in the training, such as military ethics, foreign languages, leadership and decision making. We discuss each of the integration options with respect to purpose and content of training, "best practice" for organising and conducting simulation-based training, and suggest how to evaluate training procedures and measure learning outcomes. We conclude by giving guidelines towards further explorative work and possible implementation.

Dynamic Response of a Water Tower Composed of Interlocked Panels

Earthquakes produce some of the most violent loading situations that a structure can be subjected to and if a structure fails under these loads then inevitably human life is put at risk. One of the most common methods by which a structure fails under seismic loading is at the connection of structural elements. The research presented in this paper investigates the interlock systems as a novel method for building structures. The main objective of this experimental study wasto determine the dynamic characteristics and the seismic behaviour of the proposed structures compared to conventional structural systemsduring seismic motions. Results of this study indicate that the interlock mechanism of the panels influences the behaviour of lateral load-resisting systems of the structures during earthquakes, contributing to better structural flexibility and easier maintenance.

Primer Design with Specific PCR Product using Particle Swarm Optimization

Before performing polymerase chain reactions (PCR), a feasible primer set is required. Many primer design methods have been proposed for design a feasible primer set. However, the majority of these methods require a relatively long time to obtain an optimal solution since large quantities of template DNA need to be analyzed. Furthermore, the designed primer sets usually do not provide a specific PCR product. In recent years, evolutionary computation has been applied to PCR primer design and yielded promising results. In this paper, a particle swarm optimization (PSO) algorithm is proposed to solve primer design problems associated with providing a specific product for PCR experiments. A test set of the gene CYP1A1, associated with a heightened lung cancer risk was analyzed and the comparison of accuracy and running time with the genetic algorithm (GA) and memetic algorithm (MA) was performed. A comparison of results indicated that the proposed PSO method for primer design finds optimal or near-optimal primer sets and effective PCR products in a relatively short time.