Relationship between Communication Effectiveness and the Extent of Communication among Organizational Units

This contribution deals with the relationship between communication effectiveness and the extent of communication among organizational units. To facilitate communication between employees and to increase the level of understanding, the knowledge of communication tools is necessary. Recent experience has shown that personal communication is critical for smooth running of companies and cannot be fully replaced by any form of technical communication devices. Below are presented the outcomes of the research on the relationship between the extent of communication among organisational units and its efficiency.

Decimation Filter Design Toolbox for Multi-Standard Wireless Transceivers using MATLAB

The demand for new telecommunication services requiring higher capacities, data rates and different operating modes have motivated the development of new generation multi-standard wireless transceivers. A multi-standard design often involves extensive system level analysis and architectural partitioning, typically requiring extensive calculations. In this research, a decimation filter design tool for wireless communication standards consisting of GSM, WCDMA, WLANa, WLANb, WLANg and WiMAX is developed in MATLAB® using GUIDE environment for visual analysis. The user can select a required wireless communication standard, and obtain the corresponding multistage decimation filter implementation using this toolbox. The toolbox helps the user or design engineer to perform a quick design and analysis of decimation filter for multiple standards without doing extensive calculation of the underlying methods.

A Characterized and Optimized Approach for End-to-End Delay Constrained QoS Routing

QoS Routing aims to find paths between senders and receivers satisfying the QoS requirements of the application which efficiently using the network resources and underlying routing algorithm to be able to find low-cost paths that satisfy given QoS constraints. The problem of finding least-cost routing is known to be NP hard or complete and some algorithms have been proposed to find a near optimal solution. But these heuristics or algorithms either impose relationships among the link metrics to reduce the complexity of the problem which may limit the general applicability of the heuristic, or are too costly in terms of execution time to be applicable to large networks. In this paper, we analyzed two algorithms namely Characterized Delay Constrained Routing (CDCR) and Optimized Delay Constrained Routing (ODCR). The CDCR algorithm dealt an approach for delay constrained routing that captures the trade-off between cost minimization and risk level regarding the delay constraint. The ODCR which uses an adaptive path weight function together with an additional constraint imposed on the path cost, to restrict search space and hence ODCR finds near optimal solution in much quicker time.

Geostatistical Analysis and Mapping of Groundlevel Ozone in a Medium Sized Urban Area

Ground-level tropospheric ozone is one of the air pollutants of most concern. It is mainly produced by photochemical processes involving nitrogen oxides and volatile organic compounds in the lower parts of the atmosphere. Ozone levels become particularly high in regions close to high ozone precursor emissions and during summer, when stagnant meteorological conditions with high insolation and high temperatures are common. In this work, some results of a study about urban ozone distribution patterns in the city of Badajoz, which is the largest and most industrialized city in Extremadura region (southwest Spain) are shown. Fourteen sampling campaigns, at least one per month, were carried out to measure ambient air ozone concentrations, during periods that were selected according to favourable conditions to ozone production, using an automatic portable analyzer. Later, to evaluate the ozone distribution at the city, the measured ozone data were analyzed using geostatistical techniques. Thus, first, during the exploratory analysis of data, it was revealed that they were distributed normally, which is a desirable property for the subsequent stages of the geostatistical study. Secondly, during the structural analysis of data, theoretical spherical models provided the best fit for all monthly experimental variograms. The parameters of these variograms (sill, range and nugget) revealed that the maximum distance of spatial dependence is between 302-790 m and the variable, air ozone concentration, is not evenly distributed in reduced distances. Finally, predictive ozone maps were derived for all points of the experimental study area, by use of geostatistical algorithms (kriging). High prediction accuracy was obtained in all cases as cross-validation showed. Useful information for hazard assessment was also provided when probability maps, based on kriging interpolation and kriging standard deviation, were produced.

The Auto-Tuning PID Controller for Interacting Water Level Process

This paper presents the approach to design the Auto- Tuning PID controller for interactive Water Level Process using integral step response. The Integral Step Response (ISR) is the method to model a dynamic process which can be done easily, conveniently and very efficiently. Therefore this method is advantage for design the auto tune PID controller. Our scheme uses the root locus technique to design PID controller. In this paper MATLAB is used for modeling and testing of the control system. The experimental results of the interacting water level process can be satisfyingly illustrated the transient response and the steady state response.

An Augmented Beam-search Based Algorithm for the Strip Packing Problem

In this paper, the use of beam search and look-ahead strategies for solving the strip packing problem (SPP) is investigated. Given a strip of fixed width W, unlimited length L, and a set of n circular pieces of known radii, the objective is to determine the minimum length of the initial strip that packs all the pieces. An augmented algorithm which combines beam search and a look-ahead strategies is proposed. The look-ahead is used in order to evaluate the nodes at each level of the tree search. The best nodes are then retained for branching. The computational investigation showed that the proposed augmented algorithm is able to improve the best known solutions of the literature on most instances used.

Contingent Pay and Experience with its use by Organizations of the Czech Republic Operating in the Field of Environmental Protection

One part of the total employee-s reward is apart from basic wages or salary, employee-s benefits and intangible elements also so called contingent (variable) pay. Contingent pay is connected to performance, contribution, capcompetency or skills of individual employees, and to team-s or company-wide performance or to combination of few of the mentioned possibilities. Main aim of this article is to define, based on available information, contingent pay, describe reasons for its implementation and arguments for and against this type of remuneration, but also bring information not only about its extent and level of utilization by organizations of the Czech Republic operating in the field of environmental protection, but also mention their practical experience with this type of remuneration.

Deflection Control in Composite Building by Using Belt Truss and Outriggers Systems

The design of high-rise building is more often dictated by its serviceability rather than strength. Structural Engineers are always striving to overcome challenge of controlling lateral deflection and storey drifts as well as self weight of structure imposed on foundation. One of the most effective techniques is the use of outrigger and belt truss system in Composite structures that can astutely solve the above two issues in High-rise constructions. This paper investigates deflection control by effective utilisation of belt truss and outrigger system on a 60-storey composite building subjected to wind loads. A three dimensional Finite Element Analysis is performed with one, two and three outrigger levels. The reductions in lateral deflection are 34%, 42% and 51% respectively as compared to a model without any outrigger system. There is an appreciable decline in the storey drifts with the introduction of these stiffer arrangements.

Association of G-174C Polymorphism of the Interleukin-6 Gene Promoter with Obesity in Iranian Population

Expression and secretion of inflammation markers are disturbed in obesity. Interleukin-6 reduces body fat mass. The common G-174C polymorphism in the promoter of IL-6 gene has been reported that effects on transcriptional regulation. The objective was to investigate association of the common polymorphism G-174C with obesity in Iranian population. The present study is cross sectional association study that included 242 individuals (110 men and 132 women). Serum IL-6 levels, C-reactive protein, fasting blood glucose and blood lipids profile were measured .BMI and WHR were calculated. Genotyping is carried out by PCR and RFLP. The frequencies of G and C allele were 64.5% and 35.5%, respectively. The G-174C polymorphism was not associated with BMI and WHR. However in obese individual, fasting blood glucose was significantly higher in carrier of C allele compared with the noncarrier. The IL-6 G-174C polymorphism is not a risk factor for obesity in Iranian population.

A Blind Digital Watermark in Hadamard Domain

A new blind gray-level watermarking scheme is described. In the proposed method, the host image is first divided into 4*4 non-overlapping blocks. For each block, two first AC coefficients of its Hadamard transform are then estimated using DC coefficients of its neighbor blocks. A gray-level watermark is then added into estimated values. Since embedding watermark does not change the DC coefficients, watermark extracting could be done by estimating AC coefficients and comparing them with their actual values. Several experiments are made and results suggest the robustness of the proposed algorithm.

Experimental Investigation and Sensitivity Analysis for the Effects of Fracture Parameters to the Conductance Properties of Laterite

This experiment discusses the effects of fracture parameters such as depth, length, width, angle and the number of the fracture to the conductance properties of laterite using the DUK-2B digital electrical measurement system combined with the method of simulating the fractures. The results of experiment show that the changes of fracture parameters produce effects to the conductance properties of laterite. There is a clear degressive period of the conductivity of laterite during increasing the depth, length, width, or the angle and the quantity of fracture gradually. When the depth of fracture exceeds the half thickness of the soil body, the conductivity of laterite shows evidently non-linear diminishing pattern and the amplitude of decrease tends to increase. The length of fracture has fewer effects than the depth to the conductivity. When the width of fracture reaches some fixed values, the change of the conductivity is less sensitive to the change of the width, and at this time, the conductivity of laterite maintains at a stable level. When the angle of fracture is less than 45°, the decrease of the conductivity is more clearly as the angle increases. But when angle is more than 45°, change of the conductivity is relatively gentle as the angle increases. The increasing quantity of the fracture causes the other fracture parameters having great impact on the change of conductivity. When moisture content and temperature were unchanged, depth and angle of fractures are the major factors affecting the conductivity of laterite soil; quantity, length, and width are minor influencing factors. The sensitivity of fracture parameters affect conductivity of laterite soil is: depth >angles >quantity >length >width.

Hybridized Technique to Analyze Workstress Related Data via the StressCafé

This paper presents anapproach of hybridizing two or more artificial intelligence (AI) techniques which arebeing used to fuzzify the workstress level ranking and categorize the rating accordingly. The use of two or more techniques (hybrid approach) has been considered in this case, as combining different techniques may lead to neutralizing each other-s weaknesses generating a superior hybrid solution. Recent researches have shown that there is a need for a more valid and reliable tools, for assessing work stress. Thus artificial intelligence techniques have been applied in this instance to provide a solution to a psychological application. An overview about the novel and autonomous interactive model for analysing work-stress that has been developedusing multi-agent systems is also presented in this paper. The establishment of the intelligent multi-agent decision analyser (IMADA) using hybridized technique of neural networks and fuzzy logic within the multi-agent based framework is also described.

Environmental Efficiency of Electric Power Industry of the United States: A Data Envelopment Analysis Approach

Importance of environmental efficiency of electric power industry stems from high demand for energy combined with global warming concerns. It is especially essential for the world largest economies like that of the United States. The paper introduces a Data Envelopment Analysis (DEA) model of environmental efficiency using indicators of fossil fuels utilization, emissions rate, and electric power losses. Using DEA is advantageous in this situation over other approaches due to its nonparametric nature. The paper analyzes data for the period of 1990 - 2006 by comparing actual yearly levels in each dimension with the best values of partial indicators for the period. As positive factors of efficiency, tendency to the decline in emissions rates starting 2000, and in electric power losses starting 2004 may be mentioned together with increasing trend of fuel utilization starting 1999. As a result, dynamics of environmental efficiency is positive starting 2002. The main concern is the decline in fossil fuels utilization in 2006. This negative change should be reversed to comply with ecological and economic requirements.

Effect of Electric Field Amplitude on Electrical Fatigue Behavior of Lead Zirconate Titanate Ceramic

Fatigue behaviors of Lead Zirconate Titanate (PZT) ceramics under different amplitude of bipolar electrical loads have been investigated. Fatigue behavior is represented by the change of hysteresis loops and remnant polarization. Three levels of electrical load amplitudes (1.00, 1.25 and 1.50 kV /mm) were applied in this experimental. It was found that the remnant polarization decreased significantly with the number of loading cycles. The degree of fatigue degradation depends on the amplitude of electric field. The higher amplitude exhibits the greater fatigue degradation.

Signal Driven Sampling and Filtering a Promising Approach for Time Varying Signals Processing

The mobile systems are powered by batteries. Reducing the system power consumption is a key to increase its autonomy. It is known that mostly the systems are dealing with time varying signals. Thus, we aim to achieve power efficiency by smartly adapting the system processing activity in accordance with the input signal local characteristics. It is done by completely rethinking the processing chain, by adopting signal driven sampling and processing. In this context, a signal driven filtering technique, based on the level crossing sampling is devised. It adapts the sampling frequency and the filter order by analysing the input signal local variations. Thus, it correlates the processing activity with the signal variations. It leads towards a drastic computational gain of the proposed technique compared to the classical one.

Virtual Laboratory for Learning Biology – A Preliminary Investigation

This study aims to conduct a preliminary investigation to determine the topic to be focused in developing Virtual Laboratory For Biology (VLab-Bio). Samples involved in answering the questionnaire are form five students (equivalent to A-Level) and biology teachers. Time and economical resources for the setting up and construction of scientific laboratories can be solved with the adaptation of virtual laboratories as an educational tool. Thus, it is hoped that the proposed virtual laboratory will help students to learn the abstract concepts in biology. Findings show that the difficult topic chosen is Cell Division and the learning objective to be focused in developing the virtual lab is “Describe the application of knowledge on mitosis in cloning".

An Integrated Logistics Model of Spare Parts Maintenance Planning within the Aviation Industry

Avoidable unscheduled maintenance events and unnecessary spare parts deliveries are mostly caused by an incorrect choice of the underlying maintenance strategy. For a faster and more efficient supply of spare parts for aircrafts of an airline we examine options for improving the underlying logistics network integrated in an existing aviation industry network. This paper presents a dynamic prediction model as decision support for maintenance method selection considering requirements of an entire flight network. The objective is to guarantee a high supply of spare parts by an optimal interaction of various network levels and thus to reduce unscheduled maintenance events and minimize total costs. By using a prognostics-based preventive maintenance strategy unscheduled component failures are avoided for an increase in availability and reliability of the entire system. The model is intended for use in an aviation company that utilizes a structured planning process based on collected failures data of components.

Assessment of Sediment Quality According To Heavy Metal Status in the West Port of Malaysia

Eight heavy metals (Cu, Cr, Zn, Hg, Pb, Cd, Ni and As) were analyzed in sediment samples in the dry and wet seasons from November 2009 to October 2010 in West Port of Peninsular Malaysia. The heavy metal concentrations (mg/kg dry weight) were ranged from 23.4 to 98.3 for Zn, 22.3 to 80 for Pb, 7.4 to 27.6 Cu, 0.244 to 3.53 for Cd, 7.2 to 22.2 for Ni, 20.2 to 162 for As, 0.11 to 0.409 for Hg and 11.5 to 61.5 for Cr. Metals concentrations in dry season were higher than the rainy season except in cupper and chromium. Analysis of variance with Statistical Analysis System (SAS) shows that the mean concentration of metals in the two seasons (α level=0.05) are not significantly different which shows that the metals were held firmly in the matrix of sediment. Also there are significant differences between control point station with other stations. According to the Interim Sediment Quality guidelines (ISQG), the metal concentrations are moderately polluted, except in arsenic which shows the highest level of pollution.

The Applications of Quantum Mechanics Simulation for Solvent Selection in Chemicals Separation

The quantum mechanics simulation was applied for calculating the interaction force between 2 molecules based on atomic level. For the simple extractive distillation system, it is ternary components consisting of 2 closed boiling point components (A,lower boiling point and B, higher boiling point) and solvent (S). The quantum mechanics simulation was used to calculate the intermolecular force (interaction force) between the closed boiling point components and solvents consisting of intermolecular between A-S and B-S. The requirement of the promising solvent for extractive distillation is that solvent (S) has to form stronger intermolecular force with only one component than the other component (A or B). In this study, the systems of aromatic-aromatic, aromatic-cycloparaffin, and paraffindiolefin systems were selected as the demonstration for solvent selection. This study defined new term using for screening the solvents called relative interaction force which is calculated from the quantum mechanics simulation. The results showed that relative interaction force gave the good agreement with the literature data (relative volatilities from the experiment). The reasons are discussed. Finally, this study suggests that quantum mechanics results can improve the relative volatility estimation for screening the solvents leading to reduce time and money consuming

An Evaluation of Land Use Control in Hokkaido, Japan

This study focuses on an evaluation of Hokkaido which is the northernmost and largest prefecture by surface area in Japan and particularly on two points: the rivalry between all kinds of land use such as urban land and agricultural and forestry land in various cities and their surrounding areas and the possibilities for forestry biomass in areas other than those mentioned above and grasps which areas require examination of the nature of land use control and guidance through conducting land use analysis at the district level using GIS (Geographic Information Systems). The results of analysis in this study demonstrated that it is essential to divide the whole of Hokkaido into two areas: those within delineated city planning areas and those outside of delineated city planning areas and to conduct an evaluation of each land use control. In delineated urban areas, particularly urban areas, it is essential to re-examine land use from the point of view of compact cities or smart cities along with conducting an evaluation of land use control that focuses on issues of rivalry between all kinds of land use such as urban land and agricultural and forestry land. In areas outside of delineated urban areas, it is desirable to aim to build a specific community recycling range based on forest biomass utilization by conducting an evaluation of land use control concerning the possibilities for forest biomass focusing particularly on forests within and outside of city planning areas.