Regular Data Broadcasting Plan with Grouping in Wireless Mobile Environment

The broadcast problem including the plan design is considered. The data are inserted and numbered at predefined order into customized size relations. The server ability to create a full, regular Broadcast Plan (RBP) with single and multiple channels after some data transformations is examined. The Regular Geometric Algorithm (RGA) prepares a RBP and enables the users to catch their items avoiding energy waste of their devices. Moreover, the Grouping Dimensioning Algorithm (GDA) based on integrated relations can guarantee the discrimination of services with a minimum number of channels. This last property among the selfmonitoring, self-organizing, can be offered by servers today providing also channel availability and less energy consumption by using smaller number of channels. Simulation results are provided.

The Statistical Properties of Filtered Signals

In this paper, the statistical properties of filtered or convolved signals are considered by deriving the resulting density functions as well as the exact mean and variance expressions given a prior knowledge about the statistics of the individual signals in the filtering or convolution process. It is shown that the density function after linear convolution is a mixture density, where the number of density components is equal to the number of observations of the shortest signal. For circular convolution, the observed samples are characterized by a single density function, which is a sum of products.

Improved Hill Climbing and Simulated Annealing Algorithms for Size Optimization of Trusses

Truss optimization problem has been vastly studied during the past 30 years and many different methods have been proposed for this problem. Even though most of these methods assume that the design variables are continuously valued, in reality, the design variables of optimization problems such as cross-sectional areas are discretely valued. In this paper, an improved hill climbing and an improved simulated annealing algorithm have been proposed to solve the truss optimization problem with discrete values for crosssectional areas. Obtained results have been compared to other methods in the literature and the comparison represents that the proposed methods can be used more efficiently than other proposed methods

Deniable Authentication Protocol Resisting Man-in-the-Middle Attack

Deniable authentication is a new protocol which not only enables a receiver to identify the source of a received message but also prevents a third party from identifying the source of the message. The proposed protocol in this paper makes use of bilinear pairings over elliptic curves, as well as the Diffie-Hellman key exchange protocol. Besides the security properties shared with previous authentication protocols, the proposed protocol provides the same level of security with smaller public key sizes.

Geochemistry of Cenozoic Basaltic Rocksaround Liuhe National Geopark, Jiangsu Province, Eastern China: Petrogenesis and Mantle Source

Cenozoic basalts found in Jiangsu province of eastern China include tholeiites and alkali basalts. The present paper analyzed the major, trace elements, rare earth elements of these Cenozoic basalts and combined with Sr-Nd isotopic compositions proposed by Chen et al. (1990)[1] in the literatures to discuss the petrogenesis of these basalts and the geochemical characteristics of the source mantle. Based on major, trace elements and fractional crystallization model established by Brooks and Nielsen (1982)[2] we suggest that the basaltic magma has experienced olivine + clinopyroxene fractionation during its evolution. The chemical compositions of basaltic rocks from Jiangsu province indicate that these basalts may belong to the same magmatic system. Spidergrams reveal that Cenozoic basalts from Jiangsu province have geochemical characteristics similar to those of ocean island basalts(OIB). The slight positive Nb and Ti anomalies found in basaltic rocks of this study suggest the presence of Ti-bearing minerals in the mantle source and these Ti-bearing minerals had contributed to basaltic magma during partial melting, indicating a metasomatic event might have occurred before the partial melting. Based on the Sr vs. Nd isotopic ratio plots, we suggest that Jiangsu basalts may be derived from partial melting of mantle source which may represent two-end members mixing of DMM and EM-I. Some Jiangsu basaltic magma may be derived from partial melting of EM-I heated by the upwelling asthenospheric mantle or asthenospheric diapirism.

Roundabout Optimal Entry and Circulating Flow Induced by Road Hump

Roundabout work on the principle of circulation and entry flows, where the maximum entry flow rates depend largely on circulating flow bearing in mind that entry flows must give away to circulating flows. Where an existing roundabout has a road hump installed at the entry arm, it can be hypothesized that the kinematics of vehicles may prevent the entry arm from achieving optimum performance. Road humps are traffic calming devices placed across road width solely as speed reduction mechanism. They are the preferred traffic calming option in Malaysia and often used on single and dual carriageway local routes. The speed limit on local routes is 30mph (50 km/hr). Road humps in their various forms achieved the biggest mean speed reduction (based on a mean speed before traffic calming of 30mph) of up to 10mph or 16 km/hr according to the UK Department of Transport. The underlying aim of reduced speed should be to achieve a 'safe' distribution of speeds which reflects the function of the road and the impacts on the local community. Constraining safe distribution of speeds may lead to poor drivers timing and delayed reflex reaction that can probably cause accident. Previous studies on road hump impact have focused mainly on speed reduction, traffic volume, noise and vibrations, discomfort and delay from the use of road humps. The paper is aimed at optimal entry and circulating flow induced by road humps. Results show that roundabout entry and circulating flow perform better in circumstances where there is no road hump at entrance.

Evolution of Quality Function Deployment (QFD) via Fuzzy Concepts and Neural Networks

Quality Function Deployment (QFD) is an expounded, multi-step planning method for delivering commodity, services, and processes to customers, both external and internal to an organization. It is a way to convert between the diverse customer languages expressing demands (Voice of the Customer), and the organization-s languages expressing results that sate those demands. The policy is to establish one or more matrices that inter-relate producer and consumer reciprocal expectations. Due to its visual presence is called the “House of Quality" (HOQ). In this paper, we assumed HOQ in multi attribute decision making (MADM) pattern and through a proposed MADM method, rank technical specifications. Thereafter compute satisfaction degree of customer requirements and for it, we apply vagueness and uncertainty conditions in decision making by fuzzy set theory. This approach would propound supervised neural network (perceptron) for MADM problem solving.

Robust Parameter and Scale Factor Estimation in Nonstationary and Impulsive Noise Environment

The problem of FIR system parameter estimation has been considered in the paper. A new robust recursive algorithm for simultaneously estimation of parameters and scale factor of prediction residuals in non-stationary environment corrupted by impulsive noise has been proposed. The performance of derived algorithm has been tested by simulations.

Examining Corporate Tax Evaders: Evidence from the Finalized Audit Cases

This paper aims to (1) analyze the profiles of transgressors (detected evaders); (2) examine reason(s) that triggered a tax audit, causes of tax evasion, audit timeframe and tax penalty charged; and (3) to assess if tax auditors followed the guidelines as stated in the 'Tax Audit Framework' when conducting tax audits. In 2011, the Inland Revenue Board Malaysia (IRBM) had audited and finalized 557 company cases. With official permission, data of all the 557 cases were obtained from the IRBM. Of these, a total of 421 cases with complete information were analyzed. About 58.1% was small and medium corporations and from the construction industry (32.8%). The selection for tax audit was based on risk analysis (66.8%), information from third party (11.1%), and firm with low profitability or fluctuating profit pattern (7.8%). The three persistent causes of tax evasion by firms were over claimed expenses (46.8%), fraudulent reporting of income (38.5%) and overstating purchases (10.5%). These findings are consistent with past literature. Results showed that tax auditors took six to 18 months to close audit cases. More than half of tax evaders were fined 45% on additional tax raised during audit for the first offence. The study found tax auditors did follow the guidelines in the 'Tax Audit Framework' in audit selection, settlement and penalty imposition.

Enhanced Efficacy of Kinetic Power Transform for High-Speed Wind Field

The three-time-scale plant model of a wind power generator, including a wind turbine, a flexible vertical shaft, a Variable Inertia Flywheel (VIF) module, an Active Magnetic Bearing (AMB) unit and the applied wind sequence, is constructed. In order to make the wind power generator be still able to operate as the spindle speed exceeds its rated speed, the VIF is equipped so that the spindle speed can be appropriately slowed down once any stronger wind field is exerted. To prevent any potential damage due to collision by shaft against conventional bearings, the AMB unit is proposed to regulate the shaft position deviation. By singular perturbation order-reduction technique, a lower-order plant model can be established for the synthesis of feedback controller. Two major system parameter uncertainties, an additive uncertainty and a multiplicative uncertainty, are constituted by the wind turbine and the VIF respectively. Frequency Shaping Sliding Mode Control (FSSMC) loop is proposed to account for these uncertainties and suppress the unmodeled higher-order plant dynamics. At last, the efficacy of the FSSMC is verified by intensive computer and experimental simulations for regulation on position deviation of the shaft and counter-balance of unpredictable wind disturbance.

Analysis of FWM Penalties in DWDM Systems Based on G.652, G.653, and G.655 Optical Fibers

This paper presents an investigation of the power penalties imposed by four-wave mixing (FWM) on G.652 (Single- Mode Fiber - SMF), G.653 (Dispersion-Shifted Fiber - DSF), and G.655 (Non-Zero Dispersion-Shifted Fiber - NZDSF) compliant fibers, considering the DWDM grids suggested by the ITU-T Recommendations G.692, and G.694.1, with uniform channel spacing of 100, 50, 25, and 12.5 GHz. The mathematical/numerical model assumes undepleted pumping, and shows very clearly the deleterious effect of FWM on the performance of DWDM systems, measured by the signal-to-noise ratio (SNR). The results make it evident that non-uniform channel spacing is practically mandatory for WDM systems based on DSF fibers.

Using Automated Database Reverse Engineering for Database Integration

One important problem in today organizations is the existence of non-integrated information systems, inconsistency and lack of suitable correlations between legacy and modern systems. One main solution is to transfer the local databases into a global one. In this regards we need to extract the data structures from the legacy systems and integrate them with the new technology systems. In legacy systems, huge amounts of a data are stored in legacy databases. They require particular attention since they need more efforts to be normalized, reformatted and moved to the modern database environments. Designing the new integrated (global) database architecture and applying the reverse engineering requires data normalization. This paper proposes the use of database reverse engineering in order to integrate legacy and modern databases in organizations. The suggested approach consists of methods and techniques for generating data transformation rules needed for the data structure normalization.

Structure of Linkages and Cam Gear for Integral Steering of Vehicles

This paper addresses issues of integral steering of vehicles with two steering axles, where the rear wheels are pivoted in the direction of the front wheels, but also in the opposite direction. The steering box of the rear axle is presented with simple linkages (single contour) that correlate the pivoting of the rear wheels according to the direction of the front wheels, respectively to the rotation angle of the steering wheel. The functionality of the system is analyzed – the extent to which the requirements of the integral steering are met by the considered/proposed mechanisms. The paper highlights the quality of the single contour linkages, with two driving elements for meeting these requirements, emphasizing diagrams of mechanisms with 2 driving elements. Cam variants are analyzed and proposed for the rear axle steering box. Cam profiles are determined by various factors.

Enhancing Cache Performance Based on Improved Average Access Time

A high performance computer includes a fast processor and millions bytes of memory. During the data processing, huge amount of information are shuffled between the memory and processor. Because of its small size and its effectiveness speed, cache has become a common feature of high performance computers. Enhancing cache performance proved to be essential in the speed up of cache-based computers. Most enhancement approaches can be classified as either software based or hardware controlled. The performance of the cache is quantified in terms of hit ratio or miss ratio. In this paper, we are optimizing the cache performance based on enhancing the cache hit ratio. The optimum cache performance is obtained by focusing on the cache hardware modification in the way to make a quick rejection to the missed line's tags from the hit-or miss comparison stage, and thus a low hit time for the wanted line in the cache is achieved. In the proposed technique which we called Even- Odd Tabulation (EOT), the cache lines come from the main memory into cache are classified in two types; even line's tags and odd line's tags depending on their Least Significant Bit (LSB). This division is exploited by EOT technique to reject the miss match line's tags in very low time compared to the time spent by the main comparator in the cache, giving an optimum hitting time for the wanted cache line. The high performance of EOT technique against the familiar mapping technique FAM is shown in the simulated results.

Innovation Strategy in Slovak Businesses

The aim of the paper is based on detailed analysis of literary sources and carried out research to develop a model development and implementation of innovation strategy in the business. The paper brings the main results of the authors conducted research on a sample of 462 respondents that shows the current situation in the Slovak enterprises in the use of innovation strategy. Carried out research and analysis provided the base for a model development and implementation of innovation strategy in the business, which is in the paper in detail, step by step explained with emphasis on the implementation process. Implementing the innovation strategy is described a separate model. Paper contains recommendations for successful implementation of innovation strategy in the business. These recommendations should serve mainly business managers as valuable tool in implementing the innovation strategy.

Identifying the Objectives of Outsourcing Logistics Services as a Basis for Measuring Its Financial and Operational Performance

Logistics outsourcing is a growing trend and measuring its performance, a challenge. It must be consistent with the objectives set for logistics outsourcing, but we have found no objective-based performance measurement system. We have conducted a comprehensive review of the specialist literature to cover this gap, which has led us to identify and define these objectives. The outcome is that we have obtained a list of the most relevant objectives and their descriptions. This will enable us to analyse in a future study whether the indicators used for measuring logistics outsourcing performance are consistent with the objectives pursued with the outsourcing. If this is not the case, a proposal will be made for a set of financial and operational indicators to measure performance in logistics outsourcing that take the goals being pursued into account.

A Remote Sensing Approach for Vulnerability and Environmental Change in Apodi Valley Region, Northeast Brazil

The objective of this study was to improve our understanding of vulnerability and environmental change; it's causes basically show the intensity, its distribution and human-environment effect on the ecosystem in the Apodi Valley Region, This paper is identify, assess and classify vulnerability and environmental change in the Apodi valley region using a combined approach of landscape pattern and ecosystem sensitivity. Models were developed using the following five thematic layers: Geology, geomorphology, soil, vegetation and land use/cover, by means of a Geographical Information Systems (GIS)-based on hydro-geophysical parameters. In spite of the data problems and shortcomings, using ESRI-s ArcGIS 9.3 program, the vulnerability score, to classify, weight and combine a number of 15 separate land cover classes to create a single indicator provides a reliable measure of differences (6 classes) among regions and communities that are exposed to similar ranges of hazards. Indeed, the ongoing and active development of vulnerability concepts and methods have already produced some tools to help overcome common issues, such as acting in a context of high uncertainties, taking into account the dynamics and spatial scale of asocial-ecological system, or gathering viewpoints from different sciences to combine human and impact-based approaches. Based on this assessment, this paper proposes concrete perspectives and possibilities to benefit from existing commonalities in the construction and application of assessment tools.

Algerian Irrigation in Transition; Effects on Irrigation Profitability in Irrigation Schemes: The Case of the East Mitidja Scheme

In Algeria, liberalization reforms undertaken since the 1990s have resulted in negative effects on the development and management of irrigation schemes, as well as on the conditions of farmers. Reforms have been undertaken to improve the performance of irrigation schemes, such as the national plan of agricultural development (PNDA) in 2000 and the water pricing policy of 2005. However, after implementation of these policies, questions have arisen with regard to irrigation performance and its suitability for agricultural development. Hence, the aim of this paper is to provide insight into the profitability of irrigation during the transition period under current irrigation agricultural policies in Algeria. By using the method of farm crop budget analysis in the East Mitidja irrigation scheme, the returns from using surface water resources based on farm typology were found to vary among crops and farmers- groups within the scheme. Irrigation under the current situation is profitable for all farmers, including both those who benefit from subsidies and those who do not. However, the returns to water were found to be very sensitive to crop price fluctuations, particularly for non-subsidized groups and less so for those whose farming is based on orchards. Moreover, the socio-economic environment of the farmers contributed to less significant impacts of the PNDA policy. In fact, the limiting factor is not only the water, but also the lack of land ownership title. Market access constraints led to less agricultural investment and therefore to low intensification and low water productivity. It is financially feasible to recover the annual O&M costs in the irrigation scheme. By comparing the irrigation water price, returns to water, and O&M costs of water delivery, it is clear that irrigation can be profitable in the future. However, water productivity must be improved by enhancing farmers- income through farming investment, improving assets access, and the allocation of activities and crops which bring high returns to water; this could allow the farmers to pay more for water and allow cost recovery for water systems.

On the use of Ionic Liquids for CO2 Capturing

In this work, ionic liquids (ILs) for CO2 capturing in typical absorption/stripper process are considered. The use of ionic liquids is considered to be cost-effective because it requires less energy for solvent recovery compared to other conventional processes. A mathematical model is developed for the process based on Peng-Robinson (PR) equation of state (EoS) which is validated with experimental data for various solutions involving CO2. The model is utilized to study the sorbent and energy demand for three types of ILs at specific CO2 capturing rates. The energy demand is manifested by the vapor-liquid equilibrium temperature necessary to remove the captured CO2 from the used solvent in the regeneration step. It is found that higher recovery temperature is required for solvents with higher solubility coefficient. For all ILs, the temperature requirement is less than that required by the typical monoethanolamine (MEA) solvent. The effect of the CO2 loading in the sorbent stream on the process performance is also examined.

Securing Message in Wireless Sensor Network by using New Method of Code Conversions

Recently, wireless sensor networks have been paid more interest, are widely used in a lot of commercial and military applications, and may be deployed in critical scenarios (e.g. when a malfunctioning network results in danger to human life or great financial loss). Such networks must be protected against human intrusion by using the secret keys to encrypt the exchange messages between communicating nodes. Both the symmetric and asymmetric methods have their own drawbacks for use in key management. Thus, we avoid the weakness of these two cryptosystems and make use of their advantages to establish a secure environment by developing the new method for encryption depending on the idea of code conversion. The code conversion-s equations are used as the key for designing the proposed system based on the basics of logic gate-s principals. Using our security architecture, we show how to reduce significant attacks on wireless sensor networks.