Impulse Response Shortening for Discrete Multitone Transceivers using Convex Optimization Approach

In this paper we propose a new criterion for solving the problem of channel shortening in multi-carrier systems. In a discrete multitone receiver, a time-domain equalizer (TEQ) reduces intersymbol interference (ISI) by shortening the effective duration of the channel impulse response. Minimum mean square error (MMSE) method for TEQ does not give satisfactory results. In [1] a new criterion for partially equalizing severe ISI channels to reduce the cyclic prefix overhead of the discrete multitone transceiver (DMT), assuming a fixed transmission bandwidth, is introduced. Due to specific constrained (unit morm constraint on the target impulse response (TIR)) in their method, the freedom to choose optimum vector (TIR) is reduced. Better results can be obtained by avoiding the unit norm constraint on the target impulse response (TIR). In this paper we change the cost function proposed in [1] to the cost function of determining the maximum of a determinant subject to linear matrix inequality (LMI) and quadratic constraint and solve the resulting optimization problem. Usefulness of the proposed method is shown with the help of simulations.

Moving From Problem Space to Solution Space

Extracting and elaborating software requirements and transforming them into viable software architecture are still an intricate task. This paper defines a solution architecture which is based on the blurred amalgamation of problem space and solution space. The dependencies between domain constraints, requirements and architecture and their importance are described that are to be considered collectively while evolving from problem space to solution space. This paper proposes a revised version of Twin Peaks Model named Win Peaks Model that reconciles software requirements and architecture in more consistent and adaptable manner. Further the conflict between stakeholders- win-requirements is resolved by proposed Voting methodology that is simple adaptation of win-win requirements negotiation model and QARCC.

Molecular Mechanism of Amino Acid Discrimination for the Editing Reaction of E.coli Leucyl-tRNA Synthetase

Certain tRNA synthetases have developed highly accurate molecular machinery to discriminate their cognate amino acids. Those aaRSs achieve their goal via editing reaction in the Connective Polypeptide 1 (CP1). Recently mutagenesis studies have revealed the critical importance of residues in the CP1 domain for editing activity and X-ray structures have shown binding mode of noncognate amino acids in the editing domain. To pursue molecular mechanism for amino acid discrimination, molecular modeling studies were performed. Our results suggest that aaRS bind the noncognate amino acid more tightly than the cognate one. Finally, by comparing binding conformations of the amino acids in three systems, the amino acid binding mode was elucidated and a discrimination mechanism proposed. The results strongly reveal that the conserved threonines are responsible for amino acid discrimination. This is achieved through side chain interactions between T252 and T247/T248 as well as between those threonines and the incoming amino acids.

Reduction of Power Losses in Distribution Systems

Losses reduction initiatives in distribution systems have been activated due to the increasing cost of supplying electricity, the shortage in fuel with ever-increasing cost to produce more power, and the global warming concerns. These initiatives have been introduced to the utilities in shape of incentives and penalties. Recently, the electricity distribution companies in Oman have been incentivized to reduce the distribution technical and non-technical losses with an equal annual reduction rate for 6 years. In this paper, different techniques for losses reduction in Mazoon Electricity Company (MZEC) are addressed. In this company, high numbers of substation and feeders were found to be non-compliant with the Distribution System Security Standard (DSSS). Therefore, 33 projects have been suggested to bring non-complying 29 substations and 28 feeders to meet the planed criteria and to comply with the DSSS. The largest part of MZEC-s network (South Batinah region) was modeled by ETAP software package. The model has been extended to implement the proposed projects and to examine their effects on losses reduction. Simulation results have shown that the implementation of these projects leads to a significant improvement in voltage profile, and reduction in the active and the reactive power losses. Finally, the economical analysis has revealed that the implementation of the proposed projects in MZEC leads to an annual saving of about US$ 5 million.

A Robust Data Hiding Technique based on LSB Matching

Many researchers are working on information hiding techniques using different ideas and areas to hide their secrete data. This paper introduces a robust technique of hiding secret data in image based on LSB insertion and RSA encryption technique. The key of the proposed technique is to encrypt the secret data. Then the encrypted data will be converted into a bit stream and divided it into number of segments. However, the cover image will also be divided into the same number of segments. Each segment of data will be compared with each segment of image to find the best match segment, in order to create a new random sequence of segments to be inserted then in a cover image. Experimental results show that the proposed technique has a high security level and produced better stego-image quality.

Investigating Transformations in the Cartesian Plane Using Spreadsheets

The link between coordinate transformations in the plane and their effects on the graph of a function can be difficult for students studying college level mathematics to comprehend. To solidify this conceptual link in the mind of a student Microsoft Excel can serve as a convenient graphing tool and pedagogical aid. The authors of this paper describe how various transformations and their related functional symmetry properties can be graphically displayed with an Excel spreadsheet.

The Parameters Analysis for the Intersection Collision Avoidance Systems Based on Radar Sensors

This paper mainly studies the analyses of parameters in the intersection collision avoidance (ICA) system based on the radar sensors. The parameters include the positioning errors, the repeat period of the radar sensor, the conditions of potential collisions of two cross-path vehicles, etc. The analyses of the parameters can provide the requirements, limitations, or specifications of this ICA system. In these analyses, the positioning errors will be increased as the measured vehicle approach the intersection. In addition, it is not necessary to implement the radar sensor in higher position since the positioning sensitivities become serious as the height of the radar sensor increases. A concept of the safety buffer distances for front and rear of the measured vehicle is also proposed. The conditions for potential collisions of two cross-path vehicles are also presented to facilitate the computation algorithm.

Accelerating Integer Neural Networks On Low Cost DSPs

In this paper, low end Digital Signal Processors (DSPs) are applied to accelerate integer neural networks. The use of DSPs to accelerate neural networks has been a topic of study for some time, and has demonstrated significant performance improvements. Recently, work has been done on integer only neural networks, which greatly reduces hardware requirements, and thus allows for cheaper hardware implementation. DSPs with Arithmetic Logic Units (ALUs) that support floating or fixed point arithmetic are generally more expensive than their integer only counterparts due to increased circuit complexity. However if the need for floating or fixed point math operation can be removed, then simpler, lower cost DSPs can be used. To achieve this, an integer only neural network is created in this paper, which is then accelerated by using DSP instructions to improve performance.

Economical Operation of Hydro-Thermal Power System based on Multi-path Adaptive Tabu Search

An economic operation scheduling problem of a hydro-thermal power generation system has been properly solved by the proposed multipath adaptive tabu search algorithm (MATS). Four reservoirs with their own hydro plants and another one thermal plant are integrated to be a studied system used to formulate the objective function under complicated constraints, eg water managements, power balance and thermal generator limits. MATS with four subsearch units (ATSs) and two stages of discarding mechanism (DM), has been setting and trying to solve the problem through 25 trials under function evaluation criterion. It is shown that MATS can provide superior results with respect to single ATS and other previous methods, genetic algorithms (GA) and differential evolution (DE).

Controlling of Load Elevators by the Fuzzy Logic Method

In this study, a fuzzy-logic based control system was designed to ensure that time and energy is saved during the operation of load elevators which are used during the construction of tall buildings. In the control system that was devised, for the load elevators to work more efficiently, the energy interval where the motor worked was taken as the output variable whereas the amount of load and the building height were taken as input variables. The most appropriate working intervals depending on the characteristics of these variables were defined by the help of an expert. Fuzzy expert system software was formed using Delphi programming language. In this design, mamdani max-min inference mechanism was used and the centroid method was employed in the clarification procedure. In conclusion, it is observed that the system that was designed is feasible and this is supported by statistical analyses..

A New Scheme for Improving the Quality of Service in Heterogeneous Wireless Network for Data Stream Sending

In this paper, we first consider the quality of service problems in heterogeneous wireless networks for sending the video data, which their problem of being real-time is pronounced. At last, we present a method for ensuring the end-to-end quality of service at application layer level for adaptable sending of the video data at heterogeneous wireless networks. To do this, mechanism in different layers has been used. We have used the stop mechanism, the adaptation mechanism and the graceful degrade at the application layer, the multi-level congestion feedback mechanism in the network layer and connection cutting off decision mechanism in the link layer. At the end, the presented method and the achieved improvement is simulated and presented in the NS-2 software.

The Functionality and Usage of CRM Systems

Modern information and communication technologies offer a variety of support options for the efficient handling of customer relationships. CRM systems have been developed, which are designed to support the processes in the areas of marketing, sales and service. Along with technological progress, CRM systems are constantly changing, i.e. the systems are continually enhanced by new functions. However, not all functions are suitable for every company because of different frameworks and business processes. In this context the question arises whether or not CRM systems are widely used in Austrian companies and which business processes are most frequently supported by CRM systems. This paper aims to shed light on the popularity of CRM systems in Austrian companies in general and the use of different functions to support their daily business. First of all, the paper provides a theoretical overview of the structure of modern CRM systems and proposes a categorization of currently available software functionality for collaborative, operational and analytical CRM processes, which provides the theoretical background for the empirical study. Apart from these theoretical considerations, the paper presents the empirical results of a field survey on the use of CRM systems in Austrian companies and analyzes its findings.

Selecting Materialized Views Using Two-Phase Optimization with Multiple View Processing Plan

A data warehouse (DW) is a system which has value and role for decision-making by querying. Queries to DW are critical regarding to their complexity and length. They often access millions of tuples, and involve joins between relations and aggregations. Materialized views are able to provide the better performance for DW queries. However, these views have maintenance cost, so materialization of all views is not possible. An important challenge of DW environment is materialized view selection because we have to realize the trade-off between performance and view maintenance cost. Therefore, in this paper, we introduce a new approach aimed at solve this challenge based on Two-Phase Optimization (2PO), which is a combination of Simulated Annealing (SA) and Iterative Improvement (II), with the use of Multiple View Processing Plan (MVPP). Our experiments show that our method provides a further improvement in term of query processing cost and view maintenance cost.

Ghazal Ozon River and Preserving the Existent Aquatics While Constructing the Siazakh Dam

The main purpose of the dam is to control the surface streams and rivers across the country. Dam construction and formation of river and big water reservoirs and resources happen in the glen is a big incident which would change its surrounding area considerably. In fact, constructing a dam the glen width is close and fishes don't migrate from upstream to downstream and ultimately it would led to their death. To resolve this, it seems necessity to create a passage for fishes during the construction of dam. It is provided establishing a set of stepped pools overlooking each other as a fish way or fish ladder a proper pathway for moving fishes. In this article we first examine the surrounding environment and then Ghazal Ozon River and preserving the aquatics.

Theoretical Considerations for Software Component Metrics

We have defined two suites of metrics, which cover static and dynamic aspects of component assembly. The static metrics measure complexity and criticality of component assembly, wherein complexity is measured using Component Packing Density and Component Interaction Density metrics. Further, four criticality conditions namely, Link, Bridge, Inheritance and Size criticalities have been identified and quantified. The complexity and criticality metrics are combined to form a Triangular Metric, which can be used to classify the type and nature of applications. Dynamic metrics are collected during the runtime of a complete application. Dynamic metrics are useful to identify super-component and to evaluate the degree of utilisation of various components. In this paper both static and dynamic metrics are evaluated using Weyuker-s set of properties. The result shows that the metrics provide a valid means to measure issues in component assembly. We relate our metrics suite with McCall-s Quality Model and illustrate their impact on product quality and to the management of component-based product development.

Confronting the Uncertainty of Systemic Innovation in Public Welfare Services

Faced with social and health system capacity constraints and rising and changing demand for welfare services, governments and welfare providers are increasingly relying on innovation to help support and enhance services. However, the evidence reported by several studies indicates that the realization of that potential is not an easy task. Innovations can be deemed inherently complex to implement and operate, because many of them involve a combination of technological and organizational renewal within an environment featuring a diversity of stakeholders. Many public welfare service innovations are markedly systemic in their nature, which means that they emerge from, and must address, the complex interplay between political, administrative, technological, institutional and legal issues. This paper suggests that stakeholders dealing with systemic innovation in welfare services must deal with ambiguous and incomplete information in circumstances of uncertainty. Employing a literature review methodology and case study, this paper identifies, categorizes and discusses different aspects of the uncertainty of systemic innovation in public welfare services, and argues that uncertainty can be classified into eight categories: technological uncertainty, market uncertainty, regulatory/institutional uncertainty, social/political uncertainty, acceptance/legitimacy uncertainty, managerial uncertainty, timing uncertainty and consequence uncertainty.

Linux based Embedded Node for Capturing, Compression and Streaming of Digital Audio and Video

A prototype for audio and video capture and compression in real time on a Linux platform has been developed. It is able to visualize both the captured and the compressed video at the same time, as well as the captured and compressed audio with the goal of comparing their quality. As it is based on free code, the final goal is to run it in an embedded system running Linux. Therefore, we would implement a node to capture and compress such multimedia information. Thus, it would be possible to consider the project within a larger one aimed at live broadcast of audio and video using a streaming server which would communicate with our node. Then, we would have a very powerful and flexible system with several practical applications.

Radiation Damage as Nonlinear Evolution of Complex System

Irradiated material is a typical example of a complex system with nonlinear coupling between its elements. During irradiation the radiation damage is developed and this development has bifurcations and qualitatively different kinds of behavior. The accumulation of primary defects in irradiated crystals is considered in frame work of nonlinear evolution of complex system. The thermo-concentration nonlinear feedback is carried out as a mechanism of self-oscillation development. It is shown that there are two ways of the defect density evolution under stationary irradiation. The first is the accumulation of defects; defect density monotonically grows and tends to its stationary state for some system parameters. Another way that takes place for opportune parameters is the development of self-oscillations of the defect density. The stationary state, its stability and type are found. The bifurcation values of parameters (environment temperature, defect generation rate, etc.) are obtained. The frequency of the selfoscillation and the conditions of their development is found and rated. It is shown that defect density, heat fluxes and temperature during self-oscillations can reach much higher values than the expected steady-state values. It can lead to a change of typical operation and an accident, e.g. for nuclear equipment.

A Hybrid Search Algorithm for Solving Constraint Satisfaction Problems

In this paper we present a hybrid search algorithm for solving constraint satisfaction and optimization problems. This algorithm combines ideas of two basic approaches: complete and incomplete algorithms which also known as systematic search and local search algorithms. Different characteristics of systematic search and local search methods are complementary. Therefore we have tried to get the advantages of both approaches in the presented algorithm. The major advantage of presented algorithm is finding partial sound solution for complicated problems which their complete solution could not be found in a reasonable time. This algorithm results are compared with other algorithms using the well known n-queens problem.

Designing a Framework for Network Security Protection

As the Internet continues to grow at a rapid pace as the primary medium for communications and commerce and as telecommunication networks and systems continue to expand their global reach, digital information has become the most popular and important information resource and our dependence upon the underlying cyber infrastructure has been increasing significantly. Unfortunately, as our dependency has grown, so has the threat to the cyber infrastructure from spammers, attackers and criminal enterprises. In this paper, we propose a new machine learning based network intrusion detection framework for cyber security. The detection process of the framework consists of two stages: model construction and intrusion detection. In the model construction stage, a semi-supervised machine learning algorithm is applied to a collected set of network audit data to generate a profile of normal network behavior and in the intrusion detection stage, input network events are analyzed and compared with the patterns gathered in the profile, and some of them are then flagged as anomalies should these events are sufficiently far from the expected normal behavior. The proposed framework is particularly applicable to the situations where there is only a small amount of labeled network training data available, which is very typical in real world network environments.