Pontrjagin Duality and Codes over Finite Commutative Rings

We present linear codes over finite commutative rings which are not necessarily Frobenius. We treat the notion of syndrome decoding by using Pontrjagin duality. We also give a version of Delsarte-s theorem over rings relating trace codes and subring subcodes.

Development of A Meta Description Language for Software/Hardware Cooperative Design and Verification for Model-Checking Systems

Model-checking tools such as Symbolic Model Verifier (SMV) and NuSMV are available for checking hardware designs. These tools can automatically check the formal legitimacy of a design. However, NuSMV is too low level for describing a complete hardware design. It is therefore necessary to translate the system definition, as designed in a language such as Verilog or VHDL, into a language such as NuSMV for validation. In this paper, we present a meta hardware description language, Melasy, that contains a code generator for existing hardware description languages (HDLs) and languages for model checking that solve this problem.

Numerical Analysis on Rapid Decompression in Conventional Dry Gases using One- Dimensional Mathematical Modeling

The paper presents a one-dimensional transient mathematical model of compressible thermal multi-component gas mixture flows in pipes. The set of the mass, momentum and enthalpy conservation equations for gas phase is solved. Thermo-physical properties of multi-component gas mixture are calculated by solving the Equation of State (EOS) model. The Soave-Redlich-Kwong (SRK-EOS) model is chosen. Gas mixture viscosity is calculated on the basis of the Lee-Gonzales-Eakin (LGE) correlation. Numerical analysis on rapid decompression in conventional dry gases is performed by using the proposed mathematical model. The model is validated on measured values of the decompression wave speed in dry natural gas mixtures. All predictions show excellent agreement with the experimental data at high and low pressure. The presented model predicts the decompression in dry natural gas mixtures much better than GASDECOM and OLGA codes, which are the most frequently-used codes in oil and gas pipeline transport service.

Unequal Error Protection of Facial Features for Personal ID Images Coding

This paper presents an approach for an unequal error protection of facial features of personal ID images coding. We consider unequal error protection (UEP) strategies for the efficient progressive transmission of embedded image codes over noisy channels. This new method is based on the progressive image compression embedded zerotree wavelet (EZW) algorithm and UEP technique with defined region of interest (ROI). In this case is ROI equal facial features within personal ID image. ROI technique is important in applications with different parts of importance. In ROI coding, a chosen ROI is encoded with higher quality than the background (BG). Unequal error protection of image is provided by different coding techniques and encoding LL band separately. In our proposed method, image is divided into two parts (ROI, BG) that consist of more important bytes (MIB) and less important bytes (LIB). The proposed unequal error protection of image transmission has shown to be more appropriate to low bit rate applications, producing better quality output for ROI of the compresses image. The experimental results verify effectiveness of the design. The results of our method demonstrate the comparison of the UEP of image transmission with defined ROI with facial features and the equal error protection (EEP) over additive white gaussian noise (AWGN) channel.

Identification of Individual Objects at the Intelligent Assembly Cell

In this contribution is presented a complex design of individual objects identification in the workplace of intelligent assembly cell. Intelligent assembly cell is situated at Institute of Manufacturing Systems and Applied Mechanics and is used for pneumatic actuator assembly. Pneumatic actuator components are pneumatic roller, cover, piston and spring. Two identification objects alternatives for assembly are designed in the workplace of industrial robot. In the contribution is evaluated and selected suitable alternative for identification – 2D codes reader. The complex design of individual object identification is going out of intelligent manufacturing systems knowledge. Intelligent assembly and manufacturing systems as systems of new generation are gradually loaded in to the mechanical production, when they are removeing human operation out of production process and they also short production times.

Efficient Web-Learning Collision Detection Tool on Five-Axis Machine

As networking has become popular, Web-learning tends to be a trend while designing a tool. Moreover, five-axis machining has been widely used in industry recently; however, it has potential axial table colliding problems. Thus this paper aims at proposing an efficient web-learning collision detection tool on five-axis machining. However, collision detection consumes heavy resource that few devices can support, thus this research uses a systematic approach based on web knowledge to detect collision. The methodologies include the kinematics analyses for five-axis motions, separating axis method for collision detection, and computer simulation for verification. The machine structure is modeled as STL format in CAD software. The input to the detection system is the g-code part program, which describes the tool motions to produce the part surface. This research produced a simulation program with C programming language and demonstrated a five-axis machining example with collision detection on web site. The system simulates the five-axis CNC motion for tool trajectory and detects for any collisions according to the input g-codes and also supports high-performance web service benefiting from C. The result shows that our method improves 4.5 time of computational efficiency, comparing to the conventional detection method.

Low Complexity Regular LDPC codes for Magnetic Storage Devices

LDPC codes could be used in magnetic storage devices because of their better decoding performance compared to other error correction codes. However, their hardware implementation results in large and complex decoders. This one of the main obstacles the decoders to be incorporated in magnetic storage devices. We construct small high girth and rate 2 columnweight codes from cage graphs. Though these codes have low performance compared to higher column weight codes, they are easier to implement. The ease of implementation makes them more suitable for applications such as magnetic recording. Cages are the smallest known regular distance graphs, which give us the smallest known column-weight 2 codes given the size, girth and rate of the code.

Improving Image Quality in Remote Sensing Satellites using Channel Coding

Among other factors that characterize satellite communication channels is their high bit error rate. We present a system for still image transmission over noisy satellite channels. The system couples image compression together with error control codes to improve the received image quality while maintaining its bandwidth requirements. The proposed system is tested using a high resolution satellite imagery simulated over the Rician fading channel. Evaluation results show improvement in overall system including image quality and bandwidth requirements compared to similar systems with different coding schemes.

Evolving a Fuzzy Rule-Base for Image Segmentation

A new method for color image segmentation using fuzzy logic is proposed in this paper. Our aim here is to automatically produce a fuzzy system for color classification and image segmentation with least number of rules and minimum error rate. Particle swarm optimization is a sub class of evolutionary algorithms that has been inspired from social behavior of fishes, bees, birds, etc, that live together in colonies. We use comprehensive learning particle swarm optimization (CLPSO) technique to find optimal fuzzy rules and membership functions because it discourages premature convergence. Here each particle of the swarm codes a set of fuzzy rules. During evolution, a population member tries to maximize a fitness criterion which is here high classification rate and small number of rules. Finally, particle with the highest fitness value is selected as the best set of fuzzy rules for image segmentation. Our results, using this method for soccer field image segmentation in Robocop contests shows 89% performance. Less computational load is needed when using this method compared with other methods like ANFIS, because it generates a smaller number of fuzzy rules. Large train dataset and its variety, makes the proposed method invariant to illumination noise

Effect of U-Turn in Reinforced Concrete Dog-Legged Stair Slabs

Reinforced concrete stair slabs with mid landings i.e. Dog-legged shaped are conventionally designed as per specifications of standard codes of practices which guide about the effective span according to the varying support conditions. Presently, the behavior of such slabs has been investigated using Finite Element method. A single flight stair slab with landings on both sides and supported at ends on wall, and a multi flight stair slab with landings and six different support arrangements have been analyzed. The results obtained for stresses, strains and deflections are used to describe the behavior of such stair slabs, including locations of critical moments and deflections. Values of critical moments obtained by F.E. analysis have also have been compared with that obtained from conventional analysis. Analytical results show that the moments are also critical near the kinks i.e. junction of mid-landing and inclined waist slab. This change in the behavior of dog-legged stair slab may be due to continuity of the material in transverse direction in two landings adjoining the waist slab, hence additional stiffness achieved. This change in the behavior is generally not taken care of in conventional method of design.

Interest of the Sequences Pseudo Noises Codes of Different Lengths for the Reduction from the Interference between Users of CDMA Network

The third generation (3G) of cellular system adopted the spread spectrum as solution for the transmission of the data in the physical layer. Contrary to systems IS-95 or CDMAOne (systems with spread spectrum of the preceding generation), the new standard, called Universal Mobil Telecommunications System (UMTS), uses long codes in the down link. The system is conceived for the vocal communication and the transmission of the data. In particular, the down link is very important, because of the asymmetrical request of the data, i.e., more remote loading towards the mobiles than towards the basic station. Moreover, the UMTS uses for the down link an orthogonal spreading out with a variable factor of spreading out (OVSF for Orthogonal Variable Spreading Factor). This characteristic makes it possible to increase the flow of data of one or more users by reducing their factor of spreading out without changing the factor of spreading out of other users. In the current standard of the UMTS, two techniques to increase the performances of the down link were proposed, the diversity of sending antenna and the codes space-time. These two techniques fight only fainding. The receiver proposed for the mobil station is the RAKE, but one can imagine a receiver more sophisticated, able to reduce the interference between users and the impact of the coloured noise and interferences to narrow band. In this context, where the users have long codes synchronized with variable factor of spreading out and ignorance by the mobile of the other active codes/users, the use of the sequences of code pseudo-noises different lengths is presented in the form of one of the most appropriate solutions.

A Critical Survey of Reusability Aspects for Component-Based Systems

The last decade has shown that object-oriented concept by itself is not that powerful to cope with the rapidly changing requirements of ongoing applications. Component-based systems achieve flexibility by clearly separating the stable parts of systems (i.e. the components) from the specification of their composition. In order to realize the reuse of components effectively in CBSD, it is required to measure the reusability of components. However, due to the black-box nature of components where the source code of these components are not available, it is difficult to use conventional metrics in Component-based Development as these metrics require analysis of source codes. In this paper, we survey few existing component-based reusability metrics. These metrics give a border view of component-s understandability, adaptability, and portability. It also describes the analysis, in terms of quality factors related to reusability, contained in an approach that aids significantly in assessing existing components for reusability.

Flexural Strength and Ductility Improvement of NSC beams

In order to calculate the flexural strength of normal-strength concrete (NSC) beams, the nonlinear actual concrete stress distribution within the compression zone is normally replaced by an equivalent rectangular stress block, with two coefficients of α and β to regulate the intensity and depth of the equivalent stress respectively. For NSC beams design, α and β are usually assumed constant as 0.85 and 0.80 in reinforced concrete (RC) codes. From an earlier investigation of the authors, α is not a constant but significantly affected by flexural strain gradient, and increases with the increasing of strain gradient till a maximum value. It indicates that larger concrete stress can be developed in flexure than that stipulated by design codes. As an extension and application of the authors- previous study, the modified equivalent concrete stress block is used here to produce a series of design charts showing the maximum design limits of flexural strength and ductility of singly- and doubly- NSC beams, through which both strength and ductility design limits are improved by taking into account strain gradient effect.

Attribute Weighted Class Complexity: A New Metric for Measuring Cognitive Complexity of OO Systems

In general, class complexity is measured based on any one of these factors such as Line of Codes (LOC), Functional points (FP), Number of Methods (NOM), Number of Attributes (NOA) and so on. There are several new techniques, methods and metrics with the different factors that are to be developed by the researchers for calculating the complexity of the class in Object Oriented (OO) software. Earlier, Arockiam et.al has proposed a new complexity measure namely Extended Weighted Class Complexity (EWCC) which is an extension of Weighted Class Complexity which is proposed by Mishra et.al. EWCC is the sum of cognitive weights of attributes and methods of the class and that of the classes derived. In EWCC, a cognitive weight of each attribute is considered to be 1. The main problem in EWCC metric is that, every attribute holds the same value but in general, cognitive load in understanding the different types of attributes cannot be the same. So here, we are proposing a new metric namely Attribute Weighted Class Complexity (AWCC). In AWCC, the cognitive weights have to be assigned for the attributes which are derived from the effort needed to understand their data types. The proposed metric has been proved to be a better measure of complexity of class with attributes through the case studies and experiments

Computational Prediction of Complicated Atmospheric Motion for Spinning or non- Spinning Projectiles

A full six degrees of freedom (6-DOF) flight dynamics model is proposed for the accurate prediction of short and long-range trajectories of high spin and fin-stabilized projectiles via atmospheric flight to final impact point. The projectiles is assumed to be both rigid (non-flexible), and rotationally symmetric about its spin axis launched at low and high pitch angles. The mathematical model is based on the full equations of motion set up in the no-roll body reference frame and is integrated numerically from given initial conditions at the firing site. The projectiles maneuvering motion depends on the most significant force and moment variations, in addition to wind and gravity. The computational flight analysis takes into consideration the Mach number and total angle of attack effects by means of the variable aerodynamic coefficients. For the purposes of the present work, linear interpolation has been applied from the tabulated database of McCoy-s book. The developed computational method gives satisfactory agreement with published data of verified experiments and computational codes on atmospheric projectile trajectory analysis for various initial firing flight conditions.

Recycling-Oriented Product Assessment during Design Process with Usage of Agent Technology

In the paper the method of product analysis from recycling point of view has been described. The analysis bases on set of measures that assess a product from the point of view of final stages of its lifecycle. It was assumed that such analysis will be performed at the design phase – in order to conduct such analysis the computer system that aids the designer during the design process has been developed. The structure of the computer tool, based on agent technology, and example results has been also included in the paper.

Error Correction Codes in Wireless Sensor Network: An Energy Aware Approach

Link reliability and transmitted power are two important design constraints in wireless network design. Error control coding (ECC) is a classic approach used to increase link reliability and to lower the required transmitted power. It provides coding gain, resulting in transmitter energy savings at the cost of added decoder power consumption. But the choice of ECC is very critical in the case of wireless sensor network (WSN). Since the WSNs are energy constraint in nature, both the BER and power consumption has to be taken into count. This paper develops a step by step approach in finding suitable error control codes for WSNs. Several simulations are taken considering different error control codes and the result shows that the RS(31,21) fits both in BER and power consumption criteria.

Array Data Transformation for Source Code Obfuscation

Obfuscation is a low cost software protection methodology to avoid reverse engineering and re engineering of applications. Source code obfuscation aims in obscuring the source code to hide the functionality of the codes. This paper proposes an Array data transformation in order to obfuscate the source code which uses arrays. The applications using the proposed data structures force the programmer to obscure the logic manually. It makes the developed obscured codes hard to reverse engineer and also protects the functionality of the codes.

Performance Evaluation of Wavelet Based Coders on Brain MRI Volumetric Medical Datasets for Storage and Wireless Transmission

In this paper, we evaluate the performance of some wavelet based coding algorithms such as 3D QT-L, 3D SPIHT and JPEG2K. In the first step we achieve an objective comparison between three coders, namely 3D SPIHT, 3D QT-L and JPEG2K. For this purpose, eight MRI head scan test sets of 256 x 256x124 voxels have been used. Results show superior performance of 3D SPIHT algorithm, whereas 3D QT-L outperforms JPEG2K. The second step consists of evaluating the robustness of 3D SPIHT and JPEG2K coding algorithm over wireless transmission. Compressed dataset images are then transmitted over AWGN wireless channel or over Rayleigh wireless channel. Results show the superiority of JPEG2K over these two models. In fact, it has been deduced that JPEG2K is more robust regarding coding errors. Thus we may conclude the necessity of using corrector codes in order to protect the transmitted medical information.

Application of Biometrics to Obtain High Entropy Cryptographic Keys

In this paper, a two factor scheme is proposed to generate cryptographic keys directly from biometric data, which unlike passwords, are strongly bound to the user. Hash value of the reference iris code is used as a cryptographic key and its length depends only on the hash function, being independent of any other parameter. The entropy of such keys is 94 bits, which is much higher than any other comparable system. The most important and distinct feature of this scheme is that it regenerates the reference iris code by providing a genuine iris sample and the correct user password. Since iris codes obtained from two images of the same eye are not exactly the same, error correcting codes (Hadamard code and Reed-Solomon code) are used to deal with the variability. The scheme proposed here can be used to provide keys for a cryptographic system and/or for user authentication. The performance of this system is evaluated on two publicly available databases for iris biometrics namely CBS and ICE databases. The operating point of the system (values of False Acceptance Rate (FAR) and False Rejection Rate (FRR)) can be set by properly selecting the error correction capacity (ts) of the Reed- Solomon codes, e.g., on the ICE database, at ts = 15, FAR is 0.096% and FRR is 0.76%.