Parallel Joint Channel Coding and Cryptography

Method of Parallel Joint Channel Coding and Cryptography has been analyzed and simulated in this paper. The method is an extension of Soft Input Decryption with feedback, which is used for improvement of channel decoding of secured messages. Parallel Joint Channel Coding and Cryptography results in improved coding gain of channel decoding, which achieves more than 2 dB. Such results are an implication of a combination of receiver components and their interoperability.

Turbo-Coded Mobile Terrestrial Communication Systems in Urban and Suburban Areas for Wireless Multimedia Applications

With the rapid popularization of internet services, it is apparent that the next generation terrestrial communication systems must be capable of supporting various applications like voice, video, and data. This paper presents the performance evaluation of turbo- coded mobile terrestrial communication systems, which are capable of providing high quality services for delay sensitive (voice or video) and delay tolerant (text transmission) multimedia applications in urban and suburban areas. Different types of multimedia information require different service qualities, which are generally expressed in terms of a maximum acceptable bit-error-rate (BER) and maximum tolerable latency. The breakthrough discovery of turbo codes allows us to significantly reduce the probability of bit errors with feasible latency. In a turbo-coded system, a trade-off between latency and BER results from the choice of convolutional component codes, interleaver type and size, decoding algorithm, and the number of decoding iterations. This trade-off can be exploited for multimedia applications by using optimal and suboptimal performance parameter amalgamations to achieve different service qualities. The results are therefore proposing an adaptive framework for turbo-coded wireless multimedia communications which incorporate a set of performance parameters that achieve an appropriate set of service qualities, depending on the application's requirements.

Codebook Generation for Vector Quantization on Orthogonal Polynomials based Transform Coding

In this paper, a new algorithm for generating codebook is proposed for vector quantization (VQ) in image coding. The significant features of the training image vectors are extracted by using the proposed Orthogonal Polynomials based transformation. We propose to generate the codebook by partitioning these feature vectors into a binary tree. Each feature vector at a non-terminal node of the binary tree is directed to one of the two descendants by comparing a single feature associated with that node to a threshold. The binary tree codebook is used for encoding and decoding the feature vectors. In the decoding process the feature vectors are subjected to inverse transformation with the help of basis functions of the proposed Orthogonal Polynomials based transformation to get back the approximated input image training vectors. The results of the proposed coding are compared with the VQ using Discrete Cosine Transform (DCT) and Pairwise Nearest Neighbor (PNN) algorithm. The new algorithm results in a considerable reduction in computation time and provides better reconstructed picture quality.

A Multi-Signature Scheme based on Coding Theory

In this paper we propose two first non-generic constructions of multisignature scheme based on coding theory. The first system make use of the CFS signature scheme and is secure in random oracle while the second scheme is based on the KKS construction and is a few times. The security of our construction relies on a difficult problems in coding theory: The Syndrome Decoding problem which has been proved NP-complete [4].

H-ARQ Techniques for Wireless Systems with Punctured Non-Binary LDPC as FEC Code

This paper presents the H-ARQ techniques comparison for OFDM systems with a new family of non-binary LDPC codes which has been developed within the EU FP7 DAVINCI project. The punctured NB-LDPC codes have been used in a simulated model of the transmission system. The link level performance has been evaluated in terms of spectral efficiency, codeword error rate and average number of retransmissions. The NB-LDPC codes can be easily and effective implemented with different methods of the retransmission needed if correct decoding of a codeword failed. Here the Optimal Symbol Selection method is proposed as a Chase Combining technique.

2D Bar Codes Reading: Solutions for Camera Phones

Two-dimensional (2D) bar codes were designed to carry significantly more data with higher information density and robustness than its 1D counterpart. Thanks to the popular combination of cameras and mobile phones, it will naturally bring great commercial value to use the camera phone for 2D bar code reading. This paper addresses the problem of specific 2D bar code design for mobile phones and introduces a low-level encoding method of matrix codes. At the same time, we propose an efficient scheme for 2D bar codes decoding, of which the effort is put on solutions of the difficulties introduced by low image quality that is very common in bar code images taken by a phone camera.

Performance Analysis of HSDPA Systems using Low-Density Parity-Check (LDPC)Coding as Compared to Turbo Coding

HSDPA is a new feature which is introduced in Release-5 specifications of the 3GPP WCDMA/UTRA standard to realize higher speed data rate together with lower round-trip times. Moreover, the HSDPA concept offers outstanding improvement of packet throughput and also significantly reduces the packet call transfer delay as compared to Release -99 DSCH. Till now the HSDPA system uses turbo coding which is the best coding technique to achieve the Shannon limit. However, the main drawbacks of turbo coding are high decoding complexity and high latency which makes it unsuitable for some applications like satellite communications, since the transmission distance itself introduces latency due to limited speed of light. Hence in this paper it is proposed to use LDPC coding in place of Turbo coding for HSDPA system which decreases the latency and decoding complexity. But LDPC coding increases the Encoding complexity. Though the complexity of transmitter increases at NodeB, the End user is at an advantage in terms of receiver complexity and Bit- error rate. In this paper LDPC Encoder is implemented using “sparse parity check matrix" H to generate a codeword at Encoder and “Belief Propagation algorithm "for LDPC decoding .Simulation results shows that in LDPC coding the BER suddenly drops as the number of iterations increase with a small increase in Eb/No. Which is not possible in Turbo coding. Also same BER was achieved using less number of iterations and hence the latency and receiver complexity has decreased for LDPC coding. HSDPA increases the downlink data rate within a cell to a theoretical maximum of 14Mbps, with 2Mbps on the uplink. The changes that HSDPA enables includes better quality, more reliable and more robust data services. In other words, while realistic data rates are only a few Mbps, the actual quality and number of users achieved will improve significantly.

Pontrjagin Duality and Codes over Finite Commutative Rings

We present linear codes over finite commutative rings which are not necessarily Frobenius. We treat the notion of syndrome decoding by using Pontrjagin duality. We also give a version of Delsarte-s theorem over rings relating trace codes and subring subcodes.

Robust Digital Cinema Watermarking

With the advent of digital cinema and digital broadcasting, copyright protection of video data has been one of the most important issues. We present a novel method of watermarking for video image data based on the hardware and digital wavelet transform techniques and name it as “traceable watermarking" because the watermarked data is constructed before the transmission process and traced after it has been received by an authorized user. In our method, we embed the watermark to the lowest part of each image frame in decoded video by using a hardware LSI. Digital Cinema is an important application for traceable watermarking since digital cinema system makes use of watermarking technology during content encoding, encryption, transmission, decoding and all the intermediate process to be done in digital cinema systems. The watermark is embedded into the randomly selected movie frames using hash functions. Embedded watermark information can be extracted from the decoded video data. For that, there is no need to access original movie data. Our experimental results show that proposed traceable watermarking method for digital cinema system is much better than the convenient watermarking techniques in terms of robustness, image quality, speed, simplicity and robust structure.

A New Hardware Implementation of Manchester Line Decoder

In this paper, we present a simple circuit for Manchester decoding and without using any complicated or programmable devices. This circuit can decode 90kbps of transmitted encoded data; however, greater than this transmission rate can be decoded if high speed devices were used. We also present a new method for extracting the embedded clock from Manchester data in order to use it for serial-to-parallel conversion. All of our experimental measurements have been done using simulation.

Low Complexity Regular LDPC codes for Magnetic Storage Devices

LDPC codes could be used in magnetic storage devices because of their better decoding performance compared to other error correction codes. However, their hardware implementation results in large and complex decoders. This one of the main obstacles the decoders to be incorporated in magnetic storage devices. We construct small high girth and rate 2 columnweight codes from cage graphs. Though these codes have low performance compared to higher column weight codes, they are easier to implement. The ease of implementation makes them more suitable for applications such as magnetic recording. Cages are the smallest known regular distance graphs, which give us the smallest known column-weight 2 codes given the size, girth and rate of the code.

Hybrid Method Using Wavelets and Predictive Method for Compression of Speech Signal

The development of the signal compression algorithms is having compressive progress. These algorithms are continuously improved by new tools and aim to reduce, an average, the number of bits necessary to the signal representation by means of minimizing the reconstruction error. The following article proposes the compression of Arabic speech signal by a hybrid method combining the wavelet transform and the linear prediction. The adopted approach rests, on one hand, on the original signal decomposition by ways of analysis filters, which is followed by the compression stage, and on the other hand, on the application of the order 5, as well as, the compression signal coefficients. The aim of this approach is the estimation of the predicted error, which will be coded and transmitted. The decoding operation is then used to reconstitute the original signal. Thus, the adequate choice of the bench of filters is useful to the transform in necessary to increase the compression rate and induce an impercevable distortion from an auditive point of view.

Method to Improve Channel Coding Using Cryptography

A new approach for the improvement of coding gain in channel coding using Advanced Encryption Standard (AES) and Maximum A Posteriori (MAP) algorithm is proposed. This new approach uses the avalanche effect of block cipher algorithm AES and soft output values of MAP decoding algorithm. The performance of proposed approach is evaluated in the presence of Additive White Gaussian Noise (AWGN). For the verification of proposed approach, computer simulation results are included.

Web Page Watermarking: XML files using Synonyms and Acronyms

Advent enhancements in the field of computing have increased massive use of web based electronic documents. Current Copyright protection laws are inadequate to prove the ownership for electronic documents and do not provide strong features against copying and manipulating information from the web. This has opened many channels for securing information and significant evolutions have been made in the area of information security. Digital Watermarking has developed into a very dynamic area of research and has addressed challenging issues for digital content. Watermarking can be visible (logos or signatures) and invisible (encoding and decoding). Many visible watermarking techniques have been studied for text documents but there are very few for web based text. XML files are used to trade information on the internet and contain important information. In this paper, two invisible watermarking techniques using Synonyms and Acronyms are proposed for XML files to prove the intellectual ownership and to achieve the security. Analysis is made for different attacks and amount of capacity to be embedded in the XML file is also noticed. A comparative analysis for capacity is also made for both methods. The system has been implemented using C# language and all tests are made practically to get the results.

Encoding and Compressing Data for Decreasing Number of Switches in Baseline Networks

This method decrease usage power (expenditure) in networks on chips (NOC). This method data coding for data transferring in order to reduces expenditure. This method uses data compression reduces the size. Expenditure calculation in NOC occurs inside of NOC based on grown models and transitive activities in entry ports. The goal of simulating is to weigh expenditure for encoding, decoding and compressing in Baseline networks and reduction of switches in this type of networks. KeywordsNetworks on chip, Compression, Encoding, Baseline networks, Banyan networks.

Peak-to-Average Power Ratio Reduction in OFDM Systems using Huffman Coding

In this paper we proposed the use of Huffman coding to reduce the PAR of an OFDM system as a distortionless scrambling technique, and we utilize the amount saved in the total bit rate by the Huffman coding to send the encoding table for accurate decoding at the receiver without reducing the effective throughput. We found that the use of Huffman coding reduces the PAR by about 6 dB. Also we have investigated the effect of PAR reduction due to Huffman coding through testing the spectral spreading and the inband distortion due to HPA with different IBO values. We found a complete match of our expectation from the proposed solution with the obtained simulation results.

A Low Power SRAM Base on Novel Word-Line Decoding

This paper proposes a low power SRAM based on five transistor SRAM cell. Proposed SRAM uses novel word-line decoding such that, during read/write operation, only selected cell connected to bit-line whereas, in conventional SRAM (CV-SRAM), all cells in selected row connected to their bit-lines, which in turn develops differential voltages across all bit-lines, and this makes energy consumption on unselected bit-lines. In proposed SRAM memory array divided into two halves and this causes data-line capacitance to reduce. Also proposed SRAM uses one bit-line and thus has lower bit-line leakage compared to CV-SRAM. Furthermore, the proposed SRAM incurs no area overhead, and has comparable read/write performance versus the CV-SRAM. Simulation results in standard 0.25μm CMOS technology shows in worst case proposed SRAM has 80% smaller dynamic energy consumption in each cycle compared to CV-SRAM. Besides, energy consumption in each cycle of proposed SRAM and CV-SRAM investigated analytically, the results of which are in good agreement with the simulation results.

Decoder Design for a New Single Error Correcting/Double Error Detecting Code

This paper presents the decoder design for the single error correcting and double error detecting code proposed by the authors in an earlier paper. The speed of error detection and correction of a code is largely dependent upon the associated encoder and decoder circuits. The complexity and the speed of such circuits are determined by the number of 1?s in the parity check matrix (PCM). The number of 1?s in the parity check matrix for the code proposed by the authors are fewer than in any currently known single error correcting/double error detecting code. This results in simplified encoding and decoding circuitry for error detection and correction.

Enhancing the Error-Correcting Performance of LDPC Codes through an Efficient Use of Decoding Iterations

The decoding of Low-Density Parity-Check (LDPC) codes is operated over a redundant structure known as the bipartite graph, meaning that the full set of bit nodes is not absolutely necessary for decoder convergence. In 2008, Soyjaudah and Catherine designed a recovery algorithm for LDPC codes based on this assumption and showed that the error-correcting performance of their codes outperformed conventional LDPC Codes. In this work, the use of the recovery algorithm is further explored to test the performance of LDPC codes while the number of iterations is progressively increased. For experiments conducted with small blocklengths of up to 800 bits and number of iterations of up to 2000, the results interestingly demonstrate that contrary to conventional wisdom, the error-correcting performance keeps increasing with increasing number of iterations.

Performance of Random Diagonal Codes for Spectral Amplitude Coding Optical CDMA Systems

In this paper we study the use of a new code called Random Diagonal (RD) code for Spectral Amplitude Coding (SAC) optical Code Division Multiple Access (CDMA) networks, using Fiber Bragg-Grating (FBG), FBG consists of a fiber segment whose index of reflection varies periodically along its length. RD code is constructed using code level and data level, one of the important properties of this code is that the cross correlation at data level is always zero, which means that Phase intensity Induced Phase (PIIN) is reduced. We find that the performance of the RD code will be better than Modified Frequency Hopping (MFH) and Hadamard code It has been observed through experimental and theoretical simulation that BER for RD code perform significantly better than other codes. Proof –of-principle simulations of encoding with 3 channels, and 10 Gbps data transmission have been successfully demonstrated together with FBG decoding scheme for canceling the code level from SAC-signal.