Abstract: Digital watermarking is a way to provide the facility of secure multimedia data communication besides its copyright protection approach. The Spread Spectrum modulation principle is widely used in digital watermarking to satisfy the robustness of multimedia signals against various signal-processing operations. Several SS watermarking algorithms have been proposed for multimedia signals but very few works have discussed on the issues responsible for secure data communication and its robustness improvement. The current paper has critically analyzed few such factors namely properties of spreading codes, proper signal decomposition suitable for data embedding, security provided by the key, successive bit cancellation method applied at decoder which have greater impact on the detection reliability, secure communication of significant signal under camouflage of insignificant signals etc. Based on the analysis, robust SS watermarking scheme for secure data communication is proposed in wavelet domain and improvement in secure communication and robustness performance is reported through experimental results. The reported result also shows improvement in visual and statistical invisibility of the hidden data.
Abstract: This article describes the aspects of the formation of
the national idea and national identity through the prism of gender
control and its contradistinction to the obsolete, Soviet component.
The role of females in ethnic and national projects is considered from
the point of view of Dr. Nira Yuval-Davis: as biological reproducers
of the ethnic communities- members; as reproducers of the boarders
of ethnic/national groups; as central participants in the ideological
reproduction of community and transducers of its culture; as symbols
in ideology, reproduction and transformation of ethnic/national
categories; and as participants of national, economical, political and
military combats. The society of the transitional type uses the
symbolic resources of the formation of gender component in the
national project. The gender patterns act like cultural codes,
executing the important ideological function in formation of the
national female- image, i.e. the discussion on hijab - it-s not just the
discussion on control over the female body, it-s the discussion on the
metaphor of social order.
Abstract: This paper proposes a visual cryptography by random
grids scheme with identifiable shares. The method encodes an image
O in two shares that exhibits the following features: (1) each generated
share has the same scale as O, (2) any share singly has noise-like
appearance that reveals no secret information on O, (3) the secrets can
be revealed by superimposing the two shares, (4) folding a share up
can disclose some identification patterns, and (5) both of the secret
information and the designated identification patterns are recognized
by naked eye without any computation. The property to show up
identification patterns on folded shares establishes a simple and
friendly interface for users to manage the numerous shares created by
VC schemes.
Abstract: A novel interpolation scheme to extend usable spectrum
and upconvert in high performance D/A converters is addressed in this
paper. By adjusting the pulse width of cycle and the production circuit
of code, the expansion code is a null code or complementary code that
is interpolation process. What the times and codes of interpolation
decide DAC works in one of a normal mode or multi-mixer mode
so that convert the input digital data signal into normal signal or a
mixed analog signal having a mixer frequency that is higher than the
data frequency. Simulation results show that the novel scheme and
apparatus most extend the usable frequency spectrum into fifth to
sixth Nyquist zone beyond conventional DACs.
Abstract: An experiment was performed with a 24.5 MeV 14N
beam on a 12C target in the cyclotron DC-60 located in Astana,
Kazakhstan, to study the elastic scattering of 14N on 12C; the
scattering was also analyzed at different energies for tracking the
phenomenon of remarkable structure at large angles. Its aims were to
extend the measurements to very large angles, and attempt to
uniquely identify the elastic scattering potential. Good agreement
between the theoretical and experimental data has been obtained with
suitable optical potential parameters. Optical model calculations with
l -dependent imaginary potentials were also applied to the data and
relatively good agreement was found.
Abstract: Circular tubes have been widely used as structural
members in engineering application. Therefore, its collapse behavior
has been studied for many decades, focusing on its energy absorption
characteristics. In order to predict the collapse behavior of members,
one could rely on the use of finite element codes or experiments.
These tools are helpful and high accuracy but costly and require
extensive running time. Therefore, an approximating model of tubes
collapse mechanism is an alternative for early step of design. This
paper is also aimed to develop a closed-form solution of thin-walled
circular tube subjected to bending. It has extended the Elchalakani et
al.-s model (Int. J. Mech. Sci.2002; 44:1117-1143) to include the
rate of energy dissipation of rolling hinge in the circumferential
direction. The 3-D geometrical collapse mechanism was analyzed by
adding the oblique hinge lines along the longitudinal tube within the
length of plastically deforming zone. The model was based on the
principal of energy rate conservation. Therefore, the rates of internal
energy dissipation were calculated for each hinge lines which are
defined in term of velocity field. Inextensional deformation and
perfect plastic material behavior was assumed in the derivation of
deformation energy rate. The analytical result was compared with
experimental result. The experiment was conducted with a number of
tubes having various D/t ratios. Good agreement between analytical
and experiment was achieved.
Abstract: Many metrics were proposed to evaluate the
characteristics of the analysis and design model of a given product
which in turn help to assess the quality of the product. Function point
metric is a measure of the 'functionality' delivery by the software.
This paper presents an analysis of a set of programs of a project
developed in Cµ through Function Points metric. Function points
are measured for a Data Flow Diagram (DFD) of the case developed
at initial stage. Lines of Codes (LOCs) and possible errors are
calculated with the help of measured Function Points (FPs). The
calculations are performed using suitable established functions.
Calculated LOCs and errors are compared with actual LOCs and
errors found at the time of analysis & design review, implementation
and testing. It has been observed that actual found errors are more
than calculated errors. On the basis of analysis and observations,
authors conclude that function point provides useful insight and helps
to analyze the drawbacks in the development process.
Abstract: Space-time block codes (STBC) and spatial multiplexing
(SM) are promising techniques that effectively exploit multipleinput
multiple-output (MIMO) transmission to achieve more reliable
communication and a higher multiplexing rate, respectively. In this
paper, we study a practical design for hybrid scheme with multi-input
multi-output orthogonal frequency division multiplexing (MIMOOFDM)
systems to flexibly maximize the tradeoff between diversity
and multiplexing gains. Unlike the existing STBC and SM designs
which are suitable for the integer multiplexing rate, the proposed
design can achieve arbitrary number of multiplexing rate.
Abstract: In present communication, we have developed the
suitable constraints for the given the mean codeword length and the
measures of entropy. This development has proved that Renyi-s
entropy gives the minimum value of the log of the harmonic mean
and the log of power mean. We have also developed an important
relation between best 1:1 code and the uniquely decipherable code by
using different measures of entropy.
Abstract: Wireless mobile communications have experienced
the phenomenal growth through last decades. The advances in
wireless mobile technologies have brought about a demand for high
quality multimedia applications and services. For such applications
and services to work, signaling protocol is required for establishing,
maintaining and tearing down multimedia sessions. The Session
Initiation Protocol (SIP) is an application layer signaling protocols,
based on request/response transaction model. This paper considers
SIP INVITE transaction over an unreliable medium, since it has been
recently modified in Request for Comments (RFC) 6026. In order to
help in assuring that the functional correctness of this modification is
achieved, the SIP INVITE transaction is modeled and analyzed using
Colored Petri Nets (CPNs). Based on the model analysis, it is
concluded that the SIP INVITE transaction is free of livelocks and
dead codes, and in the same time it has both desirable and
undesirable deadlocks. Therefore, SIP INVITE transaction should be
subjected for additional updates in order to eliminate undesirable
deadlocks. In order to reduce the cost of implementation and
maintenance of SIP, additional remodeling of the SIP INVITE
transaction is recommended.
Abstract: Heart failure is the most common reason of death
nowadays, but if the medical help is given directly, the patient-s life
may be saved in many cases. Numerous heart diseases can be
detected by means of analyzing electrocardiograms (ECG). Artificial
Neural Networks (ANN) are computer-based expert systems that
have proved to be useful in pattern recognition tasks. ANN can be
used in different phases of the decision-making process, from
classification to diagnostic procedures. This work concentrates on a
review followed by a novel method.
The purpose of the review is to assess the evidence of healthcare
benefits involving the application of artificial neural networks to the
clinical functions of diagnosis, prognosis and survival analysis, in
ECG signals. The developed method is based on a compound neural
network (CNN), to classify ECGs as normal or carrying an
AtrioVentricular heart Block (AVB). This method uses three
different feed forward multilayer neural networks. A single output
unit encodes the probability of AVB occurrences. A value between 0
and 0.1 is the desired output for a normal ECG; a value between 0.1
and 1 would infer an occurrence of an AVB. The results show that
this compound network has a good performance in detecting AVBs,
with a sensitivity of 90.7% and a specificity of 86.05%. The accuracy
value is 87.9%.
Abstract: The study in this paper underlines the importance of
correct joint selection of the spreading codes for uplink of multicarrier
code division multiple access (MC-CDMA) at the transmitter
side and detector at the receiver side in the presence of nonlinear
distortion due to high power amplifier (HPA). The bit error rate
(BER) of system for different spreading sequences (Walsh code, Gold
code, orthogonal Gold code, Golay code and Zadoff-Chu code) and
different kinds of receivers (minimum mean-square error receiver
(MMSE-MUD) and microstatistic multi-user receiver (MSF-MUD))
is compared by means of simulations for MC-CDMA transmission
system. Finally, the results of analysis will show, that the application
of MSF-MUD in combination with Golay codes can outperform
significantly the other tested spreading codes and receivers for all
mostly used models of HPA.
Abstract: The decoding of Low-Density Parity-Check (LDPC) codes is operated over a redundant structure known as the bipartite graph, meaning that the full set of bit nodes is not absolutely necessary for decoder convergence. In 2008, Soyjaudah and Catherine designed a recovery algorithm for LDPC codes based on this assumption and showed that the error-correcting performance of their codes outperformed conventional LDPC Codes. In this work, the use of the recovery algorithm is further explored to test the performance of LDPC codes while the number of iterations is progressively increased. For experiments conducted with small blocklengths of up to 800 bits and number of iterations of up to 2000, the results interestingly demonstrate that contrary to conventional wisdom, the error-correcting performance keeps increasing with increasing number of iterations.
Abstract: Results in one field necessarily give insight into the
others, and all have much potential for scientific and technological
application. The Hadamard-transform technique once been applied to
the spectrometry also has its use in the SNR Enhancement of OTDR.
In this report, a new set of code (Simplex-codes) is discussed and
where the addition gain of SNR come from is implied.
Abstract: In this paper we study the use of a new code called
Random Diagonal (RD) code for Spectral Amplitude Coding (SAC)
optical Code Division Multiple Access (CDMA) networks, using
Fiber Bragg-Grating (FBG), FBG consists of a fiber segment whose
index of reflection varies periodically along its length. RD code is
constructed using code level and data level, one of the important
properties of this code is that the cross correlation at data level is
always zero, which means that Phase intensity Induced Phase (PIIN)
is reduced. We find that the performance of the RD code will be
better than Modified Frequency Hopping (MFH) and Hadamard code
It has been observed through experimental and theoretical simulation
that BER for RD code perform significantly better than other codes.
Proof –of-principle simulations of encoding with 3 channels, and 10
Gbps data transmission have been successfully demonstrated together
with FBG decoding scheme for canceling the code level from SAC-signal.
Abstract: Nowadays, OCR systems have got several
applications and are increasingly employed in daily life. Much
research has been done regarding the identification of Latin,
Japanese, and Chinese characters. However, very little investigation
has been performed regarding Farsi/Arabic characters recognition.
Probably the reason is difficulty and complexity of those characters
identification compared to the others and limitation of IT activities in
Farsi and Arabic speaking countries. In this paper, a technique has
been employed to identify isolated Farsi/Arabic characters. A chain
code based algorithm along with other significant peculiarities such
as number and location of dots and auxiliary parts, and the number of
holes existing in the isolated character has been used in this study to
identify Farsi/Arabic characters. Experimental results show the
relatively high accuracy of the method developed when it is tested on
several standard Farsi fonts.
Abstract: In this paper, we study a class of serially concatenated block codes (SCBC) based on matrix interleavers, to be employed in fixed wireless communication systems. The performances of SCBC¬coded systems are investigated under various interleaver dimensions. Numerical results reveal that the matrix interleaver could be a competitive candidate over conventional block interleaver for frame lengths of 200 bits; hence, the SCBC coding based on matrix interleaver is a promising technique to be employed for speech transmission applications in many international standards such as pan-European Global System for Mobile communications (GSM), Digital Cellular Systems (DCS) 1800, and Joint Detection Code Division Multiple Access (JD-CDMA) mobile radio systems, where the speech frame contains around 200 bits.
Abstract: In this paper we present a generic approach for the problem of the blind estimation of the parameters of linear and convolutional error correcting codes. In a non-cooperative context, an adversary has only access to the noised transmission he has intercepted. The intercepter has no knowledge about the parameters used by the legal users. So, before having acess to the information he has first to blindly estimate the parameters of the error correcting code of the communication. The presented approach has the main advantage that the problem of reconstruction of such codes can be expressed in a very simple way. This allows us to evaluate theorical bounds on the complexity of the reconstruction process but also bounds on the estimation rate. We show that some classical reconstruction techniques are optimal and also explain why some of them have theorical complexities greater than these experimentally observed.