Implementation of TinyHash based on Hash Algorithm for Sensor Network

In recent years, it has been proposed security architecture for sensor network.[2][4]. One of these, TinySec by Chris Kalof, Naveen Sastry, David Wagner had proposed Link layer security architecture, considering some problems of sensor network. (i.e : energy, bandwidth, computation capability,etc). The TinySec employs CBC_mode of encryption and CBC-MAC for authentication based on SkipJack Block Cipher. Currently, This TinySec is incorporated in the TinyOS for sensor network security. This paper introduces TinyHash based on general hash algorithm. TinyHash is the module in order to replace parts of authentication and integrity in the TinySec. it implies that apply hash algorithm on TinySec architecture. For compatibility about TinySec, Components in TinyHash is constructed as similar structure of TinySec. And TinyHash implements the HMAC component for authentication and the Digest component for integrity of messages. Additionally, we define the some interfaces for service associated with hash algorithm.

Secure Block-Based Video Authentication with Localization and Self-Recovery

Because of the great advance in multimedia technology, digital multimedia is vulnerable to malicious manipulations. In this paper, a public key self-recovery block-based video authentication technique is proposed which can not only precisely localize the alteration detection but also recover the missing data with high reliability. In the proposed block-based technique, multiple description coding MDC is used to generate two codes (two descriptions) for each block. Although one block code (one description) is enough to rebuild the altered block, the altered block is rebuilt with better quality by the two block descriptions. So using MDC increases the ratability of recovering data. A block signature is computed using a cryptographic hash function and a doubly linked chain is utilized to embed the block signature copies and the block descriptions into the LSBs of distant blocks and the block itself. The doubly linked chain scheme gives the proposed technique the capability to thwart vector quantization attacks. In our proposed technique , anyone can check the authenticity of a given video using the public key. The experimental results show that the proposed technique is reliable for detecting, localizing and recovering the alterations.

The Incorporation of In in GaAsN as a Means of N Fraction Calibration

InGaAsN and GaAsN epitaxial layers with similar nitrogen compositions in a sample were successfully grown on a GaAs (001) substrate by solid source molecular beam epitaxy. An electron cyclotron resonance nitrogen plasma source has been used to generate atomic nitrogen during the growth of the nitride layers. The indium composition changed from sample to sample to give compressive and tensile strained InGaAsN layers. Layer characteristics have been assessed by high-resolution x-ray diffraction to determine the relationship between the lattice constant of the GaAs1-yNy layer and the fraction x of In. The objective was to determine the In fraction x in an InxGa1-xAs1-yNy epitaxial layer which exactly cancels the strain present in a GaAs1-yNy epitaxial layer with the same nitrogen content when grown on a GaAs substrate.

Extended “2D-RIB“ for Impression-Based Satisfactory Retrieval and its Evaluation

Recently, lots of researchers are attracted to retrieving multimedia database by using some impression words and their values. Ikezoe-s research is one of the representatives and uses eight pairs of opposite impression words. We had modified its retrieval interface and proposed '2D-RIB' in the previous work. The aim of the present paper is to improve his/her satisfaction level to the retrieval result in the 2D-RIB. Our method is to extend the 2D-RIB. One of our extensions is to define and introduce the following two measures: 'melody goodness' and 'general acceptance'. Another extension is three types of customization menus. The result of evaluation using a pilot system is as follows. Both of these two measures 'melody goodness' and -general acceptance- can contribute to the improvement. Moreover, it is effective if we introduce the customization menu which enables a retrieval person to reduce the strictness level of retrieval condition in an impression pair based on his/her need.

A Novel Multiplex Real-Time PCR Assay Using TaqMan MGB Probes for Rapid Detection of Trisomy 21

Cytogenetic analysis still remains the gold standard method for prenatal diagnosis of trisomy 21 (Down syndrome, DS). Nevertheless, the conventional cytogenetic analysis needs live cultured cells and is too time-consuming for clinical application. In contrast, molecular methods such as FISH, QF-PCR, MLPA and quantitative Real-time PCR are rapid assays with results available in 24h. In the present study, we have successfully used a novel MGB TaqMan probe-based real time PCR assay for rapid diagnosis of trisomy 21 status in Down syndrome samples. We have also compared the results of this molecular method with corresponding results obtained by the cytogenetic analysis. Blood samples obtained from DS patients (n=25) and normal controls (n=20) were tested by quantitative Real-time PCR in parallel to standard G-banding analysis. Genomic DNA was extracted from peripheral blood lymphocytes. A high precision TaqMan probe quantitative Real-time PCR assay was developed to determine the gene dosage of DSCAM (target gene on 21q22.2) relative to PMP22 (reference gene on 17p11.2). The DSCAM/PMP22 ratio was calculated according to the formula; ratio=2 -ΔΔCT. The quantitative Real-time PCR was able to distinguish between trisomy 21 samples and normal controls with the gene ratios of 1.49±0.13 and 1.03±0.04 respectively (p value

Development of Monitoring and Simulation System of Human Tracking System Based On Mobile Agent Technologies

In recent years, the number of the cases of information leaks is increasing. Companies and Research Institutions make various actions against information thefts and security accidents. One of the actions is adoption of the crime prevention system, including the monitoring system by surveillance cameras. In order to solve difficulties of multiple cameras monitoring, we develop the automatic human tracking system using mobile agents through multiple surveillance cameras to track target persons. In this paper, we develop the monitor which confirms mobile agents tracing target persons, and the simulator of video picture analysis to construct the tracking algorithm.

Bin Bloom Filter Using Heuristic Optimization Techniques for Spam Detection

Bloom filter is a probabilistic and memory efficient data structure designed to answer rapidly whether an element is present in a set. It tells that the element is definitely not in the set but its presence is with certain probability. The trade-off to use Bloom filter is a certain configurable risk of false positives. The odds of a false positive can be made very low if the number of hash function is sufficiently large. For spam detection, weight is attached to each set of elements. The spam weight for a word is a measure used to rate the e-mail. Each word is assigned to a Bloom filter based on its weight. The proposed work introduces an enhanced concept in Bloom filter called Bin Bloom Filter (BBF). The performance of BBF over conventional Bloom filter is evaluated under various optimization techniques. Real time data set and synthetic data sets are used for experimental analysis and the results are demonstrated for bin sizes 4, 5, 6 and 7. Finally analyzing the results, it is found that the BBF which uses heuristic techniques performs better than the traditional Bloom filter in spam detection.

Genetic Variants and Atherosclerosis

Atherosclerosis is the condition in which an artery wall thickens as the result of a build-up of fatty materials such as cholesterol. It is a syndrome affecting arterial blood vessels, a chronic inflammatory response in the walls of arteries, in large part due to the accumulation of macrophage white blood cells and promoted by low density (especially small particle) lipoproteins (plasma proteins that carry cholesterol and triglycerides) without adequate removal of fats and cholesterol from the macrophages by functional high density lipoproteins (HDL). It is commonly referred to as a hardening or furring of the arteries. It is caused by the formation of multiple plaques within the arteries.

A New Approach for Classifying Large Number of Mixed Variables

The issue of classifying objects into one of predefined groups when the measured variables are mixed with different types of variables has been part of interest among statisticians in many years. Some methods for dealing with such situation have been introduced that include parametric, semi-parametric and nonparametric approaches. This paper attempts to discuss on a problem in classifying a data when the number of measured mixed variables is larger than the size of the sample. A propose idea that integrates a dimensionality reduction technique via principal component analysis and a discriminant function based on the location model is discussed. The study aims in offering practitioners another potential tool in a classification problem that is possible to be considered when the observed variables are mixed and too large.

New Proxy Signatures Preserving Privacy and as Secure as ElGamal Signatures

Digital signature is a useful primitive to attain the integrity and authenticity in various wire or wireless communications. Proxy signature is one type of the digital signatures. It helps the proxy signer to sign messages on behalf of the original signer. It is very useful when the original signer (e.g. the president of a company) is not available to sign a specific document. If the original signer can not forge valid proxy signatures through impersonating the proxy signer, it will be robust in a virtual environment; thus the original signer can not shift any illegal action initiated by herself to the proxy signer. In this paper, we propose a new proxy signature scheme. The new scheme can prevent the original signer from impersonating the proxy signer to sign messages. The proposed scheme is based on the regular ElGamal signature. In addition, the fair privacy of the proxy signer is maintained. That means, the privacy of the proxy signer is preserved; and the privacy can be revealed when it is necessary.

Influence of Type of Burner on NOx Emission Characteristics from Combustion of Palm Methyl Ester

Palm methyl ester (PME) is one of the alternative biomass fuels to liquid fossil fuels. To investigate the combustion characteristics of PME as an alternative fuel for gas turbines, combustion experiments using two types of burners under atmospheric pressure were performed. One of the burners has a configuration making strong non-premixed flame, whereas the other has a configuration promoting prevaporization of fuel droplets. The results show that the NOx emissions can be reduced by employing the latter burner without accumulation of soot when PME is used as a fuel. A burner configuration promoting prevaporzation of fuel droplets is recommended for PME.

Efficient Pipelined Hardware Implementation of RIPEMD-160 Hash Function

In this paper an efficient implementation of Ripemd- 160 hash function is presented. Hash functions are a special family of cryptographic algorithms, which is used in technological applications with requirements for security, confidentiality and validity. Applications like PKI, IPSec, DSA, MAC-s incorporate hash functions and are used widely today. The Ripemd-160 is emanated from the necessity for existence of very strong algorithms in cryptanalysis. The proposed hardware implementation can be synthesized easily for a variety of FPGA and ASIC technologies. Simulation results, using commercial tools, verified the efficiency of the implementation in terms of performance and throughput. Special care has been taken so that the proposed implementation doesn-t introduce extra design complexity; while in parallel functionality was kept to the required levels.

Evaluation of a Bio-Mechanism by Graphed Static Equilibrium Forces

The unique structural configuration found in human foot allows easy walking. Similar movement is hard to imitate even for an ape. It is obvious that human ambulation relates to the foot structure itself. Suppose the bones are represented as vertices and the joints as edges. This leads to the development of a special graph that represents human foot. On a footprint there are point-ofcontacts which have contact with the ground. It involves specific vertices. Theoretically, for an ideal ambulation, these points provide reactions onto the ground or the static equilibrium forces. They are arranged in sequence in form of a path. The ambulating footprint follows this path. Having the human foot graph and the path crossbred, it results in a representation that describes the profile of an ideal ambulation. This profile cites the locations where the point-of-contact experience normal reaction forces. It highlights the significant of these points.

Convergence and Divergence in Telephone Conversations: A Case of Persian

People usually have a telephone voice, which means they adjust their speech to fit particular situations and to blend in with other interlocutors. The question is: Do we speak differently to different people? This possibility has been suggested by social psychologists within Accommodation Theory [1]. Converging toward the speech of another person can be regarded as a polite speech strategy while choosing a language not used by the other interlocutor can be considered as the clearest example of speech divergence [2]. The present study sets out to investigate such processes in the course of everyday telephone conversations. Using Joos-s [3] model of formality in spoken English, the researchers try to explore convergence to or divergence from the addressee. The results propound the actuality that lexical choice, and subsequently, patterns of style vary intriguingly in concordance with the person being addressed.

Quality Estimation of Video Transmitted overan Additive WGN Channel based on Digital Watermarking and Wavelet Transform

This paper presents an evaluation for a wavelet-based digital watermarking technique used in estimating the quality of video sequences transmitted over Additive White Gaussian Noise (AWGN) channel in terms of a classical objective metric, such as Peak Signal-to-Noise Ratio (PSNR) without the need of the original video. In this method, a watermark is embedded into the Discrete Wavelet Transform (DWT) domain of the original video frames using a quantization method. The degradation of the extracted watermark can be used to estimate the video quality in terms of PSNR with good accuracy. We calculated PSNR for video frames contaminated with AWGN and compared the values with those estimated using the Watermarking-DWT based approach. It is found that the calculated and estimated quality measures of the video frames are highly correlated, suggesting that this method can provide a good quality measure for video frames transmitted over AWGN channel without the need of the original video.

CAPWAP Status and Design Considerations for Seamless Roaming Support

Wireless LAN technologies have picked up momentum in the recent years due to their ease of deployment, cost and availability. The era of wireless LAN has also given rise to unique applications like VOIP, IPTV and unified messaging. However, these real-time applications are very sensitive to network and handoff latencies. To successfully support these applications, seamless roaming during the movement of mobile station has become crucial. Nowadays, centralized architecture models support roaming in WLANs. They have the ability to manage, control and troubleshoot large scale WLAN deployments. This model is managed by Control and Provision of Wireless Access Point protocol (CAPWAP). This paper covers the CAPWAP architectural solution along with its proposals that have emerged. Based on the literature survey conducted in this paper, we found that the proposed algorithms to reduce roaming latency in CAPWAP architecture do not support seamless roaming. Additionally, they are not sufficient during the initial period of the network. This paper also suggests important design consideration for mobility support in future centralized IEEE 802.11 networks.

Pedestrian Areas and Sustainable Development

Transportation is one of the most fundamental challenges of urban development in contemporary world. On the other hand, sustainable urban development has received tremendous public attention in the last few years. This trend in addition to other factors such as energy cost, environmental concerns, traffic congestion and the feeling of lack of belonging have contributed to the development of pedestrian areas. The purpose of this paper is to study the role of walkable streets in sustainable development of cities. Accordingly, a documentary research through valid sources has been utilized to substantiate this study. The findings demonstrate that walking can lead to sustainable urban development from physical, social, environmental, cultural, economic and political aspects. Also, pedestrian areas –which are the main context of walking- act as focal points of development in cities and have a great effect on modifying and stimulating of their adjacent urban spaces.

Recursive Similarity Hashing of Fractal Geometry

A new technique of topological multi-scale analysis is introduced. By performing a clustering recursively to build a hierarchy, and analyzing the co-scale and intra-scale similarities, an Iterated Function System can be extracted from any data set. The study of fractals shows that this method is efficient to extract self-similarities, and can find elegant solutions the inverse problem of building fractals. The theoretical aspects and practical implementations are discussed, together with examples of analyses of simple fractals.

Study of Compaction in Hot-Mix Asphalt Using Computer Simulations

During the process of compaction in Hot-Mix Asphalt (HMA) mixtures, the distance between aggregate particles decreases as they come together and eliminate air-voids. By measuring the inter-particle distances in a cut-section of a HMA sample the degree of compaction can be estimated. For this, a calibration curve is generated by computer simulation technique when the gradation and asphalt content of the HMA mixture are known. A two-dimensional cross section of HMA specimen was simulated using the mixture design information (gradation, asphalt content and air-void content). Nearest neighbor distance methods such as Delaunay triangulation were used to study the changes in inter-particle distance and area distribution during the process of compaction in HMA. Such computer simulations would enable making several hundreds of repetitions in a short period of time without the necessity to compact and analyze laboratory specimens in order to obtain good statistics on the parameters defined. The distributions for the statistical parameters based on computer simulations showed similar trends as those of laboratory specimens.

Mathematical Correlation for Brake Thermal Efficiency and NOx Emission of CI Engine using Ester of Vegetable Oils

The aim of this study is to develop mathematical relationships for the performance parameter brake thermal efficiency (BTE) and emission parameter nitrogen oxides (NOx) for the various esters of vegetable oils used as CI engine fuel. The BTE is an important performance parameter defining the ability of engine to utilize the energy supplied and power developed similarly it is indication of efficiency of fuels used. The esters of cottonseed oil, soybean oil, jatropha oil and hingan oil are prepared using transesterification process and characterized for their physical and main fuel properties including viscosity, density, flash point and higher heating value using standard test methods. These esters are tried as CI engine fuel to analyze the performance and emission parameters in comparison to diesel. The results of the study indicate that esters as a fuel does not differ greatly with that of diesel in properties. The CI engine performance with esters as fuel is in line with the diesel where as the emission parameters are reduced with the use of esters. The correlation developed between BTE and brake power(BP), gross calorific value(CV), air-fuel ratio(A/F), heat carried away by cooling water(HCW). Another equation is developed between the NOx emission and CO, HC, smoke density (SD), exhaust gas temperature (EGT). The equations are verified by comparing the observed and calculated values which gives the coefficient of correlation of 0.99 and 0.96 for the BTE and NOx equations respectively.