Abstract: Steganography is the process of hiding one file inside another such that others can neither identify the meaning of the embedded object, nor even recognize its existence. Current trends favor using digital image files as the cover file to hide another digital file that contains the secret message or information. One of the most common methods of implementation is Least Significant Bit Insertion, in which the least significant bit of every byte is altered to form the bit-string representing the embedded file. Altering the LSB will only cause minor changes in color, and thus is usually not noticeable to the human eye. While this technique works well for 24-bit color image files, steganography has not been as successful when using an 8-bit color image file, due to limitations in color variations and the use of a colormap. This paper presents the results of research investigating the combination of image compression and steganography. The technique developed starts with a 24-bit color bitmap file, then compresses the file by organizing and optimizing an 8-bit colormap. After the process of compression, a text message is hidden in the final, compressed image. Results indicate that the final technique has potential of being useful in the steganographic world.
Abstract: When faced with stochastic networks with an uncertain
duration for their activities, the securing of network completion time
becomes problematical, not only because of the non-identical pdf of
duration for each node, but also because of the interdependence of
network paths. As evidenced by Adlakha & Kulkarni [1], many
methods and algorithms have been put forward in attempt to resolve
this issue, but most have encountered this same large-size network
problem. Therefore, in this research, we focus on network reduction
through a Series/Parallel combined mechanism. Our suggested
algorithm, named the Activity Network Reduction Algorithm
(ANRA), can efficiently transfer a large-size network into an S/P
Irreducible Network (SPIN). SPIN can enhance stochastic network
analysis, as well as serve as the judgment of symmetry for the Graph
Theory.
Abstract: Themain goal of this article is to find efficient
methods for elemental and molecular analysis of living
microorganisms (algae) under defined environmental conditions and
cultivation processes. The overall knowledge of chemical
composition is obtained utilizing laser-based techniques, Laser-
Induced Breakdown Spectroscopy (LIBS) for acquiring information
about elemental composition and Raman Spectroscopy for gaining
molecular information, respectively. Algal cells were suspended in
liquid media and characterized using their spectra. Results obtained
employing LIBS and Raman Spectroscopy techniques will help to
elucidate algae biology (nutrition dynamics depending on cultivation
conditions) and to identify algal strains, which have the potential for
applications in metal-ion absorption (bioremediation) and biofuel
industry. Moreover, bioremediation can be readily combined with
production of 3rd generation biofuels. In order to use algae for
efficient fuel production, the optimal cultivation parameters have to
be determinedleading to high production of oil in selected
cellswithout significant inhibition of the photosynthetic activity and
the culture growth rate, e.g. it is necessary to distinguish conditions
for algal strain containing high amount of higher unsaturated fatty
acids. Measurements employing LIBS and Raman Spectroscopy were
utilized in order to give information about alga Trachydiscusminutus
with emphasis on the amount of the lipid content inside the algal cell
and the ability of algae to withdraw nutrients from its environment
and bioremediation (elemental composition), respectively. This
article can serve as the reference for further efforts in describing
complete chemical composition of algal samples employing laserablation
techniques.
Abstract: A case study of the generation scheduling optimization
of the multi-hydroplants on the Yuan River Basin in China is reported
in this paper. Concerning the uncertainty of the inflows, the
long/mid-term generation scheduling (LMTGS) problem is solved by
a stochastic model in which the inflows are considered as stochastic
variables. For the short-term generation scheduling (STGS) problem, a
constraint violation priority is defined in case not all constraints are
satisfied. Provided the stage-wise separable condition and low
dimensions, the hydroplant-based operational region schedules
(HBORS) problem is solved by dynamic programming (DP). The
coordination of LMTGS and STGS is presented as well. The
feasibility and the effectiveness of the models and solution methods
are verified by the numerical results.
Abstract: In order to Study the efficacy application of green
manure as chickpea pre plant, field experiments were carried out in
2007 and 2008 growing seasons. In this research the effects of
different strategies for soil fertilization were investigated on grain
yield and yield component, minerals, organic compounds and
cooking time of chickpea. Experimental units were arranged in splitsplit
plots based on randomized complete blocks with three
replications. Main plots consisted of (G1): establishing a mixed
vegetation of Vicia panunica and Hordeum vulgare and (G2):
control, as green manure levels. Also, five strategies for obtaining the
base fertilizer requirement including (N1): 20 t.ha-1 farmyard manure;
(N2): 10 t.ha-1 compost; (N3): 75 kg.ha-1 triple super phosphate;
(N4): 10 t.ha-1 farmyard manure + 5 t.ha-1 compost and (N5): 10 t.ha-1
farmyard manure + 5 t.ha-1 compost + 50 kg.ha-1 triple super
phosphate were considered in sub plots. Furthermoree four levels of
biofertilizers consisted of (B1): Bacillus lentus + Pseudomonas
putida; (B2): Trichoderma harzianum; (B3): Bacillus lentus +
Pseudomonas putida + Trichoderma harzianum; and (B4): control
(without biofertilizers) were arranged in sub-sub plots. Results
showed that integrating biofertilizers (B3) and green manure (G1)
produced the highest grain yield. The highest amounts of yield were
obtained in G1×N5 interaction. Comparison of all 2-way and 3-way
interactions showed that G1N5B3 was determined as the superior
treatment. Significant increasing of N, P2O5, K2O, Fe and Mg content
in leaves and grains emphasized on superiority of mentioned
treatment because each one of these nutrients has an approved role in
chlorophyll synthesis and photosynthesis abilities of the crops. The
combined application of compost, farmyard manure and chemical
phosphorus (N5) in addition to having the highest yield, had the best
grain quality due to high protein, starch and total sugar contents, low
crude fiber and reduced cooking time.
Abstract: Hepatitis B and hepatitis C are among the most
significant hepatic infections all around the world that may lead to
hepatocellular carcinoma. This study is first time performed at the
blood transfussion centre of Omar hospital, Lahore. It aims to
determine the sero-prevalence of these diseases by screening the
apparently healthy blood donors who might be the carriers of HBV or
HCV and pose a high risk in the transmission. It also aims the
comparison between the sensitivity of two diagnostic tests;
chromatographic immunoassay – one step test device and Enzyme
Linked Immuno Sorbant Assay (ELISA). Blood serum of 855
apparently healthy blood donors was screened for Hepatitis B surface
antigen (HBsAg) and for anti HCV antibodies. SPSS version 12.0
and X2 (Chi-square) test were used for statistical analysis. The seroprevalence
of HCV was 8.07% by the device method and by ELISA
9.12% and that of HBV was 5.6% by the device and 6.43% by
ELISA. The unavailability of vaccination against HCV makes it more
prevalent. Comparing the two diagnostic methods, ELISA proved to
be more sensitive.
Abstract: This interdisciplinary research aims to distinguish universal scale-free and field-like fundamental principles of selforganization observable across many disciplines like computer science, neuroscience, microbiology, social science, etc. Based on these universal principles we provide basic premises and postulates for designing holistic social simulation models. We also introduce pervasive information field (PIF) concept, which serves as a simulation media for contextual information storage, dynamic distribution and organization in social complex networks. PIF concept specifically is targeted for field-like uncoupled and indirect interactions among social agents capable of affecting and perceiving broadcasted contextual information. Proposed approach is expressive enough to represent contextual broadcasted information in a form locally accessible and immediately usable by network agents. This paper gives some prospective vision how system-s resources (tangible and intangible) could be simulated as oscillating processes immersed in the all pervasive information field.
Abstract: Hydrogen diffusion is the main problem for corrosion fatigue in corrosive environment. In order to analyze the phenomenon, it is needed to understand their behaviors specially the hydrogen behavior during the diffusion. So, Hydrogen embrittlement and prediction its behavior as a main corrosive part of the fractions, needed to solve combinations of different equations mathematically. The main point to obtain the equation, having knowledge about the source of causing diffusion and running the atoms into materials, called driving force. This is produced by either gradient of electrical or chemical potential. In this work, we consider the gradient of chemical potential to obtain the property equation. In diffusion of atoms, some of them may be trapped but, it could be ignorable in some conditions. According to the phenomenon of hydrogen embrittlement, the thermodynamic and chemical properties of hydrogen are considered to justify and relate them to fracture mechanics. It is very important to get a stress intensity factor by using fugacity as a property of hydrogen or other gases. Although, the diffusive behavior and embrittlement event are common and the same for other gases but, for making it more clear, we describe it for hydrogen. This considering on the definite gas and describing it helps us to understand better the importance of this relation.
Abstract: In an emergency, combining Wireless Sensor Network's data with the knowledge gathered from various other information sources and navigation algorithms, could help safely guide people to a building exit while avoiding the risky areas. This paper presents an emergency response and navigation support architecture for data gathering, knowledge manipulation, and navigational support in an emergency situation. At normal state, the system monitors the environment. When an emergency event detects, the system sends messages to first responders and immediately identifies the risky areas from safe areas to establishing escape paths. The main functionalities of the system include, gathering data from a wireless sensor network which is deployed in a multi-story indoor environment, processing it with information available in a knowledge base, and sharing the decisions made, with first responders and people in the building. The proposed architecture will act to reduce risk of losing human lives by evacuating people much faster with least congestion in an emergency environment.
Abstract: There are many classical algorithms for finding
routing in FPGA. But Using DNA computing we can solve the routes
efficiently and fast. The run time complexity of DNA algorithms is
much less than other classical algorithms which are used for solving
routing in FPGA. The research in DNA computing is in a primary
level. High information density of DNA molecules and massive
parallelism involved in the DNA reactions make DNA computing a
powerful tool. It has been proved by many research accomplishments
that any procedure that can be programmed in a silicon computer can
be realized as a DNA computing procedure. In this paper we have
proposed two tier approaches for the FPGA routing solution. First,
geometric FPGA detailed routing task is solved by transforming it
into a Boolean satisfiability equation with the property that any
assignment of input variables that satisfies the equation specifies a
valid routing. Satisfying assignment for particular route will result in
a valid routing and absence of a satisfying assignment implies that
the layout is un-routable. In second step, DNA search algorithm is
applied on this Boolean equation for solving routing alternatives
utilizing the properties of DNA computation. The simulated results
are satisfactory and give the indication of applicability of DNA
computing for solving the FPGA Routing problem.
Abstract: This paper analyses the structural changes in
education sector since the introduction of liberalization policy in
India. This paper explains how the so-called non-profit trusts and
societies appropriated the liberalization policy and enhanced
themselves as new capitalist class in higher education sector. Over
the decades, the policy witnessed the role of private sector in terms
of maintaining market equilibrium. The state also witnessed the
incompatibility of the private sector in inculcating the values of
social justice. The most important consequence of the policy is to
witness the rise of new capitalist class and academic capitalism.
When the state came to realize that it no longer cope up with
market demands, it opens the entry of private sector in higher
education. Concessions and tax exemptions were provided to the
trusts and societies to establish higher education institutions. There
is a basic difference between western countries and India in
providing higher education by the trusts and societies. In western
countries the big business houses contributed their surplus
revenues to promote higher education and research as a
complementary service to society and nation. In India, several
entrepreneurs came up with business motive using education
sector. Over the period, they accumulated wealth at the cost of
students and concessions from the government. Four major results
can now be identified: production of manpower in view of market
demands; reduction of standards in higher education; bypassing the
values of social justice; and the rise of new capitalist class from the
business of education. This paper tries to substantiate these issues
with the inputs from case studies.
Abstract: Signature amortization schemes have been introduced
for authenticating multicast streams, in which, a single signature is
amortized over several packets. The hash value of each packet is
computed, some hash values are appended to other packets, forming
what is known as hash chain. These schemes divide the stream into
blocks, each block is a number of packets, the signature packet in
these schemes is either the first or the last packet of the block.
Amortization schemes are efficient solutions in terms of computation
and communication overhead, specially in real-time environment.
The main effictive factor of amortization schemes is it-s hash chain
construction. Some studies show that signing the first packet of each
block reduces the receiver-s delay and prevents DoS attacks, other
studies show that signing the last packet reduces the sender-s delay.
To our knowledge, there is no studies that show which is better, to
sign the first or the last packet in terms of authentication probability
and resistance to packet loss.
In th is paper we will introduce another scheme for authenticating
multicast streams that is robust against packet loss, reduces the
overhead, and prevents the DoS attacks experienced by the receiver
in the same time. Our scheme-The Multiple Connected Chain signing
the First packet (MCF) is to append the hash values of specific
packets to other packets,then append some hashes to the signature
packet which is sent as the first packet in the block. This scheme
is aspecially efficient in terms of receiver-s delay. We discuss and
evaluate the performance of our proposed scheme against those that
sign the last packet of the block.
Abstract: In this paper, an efficient method for personal identification based on the pattern of human iris is proposed. It is composed of image acquisition, image preprocessing to make a flat iris then it is converted into eigeniris and decision is carried out using only reduction of iris in one dimension. By comparing the eigenirises it is determined whether two irises are similar. The results show that proposed method is quite effective.
Abstract: The mathematical equation for Separation of the
binary aqueous solution is developed by using the Spiegler- Kedem
theory. The characteristics of a B-9 hollow fibre module of Du Pont
are determined by using these equations and their results are
compared with the experimental results of Ohya et al. The agreement
between these results is found to be excellent.
Abstract: This paper presents an evolutionary method for designing
electronic circuits and numerical methods associated with
monitoring systems. The instruments described here have been used
in studies of weather and climate changes due to global warming, and
also in medical patient supervision. Genetic Programming systems
have been used both for designing circuits and sensors, and also for
determining sensor parameters. The authors advance the thesis that
the software side of such a system should be written in computer
languages with a strong mathematical and logic background in order
to prevent software obsolescence, and achieve program correctness.
Abstract: Code mobility technologies attract more and more developers and consumers. Numerous domains are concerned, many platforms are developed and interest applications are realized. However, developing good software products requires modeling, analyzing and proving steps. The choice of models and modeling languages is so critical on these steps. Formal tools are powerful in analyzing and proving steps. However, poorness of classical modeling language to model mobility requires proposition of new models. The objective of this paper is to provide a specific formalism “Coloured Reconfigurable Nets" and to show how this one seems to be adequate to model different kinds of code mobility.
Abstract: The performance of the Optical Code Division Multiplexing/ Wavelength Division Multiplexing (WDM/OCDM) technique for Optical Packet Switch is investigated. The impact on the performance of the impairment due to both Multiple Access Interference and Beat noise is studied. The Packet Loss Probability due to output packet contentions is evaluated as a function of the main switch and traffic parameters when Gold coherent optical codes are adopted. The Packet Loss Probability of the OCDM/WDM switch can reach 10-9 when M=16 wavelengths, Gold code of length L=511 and only 24 wavelength converters are used in the switch.
Abstract: This study was an investigation on the suitability of Lahar/HDPE composite as a primary material for low-cost smallscale biogas digesters. While sources of raw materials for biogas are abundant in the Philippines, cost of the technology has made the widespread utilization of this resource an indefinite proposition. Aside from capital economics, another problem arises with space requirements of current digester designs. These problems may be simultaneously addressed by fabricating digesters on a smaller, household scale to reach a wider market, and to use materials that may accommodate optimization of overall design and fabrication cost without sacrificing operational efficiency. This study involved actual fabrication of the Lahar/HDPE composite at varying composition and geometry, subsequent mechanical and thermal characterization, and implementation of Statistical Analysis to find intrinsic relationships between variables. From the results, Lahar/HDPE composite was found to be feasible for use as digester material from both mechanical and economic standpoints.
Abstract: This work presents a numerical simulation of the interaction of an incident shock wave propagates from the left to the right with a cone placed in a tube at shock. The Mathematical model is based on a non stationary, viscous and axisymmetric flow. The Discretization of the Navier-stokes equations is carried out by the finite volume method in the integral form along with the Flux Vector Splitting method of Van Leer. Here, adequate combination of time stepping parameter, CFL coefficient and mesh size level is selected to ensure numerical convergence. The numerical simulation considers a shock tube filled with air. The incident shock wave propagates to the right with a determined Mach number and crosses the cone by leaving behind it a stationary detached shock wave in front of the nose cone. This type of interaction is observed according to the time of flow.
Abstract: This paper presents a predictive model of sensor readings for mobile robot. The model predicts sensor readings for given time horizon based on current sensor readings and velocities of wheels assumed for this horizon. Similar models for such anticipation have been proposed in the literature. The novelty of the model presented in the paper comes from the fact that its structure takes into account physical phenomena and is not just a black box, for example a neural network. From this point of view it may be regarded as a semi-phenomenological model. The model is developed for the Khepera robot, but after certain modifications, it may be applied for any robot with distance sensors such as infrared or ultrasonic sensors.