Abstract: Now a days video data embedding approach is a very challenging and interesting task towards keeping real time video data secure. We can implement and use this technique with high-level applications. As the rate-distortion of any image is not confirmed, because the gain provided by accurate image frame segmentation are balanced by the inefficiency of coding objects of arbitrary shape, with a lot factors like losses that depend on both the coding scheme and the object structure. By using rate controller in association with the encoder one can dynamically adjust the target bitrate. This paper discusses about to keep secure videos by mixing signature data with negligible distortion in the original video, and to keep steganographic video as closely as possible to the quality of the original video. In this discussion we propose the method for embedding the signature data into separate video frames by the use of block Discrete Cosine Transform. These frames are then encoded by real time encoding H.264 scheme concepts. After processing, at receiver end recovery of original video and the signature data is proposed.
Abstract: Building a service-centric business model requires
new knowledge and capabilities in companies. This paper enlightens
the challenges small and medium sized firms (SMEs) face when
developing their service-centric business models. This paper
examines the premise for knowledge transfer and capability
development required. The objective of this paper is to increase
knowledge about SME-s transformation to service-centric business
models.This paper reports an action research based case study. The
paper provides empirical evidence from three case companies. The
empirical data was collected through multiple methods. The findings
of the paper are: First, the developed model to analyze the current
state in companies. Second, the process of building the service –
centric business models. Third, the selection of suitable service
development methods. The lack of a holistic understanding on
service logic suggests that SMEs need practical and easy to use
methods to improve their business
Abstract: Part IV of the Civil Code of the Russian Federation dedicated to legal regulation of Intellectual property rights came into force in 2008. It is a first attempt of codification in Intellectual property sphere in Russia. That is why a lot of new norms appeared. The main problem of the Russian Civil Code (part IV) is that many rules (norms of Law) contradict the norms of International Intellectual property Law (i.e. protection of inventions, creations, ideas, know-how, trade secrets, innovations). Intellectual property rights protect innovations and creations and reward innovative and creative activity. Intellectual property rights are international in character and in that respect they fit in rather well with the economic reality of the global economy. Inventors prefer not to take out a patent for inventions because it is a very difficult procedure, it takes a lot of time and is very expensive. That-s why they try to protect their inventions as ideas, know-how, confidential information. An idea is the main element of any object of Intellectual property (creation, invention, innovation, know-how, etc.). But ideas are not protected by Civil Code of Russian Federation. The aim of the paper is to reveal the main problems of legal regulation of Intellectual property in Russia and to suggest possible solutions. The authors of this paper have raised these essential issues through different activities. Through the panel survey, questionnaires which were spread among the participants of intellectual activities the main problems of implementation of innovations, protecting of the ideas and know-how were identified. The implementation of research results will help to solve economic and legal problems of innovations, transfer of innovations and intellectual property.1
Abstract: Hepatitis B and hepatitis C are among the most
significant hepatic infections all around the world that may lead to
hepatocellular carcinoma. This study is first time performed at the
blood transfussion centre of Omar hospital, Lahore. It aims to
determine the sero-prevalence of these diseases by screening the
apparently healthy blood donors who might be the carriers of HBV or
HCV and pose a high risk in the transmission. It also aims the
comparison between the sensitivity of two diagnostic tests;
chromatographic immunoassay – one step test device and Enzyme
Linked Immuno Sorbant Assay (ELISA). Blood serum of 855
apparently healthy blood donors was screened for Hepatitis B surface
antigen (HBsAg) and for anti HCV antibodies. SPSS version 12.0
and X2 (Chi-square) test were used for statistical analysis. The seroprevalence
of HCV was 8.07% by the device method and by ELISA
9.12% and that of HBV was 5.6% by the device and 6.43% by
ELISA. The unavailability of vaccination against HCV makes it more
prevalent. Comparing the two diagnostic methods, ELISA proved to
be more sensitive.
Abstract: This paper describes fast and efficient method for page segmentation of document containing nonrectangular block. The segmentation is based on edge following algorithm using small window of 16 by 32 pixels. This segmentation is very fast since only border pixels of paragraph are used without scanning the whole page. Still, the segmentation may contain error if the space between them is smaller than the window used in edge following. Consequently, this paper reduce this error by first identify the missed segmentation point using direction information in edge following then, using X-Y cut at the missed segmentation point to separate the connected columns. The advantage of the proposed method is the fast identification of missed segmentation point. This methodology is faster with fewer overheads than other algorithms that need to access much more pixel of a document.
Abstract: In this paper, an extreme learning machine with an automatic segmentation algorithm is applied to heart disorder classification by heart sound signals. From continuous heart sound signals, the starting points of the first (S1) and the second heart pulses (S2) are extracted and corrected by utilizing an inter-pulse histogram. From the corrected pulse positions, a single period of heart sound signals is extracted and converted to a feature vector including the mel-scaled filter bank energy coefficients and the envelope coefficients of uniform-sized sub-segments. An extreme learning machine is used to classify the feature vector. In our cardiac disorder classification and detection experiments with 9 cardiac disorder categories, the proposed method shows significantly better performance than multi-layer perceptron, support vector machine, and hidden Markov model; it achieves the classification accuracy of 81.6% and the detection accuracy of 96.9%.
Abstract: In an emergency, combining Wireless Sensor Network's data with the knowledge gathered from various other information sources and navigation algorithms, could help safely guide people to a building exit while avoiding the risky areas. This paper presents an emergency response and navigation support architecture for data gathering, knowledge manipulation, and navigational support in an emergency situation. At normal state, the system monitors the environment. When an emergency event detects, the system sends messages to first responders and immediately identifies the risky areas from safe areas to establishing escape paths. The main functionalities of the system include, gathering data from a wireless sensor network which is deployed in a multi-story indoor environment, processing it with information available in a knowledge base, and sharing the decisions made, with first responders and people in the building. The proposed architecture will act to reduce risk of losing human lives by evacuating people much faster with least congestion in an emergency environment.
Abstract: There are many classical algorithms for finding
routing in FPGA. But Using DNA computing we can solve the routes
efficiently and fast. The run time complexity of DNA algorithms is
much less than other classical algorithms which are used for solving
routing in FPGA. The research in DNA computing is in a primary
level. High information density of DNA molecules and massive
parallelism involved in the DNA reactions make DNA computing a
powerful tool. It has been proved by many research accomplishments
that any procedure that can be programmed in a silicon computer can
be realized as a DNA computing procedure. In this paper we have
proposed two tier approaches for the FPGA routing solution. First,
geometric FPGA detailed routing task is solved by transforming it
into a Boolean satisfiability equation with the property that any
assignment of input variables that satisfies the equation specifies a
valid routing. Satisfying assignment for particular route will result in
a valid routing and absence of a satisfying assignment implies that
the layout is un-routable. In second step, DNA search algorithm is
applied on this Boolean equation for solving routing alternatives
utilizing the properties of DNA computation. The simulated results
are satisfactory and give the indication of applicability of DNA
computing for solving the FPGA Routing problem.
Abstract: Signature amortization schemes have been introduced
for authenticating multicast streams, in which, a single signature is
amortized over several packets. The hash value of each packet is
computed, some hash values are appended to other packets, forming
what is known as hash chain. These schemes divide the stream into
blocks, each block is a number of packets, the signature packet in
these schemes is either the first or the last packet of the block.
Amortization schemes are efficient solutions in terms of computation
and communication overhead, specially in real-time environment.
The main effictive factor of amortization schemes is it-s hash chain
construction. Some studies show that signing the first packet of each
block reduces the receiver-s delay and prevents DoS attacks, other
studies show that signing the last packet reduces the sender-s delay.
To our knowledge, there is no studies that show which is better, to
sign the first or the last packet in terms of authentication probability
and resistance to packet loss.
In th is paper we will introduce another scheme for authenticating
multicast streams that is robust against packet loss, reduces the
overhead, and prevents the DoS attacks experienced by the receiver
in the same time. Our scheme-The Multiple Connected Chain signing
the First packet (MCF) is to append the hash values of specific
packets to other packets,then append some hashes to the signature
packet which is sent as the first packet in the block. This scheme
is aspecially efficient in terms of receiver-s delay. We discuss and
evaluate the performance of our proposed scheme against those that
sign the last packet of the block.
Abstract: The goal of this paper is to examine the effects of laser
radiation on the skin wound healing using infrared thermography as
non-invasive method for the monitoring of the skin temperature
changes during laser treatment. Thirty Wistar rats were used in this
study. A skin lesion was performed at the leg on all rats. The animals
were exposed to laser radiation (λ = 670 nm, P = 15 mW, DP = 16.31
mW/cm2) for 600 s. Thermal images of wound were acquired before
and after laser irradiation. The results have demonstrated that the
tissue temperature decreases from 35.5±0.50°C in the first treatment
day to 31.3±0.42°C after the third treatment day. This value is close
to the normal value of the skin temperature and indicates the end of
the skin repair process. In conclusion, the improvements in the
wound healing following exposure to laser radiation have been
revealed by infrared thermography.
Abstract: Utilizing echoic intension and distribution from different organs and local details of human body, ultrasonic image can catch important medical pathological changes, which unfortunately may be affected by ultrasonic speckle noise. A feature preserving ultrasonic image denoising and edge enhancement scheme is put forth, which includes two terms: anisotropic diffusion and edge enhancement, controlled by the optimum smoothing time. In this scheme, the anisotropic diffusion is governed by the local coordinate transformation and the first and the second order normal derivatives of the image, while the edge enhancement is done by the hyperbolic tangent function. Experiments on real ultrasonic images indicate effective preservation of edges, local details and ultrasonic echoic bright strips on denoising by our scheme.
Abstract: Mostly, pedestrian-car accidents occurred at a
signalized interaction is because pedestrians cannot across the
intersection safely within the green light. From the viewpoint of
pedestrian, there might have two reasons. The first one is pedestrians
cannot speed up to across the intersection, such as the elders. The other
reason is pedestrians do not sense that the signal phase is going to
change and their right-of-way is going to lose. Developing signal logic
to protect pedestrian, who is crossing an intersection is the first
purpose of this study. Another purpose of this study is improving the
reliability and reduce delay of public transportation service. Therefore,
bus preemption is also considered in the designed signal logic. In this
study, the traffic data of the intersection of Chong-Qing North Road
and Min-Zu West Road, Taipei, Taiwan, is employed to calibrate and
validate the signal logic by simulation. VISSIM 5.20, which is a
microscopic traffic simulation software, is employed to simulate the
signal logic. From the simulated results, the signal logic presented in
this study can protect pedestrians crossing the intersection
successfully. The design of bus preemption can reduce the average
delay. However, the pedestrian safety and bus preemptive signal will
influence the average delay of cars largely. Thus, whether applying the
pedestrian safety and bus preemption signal logic to an isolated
intersection or not should be evaluated carefully.
Abstract: An accurate prediction of the minimum fluidization
velocity is a crucial hydrodynamic aspect of the design of fluidized
bed reactors. Common approaches for the prediction of the minimum
fluidization velocities of binary-solid fluidized beds are first
discussed here. The data of our own careful experimental
investigation involving a binary-solid pair fluidized with water is
presented. The effect of the relative composition of the two solid
species comprising the fluidized bed on the bed void fraction at the
incipient fluidization condition is reported and its influence on the
minimum fluidization velocity is discussed. In this connection, the
capability of packing models to predict the bed void fraction is also
examined.
Abstract: Uniqueness and distinctiveness of localities (referred to as genius loci or sense of place) are important to ensure people-s identification with their locality. Existing frameworks reveals that the affective dimension of environments is rarely mentioned or explored and limited public participation was used in constructing the frameworks. This research argues that the complexity of sense of place would be recognised and appropriate planning guidelines formulated by exploring and integrating the affective dimension of a site. Aims of the research therefore are to (i) explore relational dimensions between people and a natural rural landscape, (ii) to implement a participatory approach to obtain insight into different relational dimensions, and (ii) to concretise socio-affective relational dimensions into site planning guidelines. A qualitative, interdisciplinary research approach was followed and conducted on the farm Kromdraai, Vredefort Dome World Heritage Site. In essence the first phase of the study reveals various affective responses and projections of personal meanings. The findings in phase 1 informed the second phase, to involve people from various disciplines and different involvement with the area to make visual presentations of appropriate planning and design of the site in order to capture meanings of the interactions between people and their environment. Final site planning and design guidelines were formulated, based on these. This research contributed to provide planners with new possibilities of exploring the dimensions between people and places as well as to develop appropriate methods for participation to obtain insight into the underlying meanings of sites.
Abstract: Chromite is one of the principal ore of chromium in which the metal exists as a complex oxide (FeO.Cr2O3).The prepared chromite can be widely used as refractory in high temperature applications. This study describes the use of local chromite ore as refractory material. To study the feasibility of local chromite, chemical analysis and refractoriness are firstly measured. To produce chromite refractory brick, it is pressed under a press of 400 tons, dried and fired at 1580°C for fifty two hours. Then, the standard properties such as cold crushing strength, apparent porosity, apparent specific gravity, bulk density and water absorption that the chromite brick should possess were measured. According to the results obtained, the brick made by local chromite ore was suitable for use as refractory brick.
Abstract: A numerical study on the effect of side-dump angle on
fuel droplets sizing and effective mass fraction have been
investigated in present paper. The mass of fuel vapor inside the
flammability limit is named as the effective mass fraction. In the first
step we have considered a side-dump combustor with dump angle of
0o (acrossthe cylinder) and by increasing the entrance airflow velocity
from 20 to 30, 40 and 50 (m/s) respectively, the mean diameter of
fuel droplets sizing and effective mass fraction have been studied.
After this step, we have changed the dump angle from 0o to 30o,45o
and finally 60o in direction of cylinderand also we have increased the
entrance airflow velocity from 20 up to 50 (m/s) with the amount of
growth of 10(m/s) in each step, to examine its effects on fuel droplets
sizing as well as effective mass fraction. With rise of entrance airflow
velocity, these calculations are repeated in each step too. The results
show, with growth of dump-angle the effective mass fraction has
been decreased and the mean diameter of droplets sizing has been
increased. To fulfill the calculations a modified version of KIVA-3V
code which is a transient, three-dimensional, multiphase,
multicomponent code for the analysis of chemically reacting flows
with sprays, is used.
Abstract: Most known methods for measuring the structural similarity of document structures are based on, e.g., tag measures, path metrics and tree measures in terms of their DOM-Trees. Other methods measures the similarity in the framework of the well known vector space model. In contrast to these we present a new approach to measuring the structural similarity of web-based documents represented by so called generalized trees which are more general than DOM-Trees which represent only directed rooted trees.We will design a new similarity measure for graphs representing web-based hypertext structures. Our similarity measure is mainly based on a novel representation of a graph as strings of linear integers, whose components represent structural properties of the graph. The similarity of two graphs is then defined as the optimal alignment of the underlying property strings. In this paper we apply the well known technique of sequence alignments to solve a novel and challenging problem: Measuring the structural similarity of generalized trees. More precisely, we first transform our graphs considered as high dimensional objects in linear structures. Then we derive similarity values from the alignments of the property strings in order to measure the structural similarity of generalized trees. Hence, we transform a graph similarity problem to a string similarity problem. We demonstrate that our similarity measure captures important structural information by applying it to two different test sets consisting of graphs representing web-based documents.
Abstract: In this paper the influence of heterogeneous traffic on
the temporal variation of ambient PM10, PM2.5 and PM1
concentrations at a busy arterial route (Sardar Patel Road) in the
Chennai city has been analyzed. The hourly PM concentration, traffic
counts and average speed of the vehicles have been monitored at the
study site for one week (19th-25th January 2009). Results indicated
that the concentrations of coarse (PM10) and fine PM (PM2.5 and
PM1) concentrations at SP road are having similar trend during peak
and non-peak hours, irrespective of the days. The PM concentrations
showed daily two peaks corresponding to morning (8 to 10 am) and
evening (7 to 9 pm) peak hour traffic flow. The PM10 concentration is
dominated by fine particles (53% of PM2.5 and 45% of PM1). The
high PM2.5/PM10 ratio indicates that the majority of PM10 particles
originate from re-suspension of road dust. The analysis of traffic flow
at the study site showed that 2W, 3W and 4W are having similar
diurnal trend as PM concentrations. This confirms that the 2W, 3W
and 4W are the main emission source contributing to ambient PM
concentration at SP road. The speed measurement at SP road showed
that the average speed of 2W, 3W, 4W, LCV and HCV are 38, 40,
38, 40 and 38 km/hr and 43, 41, 42, 40 and 41 km/hr respectively for
the weekdays and weekdays.
Abstract: This contribution was developed from a research
within the doctoral thesis. Its object was to create multimedia
materials for sport gymnastics. Consequently we surveyed the
influence of its practical application on the efficiency of schooling at
a university. We verified the prescribed hypothesis of the efficiency
of the teaching process using the method of single-factor experiment,
where the entrance independent variable was the change of system of
tuition and the outgoing dependent variable was the change of level
of acquired motor skills. The results confirmed the positive impact of
using multimedia materials on the efficiency of the teaching process.
Further, with the aid of questionnaires, we evaluated how the tested
subjects perceive the innovative methods in sport gymnastics. The
responses showed that the students rate the application of multimedia
materials very positively.
Abstract: The conventional assessment of human semen is a
highly subjective assessment, with considerable intra- and interlaboratory
variability. Computer-Assisted Sperm Analysis (CASA)
systems provide a rapid and automated assessment of the sperm
characteristics, together with improved standardization and quality
control. However, the outcome of CASA systems is sensitive to the
method of experimentation. While conventional CASA systems use
digital microscopes with phase-contrast accessories, producing
higher contrast images, we have used raw semen samples (no
staining materials) and a regular light microscope, with a digital
camera directly attached to its eyepiece, to insure cost benefits and
simple assembling of the system. However, since the accurate finding
of sperms in the semen image is the first step in the examination and
analysis of the semen, any error in this step can affect the outcome of
the analysis. This article introduces and explains an algorithm for
finding sperms in low contrast images: First, an image enhancement
algorithm is applied to remove extra particles from the image. Then,
the foreground particles (including sperms and round cells) are
segmented form the background. Finally, based on certain features
and criteria, sperms are separated from other cells.