Abstract: Hepatitis B and hepatitis C are among the most
significant hepatic infections all around the world that may lead to
hepatocellular carcinoma. This study is first time performed at the
blood transfussion centre of Omar hospital, Lahore. It aims to
determine the sero-prevalence of these diseases by screening the
apparently healthy blood donors who might be the carriers of HBV or
HCV and pose a high risk in the transmission. It also aims the
comparison between the sensitivity of two diagnostic tests;
chromatographic immunoassay – one step test device and Enzyme
Linked Immuno Sorbant Assay (ELISA). Blood serum of 855
apparently healthy blood donors was screened for Hepatitis B surface
antigen (HBsAg) and for anti HCV antibodies. SPSS version 12.0
and X2 (Chi-square) test were used for statistical analysis. The seroprevalence
of HCV was 8.07% by the device method and by ELISA
9.12% and that of HBV was 5.6% by the device and 6.43% by
ELISA. The unavailability of vaccination against HCV makes it more
prevalent. Comparing the two diagnostic methods, ELISA proved to
be more sensitive.
Abstract: This paper describes fast and efficient method for page segmentation of document containing nonrectangular block. The segmentation is based on edge following algorithm using small window of 16 by 32 pixels. This segmentation is very fast since only border pixels of paragraph are used without scanning the whole page. Still, the segmentation may contain error if the space between them is smaller than the window used in edge following. Consequently, this paper reduce this error by first identify the missed segmentation point using direction information in edge following then, using X-Y cut at the missed segmentation point to separate the connected columns. The advantage of the proposed method is the fast identification of missed segmentation point. This methodology is faster with fewer overheads than other algorithms that need to access much more pixel of a document.
Abstract: This interdisciplinary research aims to distinguish universal scale-free and field-like fundamental principles of selforganization observable across many disciplines like computer science, neuroscience, microbiology, social science, etc. Based on these universal principles we provide basic premises and postulates for designing holistic social simulation models. We also introduce pervasive information field (PIF) concept, which serves as a simulation media for contextual information storage, dynamic distribution and organization in social complex networks. PIF concept specifically is targeted for field-like uncoupled and indirect interactions among social agents capable of affecting and perceiving broadcasted contextual information. Proposed approach is expressive enough to represent contextual broadcasted information in a form locally accessible and immediately usable by network agents. This paper gives some prospective vision how system-s resources (tangible and intangible) could be simulated as oscillating processes immersed in the all pervasive information field.
Abstract: In this paper, an extreme learning machine with an automatic segmentation algorithm is applied to heart disorder classification by heart sound signals. From continuous heart sound signals, the starting points of the first (S1) and the second heart pulses (S2) are extracted and corrected by utilizing an inter-pulse histogram. From the corrected pulse positions, a single period of heart sound signals is extracted and converted to a feature vector including the mel-scaled filter bank energy coefficients and the envelope coefficients of uniform-sized sub-segments. An extreme learning machine is used to classify the feature vector. In our cardiac disorder classification and detection experiments with 9 cardiac disorder categories, the proposed method shows significantly better performance than multi-layer perceptron, support vector machine, and hidden Markov model; it achieves the classification accuracy of 81.6% and the detection accuracy of 96.9%.
Abstract: In an emergency, combining Wireless Sensor Network's data with the knowledge gathered from various other information sources and navigation algorithms, could help safely guide people to a building exit while avoiding the risky areas. This paper presents an emergency response and navigation support architecture for data gathering, knowledge manipulation, and navigational support in an emergency situation. At normal state, the system monitors the environment. When an emergency event detects, the system sends messages to first responders and immediately identifies the risky areas from safe areas to establishing escape paths. The main functionalities of the system include, gathering data from a wireless sensor network which is deployed in a multi-story indoor environment, processing it with information available in a knowledge base, and sharing the decisions made, with first responders and people in the building. The proposed architecture will act to reduce risk of losing human lives by evacuating people much faster with least congestion in an emergency environment.
Abstract: Internet is one of the major sources of information for
the person belonging to almost all the fields of life. Major language
that is used to publish information on internet is language. This thing
becomes a problem in a country like Pakistan, where Urdu is the
national language. Only 10% of Pakistan mass can understand
English. The reason is millions of people are deprived of precious
information available on internet. This paper presents a system for
translation from English to Urdu. A module LESSA is used that uses
a rule based algorithm to read the input text in English language,
understand it and translate it into Urdu language. The designed
approach was further incorporated to translate the complete website
from English language o Urdu language. An option appears in the
browser to translate the webpage in a new window. The designed
system will help the millions of users of internet to get benefit of the
internet and approach the latest information and knowledge posted
daily on internet.
Abstract: New ways of working- refers to non-traditional work practices, settings and locations with information and communication technologies (ICT) to supplement or replace traditional ways of working. It questions the contemporary work practices and settings still very much used in knowledge-intensive organizations today. In this study new ways of working is seen to consist of two elements: work environment (incl. physical, virtual and social) and work practices. This study aims to gather the scattered information together and deepen the understanding on new ways of working. Moreover, the objective is to provide some evidence of the unclear productivity impacts of new ways of working using case study approach.
Abstract: There are many classical algorithms for finding
routing in FPGA. But Using DNA computing we can solve the routes
efficiently and fast. The run time complexity of DNA algorithms is
much less than other classical algorithms which are used for solving
routing in FPGA. The research in DNA computing is in a primary
level. High information density of DNA molecules and massive
parallelism involved in the DNA reactions make DNA computing a
powerful tool. It has been proved by many research accomplishments
that any procedure that can be programmed in a silicon computer can
be realized as a DNA computing procedure. In this paper we have
proposed two tier approaches for the FPGA routing solution. First,
geometric FPGA detailed routing task is solved by transforming it
into a Boolean satisfiability equation with the property that any
assignment of input variables that satisfies the equation specifies a
valid routing. Satisfying assignment for particular route will result in
a valid routing and absence of a satisfying assignment implies that
the layout is un-routable. In second step, DNA search algorithm is
applied on this Boolean equation for solving routing alternatives
utilizing the properties of DNA computation. The simulated results
are satisfactory and give the indication of applicability of DNA
computing for solving the FPGA Routing problem.
Abstract: This paper analyses the structural changes in
education sector since the introduction of liberalization policy in
India. This paper explains how the so-called non-profit trusts and
societies appropriated the liberalization policy and enhanced
themselves as new capitalist class in higher education sector. Over
the decades, the policy witnessed the role of private sector in terms
of maintaining market equilibrium. The state also witnessed the
incompatibility of the private sector in inculcating the values of
social justice. The most important consequence of the policy is to
witness the rise of new capitalist class and academic capitalism.
When the state came to realize that it no longer cope up with
market demands, it opens the entry of private sector in higher
education. Concessions and tax exemptions were provided to the
trusts and societies to establish higher education institutions. There
is a basic difference between western countries and India in
providing higher education by the trusts and societies. In western
countries the big business houses contributed their surplus
revenues to promote higher education and research as a
complementary service to society and nation. In India, several
entrepreneurs came up with business motive using education
sector. Over the period, they accumulated wealth at the cost of
students and concessions from the government. Four major results
can now be identified: production of manpower in view of market
demands; reduction of standards in higher education; bypassing the
values of social justice; and the rise of new capitalist class from the
business of education. This paper tries to substantiate these issues
with the inputs from case studies.
Abstract: In this paper, an efficient method for personal identification based on the pattern of human iris is proposed. It is composed of image acquisition, image preprocessing to make a flat iris then it is converted into eigeniris and decision is carried out using only reduction of iris in one dimension. By comparing the eigenirises it is determined whether two irises are similar. The results show that proposed method is quite effective.
Abstract: The goal of this paper is to examine the effects of laser
radiation on the skin wound healing using infrared thermography as
non-invasive method for the monitoring of the skin temperature
changes during laser treatment. Thirty Wistar rats were used in this
study. A skin lesion was performed at the leg on all rats. The animals
were exposed to laser radiation (λ = 670 nm, P = 15 mW, DP = 16.31
mW/cm2) for 600 s. Thermal images of wound were acquired before
and after laser irradiation. The results have demonstrated that the
tissue temperature decreases from 35.5±0.50°C in the first treatment
day to 31.3±0.42°C after the third treatment day. This value is close
to the normal value of the skin temperature and indicates the end of
the skin repair process. In conclusion, the improvements in the
wound healing following exposure to laser radiation have been
revealed by infrared thermography.
Abstract: In this research, STNEP is being studied considering network adequacy and limitation of investment cost by decimal codification genetic algorithm (DCGA). The goal is obtaining the maximum of network adequacy with lowest expansion cost for a specific investment. Finally, the proposed idea is applied to the Garvers 6-bus network. The results show that considering the network adequacy for solution of STNEP problem is caused that among of expansion plans for a determined investment, configuration which has relatively lower expansion cost and higher adequacy is proposed by GA based method. Finally, with respect to the curve of adequacy versus expansion cost it can be said that more optimal configurations for expansion of network are obtained with lower investment costs.
Abstract: The mathematical equation for Separation of the
binary aqueous solution is developed by using the Spiegler- Kedem
theory. The characteristics of a B-9 hollow fibre module of Du Pont
are determined by using these equations and their results are
compared with the experimental results of Ohya et al. The agreement
between these results is found to be excellent.
Abstract: Code mobility technologies attract more and more developers and consumers. Numerous domains are concerned, many platforms are developed and interest applications are realized. However, developing good software products requires modeling, analyzing and proving steps. The choice of models and modeling languages is so critical on these steps. Formal tools are powerful in analyzing and proving steps. However, poorness of classical modeling language to model mobility requires proposition of new models. The objective of this paper is to provide a specific formalism “Coloured Reconfigurable Nets" and to show how this one seems to be adequate to model different kinds of code mobility.
Abstract: Short-Term Load Forecasting (STLF) plays an important role for the economic and secure operation of power systems. In this paper, Continuous Genetic Algorithm (CGA) is employed to evolve the optimum large neural networks structure and connecting weights for one-day ahead electric load forecasting problem. This study describes the process of developing three layer feed-forward large neural networks for load forecasting and then presents a heuristic search algorithm for performing an important task of this process, i.e. optimal networks structure design. The proposed method is applied to STLF of the local utility. Data are clustered due to the differences in their characteristics. Special days are extracted from the normal training sets and handled separately. In this way, a solution is provided for all load types, including working days and weekends and special days. We find good performance for the large neural networks. The proposed methodology gives lower percent errors all the time. Thus, it can be applied to automatically design an optimal load forecaster based on historical data.
Abstract: This work presents a numerical simulation of the interaction of an incident shock wave propagates from the left to the right with a cone placed in a tube at shock. The Mathematical model is based on a non stationary, viscous and axisymmetric flow. The Discretization of the Navier-stokes equations is carried out by the finite volume method in the integral form along with the Flux Vector Splitting method of Van Leer. Here, adequate combination of time stepping parameter, CFL coefficient and mesh size level is selected to ensure numerical convergence. The numerical simulation considers a shock tube filled with air. The incident shock wave propagates to the right with a determined Mach number and crosses the cone by leaving behind it a stationary detached shock wave in front of the nose cone. This type of interaction is observed according to the time of flow.
Abstract: This paper presents a predictive model of sensor readings for mobile robot. The model predicts sensor readings for given time horizon based on current sensor readings and velocities of wheels assumed for this horizon. Similar models for such anticipation have been proposed in the literature. The novelty of the model presented in the paper comes from the fact that its structure takes into account physical phenomena and is not just a black box, for example a neural network. From this point of view it may be regarded as a semi-phenomenological model. The model is developed for the Khepera robot, but after certain modifications, it may be applied for any robot with distance sensors such as infrared or ultrasonic sensors.
Abstract: This paper presents a new compact analytical model of
the gate leakage current in high-k based nano scale MOSFET by
assuming a two-step inelastic trap-assisted tunneling (ITAT) process
as the conduction mechanism. This model is based on an inelastic
trap-assisted tunneling (ITAT) mechanism combined with a semiempirical
gate leakage current formulation in the BSIM 4 model. The
gate tunneling currents have been calculated as a function of gate
voltage for different gate dielectrics structures such as HfO2, Al2O3
and Si3N4 with EOT (equivalent oxide thickness) of 1.0 nm. The
proposed model is compared and contrasted with santaurus
simulation results to verify the accuracy of the model and excellent
agreement is found between the analytical and simulated data. It is
observed that proposed analytical model is suitable for different highk
gate dielectrics simply by adjusting two fitting parameters. It was
also shown that gate leakages reduced with the introduction of high-k
gate dielectric in place of SiO2.
Abstract: An accurate prediction of the minimum fluidization
velocity is a crucial hydrodynamic aspect of the design of fluidized
bed reactors. Common approaches for the prediction of the minimum
fluidization velocities of binary-solid fluidized beds are first
discussed here. The data of our own careful experimental
investigation involving a binary-solid pair fluidized with water is
presented. The effect of the relative composition of the two solid
species comprising the fluidized bed on the bed void fraction at the
incipient fluidization condition is reported and its influence on the
minimum fluidization velocity is discussed. In this connection, the
capability of packing models to predict the bed void fraction is also
examined.
Abstract: The software evolution control requires a deep
understanding of the changes and their impact on different system
heterogeneous artifacts. And an understanding of descriptive
knowledge of the developed software artifacts is a prerequisite
condition for the success of the evolutionary process.
The implementation of an evolutionary process is to make changes
more or less important to many heterogeneous software artifacts such
as source code, analysis and design models, unit testing, XML
deployment descriptors, user guides, and others. These changes can
be a source of degradation in functional, qualitative or behavioral
terms of modified software. Hence the need for a unified approach
for extraction and representation of different heterogeneous artifacts
in order to ensure a unified and detailed description of heterogeneous
software artifacts, exploitable by several software tools and allowing
to responsible for the evolution of carry out the reasoning change
concerned.