Abstract: This research contribution is drafted to present the
orbit design, orbit propagator and geomagnetic field estimator for the
nanosatellites specifically for the upcoming CUBESAT, ICUBE-1 of
the Institute of Space Technology (IST), Islamabad, Pakistan. The
ICUBE mission is designed for the low earth orbit at the approximate
height of 700KM. The presented research endeavor designs the
Keplarian elements for ICUBE-1 orbit while incorporating the
mission requirements and propagates the orbit using J2 perturbations,
The attitude determination system of the ICUBE-1 consists of
attitude determination sensors like magnetometer and sun sensor. The
Geomagnetic field estimator is developed according to the model of
International Geomagnetic Reference Field (IGRF) for comparing the
magnetic field measurements by the magnetometer for attitude
determination. The output of the propagator namely the Keplarians
position and velocity vectors and the magnetic field vectors are
compared and verified with the same scenario generated in the
Satellite Tool Kit (STK).
Abstract: In this paper is investigated a possible
optimization of some linear algebra problems which can be
solved by parallel processing using the special arrays called
systolic arrays. In this paper are used some special types of
transformations for the designing of these arrays. We show
the characteristics of these arrays. The main focus is on
discussing the advantages of these arrays in parallel
computation of matrix product, with special approach to the
designing of systolic array for matrix multiplication.
Multiplication of large matrices requires a lot of
computational time and its complexity is O(n3 ). There are
developed many algorithms (both sequential and parallel) with
the purpose of minimizing the time of calculations. Systolic
arrays are good suited for this purpose. In this paper we show
that using an appropriate transformation implicates in finding
more optimal arrays for doing the calculations of this type.
Abstract: This paper presents an effective framework for Chinesesyntactic parsing, which includes two parts. The first one is a parsing framework, which is based on an improved bottom-up chart parsingalgorithm, and integrates the idea of the beam search strategy of N bestalgorithm and heuristic function of A* algorithm for pruning, then get multiple parsing trees. The second is a novel evaluation model, which integrates contextual and partial lexical information into traditional PCFG model and defines a new score function. Using this model, the tree with the highest score is found out as the best parsing tree. Finally,the contrasting experiment results are given. Keywords?syntactic parsing, PCFG, pruning, evaluation model.
Abstract: The world-s largest Pre-stressed Concrete Cylinder
Pipe (PCCP) water supply project had a series of pipe failures which
occurred between 1999 and 2001. This has led the Man-Made River
Authority (MMRA), the authority in charge of the implementation
and operation of the project, to setup a rehabilitation plan for the
conveyance system while maintaining the uninterrupted flow of
water to consumers. At the same time, MMRA recognized the need
for a long term management tool that would facilitate repair and
maintenance decisions and enable taking the appropriate preventive
measures through continuous monitoring and estimation of the
remaining life of each pipe. This management tool is known as the
Pipe Risk Management System (PRMS) and now in operation at
MMRA. Both the rehabilitation plan and the PRMS require the
availability of complete and accurate pipe construction and
manufacturing data
This paper describes a systematic approach of data collection,
analysis, evaluation and correction for the construction and
manufacturing data files of phase I pipes which are the platform for
the PRMS database and any other related decision support system.
Abstract: LDPC codes could be used in magnetic storage devices because of their better decoding performance compared to other error correction codes. However, their hardware implementation results in large and complex decoders. This one of the main obstacles the decoders to be incorporated in magnetic storage devices. We construct small high girth and rate 2 columnweight codes from cage graphs. Though these codes have low performance compared to higher column weight codes, they are easier to implement. The ease of implementation makes them more suitable for applications such as magnetic recording. Cages are the smallest known regular distance graphs, which give us the smallest known column-weight 2 codes given the size, girth and rate of the code.
Abstract: Among other factors that characterize satellite communication
channels is their high bit error rate. We present a system for
still image transmission over noisy satellite channels. The system
couples image compression together with error control codes to
improve the received image quality while maintaining its bandwidth
requirements. The proposed system is tested using a high resolution
satellite imagery simulated over the Rician fading channel. Evaluation
results show improvement in overall system including image quality
and bandwidth requirements compared to similar systems with different
coding schemes.
Abstract: During more than a decade, many proposals and standards have been designed to deal with the mobility issues; however, there are still some serious limitations in basing solutions on them. In this paper we discuss the possibility of handling mobility at the application layer. We do this while revisiting the conventional implementation of the Two Phase Commit (2PC) protocol which is a fundamental asset of transactional technology for ensuring the consistent commitment of distributed transactions. The solution is based on an execution framework providing an efficient extension that is aware of the mobility and preserves the 2PC principle.
Abstract: Type 2 diabetes mellitus (T2DM) is a complex
metabolic disorder that characterized by the presence of high glucose
in blood that cause from insulin resistance and insufficiency due to
deterioration β-cell Langerhans functions. T2DM is commonly
caused by the combination of inherited genetic variations as well as
our own lifestyle. Metallothionein (MT) is a known cysteine-rich
protein responsible in helping zinc homeostasis which is important in
insulin signaling and secretion as well as protection our body from
reactive oxygen species (ROS). MT scavenged ROS and free
radicals in our body happen to be one of the reasons of T2DM and its
complications. The objective of this study was to investigate the
association of MT1A and MT2A polymorphisms between T2DM and
control subjects among Malay populations. This study involved 150
T2DM and 120 Healthy individuals of Malay ethnic with mixed
genders. The genomic DNA was extracted from buccal cells and
amplified for MT1A and MT2A loci; the 347bp and 238bp banding
patterns were respectively produced by mean of the Polymerase
Chain Reaction (PCR). The PCR products were digested with Mlucl
and Tsp451 restriction enzymes respectively and producing
fragments lengths of (158/189/347bp) and (103/135/238bp)
respectively. The ANOVA test was conducted and it shown that there
was a significant difference between diabetic and control subjects for
age, BMI, WHR, SBP, FPG, HBA1C, LDL, TG, TC and family
history with (P0.05). The genotype
frequency for AA, AG and GG of MT1A polymorphisms was 72.7%,
22.7% and 4.7% in cases and 15%, 55% and 30% in control
respectively. As for MT2A, genotype frequency of GG, GC and CC
was 42.7%, 27.3% and 30% in case and 5%, 40% and 55% for
control respectively. Both polymorphisms show significant difference
between two investigated groups with (P=0.000). The Post hoc test
was conducted and shows a significant difference between the
genotypes within each polymorphism (P=0. 000). The MT1A and
MT2A polymorphisms were believed to be the reliable molecular
markers to distinguish the T2DM subjects from healthy individuals in
Malay populations.
Abstract: In contrast to existing of calculation of temperature field of a profile part a blade with convective cooling which are not taking into account multi connective in a broad sense of this term, we develop mathematical models and highly effective combination (BIEM AND FDM) numerical methods from the point of view of a realization on the PC. The theoretical substantiation of these methods is proved by the appropriate theorems.
Abstract: Face Recognition is a field of multidimensional
applications. A lot of work has been done, extensively on the most of
details related to face recognition. This idea of face recognition using
PCA is one of them. In this paper the PCA features for Feature
extraction are used and matching is done for the face under
consideration with the test image using Eigen face coefficients. The
crux of the work lies in optimizing Euclidean distance and paving the
way to test the same algorithm using Matlab which is an efficient tool
having powerful user interface along with simplicity in representing
complex images.
Abstract: The study site was located in Ratchaburi Province,
Thailand. Four experimental plots in dry dipterocarp forest (DDF)
and four plots in mixed deciduous forest (MDF) were set up to
estimate the above-ground biomass of tree, sapling and bamboo. The
allometry equations were used to investigate above-ground biomass
of these vegetation. Seedling and other understory were determined
using direct harvesting method. Carbon storage in above-ground
biomass was calculated based on IPCC 2006.
The results showed that the above-ground biomass of DDF at
20-40% slope,
Abstract: In this paper, in order to categorize ORL database face
pictures, principle Component Analysis (PCA) and Kernel Principal
Component Analysis (KPCA) methods by using Elman neural
network and Support Vector Machine (SVM) categorization methods
are used. Elman network as a recurrent neural network is proposed
for modeling storage systems and also it is used for reviewing the
effect of using PCA numbers on system categorization precision rate
and database pictures categorization time. Categorization stages are
conducted with various components numbers and the obtained results
of both Elman neural network categorization and support vector
machine are compared. In optimum manner 97.41% recognition
accuracy is obtained.
Abstract: Amount of dissolve oxygen in a river has a great direct affect on aquatic macroinvertebrates and this would influence on the region ecosystem indirectly. In this paper it is tried to predict dissolved oxygen in rivers by employing an easy Fuzzy Logic Modeling, Wang Mendel method. This model just uses previous records to estimate upcoming values. For this purpose daily and hourly records of eight stations in Au Sable watershed in Michigan, United States are employed for 12 years and 50 days period respectively. Calculations indicate that for long period prediction it is better to increase input intervals. But for filling missed data it is advisable to decrease the interval. Increasing partitioning of input and output features influence a little on accuracy but make the model too time consuming. Increment in number of input data also act like number of partitioning. Large amount of train data does not modify accuracy essentially, so, an optimum training length should be selected.
Abstract: Antimicrobial resistant is becoming a major factor in
virtually all hospital acquired infection may soon untreatable is a
serious public health problem. These concerns have led to major
research effort to discover alternative strategies for the treatment of
bacterial infection. Nanobiotehnology is an upcoming and fast
developing field with potential application for human welfare. An
important area of nanotechnology for development of reliable and
environmental friendly process for synthesis of nanoscale particles
through biological systems In the present studies are reported on the
use of fungal strain Aspergillus species for the extracellular synthesis
of bionanoparticles from 1 mM silver nitrate (AgNO3) solution. The
report would be focused on the synthesis of metallic bionanoparticles
of silver using a reduction of aqueous Ag+ ion with the
culture supernatants of Microorganisms. The bio-reduction of the
Ag+ ions in the solution would be monitored in the aqueous
component and the spectrum of the solution would measure through
UV-visible spectrophotometer The bionanoscale particles were
further characterized by Atomic Force Microscopy (AFM), Fourier
Transform Infrared Spectroscopy (FTIR) and Thin layer
chromatography. The synthesized bionanoscale particle showed a
maximum absorption at 385 nm in the visible region. Atomic Force
Microscopy investigation of silver bionanoparticles identified that
they ranged in the size of 250 nm - 680 nm; the work analyzed the
antimicrobial efficacy of the silver bionanoparticles against various
multi drug resistant clinical isolates. The present Study would be
emphasizing on the applicability to synthesize the metallic
nanostructures and to understand the biochemical and molecular
mechanism of nanoparticles formation by the cell filtrate in order to
achieve better control over size and polydispersity of the
nanoparticles. This would help to develop nanomedicine against
various multi drug resistant human pathogens.
Abstract: Validation of an automation system is an important issue. The goal is to check if the system under investigation, modeled by a Petri net, never enters the undesired states. Usually, tools dedicated to Petri nets such as DESIGN/CPN are used to make reachability analysis. The biggest problem with this approach is that it is impossible to generate the full occurence graph of the system because it is too large. In this paper, we show how computational methods such as temporal logic model checking and Groebner bases can be used to verify the correctness of the design of an automation system. We report our experimental results with two automation systems: the Automated Guided Vehicle (AGV) system and the traffic light system. Validation of these two systems ranged from 10 to 30 seconds on a PC depending on the optimizing parameters.
Abstract: Detection of incipient abnormal events is important to
improve safety and reliability of machine operations and reduce losses
caused by failures. Improper set-ups or aligning of parts often leads to
severe problems in many machines. The construction of prediction
models for predicting faulty conditions is quite essential in making
decisions on when to perform machine maintenance. This paper
presents a multivariate calibration monitoring approach based on the
statistical analysis of machine measurement data. The calibration
model is used to predict two faulty conditions from historical reference
data. This approach utilizes genetic algorithms (GA) based variable
selection, and we evaluate the predictive performance of several
prediction methods using real data. The results shows that the
calibration model based on supervised probabilistic principal
component analysis (SPPCA) yielded best performance in this work.
By adopting a proper variable selection scheme in calibration models,
the prediction performance can be improved by excluding
non-informative variables from their model building steps.
Abstract: Unified Theory of Acceptance and Use of Technology
(UTAUT) model has demonstrated the influencing factors for generic
information systems use such as tablet personal computer (TPC) and
mobile communication. However, in the context of digital library
system, there has been very little effort to determine factors affecting
the intention to use digital library based on the UTAUT model. This
paper investigates factors that are expected to influence the intention
of postgraduate students to use digital library based on modified
UTAUT model. The modified model comprises of constructs
represented by several latent variables, namely performance
expectancy (PE), effort expectancy (EE), information quality (IQ)
and service quality (SQ) and moderated by age, gender and
experience in using digital library. Results show that performance
expectancy, effort expectancy and information quality are positively
related to the intention to use digital library, while service quality is
negatively related to the intention to use digital library. Age and
gender have shown no evidence of any significant interactions, while
experience in using digital library significantly interacts with effort
expectancy and intention to use digital library. This has provided the
evidence of a moderating effect of experience in the intention to use
digital library. It is expected that this research will shed new lights
into research of acceptance and intention to use the library in a digital
environment.
Abstract: The paper presents a novel idea to control computer
mouse cursor movement with human eyes. In this paper, a working
of the product has been described as to how it helps the special
people share their knowledge with the world. Number of traditional
techniques such as Head and Eye Movement Tracking Systems etc.
exist for cursor control by making use of image processing in which
light is the primary source. Electro-oculography (EOG) is a new
technology to sense eye signals with which the mouse cursor can be
controlled. The signals captured using sensors, are first amplified,
then noise is removed and then digitized, before being transferred to
PC for software interfacing.
Abstract: In modern era, the biggest challenge facing the
software industry is the upcoming of new technologies. So, the
software engineers are gearing up themselves to meet and manage
change in large software system. Also they find it difficult to deal
with software cognitive complexities. In the last few years many
metrics were proposed to measure the cognitive complexity of
software. This paper aims at a comprehensive survey of the metric of
software cognitive complexity. Some classic and efficient software
cognitive complexity metrics, such as Class Complexity (CC),
Weighted Class Complexity (WCC), Extended Weighted Class
Complexity (EWCC), Class Complexity due to Inheritance (CCI) and
Average Complexity of a program due to Inheritance (ACI), are
discussed and analyzed. The comparison and the relationship of these
metrics of software complexity are also presented.
Abstract: The objective of this research is to study principal
component analysis for classification of 67 soil samples collected from
different agricultural areas in the western part of Thailand. Six soil
properties were measured on the soil samples and are used as original
variables. Principal component analysis is applied to reduce the
number of original variables. A model based on the first two
principal components accounts for 72.24% of total variance. Score
plots of first two principal components were used to map with
agricultural areas divided into horticulture, field crops and wetland.
The results showed some relationships between soil properties and
agricultural areas. PCA was shown to be a useful tool for agricultural
areas classification based on soil properties.