Abstract: We review a knowledge extractor model in
constructing 3G Killer Applications. The success of 3G is essential
for Government as it became part of Telecommunications National
Strategy. The 3G wireless technologies may reach larger area and
increase country-s ICT penetration. In order to understand future
customers needs, the operators require proper information
(knowledge) lying inside. Our work approached future customers as
complex system where the complex knowledge may expose regular
behavior. The hidden information from 3G future customers is
revealed by using fractal-based questionnaires. Afterward, further
statistical analysis is used to match the results with operator-s
strategic plan. The developments of 3G applications also consider its
saturation time and further improvement of the application.
Abstract: The purpose of this study is to analyze the islands
tourist travel information sources, as well as for the satisfaction of the
tourist destination services. This study used questionnaires to the
island of Taiwan to the Penghu Islands to engage in tourism activities
tourist adopt the designated convenience sampling method, a total of
889 valid questionnaires were collected. After statistical analysis, this
study found that: 1. tourists to the Penghu Islands travel information
source for “friends and family came to Penghu". 2. Tourists feel the
service of the outlying islands of Penghu, the highest feelings of
“friendly local residents". 3. There are different demographic variables
affect the tourist travel information source and service satisfaction.
Based on the findings of this study not only for Penghu's tourism
industry with the unit in charge of the proposed operating and
suggestions for future research to other researchers.
Abstract: With the fast evolution of digital data exchange, security information becomes much important in data storage and transmission. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. In this paper, we analyze the Advanced Encryption Standard (AES), and we add a key stream generator (A5/1, W7) to AES to ensure improving the encryption performance; mainly for images characterised by reduced entropy. The implementation of both techniques has been realized for experimental purposes. Detailed results in terms of security analysis and implementation are given. Comparative study with traditional encryption algorithms is shown the superiority of the modified algorithm.
Abstract: The traditional Failure Mode and Effects Analysis
(FMEA) uses Risk Priority Number (RPN) to evaluate the risk level
of a component or process. The RPN index is determined by
calculating the product of severity, occurrence and detection indexes.
The most critically debated disadvantage of this approach is that
various sets of these three indexes may produce an identical value of
RPN. This research paper seeks to address the drawbacks in
traditional FMEA and to propose a new approach to overcome these
shortcomings. The Risk Priority Code (RPC) is used to prioritize
failure modes, when two or more failure modes have the same RPN.
A new method is proposed to prioritize failure modes, when there is a
disagreement in ranking scale for severity, occurrence and detection.
An Analysis of Variance (ANOVA) is used to compare means of
RPN values. SPSS (Statistical Package for the Social Sciences)
statistical analysis package is used to analyze the data. The results
presented are based on two case studies. It is found that the proposed
new methodology/approach resolves the limitations of traditional
FMEA approach.
Abstract: Memory Errors Detection and Correction aim to secure the transaction of data between the central processing unit of a satellite onboard computer and its local memory. In this paper, the application of a double-bit error detection and correction method is described and implemented in Field Programmable Gate Array (FPGA) technology. The performance of the proposed EDAC method is measured and compared with two different EDAC devices, using the same FPGA technology. Statistical analysis of single-event upset (SEU) and multiple-bit upset (MBU) activity in commercial memories onboard the first Algerian microsatellite Alsat-1 is given.
Abstract: Xanthan gum is one of the major commercial
biopolymers. Due to its excellent rheological properties xanthan gum
is used in many applications, mainly in food industry. Commercial
production of xanthan gum uses glucose as the carbon substrate;
consequently the price of xanthan production is high. One of the
ways to decrease xanthan price, is using cheaper substrate like
agricultural wastes. Iran is one of the biggest date producer countries.
However approximately 50% of date production is wasted annually.
The goal of this study is to produce xanthan gum from waste date
using Xanthomonas campestris PTCC1473 by submerged
fermentation. In this study the effect of three variables including
phosphor and nitrogen amount and agitation rate in three levels using
response surface methodology (RSM) has been studied. Results
achieved from statistical analysis Design Expert 7.0.0 software
showed that xanthan increased with increasing level of phosphor.
Low level of nitrogen leaded to higher xanthan production. Xanthan
amount, increasing agitation had positive influence. The statistical
model identified the optimum conditions nitrogen amount=3.15g/l,
phosphor amount=5.03 g/l and agitation=394.8 rpm for xanthan. To
model validation, experiments in optimum conditions for xanthan
gum were carried out. The mean of result for xanthan was 6.72±0.26.
The result was closed to the predicted value by using RSM.
Abstract: Traditional higher-education classrooms allow lecturers to observe students- behaviours and responses to a particular pedagogy during learning in a way that can influence changes to the pedagogical approach. Within current e-learning systems it is difficult to perform continuous analysis of the cohort-s behavioural tendency, making real-time pedagogical decisions difficult. This paper presents a Virtual Learning Process Environment (VLPE) based on the Business Process Management (BPM) conceptual framework. Within the VLPE, course designers can model various education pedagogies in the form of learning process workflows using an intuitive flow diagram interface. These diagrams are used to visually track the learning progresses of a cohort of students. This helps assess the effectiveness of the chosen pedagogy, providing the information required to improve course design. A case scenario of a cohort of students is presented and quantitative statistical analysis of their learning process performance is gathered and displayed in realtime using dashboards.
Abstract: Most of the real queuing systems include special properties and constraints, which can not be analyzed directly by using the results of solved classical queuing models. Lack of Markov chains features, unexponential patterns and service constraints, are the mentioned conditions. This paper represents an applied general algorithm for analysis and optimizing the queuing systems. The algorithm stages are described through a real case study. It is consisted of an almost completed non-Markov system with limited number of customers and capacities as well as lots of common exception of real queuing networks. Simulation is used for optimizing this system. So introduced stages over the following article include primary modeling, determining queuing system kinds, index defining, statistical analysis and goodness of fit test, validation of model and optimizing methods of system with simulation.
Abstract: In this paper is shown that the probability-statistic methods application, especially at the early stage of the aviation gas turbine engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence is considered the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods. Training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. Thus for GTE technical condition more adequate model making are analysed dynamics of skewness and kurtosis coefficients' changes. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE work parameters have fuzzy character. Hence consideration of fuzzy skewness and kurtosis coefficients is expedient. Investigation of the basic characteristics changes- dynamics of GTE work parameters allows to draw conclusion on necessity of the Fuzzy Statistical Analysis at preliminary identification of the engines' technical condition. Researches of correlation coefficients values- changes shows also on their fuzzy character. Therefore for models choice the application of the Fuzzy Correlation Analysis results is offered. For checking of models adequacy is considered the Fuzzy Multiple Correlation Coefficient of Fuzzy Multiple Regression. At the information sufficiency is offered to use recurrent algorithm of aviation GTE technical condition identification (Hard Computing technology is used) on measurements of input and output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-bystage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine temperature condition was made.
Abstract: Eukaryotic protein-coding genes are interrupted by spliceosomal introns, which are removed from the RNA transcripts before translation into a protein. The exon-intron structures of different eukaryotic species are quite different from each other, and the evolution of such structures raises many questions. We try to address some of these questions using statistical analysis of whole genomes. We go through all the protein-coding genes in a genome and study correlations between the net length of all the exons in a gene, the number of the exons, and the average length of an exon. We also take average values of these features for each chromosome and study correlations between those averages on the chromosomal level. Our data show universal features of exon-intron structures common to animals, plants, and protists (specifically, Arabidopsis thaliana, Caenorhabditis elegans, Drosophila melanogaster, Cryptococcus neoformans, Homo sapiens, Mus musculus, Oryza sativa, and Plasmodium falciparum). We have verified linear correlation between the number of exons in a gene and the length of a protein coded by the gene, while the protein length increases in proportion to the number of exons. On the other hand, the average length of an exon always decreases with the number of exons. Finally, chromosome clustering based on average chromosome properties and parameters of linear regression between the number of exons in a gene and the net length of those exons demonstrates that these average chromosome properties are genome-specific features.
Abstract: Functional Magnetic Resonance Imaging(fMRI) is a
noninvasive imaging technique that measures the hemodynamic
response related to neural activity in the human brain. Event-related
functional magnetic resonance imaging (efMRI) is a form of
functional Magnetic Resonance Imaging (fMRI) in which a series of
fMRI images are time-locked to a stimulus presentation and averaged
together over many trials. Again an event related potential (ERP) is a
measured brain response that is directly the result of a thought or
perception. Here the neuronal response of human visual cortex in
normal healthy patients have been studied. The patients were asked
to perform a visual three choice reaction task; from the relative
response of each patient corresponding neuronal activity in visual
cortex was imaged. The average number of neurons in the adult
human primary visual cortex, in each hemisphere has been estimated
at around 140 million. Statistical analysis of this experiment was
done with SPM5(Statistical Parametric Mapping version 5) software.
The result shows a robust design of imaging the neuronal activity of
human visual cortex.
Abstract: In this paper, a new approach for quality assessment
tasks in lossy compressed digital video is proposed. The research
activity is based on the visual fixation data recorded by an eye
tracker. The method involved both a new paradigm for subjective
quality evaluation and the subsequent statistical analysis to match
subjective scores provided by the observer to the data obtained from
the eye tracker experiments. The study brings improvements to the
state of the art, as it solves some problems highlighted in literature.
The experiments prove that data obtained from an eye tracker can be
used to classify videos according to the level of impairment due to
compression. The paper presents the methodology, the experimental
results and their interpretation. Conclusions suggest that the eye
tracker can be useful in quality assessment, if data are collected and
analyzed in a proper way.
Abstract: The prevalence of non organic constipation differs
from country to country and the reliability of the estimate rates is
uncertain. Moreover, the clinical relevance of subdividing the
heterogeneous functional constipation disorders into pre-defined
subgroups is largely unknown.. Aim: to estimate the prevalence of
constipation in a population-based sample and determine whether
clinical subgroups can be identified. An age and gender stratified
sample population from 5 Italian cities was evaluated using a
previously validated questionnaire. Data mining by cluster analysis
was used to determine constipation subgroups. Results: 1,500
complete interviews were obtained from 2,083 contacted households
(72%). Self-reported constipation correlated poorly with symptombased
constipation found in 496 subjects (33.1%). Cluster analysis
identified four constipation subgroups which correlated to subgroups
identified according to pre-defined symptom criteria. Significant
differences in socio-demographics and lifestyle were observed
among subgroups.
Abstract: In this paper a novel method for multiple one dimensional real valued sinusoidal signal frequency estimation in the presence of additive Gaussian noise is postulated. A computationally simple frequency estimation method with efficient statistical performance is attractive in many array signal processing applications. The prime focus of this paper is to combine the subspace-based technique and a simple peak search approach. This paper presents a variant of the Propagator Method (PM), where a collaborative approach of SUMWE and Propagator method is applied in order to estimate the multiple real valued sine wave frequencies. A new data model is proposed, which gives the dimension of the signal subspace is equal to the number of frequencies present in the observation. But, the signal subspace dimension is twice the number of frequencies in the conventional MUSIC method for estimating frequencies of real-valued sinusoidal signal. The statistical analysis of the proposed method is studied, and the explicit expression of asymptotic (large-sample) mean-squared-error (MSE) or variance of the estimation error is derived. The performance of the method is demonstrated, and the theoretical analysis is substantiated through numerical examples. The proposed method can achieve sustainable high estimation accuracy and frequency resolution at a lower SNR, which is verified by simulation by comparing with conventional MUSIC, ESPRIT and Propagator Method.
Abstract: The use of a Bayesian Hierarchical Model (BHM) to interpret breath measurements obtained during a 13C Octanoic Breath Test (13COBT) is demonstrated. The statistical analysis was implemented using WinBUGS, a commercially available computer package for Bayesian inference. A hierarchical setting was adopted where poorly defined parameters associated with a delayed Gastric Emptying (GE) were able to "borrow" strength from global distributions. This is proved to be a sufficient tool to correct model's failures and data inconsistencies apparent in conventional analyses employing a Non-linear least squares technique (NLS). Direct comparison of two parameters describing gastric emptying ng ( tlag -lag phase, t1/ 2 -half emptying time) revealed a strong correlation between the two methods. Despite our large dataset ( n = 164 ), Bayesian modeling was fast and provided a successful fitting for all subjects. On the contrary, NLS failed to return acceptable estimates in cases where GE was delayed.
Abstract: The concentrations of As, Hg, Co, Cr and Cd were
tested for each soil sample, and their spatial patterns were analyzed
by the semivariogram approach of geostatistics and geographical
information system technology. Multivariate statistic approaches
(principal component analysis and cluster analysis) were used to
identify heavy metal sources and their spatial pattern. Principal
component analysis coupled with correlation between heavy metals
showed that primary inputs of As, Hg and Cd were due to
anthropogenic while, Co, and Cr were associated with pedogenic
factors. Ordinary kriging was carried out to map the spatial patters of
heavy metals. The high pollution sources evaluated was related with
usage of urban and industrial wastewater. The results of this study
helpful for risk assessment of environmental pollution for decision
making for industrial adjustment and remedy soil pollution.
Abstract: Data of wave height and wind speed were collected
from three existing oil fields in South China Sea – offshore
Peninsular Malaysia, Sarawak and Sabah regions. Extreme values
and other significant data were employed for analysis. The data were
recorded from 1999 until 2008. The results show that offshore
structures are susceptible to unacceptable motions initiated by wind
and waves with worst structural impacts caused by extreme wave
heights. To protect offshore structures from damage, there is a need
to quantify descriptive statistics and determine spectra envelope of
wind speed and wave height, and to ascertain the frequency content
of each spectrum for offshore structures in the South China Sea
shallow waters using measured time series. The results indicate that
the process is nonstationary; it is converted to stationary process by
first differencing the time series. For descriptive statistical analysis,
both wind speed and wave height have significant influence on the
offshore structure during the northeast monsoon with high mean wind
speed of 13.5195 knots ( = 6.3566 knots) and the high mean wave
height of 2.3597 m ( = 0.8690 m). Through observation of the
spectra, there is no clear dominant peak and the peaks fluctuate
randomly. Each wind speed spectrum and wave height spectrum has
its individual identifiable pattern. The wind speed spectrum tends to
grow gradually at the lower frequency range and increasing till it
doubles at the higher frequency range with the mean peak frequency
range of 0.4104 Hz to 0.4721 Hz, while the wave height tends to
grow drastically at the low frequency range, which then fluctuates
and decreases slightly at the high frequency range with the mean
peak frequency range of 0.2911 Hz to 0.3425 Hz.
Abstract: Recent trends in building constructions in Libya are
more toward tall (high-rise) building projects. As a consequence, a
better estimation of the lateral loading in the design process is
becoming the focal of a safe and cost effective building industry. Byin-
large, Libya is not considered a potential earthquake prone zone,
making wind is the dominant design lateral loads. Current design
practice in the country estimates wind speeds on a mere random
bases by considering certain factor of safety to the chosen wind
speed. Therefore, a need for a more accurate estimation of wind
speeds in Libya was the motivation behind this study. Records of
wind speed data were collected from 22 metrological stations in
Libya, and were statistically analysed. The analysis of more than four
decades of wind speed records suggests that the country can be
divided into four zones of distinct wind speeds. A computer “survey"
program was manipulated to draw design wind speeds contour map
for the state of Libya.
The paper presents the statistical analysis of Libya-s recorded
wind speed data and proposes design wind speed values for a 50-year
return period that covers the entire country.
Abstract: In this paper spatial variability of some chemical and
physical soil properties were investigated in mountain rangelands of
Nesho, Mazandaran province, Iran. 110 soil samples from 0-30 cm
depth were taken with systematic method on grid 30×30 m2 in
regions with different vegetation cover and transported to laboratory.
Then soil chemical and physical parameters including Acidity (pH),
Electrical conductivity, Caco3, Bulk density, Particle density, total
phosphorus, total Nitrogen, available potassium, Organic matter,
Saturation moisture, Soil texture (percentage of sand, silt and clay),
Sodium, Calcium, magnesium were measured in laboratory. Data
normalization was performed then was done statistical analysis for
description of soil properties and geostatistical analysis for indication
spatial correlation between these properties and were perpetrated
maps of spatial distribution of soil properties using Kriging method.
Results indicated that in the study area Saturation moisture and
percentage of Sand had highest and lowest spatial correlation
respectively.
Abstract: Numerous concrete structures projects are currently running in Libya as part of a US$50 billion government funding. The
quality of concrete used in 20 different construction projects were assessed based mainly on the concrete compressive strength achieved. The projects are scattered all over the country and are at
various levels of completeness. For most of these projects, the
concrete compressive strength was obtained from test results of a
150mm standard cube mold. Statistical analysis of collected concrete
compressive strengths reveals that the data in general followed a
normal distribution pattern. The study covers comparison and assessment of concrete quality aspects such as: quality control, strength range, data standard deviation, data scatter, and ratio of minimum strength to design strength. Site quality control for these projects ranged from very good to poor according to ACI214 criteria [1]. The ranges (Rg) of the strength (max. strength – min. strength) divided by average strength are from (34% to 160%). Data scatter is
measured as the range (Rg) divided by standard deviation () and is
found to be (1.82 to 11.04), indicating that the range is ±3σ.
International construction companies working in Libya follow
different assessment criteria for concrete compressive strength in lieu
of national unified procedure. The study reveals that assessments of
concrete quality conducted by these construction companies usually
meet their adopted (internal) standards, but sometimes fail to meet
internationally known standard requirements. The assessment of
concrete presented in this paper is based on ACI, British standards
and proposed Libyan concrete strength assessment criteria.