Abstract: e-Government structures permits the government to operate in a more transparent and accountable manner of which it increases the power of the individual in relation to that of the government. This paper identifies the factors that determine customer-s attitude towards e-Government services using a theoretical model based on the Technology Acceptance Model. Data relating to the constructs were collected from 200 respondents. The research model was tested using Structural Equation Modeling (SEM) techniques via the Analysis of Moment Structure (AMOS 16) computer software. SEM is a comprehensive approach to testing hypotheses about relations among observed and latent variables. The proposed model fits the data well. The results demonstrated that e- Government services acceptance can be explained in terms of compatibility and attitude towards e-Government services. The setup of the e-Government services will be compatible with the way users work and are more likely to adopt e-Government services owing to their familiarity with the Internet for various official, personal, and recreational uses. In addition, managerial implications for government policy makers, government agencies, and system developers are also discussed.
Abstract: The Internet is the global data communications
infrastructure based on the interconnection of both public and private
networks using protocols that implement Internetworking on a global
scale. Hence the control of protocol and infrastructure development,
resource allocation and network operation are crucial and interlinked
aspects. Internet Governance is the hotly debated and contentious
subject that refers to the global control and operation of key Internet
infrastructure such as domain name servers and resources such as
domain names. It is impossible to separate technical and political
positions as they are interlinked. Furthermore the existence of a
global market, transparency and competition impact upon Internet
Governance and related topics such as network neutrality and
security. Current trends and developments regarding Internet
governance with a focus on the policy-making process, security and
control have been observed to evaluate current and future
implications on the Internet. The multi stakeholder approach to
Internet Governance discussed in this paper presents a number of
opportunities, issues and developments that will affect the future
direction of the Internet. Internet operation, maintenance and
advisory organisations such as the Internet Corporation for Assigned
Names and Numbers (ICANN) or the Internet Governance Forum
(IGF) are currently in the process of formulating policies for future
Internet Governance. Given the controversial nature of the issues at
stake and the current lack of agreement it is predicted that
institutional as well as market governance will remain present for the
network access and content.
Abstract: This paper presents a technical speaker adaptation
method called WMLLR, which is based on maximum likelihood linear
regression (MLLR). In MLLR, a linear regression-based transform
which adapted the HMM mean vectors was calculated to maximize the
likelihood of adaptation data. In this paper, the prior knowledge of the
initial model is adequately incorporated into the adaptation. A series of
speaker adaptation experiments are carried out at a 30 famous city
names database to investigate the efficiency of the proposed method.
Experimental results show that the WMLLR method outperforms the
conventional MLLR method, especially when only few utterances
from a new speaker are available for adaptation.
Abstract: To evaluate the ability to predict xerostomia after
radiotherapy, we constructed and compared neural network and
logistic regression models. In this study, 61 patients who completed a
questionnaire about their quality of life (QoL) before and after a full
course of radiation therapy were included. Based on this questionnaire,
some statistical data about the condition of the patients’ salivary
glands were obtained, and these subjects were included as the inputs of
the neural network and logistic regression models in order to predict
the probability of xerostomia. Seven variables were then selected from
the statistical data according to Cramer’s V and point-biserial
correlation values and were trained by each model to obtain the
respective outputs which were 0.88 and 0.89 for AUC, 9.20 and 7.65
for SSE, and 13.7% and 19.0% for MAPE, respectively. These
parameters demonstrate that both neural network and logistic
regression methods are effective for predicting conditions of parotid
glands.
Abstract: This study demonstrates the use of Class F fly ash in
combination with lime or lime kiln dust in the full depth reclamation
(FDR) of asphalt pavements. FDR, in the context of this paper, is a
process of pulverizing a predetermined amount of flexible pavement
that is structurally deficient, blending it with chemical additives and
water, and compacting it in place to construct a new stabilized base
course. Test sections of two structurally deficient asphalt pavements
were reclaimed using Class F fly ash in combination with lime and
lime kiln dust. In addition, control sections were constructed using
cement, cement and emulsion, lime kiln dust and emulsion, and mill
and fill. The service performance and structural behavior of the FDR
pavement test sections were monitored to determine how the fly ash
sections compared to other more traditional pavement rehabilitation
techniques. Service performance and structural behavior were
determined with the use of sensors embedded in the road and Falling
Weight Deflectometer (FWD) tests. Monitoring results of the FWD
tests conducted up to 2 years after reclamation show that the cement,
fly ash+LKD, and fly ash+lime sections exhibited two year resilient
modulus values comparable to open graded cement stabilized
aggregates (more than 750 ksi). The cement treatment resulted in a
significant increase in resilient modulus within 3 weeks of
construction and beyond this curing time, the stiffness increase was
slow. On the other hand, the fly ash+LKD and fly ash+lime test
sections indicated slower shorter-term increase in stiffness. The fly
ash+LKD and fly ash+lime section average resilient modulus values
at two years after construction were in excess of 800 ksi. Additional
longer-term testing data will be available from ongoing pavement
performance and environmental condition data collection at the two
pavement sites.
Abstract: Experimental data are presented to show the influence of different types of chemical demulsifier on the stability and demulsification of emulsions. Three groups of demulsifier with different functional groups were used in this work namely amines, alcohol and polyhydric alcohol. The results obtained in this study have exposed the capability of chemical breaking agents in destabilization of water in crude oil emulsions. From the present study, found that molecular weight of the demulsifier were influent the capability of the emulsion to separate.
Abstract: Cancers could normally be marked by a number of
differentially expressed genes which show enormous potential as
biomarkers for a certain disease. Recent years, cancer classification
based on the investigation of gene expression profiles derived by
high-throughput microarrays has widely been used. The selection of
discriminative genes is, therefore, an essential preprocess step in
carcinogenesis studies. In this paper, we have proposed a novel gene
selector using information-theoretic measures for biological
discovery. This multivariate filter is a four-stage framework through
the analyses of feature relevance, feature interdependence, feature
redundancy-dependence and subset rankings, and having been
examined on the colon cancer data set. Our experimental result show
that the proposed method outperformed other information theorem
based filters in all aspect of classification errors and classification
performance.
Abstract: In this study, active tendons with Proportional Integral
Derivation type controllers were applied to a SDOF and a MDOF
building model. Physical models of buildings were constituted with
virtual springs, dampers and rigid masses. After that, equations of
motion of all degrees of freedoms were obtained. Matlab Simulink
was utilized to obtain the block diagrams for these equations of
motion. Parameters for controller actions were found by using a trial
method. After earthquake acceleration data were applied to the
systems, building characteristics such as displacements, velocities,
accelerations and transfer functions were analyzed for all degrees of
freedoms. Comparisons on displacement vs. time, velocity vs. time,
acceleration vs. time and transfer function (Db) vs. frequency (Hz)
were made for uncontrolled and controlled buildings. The results
show that the method seems feasible.
Abstract: One problem of synthetic sunflower cultivation is an
erratic germination of the seeds. To improve the germination, presowing
seed treatment with an ultrasound was tested. All treatments
were carried out at 40 kHz frequency with the intensities of 40, 60,
80 and 100% of the ultrasonic generator total power (250 W) for the
durations of 5, 10, 15 and 20 minutes. Data on seed germination
percentage, seed vigor index (SVI), root and shoot lengths of
seedlings were collected. The results showed that germination, SVI,
root and shoot lengths of ultrasonic treated seedlings were different
from the control, depending on intensity of the ultrasound. The
effects of ultrasonic treatment were significant on germination,
resulting in a maximum increase of 43% at 40 and 60% intensities
compared to that of the control seeds. In addition, seedlings of these 2
treatments had higher SVI and longer root and shoot lengths than that
of the control seedlings. All treatment durations resulted in higher
germination and SVI, longer root and higher shoot lenghts of
seedlings than the control. Among the duration treatments, only SVI
and seedling root length were significantly different.
Abstract: The common bean is the most important grain legume for direct human consumption in the world and BCMV is one of the world's most serious bean diseases that can reduce yield and quality of harvested product. To determine the best tolerance index to BCMV and recognize tolerant genotypes, 2 experiments were conducted in field conditions. Twenty five common bean genotypes were sown in 2 separate RCB design with 3 replications under contamination and non-contamination conditions. On the basis of the results of indices correlations GMP, MP and HARM were determined as the most suitable tolerance indices. The results of principle components analysis indicated 2 first components totally explained 98.52% of variations among data. The first and second components were named potential yield and stress susceptible respectively. Based on the results of BCMV tolerance indices assessment and biplot analysis WA8563-4, WA8563-2 and Cardinal were the genotypes that exhibited potential seed yield under contamination and noncontamination conditions.
Abstract: The paper presents the optimization problem for the
multi-element synthetic transmit aperture method (MSTA) in
ultrasound imaging applications. The optimal choice of the transmit
aperture size is performed as a trade-off between the lateral
resolution, penetration depth and the frame rate. Results of the
analysis obtained by a developed optimization algorithm are
presented. Maximum penetration depth and the best lateral resolution
at given depths are chosen as the optimization criteria. The
optimization algorithm was tested using synthetic aperture data of
point reflectors simulated by Filed II program for Matlab® for the
case of 5MHz 128-element linear transducer array with 0.48 mm
pitch are presented. The visualization of experimentally obtained
synthetic aperture data of a tissue mimicking phantom and in vitro
measurements of the beef liver are also shown. The data were
obtained using the SonixTOUCH Research systemequipped with a
linear 4MHz 128 element transducerwith 0.3 mm element pitch, 0.28
mm element width and 70% fractional bandwidth was excited by one
sine cycle pulse burst of transducer's center frequency.
Abstract: The number of framework conceived for e-learning
constantly increase, unfortunately the creators of learning materials
and educational institutions engaged in e-formation adopt a
“proprietor" approach, where the developed products (courses,
activities, exercises, etc.) can be exploited only in the framework
where they were conceived, their uses in the other learning
environments requires a greedy adaptation in terms of time and
effort. Each one proposes courses whose organization, contents,
modes of interaction and presentations are unique for all learners,
unfortunately the latter are heterogeneous and are not interested by
the same information, but only by services or documents adapted to
their needs. Currently the new tendency for the framework
conceived for e-learning, is the interoperability of learning materials,
several standards exist (DCMI (Dublin Core Metadata Initiative)[2],
LOM (Learning Objects Meta data)[1], SCORM (Shareable Content
Object Reference Model)[6][7][8], ARIADNE (Alliance of Remote
Instructional Authoring and Distribution Networks for Europe)[9],
CANCORE (Canadian Core Learning Resource Metadata
Application Profiles)[3]), they converge all to the idea of learning
objects. They are also interested in the adaptation of the learning
materials according to the learners- profile. This article proposes an
approach for the composition of courses adapted to the various
profiles (knowledge, preferences, objectives) of learners, based on
two ontologies (domain to teach and educational) and the learning
objects.
Abstract: Pakistani doctors (MBBS) are emigrating towards developed countries for professional adjustments. This study aims to highlight causes and consequences of doctors- brain drain from Pakistan. Primary data was collected from Mayo Hospital, Lahore by interviewing doctors (n=100) through systematic random sampling technique. It found that various socio-economic and political conditions are working as push and pull factors for brain drain of doctors in Pakistan. Majority of doctors (83%) declared poor remunerations and professional infrastructure of health department as push factor of doctors- brain drain. 81% claimed that continuous instability in political situation and threats of terrorism are responsible for emigration of doctors. 84% respondents considered fewer opportunities of further studies responsible for their emigration. Brain drain of doctors is affecting health sector-s policies / programs, standard doctor-patient ratios and quality of health services badly.
Abstract: Gold passbook is an investing tool that is especially
suitable for investors to do small investment in the solid gold. The gold
passbook has the lower risk than other ways investing in gold, but its
price is still affected by gold price. However, there are many factors
can cause influences on gold price. Therefore, building a model to
predict the price of gold passbook can both reduce the risk of
investment and increase the benefits. This study investigates the
important factors that influence the gold passbook price, and utilize
the Group Method of Data Handling (GMDH) to build the predictive
model. This method can not only obtain the significant variables but
also perform well in prediction. Finally, the significant variables of
gold passbook price, which can be predicted by GMDH, are US dollar
exchange rate, international petroleum price, unemployment rate,
whole sale price index, rediscount rate, foreign exchange reserves,
misery index, prosperity coincident index and industrial index.
Abstract: PPX(Pretty Printer for XML) is a query language that offers a concise description method of formatting the XML data into HTML. In this paper, we propose a simple specification of formatting method that is a combination description of automatic layout operators and variables in the layout expression of the GENERATE clause of PPX. This method can automatically format irregular XML data included in a part of XML with layout decision rule that is referred to DTD. In the experiment, a quick comparison shows that PPX requires far less description compared to XSLT or XQuery programs doing same tasks.
Abstract: Deep and radical social reforms of the last century-s
nineties in many Eastern European countries caused changes in
Information Technology-s (IT) field. Inefficient information
technologies were rapidly replaced with forefront IT solutions, e.g.,
in Eastern European countries there is a high level penetration of
qualitative high-speed Internet. The authors have taken part in the
introduction of those changes in Latvia-s leading IT research
institute. Grounding on their experience authors in this paper offer an
IT services based model for analysis the mentioned changes- and
development processes in the higher education and research fields,
i.e., for research e-infrastructure-s development. Compare to the
international practice such services were developed in Eastern Europe
in an untraditional way, which provided swift and positive
technological changes.
Abstract: More and more home videos are being generated with the ever growing popularity of digital cameras and camcorders. For many home videos, a photo rendering, whether capturing a moment or a scene within the video, provides a complementary representation to the video. In this paper, a video motion mining framework for creative rendering is presented. The user-s capture intent is derived by analyzing video motions, and respective metadata is generated for each capture type. The metadata can be used in a number of applications, such as creating video thumbnail, generating panorama posters, and producing slideshows of video.
Abstract: This article outlines conceptualization and
implementation of an intelligent system capable of extracting
knowledge from databases. Use of hybridized features of both the
Rough and Fuzzy Set theory render the developed system flexibility
in dealing with discreet as well as continuous datasets. A raw data set
provided to the system, is initially transformed in a computer legible
format followed by pruning of the data set. The refined data set is
then processed through various Rough Set operators which enable
discovery of parameter relationships and interdependencies. The
discovered knowledge is automatically transformed into a rule base
expressed in Fuzzy terms. Two exemplary cancer repository datasets
(for Breast and Lung Cancer) have been used to test and implement
the proposed framework.
Abstract: Using mini modules of Tmotes, it is possible to automate a small personal area network. This idea can be extended to large networks too by implementing multi-hop routing. Linking the various Tmotes using Programming languages like Nesc, Java and having transmitter and receiver sections, a network can be monitored. It is foreseen that, depending on the application, a long range at a low data transfer rate or average throughput may be an acceptable trade-off. To reduce the overall costs involved, an optimum number of Tmotes to be used under various conditions (Indoor/Outdoor) is to be deduced. By analyzing the data rates or throughputs at various locations of Tmotes, it is possible to deduce an optimal number of Tmotes for a specific network. This paper deals with the determination of optimum distances to reduce the cost and increase the reliability of the entire sensor network with Wireless Local Loop (WLL) capability.
Abstract: The aim of this paper is to present a new method
which can be used for progressive transmission of electrocardiogram
(ECG). The idea consists in transforming any ECG signal to an
image, containing one beat in each row. In the first step, the beats are
synchronized in order to reduce the high frequencies due to inter-beat
transitions. The obtained image is then transformed using a discrete
version of Radon Transform (DRT). Hence, transmitting the ECG,
leads to transmit the most significant energy of the transformed
image in Radon domain. For decoding purpose, the receptor needs to
use the inverse Radon Transform as well as the two synchronization
frames.
The presented protocol can be adapted for lossy to lossless
compression systems. In lossy mode we show that the compression
ratio can be multiplied by an average factor of 2 for an acceptable
quality of reconstructed signal. These results have been obtained on
real signals from MIT database.