Abstract: The purpose of this study is to investigate the effects
of modality principles in instructional software among first grade
pupils- achievements in the learning of Arabic Language. Two modes
of instructional software were systematically designed and
developed, audio with images (AI), and text with images (TI). The
quasi-experimental design was used in the study. The sample
consisted of 123 male and female pupils from IRBED Education
Directorate, Jordan. The pupils were randomly assigned to any one of
the two modes. The independent variable comprised the two modes
of the instructional software, the students- achievement levels in the
Arabic Language class and gender. The dependent variable was the
achievements of the pupils in the Arabic Language test. The
theoretical framework of this study was based on Mayer-s Cognitive
Theory of Multimedia Learning. Four hypotheses were postulated
and tested. Analyses of Variance (ANOVA) showed that pupils using
the (AI) mode performed significantly better than those using (TI)
mode. This study concluded that the audio with images mode was an
important aid to learning as compared to text with images mode.
Abstract: In this experimental investigation shake table tests
were conducted on two reduced models that represent normal single
room building constructed by Compressed Stabilized Earth Block
(CSEB) from locally available soil. One model was constructed with
earthquake resisting features (EQRF) having sill band, lintel band and
vertical bands to control the building vibration and another one was
without Earthquake Resisting Features. To examine the seismic
capacity of the models particularly when it is subjected to long-period
ground motion by large amplitude by many cycles of repeated
loading, the test specimen was shaken repeatedly until the failure.
The test results from Hi-end Data Acquisition system show that
model with EQRF behave better than without EQRF. This modified
masonry model with new material combined with new bands is used
to improve the behavior of masonry building.
Abstract: As the web continues to grow exponentially, the idea
of crawling the entire web on a regular basis becomes less and less
feasible, so the need to include information on specific domain,
domain-specific search engines was proposed. As more information
becomes available on the World Wide Web, it becomes more difficult
to provide effective search tools for information access. Today,
people access web information through two main kinds of search
interfaces: Browsers (clicking and following hyperlinks) and Query
Engines (queries in the form of a set of keywords showing the topic
of interest) [2]. Better support is needed for expressing one's
information need and returning high quality search results by web
search tools. There appears to be a need for systems that do reasoning
under uncertainty and are flexible enough to recover from the
contradictions, inconsistencies, and irregularities that such reasoning
involves. In a multi-view problem, the features of the domain can be
partitioned into disjoint subsets (views) that are sufficient to learn the
target concept. Semi-supervised, multi-view algorithms, which
reduce the amount of labeled data required for learning, rely on the
assumptions that the views are compatible and uncorrelated. This
paper describes the use of semi-structured machine learning approach
with Active learning for the “Domain Specific Search Engines". A
domain-specific search engine is “An information access system that
allows access to all the information on the web that is relevant to a
particular domain. The proposed work shows that with the help of
this approach relevant data can be extracted with the minimum
queries fired by the user. It requires small number of labeled data and
pool of unlabelled data on which the learning algorithm is applied to
extract the required data.
Abstract: One of the essential requirements for the human
beings is the house for living. This is necessary to make the place of
satisfaction for contemporary houses residents by attention to their
culture. In this article represented the relevant theoretical literature
on cultural symbols by use the architecture semiotic to construct the
houses as a better place for living. In fact, make a place for everyday
life with changing the house to the home is one of the most
challengeable subject for architects all around the world. The target
of this article is to find Cypriot houses cultural symbols that assist
architect to design and build contemporary houses, to make more
satisfaction for its residents according to Cypriot life style and their
culture. This paper is based on researching the effect of cultural
symbols on housing, would require various types of methods.
However, this study focuses on two methods, which are quantitative
and qualitative. The purpose of the case-specific study is to finding
the symbols that used in contemporary houses by attention to the
Cypriot cultural symbols in Famagusta houses.
Abstract: Compare to western cultures, women who smoke in Korea are not tolerated. Korean people are prejudiced against women smoking. In spite of the relative prevalence of sexual equality in South Korea, women too often feel obliged to confine their smoking to only a few public spaces, such as designated smoking rooms, coffee shops or pubs. Korean Confucianism classifies people according to gender and social status. According to Confucian culture, cigarettes convey clear social meanings as well as reinforcing status, age and gender, beyond personal preferences. For these reasons, the significant of people smoking in Korea varies according to their gender. This study will determine reasons for the ongoing sexual discrimination against female Korean smokers thorough analyzing Korean films. Since film is a medium reflects social phenomenon. Roland Barthes- Mythology Theory will be used to analyze films.
Abstract: There have been various methods created based on the regression ideas to resolve the problem of data set containing censored observations, i.e. the Buckley-James method, Miller-s method, Cox method, and Koul-Susarla-Van Ryzin estimators. Even though comparison studies show the Buckley-James method performs better than some other methods, it is still rarely used by researchers mainly because of the limited diagnostics analysis developed for the Buckley-James method thus far. Therefore, a diagnostic tool for the Buckley-James method is proposed in this paper. It is called the renovated Cook-s Distance, (RD* i ) and has been developed based on the Cook-s idea. The renovated Cook-s Distance (RD* i ) has advantages (depending on the analyst demand) over (i) the change in the fitted value for a single case, DFIT* i as it measures the influence of case i on all n fitted values Yˆ∗ (not just the fitted value for case i as DFIT* i) (ii) the change in the estimate of the coefficient when the ith case is deleted, DBETA* i since DBETA* i corresponds to the number of variables p so it is usually easier to look at a diagnostic measure such as RD* i since information from p variables can be considered simultaneously. Finally, an example using Stanford Heart Transplant data is provided to illustrate the proposed diagnostic tool.
Abstract: The study investigated the hydrophilic to hydrophobic
transition of modified polyacrylamide hydrogel with the inclusion of
N-isopropylacrylamide (NIAM). The modification was done by
mimicking micellar polymerization, which resulted in better
arrangement of NIAM chains in the polyacrylamide network. The
degree of NIAM arrangement is described by NH number. The
hydrophilic to hydrophobic transition was measured through the
partition coefficient, K, of Orange II and Methylene Blue in hydrogel
and in water. These dyes were chosen as a model for solutes with
different degree of hydrophobicity. The study showed that the
hydrogel with higher NH values resulted in better solubility of both
dyes. Moreover, in temperature above the lower critical solution
temperature (LCST) of Poly(N-isopropylacrylamide) (PNIAM)also
caused the collapse of NIPAM chains which results in a more
hydrophobic environment that increases the solubility of Methylene
Blue and decreases the solubility of Orange II in the hydrogels with
NIPAM present.
Abstract: The aim of this paper is to study in depth some
methodological aspects of social interventation, focusing on desirable
passage from social maternage method to peer advocacy method. For
this purpose, we intend analyze social and organizative components,
that affect operator-s professional action and that are part of his
psychological environment, besides the physical and social one. In
fact, operator-s interventation should not be limited to a pure supply
of techniques, nor to take shape as improvised action, but “full of
good purposes".
Abstract: The objective of this research is to explore the role of actors at the local level in managing the Pre-hospital Emergency Medical Service (EMS) system in Thailand. The research method was done through documentary research, individual interviews, and one forum conducted in each province. This paper uses the case of three provinces located in three regions in Thailand including; Ubon Ratchathani (North-eastern region), Lampang (Northern Region), and Songkhla (Southern Region). The result shows that, recently, the role of the local government in being the service provider for their local people is increasingly concerned. In identifying the key success factors towards the EMS system, it includes; (i) the local executives- vision and influence that the decisions made by them, for both PAO (Provincial Administration Organisation (PAO) and TAO (Tambon Administration Organisation), is vital to address the overall challenges in EMS development, (ii) the administrative system through reforming their working style create the flexibility in running the EMS task, (iii) the network-based management among different agencies at the local level leads to the better EMS practices, and (iv) the development in human resource is very vital in delivering the effective services.
Abstract: In the study of honeycomb crushing under quasistatic loading, two parameters are important, the mean crushing stress and the wavelength of the folding mode. The previous theoretical models did not consider the true cylindrical curvature effects and the flow stress in the folding mode of honeycomb material. The present paper introduces a modification on Wierzbicki-s model based on considering two above mentioned parameters in estimating the mean crushing stress and the wavelength through implementation of the energy method. Comparison of the results obtained by the new model and Wierzbicki-s model with existing experimental data shows better prediction by the model presented in this paper.
Abstract: Computerized lip reading has been one of the most
actively researched areas of computer vision in recent past because
of its crime fighting potential and invariance to acoustic environment.
However, several factors like fast speech, bad pronunciation,
poor illumination, movement of face, moustaches and beards make
lip reading difficult. In present work, we propose a solution for
automatic lip contour tracking and recognizing letters of English
language spoken by speakers using the information available from
lip movements. Level set method is used for tracking lip contour
using a contour velocity model and a feature vector of lip movements
is then obtained. Character recognition is performed using modified
k nearest neighbor algorithm which assigns more weight to nearer
neighbors. The proposed system has been found to have accuracy
of 73.3% for character recognition with speaker lip movements as
the only input and without using any speech recognition system in
parallel. The approach used in this work is found to significantly
solve the purpose of lip reading when size of database is small.
Abstract: Setting up of rural telecentres, popularly referred to as
Common Service Centres (CSCs), are considered one of the initial
forerunners of rural e-Governance initiatives under the Government
of India-s National e-Governance Plan (NeGP). CSCs are
implemented on public-private partnership (PPP) – where State
governments play a major role in facilitating the establishment of
CSCs and investments are made by private companies referred to as
Service Centre Agencies (SCAs). CSC implementation is expected to
help in improving public service delivery in a transparent and
efficient manner. However, there is very little research undertaken to
study the actual impact of CSC implementation at the grassroots
level. This paper addresses the gap by identifying the circumstances,
concerns and expectations from the point-of-view of citizens and
examining the finer aspects of social processes in the context of rural
e-Governance.
Abstract: Solid waste can be considered as an urban burden or
as a valuable resource depending on how it is managed. To meet the
rising demand for energy and to address environmental concerns, a
conversion from conventional energy systems to renewable resources
is essential. For the sustainability of human civilization, an
environmentally sound and techno-economically feasible waste
treatment method is very important to treat recyclable waste. Several
technologies are available for realizing the potential of solid waste as
an energy source, ranging from very simple systems for disposing of
dry waste to more complex technologies capable of dealing with
large amounts of industrial waste. There are three main pathways for
conversion of waste material to energy: thermo chemical,
biochemical and physicochemical. This paper investigates the thermo
chemical conversion of solid waste for energy recovery. The
processes, advantages and dis-advantages of various thermo chemical
conversion processes are discussed and compared. Special attention
is given to Gasification process as it provides better solutions
regarding public acceptance, feedstock flexibility, near-zero
emissions, efficiency and security. Finally this paper presents
comparative statements of thermo chemical processes and introduces
an integrated waste management system.
Abstract: A computational platform is presented in this
contribution. It has been designed as a virtual laboratory to be used
for exploring optimization algorithms in biological problems. This
platform is built on a blackboard-based agent architecture. As a test
case, the version of the platform presented here is devoted to the
study of protein folding, initially with a bead-like description of the
chain and with the widely used model of hydrophobic and polar
residues (HP model). Some details of the platform design are
presented along with its capabilities and also are revised some
explorations of the protein folding problems with different types of
discrete space. It is also shown the capability of the platform to
incorporate specific tools for the structural analysis of the runs in
order to understand and improve the optimization process.
Accordingly, the results obtained demonstrate that the ensemble of
computational tools into a single platform is worthwhile by itself,
since experiments developed on it can be designed to fulfill different
levels of information in a self-consistent fashion. By now, it is being
explored how an experiment design can be useful to create a
computational agent to be included within the platform. These
inclusions of designed agents –or software pieces– are useful for the
better accomplishment of the tasks to be developed by the platform.
Clearly, while the number of agents increases the new version of the
virtual laboratory thus enhances in robustness and functionality.
Abstract: There are two common types of operational research techniques, optimisation and metaheuristic methods. The latter may be defined as a sequential process that intelligently performs the exploration and exploitation adopted by natural intelligence and strong inspiration to form several iterative searches. An aim is to effectively determine near optimal solutions in a solution space. In this work, a type of metaheuristics called Ant Colonies Optimisation, ACO, inspired by a foraging behaviour of ants was adapted to find optimal solutions of eight non-linear continuous mathematical models. Under a consideration of a solution space in a specified region on each model, sub-solutions may contain global or multiple local optimum. Moreover, the algorithm has several common parameters; number of ants, moves, and iterations, which act as the algorithm-s driver. A series of computational experiments for initialising parameters were conducted through methods of Rigid Simplex, RS, and Modified Simplex, MSM. Experimental results were analysed in terms of the best so far solutions, mean and standard deviation. Finally, they stated a recommendation of proper level settings of ACO parameters for all eight functions. These parameter settings can be applied as a guideline for future uses of ACO. This is to promote an ease of use of ACO in real industrial processes. It was found that the results obtained from MSM were pretty similar to those gained from RS. However, if these results with noise standard deviations of 1 and 3 are compared, MSM will reach optimal solutions more efficiently than RS, in terms of speed of convergence.
Abstract: In this paper, two centrifugal model tests (case 1: raft
foundation, case 2: 2x2 piled raft foundation) were conducted in
order to evaluate the effect of ground subsidence on load sharing
among piles and raft and settlement of raft and piled raft
foundations. For each case, two conditions consisting of undrained
(without groundwater pumping) and drained (with groundwater
pumping) conditions were considered. Vertical loads were applied
to the models after the foundations were completely consolidated by
selfweight at 50g. The results show that load sharing by the piles in
piled raft foundation (piled load share) for drained condition
decreases faster than that for undrained condition. Settlement of
both raft and piled raft foundations for drained condition increases
more quickly than that for undrained condition. In addition, the
settlement of raft foundation increases more largely than the
settlement of piled raft foundation for drained condition.
Abstract: A novel file splitting technique for the reduction of the nth-order entropy of text files is proposed. The technique is based on mapping the original text file into a non-ASCII binary file using a new codeword assignment method and then the resulting binary file is split into several subfiles each contains one or more bits from each codeword of the mapped binary file. The statistical properties of the subfiles are studied and it is found that they reflect the statistical properties of the original text file which is not the case when the ASCII code is used as a mapper. The nth-order entropy of these subfiles are determined and it is found that the sum of their entropies is less than that of the original text file for the same values of extensions. These interesting statistical properties of the resulting subfiles can be used to achieve better compression ratios when conventional compression techniques are applied to these subfiles individually and on a bit-wise basis rather than on character-wise basis.
Abstract: Pattern matching based on regular tree grammars have been widely used in many areas of computer science. In this paper, we propose a pattern matcher within the framework of code generation, based on a generic and a formalized approach. According to this approach, parsers for regular tree grammars are adapted to a general pattern matching solution, rather than adapting the pattern matching according to their parsing behavior. Hence, we first formalize the construction of the pattern matches respective to input trees drawn from a regular tree grammar in a form of the so-called match trees. Then, we adopt a recently developed generic parser and tightly couple its parsing behavior with such construction. In addition to its generality, the resulting pattern matcher is characterized by its soundness and efficient implementation. This is demonstrated by the proposed theory and by the derived algorithms for its implementation. A comparison with similar and well-known approaches, such as the ones based on tree automata and LR parsers, has shown that our pattern matcher can be applied to a broader class of grammars, and achieves better approximation of pattern matches in one pass. Furthermore, its use as a machine code selector is characterized by a minimized overhead, due to the balanced distribution of the cost computations into static ones, during parser generation time, and into dynamic ones, during parsing time.
Abstract: This paper presents an overview of the design and
implementation of an online rule-based Expert Systems for Islamic
medication. T his Online Islamic Medication Expert System (OIMES)
focuses on physical illnesses only. Knowledge base of this Expert
System contains exhaustively the types of illness together with their
related cures or treatments/therapies, obtained exclusively from the
Quran and Hadith. Extensive research and study are conducted to
ensure that the Expert System is able to provide the most suitable
treatment with reference to the relevant verses cited in Quran or
Hadith. These verses come together with their related 'actions'
(bodily actions/gestures or some acts) to be performed by the patient
to treat a particular illness/sickness. These verses and the instructions
for the 'actions' are to be displayed unambiguously on the computer
screen. The online platform provides the advantage for patient getting
treatment practically anytime and anywhere as long as the computer
and Internet facility exist. Patient does not need to make appointment
to see an expert for a therapy.
Abstract: This paper proposes a new performance characterization for the test strategy intended for second order filters denominated Transient Analysis Method (TRAM). We evaluate the ability of the addressed test strategy for detecting deviation faults under simultaneous statistical fluctuation of the non-faulty parameters. For this purpose, we use Monte Carlo simulations and a fault model that considers as faulty only one component of the filter under test while the others components adopt random values (within their tolerance band) obtained from their statistical distributions. The new data reported here show (for the filters under study) the presence of hard-to-test components and relatively low fault coverage values for small deviation faults. These results suggest that the fault coverage value obtained using only nominal values for the non-faulty components (the traditional evaluation of TRAM) seem to be a poor predictor of the test performance.