Abstract: This paper investigates the effect of International
Financial Reporting Standards (IFRS) adoption on the frequency of
earnings managements towards small positive profits. We focus on
two emerging markets IFRS adopters: South Africa and Turkey.
We tested our logistic regression using appropriate panelestimation
techniques over a sample of 330 South African and 210
Turkish firm-year observations over the period 2002-2008. Our
results document that mandatory adoption of IFRS is not associated
with a reduction in earnings management towards small positive
profits in emerging markets. These results contradict most of the
previous findings of the studies conducted in developed countries.
Based on the legal system factor, we compare the intensity of
earnings management between a code law country (Turkey) and a
common law country (South Africa) over the pre and post-adoption
periods. Our findings show that the frequency of such earnings
management practice increases significantly for the code law
country.
Abstract: A large amount of valuable information is available in
plain text clinical reports. New techniques and technologies are
applied to extract information from these reports. In this study, we
developed a domain based software system to transform 600
Otorhinolaryngology discharge notes to a structured form for
extracting clinical data from the discharge notes. In order to decrease
the system process time discharge notes were transformed into a data
table after preprocessing. Several word lists were constituted to
identify common section in the discharge notes, including patient
history, age, problems, and diagnosis etc. N-gram method was used
for discovering terms co-Occurrences within each section. Using this
method a dataset of concept candidates has been generated for the
validation step, and then Predictive Apriori algorithm for Association
Rule Mining (ARM) was applied to validate candidate concepts.
Abstract: Traditionally, VLSI implementations of spiking
neural nets have featured large neuron counts for fixed computations
or small exploratory, configurable nets. This paper presents the
system architecture of a large configurable neural net system
employing a dedicated mapping algorithm for projecting the targeted
biology-analog nets and dynamics onto the hardware with its
attendant constraints.
Abstract: The aim of this paper is to know the sociodemographic
and operational-financial determinants of the services
quality perceived by users of the national health services. Through
the use of an inquiry conducted by the Ministry of Health,
comprehending 16.936 interviews in 2006, we intend to find out if
there is any characteristic that determines the 2006 inquiry results.
With the revision of the literature we also want to know if the
operational-financial results have implications in hospitals users-
perception on the quality of the received services. In order to achieve
our main goals we will make use of the regression analysis to find out
the possible dimensions that determine those results.
Abstract: The paper presents an investigation in to the effect of neural network predictive control of UPFC on the transient stability performance of a multimachine power system. The proposed controller consists of a neural network model of the test system. This model is used to predict the future control inputs using the damped Gauss-Newton method which employs ‘backtracking’ as the line search method for step selection. The benchmark 2 area, 4 machine system that mimics the behavior of large power systems is taken as the test system for the study and is subjected to three phase short circuit faults at different locations over a wide range of operating conditions. The simulation results clearly establish the robustness of the proposed controller to the fault location, an increase in the critical clearing time for the circuit breakers, and an improved damping of the power oscillations as compared to the conventional PI controller.
Abstract: Recently, grid computing has been widely focused on
the science, industry, and business fields, which are required a vast
amount of computing. Grid computing is to provide the environment
that many nodes (i.e., many computers) are connected with each
other through a local/global network and it is available for many
users. In the environment, to achieve data processing among nodes
for any applications, each node executes mutual authentication by
using certificates which published from the Certificate Authority
(for short, CA). However, if a failure or fault has occurred in the
CA, any new certificates cannot be published from the CA. As
a result, a new node cannot participate in the gird environment.
In this paper, an off-the-shelf scheme for dependable grid systems
using virtualization techniques is proposed and its implementation is
verified. The proposed approach using the virtualization techniques
is to restart an application, e.g., the CA, if it has failed. The system
can tolerate a failure or fault if it has occurred in the CA. Since
the proposed scheme is implemented at the application level easily,
the cost of its implementation by the system builder hardly takes
compared it with other methods. Simulation results show that the
CA in the system can recover from its failure or fault.
Abstract: The curriculum of the primary school science course was redesigned on the basis of constructivism in 2005-2006 academic years, in Turkey. In this context, the name of this course has been changed as “Science and Technology"; and both content and course books, students workbooks for this course have been redesigned in light of constructivism. The aim of this study is to determine whether the Science and Technology course books and student work books for primary school 5th grade are appropriate for the constructivism by evaluating them in terms of the fundamental principles of constructivism. In this study, out of qualitative research methods, documentation technique (i.e. document analysis) is applied; while selecting samples, criterion-sampling is used out of purposeful sampling techniques. When the Science and Technology course book and workbook for the 5th grade in primary education are examined, it is seen that both books complete each other in certain areas. Consequently, it can be claimed that in spite of some inadequate and missing points in the course book and workbook of the primary school Science and Technology course for the 5th grade students, these books are attempted to be designed in terms of the principles of constructivism. To overcome the inadequacies in the books, it can be suggested to redesign them. In addition to them, not to ignore the technology dimension of the course, the activities that encourage the students to prepare projects using technology cycle should be included.
Abstract: This paper reports our analysis of 163 ks observations
of PSR J0538+2817 with the Rossi X-Ray Timing Explorer
(RXTE).The pulse profiles, detected up to 60 keV, show a single
peak asin the case for radio frequency. The profile is well described
by one Gaussians function with full width at half maximum (FWHM)
0.04794. We compared the difference of arrival time between radio
and X-ray pulse profiles for the first time. It turns out that the phase
of radio emits precede the X-ray by 8.7 ± 4.5 ms. Furthermore we
obtained the pulse profiles in the energy ranges of 2.29-6.18 keV,
6.18-12.63 keV and 12.63-17.36 keV. The intensity of pulses
decreases with the increasing energy range. We discuss the emission
geometry in our work.
Abstract: An attempt in this paper proposes a re-modification to
the minimum moment approach of resource leveling which is a modified minimum moment approach to the traditional method by
Harris. The method is based on critical path method. The new approach suggests the difference between the methods in the
selection criteria of activity which needs to be shifted for leveling resource histogram. In traditional method, the improvement factor
found first to select the activity for each possible day of shifting. In
modified method maximum value of the product of Resources Rate
and Free Float was found first and improvement factor is then
calculated for that activity which needs to be shifted. In the proposed
method the activity to be selected first for shifting is based on the largest value of resource rate. The process is repeated for all the
remaining activities for possible shifting to get updated histogram.
The proposed method significantly reduces the number of iterations
and is easier for manual computations.
Abstract: In this paper a new concept of partial complement of a graph G is introduced and using the same a new graph parameter, called completion number of a graph G, denoted by c(G) is defined. Some basic properties of graph parameter, completion number, are studied and upperbounds for completion number of classes of graphs are obtained , the paper includes the characterization also.
Abstract: Water vapour transport properties of gypsum block
are studied in dependence on relative humidity using inverse analysis
based on genetic algorithm. The computational inverse analysis is
performed for the relative humidity profiles measured along the
longitudinal axis of a rod sample. Within the performed transient
experiment, the studied sample is exposed to two environments with
different relative humidity, whereas the temperature is kept constant.
For the basic gypsum characterisation and for the assessment of input
material parameters necessary for computational application of
genetic algorithm, the basic material properties of gypsum are
measured as well as its thermal and water vapour storage parameters.
On the basis of application of genetic algorithm, the relative
humidity dependent water vapour diffusion coefficient and water
vapour diffusion resistance factor are calculated.
Abstract: Ireland developed a National Strategy 2030 that
argued for the creation of a new form of higher education institution,
a Technological University. The research reported here reviews the
first stage of this partnership development. The study found that
national policy can create system capacity and change, but that
individual partners may have more to gain or lose in collaborating.
When presented as a zero-sum activity, fear among partners is high.
The level of knowledge and networking within the higher education
system possessed by each partner contributed to decisions to
participate or not in a joint proposal for collaboration. Greater
success resulted when there were gains for all partners. This research
concludes that policy mandates can provide motivation to
collaborate, but that the partnership needs to be built more on shared
values versus coercion by mandates.
Abstract: The ARMrayan Multimedia Mobile CMS (Content
Management System) is the first mobile CMS that gives the
opportunity to users for creating multimedia J2ME mobile
applications with their desired content, design and logo; simply,
without any need for writing even a line of code. The low-level
programming and compatibility problems of the J2ME, along with
UI designing difficulties, makes it hard for most people –even
programmers- to broadcast their content to the widespread mobile
phones used by nearly all people. This system provides user-friendly,
PC-based tools for creating a tree index of pages and inserting
multiple multimedia contents (e.g. sound, video and picture) in each
page for creating a J2ME mobile application. The output is a standalone
Java mobile application that has a user interface, shows texts
and pictures and plays music and videos regardless of the type of
devices used as long as the devices support the J2ME platform.
Bitmap fonts have also been used thus Middle Eastern languages can
be easily supported on all mobile phone devices. We omitted
programming concepts for users in order to simplify multimedia
content-oriented mobile applictaion designing for use in educational,
cultural or marketing centers. Ordinary operators can now create a
variety of multimedia mobile applications such as tutorials,
catalogues, books, and guides in minutes rather than months.
Simplicity and power has been the goal of this CMS. In this paper,
we present the software engineered-designed concepts of the
ARMrayan MCMS along with the implementation challenges faces
and solutions adapted.
Abstract: Wireless channels are characterized by more serious
bursty and location-dependent errors. Many packet scheduling
algorithms have been proposed for wireless networks to guarantee
fairness and delay bounds. However, most existing schemes do not
consider the difference of traffic natures among packet flows. This
will cause the delay-weight coupling problem. In particular, serious
queuing delays may be incurred for real-time flows. In this paper, it
is proposed a scheduling algorithm that takes traffic types of flows
into consideration when scheduling packets and also it is provided
scheduling flexibility by trading off video quality to meet the
playback deadline.
Abstract: In order to increase in chickpea quality and
agroecosystem sustainability, field experiments were carried out in
2007 and 2008 growing seasons. In this research the effects of
different organic, chemical and biological fertilizers were
investigated on grain yield and quality of chickpea. Experimental
units were arranged in split-split plots based on randomized complete
blocks with three replications. The highest amounts of yield and yield
components were obtained in G1×N5 interaction. Significant
increasing of N, P, K, Fe and Mg content in leaves and grains
emphasized on superiority of mentioned treatment because each one
of these nutrients has an approved role in chlorophyll synthesis and
photosynthesis ability of the crop. The combined application of
compost, farmyard manure and chemical phosphorus (N5) had the
best grain quality due to high protein, starch and total sugar contents,
low crude fiber and reduced cooking time.
Abstract: The feature of HIV genome is in a wide range because
of it is highly heterogeneous. Hence, the infection ability of the virus changes related with different chemokine receptors. From this point,
R5 and X4 HIV viruses use CCR5 and CXCR5 coreceptors respectively while R5X4 viruses can utilize both coreceptors. Recently, in Bioinformatics, R5X4 viruses have been studied to
classify by using the coreceptors of HIV genome.
The aim of this study is to develop the optimal Multilayer
Perceptron (MLP) for high classification accuracy of HIV sub-type viruses. To accomplish this purpose, the unit number in hidden layer
was incremented one by one, from one to a particular number. The statistical data of R5X4, R5 and X4 viruses was preprocessed by the
signal processing methods. Accessible residues of these virus sequences were extracted and modeled by Auto-Regressive Model
(AR) due to the dimension of residues is large and different from each other. Finally the pre-processed dataset was used to evolve MLP with various number of hidden units to determine R5X4
viruses. Furthermore, ROC analysis was used to figure out the optimal MLP structure.
Abstract: Laser beam forming is a novel technique developed for the joining of metallic components. In this study, an overview of the laser beam forming process, areas of application, the basic mechanisms of the laser beam forming process, some recent research
studies and the need to focus more research effort on improving the
laser-material interaction of laser beam forming of titanium and its
alloys are presented.
Abstract: In this article, we introduce a mechanism by which the same concept of differentiated services used in network transmission can be applied to provide quality of service levels to pervasive systems applications. The classical DiffServ model, including marking and classification, assured forwarding, and expedited forwarding, are all utilized to create quality of service guarantees for various pervasive applications requiring different levels of quality of service. Through a collection of various sensors, personal devices, and data sources, the transmission of contextsensitive data can automatically occur within a pervasive system with a given quality of service level. Triggers, initiators, sources, and receivers are four entities labeled in our mechanism. An explanation of the role of each is provided, and how quality of service is guaranteed.
Abstract: Systems Analysis and Design is a key subject in
Information Technology courses, but students do not find it easy to
cope with, since it is not “precise" like programming and not exact
like Mathematics. It is a subject working with many concepts,
modeling ideas into visual representations and then translating the
pictures into a real life system. To complicate matters users who are
not necessarily familiar with computers need to give their inputs to
ensure that they get the system the need. Systems Analysis and
Design also covers two fields, namely Analysis, focusing on the
analysis of the existing system and Design, focusing on the design of
the new system. To be able to test the analysis and design of a
system, it is necessary to develop a system or at least a prototype of
the system to test the validity of the analysis and design. The skills
necessary in each aspect differs vastly. Project Management Skills,
Database Knowledge and Object Oriented Principles are all
necessary. In the context of a developing country where students
enter tertiary education underprepared and the digital divide is alive
and well, students need to be motivated to learn the necessary skills,
get an opportunity to test it in a “live" but protected environment –
within the framework of a university. The purpose of this article is to
improve the learning experience in Systems Analysis and Design
through reviewing the underlying teaching principles used, the
teaching tools implemented, the observations made and the
reflections that will influence future developments in Systems
Analysis and Design. Action research principles allows the focus to
be on a few problematic aspects during a particular semester.
Abstract: In this paper we address the issue of classifying the fluorescent intensity of a sample in Indirect Immuno-Fluorescence (IIF). Since IIF is a subjective, semi-quantitative test in its very nature, we discuss a strategy to reliably label the image data set by using the diagnoses performed by different physicians. Then, we discuss image pre-processing, feature extraction and selection. Finally, we propose two ANN-based classifiers that can separate intrinsically dubious samples and whose error tolerance can be flexibly set. Measured performance shows error rates less than 1%, which candidates the method to be used in daily medical practice either to perform pre-selection of cases to be examined, or to act as a second reader.