Abstract: As the new industrial revolution advances in the
nanotechnology have been followed with interest throughout the
world and also in Turkey. Media has an important role in conveying
these advances to public, rising public awareness and creating
attitudes related to nanotechnology. As well as representing how a
subject is treated, media frames determine how public think about
this subject. In literature definite frames related to nanoscience and
nanotechnology such as process, regulation, conflict and risks were
mentioned in studies focusing different countries. So how
nanotechnology news is treated by which frames and in which news
categories in Turkey as a one of developing countries? In this study
examining different variables about nanotechnology that affect
public attitudes such as category, frame, story tone, source in Turkish
media via framing analysis developed in agenda setting studies was
aimed. In the analysis data between 2005 and 2009 obtained from the
first five national newspapers with wide circulation in Turkey will be
used. In this study the direction of the media about nanotechnology,
in which frames nanotechnologic advances brought to agenda were
reported as news, and sectoral, legal, economic and social scenes
reflected by these frames to public related to nanotechnology in
Turkey were planned.
Abstract: WLAN Positioning has been presented by many
approaches in literatures using the characteristics of Received Signal
Strength (RSS), Time of Arrival (TOA) or Time Difference of
Arrival (TDOA), Angle of Arrival (AOA) and cell ID. Among these,
RSS approach is the simplest method to implement because there is
no need of modification on both access points and client devices
whereas its accuracy is terrible due to physical environments. For
TOA or TDOA approach, the accuracy is quite acceptable but most
researches have to modify either software or hardware on existing
WLAN infrastructure. The scales of modifications are made on only
access card up to the changes in protocol of WLAN. Hence, it is an
unattractive approach to use TOA or TDOA for positioning system.
In this paper, the new concept of merging both RSS and TOA
positioning techniques is proposed. In addition, the method to
achieve TOA characteristic for positioning WLAN user without any
extra modification necessarily appended in the existing system is
presented. The measurement results confirm that the proposed
technique using both RSS and TOA characteristics provides better
accuracy than using only either RSS or TOA approach.
Abstract: In the present paper, an improved initial value
numerical technique is presented to analyze the free vibration of
symmetrically laminated rectangular plate. A combination of the
initial value method (IV) and the finite differences (FD) devices is
utilized to develop the present (IVFD) technique. The achieved
technique is applied to the equation of motion of vibrating laminated
rectangular plate under various types of boundary conditions. Three
common types of laminated symmetrically cross-ply, orthotropic and
isotropic plates are analyzed here. The convergence and accuracy of
the presented Initial Value-Finite Differences (IVFD) technique have
been examined. Also, the merits and validity of improved technique
are satisfied via comparing the obtained results with those available
in literature indicating good agreements.
Abstract: This paper proposes, implements and evaluates an original discretization method for continuous random variables, in order to estimate the reliability of systems for which stress and strength are defined as complex functions, and whose reliability is not derivable through analytic techniques. This method is compared to other two discretizing approaches appeared in literature, also through a comparative study involving four engineering applications. The results show that the proposal is very efficient in terms of closeness of the estimates to the true (simulated) reliability. In the study we analyzed both a normal and a non-normal distribution for the random variables: this method is theoretically suitable for each parametric family.
Abstract: In literatures, many researches proposed various
methods to reduce PAPR (Peak to Average Power Ratio). Among
those, DSI (Dummy Sequence Insertion) is one of the most attractive
methods for WiMAX systems because it does not require side
information transmitted along with user data. However, the
conventional DSI methods find dummy sequence by performing an
iterative procedure until achieving PAPR under a desired threshold.
This causes a significant delay on finding dummy sequence and also
effects to the overall performances in WiMAX systems. In this paper,
the new method based on DSI is proposed by finding dummy
sequence without the need of iterative procedure. The fast DSI
method can reduce PAPR without either delays or required side
information. The simulation results confirm that the proposed method
is able to carry out PAPR performances as similar to the other
methods without any delays. In addition, the simulations of WiMAX
system with adaptive modulations are also investigated to realize the
use of proposed methods on various fading schemes. The results
suggest the WiMAX designers to modify a new Signal to Noise Ratio
(SNR) criteria for adaptation.
Abstract: ERP systems are the largest software applications adopted by universities, along with quite significant investments in their implementation. However, unlike other applications little research has been conducted regarding these systems in a university environment. This paper aims at providing a critical review of previous research in ERP system in higher education with a special focus on higher education in Australia. The research not only forms the basis of an evaluation of previous research and research needs, it also makes inroads in identifying the payoff of ERPs in the sector from different perspectives with particular reference to the user. The paper is divided into two parts, the first part focuses on ERP literature in higher education at large, while the second focuses on ERP literature in higher education in Australia.
Abstract: This research investigates the factors that influence moral judgments when dealing with ethical dilemmas in the organizational context. It also investigates the antecedents of individual ethical ideology (idealism and relativism). A mixed method study, which combines qualitative (field study) and quantitative (survey) approaches, was used in this study. An initial model was developed first, which was then fine-tuned based on field studies. Data were collected from managers in Malaysian large organizations. The results of this study reveal that in-group collectivism culture, power distance culture, parental values, and religiosity were significant as antecedents of ethical ideology. However, direct effects of these variables on moral judgment were not significant. Furthermore, the results of this study confirm the significant effects of ethical ideology on moral judgment. This study provides valuable insight into evaluating the validity of existing theory as proposed in the literature and offers significant practical implications.
Abstract: The importance of nurturing, accumulating, and efficiently deploying knowledge resources through formal structures and organisational mechanisms is well understood. Recent trends in knowledge management (KM) highlight that the effective creation and transfer of knowledge can also rely upon extra-organisational channels, such as, informal networks. The perception exists that the role of informal networks in knowledge creation and performance has been underestimated in the organisational context. Literature indicates that many managers fail to comprehend and successfully exploit the potential role of informal networks to create value for their organisations. This paper investigates: 1) whether managers share work-specific knowledge with informal contacts within and outside organisational boundaries; and 2) what do they think is the importance of this knowledge collaboration in their learning and work outcomes.
Abstract: Located within the tropical belt region, there are
certain rules which should implemented in creating a passive
sustainable housing design in Malaysia. Traditional Malay house
possess a strong character with certain special spaces to create a
sustainable house which suit to the tropical climate in Malaysia. One
of the special space known as verandah or serambi gantung, create
various advantages in solving various issues. However, this special
space is not extremely being applied currently which produce major
issues in term of social and environmental aspects. Hence, this
phenomena create a negative impact to the occupant while Malaysia
already has a best housing design previously. Therefore, this paper
aims to explore both of the main issues mentioned above and reveal
the advantages of implementing verandah into passive sustainable
housing design in Malaysia. A systematic literature review is the
main methodology in this research to identify the various advantages
about verandah.. The study reveals that verandah is the best solution
in term of social and environmental issues and should be
implemented in current housing design in Malaysia.
Abstract: In this paper we consider the issue of distributed adaptive estimation over sensor networks. To deal with more realistic scenario, different variance for observation noise is assumed for sensors in the network. To solve the problem of different variance of observation noise, the proposed method is divided into two phases: I) Estimating each sensor-s observation noise variance and II) using the estimated variances to obtain the desired parameter. Our proposed algorithm is based on a diffusion least mean square (LMS) implementation with linear combiner model. In the proposed algorithm, the step-size parameter the coefficients of linear combiner are adjusted according to estimated observation noise variances. As the simulation results show, the proposed algorithm considerably improves the diffusion LMS algorithm given in literature.
Abstract: Underpricing is one anomaly in initial public offerings
(IPO) literature that has been widely observed across different stock
markets with different trends emerging over different time periods.
This study seeks to determine how IPOs on the JSE performed on the
first day, first week and first month over the period of 1996-2011.
Underpricing trends are documented for both hot and cold market
periods in terms of four main sectors (cyclical, defensive, growth
stock and interest rate sensitive stocks). Using a sample of 360 listed
companies on the JSE, the empirical findings established that IPOs
on the JSE are significantly underpriced with an average market
adjusted first day return of 62.9%. It is also established that hot
market IPOs on the JSE are more underpriced than the cold market
IPOs. Also observed is the fact that as the offer price per share
increases above the median price for any given period, the level of
underpricing decreases substantially. While significant differences
exist in the level of underpricing of IPOs in the four different sectors
in the hot and cold market periods, interest rates sensitive stocks
showed a different trend from the other sectors and thus require
further investigation to uncover this pattern.
Abstract: The Multi-Layered Perceptron (MLP) Neural
networks have been very successful in a number of signal processing
applications. In this work we have studied the possibilities and the
met difficulties in the application of the MLP neural networks for the
prediction of daily solar radiation data. We have used the Polack-Ribière algorithm for training the neural networks. A comparison, in
term of the statistical indicators, with a linear model most used in
literature, is also performed, and the obtained results show that the
neural networks are more efficient and gave the best results.
Abstract: Complexity, as a theoretical background has made it
easier to understand and explain the features and dynamic behavior
of various complex systems. As the common theoretical background
has confirmed, borrowing the terminology for design from the
natural sciences has helped to control and understand urban
complexity. Phenomena like self-organization, evolution and
adaptation are appropriate to describe the formerly inaccessible
characteristics of the complex environment in unpredictable bottomup
systems. Increased computing capacity has been a key element in
capturing the chaotic nature of these systems.
A paradigm shift in urban planning and architectural design has
forced us to give up the illusion of total control in urban
environment, and consequently to seek for novel methods for
steering the development. New methods using dynamic modeling
have offered a real option for more thorough understanding of
complexity and urban processes. At best new approaches may renew
the design processes so that we get a better grip on the complex
world via more flexible processes, support urban environmental
diversity and respond to our needs beyond basic welfare by liberating
ourselves from the standardized minimalism.
A complex system and its features are as such beyond human
ethics. Self-organization or evolution is either good or bad. Their
mechanisms are by nature devoid of reason. They are common in
urban dynamics in both natural processes and gas. They are features
of a complex system, and they cannot be prevented. Yet their
dynamics can be studied and supported.
The paradigm of complexity and new design approaches has been
criticized for a lack of humanity and morality, but the ethical
implications of scientific or computational design processes have not
been much discussed. It is important to distinguish the (unexciting)
ethics of the theory and tools from the ethics of computer aided
processes based on ethical decisions. Urban planning and architecture
cannot be based on the survival of the fittest; however, the natural
dynamics of the system cannot be impeded on grounds of being
“non-human".
In this paper the ethical challenges of using the dynamic models
are contemplated in light of a few examples of new architecture and
dynamic urban models and literature. It is suggested that ethical
challenges in computational design processes could be reframed
under the concepts of responsibility and transparency.
Abstract: The segmentation of endovascular tools in fluoroscopy images can be accurately performed automatically or by minimum user intervention, using known modern techniques. It has been proven in literature, but no clinical implementation exists so far because the computational time requirements of such technology have not yet been met. A classical segmentation scheme is composed of edge enhancement filtering, line detection, and segmentation. A new method is presented that consists of a vector that propagates in the image to track an edge as it advances. The filtering is performed progressively in the projected path of the vector, whose orientation allows for oriented edge detection, and a minimal image area is globally filtered. Such an algorithm is rapidly computed and can be implemented in real-time applications. It was tested on medical fluoroscopy images from an endovascular cerebral intervention. Ex- periments showed that the 2D tracking was limited to guidewires without intersection crosspoints, while the 3D implementation was able to cope with such planar difficulties.
Abstract: In the literature of fuzzy measures, there exist many
well known parametric and non-parametric measures, each with its
own merits and limitations. But our main emphasis is on
applications of these measures to a variety of disciplines. To extend
the scope of applications of these fuzzy measures to geometry, we
need some special fuzzy measures. In this communication, we have
introduced two new fuzzy measures involving trigonometric
functions and simultaneously provided their applications to obtain
the basic results already existing in the literature of geometry.
Abstract: Basic objective of this study is to create a regression
analysis method that can estimate the length of a plastic hinge which
is an important design parameter, by making use of the outcomes of
(lateral load-lateral displacement hysteretic curves) the experimental
studies conducted for the reinforced square concrete columns. For
this aim, 170 different square reinforced concrete column tests results
have been collected from the existing literature. The parameters
which are thought affecting the plastic hinge length such as crosssection
properties, features of material used, axial loading level,
confinement of the column, longitudinal reinforcement bars in the
columns etc. have been obtained from these 170 different square
reinforced concrete column tests. In the study, when determining the
length of plastic hinge, using the experimental test results, a
regression analysis have been separately tested and compared with
each other. In addition, the outcome of mentioned methods on
determination of plastic hinge length of the reinforced concrete
columns has been compared to other methods available in the
literature.
Abstract: Artifact rejection plays a key role in many signal processing applications. The artifacts are disturbance that can occur during the signal acquisition and that can alter the analysis of the signals themselves. Our aim is to automatically remove the artifacts, in particular from the Electroencephalographic (EEG) recordings. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we try to enhance this technique proposing a new method based on the Renyi-s entropy. The performance of our method was tested and compared to the performance of the method in literature and the former proved to outperform the latter.
Abstract: The transformation of vocal characteristics aims at
modifying voice such that the intelligibility of aphonic voice is
increased or the voice characteristics of a speaker (source speaker) to
be perceived as if another speaker (target speaker) had uttered it. In
this paper, the current state-of-the-art voice characteristics
transformation methodology is reviewed. Special emphasis is placed
on voice transformation methodology and issues for improving the
transformed speech quality in intelligibility and naturalness are
discussed. In particular, it is suggested to use the modulation theory
of speech as a base for research on high quality voice transformation.
This approach allows one to separate linguistic, expressive, organic
and perspective information of speech, based on an analysis of how
they are fused when speech is produced. Therefore, this theory
provides the fundamentals not only for manipulating non-linguistic,
extra-/paralinguistic and intra-linguistic variables for voice
transformation, but also for paving the way for easily transposing the
existing voice transformation methods to emotion-related voice
quality transformation and speaking style transformation. From the
perspectives of human speech production and perception, the popular
voice transformation techniques are described and classified them
based on the underlying principles either from the speech production
or perception mechanisms or from both. In addition, the advantages
and limitations of voice transformation techniques and the
experimental manipulation of vocal cues are discussed through
examples from past and present research. Finally, a conclusion and
road map are pointed out for more natural voice transformation
algorithms in the future.
Abstract: This paper aims to present the main instruments used
in the economic literature for measuring the price risk, pointing out
on the advantages brought by the conditional variance in this respect.
The theoretical approach will be exemplified by elaborating an
EGARCH model for the price returns of wheat, both on Romanian
and on international market. To our knowledge, no previous
empirical research, either on price risk measurement for the
Romanian markets or studies that use the ARIMA-EGARCH
methodology, have been conducted. After estimating the
corresponding models, the paper will compare the estimated
conditional variance on the two markets.
Abstract: Ontology Matching is a task needed in various applica-tions, for example for comparison or merging purposes. In literature,many algorithms solving the matching problem can be found, butmost of them do not consider instances at all. Mappings are deter-mined by calculating the string-similarity of labels, by recognizinglinguistic word relations (synonyms, subsumptions etc.) or by ana-lyzing the (graph) structure. Due to the facts that instances are oftenmodeled within the ontology and that the set of instances describesthe meaning of the concepts better than their meta information,instances should definitely be incorporated into the matching process.In this paper several novel instance-based matching algorithms arepresented which enhance the quality of matching results obtainedwith common concept-based methods. Different kinds of formalismsare use to classify concepts on account of their instances and finallyto compare the concepts directly.KeywordsInstances, Ontology Matching, Semantic Web