Abstract: This study1 holds for the formation of international financial crisis and political factors for economic crisis in Turkey, are evaluated in chronological order. The international arena and relevant studies conducted in Turkey work in the literature are assessed. The main purpose of the study is to hold the linkage between the crises and political stability in Turkey in details, and to examine the position of Turkey in this regard. The introduction part follows the literature survey on the models explaining causes and results of the crises, the second part of the study. In the third part, the formations of the world financial crises are studied. The fourth part, financial crisis in Turkey in 1994, 2000, 2001 and 2008 are reviewed and their political reasons are analyzed. In the last part of the study the results and recommendations are held. Political administrations have laid the grounds for an economic crisis in Turkey. In this study, the emergence of an economic crisis in Turkey and the developments after the crisis are chronologically examined and an explanation is offered as to the cause and effect relationship between the political administration and economic equilibrium in the country. Economic crises can be characterized as follows: high prices of consumables, high interest rates, current account deficits, budget deficits, structural defects in government finance, rising inflation and fixed currency applications, rising government debt, declining savings rates and increased dependency on foreign capital stock. Entering into the conditions of crisis during a time when the exchange value of the country-s national currency was rising, speculative finance movements and shrinking of foreign currency reserves happened due to expectations for devaluation and because of foreign investors- resistance to financing national debt, and a financial risk occurs. During the February 2001 crisis and immediately following, devaluation and reduction of value occurred in Turkey-s stock market. While changing over to the system of floating exchange rates in the midst of this crisis, the effects of the crisis on the real economy are discussed in this study. Administered politics include financial reforms, such as the rearrangement of banking systems. These reforms followed with the provision of foreign financial support. There have been winners and losers in the imbalance of income distribution, which has recently become more evident in Turkey-s fragile economy.
Abstract: This paper presents Qmulus- a Cloud Based GPS
Model. Qmulus is designed to compute the best possible route which
would lead the driver to the specified destination in the shortest time
while taking into account real-time constraints. Intelligence
incorporated to Qmulus-s design makes it capable of generating and
assigning priorities to a list of optimal routes through customizable
dynamic updates. The goal of this design is to minimize travel and
cost overheads, maintain reliability and consistency, and implement
scalability and flexibility. The model proposed focuses on
reducing the bridge between a Client Application and a Cloud
service so as to render seamless operations. Qmulus-s system
model is closely integrated and its concept has the potential to be
extended into several other integrated applications making it capable
of adapting to different media and resources.
Abstract: Haptics has been used extensively in many applications especially in human machine interaction and virtual reality systems. Haptic technology allows user to perceive virtual reality as in real world. However, commercially available haptic devices are expensive and may not be suitable for educational purpose. This paper describes the design and development of a low cost haptic knob, with only one degree of freedom, for use in rehabilitation or training hand pronation and supination. End-effectors can be changed to suit different applications or variation in hand sizes and hand orientation.
Abstract: This paper proves that the problem of finding connected
vertex cover in a 2-connected planar graph ( CVC-2 ) with maximum degree 4 is NP-complete. The motivation for proving this result is to
give a shorter and simpler proof of NP-Completeness of TRA-MLC (the Top Right Access point Minimum-Length Corridor) problem [1], by finding the reduction from CVC-2. TRA-MLC has many applications in laying optical fibre cables for data communication and electrical wiring in floor plans.The problem of finding connected vertex cover in any planar graph ( CVC ) with maximum degree 4 is NP-complete [2]. We first show that CVC-2 belongs to NP and then we find a polynomial reduction from CVC to CVC-2. Let a graph G0 and an integer K form an instance of CVC, where G0 is a planar graph and K is an upper bound on the size of the connected vertex cover in G0. We construct a 2-connected planar graph, say G, by identifying the blocks and cut vertices of G0, and then finding the planar representation of all the blocks of G0, leading to a plane graph G1. We replace the cut vertices with cycles in such a way that the resultant graph G is a 2-connected planar graph with maximum
degree 4. We consider L = K -2t+3 t i=1 di where t is the number of cut vertices in G1 and di is the number of blocks for which ith cut vertex is common. We prove that G will have a connected vertex
cover with size less than or equal to L if and only if G0 has a connected vertex cover of size less than or equal to K.
Abstract: This paper study the high-level modelling and design
of delta-sigma (ΔΣ) noise shapers for audio Digital-to-Analog
Converter (DAC) so as to eliminate the in-band Signal-to-Noise-
Ratio (SNR) degradation that accompany one channel mismatch in
audio signal. The converter combines a cascaded digital signal
interpolation, a noise-shaping single loop delta-sigma modulator with
a 5-bit quantizer resolution in the final stage. To reduce sensitivity of
Digital-to-Analog Converter (DAC) nonlinearities of the last stage, a
high pass second order Data Weighted Averaging (R2DWA) is
introduced. This paper presents a MATLAB description modelling
approach of the proposed DAC architecture with low distortion and
swing suppression integrator designs. The ΔΣ Modulator design can
be configured as a 3rd-order and allows 24-bit PCM at sampling rate
of 64 kHz for Digital Video Disc (DVD) audio application. The
modeling approach provides 139.38 dB of dynamic range for a 32
kHz signal band at -1.6 dBFS input signal level.
Abstract: Antimicrobial (AM) starch-based films were
developed by incorporating chitosan and lauric acid as antimicrobial
agent into starch-based film. Chitosan has wide range of applications
as a biomaterial, but barriers still exist to its broader use due to its
physical and chemical limitations. In this work, a series of
starch/chitosan (SC) blend films containing 8% of lauric acid was
prepared by casting method. The structure of the film was
characterized by Fourier transform infrared spectroscopy (FTIR), Xray
diffraction (XRD), and scanning electron microscopy (SEM).
The results indicated that there were strong interactions were present
between the hydroxyl groups of starch and the amino groups of
chitosan resulting in a good miscibility between starch and chitosan
in the blend films. Physical properties and optical properties of the
AM starch-based film were evaluated. The AM starch-based films
incorporated with chitosan and lauric acid showed an improvement in
water vapour transmission rate (WVTR) and addition of starch
content provided more transparent films while the yellowness of the
film attributed to the higher chitosan content. The improvement in
water barrier properties was mainly attributed to the hydrophobicity
of lauric acid and optimum chitosan or starch content. AM starch
based film also showed excellent oxygen barrier. Obtaining films
with good oxygen permeability would be an indication of the
potential use of these antimicrobial packaging as a natural packaging
and an alternative packaging to the synthetic polymer to protect food
from oxidation reactions
Abstract: For many industrial applications plate heat
exchangers are demonstrating a large superiority over the
other types of heat exchangers. The efficiency of such a
device depends on numerous factors the effect of which needs
to be analysed and accurately evaluated.
In this paper we present a theoretical analysis of a cocurrent
plate heat exchanger and the results of its numerical
simulation.
Knowing the hot and the cold fluid streams inlet temperatures,
the respective heat capacities mCp
and the value of the
overall heat transfer coefficient, a 1-D mathematical model
based on the steady flow energy balance for a differential
length of the device is developed resulting in a set of N first
order differential equations with boundary conditions where N
is the number of channels.For specific heat exchanger
geometry and operational parameters, the problem is
numerically solved using the shooting method.
The simulation allows the prediction of the temperature
map in the heat exchanger and hence, the evaluation of its
performances. A parametric analysis is performed to evaluate
the influence of the R-parameter on the e-NTU values. For
practical purposes effectiveness-NTU graphs are elaborated
for specific heat exchanger geometry and different operating
conditions.
Abstract: In this paper, we extend the compound binomial model to the case where the premium income process, based on a binomial process, is no longer a linear function. First, a mathematically recursive formula is derived for non ruin probability, and then, we examine the expected discounted penalty function, satisfy a defect renewal equation. Third, the asymptotic estimate for the expected discounted penalty function is then given. Finally, we give two examples of ruin quantities to illustrate applications of the recursive formula and the asymptotic estimate for penalty function.
Abstract: Among the various cooling processes in industrial
applications such as: electronic devices, heat exchangers, gas
turbines, etc. Gas turbine blades cooling is the most challenging one.
One of the most common practices is using ribbed wall because of
the boundary layer excitation and therefore making the ultimate
cooling. Vortex formation between rib and channel wall will result in
a complicated behavior of flow regime. At the other hand, selecting
the most efficient method for capturing the best results comparing to
experimental works would be a fascinating issue. In this paper 4
common methods in turbulence modeling: standard k-e, rationalized
k-e with enhanced wall boundary layer treatment, k-w and RSM
(Reynolds stress model) are employed to a square ribbed channel to
investigate the separation and thermal behavior of the flow in the
channel. Finally all results from different methods which are used in
this paper will be compared with experimental data available in
literature to ensure the numerical method accuracy.
Abstract: Interactive installations for public spaces are a
particular kind of interactive systems, the design of which has been
the subject of several research studies. Sensor-based applications are
becoming increasingly popular, but the human-computer interaction
community is still far from reaching sound, effective large-scale
interactive installations for public spaces. The 6DSpaces project is
described in this paper as a research approach based on studying the
role of multisensory interactivity and how it can be effectively used
to approach people to digital, scientific contents. The design of an
entire scientific exhibition is described and the result was evaluated
in the real world context of a Science Centre. Conclusions bring
insight into how the human-computer interaction should be designed
in order to maximize the overall experience.
Abstract: Signal processing applications which are iterative in
nature are best represented by data flow graphs (DFG). In these
applications, the maximum sampling frequency is dependent on the
topology of the DFG, the cyclic dependencies in particular. The
determination of the iteration bound, which is the reciprocal of the
maximum sampling frequency, is critical in the process of hardware
implementation of signal processing applications. In this paper, a
novel technique to compute the iteration bound is proposed. This
technique is different from all previously proposed techniques, in the
sense that it is based on the natural flow of tokens into the DFG
rather than the topology of the graph. The proposed algorithm has
lower run-time complexity than all known algorithms. The
performance of the proposed algorithm is illustrated through
analytical analysis of the time complexity, as well as through
simulation of some benchmark problems.
Abstract: Embedding and extraction of a secret information as
well as the restoration of the original un-watermarked image is
highly desirable in sensitive applications like military, medical, and
law enforcement imaging. This paper presents a novel reversible
data-hiding method for digital images using integer to integer
wavelet transform and companding technique which can embed and
recover the secret information as well as can restore the image to its
pristine state. The novel method takes advantage of block based
watermarking and iterative optimization of threshold for companding
which avoids histogram pre and post-processing. Consequently, it
reduces the associated overhead usually required in most of the
reversible watermarking techniques. As a result, it keeps the
distortion small between the marked and the original images.
Experimental results show that the proposed method outperforms the
existing reversible data hiding schemes reported in the literature.
Abstract: A numerical method for solving nonlinear Fredholm integral equations of second kind is proposed. The Fredholm type equations which have many applications in mathematical physics are then considered. The method is based on hybrid function approximations. The properties of hybrid of block-pulse functions and Chebyshev polynomials are presented and are utilized to reduce the computation of nonlinear Fredholm integral equations to a system of nonlinear. Some numerical examples are selected to illustrate the effectiveness and simplicity of the method.
Abstract: Due to new distributed database applications such as
huge deductive database systems, the search complexity is constantly
increasing and we need better algorithms to speedup traditional
relational database queries. An optimal dynamic programming
method for such high dimensional queries has the big disadvantage of
its exponential order and thus we are interested in semi-optimal but
faster approaches. In this work we present a multi-agent based
mechanism to meet this demand and also compare the result with
some commonly used query optimization algorithms.
Abstract: This paper is concerned with the production of an Arabic word semantic similarity benchmark dataset. It is the first of its kind for Arabic which was particularly developed to assess the accuracy of word semantic similarity measurements. Semantic similarity is an essential component to numerous applications in fields such as natural language processing, artificial intelligence, linguistics, and psychology. Most of the reported work has been done for English. To the best of our knowledge, there is no word similarity measure developed specifically for Arabic. In this paper, an Arabic benchmark dataset of 70 word pairs is presented. New methods and best possible available techniques have been used in this study to produce the Arabic dataset. This includes selecting and creating materials, collecting human ratings from a representative sample of participants, and calculating the overall ratings. This dataset will make a substantial contribution to future work in the field of Arabic WSS and hopefully it will be considered as a reference basis from which to evaluate and compare different methodologies in the field.
Abstract: Image interpolation is a common problem in imaging applications. However, most interpolation algorithms in existence suffer visually to some extent the effects of blurred edges and jagged artifacts in the image. This paper presents an adaptive feature preserving bidirectional flow process, where an inverse diffusion is performed to enhance edges along the normal directions to the isophote lines (edges), while a normal diffusion is done to remove artifacts (''jaggies'') along the tangent directions. In order to preserve image features such as edges, angles and textures, the nonlinear diffusion coefficients are locally adjusted according to the first and second order directional derivatives of the image. Experimental results on synthetic images and nature images demonstrate that our interpolation algorithm substantially improves the subjective quality of the interpolated images over conventional interpolations.
Abstract: In order to study the effect of phosphate solubilization
microorganisms (PSM) and plant growth promoting rhizobacteria
(PGPR) on yield and yield components of corn Zea mays (L. cv.
SC604) an experiment was conducted at research farm of Sari
Agricultural Sciences and Natural Resources University, Iran during
2007. Experiment laid out as split plot based on randomized
complete block design with three replications. Three levels of
manures (consisted of 20 Mg.ha-1 farmyard manure, 15 Mg.ha-1 green
manure and check or without any manures) as main plots and eight
levels of biofertilizers (consisted of 1-NPK or conventional fertilizer
application; 2-NPK+PSM+PGPR; 3 NP50%K+PSM+PGPR; 4-
N50%PK+PSM +PGPR; 5-N50%P50%K+PSM+ PGPR; 6-PK+PGPR; 7-
NK+PSM and 8-PSM+PGPR) as sub plots were treatments. Results
showed that farmyard manure application increased row number, ear
weight, grain number per ear, grain yield, biological yield and
harvest index compared to check. Furthermore, using of PSM and
PGPR in addition to conventional fertilizer applications (NPK) could
improve ear weight, row number and grain number per row and
ultimately increased grain yield in green manure and check plots.
According to results in all fertilizer treatments application of PSM
and PGPR together could reduce P application by 50% without any
significant reduction of grain yield. However, this treatment could
not compensate 50% reduction of N application.
Abstract: The abnormal increase in the number of applications available for download in Android markets is a good indication that they are being reused. However, little is known about their real reusability potential. A considerable amount of these applications is reported as having a poor quality or being malicious. Hence, in this paper, an approach to measure the reusability potential of classes in Android applications is proposed. The approach is not meant specifically for this particular type of applications. Rather, it is intended for Object-Oriented (OO) software systems in general and aims also to provide means to discard the classes of low quality and defect prone applications from being reused directly through inheritance and instantiation. An empirical investigation is conducted to measure and rank the reusability potential of the classes of randomly selected Android applications. The results obtained are thoroughly analyzed in order to understand the extent of this potential and the factors influencing it.
Abstract: In many applications there is a broad variety of
information relevant to a focal “object" of interest, and the fusion of such heterogeneous data types is desirable for classification and
categorization. While these various data types can sometimes be treated as orthogonal (such as the hull number, superstructure color,
and speed of an oil tanker), there are instances where the inference and the correlation between quantities can provide improved fusion
capabilities (such as the height, weight, and gender of a person). A
service-oriented architecture has been designed and prototyped to
support the fusion of information for such “object-centric" situations.
It is modular, scalable, and flexible, and designed to support new data sources, fusion algorithms, and computational resources without affecting existing services. The architecture is designed to simplify
the incorporation of legacy systems, support exact and probabilistic entity disambiguation, recognize and utilize multiple types of
uncertainties, and minimize network bandwidth requirements.
Abstract: The growing interest on national heritage
preservation has led to intensive efforts on digital documentation of
cultural heritage knowledge. Encapsulated within this effort is the
focus on ontology development that will help facilitate the
organization and retrieval of the knowledge. Ontologies surrounding
cultural heritage domain are related to archives, museum and library
information such as archaeology, artifacts, paintings, etc. The growth
in number and size of ontologies indicates the well acceptance of its
semantic enrichment in many emerging applications. Nowadays,
there are many heritage information systems available for access.
Among others is community-based e-museum designed to support the
digital cultural heritage preservation. This work extends previous
effort of developing the Traditional Malay Textile (TMT) Knowledge
Model where the model is designed with the intention of auxiliary
mapping with CIDOC CRM. Due to its internal constraints, the
model needs to be transformed in advance. This paper addresses the
issue by reviewing the previous harmonization works with CIDOC
CRM as exemplars in refining the facets in the model particularly
involving TMT-Artifact class. The result is an extensible model
which could lead to a common view for automated mapping with
CIDOC CRM. Hence, it promotes integration and exchange of
textile information especially batik-related between communities in
e-museum applications.