Abstract: Performance of a cobalt doped sol-gel derived silica (Co/SiO2) catalyst for Fischer–Tropsch synthesis (FTS) in slurryphase reactor was studied using paraffin wax as initial liquid media. The reactive mixed gas, hydrogen (H2) and carbon monoxide (CO) in a molar ratio of 2:1, was flowed at 50 ml/min. Braunauer-Emmett- Teller (BET) surface area and X-ray diffraction (XRD) techniques were employed to characterize both the specific surface area and crystallinity of the catalyst, respectively. The reduction behavior of Co/SiO2 catalyst was investigated using the Temperature Programmmed Reduction (TPR) method. Operating temperatures were varied from 493 to 533K to find the optimum conditions to maximize liquid fuels production, gasoline and diesel.
Abstract: This paper describes an algorithm to estimate realtime vehicle velocity using image processing technique from the known camera calibration parameters. The presented algorithm involves several main steps. First, the moving object is extracted by utilizing frame differencing technique. Second, the object tracking method is applied and the speed is estimated based on the displacement of the object-s centroid. Several assumptions are listed to simplify the transformation of 2D images from 3D real-world images. The results obtained from the experiment have been compared to the estimated ground truth. From this experiment, it exhibits that the proposed algorithm has achieved the velocity accuracy estimation of about ± 1.7 km/h.
Abstract: Hexavalent chromium is highly toxic to most living organisms and a known human carcinogen by the inhalation route of exposure. Therefore, treatment of Cr(VI) contaminated wastewater is essential before their discharge to the natural water bodies. Cr(VI) reduction to Cr(III) can be beneficial because a more mobile and more toxic chromium species is converted to a less mobile and less toxic form. Zero-valence-state metals, such as scrap iron, can serve as electron donors for reducing Cr(VI) to Cr(III). The influence of pH on scrap iron capacity to reduce Cr(VI) was investigated in this study. Maximum reduction capacity of scrap iron was observed at the beginning of the column experiments; the lower the pH, the greater the experiment duration with maximum scrap iron reduction capacity. The experimental results showed that highest maximum reduction capacity of scrap iron was 12.5 mg Cr(VI)/g scrap iron, at pH 2.0, and decreased with increasing pH up to 1.9 mg Cr(VI)/g scrap iron at pH = 7.3.
Abstract: Avalanche velocity (from start to track zone) has been estimated in the present model for an avalanche which is triggered artificially by an explosive devise. The initial development of the model has been from the concept of micro-continuum theories [1], underwater explosions [2] and from fracture mechanics [3] with appropriate changes to the present model. The model has been computed for different slab depth R, slope angle θ, snow density ¤ü, viscosity μ, eddy viscosity η*and couple stress parameter η. The applicability of the present model in the avalanche forecasting has been highlighted.
Abstract: With the advent of emerging personal computing paradigms such as ubiquitous and mobile computing, Web contents are becoming accessible from a wide range of mobile devices. Since these devices do not have the same rendering capabilities, Web contents need to be adapted for transparent access from a variety of client agents. Such content adaptation is exploited for either an individual element or a set of consecutive elements in a Web document and results in better rendering and faster delivery to the client device. Nevertheless, Web content adaptation sets new challenges for semantic markup. This paper presents an advanced components platform, called SMC, enabling the development of mobility applications and services according to a channel model based on the principles of Services Oriented Architecture (SOA). It then goes on to describe the potential for integration with the Semantic Web through a novel framework of external semantic annotation that prescribes a scheme for representing semantic markup files and a way of associating Web documents with these external annotations. The role of semantic annotation in this framework is to describe the contents of individual documents themselves, assuring the preservation of the semantics during the process of adapting content rendering. Semantic Web content adaptation is a way of adding value to Web contents and facilitates repurposing of Web contents (enhanced browsing, Web Services location and access, etc).
Abstract: In this paper, first, a characterization of spherical
Pseudo null curves in Semi-Euclidean space is given. Then, to
investigate position vector of a pseudo null curve, a system of
differential equation whose solution gives the components of the
position vector of a pseudo null curve on the Frenet axis is
established by means of Frenet equations. Additionally, in view of
some special solutions of mentioned system, characterizations of
some special pseudo null curves are presented.
Abstract: This paper presents modeling and optimization of two NP-hard problems in flexible manufacturing system (FMS), part type selection problem and loading problem. Due to the complexity and extent of the problems, the paper was split into two parts. The first part of the papers has discussed the modeling of the problems and showed how the real coded genetic algorithms (RCGA) can be applied to solve the problems. This second part discusses the effectiveness of the RCGA which uses an array of real numbers as chromosome representation. The novel proposed chromosome representation produces only feasible solutions which minimize a computational time needed by GA to push its population toward feasible search space or repair infeasible chromosomes. The proposed RCGA improves the FMS performance by considering two objectives, maximizing system throughput and maintaining the balance of the system (minimizing system unbalance). The resulted objective values are compared to the optimum values produced by branch-and-bound method. The experiments show that the proposed RCGA could reach near optimum solutions in a reasonable amount of time.
Abstract: The ability of UML to handle the modeling process of complex industrial software applications has increased its popularity to the extent of becoming the de-facto language in serving the design purpose. Although, its rich graphical notation naturally oriented towards the object-oriented concept, facilitates the understandability, it hardly successes to report all domainspecific aspects in a satisfactory way. OCL, as the standard language for expressing additional constraints on UML models, has great potential to help improve expressiveness. Unfortunately, it suffers from a weak formalism due to its poor semantic resulting in many obstacles towards the build of tools support and thus its application in the industry field. For this reason, many researches were established to formalize OCL expressions using a more rigorous approach. Our contribution join this work in a complementary way since it focuses specifically on OCL predefined properties which constitute an important part in the construction of OCL expressions. Using formal methods, we mainly succeed in expressing rigorously OCL predefined functions.
Abstract: This paper is motivated by the aspect of uncertainty in
financial decision making, and how artificial intelligence and soft
computing, with its uncertainty reducing aspects can be used for
algorithmic trading applications that trade in high frequency.
This paper presents an optimized high frequency trading system that
has been combined with various moving averages to produce a hybrid
system that outperforms trading systems that rely solely on moving
averages. The paper optimizes an adaptive neuro-fuzzy inference
system that takes both the price and its moving average as input,
learns to predict price movements from training data consisting of
intraday data, dynamically switches between the best performing
moving averages, and performs decision making of when to buy or
sell a certain currency in high frequency.
Abstract: The decoding of Low-Density Parity-Check (LDPC) codes is operated over a redundant structure known as the bipartite graph, meaning that the full set of bit nodes is not absolutely necessary for decoder convergence. In 2008, Soyjaudah and Catherine designed a recovery algorithm for LDPC codes based on this assumption and showed that the error-correcting performance of their codes outperformed conventional LDPC Codes. In this work, the use of the recovery algorithm is further explored to test the performance of LDPC codes while the number of iterations is progressively increased. For experiments conducted with small blocklengths of up to 800 bits and number of iterations of up to 2000, the results interestingly demonstrate that contrary to conventional wisdom, the error-correcting performance keeps increasing with increasing number of iterations.
Abstract: Interpretation of aerial images is an important task in
various applications. Image segmentation can be viewed as the essential
step for extracting information from aerial images. Among many
developed segmentation methods, the technique of clustering has been
extensively investigated and used. However, determining the number
of clusters in an image is inherently a difficult problem, especially
when a priori information on the aerial image is unavailable. This
study proposes a support vector machine approach for clustering
aerial images. Three cluster validity indices, distance-based index,
Davies-Bouldin index, and Xie-Beni index, are utilized as quantitative
measures of the quality of clustering results. Comparisons on the
effectiveness of these indices and various parameters settings on the
proposed methods are conducted. Experimental results are provided
to illustrate the feasibility of the proposed approach.
Abstract: In order to make surfing the internet faster, and to save redundant processing load with each request for the same web page, many caching techniques have been developed to reduce latency of retrieving data on World Wide Web. In this paper we will give a quick overview of existing web caching techniques used for dynamic web pages then we will introduce a design and implementation model that take advantage of “URL Rewriting" feature in some popular web servers, e.g. Apache, to provide an effective approach of caching dynamic web pages.
Abstract: This study deals with evaluation of influence of salinity (NaCl) onto equilibrium of Cu and Ni removal from aqueous solutions by natural sorbent – zeolite. Equilibrium data were obtained by batch experiments. The salinity of the aqueous solution was influenced by dissolving NaCl in distilled water. It was studied in the range of NaCl concentrations from 1 g.l-1 to 100g.l-1. For Cu sorption there is a significant influence of salinity. The maximum capacity of zeolite for Cu was decreasing with growing concentration of NaCl. For Ni sorption there is not so significant influence of salinity as for Cu. The maximum capacity of zeolite for Ni was slightly decreasing with growing concentration of NaCl.
Abstract: There are three approaches to complete Bayesian
Network (BN) model construction: total expert-centred, total datacentred,
and semi data-centred. These three approaches constitute the
basis of the empirical investigation undertaken and reported in this
paper. The objective is to determine, amongst these three
approaches, which is the optimal approach for the construction of a
BN-based model for the performance assessment of students-
laboratory work in a virtual electronic laboratory environment. BN
models were constructed using all three approaches, with respect to
the focus domain, and compared using a set of optimality criteria. In
addition, the impact of the size and source of the training, on the
performance of total data-centred and semi data-centred models was
investigated. The results of the investigation provide additional
insight for BN model constructors and contribute to literature
providing supportive evidence for the conceptual feasibility and
efficiency of structure and parameter learning from data. In addition,
the results highlight other interesting themes.
Abstract: This paper demonstrates the bus location system for
the route bus through the experiment in the real environment. A
bus location system is a system that provides information such as
the bus delay and positions. This system uses actual services and
positions data of buses, and those information should match data
on the database. The system has two possible problems. One, the
system could cost high in preparing devices to get bus positions.
Two, it could be difficult to match services data of buses. To avoid
these problems, we have developed this system at low cost and short
time by using the smart phone with GPS and the bus route system.
This system realizes the path planning considering bus delay and
displaying position of buses on the map. The bus location system
was demonstrated on route buses with smart phones for two months.
Abstract: The purpose of this study is to examine the self and
decision making levels of students receiving education in schools of
physical training and sports. The population of the study consisted
258 students, among which 152 were male and 106 were female
( X age=19,3713 + 1,6968), that received education in the schools of
physical education and sports of Selcuk University, Inonu University,
Gazi University and Karamanoglu Mehmetbey University. In order to
achieve the purpose of the study, the Melbourne Decision Making
Questionnary developed by Mann et al. (1998) [1] and adapted to
Turkish by Deniz (2004) [2] and the Self-Esteem Scale developed by
Aricak (1999) [3] was utilized. For analyzing and interpreting data
Kolmogorov-Smirnov test, t-test and one way anova test were used,
while for determining the difference between the groups Tukey test
and Multiple Linear Regression test were employed and significance
was accepted at P
Abstract: Models are placed by modeling paradigm at the center of development process. These models are represented by languages, like UML the language standardized by the OMG which became necessary for development. Moreover the ontology engineering paradigm places ontologies at the center of development process; in this paradigm we find OWL the principal language for knowledge representation. Building ontologies from scratch is generally a difficult task. The bridging between UML and OWL appeared on several regards such as the classes and associations. In this paper, we have to profit from convergence between UML and OWL to propose an approach based on Meta-Modelling and Graph Grammars and registered in the MDA architecture for the automatic generation of OWL ontologies from UML class diagrams. The transformation is based on transformation rules; the level of abstraction in these rules is close to the application in order to have usable ontologies. We illustrate this approach by an example.
Abstract: Smart Dust particles, are small smart materials used for generating weather maps. We investigate question of the optimal number of Smart Dust particles necessary for generating precise, computationally feasible and cost effective 3–D weather maps. We also give an optimal matching algorithm for the generalized scenario, when there are N Smart Dust particles and M ground receivers.
Abstract: In this paper, an analytical modeling is presentated to
describe the channel noise in GME SGT/CGT MOSFET, based on
explicit functions of MOSFETs geometry and biasing conditions for
all channel length down to deep submicron and is verified with the
experimental data. Results shows the impact of various parameters
such as gate bias, drain bias, channel length ,device diameter and gate
material work function difference on drain current noise spectral
density of the device reflecting its applicability for circuit design
applications.