Abstract: The objective of this paper is to present a research
study of the convectors that are used for heating or cooling of the
living room or industrial halls. The key points are experimental
measurement and comprehensive numerical simulation of the flow
coming throughout the part of the convector such as heat exchanger,
input from the fan etc.. From the obtained results, the components of
the convector are optimized in sense to increase thermal power
efficiency due to improvement of heat convection or reduction of air
drag friction. Both optimized aspects are leading to the more
effective service conditions and to energy saving. The significant part
of the convector research is a design of the unique measurement
laboratory and adopting measure techniques. The new laboratory
provides possibility to measure thermal power efficiency and other
relevant parameters under specific service conditions of the
convectors.
Abstract: Computing and maintaining network structures for efficient
data aggregation incurs high overhead for dynamic events
where the set of nodes sensing an event changes with time. Moreover,
structured approaches are sensitive to the waiting time that is used
by nodes to wait for packets from their children before forwarding
the packet to the sink. An optimal routing and data aggregation
scheme for wireless sensor networks is proposed in this paper. We
propose Tree on DAG (ToD), a semistructured approach that uses
Dynamic Forwarding on an implicitly constructed structure composed
of multiple shortest path trees to support network scalability. The key
principle behind ToD is that adjacent nodes in a graph will have
low stretch in one of these trees in ToD, thus resulting in early
aggregation of packets. Based on simulations on a 2,000-node Mica2-
based network, we conclude that efficient aggregation in large-scale
networks can be achieved by our semistructured approach.
Abstract: Firstly, this study briefly presents the current situation that there exists a vast gap between current Chinese and Japanese seismic design specification for bridge pile foundation in liquefiable and liquefaction-induced lateral spreading ground; The Chinese and Japanese seismic design method and technical detail for bridge pile foundation in liquefying and lateral spreading ground are described and compared systematically and comprehensively, the methods of determining coefficient of subgrade reaction and its reduction factor as well as the computing mode of the applied force on pile foundation due to liquefaction-induced lateral spreading soil in Japanese design specification are especially introduced. Subsequently, the comparison indicates that the content of Chinese seismic design specification for bridge pile foundation in liquefiable and liquefaction-induced lateral spreading ground, just presenting some qualitative items, is too general and lacks systematicness and maneuverability. Finally, some defects of seismic design specification in China are summarized, so the improvement and revision of specification in the field turns out to be imperative for China, some key problems of current Chinese specifications are generalized and the corresponding improvement suggestions are proposed.
Abstract: In distributed resource allocation a set of agents must assign their resources to a set of tasks. This problem arises in many real-world domains such as distributed sensor networks, disaster rescue, hospital scheduling and others. Despite the variety of approaches proposed for distributed resource allocation, a systematic formalization of the problem, explaining the different sources of difficulties, and a formal explanation of the strengths and limitations of key approaches is missing. We take a step towards this goal by using a formalization of distributed resource allocation that represents both dynamic and distributed aspects of the problem. In this paper we present a new idea for target tracking in sensor networks and compare it with previous approaches. The central contribution of the paper is a generalized mapping from distributed resource allocation to DDCSP. This mapping is proven to correctly perform resource allocation problems of specific difficulty. This theoretical result is verified in practice by a simulation on a realworld distributed sensor network.
Abstract: High Speed PM Generators driven by micro-turbines
are widely used in Smart Grid System. So, this paper proposes
comparative study among six classical, optimized and genetic
analytical design cases for 400 kW output power at tip speed 200
m/s. These six design trials of High Speed Permanent Magnet
Synchronous Generators (HSPMSGs) are: Classical Sizing;
Unconstrained optimization for total losses and its minimization;
Constrained optimized total mass with bounded constraints are
introduced in the problem formulation. Then a genetic algorithm is
formulated for obtaining maximum efficiency and minimizing
machine size. In the second genetic problem formulation, we attempt
to obtain minimum mass, the machine sizing that is constrained by
the non-linear constraint function of machine losses. Finally, an
optimum torque per ampere genetic sizing is predicted. All results are
simulated with MATLAB, Optimization Toolbox and its Genetic
Algorithm. Finally, six analytical design examples comparisons are
introduced with study of machines waveforms, THD and rotor losses.
Abstract: Corporate social responsibility (CSR) can be defined as the management of social, environmental, economical and ethical concepts and firms sensivities to the expectations of the social stakeholders. CSR is seen as an important competitive advantage in the textile sector because this sector has an important impact on the environment and it is labor extensive. Textile sector has a strong advantage when compared with other sectors in Turkey due to its low labor costs and abundancy of raw materials. Turkey was a producer and an exporter of cotton, and an importer of fiber, clothes and dresses until 1950s. After 1950s, Turkey has begun to export fiber, ready-made clothes and become one of the most important textile producers in the world recently. CSR practices of the textile firms that are quoted in Istanbul Stock Exchange and these firms sensivities to their internal and external stakeholders and environment will be presented in this study.
Abstract: Support Vector Machine (SVM) is a recent class of statistical classification and regression techniques playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM is applied to an infrared (IR) binary communication system with different types of channel models including Ricean multipath fading and partially developed scattering channel with additive white Gaussian noise (AWGN) at the receiver. The structure and performance of SVM in terms of the bit error rate (BER) metric is derived and simulated for these channel stochastic models and the computational complexity of the implementation, in terms of average computational time per bit, is also presented. The performance of SVM is then compared to classical binary signal maximum likelihood detection using a matched filter driven by On-Off keying (OOK) modulation. We found that the performance of SVM is superior to that of the traditional optimal detection schemes used in statistical communication, especially for very low signal-to-noise ratio (SNR) ranges. For large SNR, the performance of the SVM is similar to that of the classical detectors. The implication of these results is that SVM can prove very beneficial to IR communication systems that notoriously suffer from low SNR at the cost of increased computational complexity.
Abstract: In the context of sensor networks, where every few
dB saving counts, the novel node cooperation schemes are reviewed
where MIMO techniques play a leading role. These methods could be
treated as joint approach for designing physical layer of their
communication scenarios. Then we analyzed the BER performance
of transmission diversity schemes under a general fading channel
model and proposed a power allocation strategy to the transmitting
sensor nodes. This approach is then compared to an equal-power
assignment method and its performance enhancement is verified by
the simulation. Another key point of the contribution lies in the
combination of optimal power allocation and sensor nodes-
cooperation in a transmission diversity regime (MISO). Numerical
results are given through figures to demonstrate the optimality and
efficiency of proposed combined approach.
Abstract: Nowadays, people are going more and more mobile, both in terms of devices and associated applications. Moreover, services that these devices are offering are getting wider and much more complex. Even though actual handheld devices have considerable computing power, their contexts of utilization are different. These contexts are affected by the availability of connection, high latency of wireless networks, battery life, size of the screen, on-screen or hard keyboard, etc. Consequently, development of mobile applications and their associated mobile Web services, if any, should follow a concise methodology so they will provide a high Quality of Service. The aim of this paper is to highlight and discuss main issues to consider when developing mobile applications and mobile Web services and then propose a framework that leads developers through different steps and modules toward development of efficient and secure mobile applications. First, different challenges in developing such applications are elicited and deeply discussed. Second, a development framework is presented with different modules addressing each of these challenges. Third, the paper presents an example of a mobile application, Eivom Cinema Guide, which benefits from following our development framework.
Abstract: This paper introduces and studies new indexing techniques for content-based queries in images databases. Indexing is the key to providing sophisticated, accurate and fast searches for queries in image data. This research describes a new indexing approach, which depends on linear modeling of signals, using bases for modeling. A basis is a set of chosen images, and modeling an image is a least-squares approximation of the image as a linear combination of the basis images. The coefficients of the basis images are taken together to serve as index for that image. The paper describes the implementation of the indexing scheme, and presents the findings of our extensive evaluation that was conducted to optimize (1) the choice of the basis matrix (B), and (2) the size of the index A (N). Furthermore, we compare the performance of our indexing scheme with other schemes. Our results show that our scheme has significantly higher performance.
Abstract: This study analyzed environmental health risks and
people-s perceptions of risks related to waste management in poor
settlements of Abidjan, to develop integrated solutions for health and
well-being improvement. The trans-disciplinary approach used relied
on remote sensing, a geographic information system (GIS),
qualitative and quantitative methods such as interviews and a
household survey (n=1800). Mitigating strategies were then
developed using an integrated participatory stakeholder workshop.
Waste management deficiencies resulting in lack of drainage and
uncontrolled solid and liquid waste disposal in the poor settlements
lead to severe environmental health risks. Health problems were
caused by direct handling of waste, as well as through broader
exposure of the population. People in poor settlements had little
awareness of health risks related to waste management in their
community and a general lack of knowledge pertaining to sanitation
systems. This unfortunate combination was the key determinant
affecting the health and vulnerability. For example, an increased
prevalence of malaria (47.1%) and diarrhoea (19.2%) was observed
in the rainy season when compared to the dry season (32.3% and
14.3%). Concerted and adapted solutions that suited all the
stakeholders concerned were developed in a participatory workshop
to allow for improvement of health and well-being.
Abstract: Condition monitoring of electrical power equipment
has attracted considerable attention for many years. The aim of this
paper is to use Labview with Fuzzy Logic controller to build a
simulation system to diagnose transformer faults and monitor its
condition. The front panel of the system was designed using
LabVIEW to enable computer to act as customer-designed
instrument. The dissolved gas-in-oil analysis (DGA) method was
used as technique for oil type transformer diagnosis; meanwhile
terminal voltages and currents analysis method was used for dry type
transformer. Fuzzy Logic was used as expert system that assesses all
information keyed in at the front panel to diagnose and predict the
condition of the transformer. The outcome of the Fuzzy Logic
interpretation will be displayed at front panel of LabVIEW to show
the user the conditions of the transformer at any time.
Abstract: Complexity, as a theoretical background has made it
easier to understand and explain the features and dynamic behavior
of various complex systems. As the common theoretical background
has confirmed, borrowing the terminology for design from the
natural sciences has helped to control and understand urban
complexity. Phenomena like self-organization, evolution and
adaptation are appropriate to describe the formerly inaccessible
characteristics of the complex environment in unpredictable bottomup
systems. Increased computing capacity has been a key element in
capturing the chaotic nature of these systems.
A paradigm shift in urban planning and architectural design has
forced us to give up the illusion of total control in urban
environment, and consequently to seek for novel methods for
steering the development. New methods using dynamic modeling
have offered a real option for more thorough understanding of
complexity and urban processes. At best new approaches may renew
the design processes so that we get a better grip on the complex
world via more flexible processes, support urban environmental
diversity and respond to our needs beyond basic welfare by liberating
ourselves from the standardized minimalism.
A complex system and its features are as such beyond human
ethics. Self-organization or evolution is either good or bad. Their
mechanisms are by nature devoid of reason. They are common in
urban dynamics in both natural processes and gas. They are features
of a complex system, and they cannot be prevented. Yet their
dynamics can be studied and supported.
The paradigm of complexity and new design approaches has been
criticized for a lack of humanity and morality, but the ethical
implications of scientific or computational design processes have not
been much discussed. It is important to distinguish the (unexciting)
ethics of the theory and tools from the ethics of computer aided
processes based on ethical decisions. Urban planning and architecture
cannot be based on the survival of the fittest; however, the natural
dynamics of the system cannot be impeded on grounds of being
“non-human".
In this paper the ethical challenges of using the dynamic models
are contemplated in light of a few examples of new architecture and
dynamic urban models and literature. It is suggested that ethical
challenges in computational design processes could be reframed
under the concepts of responsibility and transparency.
Abstract: The use of the oncologic index ISTER allows for a more effective planning of the radiotherapic facilities in the hospitals. Any change in the radiotherapy treatment, due to unexpected stops, may be adapted by recalculating the doses to the new treatment duration while keeping the optimal prognosis. The results obtained in a simulation model on millions of patients allow the definition of optimal success probability algorithms.
Abstract: Information is increasing in volumes; companies are overloaded with information that they may lose track in getting the intended information. It is a time consuming task to scan through each of the lengthy document. A shorter version of the document which contains only the gist information is more favourable for most information seekers. Therefore, in this paper, we implement a text summarization system to produce a summary that contains gist information of oil and gas news articles. The summarization is intended to provide important information for oil and gas companies to monitor their competitor-s behaviour in enhancing them in formulating business strategies. The system integrated statistical approach with three underlying concepts: keyword occurrences, title of the news article and location of the sentence. The generated summaries were compared with human generated summaries from an oil and gas company. Precision and recall ratio are used to evaluate the accuracy of the generated summary. Based on the experimental results, the system is able to produce an effective summary with the average recall value of 83% at the compression rate of 25%.
Abstract: Fly ash is a significant waste that is released of
thermal power plants and defined as very fine particles that are drifted upward with up taken by the flue gases due to the burning of
used coal [1]. The fly-ash is capable of removing organic
contaminants in consequence of high carbon content, a large surface area per unit volume and contained heavy metals. Therefore, fly ash
is used as an effective coagulant and adsorbent by pelletization [2, 3].
In this study, the possibility of use of fly ash taken from Turkey like low-cost adsorbent for adsorption of zinc ions found in waste
water was investigated. The fly ash taken from Turkey was pelletized with bentonite and molass to evaluate the adsorption capaticity. For
this purpose; analyses such as sieve analysis, XRD, XRF, FTIR and SEM were performed. As a result, it was seen that pellets prepared
from fly ash, bentonite and molass would be used for zinc adsorption.
Abstract: Developing countries are facing a problem of slums and there appears to be no fool proof solution to eradicate them. For improving the quality of life there are three approaches of slum development and In-situ up-gradation approach is found to be the best one, while the relocation approach has proved to be failure. Factors responsible for failure of relocation projects are needed to be assessed, which is the basic aim of the paper. Factors responsible for failure of relocation projects are loss of livelihood, security of tenure and inefficiency of the Government. These factors are traced out & mapped from the examples of Western & Indian cities. National habitat, Resettlement policy emphasized relationship between shelter and work place. SRA has identified 55 slums for relocation due reservation of land uses, security of tenure and non- notified status of slums. The policy guidelines have been suggested for successful relocation projects. KeywordsLivelihood, Relocation, Slums, Urban poor.
Abstract: Network layer multicast, i.e. IP multicast, even after
many years of research, development and standardization, is not
deployed in large scale due to both technical (e.g. upgrading of
routers) and political (e.g. policy making and negotiation) issues.
Researchers looked for alternatives and proposed application/overlay
multicast where multicast functions are handled by end hosts, not
network layer routers. Member hosts wishing to receive multicast
data form a multicast delivery tree. The intermediate hosts in the tree
act as routers also, i.e. they forward data to the lower hosts in the
tree. Unlike IP multicast, where a router cannot leave the tree until all
members below it leave, in overlay multicast any member can leave
the tree at any time thus disjoining the tree and disrupting the data
dissemination. All the disrupted hosts have to rejoin the tree. This
characteristic of the overlay multicast causes multicast tree unstable,
data loss and rejoin overhead. In this paper, we propose that each node
sets its leaving time from the tree and sends join request to a number
of nodes in the tree. The nodes in the tree will reject the request if
their leaving time is earlier than the requesting node otherwise they
will accept the request. The node can join at one of the accepting
nodes. This makes the tree more stable as the nodes will join the tree
according to their leaving time, earliest leaving time node being at the
leaf of the tree. Some intermediate nodes may not follow their leaving
time and leave earlier than their leaving time thus disrupting the tree.
For this, we propose a proactive recovery mechanism so that disrupted
nodes can rejoin the tree at predetermined nodes immediately. We
have shown by simulation that there is less overhead when joining
the multicast tree and the recovery time of the disrupted nodes is
much less than the previous works. Keywords
Abstract: A major requirement for Grid application developers is ensuring performance and scalability of their applications. Predicting the performance of an application demands understanding its specific features. This paper discusses performance modeling and prediction of multi-agent based simulation (MABS) applications on the Grid. An experiment conducted using a synthetic MABS workload explains the key features to be included in the performance model. The results obtained from the experiment show that the prediction model developed for the synthetic workload can be used as a guideline to understand to estimate the performance characteristics of real world simulation applications.
Abstract: Artifact rejection plays a key role in many signal processing applications. The artifacts are disturbance that can occur during the signal acquisition and that can alter the analysis of the signals themselves. Our aim is to automatically remove the artifacts, in particular from the Electroencephalographic (EEG) recordings. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we try to enhance this technique proposing a new method based on the Renyi-s entropy. The performance of our method was tested and compared to the performance of the method in literature and the former proved to outperform the latter.