Abstract: This work presents an approach for the construction of a hybrid color-texture space by using mutual information. Feature extraction is done by the Laws filter with SVM (Support Vectors Machine) as a classifier. The classification is applied on the VisTex database and a SPOT HRV (XS) image representing two forest areas in the region of Rabat in Morocco. The result of classification obtained in the hybrid space is compared with the one obtained in the RGB color space.
Abstract: In a none-super-competitive environment the concepts
of closed system, management control remains to be the dominant
guiding concept to management. The merits of closed loop have been
the sources of most of the management literature and culture for
many decades. It is a useful exercise to investigate and poke into the
dynamics of the control loop phenomenon and draws some lessons to
use for refining the practice of management. This paper examines the
multitude of lessons abstracted from the behavior of the Input /output
/feedback control loop model, which is the core of control theory.
There are numerous lessons that can be learned from the insights this
model would provide and how it parallels the management dynamics
of the organization. It is assumed that an organization is basically a
living system that interacts with the internal and external variables. A
viable control loop is the one that reacts to the variation in the
environment and provide or exert a corrective action. In managing
organizations this is reflected in organizational structure and
management control practices. This paper will report findings that
were a result of examining several abstract scenarios that are
exhibited in the design, operation, and dynamics of the control loop
and how they are projected on the functioning of the organization.
Valuable lessons are drawn in trying to find parallels and new
paradigms, and how the control theory science is reflected in the
design of the organizational structure and management practices. The
paper is structured in a logical and perceptive format. Further
research is needed to extend these findings.
Abstract: Post cracking behavior and load –bearing capacity of
the steel fiber reinforced high-strength concrete (SFRHSC) are
dependent on the number of fibers are crossing the weakest crack
(bridged the crack) and their orientation to the crack surface. Filling
the mould by SFRHSC, fibers are moving and rotating with the
concrete matrix flow till the motion stops in each internal point of the
concrete body. Filling the same mould from the different ends
SFRHSC samples with the different internal structures (and different
strength) can be obtained. Numerical flow simulations (using Newton
and Bingham flow models) were realized, as well as single fiber
planar motion and rotation numerical and experimental investigation
(in viscous flow) was performed. X-ray pictures for prismatic
samples were obtained and internal fiber positions and orientations
were analyzed. Similarly fiber positions and orientations in cracked
cross-section were recognized and were compared with numerically
simulated. Structural SFRHSC fracture model was created based on
single fiber pull-out laws, which were determined experimentally.
Model predictions were validated by 15x15x60cm prisms 4 point
bending tests.
Abstract: Due to the fact that in the new century customers tend
to express globally increasing demands, networks of interconnected
businesses have been established in societies and the management of
such networks seems to be a major key through gaining competitive
advantages. Supply chain management encompasses such managerial
activities. Within a supply chain, a critical role is played by quality.
QFD is a widely-utilized tool which serves the purpose of not only
bringing quality to the ultimate provision of products or service
packages required by the end customer or the retailer, but it can also
initiate us into a satisfactory relationship with our initial customer;
that is the wholesaler. However, the wholesalers- cooperation is
considerably based on the capabilities that are heavily dependent on
their locations and existing circumstances. Therefore, it is undeniable
that for all companies each wholesaler possesses a specific
importance ratio which can heavily influence the figures calculated in
the House of Quality in QFD. Moreover, due to the competitiveness
of the marketplace today, it-s been widely recognized that
consumers- expression of demands has been highly volatile in
periods of production. Apparently, such instability and proneness to
change has been very tangibly noticed and taking it into account
during the analysis of HOQ is widely influential and doubtlessly
required. For a more reliable outcome in such matters, this article
demonstrates the application viability of Analytic Network Process
for considering the wholesalers- reputation and simultaneously
introduces a mortality coefficient for the reliability and stability of
the consumers- expressed demands in course of time. Following to
this, the paper provides further elaboration on the relevant
contributory factors and approaches through the calculation of such
coefficients. In the end, the article concludes that an empirical
application is needed to achieve broader validity.
Abstract: Biological data has several characteristics that strongly differentiate it from typical business data. It is much more complex, usually large in size, and continuously changes. Until recently business data has been the main target for discovering trends, patterns or future expectations. However, with the recent rise in biotechnology, the powerful technology that was used for analyzing business data is now being applied to biological data. With the advanced technology at hand, the main trend in biological research is rapidly changing from structural DNA analysis to understanding cellular functions of the DNA sequences. DNA chips are now being used to perform experiments and DNA analysis processes are being used by researchers. Clustering is one of the important processes used for grouping together similar entities. There are many clustering algorithms such as hierarchical clustering, self-organizing maps, K-means clustering and so on. In this paper, we propose a clustering algorithm that imitates the ecosystem taking into account the features of biological data. We implemented the system using an Ant-Colony clustering algorithm. The system decides the number of clusters automatically. The system processes the input biological data, runs the Ant-Colony algorithm, draws the Topic Map, assigns clusters to the genes and displays the output. We tested the algorithm with a test data of 100 to1000 genes and 24 samples and show promising results for applying this algorithm to clustering DNA chip data.
Abstract: Five vegetables (spinach, lettuce, cabbage, tomato, and onion) were freshly harvested from the Alau Dam and Gongulong agricultural areas for the determination of some organochlorine pesticide residues (o, p-DDE, p,p’-DDD, o,p’-DDD, p,p’-DDT, α-BHC, γ-BHC, metoxichlor, lindane, endosulfan dieldrin, and aldrin.) Soil samples were also collected at different depths for the determination of the above pesticides. Samples collection and preparation were conducted using standard procedures. The concentrations of all the pesticides in the soil and vegetable samples were determined using GC/MS SHIMADZU (GC-17A) equipped with electron capture detector (ECD). The highest concentration was that of p,p’-DDD (132.4±13.45µg/g) which was observed in the leaf of cabbage, while the lowest concentration was that of p,p’-DDT (2.34µg/g) was observed in the root of spinach. Similar trends were observed at the Gongulong agricultural area, with p,p’-DDD having the highest concentration of 153.23µg/g in the leaf of cabbage, while the lowest concentration was that of p,p’-DDT (12.45µg/g) which was observed in the root of spinach. α-BHC, γ-BHC, Methoxychlor, and lindane were detected in all the vegetable samples studied. The concentrations of all the pesticides in the soil samples were observed to be higher at a depth of 21-30cm, while the lowest concentrations were observed at a depth of 0-10cm. The concentrations of all the pesticides in the vegetables and soil samples from the two agricultural sites were observed to be at alarming levels, much higher than the maximum residue limits (MRLs) and acceptable daily intake values (ADIs) .The levels of the pesticides observed in the vegetables and soil samples investigated, are of such a magnitude that calls for special attention and laws to regulate the use and circulation of such chemicals. Routine monitoring of pesticide residues in these study areas is necessary for the prevention, control and reduction of environmental pollution, so as to minimize health risks.
Abstract: The subject of the paper is comparative analysis of the hotel guest-s contractual liability for breaching the obligation for non-payment of hotel services in the hotel-keeper-s contract. The paper is methodologically conceived of six chapters (1. introduction, 2. comparative law sources of the hotel-keeper-s contract, 3. the guest-s obligation for payment of hotel services, 4. hotel guest's liability for non-payment, 5. the hotel-keeper-s rights due to nonpayment and 6. conclusion), which analyzes the guest-s liability for non-payment of hotel services through the international law, European law, euro-continental national laws (France, Germany, Italy, Croatia) and Anglo-American national laws (UK, USA). The paper-s results are the synthesis of answers to the set hypothesis and comparative review of hotel guest-s contractual liability for nonpayment of hotel services provided. In conclusion, it is necessary to adopt an international convention on the hotel-keeper-s contract, which would unify the institute of the hotel guest-s contractual liability for non-payment of hotel services at the international level.
Abstract: This paper proposes a novel game theoretical
technique to address the problem of data object replication in largescale
distributed computing systems. The proposed technique draws
inspiration from computational economic theory and employs the
extended Vickrey auction. Specifically, players in a non-cooperative
environment compete for server-side scarce memory space to
replicate data objects so as to minimize the total network object
transfer cost, while maintaining object concurrency. Optimization of
such a cost in turn leads to load balancing, fault-tolerance and
reduced user access time. The method is experimentally evaluated
against four well-known techniques from the literature: branch and
bound, greedy, bin-packing and genetic algorithms. The experimental
results reveal that the proposed approach outperforms the four
techniques in both the execution time and solution quality.
Abstract: Sandwich panels are widely used in the construction
industry for their ease of assembly, light weight and efficient thermal
performance. They are composed of two RC thin outer layers
separated by an insulating inner layer. In this research the inner
insulating layer is made of lightweight Autoclaved Aerated Concrete
(AAC) blocks which has good thermal insulation properties and yet
possess reasonable mechanical strength. The shear strength of the
AAC infill is relied upon to replace the traditionally used insulating
foam and to provide the shear capacity of the panel. A
comprehensive experimental program was conducted on full scale
sandwich panels subjected to bending. In this paper, detailed
numerical modeling of the tested sandwich panels is reported. Nonlinear
3-D finite element modeling of the composite action of the
sandwich panel is developed using ANSYS. Solid elements with
different crashing and cracking capabilities and different constitutive
laws were selected for the concrete and the AAC. Contact interface
elements are used in this research to adequately model the shear
transfer at the interface between the different layers. The numerical
results showed good correlation with the experimental ones
indicating the adequacy of the model in estimating the loading
capacity of panels.
Abstract: A new conserving approach in the context of Immersed Boundary Method (IBM) is presented to simulate one dimensional, incompressible flow in a moving boundary problem. The method employs control volume scheme to simulate the flow field. The concept of ghost node is used at the boundaries to conserve the mass and momentum equations. The Present method implements the conservation laws in all cells including boundary control volumes. Application of the method is studied in a test case with moving boundary. Comparison between the results of this new method and a sharp interface (Image Point Method) IBM algorithm shows a well distinguished improvement in both pressure and velocity fields of the present method. Fluctuations in pressure field are fully resolved in this proposed method. This approach expands the IBM capability to simulate flow field for variety of problems by implementing conservation laws in a fully Cartesian grid compared to other conserving methods.
Abstract: The prediction of financial time series is a very
complicated process. If the efficient market hypothesis holds, then the predictability of most financial time series would be a rather
controversial issue, due to the fact that the current price contains already all available information in the market. This paper extends
the Adaptive Neuro Fuzzy Inference System for High Frequency
Trading which is an expert system that is capable of using fuzzy reasoning combined with the pattern recognition capability of neural networks to be used in financial forecasting and trading in high
frequency. However, in order to eliminate unnecessary input in the
training phase a new event based volatility model was proposed.
Taking volatility and the scaling laws of financial time series into consideration has brought about the development of the Intraday Seasonality Observation Model. This new model allows the observation of specific events and seasonalities in data and subsequently removes any unnecessary data. This new event based
volatility model provides the ANFIS system with more accurate input
and has increased the overall performance of the system.
Abstract: This paper proposes a modeling method of the laws controlling manufacturing systems with temporal and non temporal constraints. A methodology of robust control construction generating the margins of passive and active robustness is being elaborated. Indeed, two paramount models are presented in this paper. The first utilizes the P-time Petri Nets which is used to manage the flow type disturbances. The second, the quality model, exploits the Intervals Constrained Petri Nets (ICPN) tool which allows the system to preserve its quality specificities. The redundancy of the robustness of the elementary parameters between passive and active is also used. The final model built allows the correlation of temporal and non temporal criteria by putting two paramount models in interaction. To do so, a set of definitions and theorems are employed and affirmed by applicator examples.
Abstract: This paper explores steady-state characteristics of
grid-connected doubly fed induction motor (DFIM) in case of unity
power factor operation. Based on the synchronized mathematical
model, analytic determination of the control laws is presented and
illustrated by various figures to understand the effect of the applied
rotor voltage on the speed and the active power. On other hand,
unlike previous works where the stator resistance was neglected, in
this work, stator resistance is included such that the equations can be
applied to small wind turbine generators which are becoming more
popular. Finally the work is crowned by integration of the studied
induction generator in a wind system where an open loop control is
proposed confers a remarkable simplicity of implementation
compared to the known methods.
Abstract: This paper proposes a solution to the motion planning
and control problem of car-like mobile robots which is required to
move safely to a designated target in a priori known workspace
cluttered with swarm of boids exhibiting collective emergent
behaviors. A generalized algorithm for target convergence and
swarm avoidance is proposed that will work for any number of
swarms. The control laws proposed in this paper also ensures
practical stability of the system. The effectiveness of the proposed
control laws are demonstrated via computer simulations of an
emergent behavior.
Abstract: In an open real-time system environment, the coexistence of different kinds of real-time and non real-time applications makes the system scheduling mechanism face new requirements and challenges. One two-level scheduling scheme of the open real-time systems is introduced, and points out that hard and soft real-time applications are scheduled non-distinctively as the same type real-time applications, the Quality of Service (QoS) cannot be guaranteed. It has two flaws: The first, it can not differentiate scheduling priorities of hard and soft real-time applications, that is to say, it neglects characteristic differences between hard real-time applications and soft ones, so it does not suit a more complex real-time environment. The second, the worst case execution time of soft real-time applications cannot be predicted exactly, so it is not worth while to cost much spending in order to assure all soft real-time applications not to miss their deadlines, and doing that may cause resource wasting. In order to solve this problem, a novel two-level real-time scheduling mechanism (including scheduling profile and scheduling algorithm) which adds the process of dealing with soft real-time applications is proposed. Finally, we verify real-time scheduling mechanism from two aspects of theory and experiment. The results indicate that our scheduling mechanism can achieve the following objectives. (1) It can reflect the difference of priority when scheduling hard and soft real-time applications. (2) It can ensure schedulability of hard real-time applications, that is, their rate of missing deadline is 0. (3) The overall rate of missing deadline of soft real-time applications can be less than 1. (4) The deadline of a non-real-time application is not set, whereas the scheduling algorithm that server 0 S uses can avoid the “starvation" of jobs and increase QOS. By doing that, our scheduling mechanism is more compatible with different types of applications and it will be applied more widely.
Abstract: In a wind power generator using doubly fed induction
generator (DFIG), the three-phase pulse width modulation (PWM)
voltage source converter (VSC) is used as grid side converter (GSC)
and rotor side converter (RSC). The standard linear control laws
proposed for GSC provides not only instablity against comparatively
large-signal disturbances, but also the problem of stability due to
uncertainty of load and variations in parameters. In this paper, a
nonlinear controller is designed for grid side converter (GSC) of a
DFIG for wind power application. The nonlinear controller is
designed based on the input-output feedback linearization control
method. The resulting closed-loop system ensures a sufficient
stability region, make robust to variations in circuit parameters and
also exhibits good transient response. Computer simulations and
experimental results are presented to confirm the effectiveness of the
proposed control strategy.
Abstract: In this paper we present a new approach to detecting a
flaw in T.O.F.D (Time Of Flight Diffraction) type ultrasonic image
based on texture features. Texture is one of the most important
features used in recognizing patterns in an image. The paper
describes texture features based on 2D Gabor functions, i.e.,
Gaussian shaped band-pass filters, with dyadic treatment of the radial
spatial frequency range and multiple orientations, which represent an
appropriate choice for tasks requiring simultaneous measurement in
both space and frequency domains. The most relevant features are
used as input data on a Fuzzy c-mean clustering classifier. The
classes that exist are only two: 'defects' or 'no defects'. The proposed
approach is tested on the T.O.F.D image achieved at the laboratory
and on the industrial field.
Abstract: An epidemiological cross sectional study was
undertaken in Yaoundé in 2002 and updated in 2005. Focused on
health within the city, the objectives were to measure diarrheal
prevalence and to identify the risk factors associated with them.
Results of microbiological examinations have revealed an urban
average prevalence rate of 14.5%. Access to basic services in the
living environment appears to be an important risk factor for
diarrheas. Statistical and spatial analyses conducted have revealed
that prevalence of diarrheal diseases vary among the two main types
of settlement (informal and planned). More importantly, this study
shows that, diarrhea prevalence rates (notably bacterial and parasitic
diarrheas) vary according to the sub- category of settlements. The
study draws a number of theoretical and policy implications for
researchers and policy decision makers.
Abstract: Absorption and fluorescence spectra of quinine
sulphate (QSD) have been recorded at room temperature in wide
range of solvents of different polarities. The ground-state dipole
moment of QSD was obtained from quantum mechanical calculations
and the excited state dipole moment of QSD was estimated from
Bakhshiev-s and Kawski-Chamma-Viallet-s equations by means of
solvatochromic shift method. Higher value of dipole moment is
observed for excited state as compared to the corresponding ground
state value and this is attributed to the more polar excited state of
QSD.
Abstract: The UK Government has emphasized the role of Local Authorities as a key player in its flagship residential energy efficiency strategies, by identifying and targeting areas for energy efficiency improvements. Residential energy consumption in England is characterized by significant geographical variation in energy demand, which makes centralized targeting of areas for energy efficiency intervention difficult. This paper draws on research which aims to understand how demographic, social, economic, urban form and climatic factors influence the geographical variations in English residential gas consumption. The paper reports the findings of a multiple regression model that shows how 64% of the geographical variation in residential gas consumption is accounted for by variations in these factors. Results from this study, after further refinement and validation, can be used by Local Authorities to identify areas within their boundaries that have higher than expected gas consumption, these may be prime targets for energy efficiency initiatives.