Abstract: This paper develops a multiple channel assignment
model, which allows to take advantage of spectrum opportunities in
cognitive radio networks in the most efficient way. The developed
scheme allows making several assignments of available and
frequency adjacent channel, which require a bigger bandwidth, under
an equality environment. The hybrid assignment model it is made by
two algorithms, one that makes the ranking and selects available
frequency channels and the other one in charge of establishing the
Max-Min Fairness for not restrict the spectrum opportunities for all
the other secondary users, who also claim to make transmissions.
Measurements made were done for average bandwidth, average
delay, as well as fairness computation for several channel
assignments. Reached results were evaluated with experimental
spectrum occupational data from captured GSM frequency band. The
developed model shows evidence of improvement in spectrum
opportunity use and a wider average transmission bandwidth for each
secondary user, maintaining equality criteria in channel assignment.
Abstract: The growing concerns for physical wellbeing and
health have been reflected in the way we choose food in our table.
Nowadays, we are all more informed consumers and choose healthier
foods. On the other hand, stroke, cancer and atherosclerosis may be
somehow minimized by the intake of some bioactive compounds
present in food, the so-called nutraceuticals and functional foods. The
aim of this work was to make a revision of the published studies
about the effects of some bioactive compounds, namely lycopene in
human health, in the prevention of diseases, thus playing the role of a
functional food. Free radical in human body can induce cell damage
and consequently can be responsible for the development of some
cancers and chronic diseases. Lycopene is one of the most powerful
antioxidants known, being the predominant carotenoid in tomato. The
respective chemistry, bioavailability, and its functional role in the
prevention of several diseases will be object of this work. On the
other hand, the inclusion of lycopene in some foods can also be made
by biotechnology and represents a way to recover the wastes in the
tomato industry with nutritional positive effects in health.
Abstract: The Portuguese footwear industry had in the last five years a remarkable performance in the exportation values, the trade balance and others economic indicators. After a long period of difficulties and with a strong reduction of companies and employees since 1994 until 2009, the Portuguese footwear industry changed the strategy and is now a success case between the international players of footwear. Only the Italian industry sells footwear with a higher value than the Portuguese and the distance between them is decreasing year by year. This paper analyses how the Portuguese footwear companies innovate and make innovation, according the classification proposed by the Oslo Manual. Also, analyses the strategy follow in the innovation process and shows the linkage between the type of innovation and the strategy of innovation. The research methodology was qualitative and the strategy for data collection was the case study. The qualitative data will be analyzed with the MAXQDA software. The economic results of the footwear companies studied shows differences between all of them and these differences are related with the innovation strategy adopted. The companies focused in product and marketing innovation, oriented to their target market, have higher ratios “turnover per worker” than the companies focused in process innovation. However, all the footwear companies in this “low-tech” industry create value and contribute to a positive foreign trade of 1.310 million euros in 2013. The growth strategies implemented has the participation of the sectorial organizations in several innovative projects. And it’s obvious that cooperation between all of them is a critical element to the performance achieved by the companies and the innovation observed. The Portuguese footwear sector has in the last years an excellent performance (economic results, exportation values, trade balance, brands and international image) and his performance is strongly related with the strategy in innovation followed, the type of innovation and the networks in the cluster. A simplified model, called “Ace of Diamonds”, is proposed by the authors and explains the way how this performance was reached by the seven companies that participate in the study (two of them are the leaders in the setor), and if this model can be used in others traditional and “low-tech” industries.
Abstract: The enormous amount of information stored on the
web increases from one day to the next, exposing the web currently
faced with the inevitable difficulties of research pertinent information
that users really want. The problem today is not limited to expanding
the size of the information highways, but to design a system for
intelligent search. The vast majority of this information is stored in
relational databases, which in turn represent a backend for managing
RDF data of the semantic web. This problem has motivated us to
write this paper in order to establish an effective approach to support
semantic transformation algorithm for SPARQL queries to SQL
queries, more precisely SPARQL SELECT queries; by adopting this
method, the relational database can be questioned easily with
SPARQL queries maintaining the same performance.
Abstract: Several parameters are established in order to measure
biodiesel quality. One of them is the iodine value, which is an
important parameter that measures the total unsaturation within a
mixture of fatty acids. Limitation of unsaturated fatty acids is
necessary since warming of higher quantity of these ones ends in
either formation of deposits inside the motor or damage of lubricant.
Determination of iodine value by official procedure tends to be very
laborious, with high costs and toxicity of the reagents, this study uses
artificial neural network (ANN) in order to predict the iodine value
property as an alternative to these problems. The methodology of
development of networks used 13 esters of fatty acids in the input
with convergence algorithms of back propagation of back
propagation type were optimized in order to get an architecture of
prediction of iodine value. This study allowed us to demonstrate the
neural networks’ ability to learn the correlation between biodiesel
quality properties, in this caseiodine value, and the molecular
structures that make it up. The model developed in the study reached
a correlation coefficient (R) of 0.99 for both network validation and
network simulation, with Levenberg-Maquardt algorithm.
Abstract: This paper presents an optimization method for
reducing the number of input channels and the complexity of the
feed-forward NARX neural network (NN) without compromising the
accuracy of the NN model. By utilizing the correlation analysis
method, the most significant regressors are selected to form the input
layer of the NN structure. An application of vehicle dynamic model
identification is also presented in this paper to demonstrate the
optimization technique and the optimal input layer structure and the
optimal number of neurons for the neural network is investigated.
Abstract: Historical narration is an act that necessarily develops
and deforms history. This “translation” is examined within the
historical and political context of the 1930 Berlin film premiere of
“All Quiet on the Western Front,” a film based on Erich Maria
Remarque’s 1928 best-selling novel. Both the film and the novel
appeared during an era in which life was conceived of as innately
artistic. The emergence of this “aestheticization” of memory and
history enabled conservative propaganda of the period to denounce
all art that did not adhere conceptually to its political tenets, with “All
Quiet” becoming yet another of its “victims.”
Abstract: A large amount of data is typically stored in relational
databases (DB). The latter can efficiently handle user queries which
intend to elicit the appropriate information from data sources.
However, direct access and use of this data requires the end users to
have an adequate technical background, while they should also cope
with the internal data structure and values presented. Consequently
the information retrieval is a quite difficult process even for IT or DB
experts, taking into account the limited contributions of relational
databases from the conceptual point of view. Ontologies enable users
to formally describe a domain of knowledge in terms of concepts and
relations among them and hence they can be used for unambiguously
specifying the information captured by the relational database.
However, accessing information residing in a database using
ontologies is feasible, provided that the users are keen on using
semantic web technologies. For enabling users form different
disciplines to retrieve the appropriate data, the design of a Graphical
User Interface is necessary. In this work, we will present an
interactive, ontology-based, semantically enable web tool that can be
used for information retrieval purposes. The tool is totally based on
the ontological representation of underlying database schema while it
provides a user friendly environment through which the users can
graphically form and execute their queries.
Abstract: The number of Ground Motion Prediction Equations
(GMPEs) used for predicting peak ground acceleration (PGA) and
the number of earthquake recordings that have been used for fitting
these equations has increased in the past decades. The current PF-L
database contains 3550 recordings. Since the GMPEs frequently
model the peak ground acceleration the goal of the present study was
to refit a selection of 44 of the existing equation models for PGA in
light of the latest data. The algorithm Levenberg-Marquardt was used
for fitting the coefficients of the equations and the results are
evaluated both quantitatively by presenting the root mean squared
error (RMSE) and qualitatively by drawing graphs of the five best
fitted equations. The RMSE was found to be as low as 0.08 for the
best equation models. The newly estimated coefficients vary from the
values published in the original works.
Abstract: In this paper we describe the Levenvberg-Marquardt
(LM) algorithm for identification and equalization of CDMA
signals received by an antenna array in communication channels.
The synthesis explains the digital separation and equalization of
signals after propagation through multipath generating intersymbol
interference (ISI). Exploiting discrete data transmitted and three
diversities induced at the reception, the problem can be composed
by the Block Component Decomposition (BCD) of a tensor of
order 3 which is a new tensor decomposition generalizing the
PARAFAC decomposition. We optimize the BCD decomposition by
Levenvberg-Marquardt method gives encouraging results compared to
classical alternating least squares algorithm (ALS). In the equalization
part, we use the Minimum Mean Square Error (MMSE) to perform
the presented method. The simulation results using the LM algorithm
are important.
Abstract: This paper aims at finding a suitable neural network
for monitoring congestion level in electrical power systems. In this
paper, the input data has been framed properly to meet the target
objective through supervised learning mechanism by defining normal
and abnormal operating conditions for the system under study. The
congestion level, expressed as line congestion index (LCI), is
evaluated for each operating condition and is presented to the NN
along with the bus voltages to represent the input and target data.
Once, the training goes successful, the NN learns how to deal with a
set of newly presented data through validation and testing
mechanism. The crux of the results presented in this paper rests on
performance comparison of a multi-layered feed forward neural
network with eleven types of back propagation techniques so as to
evolve the best training criteria. The proposed methodology has been
tested on the standard IEEE-14 bus test system with the support of
MATLAB based NN toolbox. The results presented in this paper
signify that the Levenberg-Marquardt backpropagation algorithm
gives best training performance of all the eleven cases considered in
this paper, thus validating the proposed methodology.
Abstract: From an organizational perspective, leaders are a
variation of the same talent pool in that they all score a larger than
average value on the bell curve that maps leadership behaviors and
characteristics, namely competence, vision, communication,
confidence, cultural sensibility, stewardship, empowerment,
authenticity, reinforcement, and creativity. The question that remains
unanswered and essentially unresolved is how to explain the irony
that leaders are so much alike yet their organizations diverge so
noticeably in their ability to innovate. Leadership intersects with
innovation at the point where human interactions get exceedingly
complex and where certain paradoxical forces cohabit: conflict with
conciliation, sovereignty with interdependence, and imagination with
realism. Rather than accepting that leadership is without context, we
argue that leaders are specialists of their domain and that those
effective at leading for innovation are distinct within the broader pool
of leaders. Keeping in view the extensive literature on leadership and
innovation, we carried out a quantitative study with data collected
over a five-year period involving 240 participants from across five
dissimilar companies based in the United States. We found that while
innovation and leadership are, in general, strongly interrelated (r =
.89, p = 0.0), there are five qualities that set leaders apart on
innovation. These qualities include a large radius of trust, a restless
curiosity with a low need for acceptance, an honest sense of self and
other, a sense for knowledge and creativity as the yin and yang of
innovation, and an ability to use multiple senses in the engagement
with followers. When these particular behaviors and characteristics
are present in leaders, organizations out-innovate their rivals by a
margin of 29.3 per cent to gain an unassailable edge in a business
environment that is regularly disruptive. A strategic outcome of this
study is a psychometric scale named iLeadership, proposed with the
underlying evidence, limitations, and potential for leadership and
innovation in organizations.c
Abstract: This work evaluates the ability of OBT for detecting parametric faults in continuous-time filters. To this end, we adopt two filters with quite different topologies as cases of study and a previously reported statistical fault model. In addition, we explore the behavior of the test schemes when a particular test condition is changed. The new data reported here, obtained from a fault simulation process, reveal a lower performance of OBT not observed in previous work using single-deviation faults, even under the change in the test condition.
Abstract: The direct synthesis process of dimethyl ether (DME)
from syngas in slurry reactors is considered to be promising because
of its advantages in caloric transfer. In this paper, the influences of
operating conditions (temperature, pressure and weight hourly space
velocity) on the conversion of CO, selectivity of DME and methanol
were studied in a stirred autoclave over Cu-Zn-Al-Zr slurry catalyst,
which is far more suitable to liquid phase dimethyl ether synthesis
process than bifunctional catalyst commercially. A Langmuir-
Hinshelwood mechanism type global kinetics model for liquid phase
DME direct synthesis based on methanol synthesis models and a
methanol dehydration model has been investigated by fitting our
experimental data. The model parameters were estimated with
MATLAB program based on general Genetic Algorithms and
Levenberg-Marquardt method, which is suitably fitting experimental
data and its reliability was verified by statistical test and residual
error analysis.
Abstract: Artificial Intelligence based gaming is an interesting topic in the state-of-art technology. This paper presents an automation of a tradition Omani game, called Al-Hawalees. Its related issues are resolved and implemented using artificial intelligence approach. An AI approach called mini-max procedure is incorporated to make a diverse budges of the on-line gaming. If number of moves increase, time complexity will be increased in terms of propositionally. In order to tackle the time and space complexities, we have employed a back propagation neural network (BPNN) to train in off-line to make a decision for resources required to fulfill the automation of the game. We have utilized Leverberg- Marquardt training in order to get the rapid response during the gaming. A set of optimal moves is determined by the on-line back propagation training fashioned with alpha-beta pruning. The results and analyses reveal that the proposed scheme will be easily incorporated in the on-line scenario with one player against the system.
Abstract: The purpose of this work is to present the potential of
solar energy in Zarqa region. The solar radiation along year 2009 was
obtained from Pyranometer which measures the global radiation over
horizontal surfaces. Solar data in several different forms, over period
of 5 minutes, hour-by-hour, daily and monthly data radiation have
been presented. Briefly, the yearly global solar radiation in Zarqa is
7297.5 MJ/m2 (2027 kWh/m²) and the average annual solar radiation
per day is 20 MJ/m2 (5.5 Kwh/m2). More specifically, the average
annual solar radiation per day is 12.9 MJ/m2 (3.57 Kwh/m2) in winter
and 25 MJ/m2 (7 Kwh/m2) in summer.
Abstract: This paper presents a Neural Network (NN) identification of icing parameters in an A340 aircraft and a reconfiguration technique to keep the A/C performance close to the performance prior to icing. Five aircraft parameters are assumed to be considerably affected by icing. The off-line training for identifying the clear and iced dynamics is based on the Levenberg-Marquard Backpropagation algorithm. The icing parameters are located in the system matrix. The physical locations of the icing are assumed at the right and left wings. The reconfiguration is based on the technique known as the control mixer approach or pseudo inverse technique. This technique generates the new control input vector such that the A/C dynamics is not much affected by icing. In the simulations, the longitudinal and lateral dynamics of an Airbus A340 aircraft model are considered, and the stability derivatives affected by icing are identified. The simulation results show the successful NN identification of the icing parameters and the reconfigured flight dynamics having the similar performance before the icing. In other words, the destabilizing icing affect is compensated.
Abstract: Design for cost (DFC) is a method that reduces life
cycle cost (LCC) from the angle of designers. Multiple domain
features mapping (MDFM) methodology was given in DFC. Using
MDFM, we can use design features to estimate the LCC. From the
angle of DFC, the design features of family cars were obtained, such
as all dimensions, engine power and emission volume. At the
conceptual design stage, cars- LCC were estimated using back
propagation (BP) artificial neural networks (ANN) method and
case-based reasoning (CBR). Hamming space was used to measure the
similarity among cases in CBR method. Levenberg-Marquardt (LM)
algorithm and genetic algorithm (GA) were used in ANN. The
differences of LCC estimation model between CBR and artificial
neural networks (ANN) were provided. ANN and CBR separately
each method has its shortcomings. By combining ANN and CBR
improved results accuracy was obtained. Firstly, using ANN selected
some design features that affect LCC. Then using LCC estimation
results of ANN could raise the accuracy of LCC estimation in CBR
method. Thirdly, using ANN estimate LCC errors and correct errors in
CBR-s estimation results if the accuracy is not enough accurate.
Finally, economically family cars and sport utility vehicle (SUV) was
given as LCC estimation cases using this hybrid approach combining
ANN and CBR.
Abstract: Variable channel conditions in underwater networks,
and variable distances between sensors due to water current, leads to
variable bit error rate (BER). This variability in BER has great
effects on energy efficiency of error correction techniques used. In
this paper an efficient energy adaptive hybrid error correction
technique (AHECT) is proposed. AHECT adaptively changes error
technique from pure retransmission (ARQ) in a low BER case to a
hybrid technique with variable encoding rates (ARQ & FEC) in a
high BER cases. An adaptation algorithm depends on a precalculated
packet acceptance rate (PAR) look-up table, current BER,
packet size and error correction technique used is proposed. Based
on this adaptation algorithm a periodically 3-bit feedback is added to
the acknowledgment packet to state which error correction technique
is suitable for the current channel conditions and distance.
Comparative studies were done between this technique and other
techniques, and the results show that AHECT is more energy
efficient and has high probability of success than all those
techniques.
Abstract: The response of King Abdulla Canal (KAC) water to the upgrade of As Samra Wastewater Treatment Plant which discharges its effluent to the Zarqa River is investigated. Time series quality data that extends between October 2005 and December 2009 obtained by a state of the art telemetric monitoring system were analyzed for COD, EC, TP and TN at two monitoring stations located upstream and downstream of the confluence of the Zarqa River with KAC. The samples- means and the t-test showed that there has been significant improvement in the quality of the KAC water for COD, and TP. However, the improvement in the TN was found statistically insignificant, whereas the EC of the KAC was unaffected by the upgrade. Comparing the selected parameters with the standards and guidelines for using treated wastewater in irrigation showed that the KAC water has improved towards meeting the required standards and guidelines for treated wastewater reuse in irrigation.