Abstract: In wireless sensor network, sensor node transmits the
sensed data to the sink node in multi-hop communication
periodically. This high traffic induces congestion at the node which is
present one-hop distance to the sink node. The packet transmission
and reception rate of these nodes should be very high, when
compared to other sensor nodes in the network. Therefore, the energy
consumption of that node is very high and this effect is known as the
“funneling effect”. The tree based-data aggregation technique
(TBDA) is used to reduce the energy consumption of the node. The
throughput of the overall performance shows a considerable decrease
in the number of packet transmissions to the sink node. The proposed
scheme, TBDA, avoids the funneling effect and extends the lifetime
of the wireless sensor network. The average case time complexity for
inserting the node in the tree is O(n log n) and for the worst case time
complexity is O(n2).
Abstract: In this paper, an attempt has been made for the design
of a robotic library using an intelligent system. The robot works on
the ARM microprocessor, motor driver circuit with 5 degrees of
freedom with Wi-Fi and GPS based communication protocol. The
authenticity of the library books is controlled by RFID. The proposed
robotic library system is facilitated with embedded system and ARM.
In this library issuance system, the previous potential readers’
authentic review reports have been taken into consideration for
recommending suitable books to the deserving new users and the
issuance of books or periodicals is based on the users’ decision. We
have conjectured that the Wi-Fi based robotic library management
system would allow fast transaction of books issuance and it also
produces quality readers.
Abstract: The paper presents a method in which the expert
knowledge is applied to fuzzy inference model. Even a less
experienced person could benefit from the use of such a system, e.g.
urban planners, officials. The analysis result is obtained in a very
short time, so a large number of the proposed locations can also be
verified in a short time. The proposed method is intended for testing
of locations of car parks in a city. The paper shows selected examples
of locations of the P&R facilities in cities planning to introduce the
P&R. The analyses of existing objects are also shown in the paper
and they are confronted with the opinions of the system users, with
particular emphasis on unpopular locations. The results of the
analyses are compared to expert analysis of the P&R facilities
location that was outsourced by the city and the opinions about
existing facilities users that were expressed on social networking
sites. The obtained results are consistent with actual users’ feedback.
The proposed method proves to be good, but does not require the
involvement of a large experts team and large financial contributions
for complicated research. The method also provides an opportunity to
show the alternative location of P&R facilities. Although the results
of the method are approximate, they are not worse than results of
analysis of employed experts. The advantage of this method is ease of
use, which simplifies the professional expert analysis. The ability of
analyzing a large number of alternative locations gives a broader
view on the problem. It is valuable that the arduous analysis of the
team of people can be replaced by the model's calculation. According
to the authors, the proposed method is also suitable for
implementation on a GIS platform.
Abstract: Information technology has been gaining more and
more space whether in industry, commerce or even for personal use,
but the misuse of it brings harm to the environment and human health
as a result. Contribute to the sustainability of the planet is to
compensate the environment, all or part of what withdraws it. The
green computing also came to propose practical for use in IT in an
environmentally correct way in aid of strategic management and
communication. This work focuses on showing how a mobile
application can help businesses reduce costs and reduced
environmental impacts caused by its processes, through a case study
of a public company in Brazil.
Abstract: Manufacturing process has been considered as one of
the most important activity in business process. It correlates with
productivity and quality of the product so industries could fulfill
customer’s demand. With the increasing demand from customer,
industries must improve their manufacturing ability such as shorten
lead-time and reduce wastes on their process. Lean manufacturing
has been considered as one of the tools to waste elimination in
manufacturing or service industry. Workforce development is one
practice in lean manufacturing that can reduce waste generated from
operator such as waste of unnecessary motion. Anthropometric
approach is proposed to determine the recommended measurement in
operator’s work area. The method will get some dimensions from
Indonesia people that related to piston workstation. The result from
this research can be obtained new design for the work area
considering ergonomic aspect.
Abstract: Background modeling and subtraction in video
analysis has been widely used as an effective method for moving
objects detection in many computer vision applications. Recently, a
large number of approaches have been developed to tackle different
types of challenges in this field. However, the dynamic background
and illumination variations are the most frequently occurred problems
in the practical situation. This paper presents a favorable two-layer
model based on codebook algorithm incorporated with local binary
pattern (LBP) texture measure, targeted for handling dynamic
background and illumination variation problems. More specifically,
the first layer is designed by block-based codebook combining with
LBP histogram and mean value of each RGB color channel. Because
of the invariance of the LBP features with respect to monotonic
gray-scale changes, this layer can produce block wise detection results
with considerable tolerance of illumination variations. The pixel-based
codebook is employed to reinforce the precision from the output of the
first layer which is to eliminate false positives further. As a result, the
proposed approach can greatly promote the accuracy under the
circumstances of dynamic background and illumination changes.
Experimental results on several popular background subtraction
datasets demonstrate very competitive performance compared to
previous models.
Abstract: The aim of this work is to study the numerical
implementation of the Hilbert Uniqueness Method for the exact
boundary controllability of Euler-Bernoulli beam equation. This study
may be difficult. This will depend on the problem under consideration
(geometry, control and dimension) and the numerical method used.
Knowledge of the asymptotic behaviour of the control governing the
system at time T may be useful for its calculation. This idea will
be developed in this study. We have characterized as a first step, the
solution by a minimization principle and proposed secondly a method
for its resolution to approximate the control steering the considered
system to rest at time T.
Abstract: Companies face increasing challenges in research due
to higher costs and risks. The intensifying technology complexity and
interdisciplinarity require unique know-how. Therefore, companies
need to decide whether research shall be conducted internally or
externally with partners. On the other hand, research institutes meet
increasing efforts to achieve good financing and to maintain high
research reputation. Therefore, relevant research topics need to be
identified and specialization of competency is necessary. However,
additional competences for solving interdisciplinary research projects
are also often required. Secured financing can be achieved by
bonding industry partners as well as public fundings. The realization
of faster and better research drives companies and research institutes
to cooperate in organized research networks, which are managed by
an administrative organization. For an effective and efficient
cooperation, necessary processes, roles, tools and a set of rules need
to be determined. Goal of this paper is to show the state-of-art
research and to propose a governance framework for organized
research networks.
Abstract: Contemporary metropolitan areas and large cities are
dynamic, rapidly growing and continuously changing. Thus, urban
transformations and mutations are not a new phenomenon, but rather
a continuous process. Basic factors of urban transformation are
related to development of technologies, globalisation, lifestyle, etc.,
which in combination with local factors have generated an extremely
great variety of urban development conditions. This article discusses
the main urbanisation processes in Lithuania during last 50-year
period and social factors affecting urban functional mutations.
Abstract: Web Usage Mining is the application of data mining
techniques to find usage patterns from web log data, so as to grasp
required patterns and serve the requirements of Web-based
applications. User’s expertise on the internet may be improved by
minimizing user’s web access latency. This may be done by
predicting the future search page earlier and the same may be prefetched
and cached. Therefore, to enhance the standard of web
services, it is needed topic to research the user web navigation
behavior. Analysis of user’s web navigation behavior is achieved
through modeling web navigation history. We propose this technique
which cluster’s the user sessions, based on the K-medoids technique.
Abstract: River Hindon is an important river catering the
demand of highly populated rural and industrial cluster of western
Uttar Pradesh, India. Water quality of river Hindon is deteriorating at
an alarming rate due to various industrial, municipal and agricultural
activities. The present study aimed at identifying the pollution
sources and quantifying the degree to which these sources are
responsible for the deteriorating water quality of the river. Various
water quality parameters, like pH, temperature, electrical
conductivity, total dissolved solids, total hardness, calcium, chloride,
nitrate, sulphate, biological oxygen demand, chemical oxygen
demand, and total alkalinity were assessed. Water quality data
obtained from eight study sites for one year has been subjected to the
two multivariate techniques, namely, principal component analysis
and cluster analysis. Principal component analysis was applied with
the aim to find out spatial variability and to identify the sources
responsible for the water quality of the river. Three Varifactors were
obtained after varimax rotation of initial principal components using
principal component analysis. Cluster analysis was carried out to
classify sampling stations of certain similarity, which grouped eight
different sites into two clusters. The study reveals that the
anthropogenic influence (municipal, industrial, waste water and
agricultural runoff) was the major source of river water pollution.
Thus, this study illustrates the utility of multivariate statistical
techniques for analysis and elucidation of multifaceted data sets,
recognition of pollution sources/factors and understanding
temporal/spatial variations in water quality for effective river water
quality management.
Abstract: English like any other language is rich by means of arbitrary, conventional, symbols which lend it to lot of inconsistencies in spelling, phonology, syntax, and morphology. The research examines the irregularities prevalent in the structure and meaning of some ‘er’ lexical items in English and its implication to vocabulary acquisition. It centers its investigation on the derivational suffix ‘er’, which changes the grammatical category of word. English language poses many challenges to Second Language Learners because of its irregularities, exceptions, and rules. One of the meaning of –er derivational suffix is someone or somebody who does something. This rule often confuses the learners when they meet with the exceptions in normal discourse. The need to investigate instances of such inconsistencies in the formation of –er words and the meanings given to such words by the students motivated this study. For this purpose, some senior secondary two (SS2) students in six randomly selected schools in the metropolis were provided a large number of alphabetically selected ‘er’ suffix ending words, The researcher opts for a test technique, which requires them to provide the meaning of the selected words with- er. The marking of the test was scored on the scale of 1-0, where correct formation of –er word and meaning is scored one while wrong formation and meaning is scored zero. The number of wrong and correct formations of –er words meaning were calculated using percentage. The result of this research shows that a large number of students made wrong generalization of the meaning of the selected -er ending words. This shows how enormous the inconsistencies are in English language and how are affect the learning of English. Findings from the study revealed that though students mastered the basic morphological rules but the errors are generally committed on those vocabulary items that are not frequently in use. The study arrives at this conclusion from the survey of their textbook and their spoken activities. Therefore, the researcher recommends that there should be effective reappraisal of language teaching through implementation of the designed curriculum to reflect on modern strategies of teaching language, identification, and incorporation of the exceptions in rigorous communicative activities in language teaching, language course books and tutorials, training and retraining of teachers on the strategies that conform to the new pedagogy.
Abstract: A Reconfigurable Wilkinson power divider is
proposed in this paper. In existing system only a limited number of
bandwidth is used at the output ports, in the proposed Wilkinson
power divider different band of frequencies are obtained by using
PIN diode. By tuning the PIN diode, different frequencies are
achieved. The size of the power divider is reduced for the operating
frequency and increases the fractional bandwidth.
Abstract: Small-size and low-power sensors with sensing, signal
processing and wireless communication capabilities is suitable for the
wireless sensor networks. Due to the limited resources and battery
constraints, complex routing algorithms used for the ad-hoc networks
cannot be employed in sensor networks. In this paper, we propose
node-disjoint multi-path hexagon-based routing algorithms in wireless
sensor networks. We suggest the details of the algorithm and compare
it with other works. Simulation results show that the proposed scheme
achieves better performance in terms of efficiency and message
delivery ratio.
Abstract: In this work, a framework to model the Supply Chain
(SC) Collaborative Planning (CP) process is proposed. The main
contributions of this framework concern 1) the presentation of the
decision view, the most important one due to the characteristics of the
process, jointly within the physical, organisation and information
views, and 2) the simultaneous consideration of the spatial and
temporal integration among the different supply chain decision
centres. This framework provides the basis for a realistic and
integrated perspective of the supply chain collaborative planning
process and also the analytical modeling of each of its decisional
activities.
Abstract: The research investigates the causes of unemployment
in Namibia, Nigeria and South Africa and the role of Capital
Accumulation in reducing the unemployment profile of these
economies as proposed by the post-Keynesian economics. This is
conducted through extensive review of literature on the NAIRU
models and focused on the post-Keynesian view of unemployment
within the NAIRU framework. The NAIRU (non-accelerating
inflation rate of unemployment) model has become a dominant
framework used in macroeconomic analysis of unemployment. The
study views the post-Keynesian economics arguments that capital
accumulation is a major determinant of unemployment.
Unemployment remains the fundamental socio-economic challenge
facing African economies. It has been a burden to citizens of those
economies. Namibia, Nigeria, and South Africa are great African
nations battling with high unemployment rates. The high
unemployment rate in the country led the citizens to chase away
foreigners in the country claiming that they have taken away their
jobs. The study proposes there is a strong relationship between
capital accumulation and unemployment in Namibia, Nigeria, and
South Africa, and capital accumulation is responsible for high
unemployment rates in these countries. For the economies to achieve
steady state level of employment and satisfactory level of economic
growth and development, there is need for capital accumulation to
take place. The countries in the study have been selected after a
critical research and investigations. They are selected based on the
following criteria; African economies with high unemployment rates
above 15% and have about 40% of their workforce unemployed. This
level of unemployment is the critical level of unemployment in
Africa as expressed by International Labour Organization (ILO). And
finally, the African countries experience a slow growth in their Gross
fixed capital formation. Adequate statistical measures have been
employed using a time-series analysis in the study and the results
revealed that capital accumulation is the main driver of
unemployment performance in the chosen African countries. An
increase in the accumulation of capital causes unemployment to
reduce significantly. The results of the research work will be useful
and relevant to federal governments and ministries, departments and
agencies (MDAs) of Namibia, Nigeria and South Africa to resolve
the issue of high and persistent unemployment rates in their
economies which are great burden that slows growth and
development of developing economies. Also, the result can be useful
to World Bank, African Development Bank and International Labour
Organization (ILO) in their further research and studies on how to
tackle unemployment in developing and emerging economies.
Abstract: The question of legal liability over injury arising out
of the import and the introduction of GM food emerges as a crucial
issue confronting to promote GM food and its derivatives. There is a
greater possibility of commercialized GM food from the exporting
country to enter importing country where status of approval shall not
be same. This necessitates the importance of fixing a liability
mechanism to discuss the damage, if any, occurs at the level of
transboundary movement or at the market. There was a widespread consensus to develop the Cartagena
Protocol on Biosafety and to give for a dedicated regime on liability
and redress in the form of Nagoya Kuala Lumpur Supplementary
Protocol on the Liability and Redress (‘N-KL Protocol’) at the
international context. The national legal frameworks based on this
protocol are not adequately established in the prevailing food
legislations of the developing countries. The developing economy
like India is willing to import GM food and its derivatives after the
successful commercialization of Bt Cotton in 2002. As a party to the
N-KL Protocol, it is indispensable for India to formulate a legal
framework and to discuss safety, liability, and regulatory issues
surrounding GM foods in conformity to the provisions of the
Protocol. The liability mechanism is also important in the case where
the risk assessment and risk management is still in implementing
stage. Moreover, the country is facing GM infiltration issues with its
neighbors Bangladesh. As a precautionary approach, there is a need
to formulate rules and procedure of legal liability to discuss any kind
of damage occurs at transboundary trade. In this context, the
proposed work will attempt to analyze the liability regime in the
existing Food Safety and Standards Act, 2006 from the applicability
and domestic compliance and to suggest legal and policy options for
regulatory authorities.
Abstract: The increase in electric power demand in face of
environmental issues has intensified the participation of renewable
energy sources such as photovoltaics, in the energy matrix of various
countries. Due to their operational characteristics, they can generate
time-varying harmonic and inter-harmonic distortions. For this
reason, the application of methods of measurement based on
traditional Fourier analysis, as proposed by IEC 61000-4-7, can
provide inaccurate results. Considering the aspects mentioned herein,
came the idea of the development of this work which aims to present
the results of a comparative evaluation between a methodology
arising from the combination of the Prony method with the Kalman
filter and another method based on the IEC 61000-4-30 and IEC
61000-4-7 standards. Employed in this study were synthetic signals
and data acquired through measurements in a 50kWp photovoltaic
installation.
Abstract: Social networking sites such as Twitter and Facebook
attracts over 500 million users across the world, for those users, their
social life, even their practical life, has become interrelated. Their
interaction with social networking has affected their life forever.
Accordingly, social networking sites have become among the main
channels that are responsible for vast dissemination of different kinds
of information during real time events. This popularity in Social
networking has led to different problems including the possibility of
exposing incorrect information to their users through fake accounts
which results to the spread of malicious content during life events.
This situation can result to a huge damage in the real world to the
society in general including citizens, business entities, and others. In this paper, we present a classification method for detecting the
fake accounts on Twitter. The study determines the minimized set of
the main factors that influence the detection of the fake accounts on
Twitter, and then the determined factors are applied using different
classification techniques. A comparison of the results of these
techniques has been performed and the most accurate algorithm is
selected according to the accuracy of the results. The study has been
compared with different recent researches in the same area; this
comparison has proved the accuracy of the proposed study. We claim
that this study can be continuously applied on Twitter social network
to automatically detect the fake accounts; moreover, the study can be
applied on different social network sites such as Facebook with minor
changes according to the nature of the social network which are
discussed in this paper.
Abstract: A broadband wire monopole antenna loaded by inhomogeneous stack of annular dielectric ring resonators (DRRs) is proposed. The proposed antenna exhibits a broad impedance bandwidth from 3 to 30 GHz. This is achieved by adding an external step matching network at the antenna feed point. The matching network is comprised of three annular DRRs possessing different permittivity values and sharing the same axial over a finite ground plane. The antenna performance is characterized using full-wave EM simulation. Compared to previous-reported wire antennas with improved bandwidth achieved by DRRs, the proposed topology provides relatively compact realization and superior broadband performance.