Abstract: The research objective of the project and article “The impact of Structural Funds on the growth of competitiveness of Polish agriculture" is to assess competitiveness of regions in Poland from the perspective of Polish agriculture by analysing the efficiency of the use of Structural Funds, the economic procedure of their distribution and the regulatory and organisational framework under the Rural Development Programme (RDP). It must be stressed that defining the scope of research in the above manner limits the analysis only to the part of Structural Funds directed to support Polish agriculture.
Abstract: Rule Discovery is an important technique for mining knowledge from large databases. Use of objective measures for discovering interesting rules lead to another data mining problem, although of reduced complexity. Data mining researchers have studied subjective measures of interestingness to reduce the volume of discovered rules to ultimately improve the overall efficiency of KDD process. In this paper we study novelty of the discovered rules as a subjective measure of interestingness. We propose a hybrid approach that uses objective and subjective measures to quantify novelty of the discovered rules in terms of their deviations from the known rules. We analyze the types of deviation that can arise between two rules and categorize the discovered rules according to the user specified threshold. We implement the proposed framework and experiment with some public datasets. The experimental results are quite promising.
Abstract: In 3D-wavelet video coding framework temporal
filtering is done along the trajectory of motion using Motion
Compensated Temporal Filtering (MCTF). Hence computationally
efficient motion estimation technique is the need of MCTF. In this
paper a predictive technique is proposed in order to reduce the
computational complexity of the MCTF framework, by exploiting
the high correlation among the frames in a Group Of Picture (GOP).
The proposed technique applies coarse and fine searches of any fast
block based motion estimation, only to the first pair of frames in a
GOP. The generated motion vectors are supplied to the next
consecutive frames, even to subsequent temporal levels and only fine
search is carried out around those predicted motion vectors. Hence
coarse search is skipped for all the motion estimation in a GOP
except for the first pair of frames. The technique has been tested for
different fast block based motion estimation algorithms over different
standard test sequences using MC-EZBC, a state-of-the-art scalable
video coder. The simulation result reveals substantial reduction (i.e.
20.75% to 38.24%) in the number of search points during motion
estimation, without compromising the quality of the reconstructed
video compared to non-predictive techniques. Since the motion
vectors of all the pair of frames in a GOP except the first pair will
have value ±1 around the motion vectors of the previous pair of
frames, the number of bits required for motion vectors is also
reduced by 50%.
Abstract: Researches related to standard product model and
development of neutral manufacturing interfaces for numerical
control machines becomes a significant topic since the last 25 years.
In this paper, a detail description of STEP implementation on turnmill
manufacturing has been discussed. It shows requirements of
information contents from ISO14649 data model. It covers to
describe the design of STEP-NC framework applicable to turn-mill
manufacturing. In the framework, EXPRESS-G and UML modeling
tools are used to depict the information contents of the system and
established the bases of information model requirement. A product
and manufacturing data model applicable for STEP compliant
manufacturing. The next generation turn-mill operations
requirements have been represented by a UML diagram. An object
oriented classes of ISO1449 has been developed on Visual Basic dot
NET platform for binding the static information model represented
by the UML diagram. An architect of the proposed system
implementation has been given on the bases of the design and
manufacturing module of STEP-NC interface established. Finally, a
part 21 file process plan generated for an illustration of turn-mill
components.
Abstract: Elliptic curve-based certificateless signature is slowly
gaining attention due to its ability to retain the efficiency of
identity-based signature to eliminate the need of certificate
management while it does not suffer from inherent private
key escrow problem. Generally, cryptosystem based on elliptic
curve offers equivalent security strength at smaller key sizes
compared to conventional cryptosystem such as RSA which
results in faster computations and efficient use of computing
power, bandwidth, and storage. This paper proposes to implement
certificateless signature based on bilinear pairing to
structure the framework of IKE authentication. In this paper,
we perform a comparative analysis of certificateless signature
scheme with a well-known RSA scheme and also present the
experimental results in the context of signing and verification
execution times. By generalizing our observations, we discuss the
different trade-offs involved in implementing IKE authentication
by using certificateless signature.
Abstract: the current study presents a modeling framework to determine the torsion strength of an induction hardened splined shaft by considering geometry and material aspects with the aim to optimize the static torsion strength by selection of spline geometry and hardness depth. Six different spline geometries and seven different hardness profiles including non-hardened and throughhardened shafts have been considered. The results reveal that the torque that causes initial yielding of the induction hardened splined shaft is strongly dependent on the hardness depth and the geometry of the spline teeth. Guidelines for selection of the appropriate hardness depth and spline geometry are given such that an optimum static torsion strength of the component can be achieved.
Abstract: An original Direct Numerical Simulation (DNS) method to tackle the problem of particulate flows at moderate to high concentration and finite Reynolds number is presented. Our method is built on the framework established by Glowinski and his coworkers [1] in the sense that we use their Distributed Lagrange Multiplier/Fictitious Domain (DLM/FD) formulation and their operator-splitting idea but differs in the treatment of particle collisions. The novelty of our contribution relies on replacing the simple artificial repulsive force based collision model usually employed in the literature by an efficient Discrete Element Method (DEM) granular solver. The use of our DEM solver enables us to consider particles of arbitrary shape (at least convex) and to account for actual contacts, in the sense that particles actually touch each other, in contrast with the simple repulsive force based collision model. We recently upgraded our serial code, GRIFF 1 [2], to full MPI capabilities. Our new code, PeliGRIFF 2, is developed under the framework of the full MPI open source platform PELICANS [3]. The new MPI capabilities of PeliGRIFF open new perspectives in the study of particulate flows and significantly increase the number of particles that can be considered in a full DNS approach: O(100000) in 2D and O(10000) in 3D. Results on the 2D/3D sedimentation/fluidization of isometric polygonal/polyedral particles with collisions are presented.
Abstract: With the rapid popularization of internet services, it is apparent that the next generation terrestrial communication systems must be capable of supporting various applications like voice, video, and data. This paper presents the performance evaluation of turbo- coded mobile terrestrial communication systems, which are capable of providing high quality services for delay sensitive (voice or video) and delay tolerant (text transmission) multimedia applications in urban and suburban areas. Different types of multimedia information require different service qualities, which are generally expressed in terms of a maximum acceptable bit-error-rate (BER) and maximum tolerable latency. The breakthrough discovery of turbo codes allows us to significantly reduce the probability of bit errors with feasible latency. In a turbo-coded system, a trade-off between latency and BER results from the choice of convolutional component codes, interleaver type and size, decoding algorithm, and the number of decoding iterations. This trade-off can be exploited for multimedia applications by using optimal and suboptimal performance parameter amalgamations to achieve different service qualities. The results are therefore proposing an adaptive framework for turbo-coded wireless multimedia communications which incorporate a set of performance parameters that achieve an appropriate set of service qualities, depending on the application's requirements.
Abstract: Structural Integrity Management (SIM) is
important for the protection of offshore crew, environment, business assets and company and industry reputation. API RP 2A contained guidelines for assessment of existing platforms mostly for the Gulf
of Mexico (GOM). ISO 19902 SIM framework also does not
specifically cater for Malaysia. There are about 200 platforms in
Malaysia with 90 exceeding their design life. The Petronas Carigali
Sdn Bhd (PCSB) uses the Asset Integrity Management System and
the very subjective Risk based Inspection Program for these
platforms. Petronas currently doesn-t have a standalone Petronas
Technical Standard PTS-SIM. This study proposes a recommended
practice for the SIM process for offshore structures in Malaysia,
including studies by API and ISO and local elements such as the
number of platforms, types of facilities, age and risk ranking. Case
study on SMG-A platform in Sabah shows missing or scattered
platform data and a gap in inspection history. It is to undergo a level
3 underwater inspection in year 2015.
Abstract: This paper provides a key driver-based conceptual framework that can be used to improve a firm-s success in commercializing technology and in new product innovation resulting from collaboration with other organizations through strategic alliances. Based on a qualitative study using an interview approach, strategic alliances of entrepreneurs in the food processing industry in Thailand are explored. This paper describes factors affecting decisions to collaborate through alliances. It identifies four issues: maintaining the efficiency of the value chain for production capability, adapting to present and future competition, careful assessment of value of outcomes, and management of innovation. We consider five driving factors: resource orientation, assessment of risk, business opportunity, sharing of benefits and confidence in alliance partners. These factors will be of interest to entrepreneurs and policy makers with regard to further understanding of the direction of business strategies.
Abstract: In recent years, sustainable supply chain management
(SSCM) has been widely researched in academic domain. However,
due to the traditional operational role and the complexity of supply
chain management in the cement industry, a relatively small amount
of research has been conducted on cement supply chain simulation
integrated with sustainability criteria. This paper analyses the cement
supply chain operations using the Push-Pull supply chain
frameworks, the Life Cycle Assessment (LCA) methodology; and
proposal integration approach, proposes three supply chain scenarios
based on Make-To-Stock (MTS), Pack-To-Order (PTO) and Grind-
To-Order (GTO) strategies. A Discrete-Event Simulation (DES)
model of SSCM is constructed using Arena software to implement
the three-target scenarios. We conclude with the simulation results
that (GTO) is the optimal supply chain strategy that demonstrates the
best economic, ecological and social performance in the cement
industry.
Abstract: This paper presents an optimization of the hull
separation, i.e. transverse clearance. The main objective is to identify
the feasible speed ranges and find the optimum transverse clearance
considering the minimum wave-making resistance. The dimensions
and the weight of hardware systems installed in the catamaran
structured fuel cell powered USV (Unmanned Surface Vehicle) were
considered as constraints. As the CAE (Computer Aided Engineering)
platform FRIENDSHIP-Framework was used. The hull surface
modeling, DoE (Design of Experiment), Tangent search optimization,
tool integration and the process automation were performed by
FRIENDSHIP-Framework. The hydrodynamic result was evaluated
by XPAN the potential solver of SHIPFLOW.
Abstract: Knowledge is renowned as a significant component
for sustaining competitive advantage and gives leading edge in
business. This study emphasizes towards proper and effectuate
utilization of internal and external (both either explicit or tacit)
knowledge comes from stakeholders, highly supportive to combat
with the challenges and enhance organizational productivity.
Furthermore, it proposed a model under context of IRSA framework
which facilitates the organization including flow of knowledge and
experience sharing among employees. In discussion section an
innovative model which indulges all functionality as mentioned in
analysis section.
Abstract: Coordinated supply chain represents major challenges
for the different actors involved in it, because each agent responds to
individual interests. The paper presents a framework with the
reviewed literature regarding the system's decision structure and
nature of demand. Later, it characterizes an agri food supply chain in
the Central Region of Colombia, it responds to a decentralized
distribution system and a stochastic demand. Finally, the paper
recommends coordinating the chain based on shared information, and
mechanisms for each agent, as VMI (vendor-managed inventory)
strategy for farmer-buyer relationship, information system for
farmers and contracts for transportation service providers.
Abstract: Nowadays scientific data is inevitably digital and
stored in a wide variety of formats in heterogeneous systems.
Scientists need to access an integrated view of remote or local
heterogeneous data sources with advanced data accessing, analyzing,
and visualization tools. This research suggests the use of Service
Oriented Architecture (SOA) to integrate biological data from
different data sources. This work shows SOA will solve the problems
that facing integration process and if the biologist scientists can
access the biological data in easier way. There are several methods to
implement SOA but web service is the most popular method. The
Microsoft .Net Framework used to implement proposed architecture.
Abstract: In this paper, we present the region based hidden Markov random field model (RBHMRF), which encodes the characteristics of different brain regions into a probabilistic framework for brain MR image segmentation. The recently proposed TV+L1 model is used for region extraction. By utilizing different spatial characteristics in different brain regions, the RMHMRF model performs beyond the current state-of-the-art method, the hidden Markov random field model (HMRF), which uses identical spatial information throughout the whole brain. Experiments on both real and synthetic 3D MR images show that the segmentation result of the proposed method has higher accuracy compared to existing algorithms.
Abstract: The promises of component-based technology can only be fully realized when the system contains in its design a necessary level of separation of concerns. The authors propose to focus on the concerns that emerge throughout the life cycle of the system and use them as an architectural foundation for the design of a component-based framework. The proposed model comprises a set of superimposed views of the system describing its functional and non-functional concerns. This approach is illustrated by the design of a specific framework for data analysis and data acquisition and supplemented with experiences from using the systems developed with this framework at the Fermi National Accelerator Laboratory.
Abstract: This paper presents an approach based on the
adoption of a distributed cognition framework and a non parametric
multicriteria evaluation methodology (DEA) designed specifically to
compare e-commerce websites from the consumer/user viewpoint. In
particular, the framework considers a website relative efficiency as a
measure of its quality and usability. A website is modelled as a black
box capable to provide the consumer/user with a set of
functionalities. When the consumer/user interacts with the website to
perform a task, he/she is involved in a cognitive activity, sustaining a
cognitive cost to search, interpret and process information, and
experiencing a sense of satisfaction. The degree of ambiguity and
uncertainty he/she perceives and the needed search time determine
the effort size – and, henceforth, the cognitive cost amount – he/she
has to sustain to perform his/her task. On the contrary, task
performing and result achievement induce a sense of gratification,
satisfaction and usefulness. In total, 9 variables are measured,
classified in a set of 3 website macro-dimensions (user experience,
site navigability and structure). The framework is implemented to
compare 40 websites of businesses performing electronic commerce
in the information technology market. A questionnaire to collect
subjective judgements for the websites in the sample was purposely
designed and administered to 85 university students enrolled in
computer science and information systems engineering
undergraduate courses.
Abstract: Traditional higher-education classrooms allow lecturers to observe students- behaviours and responses to a particular pedagogy during learning in a way that can influence changes to the pedagogical approach. Within current e-learning systems it is difficult to perform continuous analysis of the cohort-s behavioural tendency, making real-time pedagogical decisions difficult. This paper presents a Virtual Learning Process Environment (VLPE) based on the Business Process Management (BPM) conceptual framework. Within the VLPE, course designers can model various education pedagogies in the form of learning process workflows using an intuitive flow diagram interface. These diagrams are used to visually track the learning progresses of a cohort of students. This helps assess the effectiveness of the chosen pedagogy, providing the information required to improve course design. A case scenario of a cohort of students is presented and quantitative statistical analysis of their learning process performance is gathered and displayed in realtime using dashboards.
Abstract: The rapid adoption of Internet has turned the Millennial Teens- life like a lightning speed. Empirical evidence has illustrated that Pathological Internet Use (PIU) among them ensure long-term success to the market players in the children industry. However, it creates concerns among their care takers as it generates mental disorder among some of them. The purpose of this paper is to examine the determinants of PIU and identify its outcomes among urban Millennial Teens. It aims to develop a theoretical framework based on a modified Media System Dependency (MSD) Theory that integrates important systems and components that determine and resulted from PIU.