Abstract: Innovations not only contribute to competitiveness of
the company but have also positive effects on revenues. On average,
product innovations account to 14 percent of companies’ sales.
Innovation management has substantially changed during the last
decade, because of growing reliance on external partners. As a
consequence, a new task for purchasing arises, as firms need to
understand which suppliers actually do have high potential
contributing to the innovativeness of the firm and which do not.
Proper organization of the purchasing function is important since
for the majority of manufacturing companies deal with substantial
material costs which pass through the purchasing function. In the past
the purchasing function was largely seen as a transaction-oriented,
clerical function but today purchasing is the intermediate with supply
chain partners contributing to innovations, be it product or process
innovations. Therefore, purchasing function has to be organized
differently to enable firm innovation potential.
However, innovations are inherently risky. There are behavioral
risk (that some partner will take advantage of the other party),
technological risk in terms of complexity of products and processes
of manufacturing and incoming materials and finally market risks,
which in fact judge the value of the innovation. These risks are
investigated in this work. Specifically, technological risks which deal
with complexity of the products, and processes will be investigated
more thoroughly. Buying components or such high edge technologies
necessities careful investigation of technical features and therefore is
usually conducted by a team of experts. Therefore it is hypothesized
that higher the technological risk, higher will be the centralization of
the purchasing function as an interface with other supply chain
members.
Main contribution of this research lies is in the fact that analysis
was performed on a large data set of 1493 companies, from 25
countries collected in the GMRG 4 survey. Most analyses of
purchasing function are done by case study analysis of innovative
firms. Therefore this study contributes with empirical evaluations that
can be generalized.
Abstract: In many communication and signal processing
systems, it is highly desirable to implement an efficient narrow-band
filter that decimate or interpolate the incoming signals. This paper
presents hardware efficient compensated CIC filter over a narrow
band frequency that increases the speed of down sampling by using
multiplierless decimation filters with polyphase FIR filter structure.
The proposed work analyzed the performance of compensated CIC
filter on the bases of the improvement of frequency response with
reduced hardware complexity in terms of no. of adders and
multipliers and produces the filtered results without any alterations.
CIC compensator filter demonstrated that by using compensation
with CIC filter improve the frequency response in passed of interest
26.57% with the reduction in hardware complexity 12.25%
multiplications per input sample (MPIS) and 23.4% additions per
input sample (APIS) w.r.t. FIR filter respectively.
Abstract: The high Peak to Average Power Ratio (PAR) in Filter
Bank Multicarrier with Offset Quadrature Amplitude Modulation
(FBMC-OQAM) can significantly reduce power efficiency and
performance. In this paper, we address the problem of PAPR
reduction for FBMC-OQAM systems using Tone Reservation (TR)
technique. Due to the overlapping structure of FBMCOQAM signals,
directly applying TR schemes of OFDM systems to FBMC-OQAM
systems is not effective. We improve the tone reservation (TR)
technique by employing sliding window with Active Constellation
Extension for the PAPR reduction of FBMC-OQAM signals, called
sliding window tone reservation Active Constellation Extension
(SW-TRACE) technique. The proposed SW-TRACE technique uses
the peak reduction tones (PRTs) of several consecutive data
blocks to cancel the peaks of the FBMC-OQAM signal inside a
window, with dynamically extending outer constellation points in
active(data-carrying) channels, within margin-preserving constraints,
in order to minimize the peak magnitude. Analysis and simulation
results compared to the existing Tone Reservation (TR) technique for
FBMC/OQAM system. The proposed method SW-TRACE has better
PAPR performance and lower computational complexity.
Abstract: This paper proposes a cooperative Alamouti space time
transmission scheme with low relay complexity for the cooperative
communication systems. In the proposed scheme, the source node
combines the data symbols to construct the Alamouti-coded form at
the destination node, while the conventional scheme performs the
corresponding operations at the relay nodes. In simulation results,
it is shown that the proposed scheme achieves the second order
cooperative diversity while maintaining the same bit error rate (BER)
performance as that of the conventional scheme.
Abstract: In this paper, we employ a directed hypergraph model
to investigate the extent to which environmental variability influences
the set of available biochemical reactions within a living cell.
Such an approach avoids the limitations of the usual complex
network formalism by allowing for the multilateral relationships (i.e.
connections involving more than two nodes) that naturally occur
within many biological processes. More specifically, we extend the
concept of network reciprocity to complex hyper-networks, thus
enabling us to characterise a network in terms of the existence
of mutual hyper-connections, which may be considered a proxy
for metabolic network complexity. To demonstrate these ideas, we
study 115 metabolic hyper-networks of bacteria, each of which
can be classified into one of 6 increasingly varied habitats.
In particular, we found that reciprocity increases significantly
with increased environmental variability, supporting the view that
organism adaptability leads to increased complexities in the resultant
biochemical networks.
Abstract: Prior literature on innovation diffusion or acceptance has almost exclusively concentrated on consumers’ positive attitudes and behaviors for new products/services. Consumers’ negative attitudes or behaviors to innovations have received relatively little marketing attention, but it happens frequently in practice. This study discusses consumer psychological factors when they try to learn or use new technologies. According to recent research, technological innovation acceptance has been considered as a dynamic or mediated process. This research argues that consumers can experience inertia and emotions in the initial use of new technologies. However, given such consumer psychology, the argument can be made as to whether the inclusion of consumer inertia (routine seeking and cognitive rigidity) and emotions increases the predictive power of new technology acceptance model. As data from the empirical study find, the process is potentially consumer emotion changing (independent of performance benefits) because of technology complexity and consumer inertia, and impact innovative technology use significantly. Finally, the study presents the superior predictability of the hypothesized model, which let managers can better predict and influence the successful diffusion of complex technological innovations.
Abstract: In this paper, we investigated the effect of real valued transformation of the spectral matrix of the received data for Angles Of Arrival estimation problem. Indeed, the unitary transformation of Partial Propagator (UPP) for narrowband sources is proposed and applied on Uniform Linear Array (ULA).
Monte Carlo simulations proved the performance of the UPP spectrum comparatively with Forward Backward Partial Propagator (FBPP) and Unitary Propagator (UP). The results demonstrates that when some of the sources are fully correlated and closer than the Rayleigh angular limit resolution of the broadside array, the UPP method outperforms the FBPP in both of spatial resolution and complexity.
Abstract: This paper presents feature level image fusion using Haar lifting wavelet transform. Feature fused is edge and boundary information, which is obtained using wavelet transform modulus maxima criteria. Simulation results show the superiority of the result as entropy, gradient, standard deviation are increased for fused image as compared to input images. The proposed methods have the advantages of simplicity of implementation, fast algorithm, perfect reconstruction, and reduced computational complexity. (Computational cost of Haar wavelet is very small as compared to other lifting wavelets.)
Abstract: In terms of ecology forecast effects of desertification, the purpose of this study is to develop a predictive model of growth and adaptation of species in arid environment and bioclimatic conditions. The impact of climate change and the desertification phenomena is the result of combined effects in magnitude and frequency of these phenomena. Like the data involved in the phytopathogenic process and bacteria growth in arid soil occur in an uncertain environment because of their complexity, it becomes necessary to have a suitable methodology for the analysis of these variables. The basic principles of fuzzy logic those are perfectly suited to this process. As input variables, we consider the physical parameters, soil type, bacteria nature, and plant species concerned. The result output variable is the adaptability of the species expressed by the growth rate or extinction. As a conclusion, we prevent the possible strategies for adaptation, with or without shifting areas of plantation and nature adequate vegetation.
Abstract: In this paper will be discussed two coin´s sides
of crisis scenarios dynamics. On the one's side is negative role
of subsidiary scenario branches in its compactness weakening
by means unduly chaotic atomizing, having many interactive
feedbacks cases, increasing a value of a complexity here.
This negative role reflects the complexity of use cases, weakening
leader compliancy, which brings something as a ´readiness
for controlling capabilities provision´. Leader´s dissatisfaction has
zero compliancy, but factual it is a ´crossbar´ (interface in fact)
between planning and executing use cases. On the other side of this
coin, an advantage of rich scenarios embranchment is possible to see
in a support of response awareness, readiness, preparedness,
adaptability, creativity and flexibility. Here rich scenarios
embranchment contributes to the steadiness and resistance of scenario
mission actors. These all will be presented in live power-points
´Blazons´, modelled via DYVELOP (Dynamic Vector Logistics
of Processes) on the Conference.
Abstract: In this paper, it is proposed to improve Daisy Descriptor based face recognition using a novel One-Bit Transform (1BT) based pre-registration approach. The 1BT based pre-registration procedure is fast and has low computational complexity. It is shown that the face recognition accuracy is improved with the proposed approach. The proposed approach can facilitate highly accurate face recognition using DAISY descriptor with simple matching and thereby facilitate a low-complexity approach.
Abstract: Using the technology acceptance model (TAM), this
study examined the external variables of technological complexity
(TC) to acquire a better understanding of the factors that influence the
acceptance of computer application courses by learners at Active
Aging Universities. After the learners in this study had completed a
27-hour Facebook course, 44 learners responded to a modified TAM
survey. Data were collected to examine the path relationships among
the variables that influence the acceptance of Facebook-mediated
community learning. The partial least squares (PLS) method was used
to test the measurement and the structural model. The study results
demonstrated that attitudes toward Facebook use directly influence
behavioral intentions (BI) with respect to Facebook use, evincing a
high prediction rate of 58.3%. In addition to the perceived usefulness
(PU) and perceived ease of use (PEOU) measures that are proposed in
the TAM, other external variables, such as TC, also indirectly
influence BI. These four variables can explain 88% of the variance in
BI and demonstrate a high level of predictive ability. Finally,
limitations of this investigation and implications for further research
are discussed.
Abstract: The subject of this paper is the design analysis of a single well power production unit from low enthalpy geothermal resources. A complexity of the project is defined by a low temperature heat source that usually makes such projects economically disadvantageous using the conventional binary power plant approach. A proposed new compact design is numerically analyzed. This paper describes a thermodynamic analysis, a working fluid choice, downhole heat exchanger (DHE) and turbine calculation results. The unit is able to produce 321 kW of electric power from a low enthalpy underground heat source utilizing n-Pentane as a working fluid. A geo-pressured reservoir located in Vermilion Parish, Louisiana, USA is selected as a prototype for the field application. With a brine temperature of 126 , the optimal length of DHE is determined as 304.8 m (1000ft). All units (pipes, turbine, and pumps) are chosen from commercially available parts to bring this project closer to the industry requirements. Numerical calculations are based on petroleum industry standards. The project is sponsored by the Department of Energy of the US.
Abstract: An accurate study of blood flow is associated with an accurate vascular pattern and geometrical properties of the organ of interest. Due to the complexity of vascular networks and poor accessibility in vivo, it is challenging to reconstruct the entire vasculature of any organ experimentally. The objective of this study is to introduce an innovative approach for the reconstruction of a full vascular tree from available morphometric data. Our method consists of implementing morphometric data on those parts of the vascular tree that are smaller than the resolution of medical imaging methods. This technique reconstructs the entire arterial tree down to the capillaries. Vessels greater than 2 mm are obtained from direct volume and surface analysis using contrast enhanced computed tomography (CT). Vessels smaller than 2mm are reconstructed from available morphometric and distensibility data and rearranged by applying Murray’s Laws. Implementation of morphometric data to reconstruct the branching pattern and applying Murray’s Laws to every vessel bifurcation simultaneously, lead to an accurate vascular tree reconstruction. The reconstruction algorithm generates full arterial tree topography down to the first capillary bifurcation. Geometry of each order of the vascular tree is generated separately to minimize the construction and simulation time. The node-to-node connectivity along with the diameter and length of every vessel segment is established and order numbers, according to the diameter-defined Strahler system, are assigned. During the simulation, we used the averaged flow rate for each order to predict the pressure drop and once the pressure drop is predicted, the flow rate is corrected to match the computed pressure drop for each vessel. The final results for 3 cardiac cycles is presented and compared to the clinical data.
Abstract: With the current rise in the demand of electrical energy, present-day power systems which are large and complex, will continue to grow in both size and complexity. Flexible AC Transmission System (FACTS) controllers provide new facilities, both in steady state power flow control and dynamic stability control. Thyristor Controlled Series Capacitor (TCSC) is one of FACTS equipment, which is used for power flow control of active power in electric power system and for increase of capacities of transmission lines. In this paper, a Backstepping Power Flow Controller (BPFC) for TCSC in multimachine power system is developed and tested. The simulation results show that the TCSC proposed controller is capable of controlling the transmitted active power and improving the transient stability when compared with conventional PI Power Flow Controller (PIPFC).
Abstract: This paper proposes frequency offset (FO) estimation
schemes robust to the non-Gaussian noise for orthogonal frequency
division multiplexing (OFDM) systems. A maximum-likelihood (ML)
scheme and a low-complexity estimation scheme are proposed by
applying the probability density function of the cyclic prefix of
OFDM symbols to the ML criterion. From simulation results, it is
confirmed that the proposed schemes offer a significant FO estimation
performance improvement over the conventional estimation scheme
in non-Gaussian noise environments.
Abstract: The lignite-fired power plants in the Western Macedonia Lignite Center produce more than 8106 t of fly ash per year. Approximately 90% of this quantity is used for restoration-reclamation of exhausted open-cast lignite mines and slope stabilization of the overburden. The purpose of this work is to evaluate the environmental behavior of the mixture of waste rock and fly ash that is being used in the external deposition site of the South Field lignite mine. For this reason, a borehole was made within the site and 86 samples were taken and subjected to chemical analyses and leaching tests. The results showed very limited leaching of trace elements and heavy metals from this mixture. Moreover, when compared to the limit values set for waste acceptable in inert waste landfills, only few excesses were observed, indicating only minor risk for groundwater pollution. However, due to the complexity of both the leaching process and the contaminant pathway, more boreholes and analyses should be made in nearby locations and a systematic groundwater monitoring program should be implemented both downstream and within the external deposition site.
Abstract: A dual tiered network model is designed to overcome the problem of energy alert and fault tolerance. This model minimizes the delay time and overcome failure of links. Performance analysis of the dual tiered network model is studied in this paper where the CA and LS schemes are compared with DEO optimal. We then evaluate the Integrated Network Topological Control and Key Management (INTK) Schemes, which was proposed to add security features of the wireless sensor networks. Clustering efficiency, level of protections, the time complexity is some of the parameters of INTK scheme that were analyzed. We then evaluate the Cluster based Energy Competent n-coverage scheme (CEC n-coverage scheme) to ensure area coverage for wireless sensor networks.
Abstract: Shifted polynomial basis (SPB) is a variation of
polynomial basis representation. SPB has potential for efficient
bit level and digi -level implementations of multiplication over
binary extension fields with subquadratic space complexity. For
efficient implementation of pairing computation with large finite
fields, this paper presents a new SPB multiplication algorithm based
on Karatsuba schemes, and used that to derive a novel scalable
multiplier architecture. Analytical results show that the proposed
multiplier provides a trade-off between space and time complexities.
Our proposed multiplier is modular, regular, and suitable for very
large scale integration (VLSI) implementations. It involves less
area complexity compared to the multipliers based on traditional
decomposition methods. It is therefore, more suitable for efficient
hardware implementation of pairing based cryptography and elliptic
curve cryptography (ECC) in constraint driven applications.
Abstract: When we prefer to make the data secure from various attacks and fore integrity of data, we must encrypt the data before it is transmitted or stored. This paper introduces a new effective and lossless image encryption algorithm using a natural logarithmic function. The new algorithm encrypts an image through a three stage process. In the first stage, a reference natural logarithmic function is generated as the foundation for the encryption image. The image numeral matrix is then analyzed to five integer numbers, and then the numbers’ positions are transformed to matrices. The advantages of this method is useful for efficiently encrypting a variety of digital images, such as binary images, gray images, and RGB images without any quality loss. The principles of the presented scheme could be applied to provide complexity and then security for a variety of data systems such as image and others.