Abstract: Business and IT alignment has continued as a
top concern for business and IT executives for almost three
decades. Many researchers have conducted empirical studies on
the relationship between business-IT alignment and performance.
Yet, these approaches, lacking a social perspective, have had little
impact on sustaining performance and competitive advantage. In
addition to the limited alignment literature that explores
organisational learning that is represented in shared understanding,
communication, cognitive maps and experiences.
Hence, this paper proposes an integrated process that enables
social and intellectual dimensions through the concept of
organisational learning. In particular, the feedback and feedforward
process which provide a value creation across dynamic
multilevel of learning. This mechanism enables on-going
effectiveness through development of individuals, groups and
organisations, which improves the quality of business and IT
strategies and drives to performance.
Abstract: Wavelet transforms is a very powerful tools for image compression. One of its advantage is the provision of both spatial and frequency localization of image energy. However, wavelet transform coefficients are defined by both a magnitude and sign. While algorithms exist for efficiently coding the magnitude of the transform coefficients, they are not efficient for the coding of their sign. It is generally assumed that there is no compression gain to be obtained from the coding of the sign. Only recently have some authors begun to investigate the sign of wavelet coefficients in image coding. Some authors have assumed that the sign information bit of wavelet coefficients may be encoded with the estimated probability of 0.5; the same assumption concerns the refinement information bit. In this paper, we propose a new method for Separate Sign Coding (SSC) of wavelet image coefficients. The sign and the magnitude of wavelet image coefficients are examined to obtain their online probabilities. We use the scalar quantization in which the information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also examined. We show that the sign information and the refinement information may be encoded by the probability of approximately 0.5 only after about five bit planes. Two maps are separately entropy encoded: the sign map and the magnitude map. The refinement information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also entropy encoded. An algorithm is developed and simulations are performed on three standard images in grey scale: Lena, Barbara and Cameraman. Five scales are performed using the biorthogonal wavelet transform 9/7 filter bank. The obtained results are compared to JPEG2000 standard in terms of peak signal to noise ration (PSNR) for the three images and in terms of subjective quality (visual quality). It is shown that the proposed method outperforms the JPEG2000. The proposed method is also compared to other codec in the literature. It is shown that the proposed method is very successful and shows its performance in term of PSNR.
Abstract: With the rapid popularization of internet services, it is apparent that the next generation terrestrial communication systems must be capable of supporting various applications like voice, video, and data. This paper presents the performance evaluation of turbo- coded mobile terrestrial communication systems, which are capable of providing high quality services for delay sensitive (voice or video) and delay tolerant (text transmission) multimedia applications in urban and suburban areas. Different types of multimedia information require different service qualities, which are generally expressed in terms of a maximum acceptable bit-error-rate (BER) and maximum tolerable latency. The breakthrough discovery of turbo codes allows us to significantly reduce the probability of bit errors with feasible latency. In a turbo-coded system, a trade-off between latency and BER results from the choice of convolutional component codes, interleaver type and size, decoding algorithm, and the number of decoding iterations. This trade-off can be exploited for multimedia applications by using optimal and suboptimal performance parameter amalgamations to achieve different service qualities. The results are therefore proposing an adaptive framework for turbo-coded wireless multimedia communications which incorporate a set of performance parameters that achieve an appropriate set of service qualities, depending on the application's requirements.
Abstract: Effective evaluation of software development effort is an important issue during project plan. This study provides a model to predict development effort based on the software size estimated with function points. We generalize the average amount of effort spent on each phase of the development, and give the estimates for the effort used in software building, testing, and implementation. Finally, this paper finds a strong correlation between software defects and software size. As the size of software constantly increases, the quality remains to be a matter which requires major concern.
Abstract: This article proposes a voltage-mode
multifunction filter using differential voltage current
controllable current conveyor transconductance amplifier
(DV-CCCCTA). The features of the circuit are that: the
quality factor and pole frequency can be tuned independently
via the values of capacitors: the circuit description is very
simple, consisting of merely 1 DV-CCCCTA, and 2
capacitors. Without any component matching conditions, the
proposed circuit is very appropriate to further develop into
an integrated circuit. Additionally, each function response
can be selected by suitably selecting input signals with
digital method. The PSpice simulation results are depicted.
The given results agree well with the theoretical anticipation.
Abstract: In this paper, we propose ablock-wise watermarking scheme for color image authentication to resist malicious tampering of digital media. The thresholding technique is incorporated into the scheme such that the tampered region of the color image can be recovered with high quality while the proofing result is obtained. The watermark for each block consists of its dual authentication data and the corresponding feature information. The feature information for recovery iscomputed bythe thresholding technique. In the proofing process, we propose a dual-option parity check method to proof the validity of image blocks. In the recovery process, the feature information of each block embedded into the color image is rebuilt for high quality recovery. The simulation results show that the proposed watermarking scheme can effectively proof the tempered region with high detection rate and can recover the tempered region with high quality.
Abstract: Large volumes of fingerprints are collected and stored
every day in a wide range of applications, including forensics, access
control etc. It is evident from the database of Federal Bureau of
Investigation (FBI) which contains more than 70 million finger
prints. Compression of this database is very important because of this
high Volume. The performance of existing image coding standards
generally degrades at low bit-rates because of the underlying block
based Discrete Cosine Transform (DCT) scheme. Over the past
decade, the success of wavelets in solving many different problems
has contributed to its unprecedented popularity. Due to
implementation constraints scalar wavelets do not posses all the
properties which are needed for better performance in compression.
New class of wavelets called 'Multiwavelets' which posses more
than one scaling filters overcomes this problem. The objective of this
paper is to develop an efficient compression scheme and to obtain
better quality and higher compression ratio through multiwavelet
transform and embedded coding of multiwavelet coefficients through
Set Partitioning In Hierarchical Trees algorithm (SPIHT) algorithm.
A comparison of the best known multiwavelets is made to the best
known scalar wavelets. Both quantitative and qualitative measures of
performance are examined for Fingerprints.
Abstract: The objectives of this research paper were to study the
influencing factors that contributed to the success of electronic
commerce (e-commerce) and to study the approach to enhance the
standard of e-commerce for small and medium enterprises (SME).
The research paper focused the study on only sole proprietorship
SMEs in Bangkok, Thailand. The factors contributed to the success
of SME included business management, learning in the organization,
business collaboration, and the quality of website. A quantitative and
qualitative mixed research methodology was used. In terms of
quantitative method, a questionnaire was used to collect data from
251 sole proprietorships. The System Equation Model (SEM) was
utilized as the tool for data analysis. In terms of qualitative method,
an in-depth interview, a dialogue with experts in the field of ecommerce
for SMEs, and content analysis were used.
By using the adjusted causal relationship structure model, it was
revealed that the factors affecting the success of e-commerce for
SMEs were found to be congruent with the empirical data. The
hypothesis testing indicated that business management influenced the
learning in the organization, the learning in the organization
influenced business collaboration and the quality of the website, and
these factors, in turn, influenced the success of SMEs. Moreover, the
approach to enhance the standard of SMEs revealed that the majority
of respondents wanted to enhance the standard of SMEs to a high
level in the category of safety of e-commerce system, basic structure
of e-commerce, development of staff potentials, assistance of budget
and tax reduction, and law improvement regarding the e-commerce
respectively.
Abstract: With respect to the dissipation of energy through
plastic deformation of joints of prefabricated wall units, the paper
points out the principal importance of efficient reinforcement of the
prefabricated system at its joints. The method, quality and amount of
reinforcement are essential for reaching the necessary degree of joint
ductility. The paper presents partial results of experimental research
of vertical joints of prefabricated units exposed to monotonously
rising loading and repetitive shear force and formulates a conclusion
that the limit state of the structure as a whole is preceded by the
disintegration of joints, or that the structure tends to pass from
linearly elastic behaviour to non-linearly elastic to plastic behaviour
by exceeding the proportional elastic limit in joints.Experimental
verification on a model of a 7-storey prefabricated structure revealed
weak points in its load-bearing systems, mainly at places of critical
points around openings situated in close proximity to vertical joints
of mutually perpendicularly oriented walls.
Abstract: With the globalized production and logistics
environment, the need for reducing the product development interval
and lead time, having a faster response to orders, conforming to quality
standards, fair tracking, and boosting information exchanging
activities with customers and partners, and coping with changes in the
management environment, manufacturers are in dire need of an
information management system in their manufacturing environments.
There are lots of information systems that have been designed to
manage the condition or operation of equipment in the field but
existing systems have a decentralized architecture, which is not
unified. Also, these systems cannot effectively handle the status data
extraction process upon encountering a problem related to protocols or
changes in the equipment or the setting. In this regard, this paper will
introduce a system for processing and saving the status info of
production equipment, which uses standard representation formats, to
enable flexible responses to and support for variables in the field
equipment. This system can be used for a variety of manufacturing and
equipment settings and is capable of interacting with higher-tier
systems such as MES.
Abstract: For smaller mechatronic device, especially for micro
Electronic system, a micro machining is a must. However, most
investigations on vibration of a mill have been limited to the
traditional type mill. In this article, vibration and dynamic
characteristics of a micro mill were investigated in this study. The
trend towards higher precision manufacturing technology requires
producing miniaturized components. To improve micro-milled
product quality, obtain a higher production rate and avoid milling
breakage, the dynamic characteristics of micro milling must be
studied. A stepped pre-twisted mill is used to simulate the micro mill.
The finite element analysis is employed in this work. The flute length
and diameter effects of the micro mill are considered. It is clear that
the effects of micro mill shape parameters on vibration in a micro mill
are significant.
Abstract: In this paper, an automated algorithm to estimate and remove the continuous baseline from measured spectra containing both continuous and discontinuous bands is proposed. The algorithm uses previous information contained in a Continuous Database Spectra (CDBS) to obtain a linear basis, with minimum number of sampled vectors, capable of representing a continuous baseline. The proposed algorithm was tested by using a CDBS of flame spectra where Principal Components Analysis and Non-negative Matrix Factorization were used to obtain linear bases. Thus, the radical emissions of natural gas, oil and bio-oil flames spectra at different combustion conditions were obtained. In order to validate the performance in the baseline estimation process, the Goodness-of-fit Coefficient and the Root Mean-squared Error quality metrics were evaluated between the estimated and the real spectra in absence of discontinuous emission. The achieved results make the proposed method a key element in the development of automatic monitoring processes strategies involving discontinuous spectral bands.
Abstract: Perennial ryegrass (Lolium perenne L.) plants are cultivated for lawn constitution and as forage plants. Considerable number of perennial ryegrass genotypes are present in the flora of our country and they present substantial was performed based on a Project supported bu TUBITAK (Project numver : 106O159) and perannial ryegrass genotypes from 8 provinces were collected during 2006. Seeds of perennial ryegrass were collected from 48 different locations. Populations of turfgrass seeds in flowerpots to be 20 and 1 cm deep greenhouse were sown in three replications at 07.07.2007.Then the growth of turfgrass seedlings in the greenhouse in pots showed sufficiently separated from the plants were planted in each population. Plants planted in the garden of the observation scale of 1-9 was evaluated by the quality, 1 = the weakest / worst, 6 = acceptable and 9 = superior or considered as an ideal. Essentially only recognized in assessing the quality of the color of grass, but the color, density, uniformity, texture (texture), illness or environmental stresses are evaluated as a combination reaction. Turfgrass quality 15.11.2007, 19.03.2008, 27.05.2008, 27.11.2008, 07.03.2009 and 02.06.2009 have been 6 times to be in order. Observations made regarding the quality of grass; 3 years according to seasonal environments turf quality genotypes belonging to 14 different populations were found to be 7.5 and above are reserved for future use in breeding works.The number of genotypes belonging to 41 populations in terms of turfgrass quality was determined as 7.9 of 3 year average seasonal. Argıthan between Doğanhisar (Konya) is located 38.09 latitude and 31.40 longitude, altitude 1158 m in the set that population numbered 41.
Abstract: This paper presents the results of the authors in designing, experimenting, assessing and transferring an innovative approach to energy education in secondary schools, aimed to enhance the quality of learning in terms of didactic curricula and pedagogic methods. The training is online delivered to youngsters via e-Books and portals specially designed for this purpose or by learning by doing via interactive games. An online educational methodology is available teachers.
Abstract: This paper presents an approach based on the
adoption of a distributed cognition framework and a non parametric
multicriteria evaluation methodology (DEA) designed specifically to
compare e-commerce websites from the consumer/user viewpoint. In
particular, the framework considers a website relative efficiency as a
measure of its quality and usability. A website is modelled as a black
box capable to provide the consumer/user with a set of
functionalities. When the consumer/user interacts with the website to
perform a task, he/she is involved in a cognitive activity, sustaining a
cognitive cost to search, interpret and process information, and
experiencing a sense of satisfaction. The degree of ambiguity and
uncertainty he/she perceives and the needed search time determine
the effort size – and, henceforth, the cognitive cost amount – he/she
has to sustain to perform his/her task. On the contrary, task
performing and result achievement induce a sense of gratification,
satisfaction and usefulness. In total, 9 variables are measured,
classified in a set of 3 website macro-dimensions (user experience,
site navigability and structure). The framework is implemented to
compare 40 websites of businesses performing electronic commerce
in the information technology market. A questionnaire to collect
subjective judgements for the websites in the sample was purposely
designed and administered to 85 university students enrolled in
computer science and information systems engineering
undergraduate courses.
Abstract: fifteen cultivars of Strawberries (Queen Eliza, Sequia,
Paros, Mcdonance, Selva, Chandler, Mrak, Ten beauty, Aliso, Pajero,
Kordestan, Camarosa, Blackmore, Gaviota and Fresno) were
investigated in 2011, under hydroponic system condition. Yield and
fruit Firmness was determinate. Chemical analyses of soluble solids
content (SSC), titratable acidity (TA), ascorbic acid (AA) and pH
were done. 4 cultivars (Aliso, Selva, Paros and Gaviota) yielded more
than 250 g/plant, while cultivar Black more, Fresno and Kordestan
produced less than 100g/plant. The amounts of fruit firmness
indicated that 'Camarosa' fruit was firmer than others cultivars.
Cultivar 'Fresno' had the highest pH (3.27). Ttitratable acidity varied
from 1.03g/l00g for cultivar 'Sequia' and 'Gaviota' to 1.48g/l00g for
cultivar 'Chandler'. Fresno, Kordestan, Aliso and Chandler showed
the highest soluble solid concentration. Ascorbic acid averaged for
most cultivars between 30.26 and 79.73 mg/100gf.w. Present results
showed that different cultivars of strawberry contain highly variable
in fruit quality.
Abstract: The main purpose of this research is the calculation of implicit prices of the environmental level of air quality in the city of Moscow on the basis of housing property prices. The database used contains records of approximately 20 thousand apartments and has been provided by a leading real estate agency operating in Russia. The explanatory variables include physical characteristics of the houses, environmental (industry emissions), neighbourhood sociodemographic and geographic data: GPS coordinates of each house. The hedonic regression results for ecological variables show «negative» prices while increasing the level of air contamination from such substances as carbon monoxide, nitrogen dioxide, sulphur dioxide, and particles (CO, NO2, SO2, TSP). The marginal willingness to pay for higher environmental quality is presented for linear and log-log models.
Abstract: The shortest path routing problem is a multiobjective
nonlinear optimization problem with constraints. This problem has
been addressed by considering Quality of service parameters, delay
and cost objectives separately or as a weighted sum of both
objectives. Multiobjective evolutionary algorithms can find multiple
pareto-optimal solutions in one single run and this ability makes them
attractive for solving problems with multiple and conflicting
objectives. This paper uses an elitist multiobjective evolutionary
algorithm based on the Non-dominated Sorting Genetic Algorithm
(NSGA), for solving the dynamic shortest path routing problem in
computer networks. A priority-based encoding scheme is proposed
for population initialization. Elitism ensures that the best solution
does not deteriorate in the next generations. Results for a sample test
network have been presented to demonstrate the capabilities of the
proposed approach to generate well-distributed pareto-optimal
solutions of dynamic routing problem in one single run. The results
obtained by NSGA are compared with single objective weighting
factor method for which Genetic Algorithm (GA) was applied.
Abstract: In Image processing the Image compression can improve
the performance of the digital systems by reducing the cost and
time in image storage and transmission without significant reduction
of the Image quality. This paper describes hardware architecture of
low complexity Discrete Cosine Transform (DCT) architecture for
image compression[6]. In this DCT architecture, common computations
are identified and shared to remove redundant computations
in DCT matrix operation. Vector processing is a method used for
implementation of DCT. This reduction in computational complexity
of 2D DCT reduces power consumption. The 2D DCT is performed
on 8x8 matrix using two 1-Dimensional Discrete cosine transform
blocks and a transposition memory [7]. Inverse discrete cosine
transform (IDCT) is performed to obtain the image matrix and
reconstruct the original image. The proposed image compression
algorithm is comprehended using MATLAB code. The VLSI design
of the architecture is implemented Using Verilog HDL. The proposed
hardware architecture for image compression employing DCT was
synthesized using RTL complier and it was mapped using 180nm
standard cells. . The Simulation is done using Modelsim. The
simulation results from MATLAB and Verilog HDL are compared.
Detailed analysis for power and area was done using RTL compiler
from CADENCE. Power consumption of DCT core is reduced to
1.027mW with minimum area[1].
Abstract: The design of a pattern classifier includes an attempt
to select, among a set of possible features, a minimum subset of
weakly correlated features that better discriminate the pattern classes.
This is usually a difficult task in practice, normally requiring the
application of heuristic knowledge about the specific problem
domain. The selection and quality of the features representing each
pattern have a considerable bearing on the success of subsequent
pattern classification. Feature extraction is the process of deriving
new features from the original features in order to reduce the cost of
feature measurement, increase classifier efficiency, and allow higher
classification accuracy. Many current feature extraction techniques
involve linear transformations of the original pattern vectors to new
vectors of lower dimensionality. While this is useful for data
visualization and increasing classification efficiency, it does not
necessarily reduce the number of features that must be measured
since each new feature may be a linear combination of all of the
features in the original pattern vector. In this paper a new approach is
presented to feature extraction in which feature selection, feature
extraction, and classifier training are performed simultaneously using
a genetic algorithm. In this approach each feature value is first
normalized by a linear equation, then scaled by the associated weight
prior to training, testing, and classification. A knn classifier is used to
evaluate each set of feature weights. The genetic algorithm optimizes
a vector of feature weights, which are used to scale the individual
features in the original pattern vectors in either a linear or a nonlinear
fashion. By this approach, the number of features used in classifying
can be finely reduced.