Abstract: In literature, there are metrics for identifying the
quality of reusable components but the framework that makes use of
these metrics to precisely predict reusability of software components
is still need to be worked out. These reusability metrics if identified
in the design phase or even in the coding phase can help us to reduce
the rework by improving quality of reuse of the software component
and hence improve the productivity due to probabilistic increase in
the reuse level. As CK metric suit is most widely used metrics for
extraction of structural features of an object oriented (OO) software;
So, in this study, tuned CK metric suit i.e. WMC, DIT, NOC, CBO
and LCOM, is used to obtain the structural analysis of OO-based
software components. An algorithm has been proposed in which the
inputs can be given to K-Means Clustering system in form of
tuned values of the OO software component and decision tree is
formed for the 10-fold cross validation of data to evaluate the in
terms of linguistic reusability value of the component. The developed
reusability model has produced high precision results as desired.
Abstract: Although automotive industry has brought different beneficiaries to human life, it is being pointed out as one of the major cause of global air pollution which resulted in climate change, smog, green house gases (GHGs), and human diseases by many reasons. Since auto industry is one of the largest consumers of fossil fuels, the realization of green innovations is becoming a crucial choice to meet the challenges towards sustainable development. Recently, many auto manufacturers have embarked on green technology initiatives to gain a competitive advantage in the global market; however, innovative manufacturing systems and technologies can enhance operational performance only if the human resource management is in place to elicit the motivation of the employees and develop their organizational expertise. No organization can perform at peak levels unless each employee is committed to the company goals and works as an effective team member. Strategic human resource practices are the primary means by which firms can shape the skills, attitudes, and behavior of individuals to align with the business strategic objectives. This study investigates on the comprehensive approach of multiple advanced technology innovations and human resource management at Toyota Motor Corporation as the market leader of full hybrid technology in the automotive industry. Then, HRM framework of the company is described and three sets of human resource practices that support the innovation-oriented HR system, presented. Finally, a conceptual framework for innovativeness in green technology in automotive industry by applying a deliberate strategic HR management system and knowledge management with the intervening factors of organizational culture, knowledge application and knowledge sharing is proposed.
Abstract: In this paper, the construction of a detailed spine
model is presented using the LifeMOD Biomechanics Modeler. The
detailed spine model is obtained by refining spine segments in
cervical, thoracic and lumbar regions into individual vertebra
segments, using bushing elements representing the intervertebral
discs, and building various ligamentous soft tissues between
vertebrae. In the sagittal plane of the spine, constant force will be
applied from the posterior to anterior during simulation to determine
dynamic characteristics of the spine. The force magnitude is
gradually increased in subsequent simulations. Based on these
recorded dynamic properties, graphs of displacement-force
relationships will be established in terms of polynomial functions by
using the least-squares method and imported into a haptic integrated
graphic environment. A thoracolumbar spine model with complex
geometry of vertebrae, which is digitized from a resin spine
prototype, will be utilized in this environment. By using the haptic
technique, surgeons can touch as well as apply forces to the spine
model through haptic devices to observe the locomotion of the spine
which is computed from the displacement-force relationship graphs.
This current study provides a preliminary picture of our ongoing
work towards building and simulating bio-fidelity scoliotic spine
models in a haptic integrated graphic environment whose dynamic
properties are obtained from LifeMOD. These models can be helpful
for surgeons to examine kinematic behaviors of scoliotic spines and
to propose possible surgical plans before spine correction operations.
Abstract: Absorptive characteristics of polyaniline synthesized
in mixture of water and acetonitrile in 50/50 volume ratio was
studied. Synthesized polyaniline in powder shape is used as an
adsorbent to remove toxic hexavalent chromium from aqueous
solutions. Experiments were conducted in batch mode with different
variables such as agitation time, solution pH and initial concentration
of hexavalent chromium. Removal mechanism is the combination of
surface adsorption and reduction. The equilibrium time for removal
of Cr(T) and Cr(VI) was about 2 and 10 minutes respectively. The
optimum pH for total chromium removal occurred at pH 7 and
maximum hexavalent chromium removal took place under acidic
condition at pH 3. Investigating the isothermal characteristics showed
that the equilibrium adsorption data fitted both Freundlich-s and
Langmuir-s isotherms. The maximum adsorption of chromium was
calculated 36.1 mg/g for polyaniline
Abstract: Maximal Ratio Combining (MRC) is considered the most complex combining technique as it requires channel coefficients estimation. It results in the lowest bit error rate (BER) compared to all other combining techniques. However the BER starts to deteriorate as errors are introduced in the channel coefficients estimation. A novel combining technique, termed Generalized Maximal Ratio Combining (GMRC) with a polynomial kernel, yields an identical BER as MRC with perfect channel estimation and a lower BER in the presence of channel estimation errors. We show that GMRC outperforms the optimal MRC scheme in general and we hereinafter introduce it to the scientific community as a new “supraoptimal" algorithm. Since diversity combining is especially effective in small femto- and pico-cells, internet-associated wireless peripheral systems are to benefit most from GMRC. As a result, many spinoff applications can be made to IP-based 4th generation networks.
Abstract: The study of proteomics reached unexpected levels of
interest, as a direct consequence of its discovered influence over some
complex biological phenomena, such as problematic diseases like
cancer. This paper presents the latest authors- achievements regarding
the analysis of the networks of proteins (interactome networks), by
computing more efficiently the betweenness centrality measure. The
paper introduces the concept of betweenness centrality, and then
describes how betweenness computation can help the interactome net-
work analysis. Current sequential implementations for the between-
ness computation do not perform satisfactory in terms of execution
times. The paper-s main contribution is centered towards introducing
a speedup technique for the betweenness computation, based on
modified shortest path algorithms for sparse graphs. Three optimized
generic algorithms for betweenness computation are described and
implemented, and their performance tested against real biological
data, which is part of the IntAct dataset.
Abstract: Nowadays, without the awareness of consumer
behavior and correct understanding of it, it is not possible for organizations to take appropriate measures to meet the consumer
needs and demands. The aim of this paper is the identification and
prioritization of the factors affecting the consumer behavior based on
the product value. The population of the study includes all the
consumers of furniture producing firms in East Azarbaijan province,
Iran. The research sample includes 93 people selected by the sampling formula in unlimited population. The data collection
instrument was a questionnaire, the validity of which was confirmed
through face validity and the reliability of which was determined,
using Cronbach's alpha coefficient. The Kolmogorov-Smironov test
was used to test data normality, the t-test for identification of factors
affecting the product value, and Friedman test for prioritizing the
factors. The results show that quality, satisfaction, styling, price, finishing operation, performance, safety, worth, shape, use, and
excellence are placed from 1 to 11 priorities, respectively.
Abstract: Universities have an important role in social education in many aspects. In terms of creating awareness and convincing public about social issues, universities take a leading position for public. The best way to provide public support for social education is to develop public communication campaigns. The aim of this study is to present a public communication model which will be guided in social education practices. The study titled “Importance of public communication campaigns and art activities in Social Education “is based on the following topics: Effects of public communication campaigns on social education, Public relations techniques for education, communication strategies, Steps of public relations campaigns in social education, making persuasive messages for public communication campaigns, developing artistic messages and organizing art activities in social education. In addition to these topics, media planning for social education, forming a team as campaign managers, dialogues with opinion leaders in education and preparing creative communication models for social education will be taken into consideration. This study also aims to criticize social education Case studies in Turkey. At the same time, some communicative methods and principles will be given in the light of communication campaigns within the context of this notice.
Abstract: Optimization of cutting parameters important in precision machining in regards to efficiency and surface integrity of the machined part. Usually productivity and precision in machining is limited by the forces emanating from the cutting process. Due to the inherent varying nature of the workpiece in terms of geometry and material composition, the peak cutting forces vary from point to point during machining process. In order to increase productivity without compromising on machining accuracy, it is important to control these cutting forces. In this paper a fuzzy logic control algorithm is developed that can be applied in the control of peak cutting forces in milling of spherical surfaces using ball end mills. The controller can adaptively vary the feedrate to maintain allowable cutting force on the tool. This control algorithm is implemented in a computer numerical control (CNC) machine. It has been demonstrated that the controller can provide stable machining and improve the performance of the CNC milling process by varying feedrate.
Abstract: UK breweries generate extensive by products in the
form of spent grain, slurry and yeast. Much of the spent grain is
produced by large breweries and processed in bulk for animal feed.
Spent brewery grains contain up to 20% protein dry weight and up to
60% fiber and are useful additions to animal feed. Bulk processing is
economic and allows spent grain to be sold so providing an income
to the brewery. A proportion of spent grain, however, is produced by
small local breweries and is more variably distributed to farms or
other users using intermittent collection methods. Such use is much
less economic and may incur losses if not carefully assessed for
transport costs. This study reports an economic returns of using wet
brewery spent grain (WBSG) in animal feed using the Co-product
Optimizer Decision Evaluator model (Cattle CODE) developed by
the University of Nebraska to predict performance and economic
returns when byproducts are fed to finishing cattle. The results
indicated that distance from brewery to farm had a significantly
greater effect on the economics of use of small brewery spent grain
and that alternative uses than cattle feed may be important to
develop.
Abstract: The Yasuj city stream named the Beshar supply
water for different usages such as aquaculture farms , drinking,
agricultural and industrial usages. Fish processing plants
,Agricultural farms, waste water of industrial zones and hospitals
waste water which they are generate by human activity produce a
considerable volume of effluent and when they are released in to the
stream they can effect on the water quality and down stream aquatic
systems. This study was conducted to evaluate the effects of outflow
effluent from different human activity and point and non point
pollution sources on the water quality and health of the Beshar
river next to Yasuj. Yasuj is the biggest and most important city in
the Kohkiloye and Boyerahmad province . The Beshar River is one
of the most important aquatic ecosystems in the upstream of the
Karun watershed in south of Iran which is affected by point and non
point pollutant sources . This study was done in order to evaluate the
effects of human activities on the water quality and health of the
Beshar river. This river is approximately 190 km in length and
situated at the geographical positions of 51° 20' to 51° 48' E and 30°
18' to 30° 52' N it is one of the most important aquatic ecosystems of
Kohkiloye and Boyerahmad province in south-west Iran. In this
research project, five study stations were selected to examine water
pollution in the Beshar River systems. Human activity is now one of
the most important factors affecting on hydrology and water quality
of the Beshar river. Humans use large amounts of resources to sustain
various standards of living, although measures of sustainability are
highly variable depending on how sustainability is defined. The
Beshar river ecosystems are particularly sensitive and vulnerable to
human activities. The water samples were analyzed, then some
important water quality parameters such as pH, dissolve oxygen
(DO), Biochemical Oxygen Demand (BOD5), Chemical Oxygen
Demand (COD), Total Suspended Solids (TDS),Turbidity,
Temperature, Nitrates (NO3) and Phosphates (PO4) were estimated
at the two stations. The results show a downward trend in the water
quality at the down stream of the city. The amounts of
BOD5,COD,TSS,T,Turbidity, NO3 and PO4 in the down stream
stations were considerably more than the station 1. By contrast the
amounts of DO in the down stream stations were less than to the
station 1. However when effluent discharge consequence of human
activities are released into the Beshar river near the city, the quality
of river are decreases and the environmental problems of the river
during the next years are predicted to rise.
Abstract: Pressures for urban redevelopment are intensifying in
all large cities. A new logic for urban development is required –
green urbanism – that provides a spatial framework for directing
population and investment inwards to brownfields and greyfields
precincts, rather than outwards to the greenfields. This represents
both a major opportunity and a major challenge for city planners in
pluralist liberal democracies. However, plans for more compact
forms of urban redevelopment are stalling in the face of community
resistance. A new paradigm and spatial planning platform is required
that will support timely multi-level and multi-actor stakeholder
engagement, resulting in the emergence of consensus plans for
precinct-level urban regeneration capable of more rapid
implementation. Using Melbourne, Australia as a case study, this
paper addresses two of the urban intervention challenges – where and
how – via the application of a 21st century planning tool ENVISION
created for this purpose.
Abstract: Optical properties of sputter-deposited ZnS thin films
were investigated as potential replacements for CBD(chemical bath
deposition) CdS buffer layers in the application of CIGS solar cells.
ZnS thin films were fabricated on glass substrates at RT, 150oC, 200oC,
and 250oC with 50 sccm Ar gas using an RF magnetron sputtering
system. The crystal structure of the thin film is found to be zinc blende
(cubic) structure. Lattice parameter of ZnS is slightly larger than CdS
on the plane and thus better matched with that of CIGS. Within a
400-800 nm wavelength region, the average transmittance was larger
than 75%. When the deposition temperature of the thin film was
increased, the blue shift phenomenon was enhanced. Band gap energy
of the ZnS thin film tended to increase as the deposition temperature
increased. ZnS thin film is a promising material system for the CIGS
buffer layer, in terms of ease of processing, low cost, environmental
friendliness, higher transparency, and electrical properties
Abstract: Decision Feedback equalizers (DFEs) usually outperform linear equalizers for channels with intersymbol interference. However, the DFE performance is highly dependent on the availability of reliable past decisions. Hence, in coded systems, where reliable decisions are only available after decoding the full block, the performance of the DFE will be affected. A symbol based DFE is a DFE that only uses the decision after the block is decoded. In this paper we derive the optimal settings of both the feedforward and feedback taps of the symbol based equalizer. We present a novel symbol based DFE filterbank, and derive its taps optimal settings. We also show that it outperforms the classic DFE in terms of complexity and/or performance.
Abstract: In this work, the primary compressive strength
components of human femur trabecular bone are qualitatively
assessed using image processing and wavelet analysis. The Primary
Compressive (PC) component in planar radiographic femur trabecular
images (N=50) is delineated by semi-automatic image processing
procedure. Auto threshold binarization algorithm is employed to
recognize the presence of mineralization in the digitized images. The
qualitative parameters such as apparent mineralization and total area
associated with the PC region are derived for normal and abnormal
images.The two-dimensional discrete wavelet transforms are utilized
to obtain appropriate features that quantify texture changes in medical
images .The normal and abnormal samples of the human femur are
comprehensively analyzed using Harr wavelet.The six statistical
parameters such as mean, median, mode, standard deviation, mean
absolute deviation and median absolute deviation are derived at level
4 decomposition for both approximation and horizontal wavelet
coefficients. The correlation coefficient of various wavelet derived
parameters with normal and abnormal for both approximated and
horizontal coefficients are estimated. It is seen that in almost all cases
the abnormal show higher degree of correlation than normals. Further
the parameters derived from approximation coefficient show more
correlation than those derived from the horizontal coefficients. The
parameters mean and median computed at the output of level 4 Harr
wavelet channel was found to be a useful predictor to delineate the
normal and the abnormal groups.
Abstract: The clinical usefulness of heart rate variability is
limited to the range of Holter monitoring software available. These
software algorithms require a normal sinus rhythm to accurately
acquire heart rate variability (HRV) measures in the frequency
domain. Premature ventricular contractions (PVC) or more
commonly referred to as ectopic beats, frequent in heart failure,
hinder this analysis and introduce ambiguity. This investigation
demonstrates an algorithm to automatically detect ectopic beats by
analyzing discrete wavelet transform coefficients. Two techniques
for filtering and replacing the ectopic beats from the RR signal are
compared. One technique applies wavelet hard thresholding
techniques and another applies linear interpolation to replace ectopic
cycles. The results demonstrate through simulation, and signals
acquired from a 24hr ambulatory recorder, that these techniques can
accurately detect PVC-s and remove the noise and leakage effects
produced by ectopic cycles retaining smooth spectra with the
minimum of error.
Abstract: Video-on-demand (VOD) is designed by using content delivery networks (CDN) to minimize the overall operational cost and to maximize scalability. Estimation of the viewing pattern (i.e., the relationship between the number of viewings and the ranking of VOD contents) plays an important role in minimizing the total operational cost and maximizing the performance of the VOD systems. In this paper, we have analyzed a large body of commercial VOD viewing data and found that the viewing rank distribution fits well with the parabolic fractal distribution. The weighted linear model fitting function is used to estimate the parameters (coefficients) of the parabolic fractal distribution. This paper presents an analytical basis for designing an optimal hierarchical VOD contents distribution system in terms of its cost and performance.
Abstract: In mechanical and environmental engineering, mixed
convection is a frequently encountered thermal fluid phenomenon
which exists in atmospheric environment, urban canopy flows, ocean
currents, gas turbines, heat exchangers, and computer chip cooling
systems etc... . This paper deals with a numerical investigation of
mixed convection in a vertical heated channel. This flow results from
the mixing of the up-going fluid along walls of the channel with the
one issued from a flat nozzle located in its entry section. The fluiddynamic
and heat-transfer characteristics of vented vertical channels
are investigated for constant heat-flux boundary conditions, a
Rayleigh number equal to 2.57 1010, for two jet Reynolds number
Re=3 103 and 2104 and the aspect ratio in the 8-20 range. The system
of governing equations is solved with a finite volumes method and an
implicit scheme. The obtained results show that the turbulence and
the jet-wall interaction activate the heat transfer, as does the drive of
ambient air by the jet. For low Reynolds number Re=3 103, the
increase of the aspect Ratio enhances the heat transfer of about 3%,
however; for Re=2 104, the heat transfer enhancement is of about
12%. The numerical velocity, pressure and temperature fields are
post-processed to compute the quantities of engineering interest such
as the induced mass flow rate, and average Nusselt number, in terms
of Rayleigh, Reynolds numbers and dimensionless geometric
parameters are presented.
Abstract: In-place sorting algorithms play an important role in many fields such as very large database systems, data warehouses, data mining, etc. Such algorithms maximize the size of data that can be processed in main memory without input/output operations. In this paper, a novel in-place sorting algorithm is presented. The algorithm comprises two phases; rearranging the input unsorted array in place, resulting segments that are ordered relative to each other but whose elements are yet to be sorted. The first phase requires linear time, while, in the second phase, elements of each segment are sorted inplace in the order of z log (z), where z is the size of the segment, and O(1) auxiliary storage. The algorithm performs, in the worst case, for an array of size n, an O(n log z) element comparisons and O(n log z) element moves. Further, no auxiliary arithmetic operations with indices are required. Besides these theoretical achievements of this algorithm, it is of practical interest, because of its simplicity. Experimental results also show that it outperforms other in-place sorting algorithms. Finally, the analysis of time and space complexity, and required number of moves are presented, along with the auxiliary storage requirements of the proposed algorithm.
Abstract: The ability of information systems to operate in conjunction with each other encompassing communication protocols, hardware, software, application, and data compatibility layers. There has been considerable work in industry on the development of component interoperability models, such as CORBA, (D)COM and JavaBeans. These models are intended to reduce the complexity of software development and to facilitate reuse of off-the-shelf components. The focus of these models is syntactic interface specification, component packaging, inter-component communications, and bindings to a runtime environment. What these models lack is a consideration of architectural concerns – specifying systems of communicating components, explicitly representing loci of component interaction, and exploiting architectural styles that provide well-understood global design solutions. The development of complex business applications is now focused on an assembly of components available on a local area network or on the net. These components must be localized and identified in terms of available services and communication protocol before any request. The first part of the article introduces the base concepts of components and middleware while the following sections describe the different up-todate models of communication and interaction and the last section shows how different models can communicate among themselves.