Abstract: Value engineering is an efficacious contraption for
administrators to make up their minds. Value perusals proffer the
gaffers a suitable instrument to decrease the expenditures of the life
span, quality amelioration, structural improvement, curtailment of the
construction schedule, longevity prolongation or a merging of the
aforementioned cases. Subjecting organizers to pressures on one
hand and their accountability towards their pertinent fields together
with inherent risks and ambiguities of other options on the other hand
set some comptrollers in a dilemma utilization of risk management
and the value engineering in projects manipulation with regard to
complexities of implementing projects can be wielded as a
contraption to identify and efface each item which wreaks
unnecessary expenses and time squandering sans inflicting any
damages upon the essential project applications. Of course It should
be noted that implementation of risk management and value
engineering with regard to the betterment of efficiency and functions
may lead to the project implementation timing elongation. Here time
revamping does not refer to time diminishing in the whole cases. his
article deals with risk and value engineering conceptualizations at
first. The germane reverberations effectuated due to its execution in
Iran Khodro Corporation are regarded together with the joint features
and amalgamation of the aforesaid entia; hence the proposed
blueprint is submitted to be taken advantage of in engineering and
industrial projects including Iran Khodro Corporation.
Abstract: A low-complexity, high-accurate frequency offset
estimation for multi-band orthogonal frequency division multiplexing (MB-OFDM) based ultra-wide band systems is presented regarding different carrier frequency offsets, different channel frequency
responses, different preamble patterns in different bands. Utilizing a
half-cycle Constant Amplitude Zero Auto Correlation (CAZAC) sequence as the preamble sequence, the estimator with a semi-cross
contrast scheme between two successive OFDM symbols is proposed. The CRLB and complexity of the proposed algorithm are derived.
Compared to the reference estimators, the proposed method achieves
significantly less complexity (about 50%) for all preamble patterns of the MB-OFDM systems. The CRLBs turn out to be of well performance.
Abstract: We have proposed an information filtering system
using index word selection from a document set based on the
topics included in a set of documents. This method narrows
down the particularly characteristic words in a document set
and the topics are obtained by Sparse Non-negative Matrix
Factorization. In information filtering, a document is often
represented with the vector in which the elements correspond
to the weight of the index words, and the dimension of the
vector becomes larger as the number of documents is
increased. Therefore, it is possible that useless words as index
words for the information filtering are included. In order to
address the problem, the dimension needs to be reduced. Our
proposal reduces the dimension by selecting index words
based on the topics included in a document set. We have
applied the Sparse Non-negative Matrix Factorization to the
document set to obtain these topics. The filtering is carried out
based on a centroid of the learning document set. The centroid
is regarded as the user-s interest. In addition, the centroid is
represented with a document vector whose elements consist of
the weight of the selected index words. Using the English test
collection MEDLINE, thus, we confirm the effectiveness of
our proposal. Hence, our proposed selection can confirm the
improvement of the recommendation accuracy from the other
previous methods when selecting the appropriate number of
index words. In addition, we discussed the selected index
words by our proposal and we found our proposal was able to
select the index words covered some minor topics included in
the document set.
Abstract: This paper presents a 2-D hydrodynamic model of the ablated plasma when irradiating a 50 μm Al solid target with a single pulsed ion beam. The Lagrange method is used to solve the moving fluid for the ablated plasma production and formation mechanism. In the calculations, a 10-ns-single-pulsed of ion beam with a total energy density of 120 J/cm2, is used. The results show that the ablated plasma was formed after 2 ns of ion beam irradiation and it started to expand right after 4-6 ns. In addition, the 2-D model give a better understanding of pulsed ion beam-solid target ablated plasma production and expansion process clearer.
Abstract: Text similarity measurement is a fundamental issue in
many textual applications such as document clustering, classification,
summarization and question answering. However, prevailing approaches
based on Vector Space Model (VSM) more or less suffer
from the limitation of Bag of Words (BOW), which ignores the semantic
relationship among words. Enriching document representation
with background knowledge from Wikipedia is proven to be an effective
way to solve this problem, but most existing methods still
cannot avoid similar flaws of BOW in a new vector space. In this
paper, we propose a novel text similarity measurement which goes
beyond VSM and can find semantic affinity between documents.
Specifically, it is a unified graph model that exploits Wikipedia as
background knowledge and synthesizes both document representation
and similarity computation. The experimental results on two different
datasets show that our approach significantly improves VSM-based
methods in both text clustering and classification.
Abstract: A solar powered air heating system using one ended evacuated tubes is experimentally investigated. A solar air heater containing forty evacuated tubes is used for heating purpose. The collector surface area is about 4.44 m2. The length and outer diameters of the outer glass tube and absorber tube are 1500, 47 and 37 mm, respectively. In this experimental setup, we have a header (heat exchanger) of square shape (190 mm x 190 mm). The length of header is 1500 mm. The header consists of a hollow pipe in the center whose diameter is 60 mm through which the air is made to flow. The experimental setup contains approximately 108 liters of water. Water is working as heat collecting medium which collects the solar heat falling on the tubes. This heat is delivered to the air flowing through the header pipe. This heat flow is due to natural convection and conduction. The outlet air temperature depends upon several factors along with air flow rate and solar radiation intensity. The study has been done for both up-flow and down-flow of air in header in similar weather conditions, at different flow rates. In the present investigations the study has been made to find the effect of intensity of solar radiations and flow rate of air on the out let temperature of the air with time and which flow is more efficient. The obtained results show that the system is highly effective for the heating in this region. Moreover, it has been observed that system is highly efficient for the particular flow rate of air. It was also observed that downflow configuration is more effective than up-flow condition at all flow rates due to lesser losses in down-flow. The results show that temperature differences of upper head and lower head, both of water and surface of pipes on the respective ends is lower in down-flow.
Abstract: Breast cancer is one of the most frequent occurring cancers in women throughout the world including U.K. The grading of this cancer plays a vital role in the prognosis of the disease. In this paper we present an overview of the use of advanced computational method of fuzzy inference system as a tool for the automation of breast cancer grading. A new spectral data set obtained from Fourier Transform Infrared Spectroscopy (FTIR) of cancer patients has been used for this study. The future work outlines the potential areas of fuzzy systems that can be used for the automation of breast cancer grading.
Abstract: Using strength Pulse Electrical Field (PEF) in food
industries is a non-thermal process that can deactivate
microorganisms and increase penetration in plant and animals tissues
without serious impact on food taste and quality. In this paper designing and fabricating of a PEF generator has been presented. Pulse generation methods have been surveyed and the best of them
selected. The equipment by controller set can generate square pulse with adjustable parameters such as amplitude 1-5kV, frequency 0.1-10Hz, pulse width 10-100s, and duty cycle 0-100%. Setting the number of pulses, and presenting the output voltage and current
waveforms on the oscilloscope screen are another advantages of this
equipment. Finally, some food samples were tested that yielded the satisfactory results. PEF applying had considerable effects on potato, banana and purple cabbage. It caused increase Brix factor from 0.05
to 0.15 in potato solution. It is also so effective in extraction color material from purple cabbage. In the last experiment effects of PEF
voltages on color extraction of saffron scum were surveyed (about 6% increasing yield).
Abstract: This paper analyzes the effect of a single uniform accounting rule on reporting quality by investigating the influence of IFRS on earnings management. This paper examines whether earnings management is reduced after IFRS adoption through the use of “loss avoidance thresholds”, a method that has been verified in earlier studies. This paper concentrates on two European countries: one that represents the continental code law tradition with weak protection of investors (France) and one that represents the Anglo-American common law tradition, which typically implies a strong enforcement system (the United Kingdom).
The research investigates a sample of 526 companies (6822 firm-year observations) during the years 2000 – 2013. The results are different for the two jurisdictions. This study demonstrates that a single set of accounting standards contributes to better reporting quality and reduces the pervasiveness of earnings management in France. In contrast, there is no evidence that a reduction in earnings management followed the implementation of IFRS in the United Kingdom. Due to the fact that IFRS benefit France but not the United Kingdom, other political and economic factors, such legal system or capital market strength, must play a significant role in influencing the comparability and transparency cross-border companies’ financial statements. Overall, the result suggests that IFRS moderately contribute to the accounting quality of reported financial statements and bring benefit for stakeholders, though the role played by other economic factors cannot be discounted.
Abstract: The morphological short-term evolution of Ponta do Tubarão Island (PTI) was investigated through high accurate surveys based on post-processed kinematic (PPK) relative positioning on Global Navigation Satellite Systems (GNSS). PTI is part of a barrier island system on a high energy northeast Brazilian coastal environment and also an area of high environmental sensitivity. Surveys were carried out quarterly over a two years period from May 2010 to May 2012. This paper assesses statically the performance of digital elevation models (DEM) derived from different interpolation methods to represent morphologic features and to quantify volumetric changes and TIN models shown the best results to that purposes. The MDE allowed quantifying surfaces and volumes in detail as well as identifying the most vulnerable segments of the PTI to erosion and/or accumulation of sediments and relate the alterations to climate conditions. The coastal setting and geometry of PTI protects a significant mangrove ecosystem and some oil and gas facilities installed in the vicinities from damaging effects of strong oceanwaves and currents. Thus, the maintenance of PTI is extremely required but the prediction of its longevity is uncertain because results indicate an irregularity of sedimentary balance and a substantial decline in sediment supply to this coastal area.
Abstract: Cryo-electron microscopy (CEM) in combination with
single particle analysis (SPA) is a widely used technique for
elucidating structural details of macromolecular assemblies at closeto-
atomic resolutions. However, development of automated software
for SPA processing is still vital since thousands to millions of
individual particle images need to be processed. Here, we present our
workflow for automated particle picking. Our approach integrates
peak shape analysis to the classical correlation and an iterative
approach to separate macromolecules and background by
classification. This particle selection workflow furthermore provides
a robust means for SPA with little user interaction. Processing
simulated and experimental data assesses performance of the
presented tools.
Abstract: Spam mails are unwanted mails sent to large number
of users. Spam mails not only consume the network resources, but
cause security threats as well. This paper proposes an efficient
technique to detect, and to prevent spam mail in the sender side rather
than the receiver side. This technique is based on a counter set on the
sender server. When a mail is transmitted to the server, the mail server
checks the number of the recipients based on its counter policy. The
counter policy performed by the mail server is based on some
pre-defined criteria. When the number of recipients exceeds the
counter policy, the mail server discontinues the rest of the process, and
sends a failure mail to sender of the mail; otherwise the mail is
transmitted through the network. By using this technique, the usage of
network resources such as bandwidth, and memory is preserved. The
simulation results in real network show that when the counter is set on
the sender side, the time required for spam mail detection is 100 times
faster than the time the counter is set on the receiver side, and the
network resources are preserved largely compared with other
anti-spam mail techniques in the receiver side.
Abstract: Clustering categorical data is more complicated than
the numerical clustering because of its special properties. Scalability
and memory constraint is the challenging problem in clustering large
data set. This paper presents an incremental algorithm to cluster the
categorical data. Frequencies of attribute values contribute much in
clustering similar categorical objects. In this paper we propose new
similarity measures based on the frequencies of attribute values and
its cardinalities. The proposed measures and the algorithm are
experimented with the data sets from UCI data repository. Results
prove that the proposed method generates better clusters than the
existing one.
Abstract: An important structuring mechanism for knowledge bases is building clusters based on the content of their knowledge objects. The objects are clustered based on the principle of maximizing the intraclass similarity and minimizing the interclass similarity. Clustering can also facilitate taxonomy formation, that is, the organization of observations into a hierarchy of classes that group similar events together. Hierarchical representation allows us to easily manage the complexity of knowledge, to view the knowledge at different levels of details, and to focus our attention on the interesting aspects only. One of such efficient and easy to understand systems is Hierarchical Production rule (HPRs) system. A HPR, a standard production rule augmented with generality and specificity information, is of the following form Decision If < condition> Generality Specificity . HPRs systems are capable of handling taxonomical structures inherent in the knowledge about the real world. In this paper, a set of related HPRs is called a cluster and is represented by a HPR-tree. This paper discusses an algorithm based on cumulative learning scenario for dynamic structuring of clusters. The proposed scheme incrementally incorporates new knowledge into the set of clusters from the previous episodes and also maintains summary of clusters as Synopsis to be used in the future episodes. Examples are given to demonstrate the behaviour of the proposed scheme. The suggested incremental structuring of clusters would be useful in mining data streams.
Abstract: In recent years, response surface methodology (RSM) has
brought many attentions of many quality engineers in different
industries. Most of the published literature on robust design
methodology is basically concerned with optimization of a single
response or quality characteristic which is often most critical to
consumers. For most products, however, quality is multidimensional,
so it is common to observe multiple responses in an experimental
situation. Through this paper interested person will be familiarize
with this methodology via surveying of the most cited technical
papers.
It is believed that the proposed procedure in this study can resolve
a complex parameter design problem with more than two responses.
It can be applied to those areas where there are large data sets and a
number of responses are to be optimized simultaneously. In addition,
the proposed procedure is relatively simple and can be implemented
easily by using ready-made standard statistical packages.
Abstract: By systematically applying different engineering
methods, difficult financial problems become approachable. Using a
combination of theory and techniques such as wavelet transform,
time series data mining, Markov chain based discrete stochastic
optimization, and evolutionary algorithms, this work formulated a
strategy to characterize and forecast non-linear time series. It
attempted to extract typical features from the volatility data sets of
S&P100 and S&P500 indices that include abrupt drops, jumps and
other non-linearity. As a result, accuracy of forecasting has reached
an average of over 75% surpassing any other publicly available
results on the forecast of any financial index.
Abstract: Hierarchical classification is a problem with applications in many areas as protein function prediction where the dates are hierarchically structured. Therefore, it is necessary the development of algorithms able to induce hierarchical classification models. This paper presents experimenters using the algorithm for hierarchical classification called Multi-label Hierarchical Classification using a Competitive Neural Network (MHC-CNN). It was tested in ten datasets the Gene Ontology (GO) Cellular Component Domain. The results are compared with the Clus-HMC and Clus-HSC using the hF-Measure.
Abstract: Overloading is a technique to accommodate more
number of users than the spreading factor N. This is a bandwidth
efficient scheme to increase the number users in a fixed bandwidth.
One of the efficient schemes to overload a CDMA system is to use
two sets of orthogonal signal waveforms (O/O). The first set is
assigned to the N users and the second set is assigned to the
additional M users. An iterative interference cancellation technique is
used to cancel interference between the two sets of users. In this
paper, the performance of an overloading scheme in which the first N
users are assigned Walsh-Hadamard orthogonal codes and extra users
are assigned the same WH codes but overlaid by a fixed (quasi) bent
sequence [11] is evaluated. This particular scheme is called Quasi-
Orthogonal Sequence (QOS) O/O scheme, which is a part of
cdma2000 standard [12] to provide overloading in the downlink
using single user detector. QOS scheme are balance O/O scheme,
where the correlation between any set-1 and set-2 users are
equalized. The allowable overload of this scheme is investigated in
the uplink on an AWGN and Rayleigh fading channels, so that the
uncoded performance with iterative multistage interference
cancellation detector remains close to the single user bound. It is
shown that this scheme provides 19% and 11% overloading with
SDIC technique for N= 16 and 64 respectively, with an SNR
degradation of less than 0.35 dB as compared to single user bound at
a BER of 0.00001. But on a Rayleigh fading channel, the channel
overloading is 45% (29 extra users) at a BER of 0.0005, with an SNR
degradation of about 1 dB as compared to single user performance
for N=64. This is a significant amount of channel overloading on a
Rayleigh fading channel.
Abstract: A learning content management system (LCMS) is an
environment to support web-based learning content development.
Primary function of the system is to manage the learning process as
well as to generate content customized to meet a unique requirement
of each learner. Among the available supporting tools offered by
several vendors, we propose to enhance the LCMS functionality to
individualize the presented content with the induction ability. Our
induction technique is based on rough set theory. The induced rules
are intended to be the supportive knowledge for guiding the content
flow planning. They can also be used as decision rules to help
content developers on managing content delivered to individual
learner.
Abstract: Fruit drying is a well known process mostly used for
preservation of fruits. Osmotic dehydration of apricot slices were
carried out in three different salt-sucrose concentrations and four
different temperatures. Also three different weight ratios of solution
to sample were conducted to one set of experiments. The dehydration
curves were constructed using Peleg-s model. Increasing the solution
volume increased the mass transfer rate and hence the solid gain
increased rapidly. Increasing the volume of osmotic media caused an
increase in overall mass transfer but a 'solution to sample' ratio of 5:1
gave the best product quality. The best temperature and concentration
that had a high water loss to solid gain ratio and an acceptable taste
were 40°C and 5%, respectively.