Abstract: Flexible macroblock ordering (FMO), adopted in the
H.264 standard, allows to partition all macroblocks (MBs) in a frame
into separate groups of MBs called Slice Groups (SGs). FMO can not
only support error-resilience, but also control the size of video packets
for different network types. However, it is well-known that the number
of bits required for encoding the frame is increased by adopting FMO.
In this paper, we propose a novel algorithm that can reduce the bitrate
overhead caused by utilizing FMO. In the proposed algorithm, all MBs
are grouped in SGs based on the similarity of the transform
coefficients. Experimental results show that our algorithm can reduce
the bitrate as compared with conventional FMO.
Abstract: The present investigation was aimed to develop methodology for the standardization of Marichyadi Vati and its raw materials. Standardization was carried using systematic Pharmacognostical and physicochemical parameters as per WHO guidelines. The detailed standardization of Marichyadi Vati, it is concluded that there are no major differences prevailed in the quality of marketed products and laboratory samples of Marichyadi Vati. However, market samples showed slightly better amount of Piperine than the laboratory sample by both methods. This is the first attempt to generate complete set of standards required for the Marichyadi Vati.
Abstract: Effect of oral administration of “Gadagi" tea on liver
function was assessed on 50 healthy male albino rats which were
grouped and administered with different doses(mg/kg) i.e low dose
(380mg/kg, 415mg/kg, 365mg/kg, 315mg/kg for “sak", “sada" and
“magani" respectively), standard dose ( 760mg/kg, 830mg/kg,
730mg/kg for “sak-, “sada" and “magani" respectively) and high dose
(1500mg/kg, 1700mg/kg and 1460mg/kg for “sak--,"sada" and
“magani" groups respectively) for a period of four weeks. Animals
that were not administered with the tea constituted the control group.
At the end of fourth week, the animals were sacrificed and their
serum alanine aminotransferase (ALT), aspartate aminotransferase
(AST), alkaline phosphatase (ALP), total protein (TP), albumin
(ALB), and globulins (GLO) were determined. Mean serum ALT and
ALP activities were significantly higher (P
Abstract: The procurement and cost management approach adopted for mechanical and electrical (M&E) services in Malaysian construction industry have been criticized for its inefficiency. The study examined early cost estimating practices adopted for mechanical and electrical services (M&E) in Malaysia so as to understand the level of compliance of the current techniques with best practices. The methodology adopted for the study is a review of bidding documents used on both completed and on – going building projects awarded between 2008 – 2010 under 9th Malaysian Plan. The analysis revealed that, M&E services cost cannot be reliably estimated at pre-contract stage; the bidding techniques adopted for M&E services failed to provide uniform basis for contractors to submit tender; detailed measurement of items were not made which could complicate post contract cost control and financial management. The paper concluded that, there is need to follow a structured approach in determining the pre-contract cost estimate for M&E services which will serve as a virile tool for post contract cost control.
Abstract: In this work, the primary compressive strength
components of human femur trabecular bone are qualitatively
assessed using image processing and wavelet analysis. The Primary
Compressive (PC) component in planar radiographic femur trabecular
images (N=50) is delineated by semi-automatic image processing
procedure. Auto threshold binarization algorithm is employed to
recognize the presence of mineralization in the digitized images. The
qualitative parameters such as apparent mineralization and total area
associated with the PC region are derived for normal and abnormal
images.The two-dimensional discrete wavelet transforms are utilized
to obtain appropriate features that quantify texture changes in medical
images .The normal and abnormal samples of the human femur are
comprehensively analyzed using Harr wavelet.The six statistical
parameters such as mean, median, mode, standard deviation, mean
absolute deviation and median absolute deviation are derived at level
4 decomposition for both approximation and horizontal wavelet
coefficients. The correlation coefficient of various wavelet derived
parameters with normal and abnormal for both approximated and
horizontal coefficients are estimated. It is seen that in almost all cases
the abnormal show higher degree of correlation than normals. Further
the parameters derived from approximation coefficient show more
correlation than those derived from the horizontal coefficients. The
parameters mean and median computed at the output of level 4 Harr
wavelet channel was found to be a useful predictor to delineate the
normal and the abnormal groups.
Abstract: Since water resources of desert Naein City are very
limited, a approach which saves water resources and meanwhile
meets the needs of the greenspace for water is to use city-s sewage
wastewater. Proper treatment of Naein-s sewage up to the standards
required for green space uses may solve some of the problems of
green space development of the city. The present paper closely
examines available statistics and information associated with city-s
sewage system, and determines complementary stages of sewage
treatment facilities of the city. In the present paper, population, per
capita water use, and required discharge for various greenspace
pieces including different plants are calculated. Moreover, in order to
facilitate the application of water resources, a Crude water
distribution network apart from drinking water distribution network is
designed, and a plan for mixing municipal wells- water with sewage
wastewater in proposed mixing tanks is suggested. Hence, following
greenspace irrigation reform and complementary plan, per capita
greenspace of the city will be increased from current amount of 13.2
square meters to 32 square meters.
Abstract: Graph coloring is an important problem in computer
science and many algorithms are known for obtaining reasonably
good solutions in polynomial time. One method of comparing
different algorithms is to test them on a set of standard graphs where
the optimal solution is already known. This investigation analyzes a
set of 50 well known graph coloring instances according to a set of
complexity measures. These instances come from a variety of
sources some representing actual applications of graph coloring
(register allocation) and others (mycieleski and leighton graphs) that
are theoretically designed to be difficult to solve. The size of the
graphs ranged from ranged from a low of 11 variables to a high of
864 variables. The method used to solve the coloring problem was
the square of the adjacency (i.e., correlation) matrix. The results
show that the most difficult graphs to solve were the leighton and the
queen graphs. Complexity measures such as density, mobility,
deviation from uniform color class size and number of block
diagonal zeros are calculated for each graph. The results showed that
the most difficult problems have low mobility (in the range of .2-.5)
and relatively little deviation from uniform color class size.
Abstract: This research was to evaluate a technical feasibility of
making single-layer experimental particleboard panels from bamboo
waste (Dendrocalamus asper Backer) by converting bamboo into
strips, which are used to make laminated bamboo furniture. Variable
factors were density (600, 700 and 800 kg/m3) and temperature of
condition (25, 40 and 55 °C). The experimental panels were tested for
their physical and mechanical properties including modulus of
elasticity (MOE), modulus of rupture (MOR), internal bonding
strength (IB), screw holding strength (SH) and thickness swelling
values according to the procedures defined by Japanese Industrial
Standard (JIS). The test result of mechanical properties showed that
the MOR, MOE and IB values were not in the set criteria, except the
MOR values at the density of 700 kg/m3 at 25 °C and at the density
of 800 kg/m3 at 25 and 40 °C, the IB values at the density of 600
kg/m3, at 40 °C, and at the density of 800 kg/m3 at 55 °C. The SH
values had the test result according to the set standard, except with
the density of 600 kg/m3, at 40 and 55 °C. Conclusively, a valuable
renewable biomass, bamboo waste could be used to manufacture
boards.
Abstract: The paper depicts air velocity values, reproduced by laser Doppler anemometer (LDA) and ultrasonic anemometer (UA), relations with calculated ones from flow rate measurements using the gas meter which calibration uncertainty is ± (0.15 – 0.30) %. Investigation had been performed in channel installed in aerodynamical facility used as a part of national standard of air velocity. Relations defined in a research let us confirm the LDA and UA for air velocity reproduction to be the most advantageous measures. The results affirm ultrasonic anemometer to be reliable and favourable instrument for measurement of mean velocity or control of velocity stability in the velocity range of 0.05 m/s – 10 (15) m/s when the LDA used. The main aim of this research is to investigate low velocity regularities, starting from 0.05 m/s, including region of turbulent, laminar and transitional air flows. Theoretical and experimental results and brief analysis of it are given in the paper. Maximum and mean velocity relations for transitional air flow having unique distribution are represented. Transitional flow having distinctive and different from laminar and turbulent flow characteristics experimentally have not yet been analysed.
Abstract: In this study, the effects of machining parameters on
specific energy during surface grinding of 6061Al-SiC35P
composites are investigated. Vol% of SiC, feed and depth of cut were
chosen as process variables. The power needed for the calculation of
the specific energy is measured from the two watt meter method.
Experiments are conducted using standard RSM design called Central
composite design (CCD). A second order response surface model was
developed for specific energy. The results identify the significant
influence factors to minimize the specific energy. The confirmation
results demonstrate the practicability and effectiveness of the
proposed approach.
Abstract: The purpose of semantic web research is to transform
the Web from a linked document repository into a distributed knowledge base and application platform, thus allowing the vast range of available information and services to be more efficiently
exploited. As a first step in this transformation, languages such as
OWL have been developed. Although fully realizing the Semantic Web still seems some way off, OWL has already been very
successful and has rapidly become a defacto standard for ontology
development in fields as diverse as geography, geology, astronomy,
agriculture, defence and the life sciences. The aim of this paper is to classify key concepts of Semantic Web as well as introducing a new
practical approach which uses these concepts to outperform Word Wide Web.
Abstract: There are two common types of operational research techniques, optimisation and metaheuristic methods. The latter may be defined as a sequential process that intelligently performs the exploration and exploitation adopted by natural intelligence and strong inspiration to form several iterative searches. An aim is to effectively determine near optimal solutions in a solution space. In this work, a type of metaheuristics called Ant Colonies Optimisation, ACO, inspired by a foraging behaviour of ants was adapted to find optimal solutions of eight non-linear continuous mathematical models. Under a consideration of a solution space in a specified region on each model, sub-solutions may contain global or multiple local optimum. Moreover, the algorithm has several common parameters; number of ants, moves, and iterations, which act as the algorithm-s driver. A series of computational experiments for initialising parameters were conducted through methods of Rigid Simplex, RS, and Modified Simplex, MSM. Experimental results were analysed in terms of the best so far solutions, mean and standard deviation. Finally, they stated a recommendation of proper level settings of ACO parameters for all eight functions. These parameter settings can be applied as a guideline for future uses of ACO. This is to promote an ease of use of ACO in real industrial processes. It was found that the results obtained from MSM were pretty similar to those gained from RS. However, if these results with noise standard deviations of 1 and 3 are compared, MSM will reach optimal solutions more efficiently than RS, in terms of speed of convergence.
Abstract: One of the purposes of the robust method of
estimation is to reduce the influence of outliers in the data, on the
estimates. The outliers arise from gross errors or contamination from
distributions with long tails. The trimmed mean is a robust estimate.
This means that it is not sensitive to violation of distributional
assumptions of the data. It is called an adaptive estimate when the
trimming proportion is determined from the data rather than being
fixed a “priori-.
The main objective of this study is to find out the robustness
properties of the adaptive trimmed means in terms of efficiency, high
breakdown point and influence function. Specifically, it seeks to find
out the magnitude of the trimming proportion of the adaptive
trimmed mean which will yield efficient and robust estimates of the
parameter for data which follow a modified Weibull distribution with
parameter λ = 1/2 , where the trimming proportion is determined by a
ratio of two trimmed means defined as the tail length. Secondly, the
asymptotic properties of the tail length and the trimmed means are
also investigated. Finally, a comparison is made on the efficiency of
the adaptive trimmed means in terms of the standard deviation for the
trimming proportions and when these were fixed a “priori".
The asymptotic tail lengths defined as the ratio of two trimmed
means and the asymptotic variances were computed by using the
formulas derived. While the values of the standard deviations for the
derived tail lengths for data of size 40 simulated from a Weibull
distribution were computed for 100 iterations using a computer
program written in Pascal language.
The findings of the study revealed that the tail lengths of the
Weibull distribution increase in magnitudes as the trimming
proportions increase, the measure of the tail length and the adaptive
trimmed mean are asymptotically independent as the number of
observations n becomes very large or approaching infinity, the tail
length is asymptotically distributed as the ratio of two independent
normal random variables, and the asymptotic variances decrease as
the trimming proportions increase. The simulation study revealed
empirically that the standard error of the adaptive trimmed mean
using the ratio of tail lengths is relatively smaller for different values
of trimming proportions than its counterpart when the trimming
proportions were fixed a 'priori'.
Abstract: Negation is useful in the majority of the real world applications. However, its introduction leads to semantic and canonical problems. We propose in this paper an approach based on stratification to deal with negation problems. This approach is based on an extension of predicates nets. It is characterized with two main contributions. The first concerns the management of the whole class of stratified programs. The second contribution is related to usual operations optimizations on stratified programs (maximal stratification, incremental updates ...).
Abstract: This paper presents a new hardware interface using a
microcontroller which processes audio music signals to standard
MIDI data. A technique for processing music signals by extracting
note parameters from music signals is described. An algorithm to
convert the voice samples for real-time processing without complex
calculations is proposed. A high frequency microcontroller as the
main processor is deployed to execute the outlined algorithm. The
MIDI data generated is transmitted using the EIA-232 protocol. The
analyses of data generated show the feasibility of using
microcontrollers for real-time MIDI generation hardware interface.
Abstract: We report the electronic structure and optical
properties of NdF3 compound. Our calculations are based on density
functional theory (DFT) using the full potential linearized augmented
plane wave (FPLAPW) method with the inclusion of spin orbit
coupling. We employed the local spin density approximation (LSDA)
and Coulomb-corrected local spin density approximation, known for
treating the highly correlated 4f electrons properly, is able to
reproduce the correct insulating ground state. We find that the
standard LSDA approach is incapable of correctly describing the
electronic properties of such materials since it positions the f-bands
incorrectly resulting in an incorrect metallic ground state. On the
other hand, LSDA + U approximation, known for treating the highly
correlated 4f electrons properly, is able to reproduce the correct
insulating ground state. Interestingly, however, we do not find any
significant differences in the optical properties calculated using
LSDA, and LSDA + U suggesting that the 4f electrons do not play a
decisive role in the optical properties of these compounds. The
reflectivity for NdF3 compound stays low till 7 eV which is
consistent with their large energy gaps. The calculated energy gaps
are in good agreement with experiments. Our calculated reflectivity
compares well with the experimental data and the results are analyzed
in the light of band to band transitions.
Abstract: Topology Optimization is a defined as the method of
determining optimal distribution of material for the assumed design
space with functionality, loads and boundary conditions [1].
Topology optimization can be used to optimize shape for the
purposes of weight reduction, minimizing material requirements or
selecting cost effective materials [2]. Topology optimization has been
implemented through the use of finite element methods for the
analysis, and optimization techniques based on the method of moving
asymptotes, genetic algorithms, optimality criteria method, level sets
and topological derivatives. Case study of Typical “Fuselage design"
is considered for this paper to explain the benefits of Topology
Optimization in the design cycle. A cylindrical shell is assumed as
the design space and aerospace standard pay loads were applied on
the fuselage with wing attachments as constraints. Then topological
optimization is done using Finite Element (FE) based software. This
optimization results in the structural concept design which satisfies
all the design constraints using minimum material.
Abstract: This paper presented a novel combined cycle of air separation and natural gas liquefaction. The idea is that natural gas can be liquefied, meanwhile gaseous or liquid nitrogen and oxygen are produced in one combined cryogenic system. Cycle simulation and exergy analysis were performed to evaluate the process and thereby reveal the influence of the crucial parameter, i.e., flow rate ratio through two stages expanders β on heat transfer temperature difference, its distribution and consequent exergy loss. Composite curves for the combined hot streams (feeding natural gas and recycled nitrogen) and the cold stream showed the degree of optimization available in this process if appropriate β was designed. The results indicated that increasing β reduces temperature difference and exergy loss in heat exchange process. However, the maximum limit value of β should be confined in terms of minimum temperature difference proposed in heat exchanger design standard and heat exchanger size. The optimal βopt under different operation conditions corresponding to the required minimum temperature differences was investigated.
Abstract: Many states are now committed to implementing
international human rights standards domestically. In terms of
practical governance, how might effectiveness be measured? A facevalue
answer can be found in domestic laws and institutions relating
to human rights. However, this article provides two further tools to
help states assess their status on the spectrum of robust to fragile
human rights governance. The first recognises that each state has its
own 'human rights history' and the ideal end stage is robust human
rights governance, and the second is developing criteria to assess
robustness. Although a New Zealand case study is used to illustrate
these tools, the widespread adoption of human rights standards by
many states inevitably means that the issues are relevant to other
countries. This is even though there will always be varying degrees of
similarity-difference in constitutional background and developed or
emerging human rights systems.
Abstract: This paper presents design trade-off and performance impacts of
the amount of pipeline phase of control path signals in a wormhole-switched
network-on-chip (NoC). The numbers of the pipeline phase of the control
path vary between two- and one-cycle pipeline phase. The control paths
consist of the routing request paths for output selection and the arbitration
paths for input selection. Data communications between on-chip routers are
implemented synchronously and for quality of service, the inter-router data
transports are controlled by using a link-level congestion control to avoid
lose of data because of an overflow. The trade-off between the area (logic
cell area) and the performance (bandwidth gain) of two proposed NoC router
microarchitectures are presented in this paper. The performance evaluation is
made by using a traffic scenario with different number of workloads under
2D mesh NoC topology using a static routing algorithm. By using a 130-nm
CMOS standard-cell technology, our NoC routers can be clocked at 1 GHz,
resulting in a high speed network link and high router bandwidth capacity
of about 320 Gbit/s. Based on our experiments, the amount of control path
pipeline stages gives more significant impact on the NoC performance than
the impact on the logic area of the NoC router.