Abstract: This paper analyses the structural changes in
education sector since the introduction of liberalization policy in
India. This paper explains how the so-called non-profit trusts and
societies appropriated the liberalization policy and enhanced
themselves as new capitalist class in higher education sector. Over
the decades, the policy witnessed the role of private sector in terms
of maintaining market equilibrium. The state also witnessed the
incompatibility of the private sector in inculcating the values of
social justice. The most important consequence of the policy is to
witness the rise of new capitalist class and academic capitalism.
When the state came to realize that it no longer cope up with
market demands, it opens the entry of private sector in higher
education. Concessions and tax exemptions were provided to the
trusts and societies to establish higher education institutions. There
is a basic difference between western countries and India in
providing higher education by the trusts and societies. In western
countries the big business houses contributed their surplus
revenues to promote higher education and research as a
complementary service to society and nation. In India, several
entrepreneurs came up with business motive using education
sector. Over the period, they accumulated wealth at the cost of
students and concessions from the government. Four major results
can now be identified: production of manpower in view of market
demands; reduction of standards in higher education; bypassing the
values of social justice; and the rise of new capitalist class from the
business of education. This paper tries to substantiate these issues
with the inputs from case studies.
Abstract: Defect prevention is the most vital but habitually
neglected facet of software quality assurance in any project. If
functional at all stages of software development, it can condense the
time, overheads and wherewithal entailed to engineer a high quality
product. The key challenge of an IT industry is to engineer a
software product with minimum post deployment defects.
This effort is an analysis based on data obtained for five selected
projects from leading software companies of varying software
production competence. The main aim of this paper is to provide
information on various methods and practices supporting defect
detection and prevention leading to thriving software generation. The
defect prevention technique unearths 99% of defects. Inspection is
found to be an essential technique in generating ideal software
generation in factories through enhanced methodologies of abetted
and unaided inspection schedules. On an average 13 % to 15% of
inspection and 25% - 30% of testing out of whole project effort time
is required for 99% - 99.75% of defect elimination.
A comparison of the end results for the five selected projects
between the companies is also brought about throwing light on the
possibility of a particular company to position itself with an
appropriate complementary ratio of inspection testing.
Abstract: The objective of this work was to examine the
changes in the microstructure and macro physical properties caused
by the carbonation of normalised CEM II mortar. Samples were
prepared and subjected to accelerated carbonation at 20°C, 65%
relative humidity and 20% CO2 concentration. On the microstructure
scale, the evolutions of the cumulative pore volume, pore size
distribution, and specific surface area during carbonation were
calculated from the adsorption desorption isotherms of nitrogen. We
also examined the evolution of macro physical properties such as the
porosity accessible to water, the gas permeability, and thermal
conductivity. The conflict between the results of nitrogen porosity
and water porosity indicated that the porous domains explored using
these two techniques are different and help to complementarily
evaluate the effects of carbonation. This is a multi-scale study where
results on microstructural changes can help to explain the evolution
of macro physical properties.
Abstract: Given bipartite graphs H1 and H2, the bipartite Ramsey number b(H1;H2) is the smallest integer b such that any subgraph G of the complete bipartite graph Kb,b, either G contains a copy of H1 or its complement relative to Kb,b contains a copy of H2. It is known that b(K2,2;K2,2) = 5, b(K2,3;K2,3) = 9, b(K2,4;K2,4) = 14 and b(K3,3;K3,3) = 17. In this paper we study the case that both H1 and H2 are even cycles, prove that b(C2m;C2n) ≥ m + n - 1 for m = n, and b(C2m;C6) = m + 2 for m ≥ 4.
Abstract: Soft set theory was initiated by Molodtsov in 1999. In the past years, this theory had been applied to many branches of mathematics, information science and computer science. In 2003, Maji et al. introduced some operations of soft sets and gave some operational rules. Recently, some of these operational rules are pointed out to be not true. Furthermore, Ali et al., in their paper, introduced and discussed some new operations of soft sets. In this paper, we further investigate these operational rules given by Maji et al. and Ali et al.. We obtain some sufficient-necessary conditions such that corresponding operational rules hold and give correct forms for some operational rules. These results will be help for us to use rightly operational rules of soft sets in research and application of soft set theory.
Abstract: In this paper we introduce an ultra low power CMOS
LC oscillator and analyze a method to design a low power low phase
noise complementary CMOS LC oscillator. A 1.8GHz oscillator is
designed based on this analysis. The circuit has power supply equal
to 1.1 V and dissipates 0.17 mW power. The oscillator is also
optimized for low phase noise behavior. The oscillator phase noise is
-126.2 dBc/Hz and -144.4 dBc/Hz at 1 MHz and 8 MHz offset
respectively.
Abstract: In neural networks, when new patterns are learned by a network, the new information radically interferes with previously stored patterns. This drawback is called catastrophic forgetting or catastrophic interference. In this paper, we propose a biologically inspired neural network model which overcomes this problem. The proposed model consists of two distinct networks: one is a Hopfield type of chaotic associative memory and the other is a multilayer neural network. We consider that these networks correspond to the hippocampus and the neocortex of the brain, respectively. Information given is firstly stored in the hippocampal network with fast learning algorithm. Then the stored information is recalled by chaotic behavior of each neuron in the hippocampal network. Finally, it is consolidated in the neocortical network by using pseudopatterns. Computer simulation results show that the proposed model has much better ability to avoid catastrophic forgetting in comparison with conventional models.
Abstract: The Taiwan government has started to promote the “Plain Landscape Afforestation and Greening Program" since 2002. A key task of the program was the payment for environmental services (PES), entitled the “Plain Landscape Afforestation Policy" (PLAP), which was certificated by the Executive Yuan on August 31, 2001 and enacted on January 1, 2002. According to the policy, it is estimated that the total area of afforestation will be 25,100 hectares by December 31, 2007. Until the end of 2007, the policy had been enacted for six years in total and the actual area of afforestation was 8,919.18 hectares. Among them, Taiwan Sugar Corporation (TSC) was accounted for 7,960 hectares (with 2,450.83 hectares as public service area) which occupied 86.22% of the total afforestation area; the private farmland promoted by local governments was accounted for 869.18 hectares which occupied 9.75% of the total afforestation area. Based on the above, we observe that most of the afforestation area in this policy is executed by TSC, and the achievement ratio by TSC is better than by others. It implies that the success of the PLAP is seriously related to the execution of TSC. The objective of this study is to analyze the relevant policy planning of TSC's participation in the PLAP, suggest complementary measures, and draw up effective adjustment mechanisms, so as to improve the effectiveness of executing the policy. Our main conclusions and suggestions are summarized as follows: 1. The main reason for TSC’s participation in the PLAP is based on their passive cooperation with the central government or company policy. Prior to TSC’s participation in the PLAP, their lands were mainly used for growing sugarcane. 2. The main factors of TSC's consideration on the selection of tree species are based on the suitability of land and species. The largest proportion of tree species is allocated to economic forests, and the lack of technical instruction was the main problem during afforestation. Moreover, the method of improving TSC’s future development in leisure agriculture and landscape business becomes a key topic. 3. TSC has developed short and long-term plans on participating in the PLAP for the future. However, there is no great willingness or incentive on budgeting for such detailed planning. 4. Most people from TSC interviewed consider the requirements on PLAP unreasonable. Among them, an unreasonable requirement on the number of trees accounted for the greatest proportion; furthermore, most interviewees suggested that the government should continue to provide incentives even after 20 years. 5. Since the government shares the same goals as TSC, there should be sufficient cooperation and communication that support the technical instruction and reduction of afforestation cost, which will also help to improve effectiveness of the policy.
Abstract: MiRNAs participate in gene regulation of translation.
Some studies have investigated the interactions between genes and
intragenic miRNAs. It is important to study the miRNA binding sites
of genes involved in carcinogenesis. RNAHybrid 2.1 and ERNAhybrid
programmes were used to compute the hybridization free
energy of miRNA binding sites. Of these 54 mRNAs, 22.6%, 37.7%,
and 39.7% of miRNA binding sites were present in the 5'UTRs,
CDSs, and 3'UTRs, respectively. The density of the binding sites for
miRNAs in the 5'UTR ranged from 1.6 to 43.2 times and from 1.8 to
8.0 times greater than in the CDS and 3'UTR, respectively. Three
types of miRNA interactions with mRNAs have been revealed: 5'-
dominant canonical, 3'-compensatory, and complementary binding
sites. MiRNAs regulate gene expression, and information on the
interactions between miRNAs and mRNAs could be useful in
molecular medicine. We recommend that newly described sites
undergo validation by experimental investigation.
Abstract: Protein structure determination and prediction has
been a focal research subject in the field of bioinformatics due to the
importance of protein structure in understanding the biological and
chemical activities of organisms. The experimental methods used by
biotechnologists to determine the structures of proteins demand
sophisticated equipment and time. A host of computational methods
are developed to predict the location of secondary structure elements
in proteins for complementing or creating insights into experimental
results. However, prediction accuracies of these methods rarely
exceed 70%.
Abstract: This paper deals with the optimal design of two-channel recursive parallelogram quadrature mirror filter (PQMF) banks. The analysis and synthesis filters of the PQMF bank are composed of two-dimensional (2-D) recursive digital all-pass filters (DAFs) with nonsymmetric half-plane (NSHP) support region. The design problem can be facilitated by using the 2-D doubly complementary half-band (DC-HB) property possessed by the analysis and synthesis filters. For finding the coefficients of the 2-D recursive NSHP DAFs, we appropriately formulate the design problem to result in an optimization problem that can be solved by using a weighted least-squares (WLS) algorithm in the minimax (L∞) optimal sense. The designed 2-D recursive PQMF bank achieves perfect magnitude response and possesses satisfactory phase response without requiring extra phase equalizer. Simulation results are also provided for illustration and comparison.
Abstract: A hybrid learning automata-genetic algorithm (HLGA) is proposed to solve QoS routing optimization problem of next generation networks. The algorithm complements the advantages of the learning Automato Algorithm(LA) and Genetic Algorithm(GA). It firstly uses the good global search capability of LA to generate initial population needed by GA, then it uses GA to improve the Quality of Service(QoS) and acquiring the optimization tree through new algorithms for crossover and mutation operators which are an NP-Complete problem. In the proposed algorithm, the connectivity matrix of edges is used for genotype representation. Some novel heuristics are also proposed for mutation, crossover, and creation of random individuals. We evaluate the performance and efficiency of the proposed HLGA-based algorithm in comparison with other existing heuristic and GA-based algorithms by the result of simulation. Simulation results demonstrate that this paper proposed algorithm not only has the fast calculating speed and high accuracy but also can improve the efficiency in Next Generation Networks QoS routing. The proposed algorithm has overcome all of the previous algorithms in the literature.
Abstract: Scarcity of resources for biodiversity conservation gives rise to the need of strategic investment with priorities given to the cost of conservation. While the literature provides abundant methodological options for biodiversity conservation; estimating true cost of conservation remains abstract and simplistic, without recognising dynamic nature of the cost. Some recent works demonstrate the prominence of economic theory to inform biodiversity decisions, particularly on the costs and benefits of biodiversity however, the integration of the concept of true cost into biodiversity actions and planning are very slow to come by, and specially on a farm level. Conservation planning studies often use area as a proxy for costs neglecting different land values as well as protected areas. These literature consider only heterogeneous benefits while land costs are considered homogenous. Analysis with the assumption of cost homogeneity results in biased estimation; since not only it doesn’t address the true total cost of biodiversity actions and plans, but also it fails to screen out lands that are more (or less) expensive and/or difficult (or more suitable) for biodiversity conservation purposes, hindering validity and comparability of the results. Economies of scope” is one of the other most neglected aspects in conservation literature. The concept of economies of scope introduces the existence of cost complementarities within a multiple output production system and it suggests a lower cost during the concurrent production of multiple outputs by a given farm. If there are, indeed, economies of scope then simplistic representation of costs will tend to overestimate the true cost of conservation leading to suboptimal outcomes. The aim of this paper, therefore, is to provide first road review of the various theoretical ways in which economies of scope are likely to occur of how they might occur in conservation. Consequently, the paper addresses gaps that have to be filled in future analysis.
Abstract: The effect of chemical treatment in CdCl2 on the
compositional changes and defect structures of potentially useful ZnS
solar cell thin films prepared by vacuum deposition method was
studied using the complementary Rutherford backscattering (RBS)
and Thermoluminesence (TL) techniques. A series of electron and
hole traps are found in the various as deposited samples studied.
After treatment, perturbation on the intensity is noted; mobile defect
states and charge conversion and/or transfer between defect states are
found.
Abstract: Functionalities and control behavior are both primary
requirements in design of a complex system. Automata theory plays
an important role in modeling behavior of a system. Z is an ideal
notation which is used for describing state space of a system and then
defining operations over it. Consequently, an integration of automata
and Z will be an effective tool for increasing modeling power for a
complex system. Further, nondeterministic finite automata (NFA)
may have different implementations and therefore it is needed to
verify the transformation from diagrams to a code. If we describe
formal specification of an NFA before implementing it, then
confidence over transformation can be increased. In this paper, we
have given a procedure for integrating NFA and Z. Complement of a
special type of NFA is defined. Then union of two NFAs is
formalized after defining their complements. Finally, formal
construction of intersection of NFAs is described. The specification
of this relationship is analyzed and validated using Z/EVES tool.
Abstract: In this paper, we study the knapsack sharing problem, a variant of the well-known NP-Hard single knapsack problem. We investigate the use of a tree search for optimally solving the problem. The used method combines two complementary phases: a reduction interval search phase and a branch and bound procedure one. First, the reduction phase applies a polynomial reduction strategy; that is used for decomposing the problem into a series of knapsack problems. Second, the tree search procedure is applied in order to attain a set of optimal capacities characterizing the knapsack problems. Finally, the performance of the proposed optimal algorithm is evaluated on a set of instances of the literature and its runtime is compared to the best exact algorithm of the literature.
Abstract: In this paper, an efficient technique is proposed to manage the cache memory. The proposed technique introduces some modifications on the well-known set associative mapping technique. This modification requires a little alteration in the structure of the cache memory and on the way by which it can be referenced. The proposed alteration leads to increase the set size virtually and consequently to improve the performance and the utilization of the cache memory. The current mapping techniques have accomplished good results. In fact, there are still different cases in which cache memory lines are left empty and not used, whereas two or more processes overwrite the lines of each other, instead of using those empty lines. The proposed algorithm aims at finding an efficient way to deal with such problem.
Abstract: Vinegar is a precious food additive and complement as well as effective preservative against food spoilage. Recently traditional vinegar production has been improved using various natural substrates and fruits such as grape, palm, cherry, coconut, date, sugarcane, rice and balsam. These neoclassical fermentations resulted in several vinegar types with different tastes, fragrances and nutritional values because of applying various acetic acid bacteria as starters. Acetic acid bacteria include genera Acetobacter, Gluconacetobacter and Gluconobacter according to latest edition of Bergy-s Manual of Systematic Bacteriology that classifies genera on the basis of their 16s RNA differences. Acetobacter spp as the main vinegar starters belong to family Acetobacteraceae that are gram negative obligate aerobes, chemoorganotrophic bacilli that are oxidase negative and oxidize ethanol to acetic acid. In this research we isolated and identified a native Acetobacter strain with high acetic acid productivity and tolerance against high ethanol concentrations from Iranian peach as a summer delicious fruit that is very susceptible to food spoilage and decay. We used selective and specific laboratorial culture media such as Standard GYC, Frateur and Carr medium. Also we used a new industrial culture medium and a miniature fermentor with a new aeration system innovated by Pars Yeema Biotechnologists Co., Isfahan Science and Technology Town (ISTT), Isfahan, Iran. The isolated strain was successfully cultivated in modified Carr media with 2.5% and 5% ethanol simultaneously in high temperatures, 34 - 40º C after 96 hours of incubation period. We showed that the increase of ethanol concentration resulted in rising of strain sensitivity to high temperature. In conclusion we isolated and characterized a new Acetobacter strain from Iranian peach that could be considered as a potential strain for production of a new vinegar type, peach vinegar, with a delicious taste and advantageous nutritional value in food biotechnology and industrial microbiology.
Abstract: This article first summarizes reasons why current approaches supporting Open Learning and Distance Education need to be complemented by tools permitting lecturers, researchers and students to cooperatively organize the semantic content of Learning related materials (courses, discussions, etc.) into a fine-grained shared semantic network. This first part of the article also quickly describes the approach adopted to permit such a collaborative work. Then, examples of such semantic networks are presented. Finally, an evaluation of the approach by students is provided and analyzed.
Abstract: Structural representation and technology mapping of
a Boolean function is an important problem in the design of nonregenerative
digital logic circuits (also called combinational logic
circuits). Library aware function manipulation offers a solution to
this problem. Compact multi-level representation of binary networks,
based on simple circuit structures, such as AND-Inverter Graphs
(AIG) [1] [5], NAND Graphs, OR-Inverter Graphs (OIG), AND-OR
Graphs (AOG), AND-OR-Inverter Graphs (AOIG), AND-XORInverter
Graphs, Reduced Boolean Circuits [8] does exist in
literature. In this work, we discuss a novel and efficient graph
realization for combinational logic circuits, represented using a
NAND-NOR-Inverter Graph (NNIG), which is composed of only
two-input NAND (NAND2), NOR (NOR2) and inverter (INV) cells.
The networks are constructed on the basis of irredundant disjunctive
and conjunctive normal forms, after factoring, comprising terms with
minimum support. Construction of a NNIG for a non-regenerative
function in normal form would be straightforward, whereas for the
complementary phase, it would be developed by considering a virtual
instance of the function. However, the choice of best NNIG for a
given function would be based upon literal count, cell count and
DAG node count of the implementation at the technology
independent stage. In case of a tie, the final decision would be made
after extracting the physical design parameters.
We have considered AIG representation for reduced disjunctive
normal form and the best of OIG/AOG/AOIG for the minimized
conjunctive normal forms. This is necessitated due to the nature of
certain functions, such as Achilles- heel functions. NNIGs are found
to exhibit 3.97% lesser node count compared to AIGs and
OIG/AOG/AOIGs; consume 23.74% and 10.79% lesser library cells
than AIGs and OIG/AOG/AOIGs for the various samples considered.
We compare the power efficiency and delay improvement achieved
by optimal NNIGs over minimal AIGs and OIG/AOG/AOIGs for
various case studies. In comparison with functionally equivalent,
irredundant and compact AIGs, NNIGs report mean savings in power
and delay of 43.71% and 25.85% respectively, after technology
mapping with a 0.35 micron TSMC CMOS process. For a
comparison with OIG/AOG/AOIGs, NNIGs demonstrate average
savings in power and delay by 47.51% and 24.83%. With respect to
device count needed for implementation with static CMOS logic
style, NNIGs utilize 37.85% and 33.95% lesser transistors than their
AIG and OIG/AOG/AOIG counterparts.