Abstract: Mung bean starches were subjected to heat-moisture treatment (HMT) by different moisture contents (15%, 20%, 25%, 30% and 35%) at 120Ôäâ for 12h. The impact on the yields of resistant starch (RS), microstructure, physicochemical and functional properties was investigated. Compared to native starch, the RS content of heat-moisture treated starches increased significantly. The RS level of HMT-20 was the highest of all the starches. Birefringence was displayed clear at the center of native starch. For HMT starches, pronounced birefringence was exhibited on the periphery of starch granules; however, birefringence disappeared at the centre of some starch granules. The shape of HMT starches hadn-t been changed and the integrity of starch granules was preserved for all the conditions. Concavity could be observed on HMT starches under scanning electronic microscopy. After HMT, apparent amylose contents were increased and starch macromolecule was degraded in comparison with those of native starch. There was a reduction in swelling power on HMT starches, but the solubility of HMT starches was higher than that of native starch. Both of native and HMT starches showed A-type X-ray diffraction pattern. Furthermore, there is a higher intensity at the peak of 15.0 and 22.9 Å than those of native starch.
Abstract: The PAX6, a transcription factor, is essential for the morphogenesis of the eyes, brain, pituitary and pancreatic islets. In rodents, the loss of Pax6 function leads to central nervous system defects, anophthalmia, and nasal hypoplasia. The haplo-insufficiency of Pax6 causes microphthalmia, aggression and other behavioral abnormalities. It is also required in brain patterning and neuronal plasticity. In human, heterozygous mutation of Pax6 causes loss of iris [aniridia], mental retardation and glucose intolerance. The 3- deletion in Pax6 leads to autism and aniridia. The phenotypes are variable in peneterance and expressivity. However, mechanism of function and interaction of PAX6 with other proteins during development and associated disease are not clear. It is intended to explore interactors of PAX6 to elucidated biology of PAX6 function in the tissues where it is expressed and also in the central regulatory pathway. This report describes In-silico approaches to explore interacting proteins of PAX6. The models show several possible proteins interacting with PAX6 like MITF, SIX3, SOX2, SOX3, IPO13, TRIM, and OGT. Since the Pax6 is a critical transcriptional regulator and master control gene of eye and brain development it might be interacting with other protein involved in morphogenesis [TGIF, TGF, Ras etc]. It is also presumed that matricelluar proteins [SPARC, thrombospondin-1 and osteonectin etc] are likely to interact during transport and processing of PAX6 and are somewhere its cascade. The proteins involved in cell survival and cell proliferation can also not be ignored.
Abstract: An application framework provides a reusable design
and implementation for a family of software systems. If the
framework contains defects, the defects will be passed on to the
applications developed from the framework. Framework defects are
hard to discover at the time the framework is instantiated. Therefore,
it is important to remove all defects before instantiating the
framework. In this paper, two measures for the adequacy of an
object-oriented system-based testing technique are introduced. The
measures assess the usefulness and uniqueness of the testing
technique. The two measures are applied to experimentally compare
the adequacy of two testing techniques introduced to test objectoriented
frameworks at the system level. The two considered testing
techniques are the New Framework Test Approach and Testing
Frameworks Through Hooks (TFTH). The techniques are also
compared analytically in terms of their coverage power of objectoriented
aspects. The comparison study results show that the TFTH
technique is better than the New Framework Test Approach in terms
of usefulness degree, uniqueness degree, and coverage power.
Abstract: Gas turbine air inlet cooling is a useful method for
increasing output for regions where significant power demand and
highest electricity prices occur during the warm months. Inlet air
cooling increases the power output by taking advantage of the gas
turbine-s feature of higher mass flow rate when the compressor inlet
temperature decreases. Different methods are available for reducing
gas turbine inlet temperature. There are two basic systems currently
available for inlet cooling. The first and most cost-effective system is
evaporative cooling. Evaporative coolers make use of the evaporation
of water to reduce the gas turbine-s inlet air temperature. The second
system employs various ways to chill the inlet air. In this method, the
cooling medium flows through a heat exchanger located in the inlet
duct to remove heat from the inlet air. However, the evaporative
cooling is limited by wet-bulb temperature while the chilling can cool
the inlet air to temperatures that are lower than the wet bulb
temperature. In the present work, a thermodynamic model of a gas
turbine is built to calculate heat rate, power output and thermal
efficiency at different inlet air temperature conditions. Computational
results are compared with ISO conditions herein called "base-case".
Therefore, the two cooling methods are implemented and solved for
different inlet conditions (inlet temperature and relative humidity).
Evaporative cooler and absorption chiller systems results show that
when the ambient temperature is extremely high with low relative
humidity (requiring a large temperature reduction) the chiller is the
more suitable cooling solution. The net increment in the power output
as a function of the temperature decrease for each cooling method is
also obtained.
Abstract: In recent years, tuned mass damper (TMD) control systems for civil engineering structures have attracted considerable attention. This paper emphasizes on the application of particle swarm application (PSO) to design and optimize the parameters of the TMD control scheme for achieving the best results in the reduction of the building response under earthquake excitations. The Integral of the Time multiplied Absolute value of the Error (ITAE) based on relative displacement of all floors in the building is taken as a performance index of the optimization criterion. The problem of robustly TMD controller design is formatted as an optimization problem based on the ITAE performance index to be solved using the PSO technique which has a story ability to find the most optimistic results. An 11- story realistic building, located in the city of Rasht, Iran is considered as a test system to demonstrate effectiveness of the proposed method. The results analysis through the time-domain simulation and some performance indices reveals that the designed PSO based TMD controller has an excellent capability in reduction of the seismically excited example building.
Abstract: Software complexity metrics are used to predict
critical information about reliability and maintainability of software
systems. Object oriented software development requires a different
approach to software complexity metrics. Object Oriented Software
Metrics can be broadly classified into static and dynamic metrics.
Static Metrics give information at the code level whereas dynamic
metrics provide information on the actual runtime. In this paper we
will discuss the various complexity metrics, and the comparison
between static and dynamic complexity.
Abstract: Natural disasters, including earthquake, kill many people around the world every year. Society rescue actions, which start after the earthquake and are called LAST in abbreviation, include locating, access, stabilization and transportation. In the present article, we have studied the process of local accessibility to the injured and transporting them to health care centers. With regard the heavy traffic load due to earthquake, the destruction of connecting roads and bridges and the heavy debris in alleys and street, which put the lives of the injured and the people buried under the debris in danger, accelerating the rescue actions and facilitating the accessibilities are of great importance, obviously. Tehran, the capital of Iran, is among the crowded cities in the world and is the center of extensive economic, political, cultural and social activities. Tehran has a population of about 9.5 millions and because of the immigration of people from the surrounding cities. Furthermore, considering the fact that Tehran is located on two important and large faults, a 6 Richter magnitude earthquake in this city could lead to the greatest catastrophe during the entire human history. The present study is a kind of review and a major part of the required information for it, has been obtained from libraries all of the rescue vehicles around the world, including rescue helicopters, ambulances, fire fighting vehicles and rescue boats, and their applied technology, and also the robots specifically designed for the rescue system and the advantages and disadvantages of them, have been investigated. The studies show that there is a significant relationship between the rescue team-s arrival time at the incident zone and the number of saved people; so that, if the duration of burial under debris 30 minutes, the probability of survival is %99.3, after a day is %81, after 2days is %19 and after 5days is %7.4. The exiting transport systems all have some defects. If these defects are removed, more people could be saved each hour and the preparedness against natural disasters is increased. In this study, transport system has been designed for the rescue team and the injured; which could carry the rescue team to the incident zone and the injured to the health care centers. In addition, this system is able to fly in the air and move on the earth as well; so that the destruction of roads and the heavy traffic load could not prevent the rescue team from arriving early at the incident zone. The system also has the equipment required firebird for debris removing, optimum transport of the injured and first aid.
Abstract: In this study, a new root-finding method for solving nonlinear equations is proposed. This method requires two starting values that do not necessarily bracketing a root. However, when the starting values are selected to be close to a root, the proposed method converges to the root quicker than the secant method. Another advantage over all iterative methods is that; the proposed method usually converges to two distinct roots when the given function has more than one root, that is, the odd iterations of this new technique converge to a root and the even iterations converge to another root. Some numerical examples, including a sine-polynomial equation, are solved by using the proposed method and compared with results obtained by the secant method; perfect agreements are found.
Abstract: Electronic Government is one of the special concepts
which has been performed successfully within recent decades.
Electronic government is a digital, wall-free government with a
virtual organization for presenting of online governmental services
and further cooperation in different political/social activities. In order
to have a successful implementation of electronic government
strategy and benefiting from its complete potential and benefits and
generally for establishment and applying of electronic government, it
is necessary to have different infrastructures as the basics of
electronic government with lack of which it is impossible to benefit
from mentioned services. For this purpose, in this paper we have
managed to recognize relevant obstacles for establishment of
electronic government in Iran. All required data for recognition of
obstacles were collected from statistical society of involved
specialists of Ministry of Communications & Information
Technology of Iran and Information Technology Organization of
Tehran Municipality through questionnaire. Then by considering of
five-point Likert scope and μ =3 as the index of relevant factors of
proposed model, we could specify current obstacles against
electronic government in Iran along with some guidelines and
proposal in this regard. According to the results, mentioned obstacles
for applying of electronic government in Iran are as follows:
Technical & technological problems, Legal, judicial & safety
problems, Economic problems and Humanistic Problems.
Abstract: While the form of crises may change, their essence
remains the same (such as a cycle of abundant liquidity, rapid credit
growth, and a low-inflation environment followed by an asset-price
bubble). The current market turbulence began in mid-2000s when the
US economy shifted to imbalanced both internal and external
macroeconomic positions. We see two key causes of these problems
– loose US monetary policy in early 2000s and US government
guarantees issued on the securities by government-sponsored
enterprises what was further fueled by financial innovations such as
structured credit products. We have discovered both negative and
positive lessons deriving from this crisis and divided the negative
lessons into three groups: financial products and valuation, processes
and business models, and strategic issues. Moreover, we address key
risk management lessons and exit strategies derived from the current
crisis and recommend policies that should help diminish the negative
impact of future potential crises.
Abstract: In this paper we proposed a method for finding video
frames representing one sign in the finger alphabet. The method is
based on determining hands location, segmentation and the use of
standard video quality evaluation metrics. Metric calculation is
performed only in regions of interest. Sliding mechanism for finding
local extrema and adaptive threshold based on local averaging is used
for key frames selection. The success rate is evaluated by recall,
precision and F1 measure. The method effectiveness is compared
with metrics applied to all frames. Proposed method is fast, effective
and relatively easy to realize by simple input video preprocessing
and subsequent use of tools designed for video quality measuring.
Abstract: The literature reports a large number of approaches for
measuring the similarity between protein sequences. Most of these
approaches estimate this similarity using alignment-based techniques
that do not necessarily yield biologically plausible results, for two
reasons.
First, for the case of non-alignable (i.e., not yet definitively aligned
and biologically approved) sequences such as multi-domain, circular
permutation and tandem repeat protein sequences, alignment-based
approaches do not succeed in producing biologically plausible results.
This is due to the nature of the alignment, which is based on the
matching of subsequences in equivalent positions, while non-alignable
proteins often have similar and conserved domains in non-equivalent
positions.
Second, the alignment-based approaches lead to similarity measures
that depend heavily on the parameters set by the user for the alignment
(e.g., gap penalties and substitution matrices). For easily alignable
protein sequences, it's possible to supply a suitable combination of
input parameters that allows such an approach to yield biologically
plausible results. However, for difficult-to-align protein sequences,
supplying different combinations of input parameters yields different
results. Such variable results create ambiguities and complicate the
similarity measurement task.
To overcome these drawbacks, this paper describes a novel and
effective approach for measuring the similarity between protein
sequences, called SAF for Substitution and Alignment Free. Without
resorting either to the alignment of protein sequences or to substitution
relations between amino acids, SAF is able to efficiently detect the
significant subsequences that best represent the intrinsic properties of
protein sequences, those underlying the chronological dependencies of
structural features and biochemical activities of protein sequences.
Moreover, by using a new efficient subsequence matching scheme,
SAF more efficiently handles protein sequences that contain similar
structural features with significant meaning in chronologically
non-equivalent positions. To show the effectiveness of SAF, extensive
experiments were performed on protein datasets from different
databases, and the results were compared with those obtained by
several mainstream algorithms.
Abstract: With data centers, end-users can realize the pervasiveness of services that will be one day the cornerstone of our lives. However, data centers are often classified as computing systems that consume the most amounts of power. To circumvent such a problem, we propose a self-adaptive weighted sum methodology that jointly optimizes the performance and power consumption of any given data center. Compared to traditional methodologies for multi-objective optimization problems, the proposed self-adaptive weighted sum technique does not rely on a systematical change of weights during the optimization procedure. The proposed technique is compared with the greedy and LR heuristics for large-scale problems, and the optimal solution for small-scale problems implemented in LINDO. the experimental results revealed that the proposed selfadaptive weighted sum technique outperforms both of the heuristics and projects a competitive performance compared to the optimal solution.
Abstract: We study in this paper the effect of the scene
changing on image sequences coding system using Embedded
Zerotree Wavelet (EZW). The scene changing considered here is the
full motion which may occurs. A special image sequence is generated
where the scene changing occurs randomly. Two scenarios are
considered: In the first scenario, the system must provide the
reconstruction quality as best as possible by the management of the
bit rate (BR) while the scene changing occurs. In the second scenario,
the system must keep the bit rate as constant as possible by the
management of the reconstruction quality. The first scenario may be
motivated by the availability of a large band pass transmission
channel where an increase of the bit rate may be possible to keep the
reconstruction quality up to a given threshold. The second scenario
may be concerned by the narrow band pass transmission channel
where an increase of the bit rate is not possible. In this last case,
applications for which the reconstruction quality is not a constraint
may be considered. The simulations are performed with five scales
wavelet decomposition using the 9/7-tap filter bank biorthogonal
wavelet. The entropy coding is performed using a specific defined
binary code book and EZW algorithm. Experimental results are
presented and compared to LEAD H263 EVAL. It is shown that if
the reconstruction quality is the constraint, the system increases the
bit rate to obtain the required quality. In the case where the bit rate
must be constant, the system is unable to provide the required quality
if the scene change occurs; however, the system is able to improve
the quality while the scene changing disappears.
Abstract: Literature reveals that many investors rely on technical trading rules when making investment decisions. If stock markets are efficient, one cannot achieve superior results by using these trading rules. However, if market inefficiencies are present, profitable opportunities may arise. The aim of this study is to investigate the effectiveness of technical trading rules in 34 emerging stock markets. The performance of the rules is evaluated by utilizing White-s Reality Check and the Superior Predictive Ability test of Hansen, along with an adjustment for transaction costs. These tests are able to evaluate whether the best model performs better than a buy-and-hold benchmark. Further, they provide an answer to data snooping problems, which is essential to obtain unbiased outcomes. Based on our results we conclude that technical trading rules are not able to outperform a naïve buy-and-hold benchmark on a consistent basis. However, we do find significant trading rule profits in 4 of the 34 investigated markets. We also present evidence that technical analysis is more profitable in crisis situations. Nevertheless, this result is relatively weak.
Abstract: This paper reports the tensile fracture location
characterizations of dissimilar friction stir welds between 5754
aluminium alloy and C11000 copper. The welds were produced using
three shoulder diameter tools; namely, 15, 18 and 25 mm by varying
the process parameters. The rotational speeds considered were 600,
950 and 1200 rpm while the feed rates employed were 50, 150 and
300 mm/min to represent the low, medium and high settings
respectively. The tensile fracture locations were evaluated using the
optical microscope to identify the fracture locations and were
characterized. It was observed that 70% of the tensile samples failed
in the Thermo Mechanically Affected Zone (TMAZ) of copper at the
weld joints. Further evaluation of the fracture surfaces of the pulled
tensile samples revealed that welds with low Ultimate Tensile
Strength either have defects or intermetallics present at their joint
interfaces.
Abstract: A cancelable palmprint authentication system
proposed in this paper is specifically designed to overcome the
limitations of the contemporary biometric authentication system. In
this proposed system, Geometric and pseudo Zernike moments are
employed as feature extractors to transform palmprint image into a
lower dimensional compact feature representation. Before moment
computation, wavelet transform is adopted to decompose palmprint
image into lower resolution and dimensional frequency subbands.
This reduces the computational load of moment calculation
drastically. The generated wavelet-moment based feature
representation is used to generate cancelable verification key with a
set of random data. This private binary key can be canceled and
replaced. Besides that, this key also possesses high data capture
offset tolerance, with highly correlated bit strings for intra-class
population. This property allows a clear separation of the genuine
and imposter populations, as well as zero Equal Error Rate
achievement, which is hardly gained in the conventional biometric
based authentication system.
Abstract: Reducing the risk of information leaks is one of
the most important functions of identity management systems. To
achieve this purpose, Dey et al. have already proposed an account
management method for a federated login system using a blind
signature scheme. In order to ensure account anonymity for the
authentication provider, referred to as an IDP (identity provider),
a blind signature scheme is utilized to generate an authentication
token on an authentication service and the token is sent to an IDP.
However, there is a problem with the proposed system. Malicious
users can establish multiple accounts on an IDP by requesting such
accounts. As a measure to solve this problem, in this paper, the
authors propose an account checking method that is performed before
account generation.
Abstract: The methanolic extracts from seeds of tamarind
(Tamarindus indica) was prepared by Soxhlet apparatus extraction
and evaluated for total phenolic content by Folin-Ciocalteu method.
Then, methanolic extract was screened biological activities (In vitro)
for anti-melanogenic activity by tyrosinase inhibition test, antiinflammation
activity by cyclooxygenase 1 (COX-1) and
cyclooxygenase 2 (COX-2) inhibition test, and cytotoxic screening
test with Vero cells. The results showed that total phenolic content,
which contained in extract, was contained 27.72 mg of gallic acid
equivalent per g of dry weight. The ability to inhibit tyrosinase
enzyme, which exerted by Tamarind seed extracts (1 mg/ml) was
52.13 ± 0.42 %. The extract was not possessed inhibitory effect to
COX-1 and COX-2 enzymes and cytotoxic effect to Vero cells. The
finding is concludes that tested seed extract was possessed
antimelanogenic activity with non-toxic effects. However, there was
not exhibited anti-inflammatory activity. Further studies include the
use of advance biological models to confirm this biological activity,
as well as, the isolation and characterization of the purified
compounds that it was contained.
Abstract: One of the major, difficult tasks in automated video
surveillance is the segmentation of relevant objects in the scene.
Current implementations often yield inconsistent results on average
from frame to frame when trying to differentiate partly occluding
objects. This paper presents an efficient block-based segmentation
algorithm which is capable of separating partly occluding objects and
detecting shadows. It has been proven to perform in real time with a
maximum duration of 47.48 ms per frame (for 8x8 blocks on a
720x576 image) with a true positive rate of 89.2%. The flexible
structure of the algorithm enables adaptations and improvements with
little effort. Most of the parameters correspond to relative differences
between quantities extracted from the image and should therefore not
depend on scene and lighting conditions. Thus presenting a
performance oriented segmentation algorithm which is applicable in
all critical real time scenarios.