Abstract: The purpose of this study was to determine the significance of history of obesity for the development of childhood overweight and/or obesity. Accordingly, a systematic literature review of English-language studies published from 1980 to 2012 using the following data bases: MEDLINE, PsychINFO, Cochrane Database of Systematic Reviews, and Dissertation Abstracts International was conducted. The following terms were used in the search: pregnancy, overweight, obesity, family history, parents, childhood, risk factors. Eleven studies of family history and obesity conducted in Europe, Asia, North America, and South America met the inclusion criteria. A meta-analysis of these studies indicated that family history of obesity is a significant risk factor of overweight and /or obesity in offspring; risk for offspring overweight and/or obesity associated with family history varies depending of the family members included in the analysis; and when family history of obesity is present, the offspring are at greater risk for developing obesity or overweight. In addition, the results from moderator analyses suggest that part of the heterogeneity discovered between the studies can be explained by the region of world that the study occurred in and the age of the child at the time of weight assessment.
Abstract: The purpose of this study was to determine the significance of maternal smoking for the development of childhood overweight and/or obesity. Accordingly, a systematic literature review of English-language studies published from 1980 to 2012 using the following data bases: MEDLINE, PsychINFO, Cochrane Database of Systematic Reviews, and Dissertation Abstracts International was conducted. The following terms were used in the search: pregnancy, overweight, obesity, smoking, parents, childhood, risk factors. Eighteen studies of maternal smoking during pregnancy and obesity conducted in Europe, Asia, North America, and South America met the inclusion criteria. A meta-analysis of these studies indicated that maternal smoking during pregnancy is a significant risk factor for overweight and obesity; mothers who smoke during pregnancy are at a greater risk for developing obesity or overweight; the quantity of cigarettes consumed by the mother during pregnancy influenced the odds of offspring overweight and/or obesity. In addition, the results from moderator analyses suggest that part of the heterogeneity discovered between the studies can be explained by the region of world that the study occurred in and the age of the child at the time of weight assessment.
Abstract: The calcarenites carbonate rocks of the Quaternary ridges, which extend along the northwestern Mediterranean coastal plain of Egypt, represent an excellent model for the transformation of loose sediments to real sedimentary rocks by the different stages of meteoric diagenesis. The depositional and diagenetic fabrics of the rocks, in addition to the strata orientation, highly affect their ultimate compressive strength and other geotechnical properties.
There is a marked increase in the compressive strength (UCS) from the first to the fourth ridge rock samples. The lowest values are related to the loose packing, weakly cemented aragonitic ooid sediments with high porosity, besides the irregularly distributed of cement, which result in decreasing the ability of these rocks to withstand crushing under direct pressure. The high (UCS) values are attributed to the low porosity, the presence of micritic cement, the reduction in grain size and the occurrence of micritization and calcretization processes.
The strata orientation has a notable effect on the measured (UCS). The lowest values have been recorded for the samples cored in the inclined direction; whereas the highest values have been noticed in most samples cored in the vertical and parallel directions to bedding plane. In case of the inclined direction, the bedding planes were oriented close to the plane of maximum shear stress. The lowest and highest anisotropy values have been recorded for the first and the third ridges rock samples, respectively, which may attributed to the relatively homogeneity and well sorted grainstone of the first ridge rock samples, and relatively heterogeneity in grain and pore size distribution and degree of cementation of the third ridge rock samples, besides, the abundance of shell fragments with intraparticle pore spaces, which may produce lines of weakness within the rock.
Abstract: Many researchers have suggested the use of zero inflated Poisson (ZIP) and zero inflated negative binomial (ZINB) models in modeling overdispersed medical count data with extra variations caused by extra zeros and unobserved heterogeneity. The studies indicate that ZIP and ZINB always provide better fit than using the normal Poisson and negative binomial models in modeling overdispersed medical count data. In this study, we proposed the use of Zero Inflated Inverse Trinomial (ZIIT), Zero Inflated Poisson Inverse Gaussian (ZIPIG) and zero inflated strict arcsine models in modeling overdispered medical count data. These proposed models are not widely used by many researchers especially in the medical field. The results show that these three suggested models can serve as alternative models in modeling overdispersed medical count data. This is supported by the application of these suggested models to a real life medical data set. Inverse trinomial, Poisson inverse Gaussian and strict arcsine are discrete distributions with cubic variance function of mean. Therefore, ZIIT, ZIPIG and ZISA are able to accommodate data with excess zeros and very heavy tailed. They are recommended to be used in modeling overdispersed medical count data when ZIP and ZINB are inadequate.
Abstract: Human society, there are many uncertainties, such as economic growth rate forecast of the financial crisis, many scholars have, since the the Song Chissom two scholars in 1993 the concept of the so-called fuzzy time series (Fuzzy Time Series)different mode to deal with these problems, a previous study, however, usually does not consider the relevant variables selected and fuzzy process based solely on subjective opinions the fuzzy semantic discrete, so can not objectively reflect the characteristics of the data set, in addition to carrying outforecasts are often fuzzy rules as equally important, failed to consider the importance of each fuzzy rule. For these reasons, the variable selection (Factor Selection) through self-organizing map (Self-Organizing Map, SOM) and proposed high-end weighted multivariate fuzzy time series model based on fuzzy neural network (Fuzzy-BPN), and using the the sequential weighted average operator (Ordered Weighted Averaging operator, OWA) weighted prediction. Therefore, in order to verify the proposed method, the Taiwan stock exchange (Taiwan Stock Exchange Corporation) Taiwan Weighted Stock Index (Taiwan Stock Exchange Capitalization Weighted Stock Index, TAIEX) as experimental forecast target, in order to filter the appropriate variables in the experiment Finally, included in other studies in recent years mode in conjunction with this study, the results showed that the predictive ability of this study further improve.
Abstract: A high-performance Monte Carlo simulation, which
simultaneously takes diffusion-controlled and chain-length-dependent
bimolecular termination reactions into account, is developed to
simulate atom transfer radical copolymerization of styrene and nbutyl
acrylate. As expected, increasing initial feed fraction of styrene
raises the fraction of styrene-styrene dyads (fAA) and reduces that of
n-butyl acrylate dyads (fBB). The trend of variation in randomness
parameter (fAB) during the copolymerization also varies significantly.
Also, there is a drift in copolymer heterogeneity and the highest drift
occurs in the initial feeds containing lower percentages of styrene, i.e.
20% and 5%.
Abstract: The occurrence and removal of trace organic
contaminants in the aquatic environment has become a focus of
environmental concern. For the selective removal of carbamazepine
from loaded waters molecularly imprinted polymers (MIPs) were
synthesized with carbamazepine as template. Parameters varied were
the type of monomer, crosslinker, and porogen, the ratio of starting
materials, and the synthesis temperature. Best results were obtained
with a template to crosslinker ratio of 1:20, toluene as porogen, and
methacrylic acid (MAA) as monomer. MIPs were then capable to
recover carbamazepine by 93% from a 10-5 M landfill leachate
solution containing also caffeine and salicylic acid. By comparison,
carbamazepine recoveries of 75% were achieved using a nonimprinted
polymer (NIP) synthesized under the same conditions, but
without template. In landfill leachate containing solutions
carbamazepine was adsorbed by 93-96% compared with an uptake of
73% by activated carbon. The best solvent for desorption was
acetonitrile, with which the amount of solvent necessary and dilution
with water was tested. Selected MIPs were tested for their reusability
and showed good results for at least five cycles. Adsorption
isotherms were prepared with carbamazepine solutions in the
concentration range of 0.01 M to 5*10-6 M. The heterogeneity index
showed a more homogenous binding site distribution.
Abstract: The paper aims at investigating influence of medium
capacity on linear adsorbed solute dispersion into chemically
heterogeneous fixed beds. A discrete chemical heterogeneity
distribution is considered in the one-dimensional advectivedispersive
equation. The partial differential equation is solved using
finite volumes method based on the Adam-Bashforth algorithm.
Increased dispersion is estimated by comparing breakthrough curves
second order moments and keeping identical hydrodynamic
properties. As a result, dispersion increase due to chemical
heterogeneity depends on the column size and surprisingly on the
solid capacity. The more intense capacity is, the more important
solute dispersion is. Medium length which is known to favour this
effect vanishing according to the linear adsorption in fixed bed seems
to create nonmonotonous variation of dispersion because of the
heterogeneity. This nonmonotonous behaviour is also favoured by
high capacities.
Abstract: In the Fe-3%Si sheets, grade Hi-B, with AlN and MnS
as inhibitors, the Goss grains which abnormally grow do not have a
size greater than the average size of the primary matrix. In this
heterogeneous microstructure, the size factor is not a required
condition for the secondary recrystallization. The onset of the small
Goss grain abnormal growth appears to be related to a particular
behavior of their grain boundaries, to the local texture and to the
distribution of the inhibitors. The presence and the evolution of
oriented clusters ensure to the small Goss grains a favorable
neighborhood to grow. The modified Monte-Carlo approach, which
is applied, considers the local environment of each grain. The grain
growth is dependent of its real spatial position; the matrix
heterogeneity is then taken into account. The grain growth conditions
are considered in the global matrix and in different matrixes
corresponding to A component clusters. The grain growth behaviour
is considered with introduction of energy only, energy and mobility,
energy and mobility and precipitates.
Abstract: The amount and heterogeneity of data in biomedical research, notably in interdisciplinary research, requires new methods for the collection, presentation and analysis of information. Important data from laboratory experiments as well as patient trials are available but come out of distributed resources. The Charite Medical School in Berlin has established together with the German Research Foundation (DFG) a new information service center for kidney diseases and transplantation (Open European Nephrology Science Centre - OpEN.SC). The system is based on a service-oriented architecture (SOA) with main and auxiliary modules arranged in four layers. To improve the reuse and efficient arrangement of the services the functionalities are described as business processes using the standardised Business Process Execution Language (BPEL).
Abstract: A multilayer self organizing neural neural network
(MLSONN) architecture for binary object extraction, guided by a beta
activation function and characterized by backpropagation of errors
estimated from the linear indices of fuzziness of the network output
states, is discussed. Since the MLSONN architecture is designed to
operate in a single point fixed/uniform thresholding scenario, it does
not take into cognizance the heterogeneity of image information in
the extraction process. The performance of the MLSONN architecture
with representative values of the threshold parameters of the beta
activation function employed is also studied. A three layer bidirectional
self organizing neural network (BDSONN) architecture
comprising fully connected neurons, for the extraction of objects from
a noisy background and capable of incorporating the underlying image
context heterogeneity through variable and adaptive thresholding,
is proposed in this article. The input layer of the network architecture
represents the fuzzy membership information of the image scene to
be extracted. The second layer (the intermediate layer) and the final
layer (the output layer) of the network architecture deal with the self
supervised object extraction task by bi-directional propagation of the
network states. Each layer except the output layer is connected to the
next layer following a neighborhood based topology. The output layer
neurons are in turn, connected to the intermediate layer following
similar topology, thus forming a counter-propagating architecture
with the intermediate layer. The novelty of the proposed architecture
is that the assignment/updating of the inter-layer connection weights
are done using the relative fuzzy membership values at the constituent
neurons in the different network layers. Another interesting feature
of the network lies in the fact that the processing capabilities of
the intermediate and the output layer neurons are guided by a beta
activation function, which uses image context sensitive adaptive
thresholding arising out of the fuzzy cardinality estimates of the
different network neighborhood fuzzy subsets, rather than resorting to
fixed and single point thresholding. An application of the proposed
architecture for object extraction is demonstrated using a synthetic
and a real life image. The extraction efficiency of the proposed
network architecture is evaluated by a proposed system transfer index
characteristic of the network.
Abstract: It has been recognized that due to the autonomy and
heterogeneity, of Web services and the Web itself, new approaches
should be developed to describe and advertise Web services. The
most notable approaches rely on the description of Web services
using semantics. This new breed of Web services, termed semantic
Web services, will enable the automatic annotation, advertisement,
discovery, selection, composition, and execution of interorganization
business logic, making the Internet become a common
global platform where organizations and individuals communicate
with each other to carry out various commercial activities and to
provide value-added services. This paper deals with two of the
hottest R&D and technology areas currently associated with the Web
– Web services and the semantic Web. It describes how semantic
Web services extend Web services as the semantic Web improves the
current Web, and presents three different conceptual approaches to
deploying semantic Web services, namely, WSDL-S, OWL-S, and
WSMO.
Abstract: Given a parallel program to be executed on a heterogeneous
computing system, the overall execution time of the program
is determined by a schedule. In this paper, we analyze the worst-case
performance of the list scheduling algorithm for scheduling tasks
of a parallel program in a mixed-machine heterogeneous computing
system such that the total execution time of the program is minimized.
We prove tight lower and upper bounds for the worst-case
performance ratio of the list scheduling algorithm. We also examine
the average-case performance of the list scheduling algorithm. Our
experimental data reveal that the average-case performance of the list
scheduling algorithm is much better than the worst-case performance
and is very close to optimal, except for large systems with large
heterogeneity. Thus, the list scheduling algorithm is very useful in
real applications.
Abstract: Simulation is a very powerful method used for highperformance
and high-quality design in distributed system, and now
maybe the only one, considering the heterogeneity, complexity and
cost of distributed systems. In Grid environments, foe example, it is
hard and even impossible to perform scheduler performance
evaluation in a repeatable and controllable manner as resources and
users are distributed across multiple organizations with their own
policies. In addition, Grid test-beds are limited and creating an
adequately-sized test-bed is expensive and time consuming.
Scalability, reliability and fault-tolerance become important
requirements for distributed systems in order to support distributed
computation. A distributed system with such characteristics is called
dependable. Large environments, like Cloud, offer unique
advantages, such as low cost, dependability and satisfy QoS for all
users. Resource management in large environments address
performant scheduling algorithm guided by QoS constrains. This
paper presents the performance evaluation of scheduling heuristics
guided by different optimization criteria. The algorithms for
distributed scheduling are analyzed in order to satisfy users
constrains considering in the same time independent capabilities of
resources. This analysis acts like a profiling step for algorithm
calibration. The performance evaluation is based on simulation. The
simulator is MONARC, a powerful tool for large scale distributed
systems simulation. The novelty of this paper consists in synthetic
analysis results that offer guidelines for scheduler service
configuration and sustain the empirical-based decision. The results
could be used in decisions regarding optimizations to existing Grid
DAG Scheduling and for selecting the proper algorithm for DAG
scheduling in various actual situations.
Abstract: To determine if the murine insulinoma, β-TC-6, is a
suitable substitute for primary pancreatic β-cells in the study of β-
cell functional heterogeneity, we used three distinct functional assays
to ascertain the cell line-s response to glucose or a glucose analog.
These assays include: (i) a 2-NBDG uptake assay; (ii) a calcium
influx assay, and; (iii) a quinacrine secretion assay. We show that a
population of β-TC-6 cells endocytoses the glucose analog, 2-
NBDG, at different rates, has non-uniform intracellular calcium ion
concentrations and releases quinacrine at different rates when
challenged with glucose. We also measured the Km for β-TC-6
glucose uptake to be 46.9 mM and the Vm to be 8.36 x 10-5
mmole/million cells/min. These data suggest that β-TC-6 might be
used as an alternative to primary pancreatic β-cells for the study of
glucose-dependent β-cell functional heterogeneity.
Abstract: This paper deals with dynamic load balancing using PVM. In distributed environment Load Balancing and Heterogeneity are very critical issues and needed to drill down in order to achieve the optimal results and efficiency. Various techniques are being used in order to distribute the load dynamically among different nodes and to deal with heterogeneity. These techniques are using different approaches where Process Migration is basic concept with different optimal flavors. But Process Migration is not an easy job, it impose lot of burden and processing effort in order to track each process in nodes. We will propose a dynamic load balancing technique in which application will intelligently balance the load among different nodes, resulting in efficient use of system and have no overheads of process migration. It would also provide a simple solution to problem of load balancing in heterogeneous environment.
Abstract: We investigate an asymmetric connections model with a
dynamic network formation process, using an agent based simulation.
We permit heterogeneity of agents- value. Valuable persons seem
to have many links on real social networks. We focus on this
point of view, and examine whether valuable agents change the
structures of the terminal networks. Simulation reveals that valuable
agents diversify the terminal networks. We can not find evidence that
valuable agents increase the possibility that star networks survive the
dynamic process. We find that valuable agents disperse the degrees
of agents in each terminal network on an average.
Abstract: Inter-organizational Workflow (IOW) is commonly
used to support the collaboration between heterogeneous and
distributed business processes of different autonomous organizations
in order to achieve a common goal. E-government is considered as an
application field of IOW. The coordination of the different
organizations is the fundamental problem in IOW and remains the
major cause of failure in e-government projects. In this paper, we
introduce a new coordination model for IOW that improves the
collaboration between government administrations and that respects
IOW requirements applied to e-government. For this purpose, we
adopt a Multi-Agent approach, which deals more easily with interorganizational
digital government characteristics: distribution,
heterogeneity and autonomy. Our model integrates also different
technologies to deal with the semantic and technologic
interoperability. Moreover, it conserves the existing systems of
government administrations by offering a distributed coordination
based on interfaces communication. This is especially applied in
developing countries, where administrations are not necessary
equipped with workflow systems. The use of our coordination
techniques allows an easier migration for an e-government solution
and with a lower cost. To illustrate the applicability of the proposed
model, we present a case study of an identity card creation in Tunisia.
Abstract: Traffic Density provides an indication of the level of
service being provided to the road users. Hence, there is a need to
study the traffic flow characteristics with specific reference to
density in detail. When the length and speed of the vehicles in a
traffic stream vary significantly, the concept of occupancy, rather
than density, is more appropriate to describe traffic concentration.
When the concept of occupancy is applied to heterogeneous traffic
condition, it is necessary to consider the area of the road space and
the area of the vehicles as the bases. Hence, a new concept named,
'area-occupancy' is proposed here. It has been found that the
estimated area-occupancy gives consistent values irrespective of
change in traffic composition.
Abstract: In this paper we present a new approach to deal with
image segmentation. The fact that a single segmentation result do not
generally allow a higher level process to take into account all the
elements included in the image has motivated the consideration of
image segmentation as a multiobjective optimization problem. The
proposed algorithm adopts a split/merge strategy that uses the result
of the k-means algorithm as input for a quantum evolutionary
algorithm to establish a set of non-dominated solutions. The
evaluation is made simultaneously according to two distinct features:
intra-region homogeneity and inter-region heterogeneity. The
experimentation of the new approach on natural images has proved
its efficiency and usefulness.