Abstract: In an open real-time system environment, the coexistence of different kinds of real-time and non real-time applications makes the system scheduling mechanism face new requirements and challenges. One two-level scheduling scheme of the open real-time systems is introduced, and points out that hard and soft real-time applications are scheduled non-distinctively as the same type real-time applications, the Quality of Service (QoS) cannot be guaranteed. It has two flaws: The first, it can not differentiate scheduling priorities of hard and soft real-time applications, that is to say, it neglects characteristic differences between hard real-time applications and soft ones, so it does not suit a more complex real-time environment. The second, the worst case execution time of soft real-time applications cannot be predicted exactly, so it is not worth while to cost much spending in order to assure all soft real-time applications not to miss their deadlines, and doing that may cause resource wasting. In order to solve this problem, a novel two-level real-time scheduling mechanism (including scheduling profile and scheduling algorithm) which adds the process of dealing with soft real-time applications is proposed. Finally, we verify real-time scheduling mechanism from two aspects of theory and experiment. The results indicate that our scheduling mechanism can achieve the following objectives. (1) It can reflect the difference of priority when scheduling hard and soft real-time applications. (2) It can ensure schedulability of hard real-time applications, that is, their rate of missing deadline is 0. (3) The overall rate of missing deadline of soft real-time applications can be less than 1. (4) The deadline of a non-real-time application is not set, whereas the scheduling algorithm that server 0 S uses can avoid the “starvation" of jobs and increase QOS. By doing that, our scheduling mechanism is more compatible with different types of applications and it will be applied more widely.
Abstract: Object: Review recent publications of patient safety
culture to investigate the relationship between leadership behavior,
safety culture, and safety performance in the healthcare industry.
Method: This study is a cross-sectional study, 350 questionnaires were
mailed to hospital workers with 195 valid responses obtained, and a
55.7% valid response rate. Confirmatory factor analysis (CFA) was
carried out to test the factor structure and determine if the composite
reliability was significant with a factor loading of >0.5, resulting in an
acceptable model fit. Results: Through the analysis of One-way
ANOVA, the results showed that physicians significantly have more
negative patient safety culture perceptions and safety performance
perceptions than non- physicians. Conclusions: The path analysis
results show that leadership behavior affects safety culture and safety
performance in the health care industry. Safety performance was
affected and improved with contingency leadership and a positive
patient safety organization culture. The study suggests improving
safety performance by providing a well-managed system that
includes: consideration of leadership, hospital worker training
courses, and a solid safety reporting system.
Abstract: Today the social marketing was constituted as a tool
of significant value in what he refers to the promotion of changes of
behaviors, attitudes end practices. With the objective of analyzing the
benefits that the social marketing can bring for the organizations that
use it the research was of the exploratory and descriptive. In the
present study the comparative method was used, through a qualitative
approach, to analyze the activities developed by three institutions:
the Recovery Center Rosa de Saron, the House of Recovery for
addicts and Teen Challenge Institute Children's Cancer of the
Wasteland (ICIA), kindred of pointing out the benefits of the social
marketing in organizations that don-t seek the profit.
Abstract: This paper presents an improved variable ordering method to obtain the minimum number of nodes in Reduced Ordered Binary Decision Diagrams (ROBDD). The proposed method uses the graph topology to find the best variable ordering. Therefore the input Boolean function is converted to a unidirectional graph. Three levels of graph parameters are used to increase the probability of having a good variable ordering. The initial level uses the total number of nodes (NN) in all the paths, the total number of paths (NP) and the maximum number of nodes among all paths (MNNAP). The second and third levels use two extra parameters: The shortest path among two variables (SP) and the sum of shortest path from one variable to all the other variables (SSP). A permutation of the graph parameters is performed at each level for each variable order and the number of nodes is recorded. Experimental results are promising; the proposed method is found to be more effective in finding the variable ordering for the majority of benchmark circuits.
Abstract: In this researcha particle swarm optimization (PSO)
algorithm is proposedfor no-wait flowshopsequence dependent
setuptime scheduling problem with weighted earliness-tardiness
penalties as the criterion (|,
|Σ
"
).The
smallestposition value (SPV) rule is applied to convert the continuous
value of position vector of particles in PSO to job permutations.A
timing algorithm is generated to find the optimal schedule and
calculate the objective function value of a given sequence in PSO
algorithm. Twodifferent neighborhood structures are applied to
improve the solution quality of PSO algorithm.The first one is based
on variable neighborhood search (VNS) and the second one is a
simple one with invariable structure. In order to compare the
performance of two neighborhood structures, random test problems
are generated and solved by both neighborhood
approaches.Computational results show that the VNS algorithmhas
better performance than the other one especially for the large sized
problems.
Abstract: Loop detectors report traffic characteristics in real
time. They are at the core of traffic control process. Intuitively,
one would expect that as density of detection increases, so would
the quality of estimates derived from detector data. However, as
detector deployment increases, the associated operating and
maintenance cost increases. Thus, traffic agencies often need to
decide where to add new detectors and which detectors should
continue receiving maintenance, given their resource constraints.
This paper evaluates the effect of detector spacing on freeway
travel time estimation. A freeway section (Interstate-15) in Salt
Lake City metropolitan region is examined. The research reveals
that travel time accuracy does not necessarily deteriorate with
increased detector spacing. Rather, the actual location of detectors
has far greater influence on the quality of travel time estimates.
The study presents an innovative computational approach that
delivers optimal detector locations through a process that relies on
Genetic Algorithm formulation.
Abstract: Emerging adulthood, the new period which is
especially prevalent in the developed or industrialized countries
during ages 18 to 29, is a new conceptualization proposed by Arnett.
Intimacy is a superordinate concept which includes intimate
interaction and intimate relationship. This study includes two
proceses which are scale development and conduction of gender
differences about markers of starting romantic intimacy among
Turkish emerging adults. In first process, Markers of Starting
Romantic Intimacy Scale, with 17 items and 5 factors, was developed
using by 220 participants. In the second step, the scale was
administered to 318 Turkish male and female emerging adults
between ages 22 and 25. Results show that there is no significant
difference between gender and total score of the scale. With respect
to gender, there are significant differences between gender and in
four subscales which are self perception, affective and cognitive
intimacy, self knowledge and romantic verbalizations. Moreover,
there is no significant relationship between gender and behavioral
intimacy subscale.
Abstract: As a matter of the fact that online social networks like
Twitter, Facebook and MySpace have experienced an extensive
growth in recent years. Social media offers individuals with a tool for
communicating and interacting with one another. These social
networks enable people to stay in touch with other people and
express themselves. This process makes the users of online social
networks active creators of content rather than being only consumers
of traditional media. That’s why millions of people show strong
desire to learn the methods and tools of digital content production
and necessary communication skills. However, the booming interest
in communication and interaction through online social networks and
high level of eagerness to invent and implement the ways to
participate in content production raise some privacy and security
concerns.
This presentation aims to open the assumed revolutionary,
democratic and liberating nature of the online social media up for
discussion by reviewing some recent political developments in
Turkey. Firstly, the role of Internet and online social networks in
mobilizing collective movements through social interactions and
communications will be questioned. Secondly, some cases from Gezi
and Okmeydanı Protests and also December 17-25 period will be
presented in order to illustrate misinformation and manipulation in
social media and violation of individual privacy through online social
networks in order to damage social unity and stability contradictory
to democratic nature of online social networking.
Abstract: This paper addresses modeling and optimization of process parameters in powder mixed electrical discharge machining (PMEDM). The process output characteristics include metal removal rate (MRR) and electrode wear rate (EWR). Grain size of Aluminum powder (S), concentration of the powder (C), discharge current (I) pulse on time (T) are chosen as control variables to study the process performance. The experimental results are used to develop the regression models based on second order polynomial equations for the different process characteristics. Then, a genetic algorithm (GA) has been employed to determine optimal process parameters for any desired output values of machining characteristics.
Abstract: The aim of this work is to study the elastic transfer
phenomenon which takes place in the elastic scattering of 16O on 12C
at energies near the Coulomb barrier. Where, the angular distribution
decrease steadily with increasing the scattering angle, then the cross
section will increase at backward angles due to the α-transfer process.
This reaction was also studied at different energies for tracking the
nuclear rainbow phenomenon. The experimental data of the angular
distribution at these energies were compared to the calculation
predictions. The optical potential codes such as SPIVAL and
Distorted Wave Born Approximation (DWUCK5) were used in
analysis.
Abstract: In this paper, we construct and implement a new
Steganography algorithm based on learning system to hide a large
amount of information into color BMP image. We have used adaptive
image filtering and adaptive non-uniform image segmentation with
bits replacement on the appropriate pixels. These pixels are selected
randomly rather than sequentially by using new concept defined by
main cases with sub cases for each byte in one pixel. According to
the steps of design, we have been concluded 16 main cases with their
sub cases that covere all aspects of the input information into color
bitmap image. High security layers have been proposed through four
layers of security to make it difficult to break the encryption of the
input information and confuse steganalysis too. Learning system has
been introduces at the fourth layer of security through neural
network. This layer is used to increase the difficulties of the statistical
attacks. Our results against statistical and visual attacks are discussed
before and after using the learning system and we make comparison
with the previous Steganography algorithm. We show that our
algorithm can embed efficiently a large amount of information that
has been reached to 75% of the image size (replace 18 bits for each
pixel as a maximum) with high quality of the output.
Abstract: In this study, Friction Stir Processing (FSP) a recent grain refinement technique was employed to disperse micron-sized (2 *m) SiCp particles into aluminum alloy AA6063. The feasibility to fabricate bulk composites through FSP was analyzed and experiments were conducted at different traverse speeds and wider volumes of the specimens. Micro structural observation were carried out by employing optical microscopy test of the cross sections in both parallel and perpendicular to the tool traverse direction. Mechanical property including micro hardness was evaluated in detail at various regions on the specimen. The composites had an excellent bonding with aluminum alloy substrate and a significant increase of 30% in the micro hardness value of metal matrix composite (MMC) as to that of the base metal has observed. The observations clearly indicate that SiC particles were uniformly distributed within the aluminum matrix.
Abstract: Pentachlorophenol (PCP) is a polychlorinated
aromatic compound that is widespread in industrial effluents and is
considered to be a serious pollutant. Among the variety of industrial
effluents encountered, effluents from tanning industry are very
important and have a serious pollution potential. PCP is also formed
unintentionally in effluents of paper and pulp industries. It is highly
persistent in soils and is lethal to a wide variety of beneficial
microorganisms and insects, human beings and animals. The natural
processes that breakdown toxic chemicals in the environment have
become the focus of much attention to develop safe and environmentfriendly
deactivation technologies. Microbes and plants are among
the most important biological agents that remove and degrade waste
materials to enable their recycling in the environment. The present
investigation was carried out with the aim of developing a microbial
system for bioremediation of PCP polluted soils. A number of plant
species were evaluated for their ability to tolerate different
concentrations of pentachlorophenol (PCP) in the soil. The
experiment was conducted for 30 days under pot culture conditions.
The toxic effect of PCP on plants was studied by monitoring seed
germination, plant growth and biomass. As the concentration of PCP
was increased to 50 ppm, the inhibition of seed germination, plant
growth and biomass was also increased. Although PCP had a
negative effect on all plant species tested, maize and groundnut
showed the maximum tolerance to PCP. Other tolerating crops
included wheat, safflower, sunflower, and soybean. From the
rhizosphere soil of the tolerant seedlings, as many as twenty seven
PCP tolerant bacteria were isolated. From soybean, 8; sunflower, 3;
safflower 8; maize 2; groundnut and wheat, 3 each isolates were
made. They were screened for their PCP degradation potentials.
HPLC analyses of PCP degradation revealed that the isolate MAZ-2
degraded PCP completely. The isolate MAZ-1 was the next best
isolate with 90 per cent PCP degradation. These strains hold promise
to be used in the bioremediation of PCP polluted soils.
Abstract: The article examines the methods of protection of
citizens' personal data on the Internet using biometric identity
authentication technology. It`s celebrated their potential danger due
to the threat of loss of base biometric templates. To eliminate the
threat of compromised biometric templates is proposed to use neural
networks large and extra-large sizes, which will on the one hand
securely (Highly reliable) to authenticate a person by his biometrics,
and on the other hand make biometrics a person is not available for
observation and understanding. This article also describes in detail
the transformation of personal biometric data access code. It`s formed
the requirements for biometrics converter code for his work with the
images of "Insider," "Stranger", all the "Strangers". It`s analyzed the
effect of the dimension of neural networks on the quality of
converters mystery of biometrics in access code.
Abstract: The hydrodynamic and thermal lattice Boltzmann
methods are applied to investigate the turbulent convective heat
transfer in the wavy channel flows. In this study, the turbulent
phenomena are modeling by large-eddy simulations with the
Smagorinsky model. As a benchmark, the laminar and turbulent
backward-facing step flows are simulated first. The results give good
agreement with other numerical and experimental data. For wavy
channel flows, the distribution of Nusselt number and the skin-friction
coefficients are calculated to evaluate the heat transfer effect and the
drag force. It indicates that the vortices at the trough would affect the
magnitude of drag and weaken the heat convection effects on the wavy
surface. In turbulent cases, if the amplitude of the wavy boundary is
large enough, the secondary vortices would be generated at troughs
and contribute to the heat convection. Finally, the effects of different
Re on the turbulent transport phenomena are discussed.
Abstract: Chicken feathers were used as biosorbent for Pb
removal from aqueous solution. In this paper, the kinetics and
equilibrium studies at several pH, temperature, and metal
concentration values are reported. For tested conditions, the Pb
sorption capacity of this poultry waste ranged from 0.8 to 8.3 mg/g.
Optimal conditions for Pb removal by chicken feathers have been
identified. Pseudo-first order and pseudo-second order equations
were used to analyze the experimental data. In addition, the sorption
isotherms were fitted to classical Langmuir and Freundlich models.
Finally, thermodynamic parameters for the sorption process have
been determined. In summary, the results showed that chicken
feathers are an alternative and promising sorbent for the treatment of
effluents polluted by Pb ions.
Abstract: In this paper, we propose a new image segmentation approach for colour textured images. The proposed method for image segmentation consists of two stages. In the first stage, textural features using gray level co-occurrence matrix(GLCM) are computed for regions of interest (ROI) considered for each class. ROI acts as ground truth for the classes. Ohta model (I1, I2, I3) is the colour model used for segmentation. Statistical mean feature at certain inter pixel distance (IPD) of I2 component was considered to be the optimized textural feature for further segmentation. In the second stage, the feature matrix obtained is assumed to be the degraded version of the image labels and modeled as Markov Random Field (MRF) model to model the unknown image labels. The labels are estimated through maximum a posteriori (MAP) estimation criterion using ICM algorithm. The performance of the proposed approach is compared with that of the existing schemes, JSEG and another scheme which uses GLCM and MRF in RGB colour space. The proposed method is found to be outperforming the existing ones in terms of segmentation accuracy with acceptable rate of convergence. The results are validated with synthetic and real textured images.
Abstract: Using entropy weight and TOPSIS method, a
comprehensive evaluation is done on the development level of
Chinese regional service industry in this paper. Firstly, based on
existing research results, an evaluation index system is constructed
from the scale of development, the industrial structure and the
economic benefits. An evaluation model is then built up based on
entropy weight and TOPSIS, and an empirical analysis is conducted on
the development level of service industries in 31 Chinese provinces
during 2006 and 2009 from the two dimensions or time series and
cross section, which provides new idea for assessing regional service
industry. Furthermore, the 31 provinces are classified into four
categories based on the evaluation results, and deep analysis is carried
out on the evaluation results.
Abstract: An epidemiological cross sectional study was
undertaken in Yaoundé in 2002 and updated in 2005. Focused on
health within the city, the objectives were to measure diarrheal
prevalence and to identify the risk factors associated with them.
Results of microbiological examinations have revealed an urban
average prevalence rate of 14.5%. Access to basic services in the
living environment appears to be an important risk factor for
diarrheas. Statistical and spatial analyses conducted have revealed
that prevalence of diarrheal diseases vary among the two main types
of settlement (informal and planned). More importantly, this study
shows that, diarrhea prevalence rates (notably bacterial and parasitic
diarrheas) vary according to the sub- category of settlements. The
study draws a number of theoretical and policy implications for
researchers and policy decision makers.
Abstract: Most of the nonlinear equation solvers do not converge always or they use the derivatives of the function to approximate the
root of such equations. Here, we give a derivative-free algorithm that guarantees the convergence. The proposed two-step method, which
is to some extent like the secant method, is accompanied with some
numerical examples. The illustrative instances manifest that the rate of convergence in proposed algorithm is more than the quadratically
iterative schemes.