Abstract: Scheduling algorithms are used in operating systems
to optimize the usage of processors. One of the most efficient
algorithms for scheduling is Multi-Layer Feedback Queue (MLFQ)
algorithm which uses several queues with different quanta. The most
important weakness of this method is the inability to define the
optimized the number of the queues and quantum of each queue. This
weakness has been improved in IMLFQ scheduling algorithm.
Number of the queues and quantum of each queue affect the response
time directly. In this paper, we review the IMLFQ algorithm for
solving these problems and minimizing the response time. In this
algorithm Recurrent Neural Network has been utilized to find both
the number of queues and the optimized quantum of each queue.
Also in order to prevent any probable faults in processes' response
time computation, a new fault tolerant approach has been presented.
In this approach we use combinational software redundancy to
prevent the any probable faults. The experimental results show that
using the IMLFQ algorithm results in better response time in
comparison with other scheduling algorithms also by using fault
tolerant mechanism we improve IMLFQ performance.
Abstract: This paper presents a method to estimate load profile
in a multiple power flow solutions for every minutes in 24 hours per
day. A method to calculate multiple solutions of non linear profile is
introduced. The Power System Simulation/Engineering (PSS®E) and
python has been used to solve the load power flow. The result of this
power flow solutions has been used to estimate the load profiles for
each load at buses using Independent Component Analysis (ICA)
without any knowledge of parameter and network topology of the
systems. The proposed algorithm is tested with IEEE 69 test bus
system represents for distribution part and the method of ICA has
been programmed in MATLAB R2012b version. Simulation results
and errors of estimations are discussed in this paper.
Abstract: Stochastic models of biological networks are well established in systems biology, where the computational treatment of such models is often focused on the solution of the so-called chemical master equation via stochastic simulation algorithms. In contrast to this, the development of storage-efficient model representations that are directly suitable for computer implementation has received significantly less attention. Instead, a model is usually described in terms of a stochastic process or a "higher-level paradigm" with graphical representation such as e.g. a stochastic Petri net. A serious problem then arises due to the exponential growth of the model-s state space which is in fact a main reason for the popularity of stochastic simulation since simulation suffers less from the state space explosion than non-simulative numerical solution techniques. In this paper we present transition class models for the representation of biological network models, a compact mathematical formalism that circumvents state space explosion. Transition class models can also serve as an interface between different higher level modeling paradigms, stochastic processes and the implementation coded in a programming language. Besides, the compact model representation provides the opportunity to apply non-simulative solution techniques thereby preserving the possible use of stochastic simulation. Illustrative examples of transition class representations are given for an enzyme-catalyzed substrate conversion and a part of the bacteriophage λ lysis/lysogeny pathway.
Abstract: The recent advances in computational fluid dynamics
(CFD) can be useful in observing the detailed hemodynamics in
cerebral aneurysms for understanding not only their formation and
rupture but also for clinical evaluation and treatment. However,
important hemodynamic quantities are difficult to measure in vivo. In
the present study, an approximate model of normal middle cerebral
artery (MCA) along with two cases consisting broad and narrow
saccular aneurysms are analyzed. The models are generated in
ANSYS WORKBENCH and transient analysis is performed in
ANSYS-CFX. The results obtained are compared for three cases and
agree well with the available literature.
Abstract: Controlled modification of appropriate sharpness for
nanotips is of paramount importance to develop novel materials and
functional devices at a nanometer resolution. Herein, we present a
reliable and unique strategy of laser irradiation enhanced
physicochemical etching to manufacture super sharp tungsten tips
with reproducible shape and dimension as well as high yields
(~80%). The corresponding morphology structure evolution of
tungsten tips and laser-tip interaction mechanisms were
systematically investigated and discussed using field emission
scanning electron microscope (SEM) and physical optics statistics
method with different fluences under 532 nm laser irradiation. This
work paves the way for exploring more accessible metallic tips
applications with tunable apex diameter and aspect ratio, and,
furthermore, facilitates the potential sharpening enhancement
technique for other materials used in a variety of nanoscale devices.
Abstract: This research paper presents a framework on how to
build up malware dataset.Many researchers took longer time to
clean the dataset from any noise or to transform the dataset into a
format that can be used straight away for testing. Therefore, this
research is proposing a framework to help researchers to speed up
the malware dataset cleaningprocesses which later can be used for
testing. It is believed, an efficient malware dataset cleaning
processes, can improved the quality of the data, thus help to improve
the accuracy and the efficiency of the subsequent analysis. Apart
from that, an in-depth understanding of the malware taxonomy is
also important prior and during the dataset cleaning processes. A
new Trojan classification has been proposed to complement this
framework.This experiment has been conducted in a controlled lab
environment and using the dataset from VxHeavens dataset. This
framework is built based on the integration of static and dynamic
analyses, incident response method and knowledge database
discovery (KDD) processes.This framework can be used as the basis
guideline for malware researchers in building malware dataset.
Abstract: Since the 1990s the American furniture industry faces
a transition period. Manufacturers, one of its most important actors
made its entrance into the retail industry. This shift has had deep
consequences not only for the American furniture industry as a
whole, but also for other international furniture industries, especially
the Chinese. The present work aims to analyze this actor based on the
distinction provided by the Global Commodity Chain Theory. It
stresses its characteristics, structure, operational way and importance
for both the U.S. and the Chinese furniture industries.
Abstract: There are many issues that affect modeling and designing real-time databases. One of those issues is maintaining consistency between the actual state of the real-time object of the external environment and its images as reflected by all its replicas distributed over multiple nodes. The need to improve the scalability is another important issue. In this paper, we present a general framework to design a replicated real-time database for small to medium scale systems and maintain all timing constrains. In order to extend the idea for modeling a large scale database, we present a general outline that consider improving the scalability by using an existing static segmentation algorithm applied on the whole database, with the intent to lower the degree of replication, enables segments to have individual degrees of replication with the purpose of avoiding excessive resource usage, which all together contribute in solving the scalability problem for DRTDBS.
Abstract: Emerging adulthood, between the ages of 18 and 25, as a new developmental stage extending from adolescence to young adulthood. According to Arnett [2004], there are experiments related to identity in three basic fields which are love, work and view of the world in emerging adulthood. When the literature related to identity is examined, it is seen that identity has been studied more with adolescent, and studies were concentrated on the relationship of identity with many demographic variables neglecting important variables such as marital status, parental status and SES. Thus, the main aim of this study is to determine whether identity statuses differenciate with marital status, parental status and SES. A total of 700 emerging adults participated in this study, and the mean age was 22,45 years [SD = 3.76]. The sample was made up of 347 female and 353 male. All participants in the study were students from colleges. Student responses to the Extended Version of the Objective Measure of Ego Identity Status [EOM-EIS-2] used to classify students into one of the four identity statuses. SPSS 15.00 program wasa used to analyse data. Percentage, frequency and X2 analysis were used in the analysis of data. When the findings of the study is viewed as a whole, the most frequently observed identity status in the group is found to be moratorium. Also, identity statuses differenciate with marital status, parental status and SES. Findings were discussed in the context of emerging adulthood.
Abstract: This work proposes an optical fiber system (OF) for
sensing various volatile organic compounds (VOCs) in human breath
for the diagnosis of some metabolic disorders as a non-invasive
methodology. The analyzed VOCs are alkanes (i.e., ethane, pentane,
heptane, octane, and decane), and aromatic compounds (i.e., benzene,
toluene, and styrene). The OF displays high analytical performance
since it provides near real-time responses, rapid analysis, and low
instrumentation costs, as well as it exhibits useful linear range and
detection limits; the developed OF sensor is also comparable to a
reference methodology (gas chromatography-mass spectrometry) for
the eight tested VOCs.
Abstract: A method has been developed for preparing load
models for power flow and stability. The load modeling
(LOADMOD) computer software transforms data on load class mix,
composition, and characteristics into the from required for
commonly–used power flow and transient stability simulation
programs. Typical default data have been developed for load
composition and characteristics. This paper defines LOADMOD
software and describes the dynamic and static load modeling
techniques used in this software and results of initial testing for
BAKHTAR power system.
Abstract: In many sensor network applications, sensor nodes are deployed in open environments, and hence are vulnerable to physical attacks, potentially compromising the node's cryptographic keys. False sensing report can be injected through compromised nodes, which can lead to not only false alarms but also the depletion of limited energy resource in battery powered networks. Ye et al. proposed a statistical en-route filtering scheme (SEF) to detect such false reports during the forwarding process. In this scheme, the choice of a security threshold value is important since it trades off detection power and overhead. In this paper, we propose a fuzzy logic for determining a security threshold value in the SEF based sensor networks. The fuzzy logic determines a security threshold by considering the number of partitions in a global key pool, the number of compromised partitions, and the energy level of nodes. The fuzzy based threshold value can conserve energy, while it provides sufficient detection power.
Abstract: The recent development of Information and Communication Technology (ICT) enables new ways of "democratic" decision-making such as a page-ranking system, which estimates the importance of a web page based on indirect trust on that page shared by diverse group of unorganized individuals. These kinds of "democracy" have not been acclaimed yet in the world of real politics. On the other hand, a large amount of data about personal relations including trust, norms of reciprocity, and networks of civic engagement has been accumulated in a computer-readable form by computer systems (e.g., social networking systems). We can use these relations as a new type of social capital to construct a new democratic decision-making system based on a delegation network. In this paper, we propose an effective decision-making support system, which is based on empowering someone's vote whom you trust. For this purpose, we propose two new techniques: the first is for estimating entire vote distribution from a small number of votes, and the second is for estimating active voter choice to promote voting using a delegation network. We show that these techniques could increase the voting ratio and credibility of the whole decision by agent-based simulations.
Abstract: The dramatic effect of information technology on
society is undeniable. In education, it is evident in the use of terms
like active learning, blended learning, electronic learning and mobile
learning (ubiquitous learning). This study explores the perceptions of
54 learners in a higher education institution regarding the use of
mobile devices in a third year module. Using semi-structured
interviews, it was found that mobile devices had a positive impact on
learner motivation, engagement and enjoyment. It also improved the
consistency of learning material, and the convenience and flexibility
(anywhere, anytime) of learning. User-interfacelimitation, bandwidth
and cognitive overload, however, were of concern. The use of cloud
based resources like Youtube and Google Docs, through mobile
devices, positively influenced learner perceptions, making them
prosumers (both consumers and producers) of education content.
Abstract: Increasing concerns over climate change have limited
the liberal usage of available energy technology options. India faces
a formidable challenge to meet its energy needs and provide adequate
energy of desired quality in various forms to users in sustainable
manner at reasonable costs. In this paper, work carried out with an
objective to study the role of various energy technology options
under different scenarios namely base line scenario, high nuclear
scenario, high renewable scenario, low growth and high growth rate
scenario. The study has been carried out using Model for Energy
Supply Strategy Alternatives and their General Environmental
Impacts (MESSAGE) model which evaluates the alternative energy
supply strategies with user defined constraints on fuel availability,
environmental regulations etc. The projected electricity demand, at
the end of study period i.e. 2035 is 500490 MWYr. The model
predicted the share of the demand by Thermal: 428170 MWYr,
Hydro: 40320 MWYr, Nuclear: 14000 MWYr, Wind: 18000 MWYr
in the base line scenario. Coal remains the dominant fuel for
production of electricity during the study period. However, the
import dependency of coal increased during the study period. In
baseline scenario the cumulative carbon dioxide emissions upto 2035
are about 11,000 million tones of CO2. In the scenario of high nuclear
capacity the carbon dioxide emissions reduced by 10 % when nuclear
energy share increased to 9 % compared to 3 % in baseline scenario.
Similarly aggressive use of renewables reduces 4 % of carbon
dioxide emissions.
Abstract: In the way of growing and developing firms especially
high-tech firms, on many occasions manager of firm is mainly involved in solving problems of his business and decision making about executive activities of the firm, while besides executive
measures, attention to planning of firm's success and growth way and
application of long experience and sagacity in designing business model are vital and necessary success in a business is achieved as a
result of different factors, one of the most important of them is designing and performing an optimal business model at the beginning
of the firm's work. This model is determining the limit of profitability
achieved by innovation and gained value added. Therefore, business
model is the process of connecting innovation environment and
technology with economic environment and business and is important
for succeeding modern businesses considering their traits.
Abstract: Analyses carried out on examples of detected defects
echoes showed clearly that one can describe these detected forms according to a whole of characteristic parameters in order to be able to make discrimination between a planar defect and a volumic defect.
This work answers to a problem of ultrasonics NDT like Identification of the defects. The problems as well as the objective of
this realized work, are divided in three parts: Extractions of the parameters of wavelets from the ultrasonic echo of the detected defect - the second part is devoted to principal components analysis
(PCA) for optimization of the attributes vector. And finally to establish the algorithm of classification (SVM, Support Vector Machine) which allows discrimination between a plane defect and a
volumic defect. We have completed this work by a conclusion where we draw up a summary of the completed works, as well as the robustness of the
various algorithms proposed in this study.
Abstract: This study applied the Theory of Planned Behavior
model in predicting dietary behavior among Type 2 diabetics in a
Kenyan environment. The study was conducted for three months
within the diabetic clinic at Kisii Hospital in Nyanza Province in
Kenya and adopted sequential mixed methods design combing both
qualitative and quantitative phases. Qualitative data was analyzed
using grounded theory analysis method. Structural equation modeling
using maximum likelihood was used to analyze quantitative data.
The results based on the common fit indices revealed that the theory
of planned behavior fitted the data acceptably well among the Type 2
diabetes and within dietary behavior {χ2 = 223.3, df = 77, p = .02,
χ2/df = 2.9, n=237; TLI = .93; CFI =.91; RMSEA (90CI) = .090(.039,
.146)}. This implies that the Theory of Planned Behavior holds and
forms a framework for promoting dietary practice among Type 2
diabetics.
Abstract: Though nonlinear dynamic analysis using a specialized
hydro-code such as AUTODYN is accurate and useful tool for
progressive collapse assessment of a multi-story building subjected to
blast load, it takes too much time to be applied to a practical simulation
of progressive collapse of a tall building. In this paper, blast analysis of
a RC frame structure using a simplified model with Reinforcement
Contact technique provided in Ansys Workbench was introduced and
investigated on its accuracy. Even though the simplified model has a
fraction of elements of the detailed model, the simplified model with
this modeling technique shows similar structural behavior under the
blast load to the detailed model. The proposed modeling method can
be effectively applied to blast loading progressive collapse analysis of
a RC frame structure.
Abstract: We propose a new fiber lens structure for large distance
measurement in which a polymer layer is added to a conventional
fiber lens. The proposed fiber lens can adjust the working distance by
properly choosing the refractive index and thickness of the polymer
layer. In our numerical analysis for the fiber lens radius of 120 μm,
the working distance of the proposed fiber lens is about 10 mm
which is about 30 times larger than conventional fiber lens.