Abstract: The estimation of accumulated radiation doses in people professionally exposed to ionizing radiation was performed using methods of biological (chromosomal aberrations frequency in lymphocytes) and physical (radionuclides analysis in urine, whole-body radiation meter, individual thermoluminescent dosimeters) dosimetry. A group of 84 "A" category employees after their work in the territory of former Semipalatinsk test site (Kazakhstan) was investigated. The dose rate in some funnels exceeds 40 μSv/h. After radionuclides determination in urine using radiochemical and WBC methods, it was shown that the total effective dose of personnel internal exposure did not exceed 0.2 mSv/year, while an acceptable dose limit for staff is 20 mSv/year. The range of external radiation doses measured with individual thermo-luminescent dosimeters was 0.3-1.406 µSv. The cytogenetic examination showed that chromosomal aberrations frequency in staff was 4.27±0.22%, which is significantly higher than at the people from non-polluting settlement Tausugur (0.87±0.1%) (р ≤ 0.01) and citizens of Almaty (1.6±0.12%) (р≤ 0.01). Chromosomal type aberrations accounted for 2.32±0.16%, 0.27±0.06% of which were dicentrics and centric rings. The cytogenetic analysis of different types group radiosensitivity among «professionals» (age, sex, ethnic group, epidemiological data) revealed no significant differences between the compared values. Using various techniques by frequency of dicentrics and centric rings, the average cumulative radiation dose for group was calculated, and that was 0.084-0.143 Gy. To perform comparative individual dosimetry using physical and biological methods of dose assessment, calibration curves (including own ones) and regression equations based on general frequency of chromosomal aberrations obtained after irradiation of blood samples by gamma-radiation with the dose rate of 0,1 Gy/min were used. Herewith, on the assumption of individual variation of chromosomal aberrations frequency (1–10%), the accumulated dose of radiation varied 0-0.3 Gy. The main problem in the interpretation of individual dosimetry results is reduced to different reaction of the objects to irradiation - radiosensitivity, which dictates the need of quantitative definition of this individual reaction and its consideration in the calculation of the received radiation dose. The entire examined contingent was assigned to a group based on the received dose and detected cytogenetic aberrations. Radiosensitive individuals, at the lowest received dose in a year, showed the highest frequency of chromosomal aberrations (5.72%). In opposite, radioresistant individuals showed the lowest frequency of chromosomal aberrations (2.8%). The cohort correlation according to the criterion of radio-sensitivity in our research was distributed as follows: radio-sensitive (26.2%) — medium radio-sensitivity (57.1%), radioresistant (16.7%). Herewith, the dispersion for radioresistant individuals is 2.3; for the group with medium radio-sensitivity — 3.3; and for radio-sensitive group — 9. These data indicate the highest variation of characteristic (reactions to radiation effect) in the group of radio-sensitive individuals. People with medium radio-sensitivity show significant long-term correlation (0.66; n=48, β ≥ 0.999) between the values of doses defined according to the results of cytogenetic analysis and dose of external radiation obtained with the help of thermoluminescent dosimeters. Mathematical models based on the type of violation of the radiation dose according to the professionals radiosensitivity level were offered.
Abstract: Detection of small ship is crucial task in many automatic surveillance systems which are employed for security of maritime boundaries of a country. To address this problem, image de-noising is technique to identify the target ship in between many other ships in the sea. Image de-noising technique needs to extract the ship’s image from sea background for the analysis as the ship’s image may submerge in the background and flooding waves. In this paper, a noise filter is presented that is based on fuzzy linguistic ‘most’ quantifier. Ordered weighted averaging (OWA) function is used to remove salt-pepper noise of ship’s image. Results obtained are in line with the results available by other well-known median filters and OWA based approach shows better performance.
Abstract: In today’s market, striving hard has become necessary for the industries to survive due to the intense competition and globalization. In earlier days, there were few sellers and limited numbers of buyers, so customers were having fewer options to buy the product. But today, the market is highly competitive and volatile. Industries are focusing on robotics, advance manufacturing methods like AJM (Abrasive Jet Machining), EDM (Electric Discharge Machining), ECM (Electrochemical Machining) etc., CAD/CAM, CAE to make quality products and market them in shortest possible time. Leagile manufacturing system is ensuring best available solution at minimum cost to meet the market demand. This paper tries to assimilate the concept of Leagile manufacturing system in today’s scenario and evaluating key factors affecting Leagile manufacturing using digraph technique.
Abstract: This paper presents a method for modelling and analysing end plate beam-to-column connections to obtain the quasi-static behaviour using non-linear dynamic explicit integration. In addition to its importance to study the static behaviour of a structural member, quasi-static behaviour is largely needed to be compared with the dynamic behaviour of such members in order to investigate the dynamic effect by proposing dynamic increase factors (DIFs). The beam-to-column bolted connections contain various contact surfaces at which the implicit procedure may have difficulties converging, resulting in a large number of iterations. Contrary, explicit procedure could deal effectively with complex contacts without converging problems. Hence, finite element modelling using ABAQUS/explicit is used in this study to address the dynamic effect may be produced using explicit procedure. Also, the effect of loading rate and mass scaling are discussed to investigate their effect on the time of analysis. The results show that the explicit procedure is valuable to model the end plate beam-to-column connections in terms of failure mode, load-displacement relationships. Also, it is concluded that loading rate and mass scaling should be carefully selected to avoid the dynamic effect in the solution.
Abstract: Sustainability starts with conserving resources for future generations. Since human’s existence on this earth, he has been consuming natural resources. The resource consumption pace in the past was very slow, but industrialization in 18th century brought a change in the human lifestyle. New inventions and discoveries upgraded the human workforce to machines. The mass manufacture of goods provided easy access to products. In the last few decades, the globalization and change in technologies brought consumer oriented market. The consumption of resources has increased at a very high scale. This overconsumption pattern brought economic boom and provided multiple opportunities, but it also put stress on the natural resources. This paper tries to put forth the facts and figures of the population growth and consumption of resources with examples. This is explained with the help of the mathematical expression of doubling known as exponential growth. It compares the carrying capacity of the earth and resource consumption of humans’ i.e. ecological footprint and bio-capacity. Further, it presents the need to conserve natural resources and re-examine sustainable resource use approach for sustainability.
Abstract: This research explored ward nurses’ views about the characteristics of effective nurse leaders in the context of Iraq as a developing country, where the delivery of health care continues to face disruption and change. It is well established that the provision of modern health care requires effective nurse leaders, but in countries such as Iraq the lack of effective nurse leaders is noted as a major challenge. In a descriptive quantitative study, a survey questionnaire was administered to 210 ward nurses working in two public hospitals in a major city in the north of Iraq. The participating nurses were of the opinion that the effectiveness of their nurse leaders was evident in their ability to demonstrate: good clinical knowledge, effective communication and managerial skills. They also viewed their leaders as needing to hold high-level nursing qualifications, though this was not necessarily the case in practice. Additionally, they viewed nurse leaders’ personal qualities as important, which included politeness, ethical behaviour, and trustworthiness. When considered against the issues raised in interviews with a smaller group (20) of senior nurse leaders, representative of the various occupational levels, implications identify the need for professional development that focuses on how the underpinning competencies relate to leadership and how transformational leadership is evidenced in practice.
Abstract: Polymer based membranes are one of the low-cost technologies available for the gas separation. Three major elements required for a commercial gas separating membrane are high permeability, high selectivity, and good mechanical strength. Poly(vinylidene fluoride) (PVDF) is a commercially available fluoropolymer and a widely used membrane material in gas separation devices since it possesses remarkable thermal, chemical stability, and excellent mechanical strength. The PVDF membrane was chemically modified by soaking in different ionic liquids and dried. The thermal behavior of modified membranes was investigated by differential scanning calorimetry (DSC), and thermogravimetry (TGA), and the results clearly show the best affinity between the ionic liquid and the polymer support. The porous structure of the PVDF membranes was clearly seen in the scanning electron microscopy (SEM) images. The CO₂ permeability of blended membranes was explored in comparison with the unmodified matrix. The ionic liquid immobilized in the hydrophobic PVDF support exhibited good performance for separations of CO₂/N₂. The improved permeability of modified membrane (PVDF-IL) is attributed to the high concentration of nitrogen rich imidazolium moieties.
Abstract: Rainfall is a major climatic parameter affecting
many sectors such as health, agriculture and water resources. Its
quantitative prediction remains a challenge to weather forecasters
although numerical weather prediction models are increasingly being
used for rainfall prediction. The performance of six convective
parameterization schemes, namely the Kain-Fritsch scheme, the
Betts-Miller-Janjic scheme, the Grell-Deveny scheme, the Grell-3D
scheme, the Grell-Fretas scheme, the New Tiedke scheme of the
weather research and forecast (WRF) model regarding quantitative
rainfall prediction over Uganda is investigated using the root mean
square error for the March-May (MAM) 2013 season. The MAM
2013 seasonal rainfall amount ranged from 200 mm to 900 mm over
Uganda with northern region receiving comparatively lower rainfall
amount (200–500 mm); western Uganda (270–550 mm); eastern
Uganda (400–900 mm) and the lake Victoria basin (400–650 mm). A
spatial variation in simulated rainfall amount by different convective
parameterization schemes was noted with the Kain-Fritsch scheme
over estimating the rainfall amount over northern Uganda (300–750
mm) but also presented comparable rainfall amounts over the eastern
Uganda (400–900 mm). The Betts-Miller-Janjic, the Grell-Deveny,
and the Grell-3D underestimated the rainfall amount over most
parts of the country especially the eastern region (300–600 mm).
The Grell-Fretas captured rainfall amount over the northern region
(250–450 mm) but also underestimated rainfall over the lake Victoria
Basin (150–300 mm) while the New Tiedke generally underestimated
rainfall amount over many areas of Uganda. For deterministic rainfall
prediction, the Grell-Fretas is recommended for rainfall prediction
over northern Uganda while the Kain-Fritsch scheme is recommended
over eastern region.
Abstract: Several decades ago, food and drinks were disallowed in most Japanese libraries. However, as discussions of “Library as a Place” have increased in recent years, the number of public and university libraries that have relaxed their policies to allow food and drinks have been increasing. This study focused on the opinions of library users on allowing food and drinks in public libraries and conducted a questionnaire survey among users of nine Japanese libraries. The results indicated that many users favored allowing food and drinks in libraries. Furthermore, it was found that users tend to frequently visit and stay longer in libraries where food and drinks are allowed.
Abstract: The global coverage of broadband multimedia and
internet-based services in terrestrial-satellite networks demand
particular interests for satellite providers in order to enhance services
with low latencies and high signal quality to diverse users. In
particular, the delay of on-board processing is an inherent source
of latency in a satellite communication that sometimes is discarded
for the end-to-end delay of the satellite link. The frame work for this
paper includes modelling of an on-orbit satellite payload using an
agent model that can reproduce the properties of processing delays.
In essence, a comparison of different spatial interpolation methods is
carried out to evaluate physical data obtained by an GEO satellite
in order to define a discretization function for determining that
delay. Furthermore, the performance of the proposed agent and the
development of a delay discretization function are together validated
by simulating an hybrid satellite and terrestrial network. Simulation
results show high accuracy according to the characteristics of initial
data points of processing delay for Ku bands.
Abstract: Along with the acceleration of Chinese urbanization, the expansion, renovation and demolition of old buildings is on the stage together with the design and construction of new buildings every day in downtown of the old city area. The coordinative symbiosis between new and old buildings is an important problem which needs to be solved in the process of urban development. By studying and analyzing the case of Shanghai Citic Plaza and surroundings, this paper contributes to discussing the concept, value and problems to be solved of the coordination of new and old buildings, meanwhile, striking the balance between new and old buildings from the aspects of architectural form, space, function and local context. As a result, the strategy of the coordinative symbiosis between new and old buildings is summarized, which can offer some guiding principles to urban development from now on.
Abstract: In this paper, we propose a method to model the
relationship between failure time and degradation for a simple step
stress test where underlying degradation path is linear and different
causes of failure are possible. It is assumed that the intensity function
depends only on the degradation value. No assumptions are made
about the distribution of the failure times. A simple step-stress test
is used to shorten failure time of products and a tampered failure
rate (TFR) model is proposed to describe the effect of the changing
stress on the intensities. We assume that some of the products that
fail during the test have a cause of failure that is only known to
belong to a certain subset of all possible failures. This case is known
as masking. In the presence of masking, the maximum likelihood
estimates (MLEs) of the model parameters are obtained through an
expectation-maximization (EM) algorithm by treating the causes of
failure as missing values. The effect of incomplete information on the
estimation of parameters is studied through a Monte-Carlo simulation.
Finally, a real example is analyzed to illustrate the application of the
proposed methods.
Abstract: Oil in water (O/W) emulsions are utilized extensively for cooling and lubricating cutting tools during parts machining. A robust Lattice Boltzmann (LBM) thermal-surfactants model, which provides a useful platform for exploring complex emulsions’ characteristics under variety of flow conditions, is used here for the study of the fluid behavior during conventional tools cooling. The transient thermal capabilities of the model are employed for simulating the effects of the flow conditions of O/W emulsions on the cooling of cutting tools. The model results show that the temperature outcome is slightly affected by reversing the direction of upper plate (workpiece). On the other hand, an important increase in effective viscosity is seen which supports better lubrication during the work.
Abstract: This paper reports on a pilot project to develop a collaborative partnership between a community college in rural northern Ontario, Canada, and an urban university in the greater Toronto area in Oshawa, Canada. Partner institutions will collaborate to address learning needs of university applicants whose goals are to attain an undergraduate university BA in Educational Studies and Digital Technology degree, but who may not live in a geographical location that would facilitate this pathways process. The UOIT BA degree is attained through a 2+2 program, where students with a 2 year college diploma or equivalent can attain a four year undergraduate degree. The goals reported on the project are as: 1. Our aim is to expand the BA program to include an additional stream which includes serious educational games, simulations and virtual environments, 2. Develop fully (using both synchronous and asynchronous technologies) online learning modules for use by university applicants who otherwise are not geographically located close to a physical university site, 3. Assess the digital competencies of all students, including members of local, distance and Indigenous communities using a validated tool developed and tested by UOIT across numerous populations. This tool, the General Technical Competency Use and Scale (GTCU) will provide the collaborating institutions with data that will allow for analyzing how well students are prepared to succeed in fully online learning communities. Philosophically, the UOIT BA program is based on a fully online learning communities model (FOLC) that can be accessed from anywhere in the world through digital learning environments via audio video conferencing tools such as Adobe Connect. It also follows models of adult learning and mobile learning, and makes a university degree accessible to the increasing demographic of adult learners who may use mobile devices to learn anywhere anytime. The program is based on key principles of Problem Based Learning, allowing students to build their own understandings through the co-design of the learning environment in collaboration with the instructors and their peers. In this way, this degree allows students to personalize and individualize the learning based on their own culture, background and professional/personal experiences. Using modified flipped classroom strategies, students are able to interrogate video modules on their own time in preparation for one hour discussions occurring in video conferencing sessions. As a consequence of the program flexibility, students may continue to work full or part time. All of the partner institutions will co-develop four new modules, administer the GTCU and share data, while creating a new stream of the UOIT BA degree. This will increase accessibility for students to bridge from community colleges to university through a fully digital environment. We aim to work collaboratively with Indigenous elders, community members and distance education instructors to increase opportunities for more students to attain a university education.
Abstract: In this paper, the ejector-absorption refrigeration cycle is presented. This article deals with the thermodynamic simulation and the first and second law analysis of an ammonia-water. The effects of parameters such as condenser, absorber, generator, and evaporator temperatures have been investigated. The influence of the various operating parameters on the performance coefficient and exergy efficiency of this cycle has been studied. The results show that when the temperature of different parts increases, the performance coefficient and the exergy efficiency of the cycle decrease, except for evaporator and generator, that causes an increase in coefficient of performance (COP). According to the results, absorber and ejector have the highest exergy losses in the studied conditions.
Abstract: A power and cooling cycle, which combines the organic Rankine cycle and the ejector refrigeration cycle supplied by waste heat energy sources, is discussed in this paper. 13 working fluids including wet, dry, and isentropic fluids are studied in order to find their performances on the combined cycle. Various operating conditions’ effects on the proposed cycle are examined by fixing power/refrigeration ratio. According to the results, dry and isentropic fluids have better performance compared with wet fluids.
Abstract: Urban development requires deep excavations near buildings and other structures. Deep excavation has become more a necessity for better utilization of space as the population of the world has dramatically increased. In Lebanon, some urban areas are very crowded and lack spaces for new buildings and underground projects, which makes the usage of underground space indispensable. In this paper, a numerical modeling is performed using the finite element method to study the deep excavation-diaphragm wall soil-structure interaction in the case of nonlinear soil behavior. The study is focused on a comparison of the results obtained using different support systems. Furthermore, a parametric study is performed according to the remoteness of the structure.
Abstract: A lattice network is a special type of network in
which all nodes have the same number of links, and its boundary
conditions are periodic. The most basic lattice network is the ring, a
one-dimensional network with periodic border conditions. In contrast,
the Cartesian product of d rings forms a d-dimensional lattice
network. An analytical expression currently exists for the clustering
coefficient in this type of network, but the theoretical value is valid
only up to certain connectivity value; in other words, the analytical
expression is incomplete. Here we obtain analytically the clustering
coefficient expression in d-dimensional lattice networks for any link
density. Our analytical results show that the clustering coefficient for
a lattice network with density of links that tend to 1, leads to the
value of the clustering coefficient of a fully connected network. We
developed a model on criminology in which the generalized clustering
coefficient expression is applied. The model states that delinquents
learn the know-how of crime business by sharing knowledge, directly
or indirectly, with their friends of the gang. This generalization shed
light on the network properties, which is important to develop new
models in different fields where network structure plays an important
role in the system dynamic, such as criminology, evolutionary game
theory, econophysics, among others.
Abstract: In this work, we exploit two assumed properties of the abundances of the observed signatures (endmembers) in order to reconstruct the abundances from hyperspectral data. Joint-sparsity is the first property of the abundances, which assumes the adjacent pixels can be expressed as different linear combinations of same materials. The second property is rank-deficiency where the number of endmembers participating in hyperspectral data is very small compared with the dimensionality of spectral library, which means that the abundances matrix of the endmembers is a low-rank matrix. These assumptions lead to an optimization problem for the sparse unmixing model that requires minimizing a combined l2,p-norm and nuclear norm. We propose a variable splitting and augmented Lagrangian algorithm to solve the optimization problem. Experimental evaluation carried out on synthetic and real hyperspectral data shows that the proposed method outperforms the state-of-the-art algorithms with a better spectral unmixing accuracy.
Abstract: Control Flow Integrity (CFI) is one of the most
promising technique to defend Code-Reuse Attacks (CRAs).
Traditional CFI Systems and recent Context-Sensitive CFI use coarse
control flow graphs (CFGs) to analyze whether the control flow
hijack occurs, left vast space for attackers at indirect call-sites. Coarse
CFGs make it difficult to decide which target to execute at indirect
control-flow transfers, and weaken the existing CFI systems actually.
It is an unsolved problem to extract CFGs precisely and perfectly
from binaries now. In this paper, we present an algorithm to get a
more precise CFG from binaries. Parameters are analyzed at indirect
call-sites and functions firstly. By comparing counts of parameters
prepared before call-sites and consumed by functions, targets of
indirect calls are reduced. Then the control flow would be more
constrained at indirect call-sites in runtime. Combined with CCFI,
we implement our policy. Experimental results on some popular
programs show that our approach is efficient. Further analysis show
that it can mitigate COOP and other advanced attacks.