Abstract: Subdivision surfaces were applied to the entire
meshes in order to produce smooth surfaces refinement from coarse
mesh. Several schemes had been introduced in this area to provide a
set of rules to converge smooth surfaces. However, to compute and
render all the vertices are really inconvenient in terms of memory
consumption and runtime during the subdivision process. It will lead
to a heavy computational load especially at a higher level of
subdivision. Adaptive subdivision is a method that subdivides only at
certain areas of the meshes while the rest were maintained less
polygons. Although adaptive subdivision occurs at the selected areas,
the quality of produced surfaces which is their smoothness can be
preserved similar as well as regular subdivision. Nevertheless,
adaptive subdivision process burdened from two causes; calculations
need to be done to define areas that are required to be subdivided and
to remove cracks created from the subdivision depth difference
between the selected and unselected areas. Unfortunately, the result
of adaptive subdivision when it reaches to the higher level of
subdivision, it still brings the problem with memory consumption.
This research brings to iterative process of adaptive subdivision to
improve the previous adaptive method that will reduce memory
consumption applied on triangular mesh. The result of this iterative
process was acceptable better in memory and appearance in order to
produce fewer polygons while it preserves smooth surfaces.
Abstract: In this article, the authors reviewed and analyzed the survey materials similarities ornament proto-Turkic and northern Indians. The study examined the materials scientists - geneticists, archaeologists, anthropologists. Numerous studies of scientists from different directions once again prove the relevance of the topic. The authors approached the subject from an artistic side. The study authors have made the appropriate conclusions. This publication is based on the proceedings of the investigation.
Abstract: Cavitation, usually known as a destructive
phenomenon, involves turbulent unsteady two-phase flow. Having
such features, cavitating flows have been turned to a challenging
topic in numerical studies and many researches are being done for
better understanding of bubbly flows and proposing solutions to
reduce its consequent destructive effects. Aeration may be regarded
as an effective protection against cavitation erosion in many
hydraulic structures, like gated tunnels. The paper concerns
numerical simulation of flow in discharge gated tunnel of a dam
using ing RNG k -ε model coupled with the volume of fluid (VOF)
method and the zone which is susceptible of cavitation inception in
the tunnel is predicted. In the second step, a vent is considered in the
mentioned zone for aeration and the numerical simulation is done
again to study the effects of aeration. The results show that aeration
is an impressively useful method to exclude cavitation in mentioned
tunnels.
Abstract: Theobjective of this study was to evaluate the optimal
treatment condition of Fenton oxidation process to removal
contaminant in soil slurry contaminated by petroleum hydrocarbons.
This research studied somefactors that affect the removal efficiency
of petroleum hydrocarbons in soil slurry including molar ratio of
hydrogen peroxide (H2O2) to ferrous ion(Fe2+), pH condition and
reaction time.The resultsdemonstrated that the optimum condition
was that the molar ratio of H2O2:Fe3+ was 200:1,the pHwas 4.0and
the rate of reaction was increasing rapidly from starting point to 7th
hour and destruction kinetic rate (k) was 0.24 h-1. Approximately
96% of petroleum hydrocarbon was observed(initialtotal petroleum
hydrocarbon (TPH) concentration = 70±7gkg-1)
Abstract: This work discusses an innovative methodology for
deployment of service quality characteristics. Four groups of organizational features that may influence the quality of services are identified: human resource, technology, planning, and organizational
relationships. A House of Service Quality (HOSQ) matrix is built to
extract the desired improvement in the service quality characteristics
and to translate them into a hierarchy of important organizational
features. The Mean Square Error (MSE) criterion enables the
pinpointing of the few essential service quality characteristics to be
improved as well as selection of the vital organizational features. The
method was implemented in an engineering supply enterprise and
provides useful information on its vital service dimensions.
Abstract: Narratives are invaluable assets of human lives. Due to
the distinct features of narratives, they are useful for supporting human
reasoning processes. However, many useful narratives become
residuals in organizations or human minds nowadays. Researchers
have contributed effort to investigate and improve narrative generation
processes. This paper attempts to contemplate essential components in
narratives and explore a computational approach to acquire and extract
knowledge to generate narratives. The methodology and significant
benefit for decision support are presented.
Abstract: Ovshinsky initiated scientific research in the field of
amorphous and disordered materials that continues to this day. The
Ovshinsky Effect where the resistance of thin GST films is
significantly reduced upon the application of low voltage is of
fundamental importance in phase-change - random access memory
(PC-RAM) devices.GST stands for GdSbTe chalcogenide type
glasses.However, the Ovshinsky Effect is not without controversy.
Ovshinsky thought the resistance of GST films is reduced by the
redistribution of charge carriers; whereas, others at that time including
many PC-RAM researchers today argue that the GST resistance
changes because the GST amorphous state is transformed to the
crystalline state by melting, the heat supplied by external heaters. In
this controversy, quantum mechanics (QM) asserts the heat capacity of
GST films vanishes, and therefore melting cannot occur as the heat
supplied cannot be conserved by an increase in GST film
temperature.By precluding melting, QM re-opens the controversy
between the melting and charge carrier mechanisms. Supporting
analysis is presented to show that instead of increasing GST film
temperature, conservation proceeds by the QED induced creation of
photons within the GST film, the QED photons confined by TIR. QED
stands for quantum electrodynamics and TIR for total internal
reflection. The TIR confinement of QED photons is enhanced by the
fact the absorbedheat energy absorbed in the GST film is concentrated
in the TIR mode because of their high surface to volume ratio. The
QED photons having Planck energy beyond the ultraviolet produce
excitons by the photoelectric effect, the electrons and holes of which
reduce the GST film resistance.
Abstract: Adhesion strength of exterior or interior coating of
steel pipes is too important. Increasing of coating adhesion on
surfaces can increase the life time of coating, safety factor of
transmitting line pipe and decreasing the rate of corrosion and costs.
Preparation of steel pipe surfaces before doing the coating process is
done by shot and grit blasting. This is a mechanical way to do it.
Some effective parameters on that process, are particle size of
abrasives, distance to surface, rate of abrasive flow, abrasive physical
properties, shapes, selection of abrasive, kind of machine and its
power, standard of surface cleanness degree, roughness, time of
blasting and weather humidity. This search intended to find some
better conditions which improve the surface preparation, adhesion
strength and corrosion resistance of coating. So, this paper has
studied the effect of varying abrasive flow rate, changing the
abrasive particle size, time of surface blasting on steel surface
roughness and over blasting on it by using the centrifugal blasting
machine. After preparation of numbers of steel samples (according to
API 5L X52) and applying epoxy powder coating on them, to
compare strength adhesion of coating by Pull-Off test. The results
have shown that, increasing the abrasive particles size and flow rate,
can increase the steel surface roughness and coating adhesion
strength but increasing the blasting time can do surface over blasting
and increasing surface temperature and hardness too, change,
decreasing steel surface roughness and coating adhesion strength.
Abstract: This paper presents a VLSI design approach of a highspeed
and real-time 2-D Discrete Wavelet Transform computing. The
proposed architecture, based on new and fast convolution approach,
reduces the hardware complexity in addition to reduce the critical
path to the multiplier delay. Furthermore, an advanced twodimensional
(2-D) discrete wavelet transform (DWT)
implementation, with an efficient memory area, is designed to
produce one output in every clock cycle. As a result, a very highspeed
is attained. The system is verified, using JPEG2000
coefficients filters, on Xilinx Virtex-II Field Programmable Gate
Array (FPGA) device without accessing any external memory. The
resulting computing rate is up to 270 M samples/s and the (9,7) 2-D
wavelet filter uses only 18 kb of memory (16 kb of first-in-first-out
memory) with 256×256 image size. In this way, the developed design
requests reduced memory and provide very high-speed processing as
well as high PSNR quality.
Abstract: Nurses in an Armed Force Hospital (AFH) expose to stronger stress than those in a civil hospital, especially in an emergency department (ED). Ironically, stresses of these nurses received few if any attention in academic research in the past. This study collects 227 samples from the emergency departments of four armed force hospitals in central and southern Taiwan. The research indicates that the top five stressors are a massive casualty event, delayed physician support, overloads of routine work, overloads of assignments, and annoying paper work. Excessive work loading was found to be the primary source of stress. Nurses who were perceived to have greater stress levels were more inclined to deploy emotion-oriented approaches and more likely to seek job rotations. Professional stressors and problem-oriented approaches were positively correlated. Unlike other local studies, this study concludes that the excessive work-loading is more stressful in an AFH.
Abstract: Extracellular ubiquitin in vivo effect on regenerative liver cells and liver histoarchitectonics has been studied. Experiments were performed on mature female white rats. Partial hepatectomy was made using the modified method of Higgins and Anderson. Standard histopathological assessment of liver tissue was used. Proliferative activity of hepatocytes was analyzed by colchicine mitotic index and immunohistochemical staining on ki67. We have found that regardless of number of injections and dose of extracellular ubiquitin liver histology has not been changed, so at tissue level no effect was observed. In vivo double injection of ubiquitin significantly decreases the mitotic activity at 32 hour point after partial hepatectomy. Thus, we can conclude that in vivo injected extracellular ubiquitin inhibits proliferative activity of hepatocytes in partially hepatectomyzed rats.
Abstract: Data mining is an extraordinarily demanding field referring to extraction of implicit knowledge and relationships, which are not explicitly stored in databases. A wide variety of methods of data mining have been introduced (classification, characterization, generalization...). Each one of these methods includes more than algorithm. A system of data mining implies different user categories,, which mean that the user-s behavior must be a component of the system. The problem at this level is to know which algorithm of which method to employ for an exploratory end, which one for a decisional end, and how can they collaborate and communicate. Agent paradigm presents a new way of conception and realizing of data mining system. The purpose is to combine different algorithms of data mining to prepare elements for decision-makers, benefiting from the possibilities offered by the multi-agent systems. In this paper the agent framework for data mining is introduced, and its overall architecture and functionality are presented. The validation is made on spatial data. Principal results will be presented.
Abstract: In an open real-time system environment, the coexistence of different kinds of real-time and non real-time applications makes the system scheduling mechanism face new requirements and challenges. One two-level scheduling scheme of the open real-time systems is introduced, and points out that hard and soft real-time applications are scheduled non-distinctively as the same type real-time applications, the Quality of Service (QoS) cannot be guaranteed. It has two flaws: The first, it can not differentiate scheduling priorities of hard and soft real-time applications, that is to say, it neglects characteristic differences between hard real-time applications and soft ones, so it does not suit a more complex real-time environment. The second, the worst case execution time of soft real-time applications cannot be predicted exactly, so it is not worth while to cost much spending in order to assure all soft real-time applications not to miss their deadlines, and doing that may cause resource wasting. In order to solve this problem, a novel two-level real-time scheduling mechanism (including scheduling profile and scheduling algorithm) which adds the process of dealing with soft real-time applications is proposed. Finally, we verify real-time scheduling mechanism from two aspects of theory and experiment. The results indicate that our scheduling mechanism can achieve the following objectives. (1) It can reflect the difference of priority when scheduling hard and soft real-time applications. (2) It can ensure schedulability of hard real-time applications, that is, their rate of missing deadline is 0. (3) The overall rate of missing deadline of soft real-time applications can be less than 1. (4) The deadline of a non-real-time application is not set, whereas the scheduling algorithm that server 0 S uses can avoid the “starvation" of jobs and increase QOS. By doing that, our scheduling mechanism is more compatible with different types of applications and it will be applied more widely.
Abstract: In designing river intakes and diversion structures, it is paramount that the sediments entering the intake are minimized or, if possible, completely separated. Due to high water velocity, sediments can significantly damage hydraulic structures especially when mechanical equipment like pumps and turbines are used. This subsequently results in wasting water, electricity and further costs. Therefore, it is prudent to investigate and analyze the performance of lateral intakes affected by sediment control structures. Laboratory experiments, despite their vast potential and benefits, can face certain limitations and challenges. Some of these include: limitations in equipment and facilities, space constraints, equipment errors including lack of adequate precision or mal-operation, and finally, human error. Research has shown that in order to achieve the ultimate goal of intake structure design – which is to design longlasting and proficient structures – the best combination of sediment control structures (such as sill and submerged vanes) along with parameters that increase their performance (such as diversion angle and location) should be determined. Cost, difficulty of execution and environmental impacts should also be included in evaluating the optimal design. This solution can then be applied to similar problems in the future. Subsequently, the model used to arrive at the optimal design requires high level of accuracy and precision in order to avoid improper design and execution of projects. Process of creating and executing the design should be as comprehensive and applicable as possible. Therefore, it is important that influential parameters and vital criteria is fully understood and applied at all stages of choosing the optimal design. In this article, influential parameters on optimal performance of the intake, advantages and disadvantages, and efficiency of a given design are studied. Then, a multi-criterion decision matrix is utilized to choose the optimal model that can be used to determine the proper parameters in constructing the intake.
Abstract: This paper proposes a method to vibration analysis in
order to on-line monitoring and predictive maintenance during the
milling process. Adapting envelope method to diagnostics and the
analysis for milling tool materials is an important contribution to the
qualitative and quantitative characterization of milling capacity and a
step by modeling the three-dimensional cutting process. An
experimental protocol was designed and developed for the
acquisition, processing and analyzing three-dimensional signal. The
vibration envelope analysis is proposed to detect the cutting capacity
of the tool with the optimization application of cutting parameters.
The research is focused on Hilbert transform optimization to evaluate
the dynamic behavior of the machine/ tool/workpiece.
Abstract: Today the social marketing was constituted as a tool
of significant value in what he refers to the promotion of changes of
behaviors, attitudes end practices. With the objective of analyzing the
benefits that the social marketing can bring for the organizations that
use it the research was of the exploratory and descriptive. In the
present study the comparative method was used, through a qualitative
approach, to analyze the activities developed by three institutions:
the Recovery Center Rosa de Saron, the House of Recovery for
addicts and Teen Challenge Institute Children's Cancer of the
Wasteland (ICIA), kindred of pointing out the benefits of the social
marketing in organizations that don-t seek the profit.
Abstract: This research investigates risk factors for defective products in autoparts factories. Under a Bayesian framework, a generalized linear mixed model (GLMM) in which the dependent variable, the number of defective products, has a Poisson distribution is adopted. Its performance is compared with the Poisson GLM under a Bayesian framework. The factors considered are production process, machines, and workers. The products coded RT50 are observed. The study found that the Poisson GLMM is more appropriate than the Poisson GLM. For the production Process factor, the highest risk of producing defective products is Process 1, for the Machine factor, the highest risk is Machine 5, and for the Worker factor, the highest risk is Worker 6.
Abstract: Water pollution assessment problems arise frequently
in environmental science. In this research, a finite difference method
for solving the one-dimensional steady convection-diffusion equation
with variable coefficients is proposed; it is then used to optimize
water treatment costs.
Abstract: Today-s economy is in a permanent change, causing
merger and acquisitions and co operations between enterprises. As a
consequence, process adaptations and realignments result in systems
integration and software development projects. Processes and
procedures to execute such projects are still reliant on craftsman-ship
of highly skilled workers. A generally accepted, industrialized
production, characterized by high efficiency and quality, seems
inevitable.
In spite of this, current concepts of software industrialization are
aimed at traditional software engineering and do not consider the
characteristics of systems integration. The present work points out
these particularities and discusses the applicability of existing
industrial concepts in the systems integration domain. Consequently
it defines further areas of research necessary to bring the field of
systems integration closer to an industrialized production, allowing a
higher efficiency, quality and return on investment.
Abstract: Global competitiveness has recently become the
biggest concern of both manufacturing and service companies.
Electronic commerce, as a key technology enables the firms to reach
all the potential consumers from all over the world. In this study, we
have presented commonly used electronic payment systems, and then
we have shown the evaluation of these systems in respect to different
criteria. The payment systems which are included in this research are
the credit card, the virtual credit card, the electronic money, the
mobile payment, the credit transfer and the debit instruments. We
have realized a systematic comparison of these systems in respect to
three main criteria: Technical, economical and social. We have
conducted a fuzzy multi-criteria decision making procedure to deal
with the multi-attribute nature of the problem. The subjectiveness
and imprecision of the evaluation process are modeled using
triangular fuzzy numbers.