Abstract: The universe of aesthetic perception entails impasses about sensitive divergences that each text or visual object may be subjected to. If approached through intertextuality that is not based on the misleading notion of kinships or similarities a priori admissible, the possibility of anachronistic, heterogeneous - and non-diachronic - assemblies can enhance the emergence of interval movements, intermediate, and conflicting, conducive to a method of reading, interpreting, and assigning meaning that escapes the rigid antinomies of the mere being and non-being of things. In negative, they operate in a relationship built by the lack of an adjusted meaning set by their positive existences, with no remainders; the generated interval becomes the remnant of each of them; it is the opening that obscures the stable positions of each one. Without the negative of absence, of that which is always missing or must be missing in a text, concept, or image made positive by history, nothing is perceived beyond what has been already given. Pairings or binary oppositions cannot lead only to functional syntheses; on the contrary, methodological disturbances accumulated by the approximation of signs and entities can initiate a process of becoming as an opening to an unforeseen other, transformation until a moment when the difficulties of [re]conciliation become the mainstay of a future of that sign/entity, not envisioned a priori. A counter-history can emerge from these unprecedented, misadjusted approaches, beginnings of unassigned injunctions and disjunctions, in short, difficult alliances that open cracks in a supposedly cohesive history, chained in its apparent linearity with no remains, understood as a categorical historical imperative. Interstices are minority fields that, because of their opening, are capable of causing opacity in that which, apparently, presents itself with irreducible clarity. Resulting from an incomplete and maladjusted [at the least dual] marriage between the signs/entities that originate them, this interval may destabilize and cause disorder in these entities and their own meanings. The interstitials offer a hyphenated relationship: a simultaneous union and separation, a spacing between the entity’s identity and its otherness or, alterity. One and the other may no longer be seen without the crack or fissure that now separates them, uniting, by a space-time lapse. Ontological, semantic shifts are caused by this fissure, an absence between one and the other, one with and against the other. Based on an improbable approximation between some conceptual and semantic shifts within the design production of architect Rem Koolhaas and the textual production of the philosopher Jacques Derrida, this article questions the notion of unity, coherence, affinity, and complementarity in the process of construction of thought from these ontological, epistemological, and semiological fissures that rattle the signs/entities and their stable meanings. Fissures in a thought that is considered coherent, cohesive, formatted are the negativity that constitutes the interstices that allow us to move towards what still remains as non-identity, which allows us to begin another story.
Abstract: The main objective of this research is to optimize the surface roughness of a milling operation on AISI 1018 steel using live tooling on a HAAS ST-20 lathe. In this study, Taguchi analysis is used to optimize the milling process by investigating the effect of different machining parameters on surface roughness. The L9 orthogonal array is designed with four controllable factors with three different levels each and an uncontrollable factor, resulting in 18 experimental runs. The optimal parameters determined from Taguchi analysis were feed rate – 76.2 mm/min, spindle speed 1150 rpm, depth of cut – 0.762 mm and 2-flute TiN coated high-speed steel as tool material. The process capability Cp and process capability index Cpk values were improved from 0.62 and -0.44 to 1.39 and 1.24 respectively. The average surface roughness values from the confirmation runs were 1.30 µ, decreasing the defect rate from 87.72% to 0.01%. The purpose of this study is to efficiently utilize the Taguchi design to optimize the surface roughness in a milling operation using live tooling.
Abstract: The research follows a systematic approach to optimize the parameters for parts machined by turning and milling processes. The quality characteristic chosen is surface roughness since the surface finish plays an important role for parts that require surface contact. A tapered cylindrical surface is designed as a test specimen for the research. The material chosen for machining is aluminum alloy 6061 due to its wide variety of industrial and engineering applications. HAAS VF-2 TR computer numerical control (CNC) vertical machining center is used for milling and HAAS ST-20 CNC machine is used for turning in this research. Taguchi analysis is used to optimize the surface roughness of the machined parts. The L9 Orthogonal Array is designed for four controllable factors with three different levels each, resulting in 18 experimental runs. Signal to Noise (S/N) Ratio is calculated for achieving the specific target value of 75 ± 15 µin. The controllable parameters chosen for turning process are feed rate, depth of cut, coolant flow and finish cut and for milling process are feed rate, spindle speed, step over and coolant flow. The uncontrollable factors are tool geometry for turning process and tool material for milling process. Hypothesis testing is conducted to study the significance of different uncontrollable factors on the surface roughnesses. The optimal parameter settings were identified from the Taguchi analysis and the process capability Cp and the process capability index Cpk were improved from 1.76 and 0.02 to 3.70 and 2.10 respectively for turning process and from 0.87 and 0.19 to 3.85 and 2.70 respectively for the milling process. The surface roughnesses were improved from 60.17 µin to 68.50 µin, reducing the defect rate from 52.39% to 0% for the turning process and from 93.18 µin to 79.49 µin, reducing the defect rate from 71.23% to 0% for the milling process. The purpose of this study is to efficiently utilize the Taguchi design analysis to improve the surface roughness.
Abstract: The objective of this research is to optimize the process of cutting cylindrical workpieces utilizing live tooling on a HAAS ST-20 lathe. Surface roughness (Ra) has been investigated as the indicator of quality characteristics for machining process. Aluminum alloy was used to conduct experiments due to its wide range usages in engineering structures and components where light weight or corrosion resistance is required. In this study, Taguchi methodology is utilized to determine the effects that each of the parameters has on surface roughness (Ra). A total of 18 experiments of each process were designed according to Taguchi’s L9 orthogonal array (OA) with four control factors at three levels of each and signal-to-noise ratios (S/N) were computed with Smaller the better equation for minimizing the system. The optimal parameters identified for the surface roughness of the turning operation utilizing live tooling were a feed rate of 3 inches/min(A3); a spindle speed of 1300 rpm(B3); a 2-flute titanium nitrite coated 3/8” endmill (C1); and a depth of cut of 0.025 inches (D2). The mean surface roughness of the confirmation runs in turning operation was 8.22 micro inches. The final results demonstrate that Taguchi methodology is a sufficient way of process improvement in turning process on surface roughness.
Abstract: People are always in search of Wi-Fi hotspots because Internet is a major demand nowadays. But like all other technologies, there is still room for improvement in the Wi-Fi technology with regards to the speed and quality of connectivity. In order to address these aspects, Harald Haas, a professor at the University of Edinburgh, proposed what we know as the Li-Fi (Light Fidelity). Li-Fi is a new technology in the field of wireless communication to provide connectivity within a network environment. It is a two-way mode of wireless communication using light. Basically, the data is transmitted through Light Emitting Diodes which can vary the intensity of light very fast, even faster than the blink of an eye. From the research and experiments conducted so far, it can be said that Li-Fi can increase the speed and reliability of the transfer of data. This paper pays particular attention on the assessment of the performance of this technology. In other words, it is a 5G technology which uses LED as the medium of data transfer. For coverage within the buildings, Wi-Fi is good but Li-Fi can be considered favorable in situations where large amounts of data are to be transferred in areas with electromagnetic interferences. It brings a lot of data related qualities such as efficiency, security as well as large throughputs to the table of wireless communication. All in all, it can be said that Li-Fi is going to be a future phenomenon where the presence of light will mean access to the Internet as well as speedy data transfer.
Abstract: The influence of confined acoustic phonons on the Shubnikov – de Haas magnetoresistance oscillations in a doped semiconductor superlattice (DSSL), subjected in a magnetic field, DC electric field, and a laser radiation, has been theoretically studied based on quantum kinetic equation method. The analytical expression for the magnetoresistance in a DSSL has been obtained as a function of external fields, DSSL parameters, and especially the quantum number m characterizing the effect of confined acoustic phonons. When m goes to zero, the results for bulk phonons in a DSSL could be achieved. Numerical calculations are also achieved for the GaAs:Si/GaAs:Be DSSL and compared with other studies. Results show that the Shubnikov – de Haas magnetoresistance oscillations amplitude decrease as the increasing of phonon confinement effect.
Abstract: Different tools of the supply chain should be managed very efficiently in mass customization. In the automobile industry, there are different strategies to manage these tools. We need to investigate which strategies among the different ones are successful and which are not. There is lack in literature regarding such analysis. Keeping this in view, the purpose of this paper is to construct a framework and model which can help to analyze the supply chain of mass customized automobiles quantitatively for future studies. Furthermore, we will also consider that which type of data can be used for the suggested model and where it can be taken from. Such framework can help to bring insight for future analysis.
Abstract: In this paper, a robust fault detection and isolation
(FDI) scheme is developed to monitor a multivariable nonlinear
chemical process called the Chylla-Haase polymerization reactor,
when it is under the cascade PI control. The scheme employs a radial
basis function neural network (RBFNN) in an independent mode to
model the process dynamics, and using the weighted sum-squared
prediction error as the residual. The Recursive Orthogonal Least
Squares algorithm (ROLS) is employed to train the model to
overcome the training difficulty of the independent mode of the
network. Then, another RBFNN is used as a fault classifier to isolate
faults from different features involved in the residual vector. Several
actuator and sensor faults are simulated in a nonlinear simulation of
the reactor in Simulink. The scheme is used to detect and isolate the
faults on-line. The simulation results show the effectiveness of the
scheme even the process is subjected to disturbances and
uncertainties including significant changes in the monomer feed rate,
fouling factor, impurity factor, ambient temperature, and
measurement noise. The simulation results are presented to illustrate
the effectiveness and robustness of the proposed method.
Abstract: In this paper, the problem of stability and stabilization
for neutral delay-differential systems with infinite delay is
investigated. Using Lyapunov method, new delay-independent
sufficient condition for the stability of neutral systems with infinite
delay is obtained in terms of linear matrix inequality (LMI).
Memory-less state feedback controllers are then designed for the
stabilization of the system using the feasible solution of the resulting
LMI, which are easily solved using any optimization algorithms.
Numerical examples are given to illustrate the results of the proposed
methods.
Abstract: In order to address construction project requirements
and specifications, scholars and practitioners need to establish
taxonomy according to a scheme that best fits their need. While
existing characterization methods are continuously being improved,
new ones are devised to cover project properties which have not been
previously addressed. One such method, the Project Definition Rating
Index (PDRI), has received limited consideration strictly as a
classification scheme. Developed by the Construction Industry
Institute (CII) in 1996, the PDRI has been refined over the last two
decades as a method for evaluating a project's scope definition
completeness during front-end planning (FEP). The main
contribution of this study is a review of practical project classification
methods, and a discussion of how PDRI can be used to classify
projects based on their readiness in the FEP phase. The proposed
model has been applied to 59 construction projects in Ontario, and
the results are discussed.
Abstract: With a long history, dual-task has become one of the
most intriguing research fields regarding human brain functioning
and cognition. However, findings considering effects of taskinterrelations
are limited (especially, in combined motor and
cognitive tasks). Therefore, we aimed at developing a measurement
system in order to analyse interrelation effects of cognitive and motor
tasks. On the one hand, the present study demonstrates the
applicability of the measurement system and on the other hand first
results regarding a systematisation of different task combinations are
shown. Future investigations should combine imagine technologies
and this developed measurement system.
Abstract: South Africa has some regions which are susceptible
to moderate seismic activity. A peak ground acceleration of between
0.1g and 0.15g can be expected in the southern parts of the Western
Cape. Unreinforced Masonry (URM) is commonly used as a
construction material for 2 to 5 storey buildings in underprivileged
areas in and around Cape Town. URM is typically regarded as the
material most vulnerable to damage when subjected to earthquake
excitation. In this study, a three-storey URM building was analysed
by applying seven earthquake time-histories, which can be expected
to occur in South Africa using a finite element approach.
Experimental data was used to calibrate the in- and out-of-plane
stiffness of the URM. The results indicated that tensile cracking of
the in-plane piers was the dominant failure mode. It is concluded that
URM buildings of this type are at risk of failure especially if
sufficient ductility is not provided. The results also showed that
connection failure must be investigated further.
Abstract: Columns have traditionally been constructed of
reinforced concrete or structural steel. Much attention was allocated
to estimate the axial capacity of the traditional column sections to the
detriment of other forms of construction. Other forms of column
construction such as Concrete Filled Double Skin Tubes received
little research attention, and almost no attention when subjected to
eccentric loading. This paper investigates the axial capacity of
columns when subjected to eccentric loading. The experimental axial
capacities are compared to other established theoretical formulae on
concentric loading to determine a possible relationship. The study
found a good correlation between the reduction in axial capacity for
different column lengths and hollow section ratios.
Abstract: Control of a semi-batch polymerization reactor using
an adaptive radial basis function (RBF) neural network method is
investigated in this paper. A neural network inverse model is used to
estimate the valve position of the reactor; this method can identify the
controlled system with the RBF neural network identifier. The
weights of the adaptive PID controller are timely adjusted based on
the identification of the plant and self-learning capability of RBFNN.
A PID controller is used in the feedback control to regulate the actual
temperature by compensating the neural network inverse model
output. Simulation results show that the proposed control has strong
adaptability, robustness and satisfactory control performance and the
nonlinear system is achieved.
Abstract: This paper presents a neural network based model predictive control (MPC) strategy to control a strongly exothermic reaction with complicated nonlinear kinetics given by Chylla-Haase polymerization reactor that requires a very precise temperature control to maintain product uniformity. In the benchmark scenario, the operation of the reactor must be guaranteed under various disturbing influences, e.g., changing ambient temperatures or impurity of the monomer. Such a process usually controlled by conventional cascade control, it provides a robust operation, but often lacks accuracy concerning the required strict temperature tolerances. The predictive control strategy based on the RBF neural model is applied to solve this problem to achieve set-point tracking of the reactor temperature against disturbances. The result shows that the RBF based model predictive control gives reliable result in the presence of some disturbances and keeps the reactor temperature within a tight tolerance range around the desired reaction temperature.
Abstract: Natural organic matter (NOM) is heterogeneous
mixture of organic compounds that enter the water media from
animal and plant remains, domestic and industrial wastes.
Researches showed that NOM is likely precursor material for
disinfection by products (DBPs). Chlorine very commenly used for
disinfection purposes and NOM and chlorine reacts then
Trihalomethane (THM) and Haloacetic acids (HAAs) which are
cancerogenics for human health are produced. The aim of the study is
to search NOM removal by enhanced coagulation from drinking
water source of Eskisehir which is supplied from Porsuk Dam.
Recently, Porsuk dam water is getting highly polluted and therefore
NOM concentration is increasing. Enhanced coagulation studies were
evaluated by measurement of Dissolved Organic Carbon (DOC), UV
absorbance at 254 nm (UV254), and different trihalomethane
formation potential (THMFP) tests. Results of jar test experiments
showed that NOM can be removed from water about 40-50 % of
efficiency by enhanced coagulation. Optimum coagulant type and
coagulant dosages were determined using FeCl3 and Alum.
Abstract: In the project FleGSens, a wireless sensor network
(WSN) for the surveillance of critical areas and properties is currently developed which incorporates mechanisms to ensure information
security. The intended prototype consists of 200 sensor nodes for
monitoring a 500m long land strip. The system is focused on ensuring
integrity and authenticity of generated alarms and availability in the
presence of an attacker who may even compromise a limited number
of sensor nodes. In this paper, two of the main protocols developed
in the project are presented, a tracking protocol to provide secure
detection of trespasses within the monitored area and a protocol for secure detection of node failures. Simulation results of networks
containing 200 and 2000 nodes as well as the results of the first prototype comprising a network of 16 nodes are presented. The focus of the simulations and prototype are functional testing of the protocols
and particularly demonstrating the impact and cost of several attacks.
Abstract: In recent years various types of electric vehicles
has gained again increasing attention as an environmentally
benign technology in transport. Especially for urban areas with
high local pollution this Zero-emission technology (at the point
of use) is considered to provide proper solutions. Yet, the bad
economics and the limited driving ranges are still major barriers
for a broader market penetration of battery electric vehicles
(BEV) and of fuel cell vehicles (FCV). The major result of our
analyses is that the most important precondition for a further
dissemination of BEV in urban areas are emission-free zones.
This is an instrument which allows the promotion of BEV
without providing excessive subsidies. In addition, it is
important to note that the full benefits of EV can only be
harvested if the electricity used is produced from renewable
energy sources. That is to say, it has to be ensured that the use of
BEV in urban areas is clearly linked to a green electricity
purchase model. And moreover, the introduction of a CO2-
emission-based tax system would support this requirement.
Abstract: For stricter drinking water regulations in the future, reducing the humic acid and disinfection byproducts in raw water, namely, trihalomethanes (THMs) and haloacetic acids (HAAs) is worthy for research. To investigate the removal of waterborne organic material using a lab-scale of bio-activated carbon filter under different EBCT, the concentrations of humic acid prepared were 0.01, 0.03, 0.06, 0.12, 0.17, 0.23, and 0.29 mg/L. Then we conducted experiments using a pilot plant with in-field of the serially connected bio-activated carbon filters and hollow fiber membrane processes employed in traditional water purification plants. Results showed under low TOC conditions of humic acid in influent (0.69 to 1.03 mg TOC/L) with an EBCT of 30 min, 40 min, and 50 min, TOC removal rates increases with greater EBCT, attaining about 39 % removal rate. The removal rate of THMs and HAAs by BACF was 54.8 % and 89.0 %, respectively.
Abstract: Concerning the inpatient care the present situation is
characterized by intense charges of medical technology into the
clinical daily routine and an ever stronger integration of special
techniques into the clinical workflow. Medical technology is by now
an integral part of health care according to consisting general
accepted standards. Purchase and operation thereby represent an
important economic position and both are subject of everyday
optimisation attempts. For this purpose by now exists a huge number
of tools which conduce more likely to a complexness of the problem
by a comprehensive implementation. In this paper the advantages of
an integrative information-workflow on the life-cycle-management in
the region of medical technology are shown.