Electric Field Investigation in MV PILC Cables with Void Defect

Worldwide, most PILC MV underground cables in use are approaching the end of their design life; hence, failures are likely to increase. This paper studies the electric field and potential distributions within the PILC insulted cable containing common void-defect. The finite element model of the performance of the belted PILC MV underground cable is presented. The variation of the electric field stress within the cable using the Finite Element Method (FEM) is concentrated. The effects of the void-defect within the insulation are given. Outcomes will lead to deeper understanding of the modeling of Paper Insulated Lead Covered (PILC) and electric field response of belted PILC insulted cable containing void defect.

Nano Composite of Clay and Modified Ketonic Resin as Fire Retardant Polyol for Polyurethane

In situ modified cyclohexanone-formaldehyde resins were prepared by addition of alendronic acid during resin preparation. Clay nanocomposites in ketonic resins were achieved by adding clay into the flask at the beginning of the resin preparation. The prepared resins were used for the synthesis of fire resistant polyurethanes foam. Both phosphorous containing modifier compound alendronic acid and nanoclay increases fire resistance of the cyclohexanone-formaldehyde resin thus polyurethane produced from these resins. The effect of the concentrations of alendronic acid and clay on the fire resistance and physical properties of polyurethanes was studied.

Adhesion Problematic for Novel Non-Crimp Fabric and Surface Modification of Carbon-Fibres Using Oxy-Fluorination

In the scope of application of technical textiles, Non- Crimp Fabrics are increasingly used. In general, NCF exhibit excellent load bearing properties, but caused by the manufacturing process, there are some remaining disadvantages which have to be reduced. Regarding to this, a novel technique of processing NCF was developed substituting the binding-thread by an adhesive. This stitchfree method requires new manufacturing concept as well as new basic methods to prove adhesion of glue at fibres and textiles. To improve adhesion properties and the wettability of carbon-fibres by the adhesive, oxy-fluorination was used. The modification of carbonfibres by oxy-fluorination was investigated via scanning electron microscope, X-ray photoelectron spectroscopy and single fibre tensiometry. Special tensile tests were developed to determine the maximum force required for detachment.

Crack Width Evaluation for Flexural RC Members with Axial Tension

Proof of controlling crack width is a basic condition for securing suitable performance in serviceability limit state. The cracking in concrete can occur at any time from the casting of time to the years after the concrete has been set in place. Most codes struggle with offering procedure for crack width calculation. There is lack in availability of design charts for designers to compute crack width with ease. The focus of the study is to utilize design charts and parametric equations in calculating crack width with minimum error. The paper contains a simplified procedure to calculate crack width for reinforced concrete (RC) sections subjected to bending with axial tensile force following the guidelines of Euro code [DS EN-1992-1-1 & DS EN-1992-1-2]. Numerical examples demonstrate the application of the suggested procedure. Comparison with parallel analytical tools supports the validity of result and show the percentage deviation of crack width in both the procedures. The technique is simple, user friendly and ready to evolve for a greater spectrum of section sizes and materials.

Influence of Transportation Mode to the Deterioration Rate: Case Study of Food Transport by Ship

Food as perishable goods represents a specific and sensitive part in the supply chain theory, since changing physical or chemical characteristics considerably influence the approach to stock management. The most delicate phase of this process is transportation, where it becomes difficult to ensure the stable conditions which limit deterioration, since the value of the deterioration rate could be easily influenced by the mode of transportation. The fuzzy definition of variables allows one to take these variations into account. Furthermore, an appropriate choice of the defuzzification method permits one to adapt results to real conditions as far as possible. In this article those methods which take into account the relationship between the deterioration rate of perishable goods and transportation by ship will be applied with the aim of (a) minimizing the total cost function, defined as the sum of the ordering cost, holding cost, disposing cost and transportation costs, and (b) improving the supply chain sustainability by reducing environmental impact and waste disposal costs.

Stable Delta-Sigma Modulator with Signal Dependent Forward Path Gain for Industrial Applications

Higher order ΔΣ Modulator (DSM) is basically an unstable system. The approximate conditions for stability cannot be used for the design of a DSM for industrial applications where risk is involved. The existing second order, single stage, single bit, unity feedback gain , discrete DSM cannot be used for the normalized full range (-1 to +1) of an input signal since the DSM becomes unstable when the input signal is above ±0.55. The stability is also not guaranteed for input signals of amplitude less than ±0.55. In the present paper, the above mentioned second order DSM is modified with input signal dependent forward path gain. The proposed DSM is suitable for industrial applications where one needs the digital representation of the analog input signal, during each sampling period. The proposed DSM can operate almost for the full range of input signals (-0.95 to +0.95) without causing instability, assuming that the second integrator output should not exceed the circuit supply voltage, ±15 Volts.

Blind Identification and Equalization of CDMA Signals Using the Levenvberg-Marquardt Algorithm

In this paper we describe the Levenvberg-Marquardt (LM) algorithm for identification and equalization of CDMA signals received by an antenna array in communication channels. The synthesis explains the digital separation and equalization of signals after propagation through multipath generating intersymbol interference (ISI). Exploiting discrete data transmitted and three diversities induced at the reception, the problem can be composed by the Block Component Decomposition (BCD) of a tensor of order 3 which is a new tensor decomposition generalizing the PARAFAC decomposition. We optimize the BCD decomposition by Levenvberg-Marquardt method gives encouraging results compared to classical alternating least squares algorithm (ALS). In the equalization part, we use the Minimum Mean Square Error (MMSE) to perform the presented method. The simulation results using the LM algorithm are important.

Does Material Choice Drive Sustainability of 3D Printing?

Environmental impacts of six 3D printers using various materials were compared to determine if material choice drove sustainability, or if other factors such as machine type, machine size, or machine utilization dominate. Cradle-to-grave life-cycle assessments were performed, comparing a commercial-scale FDM machine printing in ABS plastic, a desktop FDM machine printing in ABS, a desktop FDM machine printing in PET and PLA plastics, a polyjet machine printing in its proprietary polymer, an SLA machine printing in its polymer, and an inkjet machine hacked to print in salt and dextrose. All scenarios were scored using ReCiPe Endpoint H methodology to combine multiple impact categories, comparing environmental impacts per part made for several scenarios per machine. Results showed that most printers’ ecological impacts were dominated by electricity use, not materials, and the changes in electricity use due to different plastics was not significant compared to variation from one machine to another. Variation in machine idle time determined impacts per part most strongly. However, material impacts were quite important for the inkjet printer hacked to print in salt: In its optimal scenario, it had up to 1/38th the impacts coreper part as the worst-performing machine in the same scenario. If salt parts were infused with epoxy to make them more physically robust, then much of this advantage disappeared, and material impacts actually dominated or equaled electricity use. Future studies should also measure DMLS and SLS processes / materials.

Fiber Braggs Grating Sensor Based Instrumentation to Evaluate Postural Balance and Stability on an Unstable Platform

This paper describes a novel application of Fiber Braggs Grating (FBG) sensors in the assessment of human postural stability and balance on an unstable platform. In this work, FBG sensor Stability Analyzing Device (FBGSAD) is developed for measurement of plantar strain to assess the postural stability of subjects on unstable platforms during different stances in eyes open and eyes closed conditions on a rocker board. The studies are validated by comparing the Centre of Gravity (CG) variations measured on the lumbar vertebra of subjects using a commercial accelerometer. The results obtained from the developed FBGSAD depict qualitative similarities with the data recorded by commercial accelerometer. The advantage of the FBGSAD is that it measures simultaneously plantar strain distribution and postural stability of the subject along with its inherent benefits like non-requirement of energizing voltage to the sensor, electromagnetic immunity and simple design which suits its applicability in biomechanical applications. The developed FBGSAD can serve as a tool/yardstick to mitigate space motion sickness, identify individuals who are susceptible to falls and to qualify subjects for balance and stability, which are important factors in the selection of certain unique professionals such as aircraft pilots, astronauts, cosmonauts etc.

Central Finite Volume Methods Applied in Relativistic Magnetohydrodynamics: Applications in Disks and Jets

We have developed a new computer program in Fortran 90, in order to obtain numerical solutions of a system of Relativistic Magnetohydrodynamics partial differential equations with predetermined gravitation (GRMHD), capable of simulating the formation of relativistic jets from the accretion disk of matter up to his ejection. Initially we carried out a study on numerical methods of unidimensional Finite Volume, namely Lax-Friedrichs, Lax-Wendroff, Nessyahu-Tadmor method and Godunov methods dependent on Riemann problems, applied to equations Euler in order to verify their main features and make comparisons among those methods. It was then implemented the method of Finite Volume Centered of Nessyahu-Tadmor, a numerical schemes that has a formulation free and without dimensional separation of Riemann problem solvers, even in two or more spatial dimensions, at this point, already applied in equations GRMHD. Finally, the Nessyahu-Tadmor method was possible to obtain stable numerical solutions - without spurious oscillations or excessive dissipation - from the magnetized accretion disk process in rotation with respect to a central black hole (BH) Schwarzschild and immersed in a magnetosphere, for the ejection of matter in the form of jet over a distance of fourteen times the radius of the BH, a record in terms of astrophysical simulation of this kind. Also in our simulations, we managed to get substructures jets. A great advantage obtained was that, with the our code, we got simulate GRMHD equations in a simple personal computer.

Estimation of Thermal Conductivity of Nanofluids Using MD-Stochastic Simulation Based Approach

The thermal conductivity of a fluid can be significantly enhanced by dispersing nano-sized particles in it, and the resultant fluid is termed as "nanofluid". A theoretical model for estimating the thermal conductivity of a nanofluid has been proposed here. It is based on the mechanism that evenly dispersed nanoparticles within a nanofluid undergo Brownian motion in course of which the nanoparticles repeatedly collide with the heat source. During each collision a rapid heat transfer occurs owing to the solidsolid contact. Molecular dynamics (MD) simulation of the collision of nanoparticles with the heat source has shown that there is a pulselike pick up of heat by the nanoparticles within 20-100 ps, the extent of which depends not only on thermal conductivity of the nanoparticles, but also on the elastic and other physical properties of the nanoparticle. After the collision the nanoparticles undergo Brownian motion in the base fluid and release the excess heat to the surrounding base fluid within 2-10 ms. The Brownian motion and associated temperature variation of the nanoparticles have been modeled by stochastic analysis. Repeated occurrence of these events by the suspended nanoparticles significantly contributes to the characteristic thermal conductivity of the nanofluids, which has been estimated by the present model for a ethylene glycol based nanofluid containing Cu-nanoparticles of size ranging from 8 to 20 nm, with Gaussian size distribution. The prediction of the present model has shown a reasonable agreement with the experimental data available in literature.

Experimental Investigations of a Modified Taylor-Couette Flow

In this study the instability problem of a modified Taylor-Couette flow between two vertical coaxial cylinders of radius R1, R2 is considered. The modification is based on the wavy shape of the inner cylinder surface, where inner cylinders with different surface amplitude and wavelength are used. The study aims to discover the effect of the inner surface geometry on the instability phenomenon that undergoes Taylor-Couette flow. The study reveals that the transition processes depends strongly on the amplitude and wavelength of the inner cylinder surface and resulting in flow instabilities that are strongly different from that encountered in the case of the classical Taylor-Couette flow.

An Output Oriented Super-Efficiency Model for Considering Time Lag Effect

There exists some time lag between the consumption of inputs and the production of outputs. This time lag effect should be considered in calculating efficiency of decision making units (DMU). Recently, a couple of DEA models were developed for considering time lag effect in efficiency evaluation of research activities. However, these models can’t discriminate efficient DMUs because of the nature of basic DEA model in which efficiency scores are limited to ‘1’. This problem can be resolved a super-efficiency model. However, a super efficiency model sometimes causes infeasibility problem. This paper suggests an output oriented super-efficiency model for efficiency evaluation under the consideration of time lag effect. A case example using a long term research project is given to compare the suggested model with the MpO model.

Nanoparticles-Protein Hybrid Based Magnetic Liposome

Liposome plays an important role in medical and pharmaceutical science as e.g. nano scale drug carriers. Liposomes are vesicles of varying size consisting of a spherical lipid bilayer and an aqueous inner compartment. Magnet-driven liposome used for the targeted delivery of drugs to organs and tissues. These liposome preparations contain encapsulated drug components and finely dispersed magnetic particles. Liposomes are vesicles of varying size consisting of a spherical lipid bilayer and an aqueous inner compartment that are generated in vitro. These are useful in terms of biocompatibility, biodegradability, and low toxicity, and can control biodistribution by changing the size, lipid composition, and physical characteristics. Furthermore, liposomes can entrap both hydrophobic and hydrophilic drugs and are able to continuously release the entrapped substrate, thus being useful drug carriers. Magnetic liposomes (MLs) are phospholipid vesicles that encapsulate magneticor paramagnetic nanoparticles. They are applied as contrast agents for magnetic resonance imaging (MRI). The biological synthesis of nanoparticles using plant extracts plays an important role in the field of nanotechnology. Green-synthesized magnetite nanoparticles-protein hybrid has been produced by treating Iron (III) / Iron (II) chloride with the leaf extract of Datura inoxia. The phytochemicals present in the leaf extracts act as a reducing as well stabilizing agents preventing agglomeration, which include flavonoids, phenolic compounds, cardiac glycosides, proteins and sugars. The magnetite nanoparticles-protein hybrid has been trapped inside the aqueous core of the liposome prepared by reversed phase evaporation (REV) method using oleic and linoleic acid which has been shown to be driven under magnetic field confirming the formation magnetic liposome (ML). Chemical characterization of stealth magnetic liposome has been performed by breaking the liposome and release of magnetic nanoparticles. The presence iron has been confirmed by colour complex formation with KSCN and UV-Vis study using spectrophotometer Cary 60, Agilent. This magnet driven liposome using nanoparticles-protein hybrid can be a smart vesicles for the targeted drug delivery.

Influence of the Compression Force and Powder Particle Size on Some Physical Properties of Date Fruit (Phoenix dactylifera) Tablets

In recent years, the compression of date (Phoenix dactylifera L.) fruit powders (DP) to obtain date tablets (DT) has been suggested as a promising form of valorization of non commercial valuable date fruit (DF) varieties. To further improve and characterize DT, the present study aims to investigate the influence of the DP particle size and compression force on some physical properties of DT. The results show that independently of particle size, the hardness (y) of tablets increases with the increase of the compression force (x) following a logarithmic law (y = a ln (bx) where a and b are the constants of model). Further, a full factorial design (FFD) at two levels, applied to investigate the erosion %, reveals that the effects of time and particle size are the same in absolute value and they are beyond the effect of the compression. Regarding the disintegration time, the obtained results also by means of a FFD show that the effect of the compression force exceeds 4 times that of the DP particle size. As final stage, the color parameters in the CIELab system of DT immediately after their obtaining are differently influenced by the size of the initial powder.

Websites for Hypothesis Testing

E-learning has become an efficient and widespread means of education at all levels of human activities. Statistics is no exception. Unfortunately the main focus in statistics teaching is usually paid to the substitution in formulas. Suitable websites can simplify and automate calculations and provide more attention and time to the basic principles of statistics, mathematization of real-life situations and following interpretation of results. We now introduce our own web-site for hypothesis testing. Its didactic aspects, the technical possibilities of the individual tools, the experience of use and the advantages or disadvantages are discussed in this paper. This web-site is not a substitute for common statistical software but should significantly improve the teaching of statistics at universities.

High Level Synthesis of Canny Edge Detection Algorithm on Zynq Platform

Real time image and video processing is a demand in many computer vision applications, e.g. video surveillance, traffic management and medical imaging. The processing of those video applications requires high computational power. Thus, the optimal solution is the collaboration of CPU and hardware accelerators. In this paper, a Canny edge detection hardware accelerator is proposed. Edge detection is one of the basic building blocks of video and image processing applications. It is a common block in the pre-processing phase of image and video processing pipeline. Our presented approach targets offloading the Canny edge detection algorithm from processing system (PS) to programmable logic (PL) taking the advantage of High Level Synthesis (HLS) tool flow to accelerate the implementation on Zynq platform. The resulting implementation enables up to a 100x performance improvement through hardware acceleration. The CPU utilization drops down and the frame rate jumps to 60 fps of 1080p full HD input video stream.

A Four Method Framework for Fighting Software Architecture Erosion

Software Architecture is the basic structure of software that states the development and advancement of a software system. Software architecture is also considered as a significant tool for the construction of high quality software systems. A clean design leads to the control, value and beauty of software resulting in its longer life while a bad design is the cause of architectural erosion where a software evolution completely fails. This paper discusses the occurrence of software architecture erosion and presents a set of methods for the detection, declaration and prevention of architecture erosion. The causes and symptoms of architecture erosion are observed with the examples of prescriptive and descriptive architectures and the practices used to stop this erosion are also discussed by considering different types of software erosion and their affects. Consequently finding and devising the most suitable approach for fighting software architecture erosion and in some way reducing its affect is evaluated and tested on different scenarios.

Training in Psychology in Brazil – Reflections on the Role of Early Supervised Internships in Undergraduate Courses

This paper presents observations on the early supervised internships in Psychology, currently called basic internships in Brazil, and its importance in professional training. The work is an experience report and focuses on the Professional training, illustrated by the reality of a Brazilian institution, used as a case study. It was developed from the authors' experience as academic supervisors of this kind of practice throughout this undergraduate course, combined with aspects investigated in the post-doctoral research of one of them. Theoretical references on the subject and related national legislation are analyzed, as well as reports of students who experienced at least one semester of this type of practice, articulated to the observations of the authors. The results demonstrate the importance of the early supervised internships as a way of creating opportunities for the students of a first contact with the professional reality and the practice of psychologists in different fields of insertion, preparing them for further experiments that require more involvement in activities of training and practices in Psychology.

An Axiomatic Model for Development of the Allocated Architecture in Systems Engineering Process

The final step to complete the “Analytical Systems Engineering Process” is the “Allocated Architecture” in which all Functional Requirements (FRs) of an engineering system must be allocated into their corresponding Physical Components (PCs). At this step, any design for developing the system’s allocated architecture in which no clear pattern of assigning the exclusive “responsibility” of each PC for fulfilling the allocated FR(s) can be found is considered a poor design that may cause difficulties in determining the specific PC(s) which has (have) failed to satisfy a given FR successfully. The present study utilizes the Axiomatic Design method principles to mathematically address this problem and establishes an “Axiomatic Model” as a solution for reaching good alternatives for developing the allocated architecture. This study proposes a “loss Function”, as a quantitative criterion to monetarily compare non-ideal designs for developing the allocated architecture and choose the one which imposes relatively lower cost to the system’s stakeholders. For the case-study, we use the existing design of U. S. electricity marketing subsystem, based on data provided by the U.S. Energy Information Administration (EIA). The result for 2012 shows the symptoms of a poor design and ineffectiveness due to coupling among the FRs of this subsystem.