Identifying Knowledge Gaps in Incorporating Toxicity of Particulate Matter Constituents for Developing Regulatory Limits on Particulate Matter

Regulatory bodies has proposed limits on Particulate Matter (PM) concentration in air; however, it does not explicitly indicate the incorporation of effects of toxicities of constituents of PM in developing regulatory limits. This study aimed to provide a structured approach to incorporate toxic effects of components in developing regulatory limits on PM. A four-step human health risk assessment framework consists of - (1) hazard identification (parameters: PM and its constituents and their associated toxic effects on health), (2) exposure assessment (parameters: concentrations of PM and constituents, information on size and shape of PM; fate and transport of PM and constituents in respiratory system), (3) dose-response assessment (parameters: reference dose or target toxicity dose of PM and its constituents), and (4) risk estimation (metric: hazard quotient and/or lifetime incremental risk of cancer as applicable). Then parameters required at every step were obtained from literature. Using this information, an attempt has been made to determine limits on PM using component-specific information. An example calculation was conducted for exposures of PM2.5 and its metal constituents from Indian ambient environment to determine limit on PM values. Identified data gaps were: (1) concentrations of PM and its constituents and their relationship with sampling regions, (2) relationship of toxicity of PM with its components.

Improved Multi–Objective Firefly Algorithms to Find Optimal Golomb Ruler Sequences for Optimal Golomb Ruler Channel Allocation

Recently nature–inspired algorithms have widespread use throughout the tough and time consuming multi–objective scientific and engineering design optimization problems. In this paper, we present extended forms of firefly algorithm to find optimal Golomb ruler (OGR) sequences. The OGRs have their one of the major application as unequally spaced channel–allocation algorithm in optical wavelength division multiplexing (WDM) systems in order to minimize the adverse four–wave mixing (FWM) crosstalk effect. The simulation results conclude that the proposed optimization algorithm has superior performance compared to the existing conventional computing and nature–inspired optimization algorithms to find OGRs in terms of ruler length, total optical channel bandwidth and computation time.

Scheduling Multiple Workflow Using De-De Dodging Algorithm and PBD Algorithm in Cloud: Detailed Study

Workflow scheduling is an important part of cloud computing and based on different criteria it decides cost, execution time, and performances. A cloud workflow system is a platform service facilitating automation of distributed applications based on new cloud infrastructure. An aspect which differentiates cloud workflow system from others is market-oriented business model, an innovation which challenges conventional workflow scheduling strategies. Time and Cost optimization algorithm for scheduling Hybrid Clouds (TCHC) algorithm decides which resource should be chartered from public providers is combined with a new De-De algorithm considering that every instance of single and multiple workflows work without deadlocks. To offset this, two new concepts - De-De Dodging Algorithm and Priority Based Decisive Algorithm - combine with conventional deadlock avoidance issues by proposing one algorithm that maximizes active (not just allocated) resource use and reduces Makespan.

Poly(Lactic Acid) Based Flexible Films

Poly(lactic acid) (PLA) is a biodegradable polymer which has good mechanical properties, however, its brittleness limits its usage especially in packaging materials. Therefore, in this work, PLA based polyurethane films were prepared by synthesizing with different types of isocyanates; methylene diisocyanate (MDI) and hexamethylene diisocyanates (HDI). For this purpose, PLA based polyurethane must have good strength and flexibility. Therefore, polycaprolactone which has better flexibility were prepared with PLA. An effective way to endow polylactic acid with toughness is through chain-extension reaction of the polylactic acid pre-polymer with polycaprolactone used as chain extender. Polyurethane prepared from MDI showed brittle behaviour, while, polyurethane prepared from HDI showed flexibility at same concentrations.

Decision Tree Modeling in Emergency Logistics Planning

Despite the availability of natural disaster related time series data for last 110 years, there is no forecasting tool available to humanitarian relief organizations to determine forecasts for emergency logistics planning. This study develops a forecasting tool based on identifying probability of disaster for each country in the world by using decision tree modeling. Further, the determination of aggregate forecasts leads to efficient pre-disaster planning. Based on the research findings, the relief agencies can optimize the various resources allocation in emergency logistics planning.

Video Quality Assessment Methods: A Bird’s-Eye View

The proliferation of multimedia technology and services in today’s world provide ample research scope in the frontiers of visual signal processing. Wide spread usage of video based applications in heterogeneous environment needs viable methods of Video Quality Assessment (VQA). The evaluation of video quality not only depends on high QoS requirements but also emphasis the need of novel term ‘QoE’ (Quality of Experience) that perceive video quality as user centric. This paper discusses two vital video quality assessment methods namely, subjective and objective assessment methods. The evolution of various video quality metrics, their classification models and applications are reviewed in this work. The Mean Opinion Score (MOS) based subjective measurements and algorithm based objective metrics are discussed and their challenges are outlined. Further, this paper explores the recent progress of VQA in emerging technologies such as mobile video and 3D video.

Simulation of Static Frequency Converter for Synchronous Machine Operation and Investigation of Shaft Voltage

This study is carried out to understand the effects of Static frequency converter (SFC) on large machine. SFC has a feature of four quadrant operations. By virtue of this it can be implemented to run a synchronous machine either as a motor or alternator. This dual mode operation helps a single machine to start & run as a motor and then it can be converted as an alternator whenever required. One such dual purpose machine is taken here for study. This machine is installed at a laboratory carrying out short circuit test on high power electrical equipment. SFC connected with this machine is broadly described in this paper. The same SFC has been modeled with the MATLAB/Simulink software. The data applied on this virtual model are the actual parameters from SFC and synchronous machine. After running the model, simulated machine voltage and current waveforms are validated with the real measurements. Processing of these waveforms is done through Fast Fourier Transformation (FFT) which reveals that the waveforms are not sinusoidal rather they contain number of harmonics. These harmonics are the major cause of generating shaft voltage. It is known that bearings of electrical machine are vulnerable to current flow through it due to shaft voltage. A general discussion on causes of shaft voltage in perspective with this machine is presented in this paper.

Disaster Preparedness and Management in Saudi Arabia: An Empirical Investigation

Disaster preparedness is a key success factor for any effective disaster management practices. This paper evaluates the disaster preparedness and management in Saudi Arabia using an empirical investigation approach. It presents the results of the survey conducted by interviewing representatives of the Saudi decision-makers and administrators responsible for disaster control in Jeddah before, during and after flooding in 2009 and 2010. First, demographics of the respondents are presented, followed by quantitative analysis of their views and experiences regarding the Kingdom’s readiness before and after each flood. This is shown as a series of dependent and independent variables. Following this is a list of respondents’ priorities for disaster preparation in the Kingdom.

Study of Shaft Voltage on Short Circuit Alternator with Static Frequency Converter

Electric machines are driven nowadays by static system popularly known as soft starter. This paper describes a thyristor based static frequency converter (SFC) to run a large synchronous machine installed at a short circuit test laboratory. Normally a synchronous machine requires prime mover or some other driving mechanism to run. This machine doesn’t need a prime mover as it operates in dual mode. In the beginning SFC starts this machine as a motor to achieve the full speed. Thereafter whenever required it can be converted to generator mode. This paper begins with the various starting methodology of synchronous machine. Detailed of SFC with different operational modes have been analyzed. Shaft voltage is a very common phenomenon for the machines with static drives. Various causes of shaft voltages in perspective with this machine are the main attraction of this paper.

Facility Location Problem in Emergency Logistic

Facility location is one of the important problems affecting the relief operations. The location model in this paper is motivated by arranging the flow of relief materials from the main warehouse to continent warehouse and further to regional warehouse and from these to the disaster area. This flow makes the relief organization always ready to deal with the disaster situation during shortest possible time. The main purpose of this paper is merge the concept of just in time and the campaign system in emergency supply chain,so that when the disaster happens the affected country can request help from the nearest regional warehouse, which will supply the relief material and the required stuff to support and assist the victims in the disaster area. Furthermore, the regional warehouse places an order to the continent warehouse to replenish the material that is distributed to the disaster area. This way they will always be ready to respond to any type of disaster.

Application of Soft Systems Methodology in Solving Disaster Emergency Logistics Problems

In recent years, many high intensity earthquakes have occurred around the world, such as the 2011 earthquake in Tohoku, Japan. These large-scale disasters caused huge casualties and losses. In addition, inefficient disaster response operations also caused the second wave of casualties and losses, and expanded the damage. Effective disaster management can be used to respond to the chaotic situation, and reduce the damage; however, some inefficient disaster response operations are still used. Therefore, this case study chose the 921 earthquake for analyzing disaster emergency logistics problems and proposed the Soft Systems Methodology (SSM) to solve disaster emergency logistics problems. Moreover, it analyses the effect of human factors on system operation, and suggests a solution to improve the system.

Improving Order Quantity Model with Emergency Safety Stock (ESS)

This study considers the problem of calculating safety stocks in disaster situations inventory systems that face demand uncertainties. Safety stocks are essential to make the supply chain, which is controlled by forecasts of customer needs, in response to demand uncertainties and to reach predefined goal service levels. To solve the problem of uncertainties due to the disaster situations affecting the industry sector, the concept of Emergency Safety Stock (ESS) was proposed. While there exists a huge body of literature on determining safety stock levels, this literature does not address the problem arising due to the disaster and dealing with the situations. In this paper, the problem of improving the Order Quantity Model to deal with uncertainty of demand due to disasters is managed by incorporating a new idea called ESS which is based on the probability of disaster occurrence and uses probability matrix calculated from the historical data. 

In Silico Analysis of Quinoxaline Ligand Conformations on 1ZIP: Adenylate Kinase

Adenylate kinase (AK) catalyse the phosphotransferase reaction plays an important role in cellular energy homeostasis. The inhibitors of bacterial AK are useful in the treatment of several bacterial infections. To the novel inhibitors of AK, docking studies performed by using the 3D structure of Bacillus stearothermophilus adenylate kinase from protein data bank (IZIP). 46 Quinoxaline analogues were docked in 1ZIP and selected the highly interacting compounds based on their binding energies, for further studies

Removal of Pb (II) from Aqueous Solutions using Fuller's Earth

Fuller’s earth is a fine-grained, naturally occurring substance that has a substantial ability to adsorb impurities. In the present study Fuller’s earth has been characterized and used for the removal of Pb(II) from aqueous solution. The effect of various physicochemical parameters such as pH, adsorbent dosage and shaking time on adsorption were studied. The result of the equilibrium studies showed that the solution pH was the key factor affecting the adsorption. The optimum pH for adsorption was 5. Kinetics data for the adsorption of Pb(II) was best described by pseudo-second order model. The effective diffusion co-efficient for Pb(II) adsorption was of the order of 10-8 m2/s. The adsorption data for metal adsorption can be well described by Langmuir adsorption isotherm. The maximum uptake of metal was 103.3 mg/g of adsorbent. Mass transfer analysis was also carried out for the adsorption process. The values of mass transfer coefficients obtained from the study indicate that the velocity of the adsorbate transport from bulk to the solid phase was quite fast. The mean sorption energy calculated from Dubinin-Radushkevich isotherm indicated that the metal adsorption process was chemical in nature. 

Developing Forecasting Tool for Humanitarian Relief Organizations in Emergency Logistics Planning

Despite the availability of natural disaster related time series data for last 110 years, there is no forecasting tool available to humanitarian relief organizations to determine forecasts for emergency logistics planning. This study develops a forecasting tool based on identifying probability distributions. The estimates of the parameters are used to calculate natural disaster forecasts. Further, the determination of aggregate forecasts leads to efficient pre-disaster planning. Based on the research findings, the relief agencies can optimize the various resources allocation in emergency logistics planning.

Optimization of Process Parameters of Pressure Die Casting using Taguchi Methodology

The present work analyses different parameters of pressure die casting to minimize the casting defects. Pressure diecasting is usually applied for casting of aluminium alloys. Good surface finish with required tolerances and dimensional accuracy can be achieved by optimization of controllable process parameters such as solidification time, molten temperature, filling time, injection pressure and plunger velocity. Moreover, by selection of optimum process parameters the pressure die casting defects such as porosity, insufficient spread of molten material, flash etc. are also minimized. Therefore, a pressure die casting component, carburetor housing of aluminium alloy (Al2Si2O5) has been considered. The effects of selected process parameters on casting defects and subsequent setting of parameters with the levels have been accomplished by Taguchi-s parameter design approach. The experiments have been performed as per the combination of levels of different process parameters suggested by L18 orthogonal array. Analyses of variance have been performed for mean and signal-to-noise ratio to estimate the percent contribution of different process parameters. Confidence interval has also been estimated for 95% consistency level and three conformational experiments have been performed to validate the optimum level of different parameters. Overall 2.352% reduction in defects has been observed with the help of suggested optimum process parameters.

Noise Analysis of Single-Ended Input Differential Amplifier using Stochastic Differential Equation

In this paper, we analyze the effect of noise in a single- ended input differential amplifier working at high frequencies. Both extrinsic and intrinsic noise are analyzed using time domain method employing techniques from stochastic calculus. Stochastic differential equations are used to obtain autocorrelation functions of the output noise voltage and other solution statistics like mean and variance. The analysis leads to important design implications and suggests changes in the device parameters for improved noise characteristics of the differential amplifier.

Performance Analysis of HSDPA Systems using Low-Density Parity-Check (LDPC)Coding as Compared to Turbo Coding

HSDPA is a new feature which is introduced in Release-5 specifications of the 3GPP WCDMA/UTRA standard to realize higher speed data rate together with lower round-trip times. Moreover, the HSDPA concept offers outstanding improvement of packet throughput and also significantly reduces the packet call transfer delay as compared to Release -99 DSCH. Till now the HSDPA system uses turbo coding which is the best coding technique to achieve the Shannon limit. However, the main drawbacks of turbo coding are high decoding complexity and high latency which makes it unsuitable for some applications like satellite communications, since the transmission distance itself introduces latency due to limited speed of light. Hence in this paper it is proposed to use LDPC coding in place of Turbo coding for HSDPA system which decreases the latency and decoding complexity. But LDPC coding increases the Encoding complexity. Though the complexity of transmitter increases at NodeB, the End user is at an advantage in terms of receiver complexity and Bit- error rate. In this paper LDPC Encoder is implemented using “sparse parity check matrix" H to generate a codeword at Encoder and “Belief Propagation algorithm "for LDPC decoding .Simulation results shows that in LDPC coding the BER suddenly drops as the number of iterations increase with a small increase in Eb/No. Which is not possible in Turbo coding. Also same BER was achieved using less number of iterations and hence the latency and receiver complexity has decreased for LDPC coding. HSDPA increases the downlink data rate within a cell to a theoretical maximum of 14Mbps, with 2Mbps on the uplink. The changes that HSDPA enables includes better quality, more reliable and more robust data services. In other words, while realistic data rates are only a few Mbps, the actual quality and number of users achieved will improve significantly.

Minimization of Non-Productive Time during 2.5D Milling

In the modern manufacturing systems, the use of thermal cutting techniques using oxyfuel, plasma and laser have become indispensable for the shape forming of high quality complex components; however, the conventional chip removal production techniques still have its widespread space in the manufacturing industry. Both these types of machining operations require the positioning of end effector tool at the edge where the cutting process commences. This repositioning of the cutting tool in every machining operation is repeated several times and is termed as non-productive time or airtime motion. Minimization of this non-productive machining time plays an important role in mass production with high speed machining. As, the tool moves from one region to the other by rapid movement and visits a meticulous region once in the whole operation, hence the non-productive time can be minimized by synchronizing the tool movements. In this work, this problem is being formulated as a general travelling salesman problem (TSP) and a genetic algorithm approach has been applied to solve the same. For improving the efficiency of the algorithm, the GA has been hybridized with a noble special heuristic and simulating annealing (SA). In the present work a novel heuristic in the combination of GA has been developed for synchronization of toolpath movements during repositioning of the tool. A comparative analysis of new Meta heuristic techniques with simple genetic algorithm has been performed. The proposed metaheuristic approach shows better performance than simple genetic algorithm for minimization of nonproductive toolpath length. Also, the results obtained with the help of hybrid simulated annealing genetic algorithm (HSAGA) are also found better than the results using simple genetic algorithm only.