Numerical Simulation in the Air-Curtain Installed Subway Tunnel for the Indoor Air Quality

The Platform Screen Doors improve Indoor Air Quality (IAQ) in the subway station; however, and the air quality is degraded in the subway tunnel. CO2 concentration and indoor particulate matter value are high in the tunnel. The IAQ level in subway tunnel degrades by increasing the train movements. Air-curtain installation reduces dusts, particles and moving toxic smokes and permits traffic by generating virtual wall. The ventilation systems of the subway tunnel need improvements to have better air-quality. Numerical analyses might be effective tools analyze the flowfield inside the air-curtain installed subway tunnel. The ANSYS CFX software is used for steady computations of the airflow inside the tunnel. The single-track subway tunnel has the natural shaft, the mechanical shaft, and the PSDs installed stations. The height and width of the tunnel are 6.0 m and 4.0 m respectively. The tunnel is 400 m long and the air-curtain is installed at the top of the tunnel. The thickness and the width of the air-curtain are 0.08 m and 4 m respectively. The velocity of the air-curtain changes between 20 - 30 m/s. Three cases are analyzed depending on the installing location of the air-curtain. The discharged-air through the natural shafts increases as the velocity of the air-curtain increases when the air-curtain is installed between the mechanical and the natural shafts. The pollutant-air is exhausted by the mechanical and the natural shafts and remained air is pushed toward tunnel end. The discharged-air through the natural shaft is low when the air-curtain installed before the natural shaft. The mass flow rate decreases in the tunnel after the mechanical shaft as the air-curtain velocity increases. The computational results of the air-curtain installed tunnel become basis for the optimum design study. The air-curtain installing location is chosen between the mechanical and the natural shafts. The velocity of the air-curtain is fixed as 25 m/s. The thickness and the blowing angles of the air-curtain are the design variables for the optimum design study. The object function of the design optimization is maximizing the discharged air through the natural shaft.

Towards the Use of Renewable Energy Sources in the Home

The paper presents the results of the European EIE project “Realising the potential for small scale renewable energy sources in the home – Kyotointhehome". The project's global aim is to inform and educate teachers, students and their families so that they can realise the need and can assess the potential for energy efficiency (EE) measures and renewable energy sources (RES) in their homes. The project resources were translated and trialled by 16 partners in 10 European countries. A web-based methodology which will enable families to assess how RES can be incorporated into energy efficient homes was accomplished. The web application “KYOTOINHOME" will help the citizens to identify what they can do to help their community meet the Kyoto target for greenhouse gas reductions and prevent global warming. This application provides useful information on how the citizens can use renewable energy sources in their home to provide space heating and cooling, hot water and electricity. A methodology for assessing heat loss in a dwelling and application of heat pump system was elaborated and will be implemented this year. For schools, we developed a set of practical activities concerned with preventing climate change through using renewable energy sources. Complementary resources will also developed in the Romanian research project “Romania Contribution to the European Targets Regarding the Development of Renewable Energy Sources" - PROMES.

A Survey on Metric of Software Cognitive Complexity for OO design

In modern era, the biggest challenge facing the software industry is the upcoming of new technologies. So, the software engineers are gearing up themselves to meet and manage change in large software system. Also they find it difficult to deal with software cognitive complexities. In the last few years many metrics were proposed to measure the cognitive complexity of software. This paper aims at a comprehensive survey of the metric of software cognitive complexity. Some classic and efficient software cognitive complexity metrics, such as Class Complexity (CC), Weighted Class Complexity (WCC), Extended Weighted Class Complexity (EWCC), Class Complexity due to Inheritance (CCI) and Average Complexity of a program due to Inheritance (ACI), are discussed and analyzed. The comparison and the relationship of these metrics of software complexity are also presented.

Studding of Number of Dataset on Precision of Estimated Saturated Hydraulic Conductivity

Saturated hydraulic conductivity of Soil is an important property in processes involving water and solute flow in soils. Saturated hydraulic conductivity of soil is difficult to measure and can be highly variable, requiring a large number of replicate samples. In this study, 60 sets of soil samples were collected at Saqhez region of Kurdistan province-IRAN. The statistics such as Correlation Coefficient (R), Root Mean Square Error (RMSE), Mean Bias Error (MBE) and Mean Absolute Error (MAE) were used to evaluation the multiple linear regression models varied with number of dataset. In this study the multiple linear regression models were evaluated when only percentage of sand, silt, and clay content (SSC) were used as inputs, and when SSC and bulk density, Bd, (SSC+Bd) were used as inputs. The R, RMSE, MBE and MAE values of the 50 dataset for method (SSC), were calculated 0.925, 15.29, -1.03 and 12.51 and for method (SSC+Bd), were calculated 0.927, 15.28,-1.11 and 12.92, respectively, for relationship obtained from multiple linear regressions on data. Also the R, RMSE, MBE and MAE values of the 10 dataset for method (SSC), were calculated 0.725, 19.62, - 9.87 and 18.91 and for method (SSC+Bd), were calculated 0.618, 24.69, -17.37 and 22.16, respectively, which shows when number of dataset increase, precision of estimated saturated hydraulic conductivity, increases.

Computational Identification of MicroRNAs and their Targets in two Species of Evergreen Spruce Tree (Picea)

MicroRNAs (miRNAs) are small, non-coding and regulatory RNAs about 20 to 24 nucleotides long. Their conserved nature among the various organisms makes them a good source of new miRNAs discovery by comparative genomics approach. The study resulted in 21 miRNAs of 20 pre-miRNAs belonging to 16 families (miR156, 157, 158, 164, 165, 168, 169, 172, 319, 390, 393, 394, 395, 400, 472 and 861) in evergreen spruce tree (Picea). The miRNA families; miR 157, 158, 164, 165, 168, 169, 319, 390, 393, 394, 400, 472 and 861 are reported for the first time in the Picea. All 20 miRNA precursors form stable minimum free energy stem-loop structure as their orthologues form in Arabidopsis and the mature miRNA reside in the stem portion of the stem loop structure. Sixteen (16) miRNAs are from Picea glauca and five (5) belong to Picea sitchensis. Their targets consist of transcription factors, growth related, stressed related and hypothetical proteins.

Spent Caustic Bioregeneration by using Thiobacillus denitrificans Bacteria

Spent Sulfidic Caustic was biologically treated and regenerated for reusing by Thiobacillus denitrificans bacteria, sulfide content oxidized and RSNa reduced dramatically.PH in this test was 11.8 and no neutralization has been done on spent caustic, so spent caustic as the most difficult of industrial wastes to dispose could be regenerate and reuse instead of disposing to sea or deep wells

Principal Component Analysis for the Characterization in the Application of Some Soil Properties

The objective of this research is to study principal component analysis for classification of 67 soil samples collected from different agricultural areas in the western part of Thailand. Six soil properties were measured on the soil samples and are used as original variables. Principal component analysis is applied to reduce the number of original variables. A model based on the first two principal components accounts for 72.24% of total variance. Score plots of first two principal components were used to map with agricultural areas divided into horticulture, field crops and wetland. The results showed some relationships between soil properties and agricultural areas. PCA was shown to be a useful tool for agricultural areas classification based on soil properties.

Repair of Concrete Structures with SCC

The objective of this work is to study the influence of the properties of the substrate on the retrofit (thin repair) of damaged concrete elements, with the SCC. Fluidity, principal characteristic of the SCC, would enable it to cover and adhere to the concrete to be repaired. Two aspects of repair are considered, the bond (Adhesion) and the tensile strength and the cracking. The investigation is experimental; It was conducted over test specimens made up of ordinary concrete prepared and hardened in advance (the material to be repaired) over which a self compacting concrete layer is cast. Three alternatives of SC concrete and one ordinary concrete (comparison) were tested. It appears that the self-compacting concrete constitutes a good material for repairing. It follows perfectly the surfaces- forms to be repaired and allows a perfect bond. Fracture tests made on specimens of self-compacting concrete show a brittle behaviour. However when a small percentage of fibres is added, the resistance to cracking is very much improve.

Influence of Deep Cold Rolling and Low Plasticity Burnishing on Surface Hardness and Surface Roughness of AISI 4140 Steel

Deep cold rolling (DCR) and low plasticity burnishing (LPB) process are cold working processes, which easily produce a smooth and work-hardened surface by plastic deformation of surface irregularities. The present study focuses on the surface roughness and surface hardness aspects of AISI 4140 work material, using fractional factorial design of experiments. The assessment of the surface integrity aspects on work material was done, in order to identify the predominant factors amongst the selected parameters. They were then categorized in order of significance followed by setting the levels of the factors for minimizing surface roughness and/or maximizing surface hardness. In the present work, the influence of main process parameters (force, feed rate, number of tool passes/overruns, initial roughness of the work piece, ball material, ball diameter and lubricant used) on the surface roughness and the hardness of AISI 4140 steel were studied for both LPB and DCR process and the results are compared. It was observed that by using LPB process surface hardness has been improved by 167% and in DCR process surface hardness has been improved by 442%. It was also found that the force, ball diameter, number of tool passes and initial roughness of the workpiece are the most pronounced parameters, which has a significant effect on the work piece-s surface during deep cold rolling and low plasticity burnishing process.

In Search of Robustness and Efficiency via l1− and l2− Regularized Optimization for Physiological Motion Compensation

Compensating physiological motion in the context of minimally invasive cardiac surgery has become an attractive issue since it outperforms traditional cardiac procedures offering remarkable benefits. Owing to space restrictions, computer vision techniques have proven to be the most practical and suitable solution. However, the lack of robustness and efficiency of existing methods make physiological motion compensation an open and challenging problem. This work focusses on increasing robustness and efficiency via exploration of the classes of 1−and 2−regularized optimization, emphasizing the use of explicit regularization. Both approaches are based on natural features of the heart using intensity information. Results pointed out the 1−regularized optimization class as the best since it offered the shortest computational cost, the smallest average error and it proved to work even under complex deformations.

Semantic Web Technologies in e - Government

e-Government is already in its second decade. Prerequisite for further development and adaptation to new realities is the optimal management of administrative information and knowledge production by those involved, i.e. the public sector, citizens and businesses. Nowadays, the amount of information displayed or distributed on the Internet has reached enormous dimensions, resulting in serious difficulties when extracting and managing knowledge. The semantic web is expected to play an important role in solving this problem and the technologies that support it. In this article, we address some relevant issues.

Investigation of Some Methodologies in Providing Erosion Maps of Surface, Rill and Gully and Erosion Features

Some methodologies were compared in providing erosion maps of surface, rill and gully and erosion features, in research which took place in the Varamin sub-basin, north-east Tehran, Iran. A photomorphic unit map was produced from processed satellite images, and four other maps were prepared by the integration of different data layers, including slope, plant cover, geology, land use, rocks erodibility and land units. Comparison of ground truth maps of erosion types and working unit maps indicated that the integration of land use, land units and rocks erodibility layers with satellite image photomorphic units maps provide the best methods in producing erosion types maps.

Health Hazards Related to Computer Use: Experience of the National Institute for Medical Research in Tanzania

This paper is based on a study conducted in 2006 to assess the impact of computer usage on health of National Institute for Medical Research (NIMR) staff. NIMR being a research Institute, most of its staff spend substantial part of their working time on computers. There was notion among NIMR staff on possible prolonged computer usage health hazards. Hence, a study was conducted to establish facts and possible mitigation measures. A total of 144 NIMR staff were involved in the study of whom 63.2% were males and 36.8% females aged between 20 and 59 years. All staff cadres were included in the sample. The functions performed by Institute staff using computers includes; data management, proposal development and report writing, research activities, secretarial duties, accounting and administrative duties, on-line information retrieval and online communication through e-mail services. The interviewed staff had been using computers for 1-8 hours a day and for a period ranging from 1 to 20 years. The study has indicated ergonomic hazards for a significant proportion of interviewees (63%) of various kinds ranging from backache to eyesight related problems. The authors highlighted major issues which are substantially applicable in preventing occurrences of computer related problems and they urged NIMR Management and/or the government of Tanzania opts to adapt their practicability.

Soft-Sensor for Estimation of Gasoline Octane Number in Platforming Processes with Adaptive Neuro-Fuzzy Inference Systems (ANFIS)

Gasoline Octane Number is the standard measure of the anti-knock properties of a motor in platforming processes, that is one of the important unit operations for oil refineries and can be determined with online measurement or use CFR (Cooperative Fuel Research) engines. Online measurements of the Octane number can be done using direct octane number analyzers, that it is too expensive, so we have to find feasible analyzer, like ANFIS estimators. ANFIS is the systems that neural network incorporated in fuzzy systems, using data automatically by learning algorithms of NNs. ANFIS constructs an input-output mapping based both on human knowledge and on generated input-output data pairs. In this research, 31 industrial data sets are used (21 data for training and the rest of the data used for generalization). Results show that, according to this simulation, hybrid method training algorithm in ANFIS has good agreements between industrial data and simulated results.

Evolution of Developing Flushing Cone during the Pressurized Flushing in Reservoir Storage

Sedimentation in reservoirs and the corresponding loss of storage capacity is one of the most serious problems in dam engineering. Pressurized flushing, a way to remove sediments from the reservoir, is flushing under a pressurized flow condition and nearly constant water level. Pressurized flushing has only local effects around the outlet. Sediment in the vicinity of the outlet openings is scoured and a funnel shaped crater is created. In this study, the temporal development of flushing cone under various hydraulic conditions was studied experimentally. Time variations of parameters such as maximum length and width of flushing and also depth of scouring cone was measured. Results indicated that an increase in flow velocity (and consequently in Froude number) established new hydraulically conditions for flushing mechanism and so a sudden growth was observed in the amount of sediment released and also scouring dimenssions. In addition, a set of nondimensional relationships were identified for temporal variations of flushing scour dimenssions, which can eventuallt be used to estimate the development of flushing cone.

Production and Remanufacturing of Returned Products in Supply Chain using Modified Genetic Algorithm

In recent years, environment regulation forcing manufactures to consider recovery activity of end-of- life products and/or return products for refurbishing, recycling, remanufacturing/repair and disposal in supply chain management. In this paper, a mathematical model is formulated for single product production-inventory system considering remanufacturing/reuse of return products and rate of return products follows a demand like function, dependent on purchasing price and acceptance quality level. It is useful in decision making to determine whether to go for remanufacturing or disposal of returned products along with newly produced products to satisfy a stationary demand. In addition, a modified genetic algorithm approach is proposed, inspired by particle swarm optimization method. Numerical analysis of the case study is carried out to validate the model.

Optimization of Methods for Development of Fermented-Distillate of Passion Fruit Beverage

Fermented beverages have high expression in the market for beverages in general, is increasingly valued in situations where the characteristic aroma and flavor of the material that gave rise to them are kept after processing. This study aimed to develop a distilled beverage from passion fruit, and assess, by sensory tests and chromatographic profile, the influence of different treatments (FM1- spirit with pulp addiction and FM2 – spirit with bigger ratio of pulp in must) in the setting of volatiles in the fruit drink, and performing chemical characterization taking into account the main parameters of quality established by the legislation. The chromatograms and the first sensorial tests had indicated that sample FM1 possess better characteristics of aroma, as much of how much quantitative the qualitative point of view. However, it analyzes it sensorial end (preference test) disclosed the biggest preference of the cloth provers for sample FM2-2 (note 7.93), being the attributes of decisive color and flavor in this reply, confirmed for the observed values lowest of fixed and total acidity in the samples of treatment FM2.

Toxicity of Copper and Cadmium to Freshwater Fishes

Two freshwater fishes, Rasbora sumatrana (Cyprinidae) and Poecilia reticulata (guppy) (Poeciliidae) were exposed for a four-day period in the laboratory condition to a range of copper (Cu) and cadmium (Cd) concentrations. Mortality was assessed and median lethal concentrations (LC50) were calculated. LC50 increased with decrease in mean exposure times for both metals. For R. sumatrana, LC50s for 24, 48, 72 and 96 hours for Cu were 54.2, 30.3, 18.9 and 5.6 μg/L and for Cd 1440.2, 459.3, 392.3 and 101.6 μg/L respectively. For P. reticulata, LC50s for 24, 48, 72 and 96 hours for Cu were 348.9, 145.4, 61.3 and 37.9 μg/L and for Cd 8205.6, 2827.1, 405.8 and 168.1 μg/L, respectively. Results indicated that the Cu was more toxic than Cd to both fishes (Cu>Cd) and R. sumatrana was more sensitive than P. reticulata to the metals.

Modeling Reaction Time in Car-Following Behaviour Based on Human Factors

This paper develops driver reaction-time models for car-following analysis based on human factors. The reaction time was classified as brake-reaction time (BRT) and acceleration/deceleration reaction time (ADRT). The BRT occurs when the lead vehicle is barking and its brake light is on, while the ADRT occurs when the driver reacts to adjust his/her speed using the gas pedal only. The study evaluates the effect of driver characteristics and traffic kinematic conditions on the driver reaction time in a car-following environment. The kinematic conditions introduced urgency and expectancy based on the braking behaviour of the lead vehicle at different speeds and spacing. The kinematic conditions were used for evaluating the BRT and are classified as normal, surprised, and stationary. Data were collected on a driving simulator integrated into a real car and included the BRT and ADRT (as dependent variables) and driver-s age, gender, driving experience, driving intensity (driving hours per week), vehicle speed, and spacing (as independent variables). The results showed that there was a significant difference in the BRT at normal, surprised, and stationary scenarios and supported the hypothesis that both urgency and expectancy had significant effects on BRT. Driver-s age, gender, speed, and spacing were found to be significant variables for the BRT in all scenarios. The results also showed that driver-s age and gender were significant variables for the ADRT. The research presented in this paper is part of a larger project to develop a driversensitive in-vehicle rear-end collision warning system.

Spectral Analysis of Speech: A New Technique

ICA which is generally used for blind source separation problem has been tested for feature extraction in Speech recognition system to replace the phoneme based approach of MFCC. Applying the Cepstral coefficients generated to ICA as preprocessing has developed a new signal processing approach. This gives much better results against MFCC and ICA separately, both for word and speaker recognition. The mixing matrix A is different before and after MFCC as expected. As Mel is a nonlinear scale. However, cepstrals generated from Linear Predictive Coefficient being independent prove to be the right candidate for ICA. Matlab is the tool used for all comparisons. The database used is samples of ISOLET.