The Solar Wall in the Italian Climates

Passive systems were born with the purpose of the greatest exploitation of solar energy in cold climates and high altitudes. They spread themselves until the 80-s all over the world without any attention to the specific climate and the summer behavior; this caused the deactivation of the systems due to a series of problems connected to the summer overheating, the complex management and the rising of the dust. Until today the European regulation limits only the winter consumptions without any attention to the summer behavior but, the recent European EN 15251 underlines the relevance of the indoor comfort, and the necessity of the analytic studies validation by monitoring case studies. In the porpose paper we demonstrate that the solar wall is an efficient system both from thermal comfort and energy saving point of view and it is the most suitable for our temperate climates because it can be used as a passive cooling sistem too. In particular the paper present an experimental and numerical analisys carried out on a case study with nine different solar passive systems in Ancona, Italy. We carried out a detailed study of the lodging provided by the solar wall by the monitoring and the evaluation of the indoor conditions. Analyzing the monitored data, on the base of recognized models of comfort (ISO, ASHRAE, Givoni-s BBCC), is emerged that the solar wall has an optimal behavior in the middle seasons. In winter phase this passive system gives more advantages in terms of energy consumptions than the other systems, because it gives greater heat gain and therefore smaller consumptions. In summer, when outside air temperature return in the mean seasonal value, the indoor comfort is optimal thanks to an efficient transversal ventilation activated from the same wall.

Mining Frequent Patterns with Functional Programming

Frequent patterns are patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a dataset. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages such as C, Cµ, Java. The imperative paradigm is significantly inefficient when itemset is large and the frequent pattern is long. We suggest a high-level declarative style of programming using a functional language. Our supposition is that the problem of frequent pattern discovery can be efficiently and concisely implemented via a functional paradigm since pattern matching is a fundamental feature supported by most functional languages. Our frequent pattern mining implementation using the Haskell language confirms our hypothesis about conciseness of the program. The performance studies on speed and memory usage support our intuition on efficiency of functional language.

Preliminary Analysis of Energy Efficiency in Data Center: Case Study

As the data-driven economy is growing faster than ever and the demand for energy is being spurred, we are facing unprecedented challenges of improving energy efficiency in data centers. Effectively maximizing energy efficiency or minimising the cooling energy demand is becoming pervasive for data centers. This paper investigates overall energy consumption and the energy efficiency of cooling system for a data center in Finland as a case study. The power, cooling and energy consumption characteristics and operation condition of facilities are examined and analysed. Potential energy and cooling saving opportunities are identified and further suggestions for improving the performance of cooling system are put forward. Results are presented as a comprehensive evaluation of both the energy performance and good practices of energy efficient cooling operations for the data center. Utilization of an energy recovery concept for cooling system is proposed. The conclusion we can draw is that even though the analysed data center demonstrated relatively high energy efficiency, based on its power usage effectiveness value, there is still a significant potential for energy saving from its cooling systems.

Dynamic Response of a Water Tower Composed of Interlocked Panels

Earthquakes produce some of the most violent loading situations that a structure can be subjected to and if a structure fails under these loads then inevitably human life is put at risk. One of the most common methods by which a structure fails under seismic loading is at the connection of structural elements. The research presented in this paper investigates the interlock systems as a novel method for building structures. The main objective of this experimental study wasto determine the dynamic characteristics and the seismic behaviour of the proposed structures compared to conventional structural systemsduring seismic motions. Results of this study indicate that the interlock mechanism of the panels influences the behaviour of lateral load-resisting systems of the structures during earthquakes, contributing to better structural flexibility and easier maintenance.

A Survey of Job Scheduling and Resource Management in Grid Computing

Grid computing is a form of distributed computing that involves coordinating and sharing computational power, data storage and network resources across dynamic and geographically dispersed organizations. Scheduling onto the Grid is NP-complete, so there is no best scheduling algorithm for all grid computing systems. An alternative is to select an appropriate scheduling algorithm to use in a given grid environment because of the characteristics of the tasks, machines and network connectivity. Job and resource scheduling is one of the key research area in grid computing. The goal of scheduling is to achieve highest possible system throughput and to match the application need with the available computing resources. Motivation of the survey is to encourage the amateur researcher in the field of grid computing, so that they can understand easily the concept of scheduling and can contribute in developing more efficient scheduling algorithm. This will benefit interested researchers to carry out further work in this thrust area of research.

Primer Design with Specific PCR Product using Particle Swarm Optimization

Before performing polymerase chain reactions (PCR), a feasible primer set is required. Many primer design methods have been proposed for design a feasible primer set. However, the majority of these methods require a relatively long time to obtain an optimal solution since large quantities of template DNA need to be analyzed. Furthermore, the designed primer sets usually do not provide a specific PCR product. In recent years, evolutionary computation has been applied to PCR primer design and yielded promising results. In this paper, a particle swarm optimization (PSO) algorithm is proposed to solve primer design problems associated with providing a specific product for PCR experiments. A test set of the gene CYP1A1, associated with a heightened lung cancer risk was analyzed and the comparison of accuracy and running time with the genetic algorithm (GA) and memetic algorithm (MA) was performed. A comparison of results indicated that the proposed PSO method for primer design finds optimal or near-optimal primer sets and effective PCR products in a relatively short time.

Drum-Buffer-Rope: The Technique to Plan and Control the Production Using Theory of Constraints

Theory of Constraints has been emerging as an important tool for optimization of manufacturing/service systems. Goldratt in his first book “ The Goal " gave the introduction on Theory of Constraints and its applications in a factory scenario. A large number of production managers around the globe read this book but only a few could implement it in their plants because the book did not explain the steps to implement TOC in the factory. To overcome these limitations, Goldratt wrote this book to explain TOC, DBR and the method to implement it. In this paper, an attempt has been made to summarize the salient features of TOC and DBR listed in the book and the correct approach to implement TOC in a factory setting. The simulator available along with the book was actually used by the authors and the claim of Goldratt regarding the use of DBR and Buffer management to ease the work of production managers was tested and was found to be correct.

A Review on Application of Chitosan as a Natural Antimicrobial

In recent years application of natural antimicrobials instead of conventional ones, due to their hazardous effects on health, has got serious attentions. On the basis of the results of different studies, chitosan, a natural bio-degradable and non-toxic biopolysaccharide derived from chitin, has potential to be used as a natural antimicrobial. Chitosan has exhibited high antimicrobial activity against a wide variety of pathogenic and spoilage microorganisms, including fungi, and Gram-positive and Gramnegative bacteria. The antimicrobial action is influenced by intrinsic factors such as the type of chitosan, the degree of chitosan polymerization and extrinsic factors such as the microbial organism, the environmental conditions and presence of the other components. The use of chitosan in food systems should be based on sufficient knowledge of the complex mechanisms of its antimicrobial mode of action. In this article we review a number of studies on the investigation of chitosan antimicrobial properties and application of them in culture and food mediums.

An Optimization of Orbital Transfer for Spacecrafts with Finite-thrust Based on Legendre Pseudospectral Method

This paper presents the use of Legendre pseudospectral method for the optimization of finite-thrust orbital transfer for spacecrafts. In order to get an accurate solution, the System-s dynamics equations were normalized through a dimensionless method. The Legendre pseudospectral method is based on interpolating functions on Legendre-Gauss-Lobatto (LGL) quadrature nodes. This is used to transform the optimal control problem into a constrained parameter optimization problem. The developed novel optimization algorithm can be used to solve similar optimization problems of spacecraft finite-thrust orbital transfer. The results of a numerical simulation verified the validity of the proposed optimization method. The simulation results reveal that pseudospectral optimization method is a promising method for real-time trajectory optimization and provides good accuracy and fast convergence.

“Blood Family“ Activity With Respect To Comprehensive Guidance School Program

Children and adolescents developing in the worlds of today are facing a getting array of new and old challenges. School counselling is improving rapidly in contemporary education systems around the world. It can be said that counselling system in Turkey was newly borning. In this study, “Family of the Blood" activity is improved with respect to compherensive guidance school program. The sample included 22 adolescents who were high school students. The activity was carried out in 4 sessions, each of which lasted 45 minutes. In the first session, students- personal-social needs were determined. In the second session, in order to warm up, the students were asked three questions consisting of the constructional aspect. In the third session, the counselor and the teacher shared the results of students- responses obtained in the previous session. In the fourth session, the tables formed by students were presented in the classroom. In order to evaluate the activity, three questions were asked of the teacher and counselor. According to the results, the lesson aims of curriculum and counselling aims of curriculum were attained. In the light of literature, the results were discussed and some suggestions were made. It is taken into consideration that the activitiy was beneficial in many respects, similar studies should be carried out in the near future.

Distributed Frequency Synchronization for Global Synchronization in Wireless Mesh Networks

In this paper, our focus is to assure a global frequency synchronization in OFDMA-based wireless mesh networks with local information. To acquire the global synchronization in distributed manner, we propose a novel distributed frequency synchronization (DFS) method. DFS is a method that carrier frequencies of distributed nodes converge to a common value by repetitive estimation and averaging step and sharing step. Experimental results show that DFS achieves noteworthy better synchronization success probability than existing schemes in OFDMA-based mesh networks where the estimation error is presented.

Testing Object-Oriented Framework Applications Using FIST2 Tool: A Case Study

An application framework provides a reusable design and implementation for a family of software systems. Frameworks are introduced to reduce the cost of a product line (i.e., a family of products that shares the common features). Software testing is a timeconsuming and costly ongoing activity during the application software development process. Generating reusable test cases for the framework applications during the framework development stage, and providing and using the test cases to test part of the framework application whenever the framework is used reduces the application development time and cost considerably. This paper introduces the Framework Interface State Transition Tester (FIST2), a tool for automated unit testing of Java framework applications. During the framework development stage, given the formal descriptions of the framework hooks, the specifications of the methods of the framework-s extensible classes, and the illegal behavior description of the Framework Interface Classes (FICs), FIST2 generates unitlevel test cases for the classes. At the framework application development stage, given the customized method specifications of the implemented FICs, FIST2 automates the use, execution, and evaluation of the already generated test cases to test the implemented FICs. The paper illustrates the use of the FIST2 tool for testing several applications that use the SalesPoint framework.

Visualization and Indexing of Spectral Databases

On-line (near infrared) spectroscopy is widely used to support the operation of complex process systems. Information extracted from spectral database can be used to estimate unmeasured product properties and monitor the operation of the process. These techniques are based on looking for similar spectra by nearest neighborhood algorithms and distance based searching methods. Search for nearest neighbors in the spectral space is an NP-hard problem, the computational complexity increases by the number of points in the discrete spectrum and the number of samples in the database. To reduce the calculation time some kind of indexing could be used. The main idea presented in this paper is to combine indexing and visualization techniques to reduce the computational requirement of estimation algorithms by providing a two dimensional indexing that can also be used to visualize the structure of the spectral database. This 2D visualization of spectral database does not only support application of distance and similarity based techniques but enables the utilization of advanced clustering and prediction algorithms based on the Delaunay tessellation of the mapped spectral space. This means the prediction has not to use the high dimension space but can be based on the mapped space too. The results illustrate that the proposed method is able to segment (cluster) spectral databases and detect outliers that are not suitable for instance based learning algorithms.

Absorption of CO2 in EAF Reducing Slag from Stainless Steel Making Process by Wet Grinding

In the current study, we have conducted an experimental investigation on the utilization of electronic arc furnace (EAF) reducing slag for the absorption of CO2 via wet grinding method. It was carried out by various grinding conditions. The slag was ground in the vibrating ball mill in the presence of CO2 and pure water under ambient temperature. The reaction behavior was monitored with constant pressure method, and the changes of experimental systems volume as a function of grinding time were measured. It was found that the CO2 absorption occurred as soon as the grinding started. The CO2 absorption was significantly increased in the case of wet grinding compare to the dry grinding. Generally, the amount of CO2 absorption increased as the amount of water, weight of slag and initial pressure increased. However, it was decreased when the amount of water exceeds 200ml and when smaller balls were used. The absorption of CO2 occurred simultaneously with the start of the grinding and it stopped when the grinding was stopped. According to this research, the CO2 reacted with the CaO inside the slag, forming CaCO3.

Secondary Materials Management in Latvia: Challenges and Possibilities

Thisresearch paper is dedicated to an actual issue in Latvia and in the whole European Union – development of the secondary materials management. The goal of this paper is to research the development of the secondary materials management in Latvia as a result to point out its main positive aspects and problems. In this research paper the author regards following issues: significance of the secondary materials management, current situation of the waste generation and utilization in Latvia comparing with other EU Member States, main problems and positive aspects of the secondary materials management in Latvia. The research author concludes that in last ten years a great work is done to develop the secondary materials market. Nevertheless following improvements are necessary: implementation of the packaging deposit system, development of the separate waste collection, increasing of the recycling capacity.

An Architecture for High Performance File SystemI/O

This paper presents an architecture of current filesystem implementations as well as our new filesystem SpadFS and operating system Spad with rewritten VFS layer targeted at high performance I/O applications. The paper presents microbenchmarks and real-world benchmarks of different filesystems on the same kernel as well as benchmarks of the same filesystem on different kernels – enabling the reader to make conclusion how much is the performance of various tasks affected by operating system and how much by physical layout of data on disk. The paper describes our novel features–most notably continuous allocation of directories and cross-file readahead – and shows their impact on performance.

Universal Kinetic Modeling of RAFT Polymerization using Moment Equations

In the following text, we show that by introducing universal kinetic scheme, the origin of rate retardation and inhibition period which observed in dithiobenzoate-mediated RAFT polymerization can be described properly. We develop our model by utilizing the method of moments, then we apply our model to different monomer/RAFT agent systems, both homo- and copolymerization. The modeling results are in an excellent agreement with experiments and imply the validity of universal kinetic scheme, not only for dithiobenzoate-mediated systems, but also for different types of monomer/RAFT agent ones.

Measuring the Comprehensibility of a UML-B Model and a B Model

Software maintenance, which involves making enhancements, modifications and corrections to existing software systems, consumes more than half of developer time. Specification comprehensibility plays an important role in software maintenance as it permits the understanding of the system properties more easily and quickly. The use of formal notation such as B increases a specification-s precision and consistency. However, the notation is regarded as being difficult to comprehend. Semi-formal notation such as the Unified Modelling Language (UML) is perceived as more accessible but it lacks formality. Perhaps by combining both notations could produce a specification that is not only accurate and consistent but also accessible to users. This paper presents an experiment conducted on a model that integrates the use of both UML and B notations, namely UML-B, versus a B model alone. The objective of the experiment was to evaluate the comprehensibility of a UML-B model compared to a traditional B model. The measurement used in the experiment focused on the efficiency in performing the comprehension tasks. The experiment employed a cross-over design and was conducted on forty-one subjects, including undergraduate and masters students. The results show that the notation used in the UML-B model is more comprehensible than the B model.

A Large-Eddy Simulation of Vortex Cell flow with Incoming Turbulent Boundary Layer

We present a Large-Eddy simulation of a vortex cell with circular shaped. The results show that the flow field can be sub divided into four important zones, the shear layer above the cavity, the stagnation zone, the vortex core in the cavity and the boundary layer along the wall of the cavity. It is shown that the vortex core consits of solid body rotation without much turbulence activity. The vortex is mainly driven by high energy packets that are driven into the cavity from the stagnation point region and by entrainment of fluid from the cavity into the shear layer. The physics in the boundary layer along the cavity-s wall seems to be far from that of a canonical boundary layer which might be a crucial point for modelling this flow.

2D-Modeling with Lego Mindstorms

The whole work is based on possibility to use Lego Mindstorms robotics systems to reduce costs. Lego Mindstorms consists of a wide variety of hardware components necessary to simulate, programme and test of robotics systems in practice. To programme algorithm, which simulates space using the ultrasonic sensor, was used development environment supplied with kit. Software Matlab was used to render values afterwards they were measured by ultrasonic sensor. The algorithm created for this paper uses theoretical knowledge from area of signal processing. Data being processed by algorithm are collected by ultrasonic sensor that scans 2D space in front of it. Ultrasonic sensor is placed on moving arm of robot which provides horizontal moving of sensor. Vertical movement of sensor is provided by wheel drive. The robot follows map in order to get correct positioning of measured data. Based on discovered facts it is possible to consider Lego Mindstorm for low-cost and capable kit for real-time modelling.