Changes of Power-Velocity Relationship in Female Volleyball Players during an Annual Training Cycle

The aim of the study was to follow changes of powervelocity relationship in female volleyball players during an annual training cycle. The study was conducted on eleven female volleyball players: age 21.6±1.7 years, body height 177.9±4.7 cm, body mass 71.3±6.6 kg and training experience 8.6±3.3 years. Power–velocity relationship was determined from five maximal 10-second cycloergometer efforts with external loads equal: 2.5, 5.0, 7.5, 10.0 and 12.5% of body weight (BW) before (I) and after (II) the preparatory period, after the first (III) and second (IV) competitive season. The maximal power output increased from 9.30±0.85 W•kg–1 (I) to 9.50±0.96 W•kg–1 (II), 9.77±0.96 W•kg–1 (III) and 9.95±1.13 W•kg–1 (IV, p

Density, Strength, Thermal Conductivity and Leachate Characteristics of Light-Weight Fired Clay Bricks Incorporating Cigarette Butts

Several trillion cigarettes produced worldwide annually lead to many thousands of kilograms of toxic waste. Cigarette butts (CBs) accumulate in the environment due to the poor biodegradability of the cellulose acetate filters. This paper presents some of the results from a continuing study on recycling CBs into fired clay bricks. Physico-mechanical properties of fired clay bricks manufactured with different percentages of CBs are reported and discussed. The results show that the density of fired bricks was reduced by up to 30 %, depending on the percentage of CBs incorporated into the raw materials. Similarly, the compressive strength of bricks tested decreased according to the percentage of CBs included in the mix. The thermal conductivity performance of bricks was improved by 51 and 58 % for 5 and 10 % CBs content respectively. Leaching tests were carried out to investigate the levels of possible leachates of heavy metals from the manufactured clay-CB bricks. The results revealed trace amounts of heavy metals.

Developing of Fragility Curve for Two-Span Simply Supported Concrete Bridge in Near-Fault Area

Bridges are one of the main components of transportation networks. They should be functional before and after earthquake for emergency services. Therefore we need to assess seismic performance of bridges under different seismic loadings. Fragility curve is one of the popular tools in seismic evaluations. The fragility curves are conditional probability statements, which give the probability of a bridge reaching or exceeding a particular damage level for a given intensity level. In this study, the seismic performance of a two-span simply supported concrete bridge is assessed. Due to usual lack of empirical data, the analytical fragility curve was developed by results of the dynamic analysis of bridge subjected to the different time histories in near-fault area.

Efficient and Extensible Data Processing Framework in Ubiquitious Sensor Networks

This paper presents the design and implements the prototype of an intelligent data processing framework in ubiquitous sensor networks. Much focus is put on how to handle the sensor data stream as well as the interoperability between the low-level sensor data and application clients. Our framework first addresses systematic middleware which mitigates the interaction between the application layer and low-level sensors, for the sake of analyzing a great volume of sensor data by filtering and integrating to create value-added context information. Then, an agent-based architecture is proposed for real-time data distribution to efficiently forward a specific event to the appropriate application registered in the directory service via the open interface. The prototype implementation demonstrates that our framework can host a sophisticated application on the ubiquitous sensor network and it can autonomously evolve to new middleware, taking advantages of promising technologies such as software agents, XML, cloud computing, and the like.

Energy Efficient Reliable Cooperative Multipath Routing in Wireless Sensor Networks

In this paper, a reliable cooperative multipath routing algorithm is proposed for data forwarding in wireless sensor networks (WSNs). In this algorithm, data packets are forwarded towards the base station (BS) through a number of paths, using a set of relay nodes. In addition, the Rayleigh fading model is used to calculate the evaluation metric of links. Here, the quality of reliability is guaranteed by selecting optimal relay set with which the probability of correct packet reception at the BS will exceed a predefined threshold. Therefore, the proposed scheme ensures reliable packet transmission to the BS. Furthermore, in the proposed algorithm, energy efficiency is achieved by energy balancing (i.e. minimizing the energy consumption of the bottleneck node of the routing path) at the same time. This work also demonstrates that the proposed algorithm outperforms existing algorithms in extending longevity of the network, with respect to the quality of reliability. Given this, the obtained results make possible reliable path selection with minimum energy consumption in real time.

Analytical Model Based Evaluation of Human Machine Interfaces Using Cognitive Modeling

Cognitive models allow predicting some aspects of utility and usability of human machine interfaces (HMI), and simulating the interaction with these interfaces. The action of predicting is based on a task analysis, which investigates what a user is required to do in terms of actions and cognitive processes to achieve a task. Task analysis facilitates the understanding of the system-s functionalities. Cognitive models are part of the analytical approaches, that do not associate the users during the development process of the interface. This article presents a study about the evaluation of a human machine interaction with a contextual assistant-s interface using ACTR and GOMS cognitive models. The present work shows how these techniques may be applied in the evaluation of HMI, design and research by emphasizing firstly the task analysis and secondly the time execution of the task. In order to validate and support our results, an experimental study of user performance is conducted at the DOMUS laboratory, during the interaction with the contextual assistant-s interface. The results of our models show that the GOMS and ACT-R models give good and excellent predictions respectively of users performance at the task level, as well as the object level. Therefore, the simulated results are very close to the results obtained in the experimental study.

Sovereign Credit Risk Measures

This paper focuses on sovereign credit risk meaning a hot topic related to the current Eurozone crisis. In the light of the recent financial crisis, market perception of the creditworthiness of individual sovereigns has changed significantly. Before the outbreak of the financial crisis, market participants did not differentiate between credit risk born by individual states despite different levels of public indebtedness. In the proceeding of the financial crisis, the market participants became aware of the worsening fiscal situation in the European countries and started to discriminate among government issuers. Concerns about the increasing sovereign risk were reflected in surging sovereign risk premium. The main of this paper is to shed light on the characteristics of the sovereign risk with the special attention paid to the mutual relation between credit spread and the CDS premium as the main measures of the sovereign risk premium.

Development of NOx Emission Model for a Tangentially Fired Acid Incinerator

This paper aims to develop a NOx emission model of an acid gas incinerator using Nelder-Mead least squares support vector regression (LS-SVR). Malaysia DOE is actively imposing the Clean Air Regulation to mandate the installation of analytical instrumentation known as Continuous Emission Monitoring System (CEMS) to report emission level online to DOE . As a hardware based analyzer, CEMS is expensive, maintenance intensive and often unreliable. Therefore, software predictive technique is often preferred and considered as a feasible alternative to replace the CEMS for regulatory compliance. The LS-SVR model is built based on the emissions from an acid gas incinerator that operates in a LNG Complex. Simulated Annealing (SA) is first used to determine the initial hyperparameters which are then further optimized based on the performance of the model using Nelder-Mead simplex algorithm. The LS-SVR model is shown to outperform a benchmark model based on backpropagation neural networks (BPNN) in both training and testing data.

A Forward Automatic Censored Cell-Averaging Detector for Multiple Target Situations in Log-Normal Clutter

A challenging problem in radar signal processing is to achieve reliable target detection in the presence of interferences. In this paper, we propose a novel algorithm for automatic censoring of radar interfering targets in log-normal clutter. The proposed algorithm, termed the forward automatic censored cell averaging detector (F-ACCAD), consists of two steps: removing the corrupted reference cells (censoring) and the actual detection. Both steps are performed dynamically by using a suitable set of ranked cells to estimate the unknown background level and set the adaptive thresholds accordingly. The F-ACCAD algorithm does not require any prior information about the clutter parameters nor does it require the number of interfering targets. The effectiveness of the F-ACCAD algorithm is assessed by computing, using Monte Carlo simulations, the probability of censoring and the probability of detection in different background environments.

Approaches to Determining Optimal Asset Structure for a Commercial Bank

Every commercial bank optimises its asset portfolio depending on the profitability of assets and chosen or imposed constraints. This paper proposes and applies a stylized model for optimising banks' asset and liability structure, reflecting profitability of different asset categories and their risks as well as costs associated with different liability categories and reserve requirements. The level of detail for asset and liability categories is chosen to create a suitably parsimonious model and to include the most important categories in the model. It is shown that the most appropriate optimisation criterion for the model is the maximisation of the ratio of net interest income to assets. The maximisation of this ratio is subject to several constraints. Some are accounting identities or dictated by legislative requirements; others vary depending on the market objectives for a particular bank. The model predicts variable amount of assets allocated to loan provision.

Two Spatial Experiments based on Computational Geometry

The paper outlines the relevance of computational geometry within the design and production process of architecture. Based on two case studies, the digital chain - from the initial formfinding to the final realization of spatial concepts - is discussed in relation to geometric principles. The association with the fascinating complexity that can be found in nature and its underlying geometry was the starting point for both projects presented in the paper. The translation of abstract geometric principles into a three-dimensional digital design model – realized in Rhinoceros – was followed by a process of transformation and optimization of the initial shape that integrated aesthetic, spatial and structural qualities as well as aspects of material properties and conditions of production.

Data Acquisition from Cell Phone using Logical Approach

Cell phone forensics to acquire and analyze data in the cellular phone is nowadays being used in a national investigation organization and a private company. In order to collect cellular phone flash memory data, we have two methods. Firstly, it is a logical method which acquires files and directories from the file system of the cell phone flash memory. Secondly, we can get all data from bit-by-bit copy of entire physical memory using a low level access method. In this paper, we describe a forensic tool to acquire cell phone flash memory data using a logical level approach. By our tool, we can get EFS file system and peek memory data with an arbitrary region from Korea CDMA cell phone.

A Comparative Study on Eastern and Western Wedding Ceremonies in Korean Films and Hollywood Films

As an adult man and woman love each other and come to have faith in each other as their spouse, they marry each other. Recently people-s economic life has become individualized and women are enjoying a high education level and increased participation in social activities, and these changes are creating environment favorable for single life. Thus, an increasing number of people are choosing celibacy, and many people prefer cohabitation to marriage. Nevertheless, marriage is still regarded as a must-to-do in our thought. Most of people throughout the world admit marriage as one of natural processes of life, and is an important passage rite in life that all people experience as we can see everywhere in the world despite the diversity of lifestyles. With regard to wedding ceremony, however, each country and culture has its own unique tradition and style of festival. It is not just a congratulatory ceremony but contains multiple concepts representing the age, country or culture. Moreover, the form and contents of wedding ceremony changes over time, and such features of wedding ceremony are well represented in films. This study took note of the fact that films reflect and reproduce each country-s historicity, culturality and analyzed four films, which are believed to show differences between Eastern and Western wedding ceremonies. The selected films are: A Perfect Match (2002), Marriage Is a Crazy Thing (2001), Bride Wars (2009) and 27 Dresses (2008). The author attempted to examine wedding ceremonies described in the four films, differences between the East and the West suggested by the films, and changes in their societies.

Energy-Efficient Electrical Power Distribution with Multi-Agent Control at Parallel DC/DC Converters

Consumer electronics are pervasive. It is impossible to imagine a household or office without DVD players, digital cameras, printers, mobile phones, shavers, electrical toothbrushes, etc. All these devices operate at different voltage levels ranging from 1.8 to 20 VDC, in the absence of universal standards. The voltages available are however usually 120/230 VAC at 50/60 Hz. This situation makes an individual electrical energy conversion system necessary for each device. Such converters usually involve several conversion stages and often operate with excessive losses and poor reliability. The aim of the project presented in this paper is to design and implement a multi-channel DC/DC converter system, customizing the output voltage and current ratings according to the requirements of the load. Distributed, multi-agent techniques will be applied for the control of the DC/DC converters.

Region-Based Image Fusion with Artificial Neural Network

For most image fusion algorithms separate relationship by pixels in the image and treat them more or less independently. In addition, they have to be adjusted different parameters in different time or weather. In this paper, we propose a region–based image fusion which combines aspects of feature and pixel-level fusion method to replace only by pixel. The basic idea is to segment far infrared image only and to add information of each region from segmented image to visual image respectively. Then we determine different fused parameters according different region. At last, we adopt artificial neural network to deal with the problems of different time or weather, because the relationship between fused parameters and image features are nonlinear. It render the fused parameters can be produce automatically according different states. The experimental results present the method we proposed indeed have good adaptive capacity with automatic determined fused parameters. And the architecture can be used for lots of applications.

Mineral and Some Physico-Chemical Composition of 'Karayemis' (Prunus laurocerasus L.) Fruits Grown in Northeast Turkey

Some physico-chemical characteristics and mineral composition of 'Karayemis' (Prunus laurocerasus L.) fruits which grown naturally in Norteast Turkey was studied. 28 minerals ( Al, Mg, B, Mn, Co, Na, Ca, Ni, Cd, P, Cr, Pb, Cu, S, Fe, Zn, K, Sr, Li, As, V, Ag, Ba, Br, Ga, In, Se, Ti) were analyzed and 19 minerals were present at ascertainable levels. Karayemis fruit was richest in potassium (7938.711 ppm), magnesium (1242.186 ppm) and calcium (1158.853 ppm). And some physico-chemical characteristics of Karayemis fruit was investigated. Fruit length, fruit width, fruit thickness, fruit weight, total soluble solids, colour, protein, crude ash, crude fiber, crude oil values were determined as 2.334 cm, 1.884 cm, 2.112 cm, 5.35 g, 20.1 %, S99M99Y99, 0.29 %, 0.22 %, 6.63 % and 0.001 %, respectively. The seed of fruit mean weight, length, width and thickness were found to be 0.41 g, 1.303 cm, 0.921 cm and 0.803, respectively.

From Individual Memory to Organizational Memory (Intelligence of Organizations)

Intensive changes of environment and strong market competition have raised management of information and knowledge to the strategic level of companies. In a knowledge based economy only those organizations are capable of living which have up-to-date, special knowledge and they are able to exploit and develop it. Companies have to know what knowledge they have by taking a survey of organizational knowledge and they have to fix actual and additional knowledge in organizational memory. The question is how to identify, acquire, fix and use knowledge effectively. The paper will show that over and above the tools of information technology supporting acquisition, storage and use of information and organizational learning as well as knowledge coming into being as a result of it, fixing and storage of knowledge in the memory of a company play an important role in the intelligence of organizations and competitiveness of a company.

Peakwise Smoothing of Data Models using Wavelets

Smoothing or filtering of data is first preprocessing step for noise suppression in many applications involving data analysis. Moving average is the most popular method of smoothing the data, generalization of this led to the development of Savitzky-Golay filter. Many window smoothing methods were developed by convolving the data with different window functions for different applications; most widely used window functions are Gaussian or Kaiser. Function approximation of the data by polynomial regression or Fourier expansion or wavelet expansion also gives a smoothed data. Wavelets also smooth the data to great extent by thresholding the wavelet coefficients. Almost all smoothing methods destroys the peaks and flatten them when the support of the window is increased. In certain applications it is desirable to retain peaks while smoothing the data as much as possible. In this paper we present a methodology called as peak-wise smoothing that will smooth the data to any desired level without losing the major peak features.

The e-DELPHI Method to Test the Importance Competence and Skills: Case of the Lifelong Learning Spanish Trainers

The lifelong learning is a crucial element in the modernization of European education and training systems. The most important actors in the development process of the lifelong learning are the trainers, whose professional characteristics need new competences and skills in the current labour market. The main objective of this paper is to establish an importance ranking of the new competences, capabilities and skills that the lifelong learning Spanish trainers must possess nowadays. A wide study of secondary sources has allowed the design of a questionnaire that organizes the trainer-s skills and competences. The e-Delphi method is used for realizing a creative, individual and anonymous evaluation by experts on the importance ranking that presents the criteria, sub-criteria and indicators of the e-Delphi questionnaire. Twenty Spanish experts in the lifelong learning have participated in two rounds of the e- DELPHI method. In the first round, the analysis of the experts- evaluation has allowed to establish the ranking of the most importance criteria, sub-criteria and indicators and to eliminate the least valued. The minimum level necessary to reach the consensus among experts has been achieved in the second round.

Trends, Problems and Needs of Urban Housing in Malaysia

The right to housing is a basic need while good quality and affordable housing is a reflection of a high quality of life. However, housing remains a major problem for most, especially for the bottom billions. Satisfaction on housing and neighbourhood conditions are one of the important indicators that reflect quality of life. These indicators are also important in the process of evaluating housing policy with the objective to increase the quality of housing and neighbourhood. The research method is purely based on a quantitative method, using a survey. The findings show that housing purchasing trend in urban Malaysia is determined by demographic profiles, mainly by education level, age, gender and income. The period of housing ownership also influenced the socio-cultural interactions and satisfaction of house owners with their neighbourhoods. The findings also show that the main concerns for house buyers in urban areas are price and location of the house. Respondents feel that houses in urban Malaysia is too expensive and beyond their affordability. Location of houses and distance from work place are also regarded as the main concern. However, respondents are fairly satisfied with religious and socio-cultural facilities in the housing areas and most importantly not many regard ethnicity as an issue in their decision-making, when buying a house.