Biomechanical Analysis of the Basic Classical Dance Jump – The Grand Jeté

The aim of this study was to analyse the most important parameters determining the quality of the motion structure of the basic classical dance jump – grand jeté.Research sample consisted of 8 students of the Dance Conservatory in Brno. Using the system Simi motion we performed a 3D kinematic analysis of the jump. On the basis of the comparison of structure quality and measured data of the grand jeté, we defined the optimal values of the relevant parameters determining the quality of the performance. The take-off speed should achieve about 2.4 m·s-1, the optimum take-off angle is 28 - 30º. The take-off leg should swing backward at the beginning of the flight phase with the minimum speed of 3.3 m·s-1.If motor abilities of dancers achieve the level necessary for optimal performance of a classical dance jump, there is room for certain variability of the structure of the dance jump.

Memory and Higher Cognition

Working memory (WM) can be defined as the system which actively holds information in the mind to do tasks in spite of the distraction. Contrary, short-term memory (STM) is a system that represents the capacity for the active storing of information without distraction. There has been accumulating evidence that these types of memory are related to higher cognition (HC). The aim of this study was to verify the relationship between HC and memory (visual STM and WM, auditory STM and WM). 59 primary school children were tested by intelligence test, mathematical tasks (HC) and memory subtests. We have shown that visual but not auditory memory is a significant predictor of higher cognition. The relevance of these results are discussed.

Using Dempster-Shafer Theory in XML Information Retrieval

XML is a markup language which is becoming the standard format for information representation and data exchange. A major purpose of XML is the explicit representation of the logical structure of a document. Much research has been performed to exploit logical structure of documents in information retrieval in order to precisely extract user information need from large collections of XML documents. In this paper, we describe an XML information retrieval weighting scheme that tries to find the most relevant elements in XML documents in response to a user query. We present this weighting model for information retrieval systems that utilize plausible inferences to infer the relevance of elements in XML documents. We also add to this model the Dempster-Shafer theory of evidence to express the uncertainty in plausible inferences and Dempster-Shafer rule of combination to combine evidences derived from different inferences.

The Impact of Upgrades on ERP System Reliability

Constant upgrading of Enterprise Resource Planning (ERP) systems is necessary, but can cause new defects. This paper attempts to model the likelihood of defects after completed upgrades with Weibull defect probability density function (PDF). A case study is presented analyzing data of recorded defects obtained for one ERP subsystem. The trends are observed for the value of the parameters relevant to the proposed statistical Weibull distribution for a given one year period. As a result, the ability to predict the appearance of defects after the next upgrade is described.

Social Anthropology of Convergence and Nomadic Computing

The paper attempts to contribute to the largely neglected social and anthropological discussion of technology development on the one hand, and to redirecting the emphasis in anthropology from primitive and exotic societies to problems of high relevance in contemporary era and how technology is used in everyday life. It draws upon multidimensional models of intelligence and ideal type formation. It is argued that the predominance of computational and cognitive cosmovisions have led to technology alienation. Injection of communicative competence in artificially intelligent systems and identity technologies in the coming information society are analyzed

Professional Burn out of Teachers: Reasons and Regularities

In recent years in Kazakhstan, as well as in all countries, we have been talking not only about the professional stress, but also professional Burnout Syndrome of employees. Burnout is essentially a response to chronic emotional stress – manifests itself in the form of chronic fatigue, despondency, unmotivated aggression, anger, and others. This condition is due to mental fatigue among teachers as a sort of payment for overstrain when professional commitments include the impact of “heat your soul", emotional investment. The emergence of professional Burnout among teachers is due to the system of interrelated and mutually reinforcing factors relating to the various levels of the personality: individually-psychological level is psychodynamic special subject characteristics of valuemotivational sphere and formation of skills and habits of selfregulation; the socio-psychological level includes especially the Organization and interpersonal interaction of a teacher. Signs of the Burnout were observed in 15 testees, and virtually a symptom could be observed in every teacher. As a result of the diagnosis 48% of teachers had the signs of stress (phase syndrome), resulting in a sense of anxiety, mood, heightened emotional susceptibility. The following results have also been got:-the fall of General energy potential – 14 pers. -Psychosomatic and psycho vegetative syndrome – 26 pers. -emotional deficit-34 pers. -emotional Burnout Syndrome-6 pers. The problem of professional Burnout of teachers in the current conditions should become not only meaningful, but particularly relevant. The quality of education of the younger generation depends on professional development; teachers- training level, and how “healthy" teachers are. That is why the systematic maintenance of pedagogic-professional development for teachers (including disclosure of professional Burnout Syndrome factors) takes on a special meaning.

How Celebrities can be used in Advertising to the Best Advantage?

The ever increasing product diversity and competition on the market of goods and services has dictated the pace of growth in the number of advertisements. Despite their admittedly diminished effectiveness over the recent years, advertisements remain the favored method of sales promotion. Consequently, the challenge for an advertiser is to explore every possible avenue of making an advertisement more noticeable, attractive and impellent for consumers. One way to achieve this is through invoking celebrity endorsements. On the one hand, the use of a celebrity to endorse a product involves substantial costs, however, on the other hand, it does not immediately guarantee the success of an advertisement. The question of how celebrities can be used in advertising to the best advantage is therefore of utmost importance. Celebrity endorsements have become commonplace: empirical evidence indicates that approximately 20 to 25 per cent of advertisements feature some famous person as a product endorser. The popularity of celebrity endorsements demonstrates the relevance of the topic, especially in the context of the current global economic downturn, when companies are forced to save in order to survive, yet simultaneously to heavily invest in advertising and sales promotion. The issue of the effective use of celebrity endorsements also figures prominently in the academic discourse. The study presented below is thus aimed at exploring what qualities (characteristics) of a celebrity endorser have an impact on the ffectiveness of the advertisement in which he/she appears and how.

Correlation-based Feature Selection using Ant Colony Optimization

Feature selection has recently been the subject of intensive research in data mining, specially for datasets with a large number of attributes. Recent work has shown that feature selection can have a positive effect on the performance of machine learning algorithms. The success of many learning algorithms in their attempts to construct models of data, hinges on the reliable identification of a small set of highly predictive attributes. The inclusion of irrelevant, redundant and noisy attributes in the model building process phase can result in poor predictive performance and increased computation. In this paper, a novel feature search procedure that utilizes the Ant Colony Optimization (ACO) is presented. The ACO is a metaheuristic inspired by the behavior of real ants in their search for the shortest paths to food sources. It looks for optimal solutions by considering both local heuristics and previous knowledge. When applied to two different classification problems, the proposed algorithm achieved very promising results.

Automatic Map Simplification for Visualization on Mobile Devices

The visualization of geographic information on mobile devices has become popular as the widespread use of mobile Internet. The mobility of these devices brings about much convenience to people-s life. By the add-on location-based services of the devices, people can have an access to timely information relevant to their tasks. However, visual analysis of geographic data on mobile devices presents several challenges due to the small display and restricted computing resources. These limitations on the screen size and resources may impair the usability aspects of the visualization applications. In this paper, a variable-scale visualization method is proposed to handle the challenge of small mobile display. By merging multiple scales of information into a single image, the viewer is able to focus on the interesting region, while having a good grasp of the surrounding context. This is essentially visualizing the map through a fisheye lens. However, the fisheye lens induces undesirable geometric distortion in the peripheral, which renders the information meaningless. The proposed solution is to apply map generalization that removes excessive information around the peripheral and an automatic smoothing process to correct the distortion while keeping the local topology consistent. The proposed method is applied on both artificial and real geographical data for evaluation.

Hourly Electricity Load Forecasting: An Empirical Application to the Italian Railways

Due to the liberalization of countless electricity markets, load forecasting has become crucial to all public utilities for which electricity is a strategic variable. With the goal of contributing to the forecasting process inside public utilities, this paper addresses the issue of applying the Holt-Winters exponential smoothing technique and the time series analysis for forecasting the hourly electricity load curve of the Italian railways. The results of the analysis confirm the accuracy of the two models and therefore the relevance of forecasting inside public utilities.

Data Preprocessing for Supervised Leaning

Many factors affect the success of Machine Learning (ML) on a given task. The representation and quality of the instance data is first and foremost. If there is much irrelevant and redundant information present or noisy and unreliable data, then knowledge discovery during the training phase is more difficult. It is well known that data preparation and filtering steps take considerable amount of processing time in ML problems. Data pre-processing includes data cleaning, normalization, transformation, feature extraction and selection, etc. The product of data pre-processing is the final training set. It would be nice if a single sequence of data pre-processing algorithms had the best performance for each data set but this is not happened. Thus, we present the most well know algorithms for each step of data pre-processing so that one achieves the best performance for their data set.

Study of Natural Convection in a Triangular Cavity Filled with Water: Application of the Lattice Boltzmann Method

The Lattice Boltzmann Method (LBM) with double populations is applied to solve the steady-state laminar natural convective heat transfer in a triangular cavity filled with water. The bottom wall is heated, the vertical wall is cooled, and the inclined wall is kept adiabatic. The buoyancy effect was modeled by applying the Boussinesq approximation to the momentum equation. The fluid velocity is determined by D2Q9 LBM and the energy equation is discritized by D2Q4 LBM to compute the temperature field. Comparisons with previously published work are performed and found to be in excellent agreement. Numerical results are obtained for a wide range of parameters: the Rayleigh number from  to  and the inclination angle from 0° to 360°. Flow and thermal fields were exhibited by means of streamlines and isotherms. It is observed that inclination angle can be used as a relevant parameter to control heat transfer in right-angled triangular enclosures.  

Trust and Security in Electronic Payments: What We Have and Need to Know?

The growth of open networks created the interest to commercialise it. The establishment of an electronic business mechanism must be accompanied by a digital-electronic payment system to transfer the value of transactions. Financial organizations are requested to offer a secure e-payment synthesis with equivalent levels of trust and security served in conventional paper-based payment transactions. The paper addresses the challenge of the first trade problem in e-commerce, provides a brief literature review on electronic payment and attempts to explain the underlying concept and method of trust in relevance to electronic payment.

The Influence of User Involvement and Personal Innovativeness on User Behavior

The search for factors that influence user behavior has remained an important theme for both the academic and practitioner Information Systems Communities. In this paper we examine relevant user behaviors in the phase after adoption and investigate two factors that are expected to influence such behaviors, namely User Involvement (UI) and Personal Innovativeness in IT (PIIT). We conduct a field study to examine how these factors influence postadoption behavior and how they are interrelated. Building on theoretical premises and prior empirical findings, we propose and test two alternative models of the relationship between these factors. Our results reveal that the best explanation of post-adoption behavior is provided by the model where UI and PIIT independently influence post-adoption behavior. Our findings have important implications for research and practice. To that end, we offer directions for future research.

Dimension Reduction of Microarray Data Based on Local Principal Component

Analysis and visualization of microarraydata is veryassistantfor biologists and clinicians in the field of diagnosis and treatment of patients. It allows Clinicians to better understand the structure of microarray and facilitates understanding gene expression in cells. However, microarray dataset is a complex data set and has thousands of features and a very small number of observations. This very high dimensional data set often contains some noise, non-useful information and a small number of relevant features for disease or genotype. This paper proposes a non-linear dimensionality reduction algorithm Local Principal Component (LPC) which aims to maps high dimensional data to a lower dimensional space. The reduced data represents the most important variables underlying the original data. Experimental results and comparisons are presented to show the quality of the proposed algorithm. Moreover, experiments also show how this algorithm reduces high dimensional data whilst preserving the neighbourhoods of the points in the low dimensional space as in the high dimensional space.

Evaluation of the Microbiological, Chemical and Sensory Quality of Carp Processed by the Sous Vide Method

This study evaluated the microbiological quality and the sensory characteristics of carp fillets processed by the sousvide method when stored at 2 and 10 °C. Four different combinations of sauced–storage were studied then stored at 2 or 10 °C was evaluate periodically sensory, microbiological and chemical quality. Batches stored at 2 °C had lower growth rates of mesophiles and psychrotrophs. Moreover, these counts decreased by increasing the heating temperature and time. Staphylococcus aureus, Bacillus cereus, Clostridium perfringens and Listeria monocytogenes were not found in any of the samples. The heat treatment of 90 °C for 15 min and sauced was the most effective to ensure the safety and extend the shelf-life of sousvide carp preserving its sensory characteristics. This study establishes the microbiological quality of sous vide carp and emphasizes the relevance of the raw materials, heat treatment and storage temperature to ensure the safety of the product.

Constructing a Suitable Model of Distance Training for Community Leader in the Upper Northeastern Region

The objective of this research intends to create a suitable model of distance training for community leaders in the upper northeastern region of Thailand. The implementation of the research process is divided into four steps: The first step is to analyze relevant documents. The second step deals with an interview in depth with experts. The third step is concerned with constructing a model. And the fourth step takes aim at model validation by expert assessments. The findings reveal the two important components for constructing an appropriate model of distance training for community leaders in the upper northeastern region. The first component consists of the context of technology management, e.g., principle, policy and goals. The second component can be viewed in two ways. Firstly, there are elements comprising input, process, output and feedback. Secondly, the sub-components include steps and process in training. The result of expert assessments informs that the researcher-s constructed model is consistent and suitable and overall the most appropriate.

Learning Styles of University Students in Bangkok: The Characteristics and the Relevant Instructional Context

The purposes of this study are 1) to identify learning styles of university students in Bangkok, and 2) to study the frequency of the relevant instructional context of the identified learning styles. Learning Styles employed in this study are those of Honey and Mumford, which include 1) Reflectors, 2) Theorists, 3) Pragmatists, and 4) Activists. The population comprises 1383 students and 5 lecturers. Research tools are 2 questionnaires – one used for identifying students- learning styles, and the other used for identifying the frequency of the relevant instructional context of the identified learning styles. The research findings reveal that 32.30 percent - are Activists, while 28.10 percent are Theorists, 20.10 are Reflectors, and 19.50 are Pragmatists. In terms of the relevant instructional context of the identified 4 learning styles, it is found that the frequency level of the instructional context is totally in high level. Moreover, 2 lists of the context being conducted most frequently are 'Lead'in activity to review background knowledge,- and 'Information retrieval report.' And these two activities serve the learning styles of theorists and activists. It is, therefore, suggested that more instructional context supporting the activists, the majority of the population, learning best by doing, as well as emotional learning situation should be added.

Unsupervised Outlier Detection in Streaming Data Using Weighted Clustering

Outlier detection in streaming data is very challenging because streaming data cannot be scanned multiple times and also new concepts may keep evolving. Irrelevant attributes can be termed as noisy attributes and such attributes further magnify the challenge of working with data streams. In this paper, we propose an unsupervised outlier detection scheme for streaming data. This scheme is based on clustering as clustering is an unsupervised data mining task and it does not require labeled data, both density based and partitioning clustering are combined for outlier detection. In this scheme partitioning clustering is also used to assign weights to attributes depending upon their respective relevance and weights are adaptive. Weighted attributes are helpful to reduce or remove the effect of noisy attributes. Keeping in view the challenges of streaming data, the proposed scheme is incremental and adaptive to concept evolution. Experimental results on synthetic and real world data sets show that our proposed approach outperforms other existing approach (CORM) in terms of outlier detection rate, false alarm rate, and increasing percentages of outliers.

An Automatic Gridding and Contour Based Segmentation Approach Applied to DNA Microarray Image Analysis

DNA microarray technology is widely used by geneticists to diagnose or treat diseases through gene expression. This technology is based on the hybridization of a tissue-s DNA sequence into a substrate and the further analysis of the image formed by the thousands of genes in the DNA as green, red or yellow spots. The process of DNA microarray image analysis involves finding the location of the spots and the quantification of the expression level of these. In this paper, a tool to perform DNA microarray image analysis is presented, including a spot addressing method based on the image projections, the spot segmentation through contour based segmentation and the extraction of relevant information due to gene expression.