Learning Theories within Coaching Process

These days we face with so many advertisements in magazines, those mentioned coaching is pragmatic specialties which help people make change in their lives. Up to know Specialty coaches are not necessarily therapists, consultants or psychologist, thus they may not know psychological theories. The International Coach Federation identifies "facilitating learning and results" as one of its four core coach competencies, without understanding learning theories coaching practice hangs in theoretical abyss. Thus the aim of this article is investigating learning theories within coaching process. Therefore, I reviewed some cognitive and behavioral learning theories and analyzed their contribution with coaching process which has been introduced in mentor coaches and ICF certified coaches' papers and books. The result demonstrated that coaching profession is strongly grounded in learning theories, and it will be strengthened by the validation of theories and evidence-based research as we move forward. Thus, it needs more research in order to applying effective theoretical frameworks.

Chase Trainer Exercise Program in Athlete with Unilateral Patellofemoral Pain Syndrome (PFPS)

We investigated the effects of modified preprogrammed training mode Chase Trainer from Balance Trainer (BT3, HurLab, Tampere, Finland) on athlete who experienced unilateral Patellofemoral Pain Syndrome (PFPS). Twenty-seven athletes with mean age= 14.23 ±1.31 years, height = 164.89 ± 7.85 cm, weight = 56.94 ± 9.28 kg were randomly assigned to two groups: experiment (EG; n = 14) and injured (IG; n = 13). EG performed a series of Chase Trainer program which required them to shift their body weight at different directions, speeds and angle of leaning twice a week for duration of 8 weeks. The static postural control and perceived pain level measures were taken at baseline, after 6 weeks and 8 weeks of training. There was no significant difference in any of tested variables between EG and IG before and after 6-week the intervention period. However, after 8-week of training, the postural control (eyes open) and perceived pain level of EG improved compared to IG (p

Towards a Systematic, Cost-Effective Approach for ERP Selection

Existing experiences indicate that one of the most prominent reasons that some ERP implementations fail is related to selecting an improper ERP package. Among those important factors resulting in inappropriate ERP selections, one is to ignore preliminary activities that should be done before the evaluation of ERP packages. Another factor yielding these unsuitable selections is that usually organizations employ prolonged and costly selection processes in such extent that sometimes the process would never be finalized or sometimes the evaluation team might perform many key final activities in an incomplete or inaccurate way due to exhaustion, lack of interest or out-of-date data. In this paper, a systematic approach that recommends some activities to be done before and after the main selection phase is introduced for choosing an ERP package. On the other hand, the proposed approach has utilized some ideas that accelerates the selection process at the same time that reduces the probability of an erroneous final selection.

Highly Scalable, Reversible and Embedded Image Compression System

A new method for low complexity image coding is presented, that permits different settings and great scalability in the generation of the final bit stream. This coding presents a continuoustone still image compression system that groups loss and lossless compression making use of finite arithmetic reversible transforms. Both transformation in the space of color and wavelet transformation are reversible. The transformed coefficients are coded by means of a coding system in depending on a subdivision into smaller components (CFDS) similar to the bit importance codification. The subcomponents so obtained are reordered by means of a highly configure alignment system depending on the application that makes possible the re-configure of the elements of the image and obtaining different levels of importance from which the bit stream will be generated. The subcomponents of each level of importance are coded using a variable length entropy coding system (VBLm) that permits the generation of an embedded bit stream. This bit stream supposes itself a bit stream that codes a compressed still image. However, the use of a packing system on the bit stream after the VBLm allows the realization of a final highly scalable bit stream from a basic image level and one or several enhance levels.

Method for Concept Labeling Based on Mapping between Ontology and Thesaurus

When designing information systems that deal with large amount of domain knowledge, system designers need to consider ambiguities of labeling termsin domain vocabulary for navigating users in the information space. The goal of this study is to develop a methodology for system designers to label navigation items, taking account of ambiguities stems from synonyms or polysemes of labeling terms. In this paper, we propose a method for concept labeling based on mappings between domain ontology andthesaurus, and report results of an empirical evaluation.

A High Quality Speech Coder at 600 bps

This paper presents a vocoder to obtain high quality synthetic speech at 600 bps. To reduce the bit rate, the algorithm is based on a sinusoidally excited linear prediction model which extracts few coding parameters, and three consecutive frames are grouped into a superframe and jointly vector quantization is used to obtain high coding efficiency. The inter-frame redundancy is exploited with distinct quantization schemes for different unvoiced/voiced frame combinations in the superframe. Experimental results show that the quality of the proposed coder is better than that of 2.4kbps LPC10e and achieves approximately the same as that of 2.4kbps MELP and with high robustness.

Effective Keyword and Similarity Thresholds for the Discovery of Themes from the User Web Access Patterns

Clustering techniques have been used by many intelligent software agents to group similar access patterns of the Web users into high level themes which express users intentions and interests. However, such techniques have been mostly focusing on one salient feature of the Web document visited by the user, namely the extracted keywords. The major aim of these techniques is to come up with an optimal threshold for the number of keywords needed to produce more focused themes. In this paper we focus on both keyword and similarity thresholds to generate themes with concentrated themes, and hence build a more sound model of the user behavior. The purpose of this paper is two fold: use distance based clustering methods to recognize overall themes from the Proxy log file, and suggest an efficient cut off levels for the keyword and similarity thresholds which tend to produce more optimal clusters with better focus and efficient size.

Flow around Two Cam Shaped Cylinders in Tandem Arrangement

In this paper flow around two cam shaped cylinders had been studied numerically. The equivalent diameter of cylinders is 27.6 mm. The space between center to center of two cam shaped cylinders is define as longitudinal pitch ratio and it varies in range of 2 varies in range of 50

MATLAB/SIMULINK Based Model of Single- Machine Infinite-Bus with TCSC for Stability Studies and Tuning Employing GA

With constraints on data availability and for study of power system stability it is adequate to model the synchronous generator with field circuit and one equivalent damper on q-axis known as the model 1.1. This paper presents a systematic procedure for modelling and simulation of a single-machine infinite-bus power system installed with a thyristor controlled series compensator (TCSC) where the synchronous generator is represented by model 1.1, so that impact of TCSC on power system stability can be more reasonably evaluated. The model of the example power system is developed using MATLAB/SIMULINK which can be can be used for teaching the power system stability phenomena, and also for research works especially to develop generator controllers using advanced technologies. Further, the parameters of the TCSC controller are optimized using genetic algorithm. The non-linear simulation results are presented to validate the effectiveness of the proposed approach.

eLearning Tools Evaluation based on Quality Concept Distance Computing. A Case Study

Despite the extensive use of eLearning systems, there is no consensus on a standard framework for evaluating this kind of quality system. Hence, there is only a minimum set of tools that can supervise this judgment and gives information about the course content value. This paper presents two kinds of quality set evaluation indicators for eLearning courses based on the computational process of three known metrics, the Euclidian, Hamming and Levenshtein distances. The “distance" calculus is applied to standard evaluation templates (i.e. the European Commission Programme procedures vs. the AFNOR Z 76-001 Standard), determining a reference point in the evaluation of the e-learning course quality vs. the optimal concept(s). The case study, based on the results of project(s) developed in the framework of the European Programme “Leonardo da Vinci", with Romanian contractors, try to put into evidence the benefits of such a method.

The Patterns of Unemployment and the Geography of Social Housing

During the last few decades in the academic field, the debate has increased on the effects of social geography on the opportunities of socioeconomic integration. On one hand, it has been discussed how the contents of the urban structure and social geography affect not only the way people interact, but also their chances of social and economic integration. On the other hand, it has also been discussed how the urban structure is also constrained and transformed by the action of social actors. Without questioning the powerful influence of structural factors, related to the logic of the production system, labor markets, education and training, the research has shown the role played by place of residence in shaping individual outcomes such as unemployment. In the context of this debate the importance of territory of residence with respect to the problem of unemployment has been highlighted. Although statistics of unemployment have already demonstrated the unequal incidence of the phenomenon in social groups, the issue of uneven territorial impact on the phenomenon at intra-urban level remains relatively unknown. The purpose of this article is to show and to interpret the spatial patterns of unemployment in the city of Porto using GIS (Geographic Information System - GIS) technology. Under this analysis the overlap of the spatial patterns of unemployment with the spatial distribution of social housing, allows the discussion of the relationship that occurs between these patterns and the reasons that might explain the relative immutability of socioeconomic problems in some neighborhoods.

An Enhance of the Energy Effectiveness of the Convectors Used for Heating or Cooling

The objective of this paper is to present a research study of the convectors that are used for heating or cooling of the living room or industrial halls. The key points are experimental measurement and comprehensive numerical simulation of the flow coming throughout the part of the convector such as heat exchanger, input from the fan etc.. From the obtained results, the components of the convector are optimized in sense to increase thermal power efficiency due to improvement of heat convection or reduction of air drag friction. Both optimized aspects are leading to the more effective service conditions and to energy saving. The significant part of the convector research is a design of the unique measurement laboratory and adopting measure techniques. The new laboratory provides possibility to measure thermal power efficiency and other relevant parameters under specific service conditions of the convectors.

Adaptive Kernel Principal Analysis for Online Feature Extraction

The batch nature limits the standard kernel principal component analysis (KPCA) methods in numerous applications, especially for dynamic or large-scale data. In this paper, an efficient adaptive approach is presented for online extraction of the kernel principal components (KPC). The contribution of this paper may be divided into two parts. First, kernel covariance matrix is correctly updated to adapt to the changing characteristics of data. Second, KPC are recursively formulated to overcome the batch nature of standard KPCA.This formulation is derived from the recursive eigen-decomposition of kernel covariance matrix and indicates the KPC variation caused by the new data. The proposed method not only alleviates sub-optimality of the KPCA method for non-stationary data, but also maintains constant update speed and memory usage as the data-size increases. Experiments for simulation data and real applications demonstrate that our approach yields improvements in terms of both computational speed and approximation accuracy.

WLAN Positioning Based on Joint TOA and RSS Characteristics

WLAN Positioning has been presented by many approaches in literatures using the characteristics of Received Signal Strength (RSS), Time of Arrival (TOA) or Time Difference of Arrival (TDOA), Angle of Arrival (AOA) and cell ID. Among these, RSS approach is the simplest method to implement because there is no need of modification on both access points and client devices whereas its accuracy is terrible due to physical environments. For TOA or TDOA approach, the accuracy is quite acceptable but most researches have to modify either software or hardware on existing WLAN infrastructure. The scales of modifications are made on only access card up to the changes in protocol of WLAN. Hence, it is an unattractive approach to use TOA or TDOA for positioning system. In this paper, the new concept of merging both RSS and TOA positioning techniques is proposed. In addition, the method to achieve TOA characteristic for positioning WLAN user without any extra modification necessarily appended in the existing system is presented. The measurement results confirm that the proposed technique using both RSS and TOA characteristics provides better accuracy than using only either RSS or TOA approach.

Cash Flow Optimization on Synthetic CDOs

Collateralized Debt Obligations are not as widely used nowadays as they were before 2007 Subprime crisis. Nonetheless there remains an enthralling challenge to optimize cash flows associated with synthetic CDOs. A Gaussian-based model is used here in which default correlation and unconditional probabilities of default are highlighted. Then numerous simulations are performed based on this model for different scenarios in order to evaluate the associated cash flows given a specific number of defaults at different periods of time. Cash flows are not solely calculated on a single bought or sold tranche but rather on a combination of bought and sold tranches. With some assumptions, the simplex algorithm gives a way to find the maximum cash flow according to correlation of defaults and maturities. The used Gaussian model is not realistic in crisis situations. Besides present system does not handle buying or selling a portion of a tranche but only the whole tranche. However the work provides the investor with relevant elements on how to know what and when to buy and sell.

On Use of Semiconductor Detector Arrays on COMPASS Tokamak

Semiconductor detector arrays are widely used in high-temperature plasma diagnostics. They have a fast response, which allows observation of many processes and instabilities in tokamaks. In this paper, there are reviewed several diagnostics based on semiconductor arrays as cameras, AXUV photodiodes (referred often as fast “bolometers") and detectors of both soft X-rays and visible light installed on the COMPASS tokamak recently. Fresh results from both spring and summer campaigns in 2012 are introduced. Examples of the utilization of the detectors are shown on the plasma shape determination, fast calculation of the radiation center, two-dimensional plasma radiation tomography in different spectral ranges, observation of impurity inflow, and also on investigation of MHD activity in the COMPASS tokamak discharges.

Tele-Diagnosis System for Rural Thailand

Thailand-s health system is challenged by the rising number of patients and decreasing ratio of medical practitioners/patients, especially in rural areas. This may tempt inexperienced GPs to rush through the process of anamnesis with the risk of incorrect diagnosis. Patients have to travel far to the hospital and wait for a long time presenting their case. Many patients try to cure themselves with traditional Thai medicine. Many countries are making use of the Internet for medical information gathering, distribution and storage. Telemedicine applications are a relatively new field of study in Thailand; the infrastructure of ICT had hampered widespread use of the Internet for using medical information. With recent improvements made health and technology professionals can work out novel applications and systems to help advance telemedicine for the benefit of the people. Here we explore the use of telemedicine for people with health problems in rural areas in Thailand and present a Telemedicine Diagnosis System for Rural Thailand (TEDIST) for diagnosing certain conditions that people with Internet access can use to establish contact with Community Health Centers, e.g. by mobile phone. The system uses a Web-based input method for individual patients- symptoms, which are taken by an expert system for the analysis of conditions and appropriate diseases. The analysis harnesses a knowledge base and a backward chaining component to find out, which health professionals should be presented with the case. Doctors have the opportunity to exchange emails or chat with the patients they are responsible for or other specialists. Patients- data are then stored in a Personal Health Record.

Performance of Soft Handover Algorithm in Varied Propagation Environments

CDMA cellular networks support soft handover, which guarantees the continuity of wireless services and enhanced communication quality. Cellular networks support multimedia services under varied propagation environmental conditions. In this paper, we have shown the effect of characteristic parameters of the cellular environments on the soft handover performance. We consider path loss exponent, standard deviation of shadow fading and correlation coefficient of shadow fading as the characteristic parameters of the radio propagation environment. A very useful statistical measure for characterizing the performance of mobile radio system is the probability of outage. It is shown through numerical results that above parameters have decisive effect on the probability of outage and hence the overall performance of the soft handover algorithm.

Investigation of VMAT Algorithms and Dosimetry

Purpose: Planning and dosimetry of different VMAT algorithms (SmartArc, Ergo++, Autobeam) is compared with IMRT for Head and Neck Cancer patients. Modelling was performed to rule out the causes of discrepancies between planned and delivered dose. Methods: Five HNC patients previously treated with IMRT were re-planned with SmartArc (SA), Ergo++ and Autobeam. Plans were compared with each other and against IMRT and evaluated using DVHs for PTVs and OARs, delivery time, monitor units (MU) and dosimetric accuracy. Modelling of control point (CP) spacing, Leaf-end Separation and MLC/Aperture shape was performed to rule out causes of discrepancies between planned and delivered doses. Additionally estimated arc delivery times, overall plan generation times and effect of CP spacing and number of arcs on plan generation times were recorded. Results: Single arc SmartArc plans (SA4d) were generally better than IMRT and double arc plans (SA2Arcs) in terms of homogeneity and target coverage. Double arc plans seemed to have a positive role in achieving improved Conformity Index (CI) and better sparing of some Organs at Risk (OARs) compared to Step and Shoot IMRT (ss-IMRT) and SA4d. Overall Ergo++ plans achieved best CI for both PTVs. Dosimetric validation of all VMAT plans without modelling was found to be lower than ss-IMRT. Total MUs required for delivery were on average 19%, 30%, 10.6% and 6.5% lower than ss-IMRT for SA4d, SA2d (Single arc with 20 Gantry Spacing), SA2Arcs and Autobeam plans respectively. Autobeam was most efficient in terms of actual treatment delivery times whereas Ergo++ plans took longest to deliver. Conclusion: Overall SA single arc plans on average achieved best target coverage and homogeneity for both PTVs. SA2Arc plans showed improved CI and some OARs sparing. Very good dosimetric results were achieved with modelling. Ergo++ plans achieved best CI. Autobeam resulted in fastest treatment delivery times.

An Interactive Tool for Teaching and Learning English at Upper Primary Level for Mauritius

E-learning refers to the specific kind of learning experienced within the domain of educational technology, which can be used in or out of the classroom. In this paper, we give an overview of an e-learning platform 'An Innovative Interactive and Online English Platform for Upper Primary Students' is an interactive web-based application which will serve as an aid to the primary school students in Mauritius. The objectives of this platform are to offer quality learning resources for the English subject at our primary level of education, encourage self-learning and hence promote e-learning. The platform developed consists of several interesting features, for example, the English Verb Conjugation tool, Negative Form tool, Interrogative Form tool and Close Test Generator. Thus, this learning platform will be useful at a time where our country is looking for an alternative to private tuition and also, looking forward to increase the pass rate.