An Integrated Natural Language Processing Approach for Conversation System

The main aim of this research is to investigate a novel technique for implementing a more natural and intelligent conversation system. Conversation systems are designed to converse like a human as much as their intelligent allows. Sometimes, we can think that they are the embodiment of Turing-s vision. It usually to return a predetermined answer in a predetermined order, but conversations abound with uncertainties of various kinds. This research will focus on an integrated natural language processing approach. This approach includes an integrated knowledge-base construction module, a conversation understanding and generator module, and a state manager module. We discuss effectiveness of this approach based on an experiment.

A Hybrid Ontology Based Approach for Ranking Documents

Increasing growth of information volume in the internet causes an increasing need to develop new (semi)automatic methods for retrieval of documents and ranking them according to their relevance to the user query. In this paper, after a brief review on ranking models, a new ontology based approach for ranking HTML documents is proposed and evaluated in various circumstances. Our approach is a combination of conceptual, statistical and linguistic methods. This combination reserves the precision of ranking without loosing the speed. Our approach exploits natural language processing techniques to extract phrases from documents and the query and doing stemming on words. Then an ontology based conceptual method will be used to annotate documents and expand the query. To expand a query the spread activation algorithm is improved so that the expansion can be done flexible and in various aspects. The annotated documents and the expanded query will be processed to compute the relevance degree exploiting statistical methods. The outstanding features of our approach are (1) combining conceptual, statistical and linguistic features of documents, (2) expanding the query with its related concepts before comparing to documents, (3) extracting and using both words and phrases to compute relevance degree, (4) improving the spread activation algorithm to do the expansion based on weighted combination of different conceptual relationships and (5) allowing variable document vector dimensions. A ranking system called ORank is developed to implement and test the proposed model. The test results will be included at the end of the paper.

A New Measure of Herding Behavior: Derivation and Implications

If price and quantity are the fundamental building blocks of any theory of market interactions, the importance of trading volume in understanding the behavior of financial markets is clear. However, while many economic models of financial markets have been developed to explain the behavior of prices -predictability, variability, and information content- far less attention has been devoted to explaining the behavior of trading volume. In this article, we hope to expand our understanding of trading volume by developing a new measure of herding behavior based on a cross sectional dispersion of volumes betas. We apply our measure to the Toronto stock exchange using monthly data from January 2000 to December 2002. Our findings show that the herd phenomenon consists of three essential components: stationary herding, intentional herding and the feedback herding.

Performance Prediction of Multi-Agent Based Simulation Applications on the Grid

A major requirement for Grid application developers is ensuring performance and scalability of their applications. Predicting the performance of an application demands understanding its specific features. This paper discusses performance modeling and prediction of multi-agent based simulation (MABS) applications on the Grid. An experiment conducted using a synthetic MABS workload explains the key features to be included in the performance model. The results obtained from the experiment show that the prediction model developed for the synthetic workload can be used as a guideline to understand to estimate the performance characteristics of real world simulation applications.

Performance Analysis of HSDPA Systems using Low-Density Parity-Check (LDPC)Coding as Compared to Turbo Coding

HSDPA is a new feature which is introduced in Release-5 specifications of the 3GPP WCDMA/UTRA standard to realize higher speed data rate together with lower round-trip times. Moreover, the HSDPA concept offers outstanding improvement of packet throughput and also significantly reduces the packet call transfer delay as compared to Release -99 DSCH. Till now the HSDPA system uses turbo coding which is the best coding technique to achieve the Shannon limit. However, the main drawbacks of turbo coding are high decoding complexity and high latency which makes it unsuitable for some applications like satellite communications, since the transmission distance itself introduces latency due to limited speed of light. Hence in this paper it is proposed to use LDPC coding in place of Turbo coding for HSDPA system which decreases the latency and decoding complexity. But LDPC coding increases the Encoding complexity. Though the complexity of transmitter increases at NodeB, the End user is at an advantage in terms of receiver complexity and Bit- error rate. In this paper LDPC Encoder is implemented using “sparse parity check matrix" H to generate a codeword at Encoder and “Belief Propagation algorithm "for LDPC decoding .Simulation results shows that in LDPC coding the BER suddenly drops as the number of iterations increase with a small increase in Eb/No. Which is not possible in Turbo coding. Also same BER was achieved using less number of iterations and hence the latency and receiver complexity has decreased for LDPC coding. HSDPA increases the downlink data rate within a cell to a theoretical maximum of 14Mbps, with 2Mbps on the uplink. The changes that HSDPA enables includes better quality, more reliable and more robust data services. In other words, while realistic data rates are only a few Mbps, the actual quality and number of users achieved will improve significantly.

The Cinema in Turkey During 1940s

The cinema in Turkey during the 1940s was shaped under the Second World War conditions. The amateur film makers from different socioeconomic roots experienced movie production in those years. Having similar socioeconomic characteristics and autobiographies, each of them has a different understanding of cinema. Nevertheless, they joined in making movies which address native culture and audience. They narrated indigenous stories with native music, amateur players and simple settings. Although the martial law, censorship and economical deficiencies, they started to produce films in the Second World War. The cinematographers of the 1940s usually called as thetransition period cinematographers in Turkey, producing in the passage between the period of thetheatre playersand the period of thenational cinema. But, 1940- 1950 period of Turkish cinema should be defined not as a transition but a period of forming the professional conscioussness in cinema.

Observation of the Correlations between Pair Wise Interaction and Functional Organization of the Proteins, in the Protein Interaction Network of Saccaromyces Cerevisiae

Understanding the cell's large-scale organization is an interesting task in computational biology. Thus, protein-protein interactions can reveal important organization and function of the cell. Here, we investigated the correspondence between protein interactions and function for the yeast. We obtained the correlations among the set of proteins. Then these correlations are clustered using both the hierarchical and biclustering methods. The detailed analyses of proteins in each cluster were carried out by making use of their functional annotations. As a result, we found that some functional classes appear together in almost all biclusters. On the other hand, in hierarchical clustering, the dominancy of one functional class is observed. In brief, from interaction data to function, some correlated results are noticed about the relationship between interaction and function which might give clues about the organization of the proteins.

Landscape Visual Classification Using Land use and Contour Data for Tourism and Planning Decision Making in Cameron Highlands District

Cameron Highlands is known for upland tourism area with vast natural wealth, mountainous landscape endowed with rich diverse species as well as people traditions and cultures. With these various resources, CH possesses an interesting visual and panorama that can be offered to the tourist. However this benefit may not be utilized without obtaining the understanding of existing landscape structure and visual. Given a limited data, this paper attempts to classify landscape visual of Cameron Highlands using land use and contour data. Visual points of view were determined from the given tourist attraction points in the CH Local Plan 2003-2015. The result shows landscape visual and structure categories offered in the study area. The result can be used for further analysis to determine the best alternative tourist trails for tourism planning and decision making using readily available data.

Roles and Responsibilities to Success of IT Project in an Organization

Many IT projects come to failure because of having technical approach, focusing on the final product and lack of proper attention to strategic alignment. Project management models quite often have technical management view [4], [8], [13], [14]. These models focus greatly on the finalization of the project product and the delivery of the product to the customer. However, many project problems are due to lack of attention to the needs and capabilities of the organizations or disregarding how to deploy and use the product in the organization. In this regard, in the current research we are trying to present a solution with the purpose of raising the value of the project in an organization. This way, the project outputs will be properly deployed in the organization. Therefore, a comprehensive model is presented which takes into account the whole processes from initial step of project definition to the deployment of the final outputs in the organization and then the definition of all roles and responsibilities to put the model into practice. Taking into account the opinions of experts and project managers, to prove the performance of the model, the project problems were recognized and based on the model, categorized and analyzed. And at the end it is made clear that ignoring the proper definition of the project and not having a proper understanding of the expected value on the one hand and not supervising the emerged value in the process of production and installment are among the most important factors that bring a project to failure.

Hospital Administration for Humanized Healthcare in Thailand

Due to the emergence of “Humanized Healthcare" introduced by Professor Dr. Prawase Wasi in 2003[1], the development of this paradigm tends to be widely implemented. The organizations included Healthcare Accreditation Institute (public organization), National Health Foundation, Mahidol University in cooperation with Thai Health Promotion Foundation, and National Health Security Office (Thailand) have selected the hospitals or infirmaries that are qualified for humanized healthcare since 2008- 2010 and 35 of them are chosen to be the outstandingly navigating organizations for the development of humanized healthcare, humanized healthcare award [2]. The research aims to study the current issue, characteristics and patterns of hospital administration contributing to humanized healthcare system in Thailand. The selected case studies are from four hospitals including Dansai Crown Prince Hospital, Leoi; Ubolrattana Hospital, Khon Kaen; Kapho Hospital, Pattani; and Prathai Hospital, Nakhonrachasima. The methodology is in-depth interviewing with 10 staffs working as hospital executive directors, and representatives from leader groups including directors, multidisciplinary hospital committees, personnel development committees, physicians and nurses in each hospital. (Total=40) In addition, focus group discussions between hospital staffs and general people (including patients and their relatives, the community leader, and other people) are held by means of setting 4 groups including 8 people within each group. (Total=128) The observation on the working in each hospital is also implemented. The findings of the study reveal that there are five important aspects found in each hospital including (1) the quality improvement under the mental and spiritual development policy from the chief executives and lead teams, leaders as Role model and they have visionary leadership; (2) the participation hospital administration system focusing on learning process and stakeholder- needs, spiritual human resource management and development; (3) the relationship among people especially staffs, team work skills, mutual understanding, effective communication and personal inner-development; (4) organization culture relevant to the awareness of patients- rights as well as the participation policy including spiritual growth achieving to the same goals, sharing vision, developing public mind, and caring; and (5) healing structures or environment providing warmth and convenience for hospital staffs, patients and their relatives and visitors.

Power Quality Improvement Using PI and Fuzzy Logic Controllers Based Shunt Active Filter

In recent years the large scale use of the power electronic equipment has led to an increase of harmonics in the power system. The harmonics results into a poor power quality and have great adverse economical impact on the utilities and customers. Current harmonics are one of the most common power quality problems and are usually resolved by using shunt active filter (SHAF). The main objective of this work is to develop PI and Fuzzy logic controllers (FLC) to analyze the performance of Shunt Active Filter for mitigating current harmonics under balanced and unbalanced sinusoidal source voltage conditions for normal load and increased load. When the supply voltages are ideal (balanced), both PI and FLC are converging to the same compensation characteristics. However, the supply voltages are non-ideal (unbalanced), FLC offers outstanding results. Simulation results validate the superiority of FLC with triangular membership function over the PI controller.

Visualization of Code Clone Detection Results and the Implementation with Structured Data

This paper describes a code clone visualization method, called FC graph, and the implementation issues. Code clone detection tools usually show the results in a textual representation. If the results are large, it makes a problem to software maintainers with understanding them. One of the approaches to overcome the situation is visualization of code clone detection results. A scatter plot is a popular approach to the visualization. However, it represents only one-to-one correspondence and it is difficult to find correspondence of code clones over multiple files. FC graph represents correspondence among files, code clones and packages in Java. All nodes in FC graph are positioned using force-directed graph layout, which is dynami- cally calculated to adjust the distances of nodes until stabilizing them. We applied FC graph to some open source programs and visualized the results. In the author’s experience, FC graph is helpful to grasp correspondence of code clones over multiple files and also code clones with in a file.

A Novel Approach for Coin Identification using Eigenvalues of Covariance Matrix, Hough Transform and Raster Scan Algorithms

In this paper we present a new method for coin identification. The proposed method adopts a hybrid scheme using Eigenvalues of covariance matrix, Circular Hough Transform (CHT) and Bresenham-s circle algorithm. The statistical and geometrical properties of the small and large Eigenvalues of the covariance matrix of a set of edge pixels over a connected region of support are explored for the purpose of circular object detection. Sparse matrix technique is used to perform CHT. Since sparse matrices squeeze zero elements and contain only a small number of non-zero elements, they provide an advantage of matrix storage space and computational time. Neighborhood suppression scheme is used to find the valid Hough peaks. The accurate position of the circumference pixels is identified using Raster scan algorithm which uses geometrical symmetry property. After finding circular objects, the proposed method uses the texture on the surface of the coins called texton, which are unique properties of coins, refers to the fundamental micro structure in generic natural images. This method has been tested on several real world images including coin and non-coin images. The performance is also evaluated based on the noise withstanding capability.

Dataset Analysis Using Membership-Deviation Graph

Classification is one of the primary themes in computational biology. The accuracy of classification strongly depends on quality of a dataset, and we need some method to evaluate this quality. In this paper, we propose a new graphical analysis method using 'Membership-Deviation Graph (MDG)' for analyzing quality of a dataset. MDG represents degree of membership and deviations for instances of a class in the dataset. The result of MDG analysis is used for understanding specific feature and for selecting best feature for classification.

A Study of Lurking Behavior: The Desire Perspective

Lurking behavior is common in information-seeking oriented communities. Transferring users with lurking behavior to be contributors can assist virtual communities to obtain competitive advantages. Based on the ecological cognition framework, this study proposes a model to examine the antecedents of lurking behavior in information-seeking oriented virtual communities. This study argues desire for emotional support, desire for information support, desire for performance-approach, desire for performance -avoidance, desire for mastery-approach, desire for mastery-avoidance, desire for ability trust, desire for benevolence trust, and desire for integrity trust effect on lurking behavior. This study offers an approach to understanding the determinants of lurking behavior in online contexts.

The Predictability and Abstractness of Language: A Study in Understanding and Usage of the English Language through Probabilistic Modeling and Frequency

Accounts of language acquisition differ significantly in their treatment of the role of prediction in language learning. In particular, nativist accounts posit that probabilistic learning about words and word sequences has little to do with how children come to use language. The accuracy of this claim was examined by testing whether distributional probabilities and frequency contributed to how well 3-4 year olds repeat simple word chunks. Corresponding chunks were the same length, expressed similar content, and were all grammatically acceptable, yet the results of the study showed marked differences in performance when overall distributional frequency varied. It was found that a distributional model of language predicted the empirical findings better than a number of other models, replicating earlier findings and showing that children attend to distributional probabilities in an adult corpus. This suggested that language is more prediction-and-error based, rather than on abstract rules which nativist camps suggest.

Space-Time Variation in Rainfall and Runoff: Upper Betwa Catchment

Among all geo-hydrological relationships, rainfallrunoff relationship is of utmost importance in any hydrological investigation and water resource planning. Spatial variation, lag time involved in obtaining areal estimates for the basin as a whole can affect the parameterization in design stage as well as in planning stage. In conventional hydrological processing of data, spatial aspect is either ignored or interpolated at sub-basin level. Temporal variation when analysed for different stages can provide clues for its spatial effectiveness. The interplay of space-time variation at pixel level can provide better understanding of basin parameters. Sustenance of design structures for different return periods and their spatial auto-correlations should be studied at different geographical scales for better management and planning of water resources. In order to understand the relative effect of spatio-temporal variation in hydrological data network, a detailed geo-hydrological analysis of Betwa river catchment falling in Lower Yamuna Basin is presented in this paper. Moreover, the exact estimates about the availability of water in the Betwa river catchment, especially in the wake of recent Betwa-Ken linkage project, need thorough scientific investigation for better planning. Therefore, an attempt in this direction is made here to analyse the existing hydrological and meteorological data with the help of SPSS, GIS and MS-EXCEL software. A comparison of spatial and temporal correlations at subcatchment level in case of upper Betwa reaches has been made to demonstrate the representativeness of rain gauges. First, flows at different locations are used to derive correlation and regression coefficients. Then, long-term normal water yield estimates based on pixel-wise regression coefficients of rainfall-runoff relationship have been mapped. The areal values obtained from these maps can definitely improve upon estimates based on point-based extrapolations or areal interpolations.

The Many Faces of your Employees: Insights into the Emerging Markets Workforce

The higher compounded growth rates coupled with favourable demographics in emerging markets portend abundant opportunities for multinational organizations. With many organizations competing for talent in these growing markets, their ability to succeed will depend on their understanding of local workforce needs and aspirations. Using data from the Towers Watson 2010 Global Workforce Study, this paper highlights differences in employee engagement, turnover risks, and attraction and retention drivers between the two markets. Apart from looking at the traditional drivers of employee engagement, the study also explores the value placed by employees on elements like a strong senior leadership, managerial capabilities and career advancement opportunities. Results reveal that emerging markets employees seem to be more engaged and value the non-traditional elements more highly than the developed markets employees.

Cold Hardiness in Near Isogenic Lines of Bread Wheat (Triticum Aestivum L. em. Thell.)

Low temperature (LT) is one of the most abiotic stresses causing loss of yield in wheat (T. aestivum). Four major genes in wheat (Triticum aestivum L.) with the dominant alleles designated Vrn–A1,Vrn–B1,Vrn–D1 and Vrn4, are known to have large effects on the vernalization response, but the effects on cold hardiness are ambiguous. Poor cold tolerance has restricted winter wheat production in regions of high winter stress [9]. It was known that nearly all wheat chromosomes [5] or at least 10 chromosomes of 21 chromosome pairs are important in winter hardiness [15]. The objective of present study was to clarify the role of each chromosome in cold tolerance. With this purpose we used 20 isogenic lines of wheat. In each one of these isogenic lines only a chromosome from ‘Bezostaya’ variety (a winter habit cultivar) was substituted to ‘Capple desprez’ variety. The plant materials were planted in controlled conditions with 20º C and 16 h day length in moderately cold areas of Iran at Karaj Agricultural Research Station in 2006-07 and the acclimation period was completed for about 4 weeks in a cold room with 4º C. The cold hardiness of these isogenic lines was measured by LT50 (the temperature in which 50% of the plants are killed by freezing stress).The experimental design was completely randomized block design (RCBD)with three replicates. The results showed that chromosome 5A had a major effect on freezing tolerance, and then chromosomes 1A and 4A had less effect on this trait. Further studies are essential to understanding the importance of each chromosome in controlling cold hardiness in wheat.

Improvement of Learning Motivation and Negotiation of Learning Disorders of Students Using Integrative Teaching Methodology

Integrative teaching methodology is based on connecting and summarizing knowledge from different subjects in order to create better understanding of different disciplines and improvement of competences in general. Integrative teaching methodology was implemented and realised during one academic year in 17 Latvian schools according with specially worked out programme by specialists of different fields for adaptation in social environment of children and young people with learning, cognitive functions and motor disorders. Implemented integrative teaching methodology consisted from three subsections which were specialised for adaptation in social environment, improvement of cognitive functions and improvement and harmonization of personality. The results of investigation showed that the use of integrative teaching methodology is an effective way for improvement of learning motivation and negotiation of learning disorders of different age schoolchildren.