A New Model for Question Answering Systems

Most of the Question Answering systems composed of three main modules: question processing, document processing and answer processing. Question processing module plays an important role in QA systems. If this module doesn't work properly, it will make problems for other sections. Moreover answer processing module is an emerging topic in Question Answering, where these systems are often required to rank and validate candidate answers. These techniques aiming at finding short and precise answers are often based on the semantic classification. This paper discussed about a new model for question answering which improved two main modules, question processing and answer processing. There are two important components which are the bases of the question processing. First component is question classification that specifies types of question and answer. Second one is reformulation which converts the user's question into an understandable question by QA system in a specific domain. Answer processing module, consists of candidate answer filtering, candidate answer ordering components and also it has a validation section for interacting with user. This module makes it more suitable to find exact answer. In this paper we have described question and answer processing modules with modeling, implementing and evaluating the system. System implemented in two versions. Results show that 'Version No.1' gave correct answer to 70% of questions (30 correct answers to 50 asked questions) and 'version No.2' gave correct answers to 94% of questions (47 correct answers to 50 asked questions).

A TRIZ-based Approach to Generation of Service-supporting Product Concepts

Recently, business environment and customer needs have become rapidly changing, hence it is very difficult to fulfill sophisticated customer needs by product or service innovation only. In practice, to cope with this problem, various manufacturing companies have developed services to combine with their products. Along with this, many academic studies on PSS (Product Service System) which is the integrated system of products and services have been conducted from the viewpoint of manufacturers. On the other hand, service providers are also attempting to develop service-supporting products to increase their service competitiveness and provide differentiated value. However, there is a lack of research based on the service-centric point of view. Accordingly, this paper proposes a concept generation method for service-supporting product development from the service-centric point of view. This method is designed to be executed in five consecutive steps: situation analysis, problem definition, problem resolution, solution evaluation, and concept generation. In the proposed approach, some tools of TRIZ (Theory of Solving Inventive Problem) such as ISQ (Innovative Situation Questionnaire) and 40 inventive principles are employed in order to define problems of the current services and solve them by generating service-supporting product concepts. This research contributes to the development of service-supporting products and service-centric PSSs.

Optimization Using Simulation of the Vehicle Routing Problem

A key element of many distribution systems is the routing and scheduling of vehicles servicing a set of customers. A wide variety of exact and approximate algorithms have been proposed for solving the vehicle routing problems (VRP). Exact algorithms can only solve relatively small problems of VRP, which is classified as NP-Hard. Several approximate algorithms have proven successful in finding a feasible solution not necessarily optimum. Although different parts of the problem are stochastic in nature; yet, limited work relevant to the application of discrete event system simulation has addressed the problem. Presented here is optimization using simulation of VRP; where, a simplified problem has been developed in the ExtendSimTM simulation environment; where, ExtendSimTM evolutionary optimizer is used to minimize the total transportation cost of the problem. Results obtained from the model are very satisfactory. Further complexities of the problem are proposed for consideration in the future.

An Algorithm for Detecting Seam Cracks in Steel Plates

In this study, we developed an algorithm for detecting seam cracks in a steel plate. Seam cracks are generated in the edge region of a steel plate. We used the Gabor filter and an adaptive double threshold method to detect them. To reduce the number of pseudo defects, features based on the shape of seam cracks were used. To evaluate the performance of the proposed algorithm, we tested 989 images with seam cracks and 9470 defect-free images. Experimental results show that the proposed algorithm is suitable for detecting seam cracks. However, it should be improved to increase the true positive rate.

Comparative Study of Transformed and Concealed Data in Experimental Designs and Analyses

This paper presents the comparative study of coded data methods for finding the benefit of concealing the natural data which is the mercantile secret. Influential parameters of the number of replicates (rep), treatment effects (τ) and standard deviation (σ) against the efficiency of each transformation method are investigated. The experimental data are generated via computer simulations under the specified condition of the process with the completely randomized design (CRD). Three ways of data transformation consist of Box-Cox, arcsine and logit methods. The difference values of F statistic between coded data and natural data (Fc-Fn) and hypothesis testing results were determined. The experimental results indicate that the Box-Cox results are significantly different from natural data in cases of smaller levels of replicates and seem to be improper when the parameter of minus lambda has been assigned. On the other hand, arcsine and logit transformations are more robust and obviously, provide more precise numerical results. In addition, the alternate ways to select the lambda in the power transformation are also offered to achieve much more appropriate outcomes.

Assembly and Alignment of Ship Power Plants in Modern Shipbuilding

Fine alignment of main ship power plants mechanisms and shaft lines provides long-term and failure-free performance of propulsion system while fast and high-quality installation of mechanisms and shaft lines decreases common labor intensity. For checking shaft line allowed stress and setting its alignment it is required to perform calculations considering various stages of life cycle. In 2012 JSC SSTC developed special software complex “Shaftline” for calculation of alignment of having its own I/O interface and display of shaft line 3D model. Alignment of shaft line as per bearing loads is rather labor-intensive procedure. In order to decrease its duration, JSC SSTC developed automated alignment system from ship power plants mechanisms. System operation principle is based on automatic simulation of design load on bearings. Initial data for shaft line alignment can be exported to automated alignment system from PC “Shaft line”.

Material Handling Equipment Selection using Hybrid Monte Carlo Simulation and Analytic Hierarchy Process

The many feasible alternatives and conflicting objectives make equipment selection in materials handling a complicated task. This paper presents utilizing Monte Carlo (MC) simulation combined with the Analytic Hierarchy Process (AHP) to evaluate and select the most appropriate Material Handling Equipment (MHE). The proposed hybrid model was built on the base of material handling equation to identify main and sub criteria critical to MHE selection. The criteria illustrate the properties of the material to be moved, characteristics of the move, and the means by which the materials will be moved. The use of MC simulation beside the AHP is very powerful where it allows the decision maker to represent his/her possible preference judgments as random variables. This will reduce the uncertainty of single point judgment at conventional AHP, and provide more confidence in the decision problem results. A small business pharmaceutical company is used as an example to illustrate the development and application of the proposed model.

Effects of Human Factors on Workforce Scheduling

In today-s competitive market, most companies develop manufacturing systems that can help in cost reduction and maximum quality. Human issues are an important part of manufacturing systems, yet most companies ignore their effects on production performance. This paper aims to developing an integrated workforce planning system that incorporates the human being. Therefore, a multi-objective mixed integer nonlinear programming model is developed to determine the amount of hiring, firing, training, overtime for each worker type. This paper considers a workforce planning model including human aspects such as skills, training, workers- personalities, capacity, motivation, and learning rates. This model helps to minimize the hiring, firing, training and overtime costs, and maximize the workers- performance. The results indicate that the workers- differences should be considered in workforce scheduling to generate realistic plans with minimum costs. This paper also investigates the effects of human learning rates on the performance of the production systems.

A Fuzzy MCDM Approach for Health-Care Waste Management

The management of the health-care wastes is one of the most important problems in Istanbul, a city with more than 12 million inhabitants, as it is in most of the developing countries. Negligence in appropriate treatment and final disposal of the healthcare wastes can lead to adverse impacts to public health and to the environment. This paper employs a fuzzy multi-criteria group decision making approach, which is based on the principles of fusion of fuzzy information, 2-tuple linguistic representation model, and technique for order preference by similarity to ideal solution (TOPSIS), to evaluate health-care waste (HCW) treatment alternatives for Istanbul. The evaluation criteria are determined employing nominal group technique (NGT), which is a method of systematically developing a consensus of group opinion. The employed method is apt to manage information assessed using multigranularity linguistic information in a decision making problem with multiple information sources. The decision making framework employs ordered weighted averaging (OWA) operator that encompasses several operators as the aggregation operator since it can implement different aggregation rules by changing the order weights. The aggregation process is based on the unification of information by means of fuzzy sets on a basic linguistic term set (BLTS). Then, the unified information is transformed into linguistic 2-tuples in a way to rectify the problem of loss information of other fuzzy linguistic approaches.

The Classification Model for Hard Disk Drive Functional Tests under Sparse Data Conditions

This paper proposed classification models that would be used as a proxy for hard disk drive (HDD) functional test equitant which required approximately more than two weeks to perform the HDD status classification in either “Pass" or “Fail". These models were constructed by using committee network which consisted of a number of single neural networks. This paper also included the method to solve the problem of sparseness data in failed part, which was called “enforce learning method". Our results reveal that the constructed classification models with the proposed method could perform well in the sparse data conditions and thus the models, which used a few seconds for HDD classification, could be used to substitute the HDD functional tests.

Effect of Coolant on Cutting Forces and Surface Roughness in Grinding of CSM GFRP

This paper presents a comparative study on dry and wet grinding through experimental investigation in the grinding of CSM glass fibre reinforced polymer laminates using a pink aluminium oxide wheel. Different sets of experiments were performed to study the effects of the independent grinding parameters such as grinding wheel speed, feed and depth of cut on dependent performance criteria such as cutting forces and surface finish. Experimental conditions were laid out using design of experiment central composite design. An effective coolant was sought in this study to minimise cutting forces and surface roughness for GFRP laminates grinding. Test results showed that the use of coolants reduces surface roughness, although not necessarily the cutting forces. These research findings provide useful economic machining solution in terms of optimized grinding conditions for grinding CSM GFRP.

Unrelated Parallel Machines Scheduling Problem Using an Ant Colony Optimization Approach

Total weighted tardiness is a measure of customer satisfaction. Minimizing it represents satisfying the general requirement of on-time delivery. In this research, we consider an ant colony optimization (ACO) algorithm to solve the problem of scheduling unrelated parallel machines to minimize total weighted tardiness. The problem is NP-hard in the strong sense. Computational results show that the proposed ACO algorithm is giving promising results compared to other existing algorithms.

Calculating the Efficiency of Steam Boilers Based on Its Most Effecting Factors: A Case Study

This paper is concerned with calculating boiler efficiency as one of the most important types of performance measurements in any steam power plant. That has a key role in determining the overall effectiveness of the whole system within the power station. For this calculation, a Visual-Basic program was developed, and a steam power plant known as El-Khmus power plant, Libya was selected as a case study. The calculation of the boiler efficiency was applied by using heating balance method. The findings showed how the maximum heat energy which produced from the boiler increases the boiler efficiency through increasing the temperature of the feed water, and decreasing the exhaust temperature along with humidity levels of the of fuel used within the boiler.

Analyzing CPFR Supporting Factors with Fuzzy Cognitive Map Approach

Collaborative planning, forecasting and replenishment (CPFR) coordinates the various supply chain management activities including production and purchase planning, demand forecasting and inventory replenishment between supply chain trading partners. This study proposes a systematic way of analyzing CPFR supporting factors using fuzzy cognitive map (FCM) approach. FCMs have proven particularly useful for solving problems in which a number of decision variables and uncontrollable variables are causally interrelated. Hence the FCMs of CPFR are created to show the relationships between the factors that influence on effective implementation of CPFR in the supply chain.

Linking OpenCourseWares and Open Education Resources: Creating an Effective Search and Recommendation System

With a growing number of digital libraries and other open education repositories being made available throughout the world, effective search and retrieval tools are necessary to access the desired materials that surpass the effectiveness of traditional, allinclusive search engines. This paper discusses the design and use of Folksemantic, a platform that integrates OpenCourseWare search, Open Educational Resource recommendations, and social network functionality into a single open source project. The paper describes how the system was originally envisioned, its goals for users, and data that provides insight into how it is actually being used. Data sources include website click-through data, query logs, web server log files and user account data. Based on a descriptive analysis of its current use, modifications to the platform's design are recommended to better address goals of the system, along with recommendations for additional phases of research.

Effective Collaboration in Product Development via a Common Sharable Ontology

To achieve competitive advantage nowadays, most of the industrial companies are considering that success is sustained to great product development. That is to manage the product throughout its entire lifetime ranging from design, manufacture, operation and destruction. Achieving this goal requires a tight collaboration between partners from a wide variety of domains, resulting in various product data types and formats, as well as different software tools. So far, the lack of a meaningful unified representation for product data semantics has slowed down efficient product development. This paper proposes an ontology based approach to enable such semantic interoperability. Generic and extendible product ontology is described, gathering main concepts pertaining to the mechanical field and the relations that hold among them. The ontology is not exhaustive; nevertheless, it shows that such a unified representation is possible and easily exploitable. This is illustrated thru a case study with an example product and some semantic requests to which the ontology responds quite easily. The study proves the efficiency of ontologies as a support to product data exchange and information sharing, especially in product development environments where collaboration is not just a choice but a mandatory prerequisite.

Design for Manufacturability and Concurrent Engineering for Product Development

In the 1980s, companies began to feel the effect of three major influences on their product development: newer and innovative technologies, increasing product complexity and larger organizations. And therefore companies were forced to look for new product development methods. This paper tries to focus on the two of new product development methods (DFM and CE). The aim of this paper is to see and analyze different product development methods specifically on Design for Manufacturability and Concurrent Engineering. Companies can achieve and be benefited by minimizing product life cycle, cost and meeting delivery schedule. This paper also presents simplified models that can be modified and used by different companies based on the companies- objective and requirements. Methodologies that are followed to do this research are case studies. Two companies were taken and analysed on the product development process. Historical data, interview were conducted on these companies in addition to that, Survey of literatures and previous research works on similar topics has been done during this research. This paper also tries to show the implementation cost benefit analysis and tries to calculate the implementation time. From this research, it has been found that the two companies did not achieve the delivery time to the customer. Some of most frequently coming products are analyzed and 50% to 80 % of their products are not delivered on time to the customers. The companies are following the traditional way of product development that is sequentially design and production method, which highly affect time to market. In the case study it is found that by implementing these new methods and by forming multi disciplinary team in designing and quality inspection; the company can reduce the workflow steps from 40 to 30.

Effective Defect Prevention Approach in Software Process for Achieving Better Quality Levels

Defect prevention is the most vital but habitually neglected facet of software quality assurance in any project. If functional at all stages of software development, it can condense the time, overheads and wherewithal entailed to engineer a high quality product. The key challenge of an IT industry is to engineer a software product with minimum post deployment defects. This effort is an analysis based on data obtained for five selected projects from leading software companies of varying software production competence. The main aim of this paper is to provide information on various methods and practices supporting defect detection and prevention leading to thriving software generation. The defect prevention technique unearths 99% of defects. Inspection is found to be an essential technique in generating ideal software generation in factories through enhanced methodologies of abetted and unaided inspection schedules. On an average 13 % to 15% of inspection and 25% - 30% of testing out of whole project effort time is required for 99% - 99.75% of defect elimination. A comparison of the end results for the five selected projects between the companies is also brought about throwing light on the possibility of a particular company to position itself with an appropriate complementary ratio of inspection testing.

Development of Knowledge Portal using Open Source Tools: A Case Study of FIIT, UNISEL

Knowledge sharing culture contributes to a positive working environment. Currently, there is no platform for the Faculty of Industrial Information Technology (FIIT), Unisel academic staff to share knowledge among them. As it is done manually, the sharing process is through common meeting or by any offline discussions. There is no repository for future retrieval. However, with open source solution the development of knowledge based application may reduce the cost tremendously. In this paper we discuss about the domain on which this knowledge portal is being developed and also the deployment of open source tools such as JOOMLA, PHP programming language and MySQL. This knowledge portal is evidence that open source tools also reliable in developing knowledge based portal. These recommendations will be useful to the open source community to produce more open source products in future.

Scheduling for a Reconfigurable Manufacturing System with Multiple Process Plans and Limited Pallets/Fixtures

A reconfigurable manufacturing system (RMS) is an advanced system designed at the outset for rapid changes in its hardware and software components in order to quickly adjust its production capacity and functionally. Among various operational decisions, this study considers the scheduling problem that determines the input sequence and schedule at the same time for a given set of parts. In particular, we consider the practical constraints that the numbers of pallets/fixtures are limited and hence a part can be released into the system only when the fixture required for the part is available. To solve the integrated input sequencing and scheduling problems, we suggest a priority rule based approach in which the two sub-problems are solved using a combination of priority rules. To show the effectiveness of various rule combinations, a simulation experiment was done on the data for a real RMS, and the test results are reported.