A GPU Based Texture Mapping Technique for 3D Models Using Multi-View Images

Previous the 3D model texture generation from multi-view images and mapping algorithms has issues in the texture chart generation which are the self-intersection and the concentration of the texture in texture space. Also we may suffer from some problems due to the occluded areas, such as inside parts of thighs. In this paper we propose a texture mapping technique for 3D models using multi-view images on the GPU. We do texture mapping directly on the GPU fragment shader per pixel without generation of the texture map. And we solve for the occluded area using the 3D model depth information. Our method needs more calculation on the GPU than previous works, but it has shown real-time performance and previously mentioned problems do not occur.

Speech Recognition Using Scaly Neural Networks

This research work is aimed at speech recognition using scaly neural networks. A small vocabulary of 11 words were established first, these words are “word, file, open, print, exit, edit, cut, copy, paste, doc1, doc2". These chosen words involved with executing some computer functions such as opening a file, print certain text document, cutting, copying, pasting, editing and exit. It introduced to the computer then subjected to feature extraction process using LPC (linear prediction coefficients). These features are used as input to an artificial neural network in speaker dependent mode. Half of the words are used for training the artificial neural network and the other half are used for testing the system; those are used for information retrieval. The system components are consist of three parts, speech processing and feature extraction, training and testing by using neural networks and information retrieval. The retrieve process proved to be 79.5-88% successful, which is quite acceptable, considering the variation to surrounding, state of the person, and the microphone type.

Simulated Annealing Algorithm for Data Aggregation Trees in Wireless Sensor Networks and Comparison with Genetic Algorithm

In ad hoc networks, the main issue about designing of protocols is quality of service, so that in wireless sensor networks the main constraint in designing protocols is limited energy of sensors. In fact, protocols which minimize the power consumption in sensors are more considered in wireless sensor networks. One approach of reducing energy consumption in wireless sensor networks is to reduce the number of packages that are transmitted in network. The technique of collecting data that combines related data and prevent transmission of additional packages in network can be effective in the reducing of transmitted packages- number. According to this fact that information processing consumes less power than information transmitting, Data Aggregation has great importance and because of this fact this technique is used in many protocols [5]. One of the Data Aggregation techniques is to use Data Aggregation tree. But finding one optimum Data Aggregation tree to collect data in networks with one sink is a NP-hard problem. In the Data Aggregation technique, related information packages are combined in intermediate nodes and form one package. So the number of packages which are transmitted in network reduces and therefore, less energy will be consumed that at last results in improvement of longevity of network. Heuristic methods are used in order to solve the NP-hard problem that one of these optimization methods is to solve Simulated Annealing problems. In this article, we will propose new method in order to build data collection tree in wireless sensor networks by using Simulated Annealing algorithm and we will evaluate its efficiency whit Genetic Algorithm.

On the Performance of Information Criteria in Latent Segment Models

Nevertheless the widespread application of finite mixture models in segmentation, finite mixture model selection is still an important issue. In fact, the selection of an adequate number of segments is a key issue in deriving latent segments structures and it is desirable that the selection criteria used for this end are effective. In order to select among several information criteria, which may support the selection of the correct number of segments we conduct a simulation study. In particular, this study is intended to determine which information criteria are more appropriate for mixture model selection when considering data sets with only categorical segmentation base variables. The generation of mixtures of multinomial data supports the proposed analysis. As a result, we establish a relationship between the level of measurement of segmentation variables and some (eleven) information criteria-s performance. The criterion AIC3 shows better performance (it indicates the correct number of the simulated segments- structure more often) when referring to mixtures of multinomial segmentation base variables.

A Training Model for Successful Implementation of Enterprise Resource Planning

It well recognized that one feature that makes a successful company is its ability to successfully align its business goals with its information communication technologies platform. Enterprise Resource Planning (ERP) systems contribute to achieve better performance by integrating various business functions and providing support for information flows. However, the technological systems complexity is known to prevent the business users to exploit in an efficient way the Enterprise Resource Planning Systems (ERP). This paper aims to investigate the role of training in improving the usage of ERP systems. To this end, we have designed an instrument survey to employees of a Norwegian multinational global provider of technology solutions. Based on the analysis of collected data, we have delineated a training model that could be high relevance for both researchers and practitioners as a step towards a better understanding of ERP system implementation.

Design and Implementation of Secure Electronic Payment System (Client)

Secure electronic payment system is presented in this paper. This electronic payment system is to be secure for clients such as customers and shop owners. The security architecture of the system is designed by RC5 encryption / decryption algorithm. This eliminates the fraud that occurs today with stolen credit card numbers. The symmetric key cryptosystem RC5 can protect conventional transaction data such as account numbers, amount and other information. This process can be done electronically using RC5 encryption / decryption program written by Microsoft Visual Basic 6.0. There is no danger of any data sent within the system being intercepted, and replaced. The alternative is to use the existing network, and to encrypt all data transmissions. The system with encryption is acceptably secure, but that the level of encryption has to be stepped up, as computing power increases. Results In order to be secure the system the communication between modules is encrypted using symmetric key cryptosystem RC5. The system will use simple user name, password, user ID, user type and cipher authentication mechanism for identification, when the user first enters the system. It is the most common method of authentication in most computer system.

Measuring Relative Efficiency of Korean Construction Company using DEA/Window

Sub-prime mortgage crisis which began in the US is regarded as the most economic crisis since the Great Depression in the early 20th century. Especially, hidden problems on efficient operation of a business were disclosed at a time and many financial institutions went bankrupt and filed for court receivership. The collapses of physical market lead to bankruptcy of manufacturing and construction businesses. This study is to analyze dynamic efficiency of construction businesses during the five years at the turn of the global financial crisis. By discovering the trend and stability of efficiency of a construction business, this study-s objective is to improve management efficiency of a construction business in the ever-changing construction market. Variables were selected by analyzing corporate information on top 20 construction businesses in Korea and analyzed for static efficiency in 2008 and dynamic efficiency between 2006 and 2010. Unlike other studies, this study succeeded in deducing efficiency trend and stability of a construction business for five years by using the DEA/Window model. Using the analysis result, efficient and inefficient companies could be figured out. In addition, relative efficiency among DMU was measured by comparing the relationship between input and output variables of construction businesses. This study can be used as a literature to improve management efficiency for companies with low efficiency based on efficiency analysis of construction businesses.

Modeling “Web of Trust“ with Web 2.0

“Web of Trust" is one of the recognized goals for Web 2.0. It aims to make it possible for the people to take responsibility for what they publish on the web, including organizations, businesses and individual users. These objectives, among others, drive most of the technologies and protocols recently standardized by the governing bodies. One of the great advantages of Web infrastructure is decentralization of publication. The primary motivation behind Web 2.0 is to assist the people to add contents for Collective Intelligence (CI) while providing mechanisms to link content with people for evaluations and accountability of information. Such structure of contents will interconnect users and contents so that users can use contents to find participants and vice versa. This paper proposes conceptual information storage and linking model, based on decentralized information structure, that links contents and people together. The model uses FOAF, Atom, RDF and RDFS and can be used as a blueprint to develop Web 2.0 applications for any e-domain. However, primary target for this paper is online trust evaluation domain. The proposed model targets to assist the individuals to establish “Web of Trust" in online trust domain.

Injuries Related to Kitesurfing

Participation in sporting activities can lead to injury. Sport injuries have been widely studied in many sports including the more extreme categories of aquatic board sports. Kitesurfing is a relatively new water surface action sport, and has not yet been widely studied in terms of injuries and stress on the body. The aim of this study was to get information about which injuries that are most common among kitesurfing participants, where they occur, and their causes. Injuries were studied using an international open web questionnaire (n=206). The results showed that many respondents reported injuries, in total 251 injuries to knee (24%), ankle (17%), trunk (16%) and shoulders (10%), often sustained while doing jumps and tricks (40%). Among the reported injuries were joint injuries (n=101), muscle/tendon damages (n=47), wounds and cuts (n=36) and bone fractures (n=28). Also environmental factors and equipment can influence the risk of injury, or the extent of injury in a hazardous situation. Conclusively, the information from this retrospective study supports earlier studies in terms of prevalence and site of injuries. Suggestively, this information should be used for to build a foundation of knowledge about the sport for development of applications for physical training and product development.

Region Segmentation based on Gaussian Dirichlet Process Mixture Model and its Application to 3D Geometric Stricture Detection

In general, image-based 3D scenes can now be found in many popular vision systems, computer games and virtual reality tours. So, It is important to segment ROI (region of interest) from input scenes as a preprocessing step for geometric stricture detection in 3D scene. In this paper, we propose a method for segmenting ROI based on tensor voting and Dirichlet process mixture model. In particular, to estimate geometric structure information for 3D scene from a single outdoor image, we apply the tensor voting and Dirichlet process mixture model to a image segmentation. The tensor voting is used based on the fact that homogeneous region in an image are usually close together on a smooth region and therefore the tokens corresponding to centers of these regions have high saliency values. The proposed approach is a novel nonparametric Bayesian segmentation method using Gaussian Dirichlet process mixture model to automatically segment various natural scenes. Finally, our method can label regions of the input image into coarse categories: “ground", “sky", and “vertical" for 3D application. The experimental results show that our method successfully segments coarse regions in many complex natural scene images for 3D.

Performance and Availability Analyses of PV Generation Systems in Taiwan

The purpose of this article applies the monthly final energy yield and failure data of 202 PV systems installed in Taiwan to analyze the PV operational performance and system availability. This data is collected by Industrial Technology Research Institute through manual records. Bad data detection and failure data estimation approaches are proposed to guarantee the quality of the received information. The performance ratio value and system availability are then calculated and compared with those of other countries. It is indicated that the average performance ratio of Taiwan-s PV systems is 0.74 and the availability is 95.7%. These results are similar with those of Germany, Switzerland, Italy and Japan.

Product-Based Industrial Information Systems (Application to the Steel Industry)

This paper shows a simple and effective approach to the design and implementation of Industrial Information Systems (IIS) oriented to control the characteristics of each individual product manufactured in a production line and also their manufacturing conditions. The particular products considered in this work are large steel strips that are coiled just after their manufacturing. However, the approach is directly applicable to coiled strips in other industries, like paper, textile, aluminum, etc. These IIS provide very detailed information of each manufactured product, which complement the general information managed by the ERP system of the production line. In spite of the high importance of this type of IIS to guarantee and improve the quality of the products manufactured in many industries, there are very few works about them in the technical literature. For this reason, this paper represents an important contribution to the development of this type of IIS, providing guidelines for their design, implementation and exploitation.

Expression of Security Policy in Medical Systems for Electronic Healthcare Records

This paper introduces a tool that is being developed for the expression of information security policy controls that govern electronic healthcare records. By reference to published findings, the paper introduces the theory behind the use of knowledge management for automatic and consistent security policy assertion using the formalism called the Secutype; the development of the tool and functionality is discussed; some examples of Secutypes generated by the tool are provided; proposed integration with existing medical record systems is described. The paper is concluded with a section on further work and critique of the work achieved to date.

Shoplifting in Riyadh, Saudi Arabia

the research was conducted using the self report of shoplifters who apprehended in the supermarket while stealing. 943 shoplifters in three years were interviewed right after the stealing act and before calling the police. The aim of the study is to know the shoplifting characteristics in Saudi Arabia, including the trait of shoplifters and the situation of the supermarkets where the stealing takes place. The analysis based on the written information about each thief as the documentary research method. Descriptive statistics as well as some inferential statistics were employed. The result shows that there are differences between genders, age groups, occupations, time of the day, days of the week, months, way of stealing, individual or group of thieves and other supermarket situations in the type of items stolen, total price and the count of items. The result and the recommendation will serve as a guide for retailers where, when and who to look at to prevent shoplifting.

Organizational De-Evolution; the Small Group or Single Actor Terrorist

Traditionally, terror groups have been formed by ideologically aligned actors who perceive a lack of options for achieving political or social change. However, terrorist attacks have been increasingly carried out by small groups of actors or lone individuals who may be only ideologically affiliated with larger, formal terrorist organizations. The formation of these groups represents the inverse of traditional organizational growth, whereby structural de-evolution within issue-based organizations leads to the formation of small, independent terror cells. Ideological franchising – the bypassing of formal affiliation to the “parent" organization – represents the de-evolution of traditional concepts of organizational structure in favor of an organic, independent, and focused unit. Traditional definitions of dark networks that are issue-based include focus on an identified goal, commitment to achieving this goal through unrestrained actions, and selection of symbolic targets. The next step in the de-evolution of small dark networks is the miniorganization, consisting of only a handful of actors working toward a common, violent goal. Information-sharing through social media platforms, coupled with civil liberties of democratic nations, provide the communication systems, access to information, and freedom of movement necessary for small dark networks to flourish without the aid of a parent organization. As attacks such as the 7/7 bombings demonstrate the effectiveness of small dark networks, terrorist actors will feel increasingly comfortable aligning with an ideology only, without formally organizing. The natural result of this de-evolving organization is the single actor event, where an individual seems to subscribe to a larger organization-s violent ideology with little or no formal ties.

Software Reliability Prediction Model Analysis

Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.

Reconstitute Information about Discontinued Water Quality Variables in the Nile Delta Monitoring Network Using Two Record Extension Techniques

The world economic crises and budget constraints have caused authorities, especially those in developing countries, to rationalize water quality monitoring activities. Rationalization consists of reducing the number of monitoring sites, the number of samples, and/or the number of water quality variables measured. The reduction in water quality variables is usually based on correlation. If two variables exhibit high correlation, it is an indication that some of the information produced may be redundant. Consequently, one variable can be discontinued, and the other continues to be measured. Later, the ordinary least squares (OLS) regression technique is employed to reconstitute information about discontinued variable by using the continuously measured one as an explanatory variable. In this paper, two record extension techniques are employed to reconstitute information about discontinued water quality variables, the OLS and the Line of Organic Correlation (LOC). An empirical experiment is conducted using water quality records from the Nile Delta water quality monitoring network in Egypt. The record extension techniques are compared for their ability to predict different statistical parameters of the discontinued variables. Results show that the OLS is better at estimating individual water quality records. However, results indicate an underestimation of the variance in the extended records. The LOC technique is superior in preserving characteristics of the entire distribution and avoids underestimation of the variance. It is concluded from this study that the OLS can be used for the substitution of missing values, while LOC is preferable for inferring statements about the probability distribution.

CFD Simulation the Thermal-Hydraulic Characteristic within Fuel Rod Bundle near Grid Spacers

This paper looks into detailed investigation of thermal-hydraulic characteristics of the flow field in a fuel rod model, especially near the spacer. The area investigate represents a source of information on the velocity flow field, vortex, and on the amount of heat transfer into the coolant all of which are critical for the design and improvement of the fuel rod in nuclear power plants. The flow field investigation uses three-dimensional Computational Fluid Dynamics (CFD) with the Reynolds stresses turbulence model (RSM). The fuel rod model incorporates a vertical annular channel where three different shapes of spacers are used; each spacer shape is addressed individually. These spacers are mutually compared in consideration of heat transfer capabilities between the coolant and the fuel rod model. The results are complemented with the calculated heat transfer coefficient in the location of the spacer and along the stainless-steel pipe.

A Study on the Differential Diagnostic Model for Newborn Hearing Loss Screening

According to the statistics, the prevalence of congenital hearing loss in Taiwan is approximately six thousandths; furthermore, one thousandths of infants have severe hearing impairment. Hearing ability during infancy has significant impact in the development of children-s oral expressions, language maturity, cognitive performance, education ability and social behaviors in the future. Although most children born with hearing impairment have sensorineural hearing loss, almost every child more or less still retains some residual hearing. If provided with a hearing aid or cochlear implant (a bionic ear) timely in addition to hearing speech training, even severely hearing-impaired children can still learn to talk. On the other hand, those who failed to be diagnosed and thus unable to begin hearing and speech rehabilitations on a timely manner might lose an important opportunity to live a complete and healthy life. Eventually, the lack of hearing and speaking ability will affect the development of both mental and physical functions, intelligence, and social adaptability. Not only will this problem result in an irreparable regret to the hearing-impaired child for the life time, but also create a heavy burden for the family and society. Therefore, it is necessary to establish a set of computer-assisted predictive model that can accurately detect and help diagnose newborn hearing loss so that early interventions can be provided timely to eliminate waste of medical resources. This study uses information from the neonatal database of the case hospital as the subjects, adopting two different analysis methods of using support vector machine (SVM) for model predictions and using logistic regression to conduct factor screening prior to model predictions in SVM to examine the results. The results indicate that prediction accuracy is as high as 96.43% when the factors are screened and selected through logistic regression. Hence, the model constructed in this study will have real help in clinical diagnosis for the physicians and actually beneficial to the early interventions of newborn hearing impairment.

Device for 3D Analysis of Basic Movements of the Lower Extremity

This document details the process of developing a wireless device that captures the basic movements of the foot (plantar flexion, dorsal flexion, abduction, adduction.), and the knee movement (flexion). It implements a motion capture system by using a hardware based on optical fiber sensors, due to the advantages in terms of scope, noise immunity and speed of data transmission and reception. The operating principle used by this system is the detection and transmission of joint movement by mechanical elements and their respective measurement by optical ones (in this case infrared). Likewise, Visual Basic software is used for reception, analysis and signal processing of data acquired by the device, generating a 3D graphical representation in real time of each movement. The result is a boot in charge of capturing the movement, a transmission module (Implementing Xbee Technology) and a receiver module for receiving information and sending it to the PC for their respective processing. The main idea with this device is to help on topics such as bioengineering and medicine, by helping to improve the quality of life and movement analysis.