Determinants of the U.S. Current Account

This article provides empirical evidence on the effect of domestic and international factors on the U.S. current account deficit. Linear dynamic regression and vector autoregression models are employed to estimate the relationships during the period from 1986 to 2011. The findings of this study suggest that the current and lagged private saving rate and foreign current account for East Asian economies have played a vital role in affecting the U.S. current account. Additionally, using Granger causality tests and variance decompositions, the change of the productivity growth and foreign domestic demand are determined to influence significantly the change of the U.S. current account. To summarize, the empirical relationship between the U.S. current account deficit and its determinants is sensitive to alternative regression models and specifications.

A GPU Based Texture Mapping Technique for 3D Models Using Multi-View Images

Previous the 3D model texture generation from multi-view images and mapping algorithms has issues in the texture chart generation which are the self-intersection and the concentration of the texture in texture space. Also we may suffer from some problems due to the occluded areas, such as inside parts of thighs. In this paper we propose a texture mapping technique for 3D models using multi-view images on the GPU. We do texture mapping directly on the GPU fragment shader per pixel without generation of the texture map. And we solve for the occluded area using the 3D model depth information. Our method needs more calculation on the GPU than previous works, but it has shown real-time performance and previously mentioned problems do not occur.

An Automation of Check Focusing on CRUD for Requirements Analysis Model in UML

A key to success of high quality software development is to define valid and feasible requirements specification. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface mock-up from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the mock-up. This paper proposes a support method to check the validity of a data life cycle by using a model checking tool “UPPAAL" focusing on CRUD (Create, Read, Update and Delete). Exhaustive checking improves the quality of requirements analysis model which are validated by the customers through automatically generated mock-up. The effectiveness of our method is discussed by a case study of requirements modeling of two small projects which are a library management system and a supportive sales system for text books in a university.

On the Performance of Information Criteria in Latent Segment Models

Nevertheless the widespread application of finite mixture models in segmentation, finite mixture model selection is still an important issue. In fact, the selection of an adequate number of segments is a key issue in deriving latent segments structures and it is desirable that the selection criteria used for this end are effective. In order to select among several information criteria, which may support the selection of the correct number of segments we conduct a simulation study. In particular, this study is intended to determine which information criteria are more appropriate for mixture model selection when considering data sets with only categorical segmentation base variables. The generation of mixtures of multinomial data supports the proposed analysis. As a result, we establish a relationship between the level of measurement of segmentation variables and some (eleven) information criteria-s performance. The criterion AIC3 shows better performance (it indicates the correct number of the simulated segments- structure more often) when referring to mixtures of multinomial segmentation base variables.

A Training Model for Successful Implementation of Enterprise Resource Planning

It well recognized that one feature that makes a successful company is its ability to successfully align its business goals with its information communication technologies platform. Enterprise Resource Planning (ERP) systems contribute to achieve better performance by integrating various business functions and providing support for information flows. However, the technological systems complexity is known to prevent the business users to exploit in an efficient way the Enterprise Resource Planning Systems (ERP). This paper aims to investigate the role of training in improving the usage of ERP systems. To this end, we have designed an instrument survey to employees of a Norwegian multinational global provider of technology solutions. Based on the analysis of collected data, we have delineated a training model that could be high relevance for both researchers and practitioners as a step towards a better understanding of ERP system implementation.

Modeling of Knowledge-Intensive Business Processes

Knowledge development in companies relies on knowledge-intensive business processes, which are characterized by a high complexity in their execution, weak structuring, communication-oriented tasks and high decision autonomy, and often the need for creativity and innovation. A foundation of knowledge development is provided, which is based on a new conception of knowledge and knowledge dynamics. This conception consists of a three-dimensional model of knowledge with types, kinds and qualities. Built on this knowledge conception, knowledge dynamics is modeled with the help of general knowledge conversions between knowledge assets. Here knowledge dynamics is understood to cover all of acquisition, conversion, transfer, development and usage of knowledge. Through this conception we gain a sound basis for knowledge management and development in an enterprise. Especially the type dimension of knowledge, which categorizes it according to its internality and externality with respect to the human being, is crucial for enterprise knowledge management and development, because knowledge should be made available by converting it to more external types. Built on this conception, a modeling approach for knowledgeintensive business processes is introduced, be it human-driven,e-driven or task-driven processes. As an example for this approach, a model of the creative activity for the renewal planning of a product is given.

Conjugate Heat Transfer in an Enclosure Containing a Polygon Object

Conjugate natural convection in a differentially heated square enclosure containing a polygon shaped object is studied numerically in this article. The effect of various polygon types on the fluid flow and thermal performance of the enclosure is addressed for different thermal conductivities. The governing equations are modeled and solved numerically using the built-in finite element method of COMSOL software. It is found that the heat transfer rate remains stable by varying the polygon types.

Analysis of the Structural Fluctuation of the Permitted Building Areas and Housing Distribution Ratios - Focused on 5 Cities Including Bucheon

The purpose of this study was to analyze the correlation between permitted building areas and housing distribution ratios and their fluctuation, and test a distribution model during 3 successive governments in 5 cities including Bucheon in reference to the time series administrative data, and thereby, interpret the results of the analysis in association with the policies pursued by the successive governments to examine the structural fluctuation of permitted building areas and housing distribution ratios. In order to analyze the fluctuation of permitted building areas and housing distribution ratios during 3 successive governments and examine the cycles of the time series data, the spectral analysis was performed, and in order to analyze the correlation between permitted building areas and housing distribution ratios, the tabulation was performed to describe the correlations statistically, and in order to explain about differences of fluctuation distribution of permitted building areas and housing distribution ratios among 3 governments, the goodness of fit test was conducted.

Simulation of 3D Flow using Numerical Model at Open-channel Confluences

This paper analytically investigates the 3D flow pattern at the confluences of two rectangular channels having 900 angles using Navier-Stokes equations based on Reynolds Stress Turbulence Model (RSM). The equations are solved by the Finite- Volume Method (FVM) and the flow is analyzed in terms of steadystate (single-phased) conditions. The Shumate experimental findings were used to test the validity of data. Comparison of the simulation model with the experimental ones indicated a close proximity between the flow patterns of the two sets. Effects of the discharge ratio on separation zone dimensions created in the main-channel downstream of the confluence indicated an inverse relation, where a decrease in discharge ratio, will entail an increase in the length and width of the separation zone. The study also found the model as a powerful analytical tool in the feasibility study of hydraulic engineering projects.

Confirming the Identity of the Individual Using Remote Assessment in E-learning

One major issue that is regularly cited as a block to the widespread use of online assessments in eLearning, is that of the authentication of the student and the level of confidence that an assessor can have that the assessment was actually completed by that student. Currently, this issue is either ignored, in which case confidence in the assessment and any ensuing qualification is damaged, or else assessments are conducted at central, controlled locations at specified times, losing the benefits of the distributed nature of the learning programme. Particularly as we move towards constructivist models of learning, with intentions towards achieving heutagogic learning environments, the benefits of a properly managed online assessment system are clear. Here we discuss some of the approaches that could be adopted to address these issues, looking at the use of existing security and biometric techniques, combined with some novel behavioural elements. These approaches offer the opportunity to validate the student on accessing an assessment, on submission, and also during the actual production of the assessment. These techniques are currently under development in the DECADE project, and future work will evaluate and report their use..

Construction of Recombinant E.coli Expressing Fusion Protein to Produce 1,3-Propanediol

In this study, a synthetic pathway was created by assembling genes from Clostridium butyricum and Escherichia coli in different combinations. Among the genes were dhaB1 and dhaB2 from C. butyricum VPI1718 coding for glycerol dehydratase (GDHt) and its activator (GDHtAc), respectively, involved in the conversion of glycerol to 3-hydroxypropionaldehyde (3-HPA). The yqhD gene from E.coli BL21 was also included which codes for an NADPHdependent 1,3-propanediol oxidoreductase isoenzyme (PDORI) reducing 3-HPA to 1,3-propanediol (1,3-PD). Molecular modeling analysis indicated that the conformation of fusion protein of YQHD and DHAB1 was favorable for direct molecular channeling of the intermediate 3-HPA. According to the simulation results, the yqhD and dhaB1 gene were assembled in the upstream of dhaB2 to express a fusion protein, yielding the recombinant strain E. coliBL21 (DE3)//pET22b+::yqhD-dhaB1_dhaB2 (strain BP41Y3). Strain BP41Y3 gave 10-fold higher 1,3-PD concentration than E. coliBL21 (DE3)//pET22b+::yqhD-dhaB1_dhaB2 (strain BP31Y2) expressing the recombinant enzymes simultaneously but in a non-fusion mode. This is the first report using a gene fusion approach to enhance the biological conversion of glycerol to the value added compound 1,3- PD.

An Agent Based Dynamic Resource Scheduling Model with FCFS-Job Grouping Strategy in Grid Computing

Grid computing is a group of clusters connected over high-speed networks that involves coordinating and sharing computational power, data storage and network resources operating across dynamic and geographically dispersed locations. Resource management and job scheduling are critical tasks in grid computing. Resource selection becomes challenging due to heterogeneity and dynamic availability of resources. Job scheduling is a NP-complete problem and different heuristics may be used to reach an optimal or near optimal solution. This paper proposes a model for resource and job scheduling in dynamic grid environment. The main focus is to maximize the resource utilization and minimize processing time of jobs. Grid resource selection strategy is based on Max Heap Tree (MHT) that best suits for large scale application and root node of MHT is selected for job submission. Job grouping concept is used to maximize resource utilization for scheduling of jobs in grid computing. Proposed resource selection model and job grouping concept are used to enhance scalability, robustness, efficiency and load balancing ability of the grid.

GPU-Based Volume Rendering for Medical Imagery

We present a method for fast volume rendering using graphics hardware (GPU). To our knowledge, it is the first implementation on the GPU. Based on the Shear-Warp algorithm, our GPU-based method provides real-time frame rates and outperforms the CPU-based implementation. When the number of slices is not sufficient, we add in-between slices computed by interpolation. This improves then the quality of the rendered images. We have also implemented the ray marching algorithm on the GPU. The results generated by the three algorithms (CPU-based and GPU-based Shear- Warp, GPU-based Ray Marching) for two test models has proved that the ray marching algorithm outperforms the shear-warp methods in terms of speed up and image quality.

Measuring Relative Efficiency of Korean Construction Company using DEA/Window

Sub-prime mortgage crisis which began in the US is regarded as the most economic crisis since the Great Depression in the early 20th century. Especially, hidden problems on efficient operation of a business were disclosed at a time and many financial institutions went bankrupt and filed for court receivership. The collapses of physical market lead to bankruptcy of manufacturing and construction businesses. This study is to analyze dynamic efficiency of construction businesses during the five years at the turn of the global financial crisis. By discovering the trend and stability of efficiency of a construction business, this study-s objective is to improve management efficiency of a construction business in the ever-changing construction market. Variables were selected by analyzing corporate information on top 20 construction businesses in Korea and analyzed for static efficiency in 2008 and dynamic efficiency between 2006 and 2010. Unlike other studies, this study succeeded in deducing efficiency trend and stability of a construction business for five years by using the DEA/Window model. Using the analysis result, efficient and inefficient companies could be figured out. In addition, relative efficiency among DMU was measured by comparing the relationship between input and output variables of construction businesses. This study can be used as a literature to improve management efficiency for companies with low efficiency based on efficiency analysis of construction businesses.

Modeling “Web of Trust“ with Web 2.0

“Web of Trust" is one of the recognized goals for Web 2.0. It aims to make it possible for the people to take responsibility for what they publish on the web, including organizations, businesses and individual users. These objectives, among others, drive most of the technologies and protocols recently standardized by the governing bodies. One of the great advantages of Web infrastructure is decentralization of publication. The primary motivation behind Web 2.0 is to assist the people to add contents for Collective Intelligence (CI) while providing mechanisms to link content with people for evaluations and accountability of information. Such structure of contents will interconnect users and contents so that users can use contents to find participants and vice versa. This paper proposes conceptual information storage and linking model, based on decentralized information structure, that links contents and people together. The model uses FOAF, Atom, RDF and RDFS and can be used as a blueprint to develop Web 2.0 applications for any e-domain. However, primary target for this paper is online trust evaluation domain. The proposed model targets to assist the individuals to establish “Web of Trust" in online trust domain.

Bi-Criteria Latency Optimization of Intra-and Inter-Autonomous System Traffic Engineering

Traffic Engineering (TE) is the process of controlling how traffic flows through a network in order to facilitate efficient and reliable network operations while simultaneously optimizing network resource utilization and traffic performance. TE improves the management of data traffic within a network and provides the better utilization of network resources. Many research works considers intra and inter Traffic Engineering separately. But in reality one influences the other. Hence the effective network performances of both inter and intra Autonomous Systems (AS) are not optimized properly. To achieve a better Joint Optimization of both Intra and Inter AS TE, we propose a joint Optimization technique by considering intra-AS features during inter – AS TE and vice versa. This work considers the important criterion say latency within an AS and between ASes. and proposes a Bi-Criteria Latency optimization model. Hence an overall network performance can be improved by considering this jointoptimization technique in terms of Latency.

Model Checking Consistency of UML Diagrams Using Alloy

In this paper, we proposed a method for detecting consistency violation between UML state machine diagrams and communication diagrams using Alloy. Using input language of Alloy, the proposed method expresses system behaviors described by state machine diagrams, message sequences described by communication diagrams, and a consistency property. As a result of application for an example system, we confirmed that consistency violation could be detected using Alloy correctly.

Region Segmentation based on Gaussian Dirichlet Process Mixture Model and its Application to 3D Geometric Stricture Detection

In general, image-based 3D scenes can now be found in many popular vision systems, computer games and virtual reality tours. So, It is important to segment ROI (region of interest) from input scenes as a preprocessing step for geometric stricture detection in 3D scene. In this paper, we propose a method for segmenting ROI based on tensor voting and Dirichlet process mixture model. In particular, to estimate geometric structure information for 3D scene from a single outdoor image, we apply the tensor voting and Dirichlet process mixture model to a image segmentation. The tensor voting is used based on the fact that homogeneous region in an image are usually close together on a smooth region and therefore the tokens corresponding to centers of these regions have high saliency values. The proposed approach is a novel nonparametric Bayesian segmentation method using Gaussian Dirichlet process mixture model to automatically segment various natural scenes. Finally, our method can label regions of the input image into coarse categories: “ground", “sky", and “vertical" for 3D application. The experimental results show that our method successfully segments coarse regions in many complex natural scene images for 3D.

The Fundamental Reliance of Iterative Learning Control on Stability Robustness

Iterative learning control aims to achieve zero tracking error of a specific command. This is accomplished by iteratively adjusting the command given to a feedback control system, based on the tracking error observed in the previous iteration. One would like the iterations to converge to zero tracking error in spite of any error present in the model used to design the learning law. First, this need for stability robustness is discussed, and then the need for robustness of the property that the transients are well behaved. Methods of producing the needed robustness to parameter variations and to singular perturbations are presented. Then a method involving reverse time runs is given that lets the world behavior produce the ILC gains in such a way as to eliminate the need for a mathematical model. Since the real world is producing the gains, there is no issue of model error. Provided the world behaves linearly, the approach gives an ILC law with both stability robustness and good transient robustness, without the need to generate a model.

Research on Hybrid Neural Network in Intrusion Detection System

This paper presents an intrusion detection system of hybrid neural network model based on RBF and Elman. It is used for anomaly detection and misuse detection. This model has the memory function .It can detect discrete and related aggressive behavior effectively. RBF network is a real-time pattern classifier, and Elman network achieves the memory ability for former event. Based on the hybrid model intrusion detection system uses DARPA data set to do test evaluation. It uses ROC curve to display the test result intuitively. After the experiment it proves this hybrid model intrusion detection system can effectively improve the detection rate, and reduce the rate of false alarm and fail.