Shape Restoration of the Left Ventricle

This paper describes an automatic algorithm to restore the shape of three-dimensional (3D) left ventricle (LV) models created from magnetic resonance imaging (MRI) data using a geometry-driven optimization approach. Our basic premise is to restore the LV shape such that the LV epicardial surface is smooth after the restoration. A geometrical measure known as the Minimum Principle Curvature (κ2) is used to assess the smoothness of the LV. This measure is used to construct the objective function of a two-step optimization process. The objective of the optimization is to achieve a smooth epicardial shape by iterative in-plane translation of the MRI slices. Quantitatively, this yields a minimum sum in terms of the magnitude of κ 2, when κ2 is negative. A limited memory quasi-Newton algorithm, L-BFGS-B, is used to solve the optimization problem. We tested our algorithm on an in vitro theoretical LV model and 10 in vivo patient-specific models which contain significant motion artifacts. The results show that our method is able to automatically restore the shape of LV models back to smoothness without altering the general shape of the model. The magnitudes of in-plane translations are also consistent with existing registration techniques and experimental findings.

Revisiting the Concept of Risk Analysis within the Context of Geospatial Database Design: A Collaborative Framework

The aim of this research is to design a collaborative framework that integrates risk analysis activities into the geospatial database design (GDD) process. Risk analysis is rarely undertaken iteratively as part of the present GDD methods in conformance to requirement engineering (RE) guidelines and risk standards. Accordingly, when risk analysis is performed during the GDD, some foreseeable risks may be overlooked and not reach the output specifications especially when user intentions are not systematically collected. This may lead to ill-defined requirements and ultimately in higher risks of geospatial data misuse. The adopted approach consists of 1) reviewing risk analysis process within the scope of RE and GDD, 2) analyzing the challenges of risk analysis within the context of GDD, and 3) presenting the components of a risk-based collaborative framework that improves the collection of the intended/forbidden usages of the data and helps geo-IT experts to discover implicit requirements and risks.

Satellite Data Classification Accuracy Assessment Based from Reference Dataset

In order to develop forest management strategies in tropical forest in Malaysia, surveying the forest resources and monitoring the forest area affected by logging activities is essential. There are tremendous effort has been done in classification of land cover related to forest resource management in this country as it is a priority in all aspects of forest mapping using remote sensing and related technology such as GIS. In fact classification process is a compulsory step in any remote sensing research. Therefore, the main objective of this paper is to assess classification accuracy of classified forest map on Landsat TM data from difference number of reference data (200 and 388 reference data). This comparison was made through observation (200 reference data), and interpretation and observation approaches (388 reference data). Five land cover classes namely primary forest, logged over forest, water bodies, bare land and agricultural crop/mixed horticultural can be identified by the differences in spectral wavelength. Result showed that an overall accuracy from 200 reference data was 83.5 % (kappa value 0.7502459; kappa variance 0.002871), which was considered acceptable or good for optical data. However, when 200 reference data was increased to 388 in the confusion matrix, the accuracy slightly improved from 83.5% to 89.17%, with Kappa statistic increased from 0.7502459 to 0.8026135, respectively. The accuracy in this classification suggested that this strategy for the selection of training area, interpretation approaches and number of reference data used were importance to perform better classification result.

Utilization of Advanced Data Storage Technology to Conduct Construction Industry on Clear Environment

Construction projects generally take place in uncontrolled and dynamic environments where construction waste is a serious environmental problem in many large cities. The total amount of waste and carbon dioxide emissions from transportation vehicles are still out of control due to increasing construction projects, massive urban development projects and the lack of effective tools for minimizing adverse environmental impacts in construction. This research is about utilization of the integrated applications of automated advanced tracking and data storage technologies in the area of environmental management to monitor and control adverse environmental impacts such as construction waste and carbon dioxide emissions. Radio Frequency Identification (RFID) integrated with the Global Position System (GPS) provides an opportunity to uniquely identify materials, components, and equipments and to locate and track them using minimal or no worker input. The transmission of data to the central database will be carried out with the help of Global System for Mobile Communications (GSM).

A Microcontroller Implementation of Model Predictive Control

Model Predictive Control (MPC) is increasingly being proposed for real time applications and embedded systems. However comparing to PID controller, the implementation of the MPC in miniaturized devices like Field Programmable Gate Arrays (FPGA) and microcontrollers has historically been very small scale due to its complexity in implementation and its computation time requirement. At the same time, such embedded technologies have become an enabler for future manufacturing enterprises as well as a transformer of organizations and markets. Recently, advances in microelectronics and software allow such technique to be implemented in embedded systems. In this work, we take advantage of these recent advances in this area in the deployment of one of the most studied and applied control technique in the industrial engineering. In fact in this paper, we propose an efficient framework for implementation of Generalized Predictive Control (GPC) in the performed STM32 microcontroller. The STM32 keil starter kit based on a JTAG interface and the STM32 board was used to implement the proposed GPC firmware. Besides the GPC, the PID anti windup algorithm was also implemented using Keil development tools designed for ARM processor-based microcontroller devices and working with C/Cµ langage. A performances comparison study was done between both firmwares. This performances study show good execution speed and low computational burden. These results encourage to develop simple predictive algorithms to be programmed in industrial standard hardware. The main features of the proposed framework are illustrated through two examples and compared with the anti windup PID controller.

Fuzzy Logic PID Control of Automatic Voltage Regulator System

The application of a simple microcontroller to deal with a three variable input and a single output fuzzy logic controller, with Proportional – Integral – Derivative (PID) response control built-in has been tested for an automatic voltage regulator. The fuzzifiers are based on fixed range of the variables of output voltage. The control output is used to control the wiper motor of the auto transformer to adjust the voltage, using fuzzy logic principles, so that the voltage is stabilized. In this report, the author will demonstrate how fuzzy logic might provide elegant and efficient solutions in the design of multivariable control based on experimental results rather than on mathematical models.

Supercompression for Full-HD and 4k-3D (8k)Digital TV Systems

In this work, we developed the concept of supercompression, i.e., compression above the compression standard used. In this context, both compression rates are multiplied. In fact, supercompression is based on super-resolution. That is to say, supercompression is a data compression technique that superpose spatial image compression on top of bit-per-pixel compression to achieve very high compression ratios. If the compression ratio is very high, then we use a convolutive mask inside decoder that restores the edges, eliminating the blur. Finally, both, the encoder and the complete decoder are implemented on General-Purpose computation on Graphics Processing Units (GPGPU) cards. Specifically, the mentio-ned mask is coded inside texture memory of a GPGPU.

Heat Exchanger Design

This paper is intended to assist anyone with some general technical experience, but perhaps limited specific knowledge of heat transfer equipment. A characteristic of heat exchanger design is the procedure of specifying a design, heat transfer area and pressure drops and checking whether the assumed design satisfies all requirements or not. The purpose of this paper is how to design the oil cooler (heat exchanger) especially for shell-and-tube heat exchanger which is the majority type of liquid-to-liquid heat exchanger. General design considerations and design procedure are also illustrated in this paper and a flow diagram is provided as an aid of design procedure. In design calculation, the MatLAB and AutoCAD software are used. Fundamental heat transfer concepts and complex relationships involved in such exchanger are also presented in this paper. The primary aim of this design is to obtain a high heat transfer rate without exceeding the allowable pressure drop. This computer program is highly useful to design the shell-and-tube type heat exchanger and to modify existing deign.

A Proposal for Federation Technology for Authenticated Information between Terminals

Recently, various services such as television and the Internet have come to be received through various terminals. However, we could gain greater convenience by receiving these services through cellular phone terminals when we go out and then continuing to receive the same services through a large screen digital television after we have come home. However, it is necessary to go through the same authentication processing again when using TVs after we have come home. In this study, we have developed an authentication method that enables users to switch terminals in environments in which the user receives service from a server through a terminal. Specifically, the method simplifies the authentication of the server side when switching from one terminal to another terminal by using previously authenticated information.

Experimental Study of the Metal Foam Flow Conditioner for Orifice Plate Flowmeters

The sensitivity of orifice plate metering to disturbed flow (either asymmetric or swirling) is a subject of great concern to flow meter users and manufacturers. The distortions caused by pipe fittings and pipe installations upstream of the orifice plate are major sources of this type of non-standard flows. These distortions can alter the accuracy of metering to an unacceptable degree. In this work, a multi-scale object known as metal foam has been used to generate a predetermined turbulent flow upstream of the orifice plate. The experimental results showed that the combination of an orifice plate and metal foam flow conditioner is broadly insensitive to upstream disturbances. This metal foam demonstrated a good performance in terms of removing swirl and producing a repeatable flow profile within a short distance downstream of the device. The results of using a combination of a metal foam flow conditioner and orifice plate for non-standard flow conditions including swirling flow and asymmetric flow show this package can preserve the accuracy of metering up to the level required in the standards.

Towards Growing Self-Organizing Neural Networks with Fixed Dimensionality

The competitive learning is an adaptive process in which the neurons in a neural network gradually become sensitive to different input pattern clusters. The basic idea behind the Kohonen-s Self-Organizing Feature Maps (SOFM) is competitive learning. SOFM can generate mappings from high-dimensional signal spaces to lower dimensional topological structures. The main features of this kind of mappings are topology preserving, feature mappings and probability distribution approximation of input patterns. To overcome some limitations of SOFM, e.g., a fixed number of neural units and a topology of fixed dimensionality, Growing Self-Organizing Neural Network (GSONN) can be used. GSONN can change its topological structure during learning. It grows by learning and shrinks by forgetting. To speed up the training and convergence, a new variant of GSONN, twin growing cell structures (TGCS) is presented here. This paper first gives an introduction to competitive learning, SOFM and its variants. Then, we discuss some GSONN with fixed dimensionality, which include growing cell structures, its variants and the author-s model: TGCS. It is ended with some testing results comparison and conclusions.

Image Enhancement using α-Trimmed Mean ε-Filters

Image enhancement is the most important challenging preprocessing for almost all applications of Image Processing. By now, various methods such as Median filter, α-trimmed mean filter, etc. have been suggested. It was proved that the α-trimmed mean filter is the modification of median and mean filters. On the other hand, ε-filters have shown excellent performance in suppressing noise. In spite of their simplicity, they achieve good results. However, conventional ε-filter is based on moving average. In this paper, we suggested a new ε-filter which utilizes α-trimmed mean. We argue that this new method gives better outcomes compared to previous ones and the experimental results confirmed this claim.

Entrepreneurial Promotion among Farmers: the Early Impacts

The development of entrepreneurial competences of farmers has been pointed out as a necessary condition for the modernization of land in facing the phenomenon of globalization. However, the educational processes involved in such a development have been studied little, especially in emerging economies. This research aims to enlighten some of the critical issues behind the early stages of the transformation of farmers into entrepreneurs, through in depth interviews with farmers, entrepreneurial promoters and public officials participating in a public pilot project in Mexico. Although major impacts were expected only in the long run, important positive changes in the mind set of farmers and other participants were found in early stages of the intervention. Apparently, the farmers started a process of becoming more conscious about the importance of preserving the aquiferous resources, as well as more market and entrepreneurial oriented.

Problems and Possible Solutions with the Development of a Computer Model of Quantum Theory

A computer model of Quantum Theory (QT) has been developed by the author. Major goal of the computer model was support and demonstration of an as large as possible scope of QT. This includes simulations for the major QT (Gedanken-) experiments such as, for example, the famous double-slit experiment. Besides the anticipated difficulties with (1) transforming exacting mathematics into a computer program, two further types of problems showed up, namely (2) areas where QT provides a complete mathematical formalism, but when it comes to concrete applications the equations are not solvable at all, or only with extremely high effort; (3) QT rules which are formulated in natural language and which do not seem to be translatable to precise mathematical expressions, nor to a computer program. The paper lists problems in all three categories and describes also the possible solutions or circumventions developed for the computer model.

Infrastructure means for Adaptive Camouflage

The paper deals with the perspectives and possibilities of "smart solutions" to critical infrastructure protection. It means that common computer aided technologies are used from the perspective of new, better protection of selected infrastructure objects. The paper is focused on the co-product of the Czech Defence Research Project - ADAPTIV. This project is carrying out by the University of Defence, Faculty of Economics and Management at the Department of Civil Protection. The project creates system and technology for adaptive cybernetic camouflage of armed forces objects, armaments, vehicles and troops and of mobilization infrastructure. These adaptive camouflage system and technology will be useful for army tactic activities protection and for decoys generation also. The fourth chapter of the paper concerns the possibilities of using the introduced technology to the protection of selected civil (economically important), critical infrastructure objects. The aim of this section is to introduce the scientific capabilities and potential of the University of Defence research results and solutions for the practice.

Quantitative Analysis of PCA, ICA, LDA and SVM in Face Recognition

Face recognition is a technique to automatically identify or verify individuals. It receives great attention in identification, authentication, security and many more applications. Diverse methods had been proposed for this purpose and also a lot of comparative studies were performed. However, researchers could not reach unified conclusion. In this paper, we are reporting an extensive quantitative accuracy analysis of four most widely used face recognition algorithms: Principal Component Analysis (PCA), Independent Component Analysis (ICA), Linear Discriminant Analysis (LDA) and Support Vector Machine (SVM) using AT&T, Sheffield and Bangladeshi people face databases under diverse situations such as illumination, alignment and pose variations.

Optimizing usage of ICTs and Outsourcing Strategic in Business Models and Customer Satisfaction

Nowadays, under developed countries for progress in science and technology and decreasing the technologic gap with developed countries, increasing the capacities and technology transfer from developed countries. To remain competitive, industry is continually searching for new methods to evolve their products. Business model is one of the latest buzzwords in the Internet and electronic business world. To be successful, organizations must look into the needs and wants of their customers. This research attempts to identify a specific feature of the company with a strong competitive advantage by analyzing the cause of Customer satisfaction. Due to the rapid development of knowledge and information technology, business environments have become much more complicated. Information technology can help a firm aiming to gain a competitive advantage. This study explores the role and effect of Information Communication Technology in Business Models and Customer satisfaction on firms and also relationships between ICTs and Outsourcing strategic.

Analysis of Food Security Situation among Nigerian Rural Farmers

This paper analysed the food security situation among Nigerian rural farmers. Data collected on 202 rural farmers from Benue State were analysed using descriptive and inferential statistics. The study revealed that majority of the respondents (60.83%) had medium dietary diversity. Furthermore, household daily calorie requirement for the food secure households was 10,723 and the household daily calorie consumption was 12,598, with a surplus index of 0.04. The food security index was 1.16. The Household daily per capita calorie consumption was 3,221.2. For the food insecure households, the household daily calorie requirement was 20,213 and the household daily calorie consumption was 17,393. The shortfall index was 0.14. The food security index was 0.88. The Household daily per capita calorie consumption was 2,432.8. The most commonly used coping strategies during food stress included intercropping (99.2%), reliance on less preferred food (98.1%), limiting portion size at meal times (85.8%) and crop diversification (70.8%).

Reversible, Embedded and Highly Scalable Image Compression System

In this work a new method for low complexity image coding is presented, that permits different settings and great scalability in the generation of the final bit stream. This coding presents a continuous-tone still image compression system that groups loss and lossless compression making use of finite arithmetic reversible transforms. Both transformation in the space of color and wavelet transformation are reversible. The transformed coefficients are coded by means of a coding system in depending on a subdivision into smaller components (CFDS) similar to the bit importance codification. The subcomponents so obtained are reordered by means of a highly configure alignment system depending on the application that makes possible the re-configure of the elements of the image and obtaining different importance levels from which the bit stream will be generated. The subcomponents of each importance level are coded using a variable length entropy coding system (VBLm) that permits the generation of an embedded bit stream. This bit stream supposes itself a bit stream that codes a compressed still image. However, the use of a packing system on the bit stream after the VBLm allows the realization of a final highly scalable bit stream from a basic image level and one or several improvement levels.

COTT – A Testability Framework for Object-Oriented Software Testing

Testable software has two inherent properties – observability and controllability. Observability facilitates observation of internal behavior of software to required degree of detail. Controllability allows creation of difficult-to-achieve states prior to execution of various tests. In this paper, we describe COTT, a Controllability and Observability Testing Tool, to create testable object-oriented software. COTT provides a framework that helps the user to instrument object-oriented software to build the required controllability and observability. During testing, the tool facilitates creation of difficult-to-achieve states required for testing of difficultto- test conditions and observation of internal details of execution at unit, integration and system levels. The execution observations are logged in a test log file, which are used for post analysis and to generate test coverage reports.