Modeling of Normal and Atherosclerotic Blood Vessels using Finite Element Methods and Artificial Neural Networks

Analysis of blood vessel mechanics in normal and diseased conditions is essential for disease research, medical device design and treatment planning. In this work, 3D finite element models of normal vessel and atherosclerotic vessel with 50% plaque deposition were developed. The developed models were meshed using finite number of tetrahedral elements. The developed models were simulated using actual blood pressure signals. Based on the transient analysis performed on the developed models, the parameters such as total displacement, strain energy density and entropy per unit volume were obtained. Further, the obtained parameters were used to develop artificial neural network models for analyzing normal and atherosclerotic blood vessels. In this paper, the objectives of the study, methodology and significant observations are presented.

Semi-Automatic Analyzer to Detect Authorial Intentions in Scientific Documents

Information Retrieval has the objective of studying models and the realization of systems allowing a user to find the relevant documents adapted to his need of information. The information search is a problem which remains difficult because the difficulty in the representing and to treat the natural languages such as polysemia. Intentional Structures promise to be a new paradigm to extend the existing documents structures and to enhance the different phases of documents process such as creation, editing, search and retrieval. The intention recognition of the author-s of texts can reduce the largeness of this problem. In this article, we present intentions recognition system is based on a semi-automatic method of extraction the intentional information starting from a corpus of text. This system is also able to update the ontology of intentions for the enrichment of the knowledge base containing all possible intentions of a domain. This approach uses the construction of a semi-formal ontology which considered as the conceptualization of the intentional information contained in a text. An experiments on scientific publications in the field of computer science was considered to validate this approach.

Comparison of Parametric and Nonparametric Techniques for Non-peak Traffic Forecasting

Accurately predicting non-peak traffic is crucial to daily traffic for all forecasting models. In the paper, least squares support vector machines (LS-SVMs) are investigated to solve such a practical problem. It is the first time to apply the approach and analyze the forecast performance in the domain. For comparison purpose, two parametric and two non-parametric techniques are selected because of their effectiveness proved in past research. Having good generalization ability and guaranteeing global minima, LS-SVMs perform better than the others. Providing sufficient improvement in stability and robustness reveals that the approach is practically promising.

On the Need to have an Additional Methodology for the Psychological Product Measurement and Evaluation

Cognitive Science appeared about 40 years ago, subsequent to the challenge of the Artificial Intelligence, as common territory for several scientific disciplines such as: IT, mathematics, psychology, neurology, philosophy, sociology, and linguistics. The new born science was justified by the complexity of the problems related to the human knowledge on one hand, and on the other by the fact that none of the above mentioned sciences could explain alone the mental phenomena. Based on the data supplied by the experimental sciences such as psychology or neurology, models of the human mind operation are built in the cognition science. These models are implemented in computer programs and/or electronic circuits (specific to the artificial intelligence) – cognitive systems – whose competences and performances are compared to the human ones, leading to the psychology and neurology data reinterpretation, respectively to the construction of new models. During these processes if psychology provides the experimental basis, philosophy and mathematics provides the abstraction level utterly necessary for the intermission of the mentioned sciences. The ongoing general problematic of the cognitive approach provides two important types of approach: the computational one, starting from the idea that the mental phenomenon can be reduced to 1 and 0 type calculus operations, and the connection one that considers the thinking products as being a result of the interaction between all the composing (included) systems. In the field of psychology measurements in the computational register use classical inquiries and psychometrical tests, generally based on calculus methods. Deeming things from both sides that are representing the cognitive science, we can notice a gap in psychological product measurement possibilities, regarded from the connectionist perspective, that requires the unitary understanding of the quality – quantity whole. In such approach measurement by calculus proves to be inefficient. Our researches, deployed for longer than 20 years, lead to the conclusion that measuring by forms properly fits to the connectionism laws and principles.

A New Dimension in Software Risk Managment

A dynamic risk management framework for software projects is presented. Currently available software risk management frameworks and risk assessment models are static in nature and lacks feedback capability. Such risk management frameworks are not capable of providing the risk assessment of futuristic changes in risk events. A dynamic risk management framework for software project is needed that provides futuristic assessment of risk events.

Grid Based and Random Based Ant Colony Algorithms for Automatic Hose Routing in 3D Space

Ant Colony Algorithms have been applied to difficult combinatorial optimization problems such as the travelling salesman problem and the quadratic assignment problem. In this paper gridbased and random-based ant colony algorithms are proposed for automatic 3D hose routing and their pros and cons are discussed. The algorithm uses the tessellated format for the obstacles and the generated hoses in order to detect collisions. The representation of obstacles and hoses in the tessellated format greatly helps the algorithm towards handling free-form objects and speeds up computation. The performance of algorithm has been tested on a number of 3D models.

Bank Business Models and The Changes in CEE Countries

The aim of this article is to assess the existing business models used by the banks operating in the CEE countries in the time period from 2006 till 2011. In order to obtain research results, the authors performed qualitative analysis of the scientific literature on bank business models, which have been grouped into clusters that consist of such components as: 1) capital and reserves; 2) assets; 3) deposits, and 4) loans. In their turn, bank business models have been developed based on the types of core activities of the banks, and have been divided into four groups: Wholesale, Investment, Retail and Universal Banks. Descriptive statistics have been used to analyse the models, determining mean, minimal and maximal values of constituent cluster components, as well as standard deviation. The analysis of the data is based on such bank variable indices as Return on Assets (ROA) and Return on Equity (ROE).

Data Mining Classification Methods Applied in Drug Design

Data mining incorporates a group of statistical methods used to analyze a set of information, or a data set. It operates with models and algorithms, which are powerful tools with the great potential. They can help people to understand the patterns in certain chunk of information so it is obvious that the data mining tools have a wide area of applications. For example in the theoretical chemistry data mining tools can be used to predict moleculeproperties or improve computer-assisted drug design. Classification analysis is one of the major data mining methodologies. The aim of thecontribution is to create a classification model, which would be able to deal with a huge data set with high accuracy. For this purpose logistic regression, Bayesian logistic regression and random forest models were built using R software. TheBayesian logistic regression in Latent GOLD software was created as well. These classification methods belong to supervised learning methods. It was necessary to reduce data matrix dimension before construct models and thus the factor analysis (FA) was used. Those models were applied to predict the biological activity of molecules, potential new drug candidates.

Prototype for Enhancing Information Security Awareness in Industry

Human-related information security breaches within organizations are primarily caused by employees who have not been made aware of the importance of protecting the information they work with. Information security awareness is accordingly attracting more attention from industry, because stakeholders are held accountable for the information with which they work. The authors developed an Information Security Retrieval and Awareness model – entitled “ISRA" – that is tailored specifically towards enhancing information security awareness in industry amongst all users of information, to address shortcomings in existing information security awareness models. This paper is principally aimed at expounding a prototype for the ISRA model to highlight the advantages of utilizing the model. The prototype will focus on the non-technical, humanrelated information security issues in industry. The prototype will ensure that all stakeholders in an organization are part of an information security awareness process, and that these stakeholders are able to retrieve specific information related to information security issues relevant to their job category, preventing them from being overburdened with redundant information.

Optimization Approaches for a Complex Dairy Farm Simulation Model

This paper describes the optimization of a complex dairy farm simulation model using two quite different methods of optimization, the Genetic algorithm (GA) and the Lipschitz Branch-and-Bound (LBB) algorithm. These techniques have been used to improve an agricultural system model developed by Dexcel Limited, New Zealand, which describes a detailed representation of pastoral dairying scenarios and contains an 8-dimensional parameter space. The model incorporates the sub-models of pasture growth and animal metabolism, which are themselves complex in many cases. Each evaluation of the objective function, a composite 'Farm Performance Index (FPI)', requires simulation of at least a one-year period of farm operation with a daily time-step, and is therefore computationally expensive. The problem of visualization of the objective function (response surface) in high-dimensional spaces is also considered in the context of the farm optimization problem. Adaptations of the sammon mapping and parallel coordinates visualization are described which help visualize some important properties of the model-s output topography. From this study, it is found that GA requires fewer function evaluations in optimization than the LBB algorithm.

Chaos Theory and Application in Foreign Exchange Rates vs. IRR (Iranian Rial)

Daily production of information and importance of the sequence of produced data in forecasting future performance of market causes analysis of data behavior to become a problem of analyzing time series. But time series that are very complicated, usually are random and as a result their changes considered being unpredictable. While these series might be products of a deterministic dynamical and nonlinear process (chaotic) and as a result be predictable. Point of Chaotic theory view, complicated systems have only chaotically face and as a result they seem to be unregulated and random, but it is possible that they abide by a specified math formula. In this article, with regard to test of strange attractor and biggest Lyapunov exponent probability of chaos on several foreign exchange rates vs. IRR (Iranian Rial) has been investigated. Results show that data in this market have complex chaotic behavior with big degree of freedom.

Task Modeling for User Interface Design: A Layered Approach

The model-based approach to user interface design relies on developing separate models that are capturing various aspects about users, tasks, application domain, presentation and dialog representations. This paper presents a task modeling approach for user interface design and aims at exploring the mappings between task, domain and presentation models. The basic idea of our approach is to identify typical configurations in task and domain models and to investigate how they relate each other. A special emphasis is put on application-specific functions and mappings between domain objects and operational task structures. In this respect, we will distinguish between three layers in the task decomposition: a functional layer, a planning layer, and an operational layer.

Complex Energy Signal Model for Digital Human Fingerprint Matching

This paper describes a complex energy signal model that is isomorphic with digital human fingerprint images. By using signal models, the problem of fingerprint matching is transformed into the signal processing problem of finding a correlation between two complex signals that differ by phase-rotation and time-scaling. A technique for minutiae matching that is independent of image translation, rotation and linear-scaling, and is resistant to missing minutiae is proposed. The method was tested using random data points. The results show that for matching prints the scaling and rotation angles are closely estimated and a stronger match will have a higher correlation.

A Dual Model for Efficiency Evaluation Considering Time Lag Effect

A DEA model can generally evaluate the performance using multiple inputs and outputs for the same period. However, it is hard to avoid the production lead time phenomenon some times, such as long-term project or marketing activity. A couple of models have been suggested to capture this time lag issue in the context of DEA. This paper develops a dual-MPO model to deal with time lag effect in evaluating efficiency. A numerical example is also given to show that the proposed model can be used to get efficiency and reference set of inefficient DMUs and to obtain projected target value of input attributes for inefficient DMUs to be efficient.

Memory Estimation of Internet Server Using Queuing Theory: Comparative Study between M/G/1, G/M/1 and G/G/1 Queuing Model

How to effectively allocate system resource to process the Client request by Gateway servers is a challenging problem. In this paper, we propose an improved scheme for autonomous performance of Gateway servers under highly dynamic traffic loads. We devise a methodology to calculate Queue Length and Waiting Time utilizing Gateway Server information to reduce response time variance in presence of bursty traffic. The most widespread contemplation is performance, because Gateway Servers must offer cost-effective and high-availability services in the elongated period, thus they have to be scaled to meet the expected load. Performance measurements can be the base for performance modeling and prediction. With the help of performance models, the performance metrics (like buffer estimation, waiting time) can be determined at the development process. This paper describes the possible queue models those can be applied in the estimation of queue length to estimate the final value of the memory size. Both simulation and experimental studies using synthesized workloads and analysis of real-world Gateway Servers demonstrate the effectiveness of the proposed system.

Analytic and Finite Element Solutions for Temperature Profiles in Welding using Varied Heat Source Models

Solutions for the temperature profile around a moving heat source are obtained using both analytic and finite element (FEM) methods. Analytic and FEM solutions are applied to study the temperature profile in welding. A moving heat source is represented using both point heat source and uniform distributed disc heat source models. Analytic solutions are obtained by solving the partial differential equation for energy conservation in a solid, and FEM results are provided by simulating welding using the ANSYS software. Comparison is made for quasi steady state conditions. The results provided by the analytic solutions are in good agreement with results obtained by FEM.

Meteorological Data Study and Forecasting Using Particle Swarm Optimization Algorithm

Weather systems use enormously complex combinations of numerical tools for study and forecasting. Unfortunately, due to phenomena in the world climate, such as the greenhouse effect, classical models may become insufficient mostly because they lack adaptation. Therefore, the weather forecast problem is matched for heuristic approaches, such as Evolutionary Algorithms. Experimentation with heuristic methods like Particle Swarm Optimization (PSO) algorithm can lead to the development of new insights or promising models that can be fine tuned with more focused techniques. This paper describes a PSO approach for analysis and prediction of data and provides experimental results of the aforementioned method on realworld meteorological time series.

Packing Theory for Natural and Crushed Aggregate to Obtain the Best Mix of Aggregate: Research and Development

Concrete performance is strongly affected by the particle packing degree since it determines the distribution of the cementitious component and the interaction of mineral particles. By using packing theory designers will be able to select optimal aggregate materials for preparing concrete with low cement content, which is beneficial from the point of cost. Optimum particle packing implies minimizing porosity and thereby reducing the amount of cement paste needed to fill the voids between the aggregate particles, taking also the rheology of the concrete into consideration. For reaching good fluidity superplasticizers are required. The results from pilot tests at Luleå University of Technology (LTU) show various forms of the proposed theoretical models, and the empirical approach taken in the study seems to provide a safer basis for developing new, improved packing models.

Solution of Interval-valued Manufacturing Inventory Models With Shortages

A manufacturing inventory model with shortages with carrying cost, shortage cost, setup cost and demand quantity as imprecise numbers, instead of real numbers, namely interval number is considered here. First, a brief survey of the existing works on comparing and ranking any two interval numbers on the real line is presented. A common algorithm for the optimum production quantity (Economic lot-size) per cycle of a single product (so as to minimize the total average cost) is developed which works well on interval number optimization under consideration. Finally, the designed algorithm is illustrated with numerical example.

Assessing Semantic Consistency of Business Process Models

Business process modeling has become an accepted means for designing and describing business operations. Thereby, consistency of business process models, i.e., the absence of modeling faults, is of upmost importance to organizations. This paper presents a concept and subsequent implementation for detecting faults in business process models and for computing a measure of their consistency. It incorporates not only syntactic consistency but also semantic consistency, i.e., consistency regarding the meaning of model elements from a business perspective.