The Regional Concept, Public Policy and Policy Spaces: The ARC and TVA

This paper examines two policy spaces–the ARC and TVA–and their spatialized politics. The research observes that the regional concept informs public policy and can contribute to the formation of stable policy initiatives. Using the subsystem framework to understand the political viability of policy regimes, the authors conclude policy geographies that appeal to traditional definitions of regions are more stable over time. In contrast, geographies that fail to reflect pre-existing representations of space are engaged in more competitive subsystem politics. The paper demonstrates that the spatial practices of policy regions and their directional politics influence the political viability of programs. The paper concludes that policy spaces should institutionalize pre-existing geographies–not manufacture new ones.

A Novel Method for Behavior Modeling in Uncertain Information Systems

None of the processing models in the software development has explained the software systems performance evaluation and modeling; likewise, there exist uncertainty in the information systems because of the natural essence of requirements, and this may cause other challenges in the processing of software development. By definition an extended version of UML (Fuzzy- UML), the functional requirements of the software defined uncertainly would be supported. In this study, the behavioral description of uncertain information systems by the aid of fuzzy-state diagram is crucial; moreover, the introduction of behavioral diagrams role in F-UML is investigated in software performance modeling process. To get the aim, a fuzzy sub-profile is used.

Adaptive Fourier Decomposition Based Signal Instantaneous Frequency Computation Approach

There have been different approaches to compute the analytic instantaneous frequency with a variety of background reasoning and applicability in practice, as well as restrictions. This paper presents an adaptive Fourier decomposition and (α-counting) based instantaneous frequency computation approach. The adaptive Fourier decomposition is a recently proposed new signal decomposition approach. The instantaneous frequency can be computed through the so called mono-components decomposed by it. Due to the fast energy convergency, the highest frequency of the signal will be discarded by the adaptive Fourier decomposition, which represents the noise of the signal in most of the situation. A new instantaneous frequency definition for a large class of so-called simple waves is also proposed in this paper. Simple wave contains a wide range of signals for which the concept instantaneous frequency has a perfect physical sense. The α-counting instantaneous frequency can be used to compute the highest frequency for a signal. Combination of these two approaches one can obtain the IFs of the whole signal. An experiment is demonstrated the computation procedure with promising results.

The Panpositionable Hamiltonicity of k-ary n-cubes

The hypercube Qn is one of the most well-known and popular interconnection networks and the k-ary n-cube Qk n is an enlarged family from Qn that keeps many pleasing properties from hypercubes. In this article, we study the panpositionable hamiltonicity of Qk n for k ≥ 3 and n ≥ 2. Let x, y of V (Qk n) be two arbitrary vertices and C be a hamiltonian cycle of Qk n. We use dC(x, y) to denote the distance between x and y on the hamiltonian cycle C. Define l as an integer satisfying d(x, y) ≤ l ≤ 1 2 |V (Qk n)|. We prove the followings: • When k = 3 and n ≥ 2, there exists a hamiltonian cycle C of Qk n such that dC(x, y) = l. • When k ≥ 5 is odd and n ≥ 2, we request that l /∈ S where S is a set of specific integers. Then there exists a hamiltonian cycle C of Qk n such that dC(x, y) = l. • When k ≥ 4 is even and n ≥ 2, we request l-d(x, y) to be even. Then there exists a hamiltonian cycle C of Qk n such that dC(x, y) = l. The result is optimal since the restrictions on l is due to the structure of Qk n by definition.

The Multimedia Interactive Theatre by Virtual Means Regarding Computational Intelligence in Space Design as HCI and Samples from Turkey

The aim of this study is to emphasize the opportunities in space design under the aspect of HCI as performance areas. HCI is a multidisciplinary approach that could be identified in many different areas. The aesthetical reflections of HCI by virtual reality in space design are the high-tech solutions of the new innovations as computational facilities by artistic features. The method of this paper is to identify the subject in 3 main parts. In the first part a general approach and definition of interactivity on the basis of space design; in the second part the concept of multimedia interactive theater by some chosen samples from the world and interactive design aspects; in the third part the samples from Turkey will be identified by stage designing principles. In the results it could be declared that the multimedia database is the virtual approach of theatre stage designing regarding interactive means by computational facilities according to aesthetical aspects. HCI is mostly identified in theatre stages as computational intelligence under the affect of interactivity.

How the Iranian Free-Style Wrestlers Know and Think about Doping? – A Knowledge and Attitude Study

Nowadays, doping is an intricate dilemma. Wrestling is the nationally popular sport in Iran. Also the prevalence of doping may be high, due to its power demanding characteristics. So, we aimed to assess the knowledge and attitudes toward doping among the club wrestlers. In a cross sectional study, 426 wrestlers were studied. For this reason, a researcher made questionnaire was used. In this study, researchers selected the clubs by randomized clustered sampling and distributed the questionnaire among wrestlers. Knowledge of wrestlers in three categories of doping definitions, recognition of prohibited drugs and side effects was poor or moderate in 70.8%, 95.8% and 99.5%, respectively. Wrestlers have poor knowledge in doping. Furthermore, they believe some myths which are unfavorable. It seems necessary to design a comprehensive educational program for all of the athletes and coaches.

A Formal Suite of Object Relational Database Metrics

Object Relational Databases (ORDB) are complex in nature than traditional relational databases because they combine the characteristics of both object oriented concepts and relational features of conventional databases. Design of an ORDB demands efficient and quality schema considering the structural, functional and componential traits. This internal quality of the schema is assured by metrics that measure the relevant attributes. This is extended to substantiate the understandability, usability and reliability of the schema, thus assuring external quality of the schema. This work institutes a formalization of ORDB metrics; metric definition, evaluation methodology and the calibration of the metric. Three ORDB schemas were used to conduct the evaluation and the formalization of the metrics. The metrics are calibrated using content and criteria related validity based on the measurability, consistency and reliability of the metrics. Nominal and summative scales are derived based on the evaluated metric values and are standardized. Future works pertaining to ORDB metrics forms the concluding note.

Definition of Foot Size Model using Kohonen Network

In order to define a new model of Tunisian foot sizes and for building the most comfortable shoes, Tunisian industrialists must be able to offer for their customers products able to put on and adjust the majority of the target population concerned. Moreover, the use of models of shoes, mainly from others country, causes a mismatch between the foot and comfort of the Tunisian shoes. But every foot is unique; these models become uncomfortable for the Tunisian foot. We have a set of measures produced from a 3D scan of the feet of a diverse population (women, men ...) and we try to analyze this data to define a model of foot specific to the Tunisian footwear design. In this paper we propose tow new approaches to modeling a new foot sizes model. We used, indeed, the neural networks, and specially the Kohonen network. Next, we combine neural networks with the concept of half-foot size to improve the models already found. Finally, it was necessary to compare the results obtained by applying each approach and we decide what-s the best approach that give us the most model of foot improving more comfortable shoes.

Towards a Compliance Reporting using a Balanced Scorecard

Compliance requires an effective communication within an enterprise as well as towards a company-s external environment. This requirement commences with the implementation of compliance within large scale compliance projects and still persists in the compliance reporting within standard operations. On the one hand the understanding of compliance necessities within the organization is promoted. On the other hand reduction of asymmetric information with compliance stakeholders is achieved. To reach this goal, a central reporting must provide a consolidated view of different compliance efforts- statuses. A concept which could be adapted for this purpose is the balanced scorecard by Kaplan / Norton. This concept has not been analyzed in detail concerning its adequacy for a holistic compliance reporting starting in compliance projects until later usage in regularly compliance operations. At first, this paper evaluates if a holistic compliance reporting can be designed by using the balanced scorecard concept. The current status of compliance reporting clearly shows that scorecards are generally accepted as a compliance reporting tool and are already used for corporate governance reporting. Additional specialized compliance IT - solutions exist in the market. After the scorecard-s adequacy is thoroughly examined and proofed, an example strategy map as the basis to derive a compliance balanced scorecard is defined. This definition answers the question on proceeding in designing a compliance reporting tool.

A Review of Survey Methodology Employedin IT Outsourcing

The purpose of this paper is to provide an overview on methodological aspects of the information technology outsourcing (ITO) surveys, in an attempt to improve the data quality and reporting in survey research. It is based on a review of thirty articles on ITO surveys and focuses on two commonly explored dimensions of ITO, namely what are outsourced and why should there be ITO. This study highlights weaknesses in ITO surveys including lack of a clear definition of population, lack of information regarding the sampling method used, not citing the response rate, no information pertaining to pilot testing of survey instrument and absence of information on internal validity in the use or reporting of surveys. This study represents an attempt with a limited scope to point to shortfalls in the use survey methodology in ITO, and thus raise awareness among researchers in enhancing the reliability of survey findings.

Novel Use of a Quality Assurance Tool for Integrating Technology to HSE

The product development process (PDP) in the Technology group plays a very important role in the launch of any product. While a manufacturing process encourages the use of certain measures to reduce health, safety and environmental (HSE) risks on the shop floor, the PDP concentrates on the use of Geometric Dimensioning and Tolerancing (GD&T) to develop a flawless design. Furthermore, PDP distributes and coordinates activities between different departments such as marketing, purchasing, and manufacturing. However, it is seldom realized that PDP makes a significant contribution to developing a product that reduces HSE risks by encouraging the Technology group to use effective GD&T. The GD&T is a precise communication tool that uses a set of symbols, rules, and definitions to mathematically define parts to be manufactured. It is a quality assurance method widely used in the oil and gas sector. Traditionally it is used to ensure the interchangeability of a part without affecting its form, fit, and function. Parts that do not meet these requirements are rejected during quality audits. This paper discusses how the Technology group integrates this quality assurance tool into the PDP and how the tool plays a major role in helping the HSE department in its goal towards eliminating HSE incidents. The PDP involves a thorough risk assessment and establishes a method to address those risks during the design stage. An illustration shows how GD&T helped reduce safety risks by ergonomically improving assembling operations. A brief discussion explains how tolerances provided on a part help prevent finger injury. This tool has equipped Technology to produce fixtures, which are used daily in operations as well as manufacturing. By applying GD&T to create good fits, HSE risks are mitigated for operating personnel. Both customers and service providers benefit from reduced safety risks.

Coupled Electromagnetic and Thermal Field Modeling of a Laboratory Busbar System

The paper presents coupled electromagnetic and thermal field analysis of busbar system (of rectangular cross-section geometry) submitted to short circuit conditions. The laboratory model was validated against both analytical solution and experimental observations. The considered problem required the computation of the detailed distribution of the power losses and the heat transfer modes. In this electromagnetic and thermal analysis, different definitions of electric busbar heating were considered and compared. The busbar system is a three phase one and consists of aluminum, painted aluminum and copper busbar. The solution to the coupled field problem is obtained using the finite element method and the QuickField™ program. Experiments have been carried out using two different approaches and compared with computed results.

The Riemann Barycenter Computation and Means of Several Matrices

An iterative definition of any n variable mean function is given in this article, which iteratively uses the two-variable form of the corresponding two-variable mean function. This extension method omits recursivity which is an important improvement compared with certain recursive formulas given before by Ando-Li-Mathias, Petz- Temesi. Furthermore it is conjectured here that this iterative algorithm coincides with the solution of the Riemann centroid minimization problem. Certain simulations are given here to compare the convergence rate of the different algorithms given in the literature. These algorithms will be the gradient and the Newton mehod for the Riemann centroid computation.

Library Aware Power Conscious Realization of Complementary Boolean Functions

In this paper, we consider the problem of logic simplification for a special class of logic functions, namely complementary Boolean functions (CBF), targeting low power implementation using static CMOS logic style. The functions are uniquely characterized by the presence of terms, where for a canonical binary 2-tuple, D(mj) ∪ D(mk) = { } and therefore, we have | D(mj) ∪ D(mk) | = 0 [19]. Similarly, D(Mj) ∪ D(Mk) = { } and hence | D(Mj) ∪ D(Mk) | = 0. Here, 'mk' and 'Mk' represent a minterm and maxterm respectively. We compare the circuits minimized with our proposed method with those corresponding to factored Reed-Muller (f-RM) form, factored Pseudo Kronecker Reed-Muller (f-PKRM) form, and factored Generalized Reed-Muller (f-GRM) form. We have opted for algebraic factorization of the Reed-Muller (RM) form and its different variants, using the factorization rules of [1], as it is simple and requires much less CPU execution time compared to Boolean factorization operations. This technique has enabled us to greatly reduce the literal count as well as the gate count needed for such RM realizations, which are generally prone to consuming more cells and subsequently more power consumption. However, this leads to a drawback in terms of the design-for-test attribute associated with the various RM forms. Though we still preserve the definition of those forms viz. realizing such functionality with only select types of logic gates (AND gate and XOR gate), the structural integrity of the logic levels is not preserved. This would consequently alter the testability properties of such circuits i.e. it may increase/decrease/maintain the same number of test input vectors needed for their exhaustive testability, subsequently affecting their generalized test vector computation. We do not consider the issue of design-for-testability here, but, instead focus on the power consumption of the final logic implementation, after realization with a conventional CMOS process technology (0.35 micron TSMC process). The quality of the resulting circuits evaluated on the basis of an established cost metric viz., power consumption, demonstrate average savings by 26.79% for the samples considered in this work, besides reduction in number of gates and input literals by 39.66% and 12.98% respectively, in comparison with other factored RM forms.

Earth Potential Rise (EPR) Computation for a Fault on Transmission Mains Pole

The prologue of new High Voltage (HV) transmission mains into the community necessitates earthing design to ensure safety compliance of the system. Conductive structures such as steel or concrete poles are widely used in HV transmission mains. The earth potential rise (EPR) generated by a fault on these structures could result to an unsafe condition. This paper discusses information on the input impedance of the over head earth wire (OHEW) system for finite and infinite transmission mains. The definition of finite and infinite system is discussed, maximum EPR due to pole fault. The simplified equations for EPR assessments are introduced and discussed for the finite and infinite conditions. A case study is also shown.

3D Definition for Human Smiles

The study explored varied types of human smiles and extracted most of the key factors affecting the smiles. These key factors then were converted into a set of control points which could serve to satisfy the needs for creation of facial expression for 3D animators and be further applied to the face simulation for robots in the future. First, hundreds of human smile pictures were collected and analyzed to identify the key factors for face expression. Then, the factors were converted into a set of control points and sizing parameters calculated proportionally. Finally, two different faces were constructed for validating the parameters via the process of simulating smiles of the same type as the original one.

A Study on the Effects of Thermodynamic Nonideality and Mass Transfer on Multi-phase Hydrodynamics Using CFD Methods

Considering non-ideal behavior of fluids and its effects on hydrodynamic and mass transfer in multiphase flow is very essential. Simulations were performed that takes into account the effects of mass transfer and mixture non-ideality on hydrodynamics reported by Irani et al. In this paper, by assuming the density of phases to be constant and Raullt-s law instead of using EOS and fugacity coefficient definition, respectively for both the liquid and gas phases, the importance of non-ideality effects on mass transfer and hydrodynamic behavior was studied. The results for a system of octane/propane (T=323 K, P =445 kpa) also indicated that the assumption of constant density in simulation had major role to diverse from experimental data. Furthermore, comparison between obtained results and the previous report indicated significant differences between experimental data and simulation results with more ideal assumptions.