A Comparative Study of Electrical Transport Phenomena in Ultrathin vs. Nanoscale SOI MOSFETs Devices

Ultrathin (UTD) and Nanoscale (NSD) SOI-MOSFET devices, sharing a similar W/L but with a channel thickness of 46nm and 1.6nm respectively, were fabricated using a selective “gate recessed” process on the same silicon wafer. The electrical transport characterization at room temperature has shown a large difference between the two kinds of devices and has been interpreted in terms of a huge unexpected series resistance. Electrical characteristics of the Nanoscale device, taken in the linear region, can be analytically derived from the ultrathin device ones. A comparison of the structure and composition of the layers, using advanced techniques such as Focused Ion Beam (FIB) and High Resolution TEM (HRTEM) coupled with Energy Dispersive X-ray Spectroscopy (EDS), contributes an explanation as to the difference of transport between the devices.

Heuristic Set-Covering-Based Postprocessing for Improving the Quine-McCluskey Method

Finding the minimal logical functions has important applications in the design of logical circuits. This task is solved by many different methods but, frequently, they are not suitable for a computer implementation. We briefly summarise the well-known Quine-McCluskey method, which gives a unique procedure of computing and thus can be simply implemented, but, even for simple examples, does not guarantee an optimal solution. Since the Petrick extension of the Quine-McCluskey method does not give a generally usable method for finding an optimum for logical functions with a high number of values, we focus on interpretation of the result of the Quine-McCluskey method and show that it represents a set covering problem that, unfortunately, is an NP-hard combinatorial problem. Therefore it must be solved by heuristic or approximation methods. We propose an approach based on genetic algorithms and show suitable parameter settings.

Business Rules for Data Warehouse

Business rules and data warehouse are concepts and technologies that impact a wide variety of organizational tasks. In general, each area has evolved independently, impacting application development and decision-making. Generating knowledge from data warehouse is a complex process. This paper outlines an approach to ease import of information and knowledge from a data warehouse star schema through an inference class of business rules. The paper utilizes the Oracle database for illustrating the working of the concepts. The star schema structure and the business rules are stored within a relational database. The approach is explained through a prototype in Oracle-s PL/SQL Server Pages.

Proposed Developments of Elliptic Curve Digital Signature Algorithm

The Elliptic Curve Digital Signature Algorithm (ECDSA) is the elliptic curve analogue of DSA, where it is a digital signature scheme designed to provide a digital signature based on a secret number known only to the signer and also on the actual message being signed. These digital signatures are considered the digital counterparts to handwritten signatures, and are the basis for validating the authenticity of a connection. The security of these schemes results from the infeasibility to compute the signature without the private key. In this paper we introduce a proposed to development the original ECDSA with more complexity.

Aquatic Modeling: An Interplay between Scales

This paper presents an integrated knowledge-based approach to multi-scale modeling of aquatic systems, with a view to enhancing predictive power and aiding environmental management and policy-making. The basic phases of this approach have been exemplified in the case of a bay in Saronicos Gulf (Attiki, Greece). The results showed a significant problem with rising phytoplankton blooms linked to excessive microbial growth, arisen mostly due to increased nitrogen inflows; therefore, the nitrification/denitrification processes of the benthic and water column sub-systems have provided the quality variables to be monitored for assessing environmental status. It is thereby demonstrated that the proposed approach facilitates modeling choices and implementation option decisions, while it provides substantial support for knowledge and experience capitalization in long-term water management.

A Quantum Algorithm of Constructing Image Histogram

Histogram plays an important statistical role in digital image processing. However, the existing quantum image models are deficient to do this kind of image statistical processing because different gray scales are not distinguishable. In this paper, a novel quantum image representation model is proposed firstly in which the pixels with different gray scales can be distinguished and operated simultaneously. Based on the new model, a fast quantum algorithm of constructing histogram for quantum image is designed. Performance comparison reveals that the new quantum algorithm could achieve an approximately quadratic speedup than the classical counterpart. The proposed quantum model and algorithm have significant meanings for the future researches of quantum image processing.

A Utilitarian Approach to Modeling Information Flows in Social Networks

We propose a multi-agent based utilitarian approach to model and understand information flows in social networks that lead to Pareto optimal informational exchanges. We model the individual expected utility function of the agents to reflect the net value of information received. We show how this model, adapted from a theorem by Karl Borch dealing with an actuarial Risk Exchange concept in the Insurance industry, can be used for social network analysis. We develop a utilitarian framework that allows us to interpret Pareto optimal exchanges of value as potential information flows, while achieving a maximization of a sum of expected utilities of information of the group of agents. We examine some interesting conditions on the utility function under which the flows are optimal. We illustrate the promise of this new approach to attach economic value to information in networks with a synthetic example.

The Link between Financial and Overall Corporate Strategies

Company strategy expresses a basic idea of how to reach company objectives. A whole range of models of strategic management are used in practice. The concept of strategic management should fulfill some basic requirements to make it applicable for both the typical, but also more specific company environment. The financial strategy plays an important role in corporate strategy. The paper develops a methodology of strategic model implementing into the category of micro, small and medium-sized enterprises (SMEs). Furthermore, the methodology recommends procedures while solving an up-to-date worldwide task of the definition of the company strategy and its financial strategy.

Clinical and Methodological Issues in the Research on the Rape Myth

The purpose of this study is to revisit the concept of rape as represented by professionals in the literature as well as its perception (beliefs and attitudes) in the population at large and to propose methodological improvements to its measurement tool. Rape is a serious crime threatening its victim-s physical and mental health and integrity; and as such is legally prosecuted in all modern societies. The problem is not in accepting or rejecting rape as a criminal act, but rather in the vagueness of its interpretations and “justifications" maintained in the mentality of modern societies - known in the literature as the phenomenon of "rape-myth". The rapemyth can be studied from different perspectives: criminology, sociology, ethics, medicine and psychology. Its investigation requires rigorous scientific objectivity, free of passion (victims of rape are at risk of emotional bias), free of activism (social activists, even if wellintentioned are also biased), free of any pre-emptive assumptions or prejudices. To apply a rigorous scientific procedure, we need a solid, valid and reliable measurement. Rape is a form of heterosexual or homosexual aggression, violently forcing the victim to give-in in the sexual activity of the aggressor against her/his will. Human beings always try to “understand" or find a reason justifying their acts. Psychological literature provides multiple clinical and experimental examples of it; just to mention the famous studies by Milgram on the level of electroshock delivered by the “teacher" towards the “learner" if “scientifically justifiable" or the studies on the behavior of “prisoners" and the “guards" and many other experiments and field observations. Sigmund Freud presented the phenomenon of unconscious justification and called it rationalization. The multiple justifications, rationalizations and repeated opinions about sexual behavior contribute to a myth maintained in the society. What kind of “rationale" our societies apply to “understand" the non-consensual sexual behavior? There are many, just to mention few: • Sex is a ludistic activity for both participants, therefore – even if not consented – it should bring pleasure to both. • Everybody wants sex, but only men are allowed to manifest it openly while women have to pretend the opposite, thus men have to initiate sexual behavior and women would follow. • A person who strongly needs sex is free to manifest it and struggle to get it; the person who doesn-t want it must not reveal her/his sexual attraction and avoid risky situations; otherwise she/he is perceived as a promiscuous seducer. • A person who doesn-t fight against the sexual initiator unconsciously accepts the rape (does it explain why homosexual rapes are reported less frequently than rapes against women?). • Women who are raped deserve it because their wardrobe is very revealing and seducing and they ''willingly'' go to highly risky places (alleys, dark roads, etc.). • Men need to ventilate their sexual energy and if they are deprived of a partner their urge to have sex is difficult to control. • Men are supposed to initiate and insist even by force to have sex (their testosterone makes them both sexual and aggressive). The paper overviews numerous cultural beliefs about masculine versus feminine behavior and their impact on the “rape myth".

Effect of Sensory Manipulations on Human Joint Stiffness Strategy and Its Adaptation for Human Dynamic Stability

Sensory input plays an important role to human posture control system to initiate strategy in order to counterpart any unbalance condition and thus, prevent fall. In previous study, joint stiffness was observed able to describe certain issues regarding to movement performance. But, correlation between balance ability and joint stiffness is still remains unknown. In this study, joint stiffening strategy at ankle and hip were observed under different sensory manipulations and its correlation with conventional clinical test (Functional Reach Test) for balance ability was investigated. In order to create unstable condition, two different surface perturbations (tilt up-tilt (TT) down and forward-backward (FB)) at four different frequencies (0.2, 0.4, 0.6 and 0.8 Hz) were introduced. Furthermore, four different sensory manipulation conditions (include vision and vestibular system) were applied to the subject and they were asked to maintain their position as possible. The results suggested that joint stiffness were high during difficult balance situation. Less balance people generated high average joint stiffness compared to balance people. Besides, adaptation of posture control system under repetitive external perturbation also suggested less during sensory limited condition. Overall, analysis of joint stiffening response possible to predict unbalance situation faced by human

The Grey Relational Analysis of the Influence Factors of Profit in Cartoon-s Character Merchandising Rights

This paper constructs a four factors theoretical model of Chinese small and medium enterprises based on the “cartoon characters- reputation - enterprise marketing and management capabilities – protection of the cartoon image - institutional environment" by literature research, case studies and investigation. The empirical study show that the greatest impact on current merchandising rights income is the institutional environment friendliness, followed by marketing and management capabilities, input of character image protection and Cartoon characters- reputation through the real-time grey relational analysis, and the greatest impact on post-merchandising rights profit is Cartoon characters reputation, followed by the institutional environment friendliness, then marketing and management ability and input of character image protection through the time-delay grey relational analysis.

Requirements Engineering for Enterprise Applications Development: Seven Challenges in Higher Education Environment

This paper describes the challenges on the requirements engineering for developing an enterprise applications in higher education environment. The development activities include software implementation, maintenance, and enhancement and support for online transaction processing and overnight batch processing. Generally, an enterprise application for higher education environment may include Student Information System (SIS), HR/Payroll system, Financial Systems etc. By the way, there are so many challenges in requirement engineering phases in order to provide two distinctive services that are production processing support and systems development.

A Business Intelligence System Design Based on ASP Platform

The Informational Infrastructures of small and medium-sized manufacturing enterprises are relatively poor, there are serious shortages of capitals which can be invested in informatization construction, computer hardware and software resources, and human resources. To address the informatization issue in small and medium-sized manufacturing enterprises, and enable them to the application of advanced management thinking and enhance their competitiveness, the paper establish a manufacturing-oriented small and medium-sized enterprises informatization platform based on the ASP business intelligence technology, which effectively improves the scientificity of enterprises decision and management informatization.

Space-Time Variation in Rainfall and Runoff: Upper Betwa Catchment

Among all geo-hydrological relationships, rainfallrunoff relationship is of utmost importance in any hydrological investigation and water resource planning. Spatial variation, lag time involved in obtaining areal estimates for the basin as a whole can affect the parameterization in design stage as well as in planning stage. In conventional hydrological processing of data, spatial aspect is either ignored or interpolated at sub-basin level. Temporal variation when analysed for different stages can provide clues for its spatial effectiveness. The interplay of space-time variation at pixel level can provide better understanding of basin parameters. Sustenance of design structures for different return periods and their spatial auto-correlations should be studied at different geographical scales for better management and planning of water resources. In order to understand the relative effect of spatio-temporal variation in hydrological data network, a detailed geo-hydrological analysis of Betwa river catchment falling in Lower Yamuna Basin is presented in this paper. Moreover, the exact estimates about the availability of water in the Betwa river catchment, especially in the wake of recent Betwa-Ken linkage project, need thorough scientific investigation for better planning. Therefore, an attempt in this direction is made here to analyse the existing hydrological and meteorological data with the help of SPSS, GIS and MS-EXCEL software. A comparison of spatial and temporal correlations at subcatchment level in case of upper Betwa reaches has been made to demonstrate the representativeness of rain gauges. First, flows at different locations are used to derive correlation and regression coefficients. Then, long-term normal water yield estimates based on pixel-wise regression coefficients of rainfall-runoff relationship have been mapped. The areal values obtained from these maps can definitely improve upon estimates based on point-based extrapolations or areal interpolations.

An Agent-based Model for Analyzing Interaction of Two Stable Social Networks

In this research, the authors analyze network stability using agent-based simulation. Firstly, the authors focus on analyzing large networks (eight agents) by connecting different two stable small social networks (A small stable network is consisted on four agents.). Secondly, the authors analyze the network (eight agents) shape which is added one agent to a stable network (seven agents). Thirdly, the authors analyze interpersonal comparison of utility. The “star-network "was not found on the result of interaction among stable two small networks. On the other hand, “decentralized network" was formed from several combination. In case of added one agent to a stable network (seven agents), if the value of “c"(maintenance cost of per a link) was larger, the number of patterns of stable network was also larger. In this case, the authors identified the characteristics of a large stable network. The authors discovered the cases of decreasing personal utility under condition increasing total utility.

Performance Comparison of Particle Swarm Optimization with Traditional Clustering Algorithms used in Self-Organizing Map

Self-organizing map (SOM) is a well known data reduction technique used in data mining. It can reveal structure in data sets through data visualization that is otherwise hard to detect from raw data alone. However, interpretation through visual inspection is prone to errors and can be very tedious. There are several techniques for the automatic detection of clusters of code vectors found by SOM, but they generally do not take into account the distribution of code vectors; this may lead to unsatisfactory clustering and poor definition of cluster boundaries, particularly where the density of data points is low. In this paper, we propose the use of an adaptive heuristic particle swarm optimization (PSO) algorithm for finding cluster boundaries directly from the code vectors obtained from SOM. The application of our method to several standard data sets demonstrates its feasibility. PSO algorithm utilizes a so-called U-matrix of SOM to determine cluster boundaries; the results of this novel automatic method compare very favorably to boundary detection through traditional algorithms namely k-means and hierarchical based approach which are normally used to interpret the output of SOM.

Solving One-dimensional Hyperbolic Telegraph Equation Using Cubic B-spline Quasi-interpolation

In this paper, the telegraph equation is solved numerically by cubic B-spline quasi-interpolation .We obtain the numerical scheme, by using the derivative of the quasi-interpolation to approximate the spatial derivative of the dependent variable and a low order forward difference to approximate the temporal derivative of the dependent variable. The advantage of the resulting scheme is that the algorithm is very simple so it is very easy to implement. The results of numerical experiments are presented, and are compared with analytical solutions by calculating errors L2 and L∞ norms to confirm the good accuracy of the presented scheme.

Matrix Based Synthesis of EXOR dominated Combinational Logic for Low Power

This paper discusses a new, systematic approach to the synthesis of a NP-hard class of non-regenerative Boolean networks, described by FON[FOFF]={mi}[{Mi}], where for every mj[Mj]∈{mi}[{Mi}], there exists another mk[Mk]∈{mi}[{Mi}], such that their Hamming distance HD(mj, mk)=HD(Mj, Mk)=O(n), (where 'n' represents the number of distinct primary inputs). The method automatically ensures exact minimization for certain important selfdual functions with 2n-1 points in its one-set. The elements meant for grouping are determined from a newly proposed weighted incidence matrix. Then the binary value corresponding to the candidate pair is correlated with the proposed binary value matrix to enable direct synthesis. We recommend algebraic factorization operations as a post processing step to enable reduction in literal count. The algorithm can be implemented in any high level language and achieves best cost optimization for the problem dealt with, irrespective of the number of inputs. For other cases, the method is iterated to subsequently reduce it to a problem of O(n-1), O(n-2),.... and then solved. In addition, it leads to optimal results for problems exhibiting higher degree of adjacency, with a different interpretation of the heuristic, and the results are comparable with other methods. In terms of literal cost, at the technology independent stage, the circuits synthesized using our algorithm enabled net savings over AOI (AND-OR-Invert) logic, AND-EXOR logic (EXOR Sum-of- Products or ESOP forms) and AND-OR-EXOR logic by 45.57%, 41.78% and 41.78% respectively for the various problems. Circuit level simulations were performed for a wide variety of case studies at 3.3V and 2.5V supply to validate the performance of the proposed method and the quality of the resulting synthesized circuits at two different voltage corners. Power estimation was carried out for a 0.35micron TSMC CMOS process technology. In comparison with AOI logic, the proposed method enabled mean savings in power by 42.46%. With respect to AND-EXOR logic, the proposed method yielded power savings to the tune of 31.88%, while in comparison with AND-OR-EXOR level networks; average power savings of 33.23% was obtained.

A Nonlinear ODE System for the Unsteady Hydrodynamic Force – A New Approach

We propose a reduced-ordermodel for the instantaneous hydrodynamic force on a cylinder. The model consists of a system of two ordinary differential equations (ODEs), which can be integrated in time to yield very accurate histories of the resultant force and its direction. In contrast to several existing models, the proposed model considers the actual (total) hydrodynamic force rather than its perpendicular or parallel projection (the lift and drag), and captures the complete force rather than the oscillatory part only. We study and provide descriptions of the relationship between the model parameters, evaluated utilizing results from numerical simulations, and the Reynolds number so that the model can be used at any arbitrary value within the considered range of 100 to 500 to provide accurate representation of the force without the need to perform timeconsuming simulations and solving the partial differential equations (PDEs) governing the flow field.

Fuzzy Fingerprint Vault using Multiple Polynomials

Fuzzy fingerprint vault is a recently developed cryptographic construct based on the polynomial reconstruction problem to secure critical data with the fingerprint data. However, the previous researches are not applicable to the fingerprint having a few minutiae since they use a fixed degree of the polynomial without considering the number of fingerprint minutiae. To solve this problem, we use an adaptive degree of the polynomial considering the number of minutiae extracted from each user. Also, we apply multiple polynomials to avoid the possible degradation of the security of a simple solution(i.e., using a low-degree polynomial). Based on the experimental results, our method can make the possible attack difficult 2192 times more than using a low-degree polynomial as well as verify the users having a few minutiae.