How Does Psychoanalysis Help in Reconstructing Political Thought? An Exercise of Interpretation

The significance of psychology in studying politics is embedded in philosophical issues as well as behavioural pursuits. For the former is often associated with Sigmund Freud and his followers. The latter is inspired by the writings of Harold Lasswell. Political psychology or psychopolitics has its own impression on political thought ever since it deciphers the concept of human nature and political propaganda. More importantly, psychoanalysis views political thought as a textual content which needs to explore the latent from the manifest content. In other words, it reads the text symptomatically and interprets the hidden truth. This paper explains the paradigm of dream interpretation applied by Freud. The dream work is a process which has four successive activities: condensation, displacement, representation and secondary revision. The texts dealing with political though can also be interpreted on these principles. Freud's method of dream interpretation draws its source after the hermeneutic model of philological research. It provides theoretical perspective and technical rules for the interpretation of symbolic structures. The task of interpretation remains a discovery of equivalence of symbols and actions through perpetual analogies. Psychoanalysis can help in studying political thought in two ways: to study the text distortion, Freud's dream interpretation is used as a paradigm exploring the latent text from its manifest text; and to apply Freud's psychoanalytic concepts and theories ranging from individual mind to civilization, religion, war and politics.

Managing User Expectations in Information Systems Development

This paper provides new ways to explore the old problem of failure of information systems development in an organisation. Based on the theory of cognitive dissonance, information systems (IS) failure is defined as a gap between what the users expect from an information system and how well these expectations are met by the perceived performance of the delivered system. Bridging the expectation-perception gap requires that IS professionals make a radical change from being the proprietor of information systems and products to being service providers. In order to deliver systems and services that IS users perceive as valuable, IS people must become expert in determining and assessing users- expectations and perceptions. It is also suggested that the IS community, in general, has given relatively little attention to the front-end process of requirements specification for IS development. There is a simplistic belief that requirements are obtainable from users, they are then translatable into a formal specification. The process of information needs analysis is problematic and worthy of investigation.

A Performance Evaluation of Cellular Network Suitability for VANET

Recently, a vehicular ad-hoc networks(VANETs) for Intelligent Transport System(ITS) have become able safety and convenience services surpassing the simple services such as an electronic toll collection system. To provide the proper services, VANET needs infrastructure over the country infrastructure. Thus, we have to spend a huge sum of human resources. In this reason, several studies have been made on the usage of cellular networks instead of new protocols this study is to assess a performance evaluation of the cellular network for VANET. In this paper, the result of a for the suitability of cellular networks for VANET experiment, The LTE(Long Term Evolution) of cellular networks found to be most suitable among the others cellular networks

A Weighted-Profiling Using an Ontology Basefor Semantic-Based Search

The information on the Web increases tremendously. A number of search engines have been developed for searching Web information and retrieving relevant documents that satisfy the inquirers needs. Search engines provide inquirers irrelevant documents among search results, since the search is text-based rather than semantic-based. Information retrieval research area has presented a number of approaches and methodologies such as profiling, feedback, query modification, human-computer interaction, etc for improving search results. Moreover, information retrieval has employed artificial intelligence techniques and strategies such as machine learning heuristics, tuning mechanisms, user and system vocabularies, logical theory, etc for capturing user's preferences and using them for guiding the search based on the semantic analysis rather than syntactic analysis. Although a valuable improvement has been recorded on search results, the survey has shown that still search engines users are not really satisfied with their search results. Using ontologies for semantic-based searching is likely the key solution. Adopting profiling approach and using ontology base characteristics, this work proposes a strategy for finding the exact meaning of the query terms in order to retrieve relevant information according to user needs. The evaluation of conducted experiments has shown the effectiveness of the suggested methodology and conclusion is presented.

Neutronic Study of Two Reactor Cores Cooled with Light and Heavy Water Using Computation Method

Most HWRs currently use natural uranium fuel. Using enriched uranium fuel results in a significant improvement in fuel cycle costs and uranium utilization. On the other hand, reactivity changes of HWRs over the full range of operating conditions from cold shutdown to full power are small. This reduces the required reactivity worth of control devices and minimizes local flux distribution perturbations, minimizing potential problems due to transient local overheating of fuel. Analyzing heavy water effectiveness on neutronic parameters such as enrichment requirements, peaking factor and reactivity is important and should pay attention as primary concepts of a HWR core designing. Two nuclear nuclear reactors of CANDU-type and hexagonal-type reactor cores of 33 fuel assemblies and 19 assemblies in 1.04 P/D have been respectively simulated using MCNP-4C code. Using heavy water and light water as moderator have been compared for achieving less reactivity insertion and enrichment requirements. Two fuel matrixes of (232Th/235U)O2 and (238/235U)O2 have been compared to achieve more economical and safe design. Heavy water not only decreased enrichment needs, but it concluded in negative reactivity insertions during moderator density variations. Thorium oxide fuel assemblies of 2.3% enrichment loaded into the core of heavy water moderator resulted in 0.751 fission to absorption ratio and peaking factor of 1.7 using. Heavy water not only provides negative reactivity insertion during temperature raises which changes moderator density but concluded in 2 to 10 kg reduction of enrichment requirements, depend on geometry type.

External Effects on Dynamic Competitive Model of Domestic Airline and High Speed Rail

Social-economic variables influence transportation demand largely. Analyses of discrete choice model consider social-economic variables to study traveler-s mode choice and demand. However, to calibrate the discrete choice model needs to have plenty of questionnaire survey. Also, an aggregative model is proposed. The historical data of passenger volumes for high speed rail and domestic civil aviation are employed to calibrate and validate the model. In this study, models with different social-economic variables, which are oil price, GDP per capita, CPI and economic growth rate, are compared. From the results, the model with the oil price is better than models with the other social-economic variables.

MinRoot and CMesh: Interconnection Architectures for Network-on-Chip Systems

The success of an electronic system in a System-on- Chip is highly dependent on the efficiency of its interconnection network, which is constructed from routers and channels (the routers move data across the channels between nodes). Since neither classical bus based nor point to point architectures can provide scalable solutions and satisfy the tight power and performance requirements of future applications, the Network-on-Chip (NoC) approach has recently been proposed as a promising solution. Indeed, in contrast to the traditional solutions, the NoC approach can provide large bandwidth with moderate area overhead. The selected topology of the components interconnects plays prime rule in the performance of NoC architecture as well as routing and switching techniques that can be used. In this paper, we present two generic NoC architectures that can be customized to the specific communication needs of an application in order to reduce the area with minimal degradation of the latency of the system. An experimental study is performed to compare these structures with basic NoC topologies represented by 2D mesh, Butterfly-Fat Tree (BFT) and SPIN. It is shown that Cluster mesh (CMesh) and MinRoot schemes achieves significant improvements in network latency and energy consumption with only negligible area overhead and complexity over existing architectures. In fact, in the case of basic NoC topologies, CMesh and MinRoot schemes provides substantial savings in area as well, because they requires fewer routers. The simulation results show that CMesh and MinRoot networks outperforms MESH, BFT and SPIN in main performance metrics.

Detection of Moving Images Using Neural Network

Motion detection is a basic operation in the selection of significant segments of the video signals. For an effective Human Computer Intelligent Interaction, the computer needs to recognize the motion and track the moving object. Here an efficient neural network system is proposed for motion detection from the static background. This method mainly consists of four parts like Frame Separation, Rough Motion Detection, Network Formation and Training, Object Tracking. This paper can be used to verify real time detections in such a way that it can be used in defense applications, bio-medical applications and robotics. This can also be used for obtaining detection information related to the size, location and direction of motion of moving objects for assessment purposes. The time taken for video tracking by this Neural Network is only few seconds.

Integration Process of Industrial Design and Engineering Design

Lately management strategy that put Industrial Design (ID) in its core is recognized more important, as technology and price alone cannot differentiate a product. The needs to shorten the time to develop a product also shorten the development period of ID, and it necessitates the ID process management. This research analyzes the status of integration process of ID and Engineering Design (ED) of office equipment that requires the collaboration of ID and ED to clarify the issues for the efficiency of the development and to propose solutions.

Investigating the Effectiveness of Self-Shading Strategy on Overall Thermal Transfer Value and Window Size in High Rise Buildings

So much energy is used in high rise buildings to fulfill the basic needs of users such as lighting and thermal comfort. Malaysia has hot and humid climate, buildings especially high rise buildings receive unnecessary solar radiation that cause more solar heat gain. Energy use specially electricity consumption in high rise buildings has increased. There have been growing concerns about energy consumption and its effect on environment. Building, energy and the environment are important issues that the designers should consider to them. Self protected form is one of possible ways against the impact of solar radiation in high rise buildings. The Energy performance of building envelopes was investigated in term of the Overall Thermal Transfer Value (OTTV ).In this paper, the amount of OTTV reduction was calculated through OTTV Equations to clear the effectiveness of self shading strategy on minimizing energy consumption for cooling interior spaces in high rise buildings which has considerable envelope areas against solar radiation. Also increasing the optimum window area was investigated using self-shading strategy in designing high rise buildings. As result, the significant reduction in OTTV was shown based on WWR.In addition slight increase was demonstrated in WWR that can influence on visible comfort interior spaces.

Developing a Sustainable Educational Portal for the D-Grid Community

Within the last years, several technologies have been developed to help building e-learning portals. Most of them follow approaches that deliver a vast amount of functionalities, suitable for class-like learning. The SuGI project, as part of the D-Grid (funded by the BMBF), targets on delivering a highly scalable and sustainable learning solution to provide materials (e.g. learning modules, training systems, webcasts, tutorials, etc.) containing knowledge about Grid computing to the D-Grid community. In this article, the process of the development of an e-learning portal focused on the requirements of this special user group is described. Furthermore, it deals with the conceptual and technical design of an e-learning portal, addressing the special needs of heterogeneous target groups. The main focus lies on the quality management of the software development process, Web templates for uploading new contents, the rich search and filter functionalities which will be described from a conceptual as well as a technical point of view. Specifically, it points out best practices as well as concepts to provide a sustainable solution to a relatively unknown and highly heterogeneous community.

Ethanol Production from Sugarcane Bagasse by Means of Enzymes Produced by Solid State Fermentation Method

Nowadays there is a growing interest in biofuel production in most countries because of the increasing concerns about hydrocarbon fuel shortage and global climate changes, also for enhancing agricultural economy and producing local needs for transportation fuel. Ethanol can be produced from biomass by the hydrolysis and sugar fermentation processes. In this study ethanol was produced without using expensive commercial enzymes from sugarcane bagasse. Alkali pretreatment was used to prepare biomass before enzymatic hydrolysis. The comparison between NaOH, KOH and Ca(OH)2 shows NaOH is more effective on bagasse. The required enzymes for biomass hydrolysis were produced from sugarcane solid state fermentation via two fungi: Trichoderma longibrachiatum and Aspergillus niger. The results show that the produced enzyme solution via A. niger has functioned better than T. longibrachiatum. Ethanol was produced by simultaneous saccharification and fermentation (SSF) with crude enzyme solution from T. longibrachiatum and Saccharomyces cerevisiae yeast. To evaluate this procedure, SSF of pretreated bagasse was also done using Celluclast 1.5L by Novozymes. The yield of ethanol production by commercial enzyme and produced enzyme solution via T. longibrachiatum was 81% and 50% respectively.

Development of Reliable Web-Based Laboratories for Developing Countries

In online context, the design and implementation of effective remote laboratories environment is highly challenging on account of hardware and software needs. This paper presents the remote laboratory software framework modified from ilab shared architecture (ISA). The ISA is a framework which enables students to remotely acccess and control experimental hardware using internet infrastructure. The need for remote laboratories came after experiencing problems imposed by traditional laboratories. Among them are: the high cost of laboratory equipment, scarcity of space, scarcity of technical personnel along with the restricted university budget creates a significant bottleneck on building required laboratory experiments. The solution to these problems is to build web-accessible laboratories. Remote laboratories allow students and educators to interact with real laboratory equipment located anywhere in the world at anytime. Recently, many universities and other educational institutions especially in third world countries rely on simulations because they do not afford the experimental equipment they require to their students. Remote laboratories enable users to get real data from real-time hand-on experiments. To implement many remote laboratories, the system architecture should be flexible, understandable and easy to implement, so that different laboratories with different hardware can be deployed easily. The modifications were made to enable developers to add more equipment in ISA framework and to attract the new developers to develop many online laboratories.

The Impact of ERP Systems on Accounting Processes

Advances in information technology, recent changes in business environment, globalization, deregulation, privatization have made running a successful business more difficult than ever before. To remain successful and to be competitive have forced companies to react to the new changes in order to survive and succeed. The implementation of an Enterprise Resource planning (ERP) system improves information flow, reduce costs, establish linkage with suppliers and reduce response time to customer needs. This paper focuses on a sample of Greek companies, investigates the ERP market in Greece, the reasons why the Greek companies are investing in ERP systems, the benefits that users have achieved and the influence of ERP systems on the use of new accounting practices. The results indicate a greater level on information integration, flexibility in information access and greater functionality provided by ERP systems but little influence on the use of new accounting practices.

Performance Analysis of a Series of Adaptive Filters in Non-Stationary Environment for Noise Cancelling Setup

One of the essential components of much of DSP application is noise cancellation. Changes in real time signals are quite rapid and swift. In noise cancellation, a reference signal which is an approximation of noise signal (that corrupts the original information signal) is obtained and then subtracted from the noise bearing signal to obtain a noise free signal. This approximation of noise signal is obtained through adaptive filters which are self adjusting. As the changes in real time signals are abrupt, this needs adaptive algorithm that converges fast and is stable. Least mean square (LMS) and normalized LMS (NLMS) are two widely used algorithms because of their plainness in calculations and implementation. But their convergence rates are small. Adaptive averaging filters (AFA) are also used because they have high convergence, but they are less stable. This paper provides the comparative study of LMS and Normalized NLMS, AFA and new enhanced average adaptive (Average NLMS-ANLMS) filters for noise cancelling application using speech signals.

A New Method for Extracting Ocean Wave Energy Utilizing the Wave Shoaling Phenomenon

Fossil fuels are the major source to meet the world energy requirements but its rapidly diminishing rate and adverse effects on our ecological system are of major concern. Renewable energy utilization is the need of time to meet the future challenges. Ocean energy is the one of these promising energy resources. Threefourths of the earth-s surface is covered by the oceans. This enormous energy resource is contained in the oceans- waters, the air above the oceans, and the land beneath them. The renewable energy source of ocean mainly is contained in waves, ocean current and offshore solar energy. Very fewer efforts have been made to harness this reliable and predictable resource. Harnessing of ocean energy needs detail knowledge of underlying mathematical governing equation and their analysis. With the advent of extra ordinary computational resources it is now possible to predict the wave climatology in lab simulation. Several techniques have been developed mostly stem from numerical analysis of Navier Stokes equations. This paper presents a brief over view of such mathematical model and tools to understand and analyze the wave climatology. Models of 1st, 2nd and 3rd generations have been developed to estimate the wave characteristics to assess the power potential. A brief overview of available wave energy technologies is also given. A novel concept of on-shore wave energy extraction method is also presented at the end. The concept is based upon total energy conservation, where energy of wave is transferred to the flexible converter to increase its kinetic energy. Squeezing action by the external pressure on the converter body results in increase velocities at discharge section. High velocity head then can be used for energy storage or for direct utility of power generation. This converter utilizes the both potential and kinetic energy of the waves and designed for on-shore or near-shore application. Increased wave height at the shore due to shoaling effects increases the potential energy of the waves which is converted to renewable energy. This approach will result in economic wave energy converter due to near shore installation and more dense waves due to shoaling. Method will be more efficient because of tapping both potential and kinetic energy of the waves.

Rapid Frequency Response Measurement of Power Conversion Products with Coherence-Based Confidence Analysis

Switched-mode converters play now a significant role in modern society. Their operation are often crucial in various electrical applications affecting the every day life. Therefore, the quality of the converters needs to be reliably verified. Recent studies have shown that the converters can be fully characterized by a set of frequency responses which can be efficiently used to validate the proper operation of the converters. Consequently, several methods have been proposed to measure the frequency responses fast and accurately. Most often correlation-based techniques have been applied. The presented measurement methods are highly sensitive to external errors and system nonlinearities. This fact has been often forgotten and the necessary uncertainty analysis of the measured responses has been neglected. This paper presents a simple approach to analyze the noise and nonlinearities in the frequency-response measurements of switched-mode converters. Coherence analysis is applied to form a confidence interval characterizing the noise and nonlinearities involved in the measurements. The presented method is verified by practical measurements from a high-frequency switchedmode converter.

Detecting Abnormal ECG Signals Utilising Wavelet Transform and Standard Deviation

ECG contains very important clinical information about the cardiac activities of the heart. Often the ECG signal needs to be captured for a long period of time in order to identify abnormalities in certain situations. Such signal apart of a large volume often is characterised by low quality due to the noise and other influences. In order to extract features in the ECG signal with time-varying characteristics at first need to be preprocessed with the best parameters. Also, it is useful to identify specific parts of the long lasting signal which have certain abnormalities and to direct the practitioner to those parts of the signal. In this work we present a method based on wavelet transform, standard deviation and variable threshold which achieves 100% accuracy in identifying the ECG signal peaks and heartbeat as well as identifying the standard deviation, providing a quick reference to abnormalities.

The Turkish Version of Inventory of the Dimensions of Emerging Adulthood(The IDEA)

Emerging Adulthood, the period during ages 18 to 25, is a new conceptualitation proposed by Arnett which is especially prevalent in the industrialized countries. Turkey is basically a developing country having a young population structure. Investigating the presence of such a life period in such a culture might be helpful in understanding educational and psychological needs of people who are in their twenties. With the aim of investigating Emerging Adulthood in Turkey, a well-known instrument (IDEA, 2003) was adapted to Turkish language and Turkish culture. The scale was administered to 296 participants between 15 and 34 ages and validity and reliability were conducted. Exploratory factor analysis revealed three subscales. Reliability coefficients of the scale (Cronbach a) was found as .69. Test-retest reliability coefficients was found for the scale as .81. Finally, “The IDEA" with 20 items was obtained to be used in the Turkish population. The instrument is ready to be administered among Turkish young people for the investigation of transition to adulthood, and whether such a emerging adulthood period really existed.

Supportability Analysis in LCI Environment

Starting from the basic pillars of the supportability analysis this paper queries its characteristics in LCI (Life Cycle Integration) environment. The research methodology contents a review of modern logistics engineering literature with the objective to collect and synthesize the knowledge relating to standards of supportability design in e-logistics environment. The results show that LCI framework has properties which are in fully compatibility with the requirement of simultaneous logistics support and productservice bundle design. The proposed approach is a contribution to the more comprehensive and efficient supportability design process. Also, contributions are reflected through a greater consistency of collected data, automated creation of reports suitable for different analysis, as well as the possibility of their customization according with customer needs. In addition to this, convenience of this approach is its practical use in real time. In a broader sense, LCI allows integration of enterprises on a worldwide basis facilitating electronic business.