A Study on the Effect of Valve Timing on the Combustion and Emission Characteristics for a 4-cylinder PCCI Diesel Engine

PCCI engines can reduce NOx and PM emissions simultaneously without sacrificing thermal efficiency, but a low combustion temperature resulting from early fuel injection, and ignition occurring prior to TDC, can cause higher THC and CO emissions and fuel consumption. In conclusion, it was found that the PCCI combustion achieved by the 2-stage injection strategy with optimized calibration factors (e.g. EGR rate, injection pressure, swirl ratio, intake pressure, injection timing) can reduce NOx and PM emissions simultaneously. This research works are expected to provide valuable information conducive to a development of an innovative combustion engine that can fulfill upcoming stringent emission standards.

Video Data Mining based on Information Fusion for Tamper Detection

In this paper, we propose novel algorithmic models based on information fusion and feature transformation in crossmodal subspace for different types of residue features extracted from several intra-frame and inter-frame pixel sub-blocks in video sequences for detecting digital video tampering or forgery. An evaluation of proposed residue features – the noise residue features and the quantization features, their transformation in cross-modal subspace, and their multimodal fusion, for emulated copy-move tamper scenario shows a significant improvement in tamper detection accuracy as compared to single mode features without transformation in cross-modal subspace.

Carrageenan Properties Extracted From Eucheuma cottonii, Indonesia

The effect of extraction solvent upon properties of carrageenan from Eucheuma cottonii was studied. The distilled water and KOH solution (concentration 0.1- 0.5N) were used as the solvent. Extraction process was carried out in water bath equipped by stirrer with constant speed of 275 rpm with a constant ratio of seaweed weight to solvent volume ( 1:50 g/mL) at 86oC for 45 minutes. The extract was then precipitated in 3 volume of 90% ethanol, oven dried at 60oC. Based on experimental data, alkali significantly influenced yield and properties of extracted carrageenan. The extracted carrageenan was found to have essentially identical FTIR spectra to the reference samples of kappa-carrageenan. Increasing the KOH concentration led to carrageenan containing less sulfate content and intrinsic viscosity. The gel strength increased along with the increasing of KOH concentration. The decreasing of intrinsic viscosity value indicates that a polymer degradation occurs during alkali extraction.

Extracting Single Trial Visual Evoked Potentials using Selective Eigen-Rate Principal Components

In single trial analysis, when using Principal Component Analysis (PCA) to extract Visual Evoked Potential (VEP) signals, the selection of principal components (PCs) is an important issue. We propose a new method here that selects only the appropriate PCs. We denote the method as selective eigen-rate (SER). In the method, the VEP is reconstructed based on the rate of the eigen-values of the PCs. When this technique is applied on emulated VEP signals added with background electroencephalogram (EEG), with a focus on extracting the evoked P3 parameter, it is found to be feasible. The improvement in signal to noise ratio (SNR) is superior to two other existing methods of PC selection: Kaiser (KSR) and Residual Power (RP). Though another PC selection method, Spectral Power Ratio (SPR) gives a comparable SNR with high noise factors (i.e. EEGs), SER give more impressive results in such cases. Next, we applied SER method to real VEP signals to analyse the P3 responses for matched and non-matched stimuli. The P3 parameters extracted through our proposed SER method showed higher P3 response for matched stimulus, which confirms to the existing neuroscience knowledge. Single trial PCA using KSR and RP methods failed to indicate any difference for the stimuli.

A New Edit Distance Method for Finding Similarity in Dna Sequence

The P-Bigram method is a string comparison methods base on an internal two characters-based similarity measure. The edit distance between two strings is the minimal number of elementary editing operations required to transform one string into the other. The elementary editing operations include deletion, insertion, substitution two characters. In this paper, we address the P-Bigram method to sole the similarity problem in DNA sequence. This method provided an efficient algorithm that locates all minimum operation in a string. We have been implemented algorithm and found that our program calculated that smaller distance than one string. We develop PBigram edit distance and show that edit distance or the similarity and implementation using dynamic programming. The performance of the proposed approach is evaluated using number edit and percentage similarity measures.

Health Care Ethics in Vulnerable Populations: Clinical Research through the Patient's Eyes

Chronic conditions carry with them strong emotions and often lead to charged relationships between patients and their health providers and, by extension, patients and health researchers. Persons are both autonomous and relational and a purely cognitive model of autonomy neglects the social and relational basis of chronic illness. Ensuring genuine informed consent in research requires a thorough understanding of how participants perceive a study and their reasons for participation. Surveys may not capture the complexities of reasoning that underlies study participation. Contradictory reasons for participation, for instance an initial claim of altruism as rationale and a subsequent claim of personal benefit (therapeutic misconception), affect the quality of informed consent. Individuals apply principles through the filter of personal values and lived experience. Authentic autonomy, and hence authentic consent to research, occurs within the context of patients- unique life narratives and illness experiences.

Control Technology for a Daily Load-following Operation in a Nuclear Power Plant

In Korea, the technology of a load fo nuclear power plant has been being developed. automatic controller which is able to control temperature and axial power distribution was developed. identification algorithm and a model predictive contact former transforms the nuclear reactor status into numerically. And the latter uses them and ge manipulated values such as two kinds of control ro this automatic controller, the performance of a coperation was evaluated. As a result, the automatic generated model parameters of a nuclear react to nuclear reactor average temperature and axial power the desired targets during a daily load follow.

Vulnerabilities of IEEE 802.11i Wireless LAN CCMP Protocol

IEEE has recently incorporated CCMP protocol to provide robust security to IEEE 802.11 wireless LANs. It is found that CCMP has been designed with a weak nonce construction and transmission mechanism, which leads to the exposure of initial counter value. This weak construction of nonce renders the protocol vulnerable to attacks by intruders. This paper presents how the initial counter can be pre-computed by the intruder. This vulnerability of counter block value leads to pre-computation attack on the counter mode encryption of CCMP. The failure of the counter mode will result in the collapse of the whole security mechanism of 802.11 WLAN.

Optimization Based Obstacle Avoidance

Based on a non-linear single track model which describes the dynamics of vehicle, an optimal path planning strategy is developed. Real time optimization is used to generate reference control values to allow leading the vehicle alongside a calculated lane which is optimal for different objectives such as energy consumption, run time, safety or comfort characteristics. Strict mathematic formulation of the autonomous driving allows taking decision on undefined situation such as lane change or obstacle avoidance. Based on position of the vehicle, lane situation and obstacle position, the optimization problem is reformulated in real-time to avoid the obstacle and any car crash.

Combining Ant Colony Optimization and Dynamic Programming for Solving a Dynamic Facility Layout Problem

This paper presents an algorithm which combining ant colony optimization in the dynamic programming for solving a dynamic facility layout problem. The problem is separated into 2 phases, static and dynamic phase. In static phase, ant colony optimization is used to find the best ranked of layouts for each period. Then the dynamic programming (DP) procedure is performed in the dynamic phase to evaluate the layout set during multi-period planning horizon. The proposed algorithm is tested over many problems with size ranging from 9 to 49 departments, 2 and 4 periods. The experimental results show that the proposed method is an alternative way for the plant layout designer to determine the layouts during multi-period planning horizon.

Dependence of Virtual Subjects Reflection from the Features of Coping Behavior of Students

In the globalization process, when the struggle for minds and values of the people is taking place, the impact of the virtual space can cause unexpected effects and consequences in the process of adjustment of young people in this world. Their special significance is defined by unconscious influence on the underlying process of meaning and therefore the values preached by them are much more effective and affect both the personal characteristics and the peculiarities of adjustment process. Related to this the challenge is to identify factors influencing the reflection characteristics of virtual subjects and measures their impact on the personal characteristics of the students.

Mathematical Determination of Tall Square Building Height under Peak Wind Loads

The present study concentrates on solving the along wind oscillation problem of a tall square building from first principles and across wind oscillation problem of the same from empirical relations obtained by experiments. The criterion for human comfort at the worst condition at the top floor of the building is being considered and a limiting value of height of a building for a given cross section is predicted. Numerical integrations are carried out as and when required. The results show severeness of across wind oscillations in comparison to along wind oscillation. The comfort criterion is combined with across wind oscillation results to determine the maximum allowable height of a building for a given square cross-section.

Dynamic Slope Scaling Procedure for Stochastic Integer Programming Problem

Mathematical programming has been applied to various problems. For many actual problems, the assumption that the parameters involved are deterministic known data is often unjustified. In such cases, these data contain uncertainty and are thus represented as random variables, since they represent information about the future. Decision-making under uncertainty involves potential risk. Stochastic programming is a commonly used method for optimization under uncertainty. A stochastic programming problem with recourse is referred to as a two-stage stochastic problem. In this study, we consider a stochastic programming problem with simple integer recourse in which the value of the recourse variable is restricted to a multiple of a nonnegative integer. The algorithm of a dynamic slope scaling procedure for solving this problem is developed by using a property of the expected recourse function. Numerical experiments demonstrate that the proposed algorithm is quite efficient. The stochastic programming model defined in this paper is quite useful for a variety of design and operational problems.

Bail-in Capital: The New Box

In this paper, we discuss the paradigm shift in bank capital from the “gone concern" to the “going concern" mindset. We then propose a methodology for pricing a product of this shift called Contingent Capital Notes (“CoCos"). The Merton Model can determine a price for credit risk by using the firm-s equity value as a call option on those assets. Our pricing methodology for CoCos also uses the credit spread implied by the Merton Model in a subsequent derivative form created by John Hull et al . Here, a market implied asset volatility is calculated by using observed market CDS spreads. This implied asset volatility is then used to estimate the probability of triggering a predetermined “contingency event" given the distanceto- trigger (DTT). The paper then investigates the effect of varying DTTs and recovery assumptions on the CoCo yield. We conclude with an investment rationale.

Publishing Curriculum Vitae using Weblog: An Investigation on its Usefulness, Ease of Use, and Behavioral Intention to Use

In this cyber age, the job market has been rapidly transforming and being digitalized. Submitting a paper-based curriculum vitae (CV) nowadays does not grant a job seeker a high employability rate. This paper calls for attention on the creation of mobile Curriculum Vitae or m-CV (http://mcurriculumvitae. blogspot.com), a sample of an individual CV developed using weblog, which can enhance the job hunter especially fresh graduate-s higher marketability rate. This study is designed to identify the perceptions held by Malaysian university students regarding m-CV grounded on a modified Technology Acceptance Model (TAM). It measures the strength and the direction of relationships among three major variables – Perceived Ease of Use (PEOU), Perceived Usefulness (PU) and Behavioral Intention (BI) to use. The finding shows that university students generally accepted adopting m-CV since they perceived m-CV to be more useful rather than easy to use. Additionally, this study has confirmed TAM to be a useful theoretical model in helping to understand and explain the behavioral intention to use Web 2.0 application-weblog publishing their CV. The result of the study has underlined another significant positive value of using weblog to create personal CV. Further research of m-CV has been highlighted in this paper.

Fast Codevector Search Algorithm for 3-D Vector Quantized Codebook

This paper presents a very simple and efficient algorithm for codebook search, which reduces a great deal of computation as compared to the full codebook search. The algorithm is based on sorting and centroid technique for search. The results table shows the effectiveness of the proposed algorithm in terms of computational complexity. In this paper we also introduce a new performance parameter named as Average fractional change in pixel value as we feel that it gives better understanding of the closeness of the image since it is related to the perception. This new performance parameter takes into consideration the average fractional change in each pixel value.

Light Confinement in Low Index Nanometer Areas

In this work we numerically examine structures which could confine light in nanometer areas. A system consisting of two silicon disks with in plane separation of a few tens of nanometers has been studied first. The normalized unitless effective mode volume, Veff, has been calculated for the two lowest whispering gallery mode resonances. The effective mode volume is reduced significantly as the gap between the disks decreases. In addition, the effect of the substrate is also studied. In that case, Veff of approximately the same value as the non-substrate case for a similar two disk system can be obtained by using disks almost twice as thick. We also numerically examine a structure consisting of a circular slot waveguide which is formed into a silicon disk resonator. We show that the proposed structure could have high Q resonances thus raising the belief that it is a very promising candidate for optical interconnects applications. The study includes several numerical calculations for all the geometric parameters of the structure. It also includes numerical simulations of the coupling between a waveguide and the proposed disk resonator leading to a very promising conclusion about its applicability.

Comparative Study of Decision Trees and Rough Sets Theory as Knowledge ExtractionTools for Design and Control of Industrial Processes

General requirements for knowledge representation in the form of logic rules, applicable to design and control of industrial processes, are formulated. Characteristic behavior of decision trees (DTs) and rough sets theory (RST) in rules extraction from recorded data is discussed and illustrated with simple examples. The significance of the models- drawbacks was evaluated, using simulated and industrial data sets. It is concluded that performance of DTs may be considerably poorer in several important aspects, compared to RST, particularly when not only a characterization of a problem is required, but also detailed and precise rules are needed, according to actual, specific problems to be solved.

Anthropomorphism in Robotics Engineering for Disabled People

In its attempt to offer new ways into autonomy for a large population of disabled people, assistive technology has largely been inspired by robotics engineering. Recent human-like robots carry new hopes that it seems to us necessary to analyze by means of a specific theory of anthropomorphism. We propose to distinguish a functional anthropomorphism which is the one of actual wheelchairs from a structural anthropomorphism based on a mimicking of human physiological systems. If functional anthropomorphism offers the main advantage of eliminating the physiological systems interdependence issue, the highly link between the robot for disabled people and their human-built environment would lead to privilege in the future the anthropomorphic structural way. In this future framework, we highlight a general interdependence principle : any partial or local structural anthropomorphism generates new anthropomorphic needs due to the physiological systems interdependency, whose effects can be evaluated by means of specific anthropomorphic criterions derived from a set theory-based approach of physiological systems.

An Experimental Method for Measuring Clamping Force in Bolted Connections and Effect of Bolt Threads Lubrication on Its Value

In this paper, the details of an experimental method to measure the clamping force value at bolted connections due to application of wrenching torque to tighten the nut have been presented. A simplified bolted joint including a holed plate with a single bolt was considered to carry out the experiments. This method was designed based on Hooke-s law by measuring compressive axial strain of a steel bush placed between the nut and the plate. In the experimental procedure, the values of clamping force were calculated for seven different levels of applied torque, and this process was repeated three times for each level of the torque. Moreover, the effect of lubrication of threads on the clamping value was studied using the same method. In both conditions (dry and lubricated threads), relation between the torque and the clamping force have been displayed in graphs.