Structured Phospholipids from Commercial Soybean Lecithin Containing Omega-3 Fatty Acids Reduces Atherosclerosis Risk in Male Sprague dawley Rats which Fed with an Atherogenic Diet

Structured phospholipids from commercial soybean lecithin with oil enriched omega-3 fatty acid form by product of tuna canning is alternative procedure to provides the stability of omega-3 fatty acid structure and increase these bioactive function in metabolism. Best treatment condition was obtain in 18 hours acidolysis reaction with 30% enzyme concentration, which EPADHA incorporation level was 127,47 mg/g and incorporation percentage of EPA-DHA was 51,04% at phospholipids structure. This structured phospolipids could reduce atherosclerosis risk in male Sprague dawley rat. Provision of structured phospholipids has significant effect (α = 0.05) on changes in lipid profile, intima-media thickness of aorta rats (male Sprague dawley) fed atherogenic diet. Structured phospholipids intake can lower total cholesterol 78.36 mg/dL, total triglycerides 94,57 mg/dL, LDL levels 87.08 mg/dL and increased HDL level as much as 12,64 mg/dL in 10 weeks cares. Structured phospholipids intake also can prevent the thickening of the intima-media layer of the aorta.

The Determinants of Voluntary Disclosure in Croatia

Study investigates the level and extent of voluntary disclosure practice in Croatia. The research was conducted on the sample of 130 medium and large companies. Findings indicate that two thirds of the companies analyzed disclose below-average number of additional information. The explanatory analyses has shown that firm size, listing status and industrial sector significantly and positively affect the level and extent of voluntary disclosure in the annual report of Croatian companies. On the other hand, profitability and ownership structure were found statistically insignificant. Unlike previous studies, this paper deals with level of voluntary disclosure of medium and large companies, as well as companies whose shares are not listed on the organized capital market, which can be found as our contribution. Also, the research makes contribution by providing the insights into voluntary disclosure practices in Croatia, as a case of macro-oriented accounting system economy, i.e. bank oriented economy with an emerging capital market.

EEG-Based Fractal Analysis of Different Motor Imagery Tasks using Critical Exponent Method

The objective of this paper is to characterize the spontaneous Electroencephalogram (EEG) signals of four different motor imagery tasks and to show hereby a possible solution for the present binary communication between the brain and a machine ora Brain-Computer Interface (BCI). The processing technique used in this paper was the fractal analysis evaluated by the Critical Exponent Method (CEM). The EEG signal was registered in 5 healthy subjects,sampling 15 measuring channels at 1024 Hz.Each channel was preprocessed by the Laplacian space ltering so as to reduce the space blur and therefore increase the spaceresolution. The EEG of each channel was segmented and its Fractaldimension (FD) calculated. The FD was evaluated in the time interval corresponding to the motor imagery and averaged out for all the subjects (each channel). In order to characterize the FD distribution,the linear regression curves of FD over the electrodes position were applied. The differences FD between the proposed mental tasks are quantied and evaluated for each experimental subject. The obtained results of the proposed method are a substantial fractal dimension in the EEG signal of motor imagery tasks and can be considerably utilized as the multiple-states BCI applications.

Morphology and Risk Factors for Blunt Aortic Trauma in Car Accidents - An Autopsy Study

Background: Blunt aortic trauma (BAT) includes various morphological changes that occur during deceleration, acceleration and/or body compression in traffic accidents. The various forms of BAT, from limited laceration of the intima to complete transection of the aorta, depends on the force acting on the vessel wall and the tolerance of the aorta to injury. The force depends on the change in velocity, the dynamics of the accident and of the seating position in the car. Tolerance to aortic injury depends on the anatomy, histological structure and pathomorphological alterations due to aging or disease of the aortic wall. An overview of the literature and medical documentation reveals that different terms are used to describe certain forms of BAT, which can lead to misinterpretation of findings or diagnoses. We therefore, propose a classification that would enable uniform systematic screening of all forms of BAT. We have classified BAT into three morphologycal types: TYPE I (intramural), TYPE II (transmural) and TYPE III (multiple) aortic ruptures with appropriate subtypes. Methods: All car accident casualties examined at the Institute of Forensic Medicine from 2001 to 2009 were included in this retrospective study. Autopsy reports were used to determine the occurrence of each morphological type of BAT in deceased drivers, front seat passengers and other passengers in cars and to define the morphology of BAT in relation to the accident dynamics and the age of the fatalities. Results: A total of 391 fatalities in car accidents were included in the study. TYPE I, TYPE II and TYPE III BAT were observed in 10,9%, 55,6% and 33,5%, respectively. The incidence of BAT in drivers, front seat and other passengers was 36,7%, 43,1% and 28,6%, respectively. In frontal collisions, the incidence of BAT was 32,7%, in lateral collisions 54,2%, and in other traffic accidents 29,3%. The average age of fatalities with BAT was 42,8 years and of those without BAT 39,1 years. Conclusion: Identification and early recognition of the risk factors of BAT following a traffic accident is crucial for successful treatment of patients with BAT. Front seat passengers over 50 years of age who have been injured in a lateral collision are the most at risk of BAT.

The Effects of Peristalsis on Dispersion of a Micropolar Fluid in the Presence of Magnetic Field

The paper presents an analytical solution for dispersion of a solute in the peristaltic motion of a micropolar fluid in the presence of magnetic field and both homogeneous and heterogeneous chemical reactions. The average effective dispersion coefficient has been found using Taylor-s limiting condition under long wavelength approximation. The effects of various relevant parameters on the average coefficient of dispersion have been studied. The average effective dispersion coefficient increases with amplitude ratio, cross viscosity coefficient and heterogeneous chemical reaction rate parameter. But it decreases with magnetic field parameter and homogeneous chemical reaction rate parameter. It can be noted that the presence of peristalsis enhances dispersion of a solute.

Exergy Analysis of Combined Cycle of Air Separation and Natural Gas Liquefaction

This paper presented a novel combined cycle of air separation and natural gas liquefaction. The idea is that natural gas can be liquefied, meanwhile gaseous or liquid nitrogen and oxygen are produced in one combined cryogenic system. Cycle simulation and exergy analysis were performed to evaluate the process and thereby reveal the influence of the crucial parameter, i.e., flow rate ratio through two stages expanders β on heat transfer temperature difference, its distribution and consequent exergy loss. Composite curves for the combined hot streams (feeding natural gas and recycled nitrogen) and the cold stream showed the degree of optimization available in this process if appropriate β was designed. The results indicated that increasing β reduces temperature difference and exergy loss in heat exchange process. However, the maximum limit value of β should be confined in terms of minimum temperature difference proposed in heat exchanger design standard and heat exchanger size. The optimal βopt under different operation conditions corresponding to the required minimum temperature differences was investigated.

A Robust Method for Hand Tracking Using Mean-shift Algorithm and Kalman Filter in Stereo Color Image Sequences

Real-time hand tracking is a challenging task in many computer vision applications such as gesture recognition. This paper proposes a robust method for hand tracking in a complex environment using Mean-shift analysis and Kalman filter in conjunction with 3D depth map. The depth information solve the overlapping problem between hands and face, which is obtained by passive stereo measuring based on cross correlation and the known calibration data of the cameras. Mean-shift analysis uses the gradient of Bhattacharyya coefficient as a similarity function to derive the candidate of the hand that is most similar to a given hand target model. And then, Kalman filter is used to estimate the position of the hand target. The results of hand tracking, tested on various video sequences, are robust to changes in shape as well as partial occlusion.

Revised PLWAP Tree with Non-frequent Items for Mining Sequential Pattern

Sequential pattern mining is a challenging task in data mining area with large applications. One among those applications is mining patterns from weblog. Recent times, weblog is highly dynamic and some of them may become absolute over time. In addition, users may frequently change the threshold value during the data mining process until acquiring required output or mining interesting rules. Some of the recently proposed algorithms for mining weblog, build the tree with two scans and always consume large time and space. In this paper, we build Revised PLWAP with Non-frequent Items (RePLNI-tree) with single scan for all items. While mining sequential patterns, the links related to the nonfrequent items are not considered. Hence, it is not required to delete or maintain the information of nodes while revising the tree for mining updated transactions. The algorithm supports both incremental and interactive mining. It is not required to re-compute the patterns each time, while weblog is updated or minimum support changed. The performance of the proposed tree is better, even the size of incremental database is more than 50% of existing one. For evaluation purpose, we have used the benchmark weblog dataset and found that the performance of proposed tree is encouraging compared to some of the recently proposed approaches.

Proposal of Additional Fuzzy Membership Functions in Smoothing Transition Autoregressive Models

In this paper we present, propose and examine additional membership functions for the Smoothing Transition Autoregressive (STAR) models. More specifically, we present the tangent hyperbolic, Gaussian and Generalized bell functions. Because Smoothing Transition Autoregressive (STAR) models follow fuzzy logic approach, more fuzzy membership functions should be tested. Furthermore, fuzzy rules can be incorporated or other training or computational methods can be applied as the error backpropagation or genetic algorithm instead to nonlinear squares. We examine two macroeconomic variables of US economy, the inflation rate and the 6-monthly treasury bills interest rates.

Investigation of the Effect of Cavitator Angle and Dimensions for a Supercavitating Vehicle

At very high speeds, bubbles form in the underwater vehicles because of sharp trailing edges or of places where the local pressure is lower than the vapor pressure. These bubbles are called cavities and the size of the cavities grows as the velocity increases. A properly designed cavitator can induce the formation of a single big cavity all over the vehicle. Such a vehicle travelling in the vaporous cavity is called a supercavitating vehicle and the present research work mainly focuses on the dynamic modeling of such vehicles. Cavitation of the fins is also accounted and the effect of the same on trajectory is well explained. The entire dynamics has been developed using the state space approach and emphasis is given on the effect of size and angle of attack of the cavitator. Control law has been established for the motion of the vehicle using Non-linear Dynamic Inverse (NDI) with cavitator as the control surface.

Is Management Science doing Enough to Improve Healthcare?

Healthcare issues continue to pose huge problems and incur massive costs. As a result there are many challenging problems still unresolved. In this paper, we will carry out an extensive scientific survey of different areas of management and planning in an attempt to identify where there has already been a substantial contribution from management science methods to healthcare problems and where there is a clear potential for more work to be done. The focus will be on the read-across to the healthcare domain from such approaches applied generally to management and planning and how the methods can be used to improvement patient care. We conclude that, since the healthcare domain significantly differs from traditional areas of management and planning, in some cases there is a need to modify the approaches so as to incorporate the complexities of healthcare, and fully exploit the potential for improvement.

Robust Human Rights Governance: Developing International Criteria

Many states are now committed to implementing international human rights standards domestically. In terms of practical governance, how might effectiveness be measured? A facevalue answer can be found in domestic laws and institutions relating to human rights. However, this article provides two further tools to help states assess their status on the spectrum of robust to fragile human rights governance. The first recognises that each state has its own 'human rights history' and the ideal end stage is robust human rights governance, and the second is developing criteria to assess robustness. Although a New Zealand case study is used to illustrate these tools, the widespread adoption of human rights standards by many states inevitably means that the issues are relevant to other countries. This is even though there will always be varying degrees of similarity-difference in constitutional background and developed or emerging human rights systems.

Optimal Embedded Generation Allocation in Distribution System Employing Real Coded Genetic Algorithm Method

This paper proposes a new methodology for the optimal allocation and sizing of Embedded Generation (EG) employing Real Coded Genetic Algorithm (RCGA) to minimize the total power losses and to improve voltage profiles in the radial distribution networks. RCGA is a method that uses continuous floating numbers as representation which is different from conventional binary numbers. The RCGA is used as solution tool, which can determine the optimal location and size of EG in radial system simultaneously. This method is developed in MATLAB. The effect of EG units- installation and their sizing to the distribution networks are demonstrated using 24 bus system.

Using a Trust-Based Environment Key for Mobile Agent Code Protection

Human activities are increasingly based on the use of remote resources and services, and on the interaction between remotely located parties that may know little about each other. Mobile agents must be prepared to execute on different hosts with various environmental security conditions. The aim of this paper is to propose a trust based mechanism to improve the security of mobile agents and allow their execution in various environments. Thus, an adaptive trust mechanism is proposed. It is based on the dynamic interaction between the agent and the environment. Information collected during the interaction enables generation of an environment key. This key informs on the host-s trust degree and permits the mobile agent to adapt its execution. Trust estimation is based on concrete parameters values. Thus, in case of distrust, the source of problem can be located and a mobile agent appropriate behavior can be selected.

Changes to Oxidative Stress Levels Following Exposure to Formaldehyde in Lymphocytes

Formaldehyde is the illegal chemical substance used for food preservation in fish and vegetable. It can promote carcinogenesis. Superoxide dismutases are the important antioxidative enzymes that catalyze the dismutation of superoxide anion into oxygen and hydrogen peroxide. The resultant level of oxidative stress in formaldehyde-treated lymphocytes was investigated. The formaldehyde concentrations of 0, 20, 40, 60, 80 and 120μmol/L were treated in human lymphocytes for 12 hours. After 12 treated hours, the superoxide dismutase activity change was measured in formaldehyde-treated lymphocytes. The results showed that the formaldehyde concentrations of 60, 80 and 120μmol/L significantly decreased superoxide dismutase activities in lymphocytes (P < 0.05). The change of superoxide dismutase activity in formaldehyde-treated lymphocytes may be the biomarker for detect cellular injury, such as damage to DNA, due to formaldehyde exposure.

Climate Change and Environmental Education: The Application of Concept Map for Representing the Knowledge Complexity of Climate Change

It has formed an essential issue that Climate Change, composed of highly knowledge complexity, reveals its significant impact on human existence. Therefore, specific national policies, some of which present the educational aspects, have been published for overcoming the imperative problem. Accordingly, the study aims to analyze as well as integrate the relationship between Climate Change and environmental education and apply the perspective of concept map to represent the knowledge contents and structures of Climate Change; by doing so, knowledge contents of Climate Change could be represented in an even more comprehensive way and manipulated as the tool for environmental education. The method adapted for this study is knowledge conversion model compounded of the platform for experts and teachers, who were the participants for this study, to cooperate and combine each participant-s standpoints into a complete knowledge framework that is the foundation for structuring the concept map. The result of this research contains the important concepts, the precise propositions and the entire concept map for representing the robust concepts of Climate Change.

Pipelined Control-Path Effects on Area and Performance of a Wormhole-Switched Network-on-Chip

This paper presents design trade-off and performance impacts of the amount of pipeline phase of control path signals in a wormhole-switched network-on-chip (NoC). The numbers of the pipeline phase of the control path vary between two- and one-cycle pipeline phase. The control paths consist of the routing request paths for output selection and the arbitration paths for input selection. Data communications between on-chip routers are implemented synchronously and for quality of service, the inter-router data transports are controlled by using a link-level congestion control to avoid lose of data because of an overflow. The trade-off between the area (logic cell area) and the performance (bandwidth gain) of two proposed NoC router microarchitectures are presented in this paper. The performance evaluation is made by using a traffic scenario with different number of workloads under 2D mesh NoC topology using a static routing algorithm. By using a 130-nm CMOS standard-cell technology, our NoC routers can be clocked at 1 GHz, resulting in a high speed network link and high router bandwidth capacity of about 320 Gbit/s. Based on our experiments, the amount of control path pipeline stages gives more significant impact on the NoC performance than the impact on the logic area of the NoC router.

Digital Learning Environments for Joint Master in Science Programmes in Building and Construction in Europe: Experimenting with Tools and Technologies

Recent developments in information and communication technologies (ICT) have created excellent conditions for profoundly enhancing the traditional learning and teaching practices. New modes of teaching in higher education subjects can profoundly enhance ones ability to proactively constructing his or her personal learning universe. These developments have contributed to digital learning environments becoming widely available and accessible. In addition, there is a trend towards enlargement and specialization in higher education in Europe. With as a result that existing Master of Science (MSc) programmes are merged or new programmes have been established that are offered as joint MSc programmes to students. In these joint MSc programmes, the need for (common) digital learning environments capable of surmounting the barriers of time and location has become evident. This paper discusses the past and ongoing efforts to establish such common digital learning environments in two joint MSc programmes in Europe and discusses the way technology-based learning environments affect the traditional way of learning.

The Nonlinear Dynamic Elasto-Plastic Analysis for Evaluating the Controlling Effectiveness and Failure Mechanism of the MSCSS

This paper focuses on the Mega-Sub Controlled Structure Systems (MSCSS) performances and characteristics regarding the new control principle contained in MSCSS subjected to strong earthquake excitations. The adopted control scheme consists of modulated sub-structures where the control action is achieved by viscous dampers and sub-structure own configuration. The elastic-plastic time history analysis under severe earthquake excitation is analyzed base on the Finite Element Analysis Method (FEAM), and some comparison results are also given in this paper. The result shows that the MSCSS systems can remarkably reduce vibrations effects more than the mega-sub structure (MSS). The study illustrates that the improved MSCSS presents good seismic resistance ability even at 1.2g and can absorb seismic energy in the structure, thus imply that structural members cross section can be reduce and achieve to good economic characteristics. Furthermore, the elasto-plastic analysis demonstrates that the MSCSS is accurate enough regarding international building evaluation and design codes. This paper also shows that the elasto-plastic dynamic analysis method is a reasonable and reliable analysis method for structures subjected to strong earthquake excitations and that the computed results are more precise.

Formulation Development and Moiturising Effects of a Topical Cream of Aloe vera Extract

This study was designed to formulate, pharmaceutically evaluate a topical skin-care cream (w/o emulsion) of Aloe Vera versus its vehicle (Base) as control and determine their effects on Stratum Corneum (SC) water content and Transepidermal water loss (TEWL). Base containing no extract and a Formulation containing 3% concentrated extract of Aloe Vera was developed by entrapping in the inner aqueous phase of w/o emulsion (cream). Lemon oil was incorporated to improve the odor. Both the Base and Formulation were stored at 8°C ±0.1°C (in refrigerator), 25°C±0.1°C, 40°C±0.1°C and 40°C± 0.1°C with 75% RH (in incubator) for a period of 4 weeks to predict their stability. The evaluation parameters consisted of color, smell, type of emulsion, phase separation, electrical conductivity, centrifugation, liquefaction and pH. Both the Base and Formulation were applied to the cheeks of 21 healthy human volunteers for a period of 8 weeks Stratum corneum (SC) water content and Transepidermal water loss (TEWL) were monitored every week to measure any effect produced by these topical creams. The expected organoleptic stability of creams was achieved from 4 weeks in-vitro study period. Odor was disappeared with the passage of time due to volatilization of lemon oil. Both the Base and Formulation produced significant (p≤0.05) changes in TEWL with respect to time. SC water content was significantly (p≤0.05) increased by the Formulation while the Base has insignificant (p 0.05) effects on SC water content. The newly formulated cream of Aloe Vera, applied is suitable for improvement and quantitative monitoring of skin hydration level (SC water content/ moisturizing effects) and reducing TEWL in people with dry skin.