On Hyperbolic Gompertz Growth Model

We proposed a Hyperbolic Gompertz Growth Model (HGGM), which was developed by introducing a shape parameter (allometric). This was achieved by convoluting hyperbolic sine function on the intrinsic rate of growth in the classical gompertz growth equation. The resulting integral solution obtained deterministically was reprogrammed into a statistical model and used in modeling the height and diameter of Pines (Pinus caribaea). Its ability in model prediction was compared with the classical gompertz growth model, an approach which mimicked the natural variability of height/diameter increment with respect to age and therefore provides a more realistic height/diameter predictions using goodness of fit tests and model selection criteria. The Kolmogorov Smirnov test and Shapiro-Wilk test was also used to test the compliance of the error term to normality assumptions while the independence of the error term was confirmed using the runs test. The mean function of top height/Dbh over age using the two models under study predicted closely the observed values of top height/Dbh in the hyperbolic gompertz growth models better than the source model (classical gompertz growth model) while the results of R2, Adj. R2, MSE and AIC confirmed the predictive power of the Hyperbolic Gompertz growth models over its source model.

Malaysian Multi-Ethnic Discrimination Scale: Preliminary Factor and Psychometric Analysis

The aims of this study were to determine the factor structure and psychometric properties (i.e., reliability and convergent validity) of the Malaysian Multi-Ethnic Discrimination Scale (MMEDS). It consists of 71-items measure experience, strategies used and consequences of ethnic discrimination. A sample of 649 university students from one of the higher education institution in Malaysia was asked to complete MMEDS, as well as Perceived Ethnic and Racial Discrimination. The exploratory factor analysis on ethnic discrimination experience extracted two factors labeled ‘unfair treatment’ (15 items) and ‘Denial of the ethnic right’ (12 items) which accounted for 60.92% of the total variance. The two sub scales demonstrated clear reliability with internal consistency above .70. The convergent validity of the Scale was supported by an expected pattern of correlations (positive and significant correlation) between the score of unfair treatment and denial of the ethnic right and the score of Perceived Ethnic and Racial Discrimination by Peers Scale. The results suggest that the MMEDS is a reliable and valid measure. However, further studies need to be carried out in other groups of sample as to validate the Scale.

Geochemistry of Natural Radionuclides Associated with Acid Mine Drainage (AMD) in a Coal Mining Area in Southern Brazil

Coal is an important non-renewable energy source of and can be associated with radioactive elements. In Figueira city, Paraná state, Brazil, it was recorded high uranium activity near the coal mine that supplies a local thermoelectric power plant. In this context, the radon activity (Rn-222, produced by the Ra-226 decay in the U-238 natural series) was evaluated in groundwater, river water and effluents produced from the acid mine drainage in the coal reject dumps. The samples were collected in August 2013 and in February 2014 and analyzed at LABIDRO (Laboratory of Isotope and Hydrochemistry), UNESP, Rio Claro city, Brazil, using an alpha spectrometer (AlphaGuard) adjusted to evaluate the mean radon activity concentration in five cycles of 10 minutes. No radon activity concentration above 100 Bq.L-1, which was a previous critic value established by the World Health Organization. The average radon activity concentration in groundwater was higher than in surface water and in effluent samples, possibly due to the accumulation of uranium and radium in the aquifer layers that favors the radon trapping. The lower value in the river waters can indicate dilution and the intermediate value in the effluents may indicate radon absorption in the coal particles of the reject dumps. The results also indicate that the radon activities in the effluents increase with the sample acidification, possibly due to the higher radium leaching and the subsequent radon transport to the drainage flow. The water samples of Laranjinha River and Ribeirão das Pedras stream, which, respectively, supply Figueira city and receive the mining effluent, exhibited higher pH values upstream the mine, reflecting the acid mine drainage discharge. The radionuclides transport indicates the importance of monitoring their activity concentration in natural waters due to the risks that the radioactivity can represent to human health.

The Impact of Host Country Effects on Transferring HRM Practices from Western Headquarters to Ukrainian Subsidiaries

The emerging markets of post-USSR countries have attracted Western multinational companies; however, weak institutions and unstable host country environments have hindered the implementation of successful management practices. The Ukrainian market, in light of recent events, is particularly interesting to study for its compatibility with Western businesses. This paper focuses on factors that can facilitate or inhibit the transfer of human resource management practices from Western headquarters to Ukrainian subsidiaries. To explain the national context’s effects better, a business systems approach has been applied to a qualitative study of 16 wholly owned Western subsidiaries, dissecting the reasons for a weak integration of Western practices in Ukraine. Results show that underdeveloped institutions have forced companies to develop additional practices that compensate for national weaknesses, as well as to adjust to a constantly changing environment. Flexibility and local responsiveness were observed as vital for success in Ukraine.

A Budget and Deadline Constrained Fault Tolerant Load Balanced Scheduling Algorithm for Computational Grids

Grid is an environment with millions of resources which are dynamic and heterogeneous in nature. A computational grid is one in which the resources are computing nodes and is meant for applications that involves larger computations. A scheduling algorithm is said to be efficient if and only if it performs better resource allocation even in case of resource failure. Resource allocation is a tedious issue since it has to consider several requirements such as system load, processing cost and time, user’s deadline and resource failure. This work attempts in designing a resource allocation algorithm which is cost-effective and also targets at load balancing, fault tolerance and user satisfaction by considering the above requirements. The proposed Budget Constrained Load Balancing Fault Tolerant algorithm with user satisfaction (BLBFT) reduces the schedule makespan, schedule cost and task failure rate and improves resource utilization. Evaluation of the proposed BLBFT algorithm is done using Gridsim toolkit and the results are compared with the algorithms which separately concentrates on all these factors. The comparison results ensure that the proposed algorithm works better than its counterparts.

Physicians’ Knowledge and Perception of Gene Profiling in Malaysia

Availability of different genetic tests after completion of Human Genome Project increases the physicians’ responsibility to keep themselves update on the potential implementation of these genetic tests in their daily practice. However, due to numbers of barriers, still many of physicians are not either aware of these tests or are not willing to offer or refer their patients for genetic tests. This study was conducted an anonymous, cross-sectional, mailed-based survey to develop a primary data of Malaysian physicians’ level of knowledge and perception of gene profiling. Questionnaire had 29 questions. Total scores on selected questions were used to assess the level of knowledge. The highest possible score was 11. Descriptive statistics, one way ANOVA and chi-squared test was used for statistical analysis. Sixty three completed questionnaires were returned by 27 general practitioners (GPs) and 36 medical specialists. Responders’ age ranges from 24 to 55 years old (mean 30.2 ± 6.4). About 40% of the participants rated themselves as having poor level of knowledge in genetics in general whilst 60% believed that they have fair level of knowledge; however, almost half (46%) of the respondents felt that they were not knowledgeable about available genetic tests. A majority (94%) of the responders were not aware of any lab or company which is offering gene profiling services in Malaysia. Only 4% of participants were aware of using gene profiling for detection of dosage of some drugs. Respondents perceived greater utility of gene profiling for breast cancer (38%) compared to the colorectal familial cancer (3%). The score of knowledge ranged from 2 to 8 (mean 4.38 ± 1.67). Non- significant differences between score of knowledge of GPs and specialists were observed, with score of 4.19 and 4.58 respectively. There was no significant association between any demographic factors and level of knowledge. However, those who graduated between years 2001 to 2005 had higher level of knowledge. Overall, 83% of participants showed relatively high level of perception on value of gene profiling to detect patient’s risk of disease. However, low perception was observed for both statements of using gene profiling for general population in order to alter their lifestyle (25%) as well as having the full sequence of a patient genome for the purpose of determining a patient’s best match for treatment (18%). The lack of clinical guidelines, limited provider knowledge and awareness, lack of time and resources to educate patients, lack of evidence-based clinical information and cost of tests were the most barriers of ordering gene profiling mentioned by physicians. In conclusion Malaysian physicians who participate in this study had mediocre level of knowledge and awareness in gene profiling. The low exposure to the genetic questions and problems might be a key predictor of lack of awareness and knowledge on available genetic tests. Educational and training workshop might be useful in helping Malaysian physicians incorporate genetic profiling into practice for eligible patients.

An Approach to Flatten the Gain of Fiber Raman Amplifiers with Multi-Pumping

The effects of the pumping wavelength and their power on the gain flattening of a fiber Raman amplifier (FRA) are investigated. The multi-wavelength pumping scheme is utilized to achieve gain flatness in FRA. It is proposed that gain flatness becomes better with increase in number of pumping wavelengths applied. We have achieved flat gain with 0.27 dB fluctuation in a spectral range of 1475-1600 nm for a Raman fiber length of 10 km by using six pumps with wavelengths with in the 1385-1495 nm interval. The effect of multi-wavelength pumping scheme on gain saturation in FRA is also studied. It is proposed that gain saturation condition gets improved by using this scheme and this scheme is more useful for higher spans of Raman fiber length.

The Preparation of Silicon and Aluminum Extracts from Tuncbilek and Orhaneli Fly Ashes by Alkali Fusion

Coal fly ash is formed as a solid waste product from the combustion of coal in coal fired power stations. Huge amounts of fly ash are produced globally every year and are predicted to increase. Nowadays, less than half of the fly ash is used as a raw material for cement manufacturing, construction and the rest of it is disposed as a waste causing yet another environmental concern. For this reason, the recycling of this kind of slurries into useful materials is quite important in terms of economical and environmental aspects. The purpose of this study is to evaluate the Orhaneli and Tuncbilek coal fly ashes for utilization in some industrial applications. Therefore the mineralogical and chemical compositions of these fly ashes were analyzed by X-ray fluorescence spectroscopy, ourier-transform infrared spectrometer, and X-ray diffraction. The silicon (Si) and aluminum (Al) in the fly ashes were activated by alkali fusion technique with sodium hydroxide. The obtained extracts were analyzed for Si and Al content by inductively coupled plasma optical emission spectrometry.

Performance of the Aptima® HIV-1 Quant Dx Assay on the Panther System

The Aptima® HIV-1 Quant Dx Assay is a fully automated assay on the Panther system. It is based on Transcription- Mediated Amplification and real time detection technologies. This assay is intended for monitoring HIV-1 viral load in plasma specimens and for the detection of HIV-1 in plasma and serum specimens. Nine-hundred and seventy nine specimens selected at random from routine testing at St Thomas’ Hospital, London were anonymised and used to compare the performance of the Aptima HIV-1 Quant Dx assay and Roche COBAS® AmpliPrep/COBAS® TaqMan® HIV-1 Test, v2.0. Two-hundred and thirty four specimens gave quantitative HIV-1 viral load results in both assays. The quantitative results reported by the Aptima Assay were comparable to those reported by the Roche COBAS AmpliPrep/COBAS TaqMan HIV-1 Test, v2.0 with a linear regression slope of 1.04 and an intercept on -0.097. The Aptima assay detected HIV-1 in more samples than the COBAS assay. This was not due to lack of specificity of the Aptima assay because this assay gave 99.83% specificity on testing plasma specimens from 600 HIV-1 negative individuals. To understand the reason for this higher detection rate a side-by-side comparison of low level panels made from the HIV-1 3rd international standard (NIBSC10/152) and clinical samples of various subtypes were tested in both assays. The Aptima assay was more sensitive than the COBAS assay. The good sensitivity, specificity and agreement with other commercial assays make the HIV-1 Quant Dx Assay appropriate for both viral load monitoring and detection of HIV-1 infections.

Economic Neoliberalism: Property Right and Redistribution Policy

In this paper, we will analyze the relationship between the neo-liberal concept of property rights and redistribution policy. This issue is back in the focus of interest due to the crisis 2008. The crisis has reaffirmed the influence of the state on the free-market processes. The interference of the state with property relations reopened a classical question: is it legitimate to redistribute resources of a man in favor of another man with taxes? The dominant view is that the neoliberal philosophy of natural rights is incompatible with redistributive measures. In principle, this view can be accepted. However, when we look into the details of the theory of natural rights proposed by some coryphaei of neoliberal philosophy, such as Hayek, Nozick, Buchanan and Rothbard, we can see that it is not such an unequivocal view. 

Treatment of Chrome Tannery Wastewater by Biological Process - A Mini Review

Chrome tannery wastewater causes serious environmental hazard due to its high pollution potential. As a result, rigorous treatment is necessary for abatement of pollution from this type of wastewater. There are many research studies on chrome tannery wastewater treatment in the field of physical, chemical, and biological methods. In general, biological treatment process is found ineffective for direct application because of adverse effects by toxic chromium, sulphide, chloride etc. However, biological methods were employed mainly for a few sub processes generating significant amount of organic matter and without chromium, chlorides etc. In this context the present paper reviews the characteristics feature and pollution potential of wastewater generated from chrome tannery units and treatment of the same. The different biological processes used earlier and their chronological development for treatment of the chrome tannery wastewater are thoroughly reviewed in this paper. In this regard, the scope of hybrid bioreactor - an advanced technology option has also been explored, as this kind of treatment is well suited for the wastewater having inhibitory substances. 

Numerical Analysis of Laminar Reflux Condensation from Gas-Vapour Mixtures in Vertical Parallel Plate Channels

Reflux condensation occurs in vertical channels and tubes when there is an upward core flow of vapour (or gas-vapour mixture) and a downward flow of the liquid film. The understanding of this condensation configuration is crucial in the design of reflux condensers, distillation columns, and in loss-of-coolant safety analyses in nuclear power plant steam generators. The unique feature of this flow is the upward flow of the vapour-gas mixture (or pure vapour) that retards the liquid flow via shear at the liquid-mixture interface. The present model solves the full, elliptic governing equations in both the film and the gas-vapour core flow. The computational mesh is non-orthogonal and adapts dynamically the phase interface, thus produces a sharp and accurate interface. Shear forces and heat and mass transfer at the interface are accounted for fundamentally. This modeling is a big step ahead of current capabilities by removing the limitations of previous reflux condensation models which inherently cannot account for the detailed local balances of shear, mass, and heat transfer at the interface. Discretisation has been done based on finite volume method and co-located variable storage scheme. An in-house computer code was developed to implement the numerical solution scheme. Detailed results are presented for laminar reflux condensation from steam-air mixtures flowing in vertical parallel plate channels. The results include velocity and gas mass fraction profiles, as well as axial variations of film thickness.

New Approach for Minimizing Wavelength Fragmentation in Wavelength-Routed WDM Networks

Wavelength Division Multiplexing (WDM) is the dominant transport technology used in numerous high capacity backbone networks, based on optical infrastructures. Given the importance of costs (CapEx and OpEx) associated to these networks, resource management is becoming increasingly important, especially how the optical circuits, called “lightpaths”, are routed throughout the network. This requires the use of efficient algorithms which provide routing strategies with the lowest cost. We focus on the lightpath routing and wavelength assignment problem, known as the RWA problem, while optimizing wavelength fragmentation over the network. Wavelength fragmentation poses a serious challenge for network operators since it leads to the misuse of the wavelength spectrum, and then to the refusal of new lightpath requests. In this paper, we first establish a new Integer Linear Program (ILP) for the problem based on a node-link formulation. This formulation is based on a multilayer approach where the original network is decomposed into several network layers, each corresponding to a wavelength. Furthermore, we propose an efficient heuristic for the problem based on a greedy algorithm followed by a post-treatment procedure. The obtained results show that the optimal solution is often reached. We also compare our results with those of other RWA heuristic methods

Disturbance Observer for Lateral Trajectory Tracking Control for Autonomous and Cooperative Driving

In this contribution a structure for high level lateral vehicle tracking control based on the disturbance observer is presented. The structure is characterized by stationary compensating side forces disturbances and guaranteeing a cooperative behavior at the same time. Driver inputs are not compensated by the disturbance observer. Moreover the structure is especially useful as it robustly stabilizes the vehicle. Therefore the parameters are selected using the Parameter Space Approach. The implemented algorithms are tested in real world scenarios.

Static Priority Approach to Under-Frequency Based Load Shedding Scheme in Islanded Industrial Networks: Using the Case Study of Fatima Fertilizer Company Ltd - FFL

In this paper static scheme of under-frequency based load shedding is considered for chemical and petrochemical industries with islanded distribution networks relying heavily on the primary commodity to ensure minimum production loss, plant downtime or critical equipment shutdown. A simplistic methodology is proposed for in-house implementation of this scheme using underfrequency relays and a step by step guide is provided including the techniques to calculate maximum percentage overloads, frequency decay rates, time based frequency response and frequency based time response of the system. Case study of FFL electrical system is utilized, presenting the actual system parameters and employed load shedding settings following the similar series of steps. The arbitrary settings are then verified for worst overload conditions (loss of a generation source in this case) and comprehensive system response is then investigated.

State of Freelancing in IT and Future Trends

Freelancing in IT has seen an increased popularity during the last years mainly because of the fast Internet adoption in the countries with emerging economies, correlated with the continuous seek for reduced development costs as well with the rise of online platforms which address planning, coordination and various development tasks. This paper conducts an overview of the most relevant Freelance Marketplaces available and studies the market structure, distribution of the workforce and trends in IT freelancing.

Hydrothermal Treatment for Production of Aqueous Co-Product and Efficient Oil Extraction from Microalgae

Hydrothermal liquefaction (HTL) is a technique for obtaining clean biofuel from biomass in the presence of heat and pressure in an aqueous medium which leads to a decomposition of this biomass to the formation of various products. A role of operating conditions is essential for the bio-oil and other products’ yield and also quality of the products. The effects of these parameters were investigated in regards to the composition and yield of the products. Chlorellaceae microalgae were tested under different HTL conditions to clarify suitable conditions for extracting bio-oil together with value-added co-products. Firstly, different microalgae loading rates (5-30%) were tested and found that this parameter has not much significant to product yield. Therefore, 10% microalgae loading rate was selected as a proper economical solution for conditioned schedule at 250oC and 30 min-reaction time. Next, a range of temperature (210-290oC) was applied to verify the effects of each parameter by keeping the reaction time constant at 30 min. The results showed no linkage with the increase of the reaction temperature and some reactions occurred that lead to different product yields. Moreover, some nutrients found in the aqueous product are possible to be utilized for nutrient recovery.

Analysis of a Lignocellulose Degrading Microbial Consortium to Enhance the Anaerobic Digestion of Rice Straws

Rice straw is lignocellulosic biomass which can be utilized as substrate for the biogas production. However, due to the property and composition of rice straw, it is difficult to be degraded by hydrolysis enzymes. One of the pretreatment methods that modify such properties of lignocellulosic biomass is the application of lignocellulose-degrading microbial consortia. The aim of this study is to investigate the effect of microbial consortia to enhance biogas production. To select the high efficient consortium, cellulase enzymes were extracted and their activities were analyzed. The results suggested that microbial consortium culture obtained from cattle manure is the best candidate compared to decomposed wood and horse manure. A microbial consortium isolated from cattle manure was then mixed with anaerobic sludge and used as inoculum for biogas production. The optimal conditions for biogas production were investigated using response surface methodology (RSM). The tested parameters were the ratio of amount of microbial consortium isolated and amount of anaerobic sludge (MI:AS), substrate to inoculum ratio (S:I) and temperature. Here, the value of the regression coefficient R2 = 0.7661 could be explained by the model which is high to advocate the significance of the model. The highest cumulative biogas yield was 104.6 ml/g-rice straw at optimum ratio of MI:AS, ratio of S:I, and temperature of 2.5:1, 15:1 and 44°C respectively.

A High Level Implementation of a High Performance Data Transfer Interface for NoC

The distribution of a single global clock across a chip has become the major design bottleneck for high performance VLSI systems owing to the power dissipation, process variability and multicycle cross-chip signaling. A Network-on-Chip (NoC) architecture partitioned into several synchronous blocks has become a promising approach for attaining fine-grain power management at the system level. In a NoC architecture the communication between the blocks is handled asynchronously. To interface these blocks on a chip operating at different frequencies, an asynchronous FIFO interface is inevitable. However, these asynchronous FIFOs are not required if adjacent blocks belong to the same clock domain. In this paper, we have designed and analyzed a 16-bit asynchronous micropipelined FIFO of depth four, with the awareness of place and route on an FPGA device. We have used a commercially available Spartan 3 device and designed a high speed implementation of the asynchronous 4-phase micropipeline. The asynchronous FIFO implemented on the FPGA device shows 76 Mb/s throughput and a handshake cycle of 109 ns for write and 101.3 ns for read at the simulation under the worst case operating conditions (voltage = 0.95V) on a working chip at the room temperature.

Data-driven Multiscale Tsallis Complexity: Application to EEG Analysis

This work proposes a data-driven multiscale based quantitative measures to reveal the underlying complexity of electroencephalogram (EEG), applying to a rodent model of hypoxic-ischemic brain injury and recovery. Motivated by that real EEG recording is nonlinear and non-stationary over different frequencies or scales, there is a need of more suitable approach over the conventional single scale based tools for analyzing the EEG data. Here, we present a new framework of complexity measures considering changing dynamics over multiple oscillatory scales. The proposed multiscale complexity is obtained by calculating entropies of the probability distributions of the intrinsic mode functions extracted by the empirical mode decomposition (EMD) of EEG. To quantify EEG recording of a rat model of hypoxic-ischemic brain injury following cardiac arrest, the multiscale version of Tsallis entropy is examined. To validate the proposed complexity measure, actual EEG recordings from rats (n=9) experiencing 7 min cardiac arrest followed by resuscitation were analyzed. Experimental results demonstrate that the use of the multiscale Tsallis entropy leads to better discrimination of the injury levels and improved correlations with the neurological deficit evaluation after 72 hours after cardiac arrest, thus suggesting an effective metric as a prognostic tool.