Development of an Immunoassay Platform for Diagnosis of Acute Kidney Injury

Acute kidney injury (AKI) is a new worldwide public health problem. A diagnosis of this disease using creatinine is still a problem in clinical practice. Therefore, a measurement of biomarkers responsible for AKI has received much attention in the past couple years. Cytokine interleukin-18 (IL-18) was reported as one of the early biomarkers for AKI. The most commonly used method to detect this biomarker is an immunoassay. This study used a planar platform to perform an immunoassay using fluorescence for detection. In this study, anti-IL-18 antibody was immobilized onto a microscope slide using a covalent binding method. Make-up samples were diluted at the concentration between 10 to 1000 pg/ml to create a calibration curve. The precision of the system was determined using a coefficient of variability (CV), which was found to be less than 10%. The performance of this immunoassay system was compared with the measurement from ELISA.

Application of Adaptive Neuro-Fuzzy Inference System in the Prediction of Economic Crisis Periods in USA

In this paper discrete choice models, Logit and Probit are examined in order to predict the economic recession or expansion periods in USA. Additionally we propose an adaptive neuro-fuzzy inference system with triangular membership function. We examine the in-sample period 1947-2005 and we test the models in the out-of sample period 2006-2009. The forecasting results indicate that the Adaptive Neuro-fuzzy Inference System (ANFIS) model outperforms significant the Logit and Probit models in the out-of sample period. This indicates that neuro-fuzzy model provides a better and more reliable signal on whether or not a financial crisis will take place.

Enhancements in Blended e-Learning Management System

A learning management system (commonly abbreviated as LMS) is a software application for the administration, documentation, tracking, and reporting of training programs, classroom and online events, e-learning programs, and training content (Ellis 2009). (Hall 2003) defines an LMS as \"software that automates the administration of training events. All Learning Management Systems manage the log-in of registered users, manage course catalogs, record data from learners, and provide reports to management\". Evidence of the worldwide spread of e-learning in recent years is easy to obtain. In April 2003, no fewer than 66,000 fully online courses and 1,200 complete online programs were listed on the TeleCampus portal from TeleEducation (Paulsen 2003). In the report \" The US market in the Self-paced eLearning Products and Services:2010-2015 Forecast and Analysis\" The number of student taken classes exclusively online will be nearly equal (1% less) to the number taken classes exclusively in physical campuses. Number of student taken online course will increase from 1.37 million in 2010 to 3.86 million in 2015 in USA. In another report by The Sloan Consortium three-quarters of institutions report that the economic downturn has increased demand for online courses and programs.

Modeling “Web of Trust“ with Web 2.0

“Web of Trust" is one of the recognized goals for Web 2.0. It aims to make it possible for the people to take responsibility for what they publish on the web, including organizations, businesses and individual users. These objectives, among others, drive most of the technologies and protocols recently standardized by the governing bodies. One of the great advantages of Web infrastructure is decentralization of publication. The primary motivation behind Web 2.0 is to assist the people to add contents for Collective Intelligence (CI) while providing mechanisms to link content with people for evaluations and accountability of information. Such structure of contents will interconnect users and contents so that users can use contents to find participants and vice versa. This paper proposes conceptual information storage and linking model, based on decentralized information structure, that links contents and people together. The model uses FOAF, Atom, RDF and RDFS and can be used as a blueprint to develop Web 2.0 applications for any e-domain. However, primary target for this paper is online trust evaluation domain. The proposed model targets to assist the individuals to establish “Web of Trust" in online trust domain.

Bi-Criteria Latency Optimization of Intra-and Inter-Autonomous System Traffic Engineering

Traffic Engineering (TE) is the process of controlling how traffic flows through a network in order to facilitate efficient and reliable network operations while simultaneously optimizing network resource utilization and traffic performance. TE improves the management of data traffic within a network and provides the better utilization of network resources. Many research works considers intra and inter Traffic Engineering separately. But in reality one influences the other. Hence the effective network performances of both inter and intra Autonomous Systems (AS) are not optimized properly. To achieve a better Joint Optimization of both Intra and Inter AS TE, we propose a joint Optimization technique by considering intra-AS features during inter – AS TE and vice versa. This work considers the important criterion say latency within an AS and between ASes. and proposes a Bi-Criteria Latency optimization model. Hence an overall network performance can be improved by considering this jointoptimization technique in terms of Latency.

Fully Parameterizable FPGA based Crypto-Accelerator

In this paper, RSA encryption algorithm and its hardware implementation in Xilinx-s Virtex Field Programmable Gate Arrays (FPGA) is analyzed. The issues of scalability, flexible performance, and silicon efficiency for the hardware acceleration of public key crypto systems are being explored in the present work. Using techniques based on the interleaved math for exponentiation, the proposed RSA calculation architecture is compared to existing FPGA-based solutions for speed, FPGA utilization, and scalability. The paper covers the RSA encryption algorithm, interleaved multiplication, Miller Rabin algorithm for primality test, extended Euclidean math, basic FPGA technology, and the implementation details of the proposed RSA calculation architecture. Performance of several alternative hardware architectures is discussed and compared. Finally, conclusion is drawn, highlighting the advantages of a fully flexible & parameterized design.

A Hybrid Approach Using Particle Swarm Optimization and Simulated Annealing for N-queen Problem

This paper presents a hybrid approach for solving nqueen problem by combination of PSO and SA. PSO is a population based heuristic method that sometimes traps in local maximum. To solve this problem we can use SA. Although SA suffer from many iterations and long time convergence for solving some problems, By good adjusting initial parameters such as temperature and the length of temperature stages SA guarantees convergence. In this article we use discrete PSO (due to nature of n-queen problem) to achieve a good local maximum. Then we use SA to escape from local maximum. The experimental results show that our hybrid method in comparison of SA method converges to result faster, especially for high dimensions n-queen problems.

Ethics in the Technology Driven Enterprise

Innovations in technology have created new ethical challenges. Essential use of electronic communication in the workplace has escalated at an astronomical rate over the past decade. As such, legal and ethical dilemmas confronted by both the employer and the employee concerning managerial control and ownership of einformation have increased dramatically in the USA. From the employer-s perspective, ownership and control of all information created for the workplace is an undeniable source of economic advantage and must be monitored zealously. From the perspective of the employee, individual rights, such as privacy, freedom of speech, and freedom from unreasonable search and seizure, continue to be stalwart legal guarantees that employers are not legally or ethically entitled to abridge in the workplace. These issues have been the source of great debate and the catalyst for legal reform. The fine line between ethical and legal has been complicated by emerging technologies. This manuscript will identify and discuss a number of specific legal and ethical issues raised by the dynamic electronic workplace and conclude with suggestions that employers should follow to respect the delicate balance between employees- legal rights to privacy and the employer's right to protect its knowledge systems and infrastructure.

Migration and Unemployment Duration: The Case of the OECD Countries

This paper examines whether or not immigration has a positive influence on the duration of unemployment, in a macroeconomic perspective. We analyse also whether the degree of labor market integration can influence migration. The integration of immigrants into the labor market is a recurrence theme in the work on the economic consequences of immigration. However, to our knowledge, no researchers have studied the impact of immigration on unemployment duration, and vice versa. With two methodology of research (panel estimations (OLS and 2SLS) and panel cointegration techniques), we show that migration seems to influence positively the short-term unemployment and negatively long-term unemployment, for 14 OECD destination countries. In addition, immigration seems to be conditioned by the structural and institutional characteristics of the labour market.

Emission of Volatile Organic Compounds from the Residential Combustion of Pyrenean Oak and Black Poplar

Smoke from domestic wood burning has been identified as a major contributor to air pollution, motivating detailed emission measurements under controlled conditions. A series of experiments was performed to characterise the emissions from wood combustion in a fireplace and in a woodstove of two common species of trees grown in Spain: Pyrenean oak (Quercus pyrenaica) and black poplar (Populus nigra). Volatile organic compounds (VOCs) in the exhaust emissions were collected in Tedlar bags, re-sampled in sorbent tubes and analysed by thermal desorption-gas chromatography-flame ionisation detection. Pyrenean oak presented substantially higher emissions in the woodstove than in the fireplace, for the majority of compounds. The opposite was observed for poplar. Among the 45 identified species, benzene and benzenerelated compounds represent the most abundant group, followed by oxygenated VOCs and aliphatics. Emission factors obtained in this study are generally of the same order than those reported for residential experiments in the USA.

Effect of Scanning Speed on Material Efficiency of Laser Metal Deposited Ti6Al4V

The study of effect of laser scanning speed on material efficiency in Ti6Al4V application is very important because unspent powder is not reusable because of high temperature oxygen pick-up and contamination. This study carried out an extensive study on the effect of scanning speed on material efficiency by varying the speed between 0.01 to 0.1m/sec. The samples are wire brushed and cleaned with acetone after each deposition to remove un-melted particles from the surface of the deposit. The substrate is weighed before and after deposition. A formula was developed to calculate the material efficiency and the scanning speed was compared with the powder efficiency obtained. The results are presented and discussed. The study revealed that the optimum scanning speed exists for this study at 0.01m/sec, above and below which the powder efficiency will drop

Evaluation of Natural Drainage Flow Pattern, Necessary for Flood Control, Using Digitized Topographic Information: A Case Study of Bayelsa State Nigeria

The need to evaluate and understand the natural drainage pattern in a flood prone, and fast developing environment is of paramount importance. This information will go a long way to help the town planners to determine the drainage pattern, road networks and areas where prominent structures are to be located. This research work was carried out with the aim of studying the Bayelsa landscape topography using digitized topographic information, and to model the natural drainage flow pattern that will aid the understanding and constructions of workable drainages. To achieve this, digitize information of elevation and coordinate points were extracted from a global imagery map. The extracted information was modeled into 3D surfaces. The result revealed that the average elevation for Bayelsa State is 12 m above sea level. The highest elevation is 28 m, and the lowest elevation 0 m, along the coastline. In Yenagoa the capital city of Bayelsa were a detail survey was carried out showed that average elevation is 15 m, the highest elevation is 25 m and lowest is 3 m above the mean sea level. The regional elevation in Bayelsa, showed a gradation decrease from the North Eastern zone to the South Western Zone. Yenagoa showed an observed elevation lineament, were low depression is flanked by high elevation that runs from the North East to the South west. Hence, future drainages in Yenagoa should be directed from the high elevation, from South East toward the North West and from the North West toward South East, to the point of convergence which is at the center that flows from South East toward the North West. Bayelsa when considered on a regional Scale, the flow pattern is from the North East to the South West, and also North South. It is recommended that in the event of any large drainage construction at municipal scale, it should be directed from North East to the South West or from North to South. Secondly, detail survey should be carried out to ascertain the local topography and the drainage pattern before the design and construction of any drainage system in any part of Bayelsa.

Solar Radiation Studies for Dubai and Sharjah, UAE

Global Solar Radiation (H) for Dubai and Sharjah, Latitude 25.25oN, Longitude 55oE and 25.29oN, Longitude 55oE respectively have been studied using sunshine hour data (n) of the areas using various methods. These calculated global solar radiation values are then compared to the measured values presented by NASA. Furthermore, the extraterrestrial (H0), diffuse (Hd) and beam radiation (Hb) are also calculated. The diffuse radiation is calculated using methods proposed by Page and Liu and Jordan (L-J). Diffuse Radiation from the Page method is higher than the L-J method. Moreover, the clearness index (KT) signifies a clear sky almost all year round. Rainy days are hardly a few in a year and limited in the months December to March. The temperature remains between 25oC in winter to 44oC in summer and is desirable for thermal applications of solar energy. From the estimated results, it appears that solar radiation can be utilized very efficiently throughout the year for photovoltaic and thermal applications.

Proposed Developments of Elliptic Curve Digital Signature Algorithm

The Elliptic Curve Digital Signature Algorithm (ECDSA) is the elliptic curve analogue of DSA, where it is a digital signature scheme designed to provide a digital signature based on a secret number known only to the signer and also on the actual message being signed. These digital signatures are considered the digital counterparts to handwritten signatures, and are the basis for validating the authenticity of a connection. The security of these schemes results from the infeasibility to compute the signature without the private key. In this paper we introduce a proposed to development the original ECDSA with more complexity.

Effect of Laser Power and Powder Flow Rate on Properties of Laser Metal Deposited Ti6Al4V

Laser Metal Deposition (LMD) is an additive manufacturing process with capabilities that include: producing new part directly from 3 Dimensional Computer Aided Design (3D CAD) model, building new part on the existing old component and repairing an existing high valued component parts that would have been discarded in the past. With all these capabilities and its advantages over other additive manufacturing techniques, the underlying physics of the LMD process is yet to be fully understood probably because of high interaction between the processing parameters and studying many parameters at the same time makes it further complex to understand. In this study, the effect of laser power and powder flow rate on physical properties (deposition height and deposition width), metallurgical property (microstructure) and mechanical (microhardness) properties on laser deposited most widely used aerospace alloy are studied. Also, because the Ti6Al4V is very expensive, and LMD is capable of reducing buy-to-fly ratio of aerospace parts, the material utilization efficiency is also studied. Four sets of experiments were performed and repeated to establish repeatability using laser power of 1.8 kW and 3.0 kW, powder flow rate of 2.88 g/min and 5.67 g/min, and keeping the gas flow rate and scanning speed constant at 2 l/min and 0.005 m/s respectively. The deposition height / width are found to increase with increase in laser power and increase in powder flow rate. The material utilization is favoured by higher power while higher powder flow rate reduces material utilization. The results are presented and fully discussed.

Explorative Data Mining of Constructivist Learning Experiences and Activities with Multiple Dimensions

This paper discusses the use of explorative data mining tools that allow the educator to explore new relationships between reported learning experiences and actual activities, even if there are multiple dimensions with a large number of measured items. The underlying technology is based on the so-called Compendium Platform for Reproducible Computing (http://www.freestatistics.org) which was built on top the computational R Framework (http://www.wessa.net).

Therapeutic Product Preparation Bioprocess Modeling

An immunomodulator bioproduct is prepared in a batch bioprocess with a modified bacterium Pseudomonas aeruginosa. The bioprocess is performed in 100 L Bioengineering bioreactor with 42 L cultivation medium made of peptone, meat extract and sodium chloride. The optimal bioprocess parameters were determined: temperature – 37 0C, agitation speed - 300 rpm, aeration rate – 40 L/min, pressure – 0.5 bar, Dow Corning Antifoam M-max. 4 % of the medium volume, duration - 6 hours. This kind of bioprocesses are appreciated as difficult to control because their dynamic behavior is highly nonlinear and time varying. The aim of the paper is to present (by comparison) different models based on experimental data. The analysis criteria were modeling error and convergence rate. The estimated values and the modeling analysis were done by using the Table Curve 2D. The preliminary conclusions indicate Andrews-s model with a maximum specific growth rate of the bacterium in the range of 0.8 h-1.

Multi-Case Multi-Objective Simulated Annealing (MC-MOSA): New Approach to Adapt Simulated Annealing to Multi-objective Optimization

In this paper a new approach is proposed for the adaptation of the simulated annealing search in the field of the Multi-Objective Optimization (MOO). This new approach is called Multi-Case Multi-Objective Simulated Annealing (MC-MOSA). It uses some basics of a well-known recent Multi-Objective Simulated Annealing proposed by Ulungu et al., which is referred in the literature as U-MOSA. However, some drawbacks of this algorithm have been found, and are substituted by other ones, especially in the acceptance decision criterion. The MC-MOSA has shown better performance than the U-MOSA in the numerical experiments. This performance is further improved by some other subvariants of the MC-MOSA, such as Fast-annealing MC-MOSA, Re-annealing MCMOSA and the Two-Stage annealing MC-MOSA.

An Hybrid Approach for Loss Reduction in Distribution Systems using Harmony Search Algorithm

Individually Network reconfiguration or Capacitor control perform well in minimizing power loss and improving voltage profile of the distribution system. But for heavy reactive power loads network reconfiguration and for heavy active power loads capacitor placement can not effectively reduce power loss and enhance voltage profiles in the system. In this paper, an hybrid approach that combine network reconfiguration and capacitor placement using Harmony Search Algorithm (HSA) is proposed to minimize power loss reduction and improve voltage profile. The proposed approach is tested on standard IEEE 33 and 16 bus systems. Computational results show that the proposed hybrid approach can minimize losses more efficiently than Network reconfiguration or Capacitor control. The results of proposed method are also compared with results obtained by Simulated Annealing (SA). The proposed method has outperformed in terms of the quality of solution compared to SA.

Construct Pairwise Test Suites Based on the Bak-Sneppen Model of Biological Evolution

Pairwise testing, which requires that every combination of valid values of each pair of system factors be covered by at lease one test case, plays an important role in software testing since many faults are caused by unexpected 2-way interactions among system factors. Although meta-heuristic strategies like simulated annealing can generally discover smaller pairwise test suite, they may cost more time to perform search, compared with greedy algorithms. We propose a new method, improved Extremal Optimization (EO) based on the Bak-Sneppen (BS) model of biological evolution, for constructing pairwise test suites and define fitness function according to the requirement of improved EO. Experimental results show that improved EO gives similar size of resulting pairwise test suite and yields an 85% reduction in solution time over SA.