Measurement and Prediction of Speed of Sound in Petroleum Fluids

Seismic methods play an important role in the exploration for hydrocarbon reservoirs. However, the success of the method depends strongly on the reliability of the measured or predicted information regarding the velocity of sound in the media. Speed of sound has been used to study the thermodynamic properties of fluids. In this study, experimental data are reported and analyzed on the speed of sound in toluene and octane binary mixture. Three-factor three-level Box-Benhkam design is used to determine the significance of each factor, the synergetic effects of the factors, and the most significant factors on speed of sound. The developed mathematical model and statistical analysis provided a critical analysis of the simultaneous interactive effects of the independent variables indicating that the developed quadratic models were highly accurate and predictive.

Partner Selection in International Strategic Alliances: The Case of the Information Industry

This study analyzes international strategic alliances in the information industry. The purpose of this study is to clarify the strategic intention of an international alliance. Secondly, it investigates the influence of differences in the target markets of partner companies on alliances. Using an international strategy theory approach to analyze the global strategies of global companies, the study compares a database business and an electronic publishing business. In particular, these cases emphasized factors attributable to "people" and "learning", reliability and communication between organizations and the evolution of the IT infrastructure. The theory evolved in this study validates the effectiveness of these strategies.

An Exploratory Study of Reliability of Ranking vs. Rating in Peer Assessment

Fifty years of research has found great potential for peer assessment as a pedagogical approach. With peer assessment, not only do students receive more copious assessments; they also learn to become assessors. In recent decades, more educational peer assessments have been facilitated by online systems. Those online systems are designed differently to suit different class settings and student groups, but they basically fall into two categories: rating-based and ranking-based. The rating-based systems ask assessors to rate the artifacts one by one following some review rubrics. The ranking-based systems allow assessors to review a set of artifacts and give a rank for each of them. Though there are different systems and a large number of users of each category, there is no comprehensive comparison on which design leads to higher reliability. In this paper, we designed algorithms to evaluate assessors' reliabilities based on their rating/ranking against the global ranks of the artifacts they have reviewed. These algorithms are suitable for data from both rating-based and ranking-based peer assessment systems. The experiments were done based on more than 15,000 peer assessments from multiple peer assessment systems. We found that the assessors in ranking-based peer assessments are at least 10% more reliable than the assessors in rating-based peer assessments. Further analysis also demonstrated that the assessors in ranking-based assessments tend to assess the more differentiable artifacts correctly, but there is no such pattern for rating-based assessors.

The Application of Line Balancing Technique and Simulation Program to Increase Productivity in Hard Disk Drive Components

This study aims to investigate the balancing of the number of operators (Line Balancing technique) in the production line of hard disk drive components in order to increase efficiency. At present, the trend of using hard disk drives has continuously declined leading to limits in a company’s revenue potential. It is important to improve and develop the production process to create market share and to have the ability to compete with competitors with a higher value and quality. Therefore, an effective tool is needed to support such matters. In this research, the Arena program was applied to analyze the results both before and after the improvement. Finally, the precedent was used before proceeding with the real process. There were 14 work stations with 35 operators altogether in the RA production process where this study was conducted. In the actual process, the average production time was 84.03 seconds per product piece (by timing 30 times in each work station) along with a rating assessment by implementing the Westinghouse principles. This process showed that the rating was 123% underlying an assumption of 5% allowance time. Consequently, the standard time was 108.53 seconds per piece. The Takt time was calculated from customer needs divided by working duration in one day; 3.66 seconds per piece. Of these, the proper number of operators was 30 people. That meant five operators should be eliminated in order to increase the production process. After that, a production model was created from the actual process by using the Arena program to confirm model reliability; the outputs from imitation were compared with the original (actual process) and this comparison indicated that the same output meaning was reliable. Then, worker numbers and their job responsibilities were remodeled into the Arena program. Lastly, the efficiency of production process enhanced from 70.82% to 82.63% according to the target.

Stochastic Risk Analysis Framework for Building Construction Projects

The study was carried out to establish the probability density function of some selected building construction projects of similar complexity delivered using Bill of Quantities (BQ) and Lump Sum (LS) forms of contract, and to draw a reliability scenario for each form of contract. 30 of such delivered projects are analyzed for each of the contract forms using Weibull Analysis, and their Weibull functions (α, and β) are determined based on their completion times. For the BQ form of contract delivered projects, α is calculated as 1.6737E20 and β as + 0.0115 and for the LS form, α is found to be 5.6556E03 and β is determined as + 0.4535. Using these values, respective probability density functions are calculated and plotted, as handy tool for risk analysis of future projects of similar characteristics. By input of variables from other projects, decision making processes can be made for a whole project or its components using EVM Analysis in project evaluation and review techniques. This framework, as a quantitative approach, depends on the assumption of normality in projects completion time, it can help greatly in determining the completion time probability for veritable projects using any of the contract forms under consideration. Projects aspects that are not amenable to measurement, on the other hand, can be analyzed using fuzzy sets and fuzzy logic. This scenario can be drawn for different types of building construction projects, and using different suitable forms of contract in projects delivery.

Li-Fi Technology: Data Transmission through Visible Light

People are always in search of Wi-Fi hotspots because Internet is a major demand nowadays. But like all other technologies, there is still room for improvement in the Wi-Fi technology with regards to the speed and quality of connectivity. In order to address these aspects, Harald Haas, a professor at the University of Edinburgh, proposed what we know as the Li-Fi (Light Fidelity). Li-Fi is a new technology in the field of wireless communication to provide connectivity within a network environment. It is a two-way mode of wireless communication using light. Basically, the data is transmitted through Light Emitting Diodes which can vary the intensity of light very fast, even faster than the blink of an eye. From the research and experiments conducted so far, it can be said that Li-Fi can increase the speed and reliability of the transfer of data. This paper pays particular attention on the assessment of the performance of this technology. In other words, it is a 5G technology which uses LED as the medium of data transfer. For coverage within the buildings, Wi-Fi is good but Li-Fi can be considered favorable in situations where large amounts of data are to be transferred in areas with electromagnetic interferences. It brings a lot of data related qualities such as efficiency, security as well as large throughputs to the table of wireless communication. All in all, it can be said that Li-Fi is going to be a future phenomenon where the presence of light will mean access to the Internet as well as speedy data transfer.

Dynamic Modelling and Virtual Simulation of Digital Duty-Cycle Modulation Control Drivers

This paper presents a dynamic architecture of digital duty-cycle modulation control drivers. Compared to most oversampling digital modulation schemes encountered in industrial electronics, its novelty is founded on a number of relevant merits including; embedded positive and negative feedback loops, internal modulation clock, structural simplicity, elementary building operators, no explicit need of samples of the nonlinear duty-cycle function when computing the switching modulated signal, and minimum number of design parameters. A prototyping digital control driver is synthesized and well tested within MATLAB/Simulink workspace. Then, the virtual simulation results and performance obtained under a sample of relevant instrumentation and control systems are presented, in order to show the feasibility, the reliability, and the versatility of target applications, of the proposed class of low cost and high quality digital control drivers in industrial electronics.

Application of Powder Metallurgy Technologies for Gas Turbine Engine Wheel Production

A detailed analysis has been performed for several schemes of Gas Turbine Wheels production based on additive and powder technologies including metal, ceramic, and stereolithography 3-D printing. During the process of development and debugging of gas turbine engine components, different versions of these components must be manufactured and tested. Cooled blades of the turbine are among of these components. They are usually produced by traditional casting methods. This method requires long and costly design and manufacture of casting molds. Moreover, traditional manufacturing methods limit the design possibilities of complex critical parts of engine, so capabilities of Powder Metallurgy Techniques (PMT) were analyzed to manufacture the turbine wheel with air-cooled blades. PMT dramatically reduce time needed for such production and allow creating new complex design solutions aimed at improving the technical characteristics of the engine: improving fuel efficiency and environmental performance, increasing reliability, and reducing weight. To accelerate and simplify the blades manufacturing process, several options based on additive technologies were used. The options were implemented in the form of various casting equipment for the manufacturing of blades. Methods of powder metallurgy were applied for connecting the blades with the disc. The optimal production scheme and a set of technologies for the manufacturing of blades and turbine wheel and other parts of the engine can be selected on the basis of the options considered.

Reliability Analysis for Cyclic Fatigue Life Prediction in Railroad Bolt Hole

Bolted rail joint is one of the most vulnerable areas in railway track. A comprehensive approach was developed for studying the reliability of fatigue crack initiation of railroad bolt hole under random axle loads and random material properties. The operation condition was also considered as stochastic variables. In order to obtain the comprehensive probability model of fatigue crack initiation life prediction in railroad bolt hole, we used FEM, response surface method (RSM), and reliability analysis. Combined energy-density based and critical plane based fatigue concept is used for the fatigue crack prediction. The dynamic loads were calculated according to the axle load, speed, and track properties. The results show that axle load is most sensitive parameter compared to Poisson’s ratio in fatigue crack initiation life. Also, the reliability index decreases slowly due to high cycle fatigue regime in this area.

A SWOT Analysis on Institutional Environments of University of the Punjab

The major purpose of the study was to identify the institutional environments’ strengths, weaknesses, opportunities and threats of University of the Punjab, Lahore. The target population of the study was teachers of University of the Punjab Lahore. The sample of 235 teachers (155 males, 80 females) were selected through multistage stratified sampling technique. A questionnaire regarding the institutional environments of University SWOT Analysis “Strengths, Weaknesses, Opportunities, and Threats” was used to collect the required data for this study. The questionnaire consisted of two parts. The first part comprised of the demographic information (faculty, department, gender, teacher rank), while the second part included the statements regarding SWOT analysis (strengths, weaknesses, opportunities and threats). Reliability index (Cronbach’s Alpha) of the questionnaire was 0.87, which is statistically acceptable. Analysis of the data indicated that there was significant difference in the opinion of respondents. Teachers of Islamic studies and Laws had difference in their opinions regarding the institutional environment strengths, and opportunities and it was supported by the findings of the study. There was significant difference in opinions of male and female teachers regarding strengths and opportunities of university. And there was no significant difference in opinions of male and female teachers regarding weaknesses and threats of university.

The Communication Library DIALOG for iFDAQ of the COMPASS Experiment

Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.

Numerical Investigation of Multiphase Flow in Pipelines

We present and analyze reliable numerical techniques for simulating complex flow and transport phenomena related to natural gas transportation in pipelines. Such kind of problems are of high interest in the field of petroleum and environmental engineering. Modeling and understanding natural gas flow and transformation processes during transportation is important for the sake of physical realism and the design and operation of pipeline systems. In our approach a two fluid flow model based on a system of coupled hyperbolic conservation laws is considered for describing natural gas flow undergoing hydratization. The accurate numerical approximation of two-phase gas flow remains subject of strong interest in the scientific community. Such hyperbolic problems are characterized by solutions with steep gradients or discontinuities, and their approximation by standard finite element techniques typically gives rise to spurious oscillations and numerical artefacts. Recently, stabilized and discontinuous Galerkin finite element techniques have attracted researchers’ interest. They are highly adapted to the hyperbolic nature of our two-phase flow model. In the presentation a streamline upwind Petrov-Galerkin approach and a discontinuous Galerkin finite element method for the numerical approximation of our flow model of two coupled systems of Euler equations are presented. Then the efficiency and reliability of stabilized continuous and discontinous finite element methods for the approximation is carefully analyzed and the potential of the either classes of numerical schemes is investigated. In particular, standard benchmark problems of two-phase flow like the shock tube problem are used for the comparative numerical study.

Redundancy Component Matrix and Structural Robustness

We introduce the redundancy matrix that expresses clearly the geometrical/topological configuration of the structure. With the matrix, the redundancy of the structure is resolved into redundant components and assigned to each member or rigid joint. The values of the diagonal elements in the matrix indicates the importance of the corresponding members or rigid joints, and the geometrically correlations can be shown with the non-diagonal elements. If a member or rigid joint failures, reassignment of the redundant components can be calculated with the recursive method given in the paper. By combining the indexes of reliability and redundancy components, we define an index concerning the structural robustness. To further explain the properties of the redundancy matrix, we cited several examples of statically indeterminate structures, including two trusses and a rigid frame. With the examples, some simple results and the properties of the matrix are discussed. The examples also illustrate that the redundancy matrix and the relevant concepts are valuable in structural safety analysis.

Reliability Based Performance Evaluation of Stone Column Improved Soft Ground

The present study considers the effect of variation of different geotechnical random variables in the design of stone column-foundation systems for assessing the bearing capacity and consolidation settlement of highly compressible soil. The soil and stone column properties, spacing, diameter and arrangement of stone columns are considered as the random variables. Probability of failure (Pf) is computed for a target degree of consolidation and a target safe load by Monte Carlo Simulation (MCS). The study shows that the variation in coefficient of radial consolidation (cr) and cohesion of soil (cs) are two most important factors influencing Pf. If the coefficient of variation (COV) of cr exceeds 20%, Pf exceeds 0.001, which is unsafe following the guidelines of US Army Corps of Engineers. The bearing capacity also exceeds its safe value for COV of cs > 30%. It is also observed that as the spacing between the stone column increases, the probability of reaching a target degree of consolidation decreases. Accordingly, design guidelines, considering both consolidation and bearing capacity of improved ground, are proposed for different spacing and diameter of stone columns and geotechnical random variables.

A Study on the Relationship between Transaction Fairness, Social Capital, Supply Chain Integration and Sustainability: Focusing on Manufacturing Companies of South Korea

The purpose of this study is to analyze the relationship between transaction fairness, social capital, supply chain integration and sustainability. Based on the previous studies, measurement items were determined by using SPSS 22 and exploratory factor analysis was performed, and again, using AMOS 21 for confirmatory factor analysis and path analysis was performed by using study items that satisfy reliability, validity, and appropriateness of measurement model. It has shown that transaction fairness has a (+) significant effect on social capital, social capital on supply chain integration, supply chain integration on economic sustainability and social sustainability, and has a (+), but not significant effect on environmental sustainability. It has shown that supply chain integration has been proven to play a role as a parameter between social capital and economic and social sustainability, but not as a parameter between environmental sustainability. Through this study, it is suggested that clearly examining the relationship between fairness of trade, social capital, supply chain integration and sustainability, maintaining fairness of the transaction make formation of social capital, and further integration of supply chain, and achieve sustainability of entire supply chain.

High-Value Health System for All: Technologies for Promoting Health Education and Awareness

Health for all is considered as a sign of well-being and inclusive growth. New healthcare technologies are contributing to the quality of human lives by promoting health education and awareness, leading to the prevention, early diagnosis and treatment of the symptoms of diseases. Healthcare technologies have now migrated from the medical and institutionalized settings to the home and everyday life. This paper explores these new technologies and investigates how they contribute to health education and awareness, promoting the objective of high-value health system for all. The methodology used for the research is literature review. The paper also discusses the opportunities and challenges with futuristic healthcare technologies. The combined advances in genomics medicine, wearables and the IoT with enhanced data collection in electronic health record (EHR) systems, environmental sensors, and mobile device applications can contribute in a big way to high-value health system for all. The promise by these technologies includes reduced total cost of healthcare, reduced incidence of medical diagnosis errors, and reduced treatment variability. The major barriers to adoption include concerns with security, privacy, and integrity of healthcare data, regulation and compliance issues, service reliability, interoperability and portability of data, and user friendliness and convenience of these technologies.

Laboratory Investigations on the Utilization of Recycled Construction Aggregates in Asphalt Mixtures

Road networks are increasingly expanding all over the world. The construction and maintenance of the road pavements require large amounts of aggregates. Considerable usage of various natural aggregates for constructing roads as well as the increasing rate at which solid waste is generated have attracted the attention of many researchers in the pavement industry to investigate the feasibility of the application of some of the waste materials as alternative materials in pavement construction. Among various waste materials, construction and demolition wastes, including Recycled Construction Aggregate (RCA) constitute a major part of the municipal solid wastes in Australia. Creating opportunities for the application of RCA in civil and geotechnical engineering applications is an efficient way to increase the market value of RCA. However, in spite of such promising potentials, insufficient and inconclusive data and information on the engineering properties of RCA had limited the reliability and design specifications of RCA to date. In light of this, this paper, as a first step of a comprehensive research, aims to investigate the feasibility of the application of RCA obtained from construction and demolition wastes for the replacement of part of coarse aggregates in asphalt mixture. As the suitability of aggregates for using in asphalt mixtures is determined based on the aggregate characteristics, including physical and mechanical properties of the aggregates, an experimental program is set up to evaluate the physical and mechanical properties of RCA. This laboratory investigation included the measurement of compressive strength and workability of RCA, particle shape, water absorption, flakiness index, crushing value, deleterious materials and weak particles, wet/dry strength variation, and particle density. In addition, the comparison of RCA properties with virgin aggregates has been included as part of this investigation and this paper presents the results of these investigations on RCA, basalt, and the mix of RCA/basalt.

Analysis of the Current and Ideal Situation of Iran’s Football Talent Management Process from the Perspective of the Elites

The aim of this study was to investigate the current and ideal situations of the process of talent identification in Iranian football from the point of view of Iranian instructors of the Asian Football Confederation (AFC). This research was a descriptive-analytical study; in data collection phase a questionnaire was used, whose face validity was confirmed by experts of Physical Education and Sports Science. The reliability of questionnaire was estimated through the use of Cronbach's alpha method (0.91). This study involved 122 participants of Iranian instructors of the AFC who were selected based on stratified random sampling method. Descriptive statistics were used to describe the variables and inferential statistics (Chi-square) were used to test the hypotheses of the study at significant level (p ≤ 0.05). The results of Chi-square test related to the point of view of Iranian instructors of the AFC showed that the grass-roots scientific method was the best way to identify football players (0.001), less than 10 years old were the best ages for talent identification (0.001), the Football Federation was revealed to be the most important organization in talent identification (0.002), clubs were shown to be the most important institution in developing talents (0.001), trained scouts of Football Federation were demonstrated to be the best and most appropriate group for talent identification (0.001), and being referred by the football academy coaches was shown to be the best way to attract talented football players in Iran (0.001). It was also found that there was a huge difference between the current and ideal situation of the process of talent identification in Iranian football from the point of view of Iranian instructors of the AFC. Hence, it is recommended that the policy makers of talent identification for Iranian football provide a comprehensive, clear and systematic model of talent identification and development processes for the clubs and football teams, so that the talent identification process helps to nurture football talents more efficiently.

Field-Programmable Gate Array Based Tester for Protective Relay

The reliability of the power grid depends on the successful operation of thousands of protective relays. The failure of one relay to operate as intended may lead the entire power grid to blackout. In fact, major power system failures during transient disturbances may be caused by unnecessary protective relay tripping rather than by the failure of a relay to operate. Adequate relay testing provides a first defense against false trips of the relay and hence improves power grid stability and prevents catastrophic bulk power system failures. The goal of this research project is to design and enhance the relay tester using a technology such as Field Programmable Gate Array (FPGA) card NI 7851. A PC based tester framework has been developed using Simulink power system model for generating signals under different conditions (faults or transient disturbances) and LabVIEW for developing the graphical user interface and configuring the FPGA. Besides, the interface system has been developed for outputting and amplifying the signals without distortion. These signals should be like the generated ones by the real power system and large enough for testing the relay’s functionality. The signals generated that have been displayed on the scope are satisfactory. Furthermore, the proposed testing system can be used for improving the performance of protective relay.

A Partially Accelerated Life Test Planning with Competing Risks and Linear Degradation Path under Tampered Failure Rate Model

In this paper, we propose a method to model the relationship between failure time and degradation for a simple step stress test where underlying degradation path is linear and different causes of failure are possible. It is assumed that the intensity function depends only on the degradation value. No assumptions are made about the distribution of the failure times. A simple step-stress test is used to shorten failure time of products and a tampered failure rate (TFR) model is proposed to describe the effect of the changing stress on the intensities. We assume that some of the products that fail during the test have a cause of failure that is only known to belong to a certain subset of all possible failures. This case is known as masking. In the presence of masking, the maximum likelihood estimates (MLEs) of the model parameters are obtained through an expectation-maximization (EM) algorithm by treating the causes of failure as missing values. The effect of incomplete information on the estimation of parameters is studied through a Monte-Carlo simulation. Finally, a real example is analyzed to illustrate the application of the proposed methods.