Motion Estimator Architecture with Optimized Number of Processing Elements for High Efficiency Video Coding

Motion estimation occupies the heaviest computation in HEVC (high efficiency video coding). Many fast algorithms such as TZS (test zone search) have been proposed to reduce the computation. Still the huge computation of the motion estimation is a critical issue in the implementation of HEVC video codec. In this paper, motion estimator architecture with optimized number of PEs (processing element) is presented by exploiting early termination. It also reduces hardware size by exploiting parallel processing. The presented motion estimator architecture has 8 PEs, and it can efficiently perform TZS with very high utilization of PEs.

Analysis of Thermoelectric Coolers as Energy Harvesters for Low Power Embedded Applications

The growing popularity of solid state thermoelectric devices in cooling applications has sparked an increasing diversity of thermoelectric coolers (TECs) on the market, commonly known as “Peltier modules”. They can also be used as generators, converting a temperature difference into electric power, and opportunities are plentiful to make use of these devices as thermoelectric generators (TEGs) to supply energy to low power, autonomous embedded electronic applications. Their adoption as energy harvesters in this new domain of usage is obstructed by the complex thermoelectric models commonly associated with TEGs. Low cost TECs for the consumer market lack the required parameters to use the models because they are not intended for this mode of operation, thereby urging an alternative method to obtain electric power estimations in specific operating conditions. The design of the test setup implemented in this paper is specifically targeted at benchmarking commercial, off-the-shelf TECs for use as energy harvesters in domestic environments: applications with limited temperature differences and space available. The usefulness is demonstrated by testing and comparing single and multi stage TECs with different sizes. The effect of a boost converter stage on the thermoelectric end-to-end efficiency is also discussed.

Relation between Organizational Climate and Personnel Performance Assessment in a Tourist Service Company

This investigation aims at analyzing and determining the relation between two very important variables in the human resource management: The organizational climate and the performance assessment. This study aims at contributing with knowledge in the search of the relation between the mentioned variables because the literature still does not provide solid evidence to this respect and the cases revised are incipient to reach conclusions enabling a typology about this relation.To this regard, a correlational and cross-sectional perspective was adopted in which quantitative and qualitative techniques were chosen with the total of the workers of the tourist service company PTS Peru. In order to measure the organizational climate, the OCQ (Organization Climate Questionnaire) from was used; it has 50 items and measures 9 dimensions of the Organizational Climate. Also, to assess performance, a questionnaire with 21 items and 6 dimensions was designed. As a means of assessment, a focus group was prepared and was applied to a worker in every area of the company. Additionally, interviews to human resources experts were conducted. The results of the investigation show a clear relation between the organizational climate and the personnel performance assessment as well as a relation between the nine dimensions of the organizational climate and the work performance in general and with some of its dimensions.

Analysis of Joint Source Channel LDPC Coding for Correlated Sources Transmission over Noisy Channels

In this paper, a Joint Source Channel coding scheme based on LDPC codes is investigated. We consider two concatenated LDPC codes, one allows to compress a correlated source and the second to protect it against channel degradations. The original information can be reconstructed at the receiver by a joint decoder, where the source decoder and the channel decoder run in parallel by transferring extrinsic information. We investigate the performance of the JSC LDPC code in terms of Bit-Error Rate (BER) in the case of transmission over an Additive White Gaussian Noise (AWGN) channel, and for different source and channel rate parameters. We emphasize how JSC LDPC presents a performance tradeoff depending on the channel state and on the source correlation. We show that, the JSC LDPC is an efficient solution for a relatively low Signal-to-Noise Ratio (SNR) channel, especially with highly correlated sources. Finally, a source-channel rate optimization has to be applied to guarantee the best JSC LDPC system performance for a given channel.

Auteur 3D Filmmaking: From Hitchcock’s Protrusion Technique to Godard’s Immersion Aesthetic

Throughout film history, the regular return of 3D cinema has been discussed in connection to crises caused by the advent of television or the competition of the Internet. In addition, the three waves of stereoscopic 3D (from 1952 up to 1983) and its current digital version have been blamed for adding a challenging technical distraction to the viewing experience. By discussing the films Dial M for Murder (1954) and Goodbye to Language (2014), the paper aims to analyze the response of recognized auteurs to the use of 3D techniques in filmmaking. For Alfred Hitchcock, the solution to attaining perceptual immersion paradoxically resided in restraining the signature effect of 3D, namely protrusion. In Jean-Luc Godard’s vision, 3D techniques allowed him to explore perceptual absorption by means of depth of field, for which he had long advocated as being central to cinema. Thus, both directors contribute to the foundation of an auteur aesthetic in 3D filmmaking.

Comparison of the Distillation Curve Obtained Experimentally with the Curve Extrapolated by a Commercial Simulator

True Boiling Point distillation (TBP) is one of the most common experimental techniques for the determination of petroleum properties. This curve provides information about the performance of petroleum in terms of its cuts. The experiment is performed in a few days. Techniques are used to determine the properties faster with a software that calculates the distillation curve when a little information about crude oil is known. In order to evaluate the accuracy of distillation curve prediction, eight points of the TBP curve and specific gravity curve (348 K and 523 K) were inserted into the HYSYS Oil Manager, and the extended curve was evaluated up to 748 K. The methods were able to predict the curve with the accuracy of 0.6%-9.2% error (Software X ASTM), 0.2%-5.1% error (Software X Spaltrohr).

High-Speed Particle Image Velocimetry of the Flow around a Moving Train Model with Boundary Layer Control Elements

Trackside induced airflow velocities, also known as slipstream velocities, are an important criterion for the design of high-speed trains. The maximum permitted values are given by the Technical Specifications for Interoperability (TSI) and have to be checked in the approval process. For train manufactures it is of great interest to know in advance, how new train geometries would perform in TSI tests. The Reynolds number in moving model experiments is lower compared to full-scale. Especially the limited model length leads to a thinner boundary layer at the rear end. The hypothesis is that the boundary layer rolls up to characteristic flow structures in the train wake, in which the maximum flow velocities can be observed. The idea is to enlarge the boundary layer using roughness elements at the train model head so that the ratio between the boundary layer thickness and the car width at the rear end is comparable to a full-scale train. This may lead to similar flow structures in the wake and better prediction accuracy for TSI tests. In this case, the design of the roughness elements is limited by the moving model rig. Small rectangular roughness shapes are used to get a sufficient effect on the boundary layer, while the elements are robust enough to withstand the high accelerating and decelerating forces during the test runs. For this investigation, High-Speed Particle Image Velocimetry (HS-PIV) measurements on an ICE3 train model have been realized in the moving model rig of the DLR in Göttingen, the so called tunnel simulation facility Göttingen (TSG). The flow velocities within the boundary layer are analysed in a plain parallel to the ground. The height of the plane corresponds to a test position in the EN standard (TSI). Three different shapes of roughness elements are tested. The boundary layer thickness and displacement thickness as well as the momentum thickness and the form factor are calculated along the train model. Conditional sampling is used to analyse the size and dynamics of the flow structures at the time of maximum velocity in the train wake behind the train. As expected, larger roughness elements increase the boundary layer thickness and lead to larger flow velocities in the boundary layer and in the wake flow structures. The boundary layer thickness, displacement thickness and momentum thickness are increased by using larger roughness especially when applied in the height close to the measuring plane. The roughness elements also cause high fluctuations in the form factors of the boundary layer. Behind the roughness elements, the form factors rapidly are approaching toward constant values. This indicates that the boundary layer, while growing slowly along the second half of the train model, has reached a state of equilibrium.

Probability-Based Damage Detection of Structures Using Kriging Surrogates and Enhanced Ideal Gas Molecular Movement Algorithm

Surrogate model has received increasing attention for use in detecting damage of structures based on vibration modal parameters. However, uncertainties existing in the measured vibration data may lead to false or unreliable output result from such model. In this study, an efficient approach based on Monte Carlo simulation is proposed to take into account the effect of uncertainties in developing a surrogate model. The probability of damage existence (PDE) is calculated based on the probability density function of the existence of undamaged and damaged states. The kriging technique allows one to genuinely quantify the surrogate error, therefore it is chosen as metamodeling technique. Enhanced version of ideal gas molecular movement (EIGMM) algorithm is used as main algorithm for model updating. The developed approach is applied to detect simulated damage in numerical models of 72-bar space truss and 120-bar dome truss. The simulation results show the proposed method can perform well in probability-based damage detection of structures with less computational effort compared to direct finite element model.

Experimental Investigation on the Effects of Electroless Nickel Phosphorus Deposition, pH and Temperature with the Varying Coating Bath Parameters on Impact Energy by Taguchi Method

This paper discusses the effects of sodium hypophosphite concentration, pH, and temperature on deposition rate. This paper also discusses the evaluation of coating strength, surface, and subsurface by varying the bath parameters, percentage of phosphate, plating temperature, and pH of the plating solution. Taguchi technique has been used for the analysis. In the experiment, nickel chloride which is a source of nickel when mixed with sodium hypophosphite has been used as the reducing agent and the source of phosphate and sodium hydroxide has been used to vary the pH of the coating bath. The coated samples are tested for impact energy by conducting impact test. Finally, the effects of coating bath parameters on the impact energy absorbed have been plotted, and analysis has been carried out. Further, percentage contribution of coating bath parameters using Design of Experiments approach (DOE) has been analysed. Finally, it can be concluded that the bath parameters of the Ni-P coating will certainly influence on the strength of the specimen.

Design and Application of NFC-Based Identity and Access Management in Cloud Services

In response to a changing world and the fast growth of the Internet, more and more enterprises are replacing web-based services with cloud-based ones. Multi-tenancy technology is becoming more important especially with Software as a Service (SaaS). This in turn leads to a greater focus on the application of Identity and Access Management (IAM). Conventional Near-Field Communication (NFC) based verification relies on a computer browser and a card reader to access an NFC tag. This type of verification does not support mobile device login and user-based access management functions. This study designs an NFC-based third-party cloud identity and access management scheme (NFC-IAM) addressing this shortcoming. Data from simulation tests analyzed with Key Performance Indicators (KPIs) suggest that the NFC-IAM not only takes less time in identity identification but also cuts time by 80% in terms of two-factor authentication and improves verification accuracy to 99.9% or better. In functional performance analyses, NFC-IAM performed better in salability and portability. The NFC-IAM App (Application Software) and back-end system to be developed and deployed in mobile device are to support IAM features and also offers users a more user-friendly experience and stronger security protection. In the future, our NFC-IAM can be employed to different environments including identification for mobile payment systems, permission management for remote equipment monitoring, among other applications.

Role of ICT and Wage Inequality in Organization

This study deals with wage inequality in organization and shows the relationship between ICT and wage in organization. To do so, we incorporate ICT’s factors in organization into our model. ICT’s factors are efficiencies of Enterprise Resource Planning (ERP), Computer Assisted Design/Computer Assisted Manufacturing (CAD/CAM), and NETWORK. The improvement of ICT’s factors decrease the learning cost to solve problem pertaining to the hierarchy in organization. The improvement of NETWORK increases the wage inequality within workers and decreases within managers and entrepreneurs. The improvements of CAD/CAM and ERP increases the wage inequality within all agent, and partially increase it between the agents in hierarchy.

Ontology-Driven Generation of Radiation Protection Procedures

In this article, we present the principle and suitable methodology for the design of a medical ontology that highlights the radiological and dosimetric knowledge, applied in diagnostic radiology and radiation-therapy. Our ontology, which we named «Onto.Rap», is the subject of radiation protection in medical and radiology centers by providing a standardized regulatory oversight. Thanks to its added values of knowledge-sharing, reuse and the ease of maintenance, this ontology tends to solve many problems. Of which we name the confusion between radiological procedures a practitioner might face while performing a patient radiological exam. Adding to it, the difficulties they might have in interpreting applicable patient radioprotection standards. Here, the ontology, thanks to its concepts simplification and expressiveness capabilities, can ensure an efficient classification of radiological procedures. It also provides an explicit representation of the relations between the different components of the studied concept. In fact, an ontology based-radioprotection expert system, when used in radiological center, could implement systematic radioprotection best practices during patient exam and a regulatory compliance service auditing afterwards.

Building a Scalable Telemetry Based Multiclass Predictive Maintenance Model in R

Many organizations are faced with the challenge of how to analyze and build Machine Learning models using their sensitive telemetry data. In this paper, we discuss how users can leverage the power of R without having to move their big data around as well as a cloud based solution for organizations willing to host their data in the cloud. By using ScaleR technology to benefit from parallelization and remote computing or R Services on premise or in the cloud, users can leverage the power of R at scale without having to move their data around.

Material and Parameter Analysis of the PolyJet Process for Mold Making Using Design of Experiments

Since additive manufacturing technologies constantly advance, the use of this technology in mold making seems reasonable. Many manufacturers of additive manufacturing machines, however, do not offer any suggestions on how to parameterize the machine to achieve optimal results for mold making. The purpose of this research is to determine the interdependencies of different materials and parameters within the PolyJet process by using design of experiments (DoE), to additively manufacture molds, e.g. for thermoforming and injection molding applications. Therefore, the general requirements of thermoforming molds, such as heat resistance, surface quality and hardness, have been identified. Then, different materials and parameters of the PolyJet process, such as the orientation of the printed part, the layer thickness, the printing mode (matte or glossy), the distance between printed parts and the scaling of parts, have been examined. The multifactorial analysis covers the following properties of the printed samples: Tensile strength, tensile modulus, bending strength, elongation at break, surface quality, heat deflection temperature and surface hardness. The key objective of this research is that by joining the results from the DoE with the requirements of the mold making, optimal and tailored molds can be additively manufactured with the PolyJet process. These additively manufactured molds can then be used in prototyping processes, in process testing and in small to medium batch production.

Simulating Human Behavior in (Un)Built Environments: Using an Actor Profiling Method

This paper addresses the shortcomings of architectural computation tools in representing human behavior in built environments, prior to construction and occupancy of those environments. Evaluating whether a design fits the needs of its future users is currently done solely post construction, or is based on the knowledge and intuition of the designer. This issue is of high importance when designing complex buildings such as hospitals, where the quality of treatment as well as patient and staff satisfaction are of major concern. Existing computational pre-occupancy human behavior evaluation methods are geared mainly to test ergonomic issues, such as wheelchair accessibility, emergency egress, etc. As such, they rely on Agent Based Modeling (ABM) techniques, which emphasize the individual user. Yet we know that most human activities are social, and involve a number of actors working together, which ABM methods cannot handle. Therefore, we present an event-based model that manages the interaction between multiple Actors, Spaces, and Activities, to describe dynamically how people use spaces. This approach requires expanding the computational representation of Actors beyond their physical description, to include psychological, social, cultural, and other parameters. The model presented in this paper includes cognitive abilities and rules that describe the response of actors to their physical and social surroundings, based on the actors’ internal status. The model has been applied in a simulation of hospital wards, and showed adaptability to a wide variety of situated behaviors and interactions.

Towards a Proof Acceptance by Overcoming Challenges in Collecting Digital Evidence

Cybercrime investigation demands an appropriated evidence collection mechanism. If the investigator does not acquire digital proofs in a forensic sound, some important information can be lost, and judges can discard case evidence because the acquisition was inadequate. The correct digital forensic seizing involves preparation of professionals from fields of law, police, and computer science. This paper presents important challenges faced during evidence collection in different perspectives of places. The crime scene can be virtual or real, and technical obstacles and privacy concerns must be considered. All pointed challenges here highlight the precautions to be taken in the digital evidence collection and the suggested procedures contribute to the best practices in the digital forensics field.

Application of Data Mining Techniques for Tourism Knowledge Discovery

Application of five implementations of three data mining classification techniques was experimented for extracting important insights from tourism data. The aim was to find out the best performing algorithm among the compared ones for tourism knowledge discovery. Knowledge discovery process from data was used as a process model. 10-fold cross validation method is used for testing purpose. Various data preprocessing activities were performed to get the final dataset for model building. Classification models of the selected algorithms were built with different scenarios on the preprocessed dataset. The outperformed algorithm tourism dataset was Random Forest (76%) before applying information gain based attribute selection and J48 (C4.5) (75%) after selection of top relevant attributes to the class (target) attribute. In terms of time for model building, attribute selection improves the efficiency of all algorithms. Artificial Neural Network (multilayer perceptron) showed the highest improvement (90%). The rules extracted from the decision tree model are presented, which showed intricate, non-trivial knowledge/insight that would otherwise not be discovered by simple statistical analysis with mediocre accuracy of the machine using classification algorithms.

The Mentoring in Professional Development of University Teachers

Mentoring is provided by professionals with a higher level of experience and competence as part of the professional development of a university faculty. This paper explores the characteristics of the mentoring provided by those teachers participating in the development of an active methodology program run at the University of the Basque Country: to examine and to analyze mentors’ performance with the aim of providing empirical evidence regarding its value as a lifelong learning strategy for teaching staff. A total of 183 teachers were trained during the first three programs. The analysis method uses a coding technique and is based on flexible, systematic guidelines for gathering and analyzing qualitative data. The results have confirmed the conception of mentoring as a methodological innovation in higher education. In short, university teachers in general assessed the mentoring they received positively, considering it to be a valid, useful strategy in their professional development. They highlighted the methodological expertise of their mentor and underscored how they monitored the learning process of the active method and provided guidance and advice when necessary. Finally, they also drew attention to traits such as availability, personal commitment and flexibility in. However, a minority critique is pointed to some aspects of the performance of some mentors.

Vulnerability of Indian Agriculture to Climate Change: A Study of the Himalayan Region State

Climate variability and changes are the emerging challenges for Indian agriculture with the growing population to ensure national food security. A study was conducted to assess the Climatic Change effects in medium to low altitude areas of the Himalayan region causing changes in land use and cereal crop productivity with the various climatic parameters. The rainfall and temperature changes from 1951 to 2013 were studied at four locations of varying altitudes, namely Hardwar, Rudra Prayag, Uttar Kashi and Tehri Garwal. It was observed that there is noticeable increment in temperature on all the four locations. It was surprisingly observed that the mean rainfall intensity of 30 minutes duration has increased at the rate of 0.1 mm/hours since 2000. The study shows that the combined effect of increasing temperature, rainfall, runoff and urbanization at the mid-Himalayan region is causing an increase in various climatic disasters and changes in agriculture patterns. A noticeable change in cropping patterns, crop productivity and land use change was observed. Appropriate adaptation and mitigation strategies are necessary to ensure that sustainable and climate-resilient agriculture. Appropriate information is necessary for farmers, as well as planners and decision makers for developing, disseminating and adopting climate-smart technologies.

Experimental and Graphical Investigation on Oil Recovery by Buckley-Leveret Theory

Recently increasing oil production from petroleum reservoirs is one of the most important issues in the global energy sector. So, in this paper, the recovery of oil by the waterflooding technique from petroleum reservoir are considered. To investigate the aforementioned phenomena, the relative permeability of two immiscible fluids in sand is measured in the laboratory based on the steady-state method. Two sorts of oils, kerosene and heavy oil, and water are pumped simultaneously into a vertical sand column with different pumping ratio. From the change in fractional discharge measured at the outlet, a method for determining the relative permeability is developed focusing on the displacement mechanism in sand. Then, displacement mechanism of two immiscible fluids in the sand is investigated under the Buckley-Leveret frontal displacement theory and laboratory experiment. Two sorts of experiments, one is the displacement of pore water by oil, the other is the displacement of pore oil by water, are carried out. It is revealed that the relative permeability curves display tolerably different shape owing to the properties of oils, and produce different amount of residual oils and irreducible water saturation.