Very Large Scale Integration Architecture of Finite Impulse Response Filter Implementation Using Retiming Technique

Recursive combination of an algorithm based on Karatsuba multiplication is exploited to design a generalized transpose and parallel Finite Impulse Response (FIR) Filter. Mid-range Karatsuba multiplication and Carry Save adder based on Karatsuba multiplication reduce time complexity for higher order multiplication implemented up to n-bit. As a result, we design modified N-tap Transpose and Parallel Symmetric FIR Filter Structure using Karatsuba algorithm. The mathematical formulation of the FFA Filter is derived. The proposed architecture involves significantly less area delay product (APD) then the existing block implementation. By adopting retiming technique, hardware cost is reduced further. The filter architecture is designed by using 90 nm technology library and is implemented by using cadence EDA Tool. The synthesized result shows better performance for different word length and block size. The design achieves switching activity reduction and low power consumption by applying with and without retiming for different combination of the circuit. The proposed structure achieves more than a half of the power reduction by adopting with and without retiming techniques compared to the earlier design structure. As a proof of the concept for block size 16 and filter length 64 for CKA method, it achieves a 51% as well as 70% less power by applying retiming technique, and for CSA method it achieves a 57% as well as 77% less power by applying retiming technique compared to the previously proposed design.

Patient Support Program in Pharmacovigilance: Foster Patient Confidence and Compliance

The pharmaceutical companies are getting more inclined towards patient support programs (PSPs) which assist patients and/or healthcare professionals (HCPs) in more desirable disease management and cost-effective treatment. The utmost objective of these programs is patient care. The PSPs may include financial assistance to patients, medicine compliance programs, access to HCPs via phone or online chat centers, etc. The PSP has a crucial role in terms of customer acquisition and retention strategies. During the conduct of these programs, Marketing Authorisation Holder (MAH) may receive information related to concerned medicinal products, which is usually reported by patients or involved HCPs. This information may include suspected adverse reaction(s) during/after administration of medicinal products. Hence, the MAH should design PSP to comply with regulatory reporting requirements and avoid non-compliance during PV inspection. The emergence of wireless health devices is lowering the burden on patients to manually incorporate safety data, and building a significant option for patients to observe major swings in reference to drug safety. Therefore, to enhance the adoption of these programs, MAH not only needs to aware patients about advantages of the program, but also recognizes the importance of time of patients and commitments made in a constructive manner. It is indispensable that strengthening the public health is considered as the topmost priority in such programs, and the MAH is compliant to Pharmacovigilance (PV) requirements along with regulatory obligations.

Conceptualizing the Knowledge to Manage and Utilize Data Assets in the Context of Digitization: Case Studies of Multinational Industrial Enterprises

The trend of digitization significantly changes the role of data for enterprises. Data turn from an enabler to an intangible organizational asset that requires management and qualifies as a tradeable good. The idea of a networked economy has gained momentum in the data domain as collaborative approaches for data management emerge. Traditional organizational knowledge consequently needs to be extended by comprehensive knowledge about data. The knowledge about data is vital for organizations to ensure that data quality requirements are met and data can be effectively utilized and sovereignly governed. As this specific knowledge has been paid little attention to so far by academics, the aim of the research presented in this paper is to conceptualize it by proposing a “data knowledge model”. Relevant model entities have been identified based on a design science research (DSR) approach that iteratively integrates insights of various industry case studies and literature research.

Synchronization of Traveling Waves within a Hollow-Core Vortex

The present paper expands details and confirms the transition mechanism between two subsequent polygonal patterns of the hollow-core vortex. Using power spectral analysis, we confirm in this work that the transition from any N-gon to (N+1)-gon pattern observed within a hollow-core vortex of shallow rotating flows occurs in two steps. The regime was quasi-periodic before the frequencies lock (synchronization). The ratios of locking frequencies were found to be equal to (N-1)/N.

Preparation of n-type Bi2Te3 Films by Electrophoretic Deposition

A high quality crack-free film of Bi2Te3 material has been deposited for the first time using electrophoretic deposition (EPD) and microstructures of various films have been investigated. One of the most important thermoelectric (TE) applications is Bi2Te3 to manufacture TE generators (TEG) which can convert waste heat into electricity targeting the global warming issue. However, the high cost of the manufacturing process of TEGs keeps them expensive and out of reach for commercialization. Therefore, utilizing EPD as a simple and cost-effective method will open new opportunities for TEG’s commercialization. This method has been recently used for advanced materials such as microelectronics and has attracted a lot of attention from both scientists and industry. In this study, the effect of media of suspensions has been investigated on the quality of the deposited films as well as their microstructure. In summary, finding an appropriate suspension is a critical step for a successful EPD process and has an important effect on both the film’s quality and its future properties.

Electrophoretic Deposition of p-Type Bi2Te3 for Thermoelectric Applications

Electrophoretic deposition (EPD) of p-type Bi2Te3 material has been accomplished, and a high quality crack-free thick film has been achieved for thermoelectric (TE) applications. TE generators (TEG) can convert waste heat into electricity, which can potentially solve global warming problems. However, TEG is expensive due to the high cost of materials, as well as the complex and expensive manufacturing process. EPD is a simple and cost-effective method which has been used recently for advanced applications. In EPD, when a DC electric field is applied to the charged powder particles suspended in a suspension, they are attracted and deposited on the substrate with the opposite charge. In this study, it has been shown that it is possible to prepare a TE film using the EPD method and potentially achieve high TE properties at low cost. The relationship between the deposition weight and the EPD-related process parameters, such as applied voltage and time, has been investigated and a linear dependence has been observed, which is in good agreement with the theoretical principles of EPD. A stable EPD suspension of p-type Bi2Te3 was prepared in a mixture of acetone-ethanol with triethanolamine as a stabilizer. To achieve a high quality homogenous film on a copper substrate, the optimum voltage and time of the EPD process was investigated. The morphology and microstructures of the green deposited films have been investigated using a scanning electron microscope (SEM). The green Bi2Te3 films have shown good adhesion to the substrate. In summary, this study has shown that not only EPD of p-type Bi2Te3 material is possible, but its thick film is of high quality for TE applications.

Rheological and Computational Analysis of Crude Oil Transportation

Transportation of unrefined crude oil from the production unit to a refinery or large storage area by a pipeline is difficult due to the different properties of crude in various areas. Thus, the design of a crude oil pipeline is a very complex and time consuming process, when considering all the various parameters. There were three very important parameters that play a significant role in the transportation and processing pipeline design; these are: viscosity profile, temperature profile and the velocity profile of waxy crude oil through the crude oil pipeline. Knowledge of the Rheological computational technique is required for better understanding the flow behavior and predicting the flow profile in a crude oil pipeline. From these profile parameters, the material and the emulsion that is best suited for crude oil transportation can be predicted. Rheological computational fluid dynamic technique is a fast method used for designing flow profile in a crude oil pipeline with the help of computational fluid dynamics and rheological modeling. With this technique, the effect of fluid properties including shear rate range with temperature variation, degree of viscosity, elastic modulus and viscous modulus was evaluated under different conditions in a transport pipeline. In this paper, two crude oil samples was used, as well as a prepared emulsion with natural and synthetic additives, at different concentrations ranging from 1,000 ppm to 3,000 ppm. The rheological properties was then evaluated at a temperature range of 25 to 60 °C and which additive was best suited for transportation of crude oil is determined. Commercial computational fluid dynamics (CFD) has been used to generate the flow, velocity and viscosity profile of the emulsions for flow behavior analysis in crude oil transportation pipeline. This rheological CFD design can be further applied in developing designs of pipeline in the future.

The Evolving Customer Experience Management Landscape: A Case Study on the Paper Machine Companies

Customer experience is increasingly the differentiator between successful companies and those who struggle. Currently, customer experiences become more dynamic; and they advance with each interaction between the company and a customer. Every customer conversation and any effort to evolve these conversations would be beneficial and should ultimately result in a positive customer experience. The aim of this paper is to analyze the evolving customer experience management landscape and the relevant challenges and opportunities. A case study on the “paper machine” companies is chosen. Hence, this paper analyzes the challenges and opportunities in customer experience management of paper machine companies for the case of “road to steel”. Road to steel shows the journey of steel from raw material to end product (i.e. paper machine in this paper). ALPHA (Steel company) and BETA (paper machine company), are chosen and their efforts to evolve the customer experiences are investigated. Semi-structured interviews are conducted with experts in those companies to identify the challenges and opportunities of the evolving customer experience management from their point of view. The findings of this paper contribute to the theory and business practices in the realm of the evolving customer experience management landscape.

Spatial Analysis of Park and Ride Users’ Dynamic Accessibility to Train Station: A Case Study in Perth

Accessibility analysis, examining people’s ability to access facilities and destinations, is a fundamental assessment for transport planning, policy making, and social exclusion research. Dynamic accessibility which measures accessibility in real-time traffic environment has been an advanced accessibility indicator in transport research. It is also a useful indicator to help travelers to understand travel time daily variability, assists traffic engineers to monitor traffic congestions, and finally develop effective strategies in order to mitigate traffic congestions. This research involved real-time traffic information by collecting travel time data with 15-minute interval via the TomTom® API. A framework for measuring dynamic accessibility was then developed based on the gravity theory and accessibility dichotomy theory through space and time interpolation. Finally, the dynamic accessibility can be derived at any given time and location under dynamic accessibility spatial analysis framework.

A Study on Vulnerability of Alahsa Governorate to Generate Urban Heat Islands

The purpose of this study is to investigate Alahsa Governorate status and its vulnerability to generate urban heat islands. Alahsa Governorate is a famous oasis in the Arabic Peninsula including several oil centers. Extensive literature review was done to collect previous relative data on the urban heat island of Alahsa Governorate. Data used for the purpose of this research were collected from authorized bodies who control weather station networks over Alahsa Governorate, Eastern Province, Saudi Arabia. Although, the number of weather station networks within the region is very limited and the analysis using GIS software and its techniques is difficult and limited, the data analyzed confirm an increase in temperature for more than 2 °C from 2004 to 2014. Such increase is considerable whenever human health and comfort are the concern. The increase of temperature within one decade confirms the availability of urban heat islands. The study concludes that, Alahsa Governorate is vulnerable to create urban heat islands and more attention should be drawn to strategic planning of the governorate that is developing with a high pace and considerable increasing levels of urbanization.

VISMA: A Method for System Analysis in Early Lifecycle Phases

The choice of applicable analysis methods in safety or systems engineering depends on the depth of knowledge about a system, and on the respective lifecycle phase. However, the analysis method chain still shows gaps as it should support system analysis during the lifecycle of a system from a rough concept in pre-project phase until end-of-life. This paper’s goal is to discuss an analysis method, the VISSE Shell Model Analysis (VISMA) method, which aims at closing the gap in the early system lifecycle phases, like the conceptual or pre-project phase, or the project start phase. It was originally developed to aid in the definition of the system boundary of electronic system parts, like e.g. a control unit for a pump motor. Furthermore, it can be also applied to non-electronic system parts. The VISMA method is a graphical sketch-like method that stratifies a system and its parts in inner and outer shells, like the layers of an onion. It analyses a system in a two-step approach, from the innermost to the outermost components followed by the reverse direction. To ensure a complete view of a system and its environment, the VISMA should be performed by (multifunctional) development teams. To introduce the method, a set of rules and guidelines has been defined in order to enable a proper shell build-up. In the first step, the innermost system, named system under consideration (SUC), is selected, which is the focus of the subsequent analysis. Then, its directly adjacent components, responsible for providing input to and receiving output from the SUC, are identified. These components are the content of the first shell around the SUC. Next, the input and output components to the components in the first shell are identified and form the second shell around the first one. Continuing this way, shell by shell is added with its respective parts until the border of the complete system (external border) is reached. Last, two external shells are added to complete the system view, the environment and the use case shell. This system view is also stored for future use. In the second step, the shells are examined in the reverse direction (outside to inside) in order to remove superfluous components or subsystems. Input chains to the SUC, as well as output chains from the SUC are described graphically via arrows, to highlight functional chains through the system. As a result, this method offers a clear and graphical description and overview of a system, its main parts and environment; however, the focus still remains on a specific SUC. It helps to identify the interfaces and interfacing components of the SUC, as well as important external interfaces of the overall system. It supports the identification of the first internal and external hazard causes and causal chains. Additionally, the method promotes a holistic picture and cross-functional understanding of a system, its contributing parts, internal relationships and possible dangers within a multidisciplinary development team.

Effect of Oxytocin on Cytosolic Calcium Concentration of Alpha and Beta Cells in Pancreas

Oxytocin is a nine-amino acid peptide synthesized in the paraventricular nucleus (PVN) and supraoptic nucleus (SON) of the hypothalamus. Oxytocin promotes contraction of the uterus during birth and milk ejection during breast feeding. Although oxytocin receptors are found predominantly in the breasts and uterus of females, many tissues and organs express oxytocin receptors, including the pituitary, heart, kidney, thymus, vascular endothelium, adipocytes, osteoblasts, adrenal gland, pancreatic islets, and many cell lines. On the other hand, in pancreatic islets, oxytocin receptors are expressed in both α-cells and β-cells with stronger expression in α- cells. However, to our knowledge there are no reports yet about the effect of oxytocin on cytosolic calcium reaction on α and β-cell. This study aims to investigate the effect of oxytocin on α-cells and β-cells and its oscillation pattern. Islet of Langerhans from wild type mice were isolated by collagenase digestion. Isolated and dissociated single cells either α-cells or β-cells on coverslips were mounted in an open chamber and superfused in HKRB. Cytosolic concentration ([Ca2+]i) in single cells were measured by fura-2 microfluorimetry. After measurement of [Ca2+]i, α-cells were identified by subsequent immunocytochemical staining using an anti-glucagon antiserum. In β-cells, the [Ca2+]i increase in response to oxytocin was observed only under 8.3 mM glucose condition, whereas in α-cells, [Ca2+]i an increase induced by oxytocin was observed in both 2.8 mM and 8.3 mM glucose. The oscillation incidence was induced more frequently in β-cells compared to α-cells. In conclusion, the present study demonstrated that oxytocin directly interacts with both α-cells and β-cells and induces increase of [Ca2+]i and its specific patterns.

Manufacturing of Twist-Free Surfaces by Magnetism Aided Machining Technologies

As a well-known conventional finishing process, the grinding is commonly used to manufacture seal mating surfaces and bearing surfaces, but is also creates twisted surfaces. The machined surfaces by turning or grinding usually have twist structure on the surfaces, which can convey lubricants such as conveyor screw. To avoid this phenomenon, have to use special techniques or machines, for example start-stop turning, tangential turning, ultrasonic protection or special toll geometries. All of these solutions have high cost and difficult usability. In this paper, we describe a system and summarize the results of the experimental research carried out mainly in the field of Magnetic Abrasive Polishing (MAP) and Magnetic Roller Burnishing (MRB). These technologies are simple and also green while able to produce twist-free surfaces. During the tests, C45 normalized steel was used as workpiece material which was machined by simple and Wiper geometrical turning inserts in a CNC turning lathe. After the turning, the MAP and MRB technologies can be used directly to reduce the twist of surfaces. The evaluation was completed by advanced measuring and IT equipment.

Blood Glucose Measurement and Analysis: Methodology

There is numerous non-invasive blood glucose measurement technique developed by researchers, and near infrared (NIR) is the potential technique nowadays. However, there are some disagreements on the optimal wavelength range that is suitable to be used as the reference of the glucose substance in the blood. This paper focuses on the experimental data collection technique and also the analysis method used to analyze the data gained from the experiment. The selection of suitable linear and non-linear model structure is essential in prediction system, as the system developed need to be conceivably accurate.

The Use of Facebook as a Social Media by Political Parties in the June 7 Election in Konya

Social media is among the most important means of communication. Social media offers individuals and groups with an opportunity for participatory socialization over the internet, which is free of any time and place restrictions. Social media is a kind of interactive communication and bilateral social network. Various communication contents can be shared and put into mass circulation easily and quickly through social media. These sharings are not only limited to individuals but also happen to groups, institutions, and different constitutions. Their contents consist of any type of written message, audio and video files. We are living in the social media era now. It is not surprising that social media which has extensive communication facilities and massive prevalence is used in politics. Therefore, the use of social media (Facebook) by political parties during the Turkish general elections held on June 7, 2015, has been chosen as our research subject. Four parties namely, AKP, CHP, MHP and HDP who have the majority of votes in Turkey and participate in elections in Konya have been selected for our study. Their provincial centers’ and parliamentary candidates` use of social media (Facebook) on the last three days prior to the election have been examined and subjected to a qualitative analysis by means of content analysis.

A Method for Measurement and Evaluation of Drape of Textiles

Drape is one of the important visual characteristics of the fabric. This paper is introducing an innovative method of measurement and evaluation of the drape shape of the fabric. The measuring principle is based on the possibility of multiple vertical strain of the fabric. This method more accurately simulates the real behavior of the fabric in the process of draping. The method is fully automated, so the sample can be measured by using any number of cycles in any time horizon. Using the present method of measurement, we are able to describe the viscoelastic behavior of the fabric.

Diagnosis of Diabetes Using Computer Methods: Soft Computing Methods for Diabetes Detection Using Iris

Complementary and Alternative Medicine (CAM) techniques are quite popular and effective for chronic diseases. Iridology is more than 150 years old CAM technique which analyzes the patterns, tissue weakness, color, shape, structure, etc. for disease diagnosis. The objective of this paper is to validate the use of iridology for the diagnosis of the diabetes. The suggested model was applied in a systemic disease with ocular effects. 200 subject data of 100 each diabetic and non-diabetic were evaluated. Complete procedure was kept very simple and free from the involvement of any iridologist. From the normalized iris, the region of interest was cropped. All 63 features were extracted using statistical, texture analysis, and two-dimensional discrete wavelet transformation. A comparison of accuracies of six different classifiers has been presented. The result shows 89.66% accuracy by the random forest classifier.

Detection of New Attacks on Ubiquitous Services in Cloud Computing and Countermeasures

Cloud computing provides infrastructure to the enterprise through the Internet allowing access to cloud services at anytime and anywhere. This pervasive aspect of the services, the distributed nature of data and the wide use of information make cloud computing vulnerable to intrusions that violate the security of the cloud. This requires the use of security mechanisms to detect malicious behavior in network communications and hosts such as intrusion detection systems (IDS). In this article, we focus on the detection of intrusion into the cloud sing IDSs. We base ourselves on client authentication in the computing cloud. This technique allows to detect the abnormal use of ubiquitous service and prevents the intrusion of cloud computing. This is an approach based on client authentication data. Our IDS provides intrusion detection inside and outside cloud computing network. It is a double protection approach: The security user node and the global security cloud computing.

A Simple and Empirical Refraction Correction Method for UAV-Based Shallow-Water Photogrammetry

The aerial photogrammetry of shallow water bottoms has the potential to be an efficient high-resolution survey technique for shallow water topography, thanks to the advent of convenient UAV and automatic image processing techniques Structure-from-Motion (SfM) and Multi-View Stereo (MVS)). However, it suffers from the systematic overestimation of the bottom elevation, due to the light refraction at the air-water interface. In this study, we present an empirical method to correct for the effect of refraction after the usual SfM-MVS processing, using common software. The presented method utilizes the empirical relation between the measured true depth and the estimated apparent depth to generate an empirical correction factor. Furthermore, this correction factor was utilized to convert the apparent water depth into a refraction-corrected (real-scale) water depth. To examine its effectiveness, we applied the method to two river sites, and compared the RMS errors in the corrected bottom elevations with those obtained by three existing methods. The result shows that the presented method is more effective than the two existing methods: The method without applying correction factor and the method utilizes the refractive index of water (1.34) as correction factor. In comparison with the remaining existing method, which used the additive terms (offset) after calculating correction factor, the presented method performs well in Site 2 and worse in Site 1. However, we found this linear regression method to be unstable when the training data used for calibration are limited. It also suffers from a large negative bias in the correction factor when the apparent water depth estimated is affected by noise, according to our numerical experiment. Overall, the good accuracy of refraction correction method depends on various factors such as the locations, image acquisition, and GPS measurement conditions. The most effective method can be selected by using statistical selection (e.g. leave-one-out cross validation).

A Genetic Algorithm Based Permutation and Non-Permutation Scheduling Heuristics for Finite Capacity Material Requirement Planning Problem

This paper presents a genetic algorithm based permutation and non-permutation scheduling heuristics (GAPNP) to solve a multi-stage finite capacity material requirement planning (FCMRP) problem in automotive assembly flow shop with unrelated parallel machines. In the algorithm, the sequences of orders are iteratively improved by the GA characteristics, whereas the required operations are scheduled based on the presented permutation and non-permutation heuristics. Finally, a linear programming is applied to minimize the total cost. The presented GAPNP algorithm is evaluated by using real datasets from automotive companies. The required parameters for GAPNP are intently tuned to obtain a common parameter setting for all case studies. The results show that GAPNP significantly outperforms the benchmark algorithm about 30% on average.