Screening of Antagonistic/Synergistic Effect between Lactic Acid Bacteria (LAB) and Yeast Strains Isolated from Kefir

Kefir is a traditional fermented refreshing beverage which is known for its valuable and beneficial properties for human health. Mainly yeast species, lactic acid bacteria (LAB) strains and fewer acetic acid bacteria strains live together in a natural matrix named “kefir grain”, which is formed from various proteins and polysaccharides. Different microbial species live together in slimy kefir grain and it has been thought that synergetic effect could take place between microorganisms, which belong to different genera and species. In this research, yeast and LAB were isolated from kefir samples obtained from Uludag University Food Engineering Department. The cell morphology of isolates was screened by microscopic examination. Gram reactions of bacteria isolates were determined by Gram staining method, and as well catalase activity was examined. After observing the microscopic/morphological and physical, enzymatic properties of all isolates, they were divided into the groups as LAB and/or yeast according to their physicochemical responses to the applied examinations. As part of this research, the antagonistic/synergistic efficacy of the identified five LAB and five yeast strains to each other were determined individually by disk diffusion method. The antagonistic or synergistic effect is one of the most important properties in a co-culture system that different microorganisms are living together. The synergistic effect should be promoted, whereas the antagonistic effect is prevented to provide effective culture for fermentation of kefir. The aim of this study was to determine microbial interactions between identified yeast and LAB strains, and whether their effect is antagonistic or synergistic. Thus, if there is a strain which inhibits or retards the growth of other strains found in Kefir microflora, this circumstance shows the presence of antagonistic effect in the medium. Such negative influence should be prevented, whereas the microorganisms which have synergistic effect on each other should be promoted by combining them in kefir grain. Standardisation is the most desired property for industrial production. Each microorganism found in the microbial flora of a kefir grain should be identified individually. The members of the microbial community found in the glue-like kefir grain may be redesigned as a starter culture regarding efficacy of each microorganism to another in kefir processing. The main aim of this research was to shed light on more effective production of kefir grain and to contribute a standardisation of kefir processing in the food industry.

Patient Support Program in Pharmacovigilance: Foster Patient Confidence and Compliance

The pharmaceutical companies are getting more inclined towards patient support programs (PSPs) which assist patients and/or healthcare professionals (HCPs) in more desirable disease management and cost-effective treatment. The utmost objective of these programs is patient care. The PSPs may include financial assistance to patients, medicine compliance programs, access to HCPs via phone or online chat centers, etc. The PSP has a crucial role in terms of customer acquisition and retention strategies. During the conduct of these programs, Marketing Authorisation Holder (MAH) may receive information related to concerned medicinal products, which is usually reported by patients or involved HCPs. This information may include suspected adverse reaction(s) during/after administration of medicinal products. Hence, the MAH should design PSP to comply with regulatory reporting requirements and avoid non-compliance during PV inspection. The emergence of wireless health devices is lowering the burden on patients to manually incorporate safety data, and building a significant option for patients to observe major swings in reference to drug safety. Therefore, to enhance the adoption of these programs, MAH not only needs to aware patients about advantages of the program, but also recognizes the importance of time of patients and commitments made in a constructive manner. It is indispensable that strengthening the public health is considered as the topmost priority in such programs, and the MAH is compliant to Pharmacovigilance (PV) requirements along with regulatory obligations.

Synchronization of Traveling Waves within a Hollow-Core Vortex

The present paper expands details and confirms the transition mechanism between two subsequent polygonal patterns of the hollow-core vortex. Using power spectral analysis, we confirm in this work that the transition from any N-gon to (N+1)-gon pattern observed within a hollow-core vortex of shallow rotating flows occurs in two steps. The regime was quasi-periodic before the frequencies lock (synchronization). The ratios of locking frequencies were found to be equal to (N-1)/N.

Preparation of n-type Bi2Te3 Films by Electrophoretic Deposition

A high quality crack-free film of Bi2Te3 material has been deposited for the first time using electrophoretic deposition (EPD) and microstructures of various films have been investigated. One of the most important thermoelectric (TE) applications is Bi2Te3 to manufacture TE generators (TEG) which can convert waste heat into electricity targeting the global warming issue. However, the high cost of the manufacturing process of TEGs keeps them expensive and out of reach for commercialization. Therefore, utilizing EPD as a simple and cost-effective method will open new opportunities for TEG’s commercialization. This method has been recently used for advanced materials such as microelectronics and has attracted a lot of attention from both scientists and industry. In this study, the effect of media of suspensions has been investigated on the quality of the deposited films as well as their microstructure. In summary, finding an appropriate suspension is a critical step for a successful EPD process and has an important effect on both the film’s quality and its future properties.

VISMA: A Method for System Analysis in Early Lifecycle Phases

The choice of applicable analysis methods in safety or systems engineering depends on the depth of knowledge about a system, and on the respective lifecycle phase. However, the analysis method chain still shows gaps as it should support system analysis during the lifecycle of a system from a rough concept in pre-project phase until end-of-life. This paper’s goal is to discuss an analysis method, the VISSE Shell Model Analysis (VISMA) method, which aims at closing the gap in the early system lifecycle phases, like the conceptual or pre-project phase, or the project start phase. It was originally developed to aid in the definition of the system boundary of electronic system parts, like e.g. a control unit for a pump motor. Furthermore, it can be also applied to non-electronic system parts. The VISMA method is a graphical sketch-like method that stratifies a system and its parts in inner and outer shells, like the layers of an onion. It analyses a system in a two-step approach, from the innermost to the outermost components followed by the reverse direction. To ensure a complete view of a system and its environment, the VISMA should be performed by (multifunctional) development teams. To introduce the method, a set of rules and guidelines has been defined in order to enable a proper shell build-up. In the first step, the innermost system, named system under consideration (SUC), is selected, which is the focus of the subsequent analysis. Then, its directly adjacent components, responsible for providing input to and receiving output from the SUC, are identified. These components are the content of the first shell around the SUC. Next, the input and output components to the components in the first shell are identified and form the second shell around the first one. Continuing this way, shell by shell is added with its respective parts until the border of the complete system (external border) is reached. Last, two external shells are added to complete the system view, the environment and the use case shell. This system view is also stored for future use. In the second step, the shells are examined in the reverse direction (outside to inside) in order to remove superfluous components or subsystems. Input chains to the SUC, as well as output chains from the SUC are described graphically via arrows, to highlight functional chains through the system. As a result, this method offers a clear and graphical description and overview of a system, its main parts and environment; however, the focus still remains on a specific SUC. It helps to identify the interfaces and interfacing components of the SUC, as well as important external interfaces of the overall system. It supports the identification of the first internal and external hazard causes and causal chains. Additionally, the method promotes a holistic picture and cross-functional understanding of a system, its contributing parts, internal relationships and possible dangers within a multidisciplinary development team.

Effect of Oxytocin on Cytosolic Calcium Concentration of Alpha and Beta Cells in Pancreas

Oxytocin is a nine-amino acid peptide synthesized in the paraventricular nucleus (PVN) and supraoptic nucleus (SON) of the hypothalamus. Oxytocin promotes contraction of the uterus during birth and milk ejection during breast feeding. Although oxytocin receptors are found predominantly in the breasts and uterus of females, many tissues and organs express oxytocin receptors, including the pituitary, heart, kidney, thymus, vascular endothelium, adipocytes, osteoblasts, adrenal gland, pancreatic islets, and many cell lines. On the other hand, in pancreatic islets, oxytocin receptors are expressed in both α-cells and β-cells with stronger expression in α- cells. However, to our knowledge there are no reports yet about the effect of oxytocin on cytosolic calcium reaction on α and β-cell. This study aims to investigate the effect of oxytocin on α-cells and β-cells and its oscillation pattern. Islet of Langerhans from wild type mice were isolated by collagenase digestion. Isolated and dissociated single cells either α-cells or β-cells on coverslips were mounted in an open chamber and superfused in HKRB. Cytosolic concentration ([Ca2+]i) in single cells were measured by fura-2 microfluorimetry. After measurement of [Ca2+]i, α-cells were identified by subsequent immunocytochemical staining using an anti-glucagon antiserum. In β-cells, the [Ca2+]i increase in response to oxytocin was observed only under 8.3 mM glucose condition, whereas in α-cells, [Ca2+]i an increase induced by oxytocin was observed in both 2.8 mM and 8.3 mM glucose. The oscillation incidence was induced more frequently in β-cells compared to α-cells. In conclusion, the present study demonstrated that oxytocin directly interacts with both α-cells and β-cells and induces increase of [Ca2+]i and its specific patterns.

A Mathematical Investigation of the Turkevich Organizer Theory in the Citrate Method for the Synthesis of Gold Nanoparticles

Gold nanoparticles are commonly synthesized by reducing chloroauric acid with sodium citrate. This method, referred to as the citrate method, can produce spherical gold nanoparticles (NPs) in the size range 10-150 nm. Gold NPs of this size are useful in many applications. However, the NPs are usually polydisperse and irreproducible. A better understanding of the synthesis mechanisms is thus required. This work thoroughly investigated the only model that describes the synthesis. This model combines mass and population balance equations, describing the NPs synthesis through a sequence of chemical reactions. Chloroauric acid reacts with sodium citrate to form aurous chloride and dicarboxy acetone. The latter organizes aurous chloride in a nucleation step and concurrently degrades into acetone. The unconsumed precursor then grows the formed nuclei. However, depending on the pH, both the precursor and the reducing agent react differently thus affecting the synthesis. In this work, we investigated the model for different conditions of pH, temperature and initial reactant concentrations. To solve the model, we used Parsival, a commercial numerical code, whilst to test it, we considered various conditions studied experimentally by different researchers, for which results are available in the literature. The model poorly predicted the experimental data. We believe that this is because the model does not account for the acid-base properties of both chloroauric acid and sodium citrate.

The Use of Facebook as a Social Media by Political Parties in the June 7 Election in Konya

Social media is among the most important means of communication. Social media offers individuals and groups with an opportunity for participatory socialization over the internet, which is free of any time and place restrictions. Social media is a kind of interactive communication and bilateral social network. Various communication contents can be shared and put into mass circulation easily and quickly through social media. These sharings are not only limited to individuals but also happen to groups, institutions, and different constitutions. Their contents consist of any type of written message, audio and video files. We are living in the social media era now. It is not surprising that social media which has extensive communication facilities and massive prevalence is used in politics. Therefore, the use of social media (Facebook) by political parties during the Turkish general elections held on June 7, 2015, has been chosen as our research subject. Four parties namely, AKP, CHP, MHP and HDP who have the majority of votes in Turkey and participate in elections in Konya have been selected for our study. Their provincial centers’ and parliamentary candidates` use of social media (Facebook) on the last three days prior to the election have been examined and subjected to a qualitative analysis by means of content analysis.

A Simple and Empirical Refraction Correction Method for UAV-Based Shallow-Water Photogrammetry

The aerial photogrammetry of shallow water bottoms has the potential to be an efficient high-resolution survey technique for shallow water topography, thanks to the advent of convenient UAV and automatic image processing techniques Structure-from-Motion (SfM) and Multi-View Stereo (MVS)). However, it suffers from the systematic overestimation of the bottom elevation, due to the light refraction at the air-water interface. In this study, we present an empirical method to correct for the effect of refraction after the usual SfM-MVS processing, using common software. The presented method utilizes the empirical relation between the measured true depth and the estimated apparent depth to generate an empirical correction factor. Furthermore, this correction factor was utilized to convert the apparent water depth into a refraction-corrected (real-scale) water depth. To examine its effectiveness, we applied the method to two river sites, and compared the RMS errors in the corrected bottom elevations with those obtained by three existing methods. The result shows that the presented method is more effective than the two existing methods: The method without applying correction factor and the method utilizes the refractive index of water (1.34) as correction factor. In comparison with the remaining existing method, which used the additive terms (offset) after calculating correction factor, the presented method performs well in Site 2 and worse in Site 1. However, we found this linear regression method to be unstable when the training data used for calibration are limited. It also suffers from a large negative bias in the correction factor when the apparent water depth estimated is affected by noise, according to our numerical experiment. Overall, the good accuracy of refraction correction method depends on various factors such as the locations, image acquisition, and GPS measurement conditions. The most effective method can be selected by using statistical selection (e.g. leave-one-out cross validation).

Non-Population Search Algorithms for Capacitated Material Requirement Planning in Multi-Stage Assembly Flow Shop with Alternative Machines

This paper aims to present non-population search algorithms called tabu search (TS), simulated annealing (SA) and variable neighborhood search (VNS) to minimize the total cost of capacitated MRP problem in multi-stage assembly flow shop with two alternative machines. There are three main steps for the algorithm. Firstly, an initial sequence of orders is constructed by a simple due date-based dispatching rule. Secondly, the sequence of orders is repeatedly improved to reduce the total cost by applying TS, SA and VNS separately. Finally, the total cost is further reduced by optimizing the start time of each operation using the linear programming (LP) model. Parameters of the algorithm are tuned by using real data from automotive companies. The result shows that VNS significantly outperforms TS, SA and the existing algorithm.

Development of a Software System for Management and Genetic Analysis of Biological Samples for Forensic Laboratories

Due to the high reliability reached by DNA tests, since the 1980s this kind of test has allowed the identification of a growing number of criminal cases, including old cases that were unsolved, now having a chance to be solved with this technology. Currently, the use of genetic profiling databases is a typical method to increase the scope of genetic comparison. Forensic laboratories must process, analyze, and generate genetic profiles of a growing number of samples, which require time and great storage capacity. Therefore, it is essential to develop methodologies capable to organize and minimize the spent time for both biological sample processing and analysis of genetic profiles, using software tools. Thus, the present work aims the development of a software system solution for laboratories of forensics genetics, which allows sample, criminal case and local database management, minimizing the time spent in the workflow and helps to compare genetic profiles. For the development of this software system, all data related to the storage and processing of samples, workflows and requirements that incorporate the system have been considered. The system uses the following software languages: HTML, CSS, and JavaScript in Web technology, with NodeJS platform as server, which has great efficiency in the input and output of data. In addition, the data are stored in a relational database (MySQL), which is free, allowing a better acceptance for users. The software system here developed allows more agility to the workflow and analysis of samples, contributing to the rapid insertion of the genetic profiles in the national database and to increase resolution of crimes. The next step of this research is its validation, in order to operate in accordance with current Brazilian national legislation.

Simulation of a Control System for an Adaptive Suspension System for Passenger Vehicles

In the process to cope with the challenges faced by the automobile industry in providing ride comfort, the electronics and control systems play a vital role. The control systems in an automobile monitor various parameters, controls the performances of the systems, thereby providing better handling characteristics. The automobile suspension system is one of the main systems that ensure the safety, stability and comfort of the passengers. The system is solely responsible for the isolation of the entire automobile from harmful road vibrations. Thus, integration of the control systems in the automobile suspension system would enhance its performance. The diverse road conditions of India demand the need of an efficient suspension system which can provide optimum ride comfort in all road conditions. For any passenger vehicle, the design of the suspension system plays a very important role in assuring the ride comfort and handling characteristics. In recent years, the air suspension system is preferred over the conventional suspension systems to ensure ride comfort. In this article, the ride comfort of the adaptive suspension system is compared with that of the passive suspension system. The schema is created in MATLAB/Simulink environment. The system is controlled by a proportional integral differential controller. Tuning of the controller was done with the Particle Swarm Optimization (PSO) algorithm, since it suited the problem best. Ziegler-Nichols and Modified Ziegler-Nichols tuning methods were also tried and compared. Both the static responses and dynamic responses of the systems were calculated. Various random road profiles as per ISO 8608 standard are modelled in the MATLAB environment and their responses plotted. Open-loop and closed loop responses of the random roads, various bumps and pot holes are also plotted. The simulation results of the proposed design are compared with the available passive suspension system. The obtained results show that the proposed adaptive suspension system is efficient in controlling the maximum over shoot and the settling time of the system is reduced enormously.

Condition Monitoring for Twin-Fluid Nozzles with Internal Mixing

Liquid sprays of water are frequently used in air pollution control for gas cooling purposes and for gas cleaning. Twin-fluid nozzles with internal mixing are often used for these purposes because of the small size of the drops produced. In these nozzles the liquid is dispersed by compressed air or another pressurized gas. In high efficiency scrubbers for particle separation, several nozzles are operated in parallel because of the size of the cross section. In such scrubbers, the scrubbing water has to be re-circulated. Precipitation of some solid material can occur in the liquid circuit, caused by chemical reactions. When such precipitations are detached from the place of formation, they can partly or totally block the liquid flow to a nozzle. Due to the resulting unbalanced supply of the nozzles with water and gas, the efficiency of separation decreases. Thus, the nozzles have to be cleaned if a certain fraction of blockages is reached. The aim of this study was to provide a tool for continuously monitoring the status of the nozzles of a scrubber based on the available operation data (water flow, air flow, water pressure and air pressure). The difference between the air pressure and the water pressure is not well suited for this purpose, because the difference is quite small and therefore very exact calibration of the pressure measurement would be required. Therefore, an equation for the reference air flow of a nozzle at the actual water flow and operation pressure was derived. This flow can be compared with the actual air flow for assessment of the status of the nozzles.

Exploring Influence Range of Tainan City Using Electronic Toll Collection Big Data

Big Data has been attracted a lot of attentions in many fields for analyzing research issues based on a large number of maternal data. Electronic Toll Collection (ETC) is one of Intelligent Transportation System (ITS) applications in Taiwan, used to record starting point, end point, distance and travel time of vehicle on the national freeway. This study, taking advantage of ETC big data, combined with urban planning theory, attempts to explore various phenomena of inter-city transportation activities. ETC, one of government's open data, is numerous, complete and quick-update. One may recall that living area has been delimited with location, population, area and subjective consciousness. However, these factors cannot appropriately reflect what people’s movement path is in daily life. In this study, the concept of "Living Area" is replaced by "Influence Range" to show dynamic and variation with time and purposes of activities. This study uses data mining with Python and Excel, and visualizes the number of trips with GIS to explore influence range of Tainan city and the purpose of trips, and discuss living area delimited in current. It dialogues between the concepts of "Central Place Theory" and "Living Area", presents the new point of view, integrates the application of big data, urban planning and transportation. The finding will be valuable for resource allocation and land apportionment of spatial planning.

Electrocardiogram Signal Denoising Using a Hybrid Technique

This paper presents an efficient method of electrocardiogram signal denoising based on a hybrid approach. Two techniques are brought together to create an efficient denoising process. The first is an Adaptive Dual Threshold Filter (ADTF) and the second is the Discrete Wavelet Transform (DWT). The presented approach is based on three steps of denoising, the DWT decomposition, the ADTF step and the highest peaks correction step. This paper presents some application of the approach on some electrocardiogram signals of the MIT-BIH database. The results of these applications are promising compared to other recently published techniques.

An E-Government Implementation Model for Peruvian State Companies Based on COBIT 5.0: Definition and Goals of the Model

As part of the regulatory compliance process and the streamlining of public administration, the Peruvian government has implemented the National E-Government Plan in all state institutions with the aim of providing citizens with solid services based on the use of Information and Communications Technologies (ICT). As part of the regulations, the requisites to be met by public institutions have been submitted. However, the lack of an implementation model was detected, one that can serve as a guide to such institutions in order to materialize the organizational and technological structures needed, which allow them to provide the required digital services. This paper develops an implementation model of electronic government (e-government) for Peru’s state institutions, in compliance with current regulations based on a COBIT 5.0 framework. Furthermore, the paper introduces phase 1 of this model: business and IT goals, the goals cascade and the future model of processes.

Hybrid Temporal Correlation Based on Gaussian Mixture Model Framework for View Synthesis

As 3D video is explored as a hot research topic in the last few decades, free-viewpoint TV (FTV) is no doubt a promising field for its better visual experience and incomparable interactivity. View synthesis is obviously a crucial technology for FTV; it enables to render images in unlimited numbers of virtual viewpoints with the information from limited numbers of reference view. In this paper, a novel hybrid synthesis framework is proposed and blending priority is explored. In contrast to the commonly used View Synthesis Reference Software (VSRS), the presented synthesis process is driven in consideration of the temporal correlation of image sequences. The temporal correlations will be exploited to produce fine synthesis results even near the foreground boundaries. As for the blending priority, this scheme proposed that one of the two reference views is selected to be the main reference view based on the distance between the reference views and virtual view, another view is chosen as the auxiliary viewpoint, just assist to fill the hole pixel with the help of background information. Significant improvement of the proposed approach over the state-of –the-art pixel-based virtual view synthesis method is presented, the results of the experiments show that subjective gains can be observed, and objective PSNR average gains range from 0.5 to 1.3 dB, while SSIM average gains range from 0.01 to 0.05.

Robot Navigation and Localization Based on the Rat’s Brain Signals

The mobile robot ability to navigate autonomously in its environment is very important. Even though the advances in technology, robot self-localization and goal directed navigation in complex environments are still challenging tasks. In this article, we propose a novel method for robot navigation based on rat’s brain signals (Local Field Potentials). It has been well known that rats accurately and rapidly navigate in a complex space by localizing themselves in reference to the surrounding environmental cues. As the first step to incorporate the rat’s navigation strategy into the robot control, we analyzed the rats’ strategies while it navigates in a multiple Y-maze, and recorded Local Field Potentials (LFPs) simultaneously from three brain regions. Next, we processed the LFPs, and the extracted features were used as an input in the artificial neural network to predict the rat’s next location, especially in the decision-making moment, in Y-junctions. We developed an algorithm by which the robot learned to imitate the rat’s decision-making by mapping the rat’s brain signals into its own actions. Finally, the robot learned to integrate the internal states as well as external sensors in order to localize and navigate in the complex environment.

A Biomimetic Structural Form: Developing a Paradigm to Attain Vital Sustainability in Tall Architecture

This paper argues for sustainability as a necessity in the evolution of tall architecture. It provides a different mode for dealing with sustainability in tall architecture, taking into consideration the speciality of its typology. To this end, the article develops a Biomimetic Structural Form as a paradigm to attain Vital Sustainability. A Biomimetic Structural Form, which is derived from the amalgamation of biomimicry as an approach for sustainability defining nature as source of knowledge and inspiration in solving humans’ problems and a Structural Form as a catalyst for evolving tall architecture, is a dynamic paradigm emerging from a conceptualizing and morphological process. A Biomimetic Structural Form is a flow system whose different forces and functions tend to be “better”, more "fit", to “survive”, and to be efficient. Through geometry and function—the two aspects of knowledge extracted from nature—the attributes of the Biomimetic Structural Form are formulated. Vital Sustainability is the survival level of sustainability in natural systems through which a system enhances the performance of its internal working and its interaction with the external environment. A Biomimetic Structural Form, in this context, is a medium for evolving tall architecture to emulate natural models in their ways of coexistence with the environment. As an integral part of this article, the sustainable super tall building 3Ts is discussed as a case study of applying Biomimetic Structural Form.   

Comparative Quantitative Study on Learning Outcomes of Major Study Groups of an Information and Communication Technology Bachelor Educational Program

Higher Education system reforms, especially Finnish system of Universities of Applied Sciences in 2014 are discussed. The new steering model is based on major legislative changes, output-oriented funding and open information. The governmental steering reform, especially the financial model and the resulting institutional level responses, such as a curriculum reforms are discussed, focusing especially in engineering programs. The paper is motivated by management need to establish objective steering-related performance indicators and to apply them consistently across all educational programs. The close relationship to governmental steering and funding model imply that internally derived indicators can be directly applied. Metropolia University of Applied Sciences (MUAS) as a case institution is briefly introduced, focusing on engineering education in Information and Communications Technology (ICT), and its related programs. The reform forced consolidation of previously separate smaller programs into fewer units of student application. New curriculum ICT students have a common first year before they apply for a Major. A framework of parallel and longitudinal comparisons is introduced and used across Majors in two campuses. The new externally introduced performance criteria are applied internally on ICT Majors using data ex-ante and ex-post of program merger.  A comparative performance of the Majors after completion of joint first year is established, focusing on previously omitted Majors for completeness of analysis. Some new research questions resulting from transfer of Majors between campuses and quota setting are discussed. Practical orientation identifies best practices to share or targets needing most attention for improvement. This level of analysis is directly applicable at student group and teaching team level, where corrective actions are possible, when identified. The analysis is quantitative and the nature of the corrective actions are not discussed. Causal relationships and factor analysis are omitted, because campuses, their staff and various pedagogical implementation details contain still too many undetermined factors for our limited data. Such qualitative analysis is left for further research. Further study must, however, be guided by the relevance of the observations.