Abstract: The telemedicine services require correct computing resource management to guarantee productivity and efficiency for medical and non-medical staff. The aim of this study was to examine web management strategies to ensure the availability of resources and services in telemedicine so as to provide medical information management with an accessible strategy. In addition, to evaluate the quality-of-service parameters, the followings were measured: delays, throughput, jitter, latency, available bandwidth, percent of access and denial of services based of web management performance map with profiles permissions and database management. Through 24 different test scenarios, the results show 100% in availability of medical information, in relation to access of medical staff to web services, and quality of service (QoS) of 99% because of network delay and performance of computer network. The findings of this study suggest that the proposed strategy of web management is an ideal solution to guarantee the availability, reliability, and accessibility of medical information. Finally, this strategy offers seven user profile used at telemedicine center of Bogota-Colombia keeping QoS parameters suitable to telemedicine services.
Abstract: Telemedicine services use a large amount of data, most of which are diagnostic images in Digital Imaging and Communications in Medicine (DICOM) and Health Level Seven (HL7) formats. Metadata is generated from each related image to support their identification. This study presents the use of decision trees for the optimization of information search processes for diagnostic images, hosted on the cloud server. To analyze the performance in the server, the following quality of service (QoS) metrics are evaluated: delay, bandwidth, jitter, latency and throughput in five test scenarios for a total of 26 experiments during the loading and downloading of DICOM images, hosted by the telemedicine group server of the Universidad Militar Nueva Granada, Bogotá, Colombia. By applying decision trees as a data mining technique and comparing it with the sequential search, it was possible to evaluate the search times of diagnostic images in the server. The results show that by using the metadata in decision trees, the search times are substantially improved, the computational resources are optimized and the request management of the telemedicine image service is improved. Based on the experiments carried out, search efficiency increased by 45% in relation to the sequential search, given that, when downloading a diagnostic image, false positives are avoided in management and acquisition processes of said information. It is concluded that, for the diagnostic images services in telemedicine, the technique of decision trees guarantees the accessibility and robustness in the acquisition and manipulation of medical images, in improvement of the diagnoses and medical procedures in patients.
Abstract: In telemedicine, the image repository service is important to increase the accuracy of diagnostic support of medical personnel. This study makes comparison between two routing algorithms regarding the quality of service (QoS), to be able to analyze the optimal performance at the time of loading and/or downloading of medical images. This study focused on comparing the performance of Tabu Search with other heuristic and metaheuristic algorithms that improve QoS in telemedicine services in Colombia. For this, Tabu Search and Simulated Annealing heuristic algorithms are chosen for their high usability in this type of applications; the QoS is measured taking into account the following metrics: Delay, Throughput, Jitter and Latency. In addition, routing tests were carried out on ten images in digital image and communication in medicine (DICOM) format of 40 MB. These tests were carried out for ten minutes with different traffic conditions, reaching a total of 25 tests, from a server of Universidad Militar Nueva Granada (UMNG) in Bogotá-Colombia to a remote user in Universidad de Santiago de Chile (USACH) - Chile. The results show that Tabu search presents a better QoS performance compared to Simulated Annealing, managing to optimize the routing of medical images, a basic requirement to offer diagnostic images services in telemedicine.
Abstract: Digital technologies offer many opportunities in the
design and implementation of brand communication and advertising.
Augmented reality (AR) is an innovative technology in marketing
communication that focuses on the fact that virtual interaction with a
product ad offers additional value to consumers. AR enables
consumers to obtain (almost) real product experiences by the way of
virtual information even before the purchase of a certain product.
Aim of AR applications in relation with advertising is in-depth
examination of product characteristics to enhance product knowledge
as well as brand knowledge. Interactive design of advertising
provides observers with an intense examination of a specific
advertising message and therefore leads to better brand knowledge.
The elaboration likelihood model and the central route to persuasion
strongly support this argumentation. Nevertheless, AR in brand
communication is still in an initial stage and therefore scientific
findings about the impact of AR on information processing and brand
attitude are rare. The aim of this paper is to empirically investigate
the potential of AR applications in combination with traditional print
advertising. To that effect an experimental design with different
levels of interactivity is built to measure the impact of interactivity of
an ad on different variables o advertising effectiveness.
Abstract: Synchrophasor technology is fast being deployed in
electric power grids all over the world and is fast changing the way
the grids are managed. This trend is to continue until the entire
power grids are fully connected so they can be monitored and
controlled in real-time. Much achievement has been made in the
synchrophasor technology development and deployment, and there
are still much more to be achieved. For instance, real-time power
grid control and protection potentials of synchrophasor are yet to be
explored. It is of necessity that researchers keep in view the various
challenges that still need to be overcome in expanding the frontiers
of synchrophasor technology. This paper outlines the major
challenges that should be dealt with in order to achieve the goal of
total power grid visualization, monitoring, and control using
synchrophasor technology.
Abstract: Economic Dispatch is one of the most important power system management tools. It is used to allocate an amount of power generation to the generating units to meet the load demand. The Economic Dispatch problem is a large scale nonlinear constrained optimization problem. In general, heuristic optimization techniques are used to solve non-convex Economic Dispatch problem. In this paper, ideas from Reinforcement Learning are proposed to solve the non-convex Economic Dispatch problem. Q-Learning is a reinforcement learning techniques where each generating unit learn the optimal schedule of the generated power that minimizes the generation cost function. The eligibility traces are used to speed up the Q-Learning process. Q-Learning with eligibility traces is used to solve Economic Dispatch problems with valve point loading effect, multiple fuel options, and power transmission losses.
Abstract: Iodine radionuclides in accident releases under severe
accident conditions at NPP with VVER are the most radiationimportant
with a view to population dose generation at the beginning
of the accident. To decrease radiation consequences of severe
accidents the technical solutions for severe accidents management
have been proposed in MIR.1200 project, with consideration of the
measures for suppression of volatile iodine forms generation in the
containment. Behavior dynamics of different iodine forms in the
containment under severe accident conditions has been analyzed for
the purpose of these technical solutions justification.
Abstract: The main features of NPP-2006/MIR-1200 design are
described. Estimation of individual doses for population under
normal operation and accident conditions is performed for
Leningradskaya NPP – 2 as an example. The radiation effect on
population and environment doesn-t exceed the established
normative limit and is as low as reasonably achievable. NPP-
2006/MIR-1200 design meets all Russian and international
requirements for power units under construction.
Abstract: Most scientific programs have large input and output
data sets that require out-of-core programming or use virtual memory
management (VMM). Out-of-core programming is very error-prone
and tedious; as a result, it is generally avoided. However, in many
instance, VMM is not an effective approach because it often results
in substantial performance reduction. In contrast, compiler driven I/O
management will allow a program-s data sets to be retrieved in parts,
called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a
compiler combined with a user level runtime system that can be used
to replace standard VMM for out-of-core programs. We describe
Comanche and demonstrate on a number of representative problems
that it substantially out-performs VMM. Significantly our system
does not require any special services from the operating system and
does not require modification of the operating system kernel.
Abstract: Protein 3D structure prediction has always been an
important research area in bioinformatics. In particular, the
prediction of secondary structure has been a well-studied research
topic. Despite the recent breakthrough of combining multiple
sequence alignment information and artificial intelligence algorithms
to predict protein secondary structure, the Q3 accuracy of various
computational prediction algorithms rarely has exceeded 75%. In a
previous paper [1], this research team presented a rule-based method
called RT-RICO (Relaxed Threshold Rule Induction from Coverings)
to predict protein secondary structure. The average Q3 accuracy on
the sample datasets using RT-RICO was 80.3%, an improvement
over comparable computational methods. Although this demonstrated
that RT-RICO might be a promising approach for predicting
secondary structure, the algorithm-s computational complexity and
program running time limited its use. Herein a parallelized
implementation of a slightly modified RT-RICO approach is
presented. This new version of the algorithm facilitated the testing of
a much larger dataset of 396 protein domains [2]. Parallelized RTRICO
achieved a Q3 score of 74.6%, which is higher than the
consensus prediction accuracy of 72.9% that was achieved for the
same test dataset by a combination of four secondary structure
prediction methods [2].
Abstract: A new technique to quantify the differential mode
delay (DMD) in multimode fiber (MMF) is been presented. The
technique measures DMD based on angular launch and
measurements of the difference in modal delay using variable
apertures at the fiber face. The result of the angular spatial filtering
revealed less excitation of higher order modes when the laser beam is
filtered at higher angles. This result would indicate that DMD
profiles would experience a data pattern dependency.