Abstract: Accurate determination of wind turbine performance is necessary for economic operation of a wind farm. At present, the procedure to carry out the power performance verification of wind turbines is based on a standard of the International Electrotechnical Commission (IEC). In this paper, nonparametric statistical inference is applied to designing a simple, inexpensive method of verifying the power performance of a wind turbine. A statistical test is explained, examined, and the adequacy is tested over real data. The methods use the information that is collected by the SCADA system (Supervisory Control and Data Acquisition) from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. The study has used data on the monthly output of wind farm in the Republic of Macedonia, and the time measuring interval was from January 1, 2016, to December 31, 2016. At the end, it is concluded whether the power performance of a wind turbine differed significantly from what would be expected. The results of the implementation of the proposed methods showed that the power performance of the specific wind farm under assessment was acceptable.
Abstract: The Unmanned Aircraft Systems (UAS) usually navigate through the Global Navigation Satellite System (GNSS) associated with an Inertial Navigation System (INS). However, GNSS can have its accuracy degraded at any time or even turn off the signal of GNSS. In addition, there is the possibility of malicious interferences, known as jamming. Therefore, the image navigation system can solve the autonomy problem, because if the GNSS is disabled or degraded, the image navigation system would continue to provide coordinate information for the INS, allowing the autonomy of the system. This work aims to evaluate the accuracy of the positioning though photogrammetry concepts. The methodology uses orthophotos and Digital Surface Models (DSM) as a reference to represent the object space and photograph obtained during the flight to represent the image space. For the calculation of the coordinates of the perspective center and camera attitudes, it is necessary to know the coordinates of homologous points in the object space (orthophoto coordinates and DSM altitude) and image space (column and line of the photograph). So if it is possible to automatically identify in real time the homologous points the coordinates and attitudes can be calculated whit their respective accuracies. With the methodology applied in this work, it is possible to verify maximum errors in the order of 0.5 m in the positioning and 0.6º in the attitude of the camera, so the navigation through the image can reach values equal to or higher than the GNSS receivers without differential correction. Therefore, navigating through the image is a good alternative to enable autonomous navigation.
Abstract: Forced vibration problem of a delaminated beam made of fiber metal laminates is studied in this paper. Firstly, a delamination is considered to divide the beam into four sections. The classic beam theory is assumed to dominate each section. The layers on two sides of the delamination are constrained to have the same deflection. This hypothesis approves the conditions of compatibility as well. Consequently, dynamic response of the beam is obtained by the means of differential transform method (DTM). In order to verify the correctness of the results, a model is constructed using commercial software ABAQUS 6.14. A linear spring with constant stiffness takes the effect of contact between delaminated layers into account. The attained semi-analytical outcomes are in great agreement with finite element analysis.
Abstract: Delays in the construction industry are a global phenomenon. Many construction projects experience extensive delays exceeding the initially estimated completion time. The main purpose of this study is to identify construction projects typical behaviors in order to develop a prognosis and management tool. Being able to know a construction projects schedule tendency will enable evidence-based decision-making to allow resolutions to be made before delays occur. This study presents an innovative approach that uses Cluster Analysis Method to support predictions during Earned Value Analyses. A clustering analysis was used to predict future scheduling, Earned Value Management (EVM), and Earned Schedule (ES) principal Indexes behaviors in construction projects. The analysis was made using a database with 90 different construction projects. It was validated with additional data extracted from literature and with another 15 contrasting projects. For all projects, planned and executed schedules were collected and the EVM and ES principal indexes were calculated. A complete linkage classification method was used. In this way, the cluster analysis made considers that the distance (or similarity) between two clusters must be measured by its most disparate elements, i.e. that the distance is given by the maximum span among its components. Finally, through the use of EVM and ES Indexes and Tukey and Fisher Pairwise Comparisons, the statistical dissimilarity was verified and four clusters were obtained. It can be said that construction projects show an average delay of 35% of its planned completion time. Furthermore, four typical behaviors were found and for each of the obtained clusters, the interim milestones and the necessary rhythms of construction were identified. In general, detected typical behaviors are: (1) Projects that perform a 5% of work advance in the first two tenths and maintain a constant rhythm until completion (greater than 10% for each remaining tenth), being able to finish on the initially estimated time. (2) Projects that start with an adequate construction rate but suffer minor delays culminating with a total delay of almost 27% of the planned time. (3) Projects which start with a performance below the planned rate and end up with an average delay of 64%, and (4) projects that begin with a poor performance, suffer great delays and end up with an average delay of a 120% of the planned completion time. The obtained clusters compose a tool to identify the behavior of new construction projects by comparing their current work performance to the validated database, thus allowing the correction of initial estimations towards more accurate completion schedules.
Abstract: Crank shaft length, connecting rod length, crank angle, engine rpm, cylinder bore, mass of piston and compression ratio are the inputs that can control the performance of the slider crank mechanism and then its efficiency. Several combinations of these seven inputs are used and compared. The throughput engine torque predicted by the simulation is analyzed through two different regression models, with and without interaction terms, developed according to multi-linear regression using LU decomposition to solve system of algebraic equations. These models are validated. A regression model in seven inputs including their interaction terms lowered the polynomial degree from 3rd degree to 1st degree and suggested valid predictions and stable explanations.
Abstract: The advancement in various concrete ingredients like plasticizers, additives and fibers, etc. has enabled concrete technologists to develop many viable varieties of special concretes in recent decades. Such various varieties of concrete have significant enhancement in green as well as hardened properties of concrete. A prudent selection of appropriate type of concrete can resolve many design and application issues in construction projects. This paper focuses on usage of self-compacting concrete, high early strength concrete, structural lightweight concrete, fiber reinforced concrete, high performance concrete and ultra-high strength concrete in the structures. The modified properties of strength at various ages, flowability, porosity, equilibrium density, flexural strength, elasticity, permeability etc. need to be carefully studied and incorporated into the design of the structures. The paper demonstrates various mixture combinations and the concrete properties that can be leveraged. The selection of such products based on the end use of structures has been proposed in order to efficiently utilize the modified characteristics of these concrete varieties. The study involves mapping the characteristics with benefits and savings for the structure from design perspective. Self-compacting concrete in the structure is characterized by high shuttering loads, better finish, and feasibility of closer reinforcement spacing. The structural design procedures can be modified to specify higher formwork strength, height of vertical members, cover reduction and increased ductility. The transverse reinforcement can be spaced at closer intervals compared to regular structural concrete. It allows structural lightweight concrete structures to be designed for reduced dead load, increased insulation properties. Member dimensions and steel requirement can be reduced proportionate to about 25 to 35 percent reduction in the dead load due to self-weight of concrete. Steel fiber reinforced concrete can be used to design grade slabs without primary reinforcement because of 70 to 100 percent higher tensile strength. The design procedures incorporate reduction in thickness and joint spacing. High performance concrete employs increase in the life of the structures by improvement in paste characteristics and durability by incorporating supplementary cementitious materials. Often, these are also designed for slower heat generation in the initial phase of hydration. The structural designer can incorporate the slow development of strength in the design and specify 56 or 90 days strength requirement. For designing high rise building structures, creep and elasticity properties of such concrete also need to be considered. Lastly, certain structures require a performance under loading conditions much earlier than final maturity of concrete. High early strength concrete has been designed to cater to a variety of usages at various ages as early as 8 to 12 hours. Therefore, an understanding of concrete performance specifications for special concrete is a definite door towards a superior structural design approach.
Abstract: Due to the numerous advantages of steel corrugated
web girders, its application field is growing for bridges as well as for
buildings. The global stability behavior of such girders is
significantly larger than those of conventional I-girders with flat web,
thus the application of the structural steel material can be
significantly reduced. Design codes and specifications do not provide
clear and complete rules or recommendations for the determination of
the lateral torsional buckling (LTB) resistance of corrugated web
girders. Therefore, the authors made a thorough investigation
regarding the LTB resistance of the corrugated web girders. Finite
element (FE) simulations have been performed to develop new
design formulas for the determination of the LTB resistance of
trapezoidally corrugated web girders. FE model is developed
considering geometrical and material nonlinear analysis using
equivalent geometric imperfections (GMNI analysis). The equivalent
geometric imperfections involve the initial geometric imperfections
and residual stresses coming from rolling, welding and flame cutting.
Imperfection sensitivity analysis was performed to determine the
necessary magnitudes regarding only the first eigenmodes shape
imperfections. By the help of the validated FE model, an extended
parametric study is carried out to investigate the LTB resistance for
different trapezoidal corrugation profiles. First, the critical moment of
a specific girder was calculated by FE model. The critical moments
from the FE calculations are compared to the previous analytical
calculation proposals. Then, nonlinear analysis was carried out to
determine the ultimate resistance. Due to the numerical
investigations, new proposals are developed for the determination of
the LTB resistance of trapezoidally corrugated web girders through a
modification factor on the design method related to the conventional
flat web girders.
Abstract: As part of a ‘Morphing-Wing’ idea, this study consists
of measuring how a winglet, which is able to change its shape during
the flight, is efficient. Conventionally, winglets are fixed-vertical
platforms at the wingtips, optimized for a cruise condition that the
airplane should use most of the time. However, during a cruise, an
airplane flies through a lot of cruise conditions corresponding to
altitudes variations from 30,000 to 45,000 ft. The fixed winglets are
not optimized for these variations, and consequently, they are
supposed to generate some drag, and thus to deteriorate aircraft fuel
consumption. This research assumes that it exists a winglet position
that reduces the fuel consumption for each cruise condition. In this
way, the methodology aims to find these optimal winglet positions,
and to further simulate, and thus estimate the fuel consumption of an
aircraft wearing this type of adaptive winglet during several cruise
conditions. The adaptive winglet is assumed to have degrees of
freedom given by the various changes of following surfaces: the tip
chord, the sweep and the dihedral angles. Finally, results obtained
during cruise simulations are presented in this paper. These results
show that an adaptive winglet can reduce, thus improve up to 2.12%
the fuel consumption of an aircraft during a cruise.
Abstract: The purpose of the present research is to equate two
test forms as part of a study to evaluate the educational effectiveness
of the ARTé: Mecenas art history learning game. The researcher
applied Item Response Theory (IRT) procedures to calculate item,
test, and mean-sigma equating parameters. With the sample size
n=134, test parameters indicated “good” model fit but low Test
Information Functions and more acute than expected equating
parameters. Therefore, the researcher applied equipercentile equating
and linear equating to raw scores and compared the equated form
parameters and effect sizes from each method. Item scaling in IRT
enables the researcher to select a subset of well-discriminating items.
The mean-sigma step produces a mean-slope adjustment from the
anchor items, which was used to scale the score on the new form
(Form R) to the reference form (Form Q) scale. In equipercentile
equating, scores are adjusted to align the proportion of scores in each
quintile segment. Linear equating produces a mean-slope adjustment,
which was applied to all core items on the new form. The study
followed a quasi-experimental design with purposeful sampling of
students enrolled in a college level art history course (n=134) and
counterbalancing design to distribute both forms on the pre- and posttests.
The Experimental Group (n=82) was asked to play ARTé:
Mecenas online and complete Level 4 of the game within a two-week
period; 37 participants completed Level 4. Over the same period, the
Control Group (n=52) did not play the game. The researcher
examined between group differences from post-test scores on test
Form Q and Form R by full-factorial Two-Way ANOVA. The raw
score analysis indicated a 1.29% direct effect of form, which was
statistically non-significant but may be practically significant. The
researcher repeated the between group differences analysis with all
three equating methods. For the IRT mean-sigma adjusted scores,
form had a direct effect of 8.39%. Mean-sigma equating with a small
sample may have resulted in inaccurate equating parameters.
Equipercentile equating aligned test means and standard deviations,
but resultant skewness and kurtosis worsened compared to raw score
parameters. Form had a 3.18% direct effect. Linear equating
produced the lowest Form effect, approaching 0%. Using linearly
equated scores, the researcher conducted an ANCOVA to examine
the effect size in terms of prior knowledge. The between group effect
size for the Control Group versus Experimental Group participants
who completed the game was 14.39% with a 4.77% effect size
attributed to pre-test score. Playing and completing the game
increased art history knowledge, and individuals with low prior
knowledge tended to gain more from pre- to post test. Ultimately,
researchers should approach test equating based on their theoretical
stance on Classical Test Theory and IRT and the respective assumptions. Regardless of the approach or method, test equating
requires a representative sample of sufficient size. With small sample
sizes, the application of a range of equating approaches can expose
item and test features for review, inform interpretation, and identify
paths for improving instruments for future study.
Abstract: The world-wide population of people over 60 years
of age is growing rapidly. The explosion is placing increasingly
onerous demands on individual families, multiple industries and
entire countries. Current, human-intensive approaches to eldercare
are not sustainable, but IoT and AI technologies can help. The
Knowledge Reactor (KR) is a contextual, data fusion engine built to
address this and other similar problems. It fuses and centralizes IoT
and System of Record/Engagement data into a reactive knowledge
graph. Cognitive applications and services are constructed with its
multiagent architecture. The KR can scale-up and scaledown, because
it exploits container-based, horizontally scalable services for graph
store (JanusGraph) and pub-sub (Kafka) technologies. While the KR
can be applied to many domains that require IoT and AI technologies,
this paper describes how the KR specifically supports the challenging
domain of cognitive eldercare. Rule- and machine learning-based
analytics infer activities of daily living from IoT sensor readings. KR
scalability, adaptability, flexibility and usability are demonstrated.
Abstract: Tire noise has a significant impact on ride quality
and vehicle interior comfort, even at low frequency. Reduction of
tire noise is especially important due to strict state and federal
environmental regulations. The primary sources of tire noise are the
low frequency structure-borne noise and the noise that originates from
the release of trapped air between the tire tread and road surface
during each revolution of the tire. The frequency response of the tire
changes at low and high frequency. At low frequency, the tension
and bending moment become dominant, while the internal structure
and local deformation become dominant at higher frequencies. Here,
we analyze tire response in terms of deformation and rolling velocity
at low revolution frequency. An Abaqus FEA finite element model
is used to calculate the static and dynamic response of a rolling tire
under different rolling conditions. The natural frequencies and mode
shapes of a deformed tire are calculated with the FEA package where
the subspace-based steady state dynamic analysis calculates dynamic
response of tire subjected to harmonic excitation. The analysis was
conducted on the dynamic response at the road (contact point of tire
and road surface) and side nodes of a static and rolling tire when
the tire was excited with 200 N vertical load for a frequency ranging
from 20 to 200 Hz. The results show that frequency has little effect on
tire deformation up to 80 Hz. But between 80 and 200 Hz, the radial
and lateral components of displacement of the road and side nodes
exhibited significant oscillation. For the static analysis, the fluctuation
was sharp and frequent and decreased with frequency. In contrast, the
fluctuation was periodic in nature for the dynamic response of the
rolling tire. In addition to the dynamic analysis, a steady state rolling
analysis was also performed on the tire traveling at ground velocity
with a constant angular motion. The purpose of the computation
was to demonstrate the effect of rotating motion on deformation and
rolling velocity with respect to a fixed Newtonian reference point.
The analysis showed a significant variation in deformation and rolling
velocity due to centrifugal and Coriolis acceleration with respect to
a fixed Newtonian point on ground.
Abstract: Deteriorating quality of the pedestrian environment
and the increasing risk of pedestrian crashes are major concerns for
most of the cities in India. The recent shift in the priority to
motorized transport and the abating condition of existing pedestrian
facilities can be considered as prime reasons for the increasing
pedestrian related crashes in India. Bengaluru City – the IT capital
hub of the nation is not much different from this. The increase in
number of pedestrian crashes in Bengaluru reflects the same. To
resolve this issue and to ensure safe, sustainable and pedestrian
friendly sidewalks, Govt. of Karnataka, India has implemented
newfangled pedestrian sidewalks popularized programme named
Tender S.U.R.E. (Specifications for Urban Road Execution) projects.
Tender SURE adopts unique urban street design guidelines where the
pedestrians are given prime preference. The present study presents an
assessment of the quality and performance of the pedestrian side walk
and the walkability index of the newly built pedestrian friendly
sidewalks. Various physical and environmental factors affecting
pedestrian safety are identified and studied in detail. The pedestrian
mobility is quantified through Pedestrian Level of Service (PLoS)
and the pedestrian walking comfort is measured by calculating the
Walkability Index (WI). It is observed that the new initiatives taken
in reference to improving pedestrian safety have succeeded in
Bengaluru by attaining a level of Service of ‘A’ and with a good WI
score.
Abstract: Butterfly valves are widely used industrial piping components as on-off and flow controlling devices. The main challenge in the design process of this type of valves is the correct dimensioning to ensure proper mechanical performance as well as to minimise flow losses that affect the efficiency of the system. Butterfly valves are typically dimensioned in a closed position based on mechanical approaches considering uniform hydrostatic pressure, whereas the flow losses are analysed by means of CFD simulations. The main limitation of these approaches is that they do not consider either the influence of the dynamics of the manoeuvring stage or coupled phenomena. Recent works have included the influence of the flow on the mechanical behaviour for different opening angles by means of one-way FSI approach. However, these works consider steady-state flow for the selected angles, not capturing the effect of the transient flow evolution during the manoeuvring stage. Two-way FSI modelling approach could allow overcoming such limitations providing more accurate results. Nevertheless, the use of this technique is limited due to the increase in the computational cost. In the present work, the applicability of FSI one-way and two-way approaches is evaluated for the analysis of butterfly valves, showing that not considering fluid-structure coupling involves not capturing the most critical situation for the valve disc.
Abstract: Minimizing the weight in flexible structures means
reducing material and costs as well. However, these structures could
become prone to vibrations. Attenuating these vibrations has become
a pivotal engineering problem that shifted the focus of many research
endeavors. One technique to do that is to design and implement
an active control system. This system is mainly composed of a
vibrating structure, a sensor to perceive the vibrations, an actuator
to counteract the influence of disturbances, and finally a controller to
generate the appropriate control signals. In this work, two different
techniques are explored to create two different mathematical models
of an active control system. The first model is a finite element model
with a reduced number of nodes and it is called a super-element.
The second model is in the form of state-space representation, i.e.
a set of partial differential equations. The damping coefficients are
calculated and incorporated into both models. The effectiveness of
these models is demonstrated when the system is excited by its first
natural frequency and an active control strategy is developed and
implemented to attenuate the resulting vibrations. Results from both
modeling techniques are presented and compared.
Abstract: An increasing degree of automation in air traffic will also change the role of the air traffic controller (ATCO). ATCOs will fulfill significantly more monitoring tasks compared to today. However, this rather passive role may lead to Out-Of-The-Loop (OOTL) effects comprising vigilance decrement and less situation awareness. The project MINIMA (Mitigating Negative Impacts of Monitoring high levels of Automation) has conceived a system to control and mitigate such OOTL phenomena. In order to demonstrate the MINIMA concept, an experimental simulation set-up has been designed. This set-up consists of two parts: 1) a Task Environment (TE) comprising a Terminal Maneuvering Area (TMA) simulator as well as 2) a Vigilance and Attention Controller (VAC) based on neurophysiological data recording such as electroencephalography (EEG) and eye-tracking devices. The current vigilance level and the attention focus of the controller are measured during the ATCO’s active work in front of the human machine interface (HMI). The derived vigilance level and attention trigger adaptive automation functionalities in the TE to avoid OOTL effects. This paper describes the full-scale experimental set-up and the component development work towards it. Hence, it encompasses a pre-test whose results influenced the development of the VAC as well as the functionalities of the final TE and the two VAC’s sub-components.
Abstract: With recent trends in Big Data and advancements
in Information and Communication Technologies, the healthcare
industry is at the stage of its transition from clinician oriented to
technology oriented. Many people around the world die of cancer
because the diagnosis of disease was not done at an early stage.
Nowadays, the computational methods in the form of Machine
Learning (ML) are used to develop automated decision support
systems that can diagnose cancer with high confidence in a timely
manner. This paper aims to carry out the comparative evaluation
of a selected set of ML classifiers on two existing datasets: breast
cancer and cervical cancer. The ML classifiers compared in this study
are Decision Tree (DT), Support Vector Machine (SVM), k-Nearest
Neighbor (k-NN), Logistic Regression, Ensemble (Bagged Tree) and
Artificial Neural Networks (ANN). The evaluation is carried out based
on standard evaluation metrics Precision (P), Recall (R), F1-score and
Accuracy. The experimental results based on the evaluation metrics
show that ANN showed the highest-level accuracy (99.4%) when
tested with breast cancer dataset. On the other hand, when these
ML classifiers are tested with the cervical cancer dataset, Ensemble
(Bagged Tree) technique gave better accuracy (93.1%) in comparison
to other classifiers.
Abstract: The rapid progress of technology in today's competitive conditions has also accelerated companies' technology development activities. As a result, companies are paying more attention to R&D studies and are beginning to allocate a larger share to R&D projects. A more systematic, comprehensive, target-oriented implementation of R&D studies is crucial for the company to achieve successful results. As a consequence, Technology Roadmap (TRM) is gaining importance as a management tool. It has critical prospects for achieving medium and long term success as it contains decisions about past business, future plans, technological infrastructure. When studies on TRM are examined, projects to be placed on the roadmap are selected by many different methods. Generally preferred methods are based on multi-criteria decision making methods. Management of selected projects becomes an important point after the selection phase of the projects. At this stage, TRM are used. TRM can be created in many different ways so that each institution can prepare its own Technology Roadmap according to their strategic plan. Depending on the intended use, there can be TRM with different layers at different sizes. In the evaluation phase of the R&D projects and in the creation of the TRM, HAVELSAN, Turkey's largest defense company in the software field, carries out this process with great care and diligence. At the beginning, suggested R&D projects are evaluated by the Technology Management Board (TMB) of HAVELSAN in accordance with the company's resources, objectives, and targets. These projects are presented to the TMB periodically for evaluation within the framework of certain criteria by board members. After the necessary steps have been passed, the approved projects are added to the time-based TRM, which is composed of four layers as market, product, project and technology. The use of a four-layered roadmap provides a clearer understanding and visualization of company strategy and objectives. This study demonstrates the benefits of using TRM, four-layered Technology Roadmapping and the possibilities for the institutions in the defense industry.
Abstract: The objective of this research is to optimize the process of cutting cylindrical workpieces utilizing live tooling on a HAAS ST-20 lathe. Surface roughness (Ra) has been investigated as the indicator of quality characteristics for machining process. Aluminum alloy was used to conduct experiments due to its wide range usages in engineering structures and components where light weight or corrosion resistance is required. In this study, Taguchi methodology is utilized to determine the effects that each of the parameters has on surface roughness (Ra). A total of 18 experiments of each process were designed according to Taguchi’s L9 orthogonal array (OA) with four control factors at three levels of each and signal-to-noise ratios (S/N) were computed with Smaller the better equation for minimizing the system. The optimal parameters identified for the surface roughness of the turning operation utilizing live tooling were a feed rate of 3 inches/min(A3); a spindle speed of 1300 rpm(B3); a 2-flute titanium nitrite coated 3/8” endmill (C1); and a depth of cut of 0.025 inches (D2). The mean surface roughness of the confirmation runs in turning operation was 8.22 micro inches. The final results demonstrate that Taguchi methodology is a sufficient way of process improvement in turning process on surface roughness.
Abstract: The comparisons of mycobacterial genomes have identified several Mycobacterium tuberculosis-specific genomic regions that are absent in other mycobacteria and are known as regions of differences. Due to M. tuberculosis-specificity, the peptides encoded by these regions could be useful in the specific diagnosis of tuberculosis. To explore this possibility, overlapping synthetic peptides corresponding to 39 proteins predicted to be encoded by genes present in regions of differences were tested for antibody-reactivity with sera from tuberculosis patients and healthy subjects. The results identified four immunodominant peptides corresponding to four different proteins, with three of the peptides showing significantly stronger antibody reactivity and rate of positivity with sera from tuberculosis patients than healthy subjects. The fourth peptide was recognized equally well by the sera of tuberculosis patients as well as healthy subjects. Predication of antibody epitopes by bioinformatics analyses using ABCpred server predicted multiple linear epitopes in each peptide. Furthermore, peptide sequence analysis for sequence identity using BLAST suggested M. tuberculosis-specificity for the three peptides that had preferential reactivity with sera from tuberculosis patients, but the peptide with equal reactivity with sera of TB patients and healthy subjects showed significant identity with sequences present in nob-tuberculous mycobacteria. The three identified M. tuberculosis-specific immunodominant peptides may be useful in the serological diagnosis of tuberculosis.
Abstract: Emotion dysregulation has been linked to psychopathology in general and, in particular, to substance abuse and other addiction-related disorders, such as eating disorders, impulsive disorder, and gambling. It has been proposed that a lessening of the difficulties in emotion regulation can have a significant positive impact on the treatment of these disorders. The present study explores the association between the progress in the Change & Grow® therapeutic model (5 stages of treatment), and the decrease in the difficulties related to emotion regulation. The Change & Grow® model has five stages of treatment according to the model’s five principles (Truth, Acceptance, Gratitude, Love and Responsibility) and incorporates different therapeutic approaches such as positive psychology, cognitive and behavioral therapy and third generation therapies. The main objective is to understand the impact of the presented therapeutic model on difficulties in emotion regulation in patients with addiction-related disorders. The exploratory study has a cross-sectional design. Participants were 44 (15 women and 29 men) Portuguese patients in the residential Villa Ramadas International Treatment Centre. The instrument used was the Portuguese version of the Difficulties in Emotion Regulation Scale (DERS), which measures six dimensions of emotion regulation (Strategies, Non-acceptance, Awareness, Impulse, Goals, and Clarity). The mean rank scores for both the DERS total score and the Impulse subscale showed statistically significant differences according to Stage of Treatment/Principles. Furthermore, Stage of Treatment/Principles held a negative correlation with the scores of the Non-acceptance and Impulse subscales, as well as the DERS total score. The results indicate that the Change & Grow® model seems to have an impact in lessening the patient’s difficulties in emotion regulation. The Impulse dimension suffered the greater impact, which supports the well-known relevance of impulse control, or related difficulties, in addiction-related disorders.