iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://doi.org/10.1515/auto-2019-0081
A new distributed co-simulation architecture for multi-physics based energy systems integration Skip to content
BY 4.0 license Open Access Published by De Gruyter (O) November 5, 2019

A new distributed co-simulation architecture for multi-physics based energy systems integration

Analysis of multimodal energy systems

Eine neue verteilte Co-Simulationsarchitektur für die multi-Physik-basierte Energiesystemintegration
Analyse von multimodalen Energiesystemen
  • Hüseyin Çakmak

    Dr.-Ing. Hüseyin K. Çakmak is research associate and project manager at the Institute for Automation and Applied Informatics, Karlsruhe Institute of Technology. Main fields of work: Energy system analysis (modelling, simulation, visualization), 3D, virtual and augmented reality, data analysis, parallel and high-performance computing.

    EMAIL logo
    , Anselm Erdmann

    Anselm Erdmann is a research associate at the Institute for Automation and Applied Informatics, Karlsruhe Institute of Technology. Main area of research is the simulation of energy systems integration.

    , Michael Kyesswa

    Michael Kyesswa is a Ph.D. student at the Institute for Automation and Applied Informatics, Karlsruhe Institute of Technology. Main areas of research: Simulation and analysis of large power systems, parallel and real-time simulations, and computational methods for power system analysis.

    , Uwe Kühnapfel

    Dr.-Ing. Uwe G. Kühnapfel is head of the working group “Simulation and Visualization” at the Institute for Automation and Applied Informatics, Karlsruhe Institute of Technology. Main research areas: Energy systems (modelling, simulation, monitoring, analysis), mechatronics, virtual and augmented reality.

    and Veit Hagenmeyer

    Prof. Dr. Veit Hagenmeyer is director of the Institute for Automation and Applied Informatics, Karlsruhe Institute of Technology. Main fields of work: Automation technology, control engineering, energy informatics.

Abstract

Simulating energy systems integration scenarios enables a comprehensive consideration of interdependencies between multimodal energy grids. It is an important part of the planning for the redesign of the current energy system infrastructure, which is essential for the foreseen drastic reduction of carbon emissions. In contrast to the complex implementation of monolithic simulation architectures, emerging distributed co-simulation technologies enable the combination of several existing single-domain simulations into one large energy systems integration simulation. Accompanying disadvantages of coupling simulators have to be minimized by an appropriate co-simulation architecture. Hence, in the present paper, a new simulation architecture for energy systems integration co-simulation is introduced, which enables an easy and fast handling of the therefore required simulation setup. The performance of the new distributed co-simulation architecture for energy systems integration is shown by a campus grid scenario with a focus on the effects of power to gas and the reversal process onto the electricity grid. The implemented control strategy enables a successful co-simulation of electrolysis coupled with photovoltaics, a hydrogen storage with a combined heat and power plant and a variable power consumption.

Zusammenfassung

Die Simulation von Szenarien sektorgekoppelter Energiesysteme ermöglicht eine umfassende Untersuchung von Wechselwirkungen in multimodalen Energienetzen. Sie stellt einen wichtigen Teil der Planung für die Neugestaltung der Energiesysteminfrastruktur dar, die für die geplante Reduzierung der CO2-Emissionen unerlässlich ist. Im Gegensatz zu komplexen, monolithischen Simulationsarchitekturen ermöglichen neuartige, verteilte Co-Simulationstechnologien die Kombination mehrerer bestehender Simulationen über einzelne Teilbereiche zu einer umfassenden, integrierten Simulation des gesamten Energiesystems. Die damit einhergehenden Nachteile gekoppelter Simulatoren müssen durch eine geeignete Co-Simulationsarchitektur minimiert werden. In diesem Artikel wird eine neue Architektur für die Co-Simulation von multi-Physik-basierter Energiesystemintegration vorgestellt, die eine einfache und schnelle Handhabung des dafür erforderlichen Simulationsaufbaus ermöglicht. Die Leistungsfähigkeit der neuen verteilten Co-Simulationsarchitektur für die Integration von Energiesystemen wird durch ein Campus-Netzszenario mit dem Fokus auf den Auswirkungen von Power-to-Gas und Gas-to-Power auf das Stromnetz veranschaulicht. Mit einer implementierten Regelstrategie wird der erfolgreiche Betrieb des Stromnetzes zusammen mit Photovoltaik, Elektrolyse, einem Wasserstoffspeicher und einem Blockheizkraftwerk sowie sich ändernden Lasten simuliert und die entsprechenden Ergebnisse dargestellt.

1 Introduction

Figure 1 
            German gas transmission system (nominal pressure: 40–120 bar, brown) and German electricity grid according to NEP2030 (450 kV HVDC: magenta, 380 kV AC: red, 220 kV AC: green, 110 kV AC: blue).
Figure 1

German gas transmission system (nominal pressure: 40–120 bar, brown) and German electricity grid according to NEP2030 (450 kV HVDC: magenta, 380 kV AC: red, 220 kV AC: green, 110 kV AC: blue).

The energy transition is a long-term process aimed at achieving a state of carbon neutrality in order to meet the estimated remaining carbon budget according to [1] as a constraint. In addition, the energy transition from a system driven mainly by easily controllable fossil-fired and nuclear power plants to an increasingly volatile production of renewable energy requires new components in the grid to ensure the stability of electricity supply. In particular, the known disadvantages of direct storage of electrical energy such as efficiency, dimensioning, demand for raw materials and suboptimal sustainability [2] require the consideration of alternative storage options. A possible shift of the storage challenge to other domains such as gas [3], [4] or heat [5] using energy systems integration strategies appears to be a promising approach to ensure the energy supply by taking resilience and economical aspects into account.

The design and implementation of a multimodal energy system requires possibilities for the efficient simulation with multi-physics models. This in turn requires the availability of complete electricity and gas networks with a high level of detail, as shown in figure 1 for Germany, in order to simulate various scenarios of energy systems integration at transportation and distribution grid levels.

Currently, energy systems integration simulations are mainly carried out by experts of one single domain, while other domains are simplified as in [6]. Ideally, specialized simulation tools and the associated single domain models need to be coupled in a so-called co-simulation for the analysis of interdependencies between different domains. For this reason, there is need for a new architecture that enables efficient collaboration between experts from different domains, aimed at accelerating and simplifying the coupling of specialized simulators and single-domain models. The contribution of this paper is a new architecture for distributed co-simulation, which enables an easy and fast handling of the required setup for successful multi-physics based energy systems integration simulation.

The present paper is structured as follows: In section 2, the state of the art for energy systems integration simulators is introduced. Section 3 describes the structure of the new co-simulation architecture. Section 4 discusses selected topics of the implementation as the efficient setup procedure, simulator communication and synchronisation. The performance of the new architecture is shown by a campus grid scenario in section 5. The paper is concluded by a discussion and outlook in section 6.

2 Related work

For clarification of terms, the number of models (m) and solvers (s) is decisive for the type of the simulation. In the literature [7], [8], an ordinary simulation provides one solver for one model (m=s=1). For an increased number of solvers (m=1, s>1), the term parallel simulation is used [8]. Simulations can be executed on a single computer as a monolithic simulation or as a distributed simulation on multiple hardware. Hybrid simulation [9] is the term used to describe that multiple models developed in different modeling environments form a monolithic unit that can be solved with one solver (m>1, s=1). The coupling of independent simulators (m>1, s>1), which are running own models in usually small local time steps and exchange data at fixed global time steps is defined as co-simulation.

A survey on the state of the art of co-simulation is given in [10], [11]. In particular, topics as the various types of co-simulation in general (discrete event, continuous time and hybrid co-simulation) as well as handling of algebraic loops and stability issues are discussed. Further selected topics on the development of a co-simulation environment as synchronization methods and communication paths for discrete and continuous simulators are described in [8]. In the literature, mainly co-simulation issues are addressed for systems described by differential algebraic equations (DAE), which are split into coupled subsystems. For strong coupling, all subsystem equations are exported and embedded into one global solver for numerical solution. In contrast, with a weak coupling, each subsystem calculates a numerical solution for a time step either in parallel (Jacobi) or sequentially (Gauss-Seidel), whereby the output of one simulator is used as input for the next one [12]. In this case, the current values from other simulators are estimated via extrapolation. The issue with numerical inaccuracies are discussed in [13] and an example is given in [14] for voltage control, where a Simulink control block is co-simulated with a power system model for time domain simulation in DIgSILENT PowerFactory. Further issues in the context of co-simulation are stability [15], algebraic loops [16] and convergence [17].

A broad and extensive introduction to the topic is given in [8], [18] in the context of smart grids comprising power grid simulation with information and communication technology (ICT). The SimPy [19] based framework Mosaik [20] enables the coupling of power grid simulators as PYPOWER [21] by providing an API for agent-based models and event-based simulation execution together with simulation of communication network protocols and services for smart grid scenarios using OMNeT++ [22]. Within the HELICS project, the focus is to provide a general-purpose, modular, highly-scalable co-simulation framework for power grid simulators [23]. A co-simulation with two real-time digital simulators RTDS [24] for studying synchronization mechanism for two subsystems connected by a simple Bergeron transmission line is presented in [25], whereas each subsystem is simulated with electromagnetic transients (EMT) based on the Dommel algorithm [26]. The establishment of a European framework for real-time simulation concentrating on power systems is the goal of the VILLAS project [27], [28], [29], [30].

In addition to the described co-simulations in the electrical domain, new developments also consider the coupling of power grids with other domains. An example for a co-simulation framework of smart grids with ICT is given in [31]. Experiments on electrical and thermal co-simulation with geographically distributed real-time simulators are reported in [32] with the focus on accuracy, latency and stability. In [33] a modular co-simulation platform for coupling models of power systems and buildings for studying demand-response strategies is introduced. MESCOS [34] is a multi-modal co-simulator for district energy systems. A similar work with a hardware-in-the-loop (HIL) approach is presented in [35].

Furthermore, multi-physics models are developed using modeling languages as Modelica® [36] implemented in open-source or commercial software with support for the Functional Mock-up Interface (FMI), which has become a standard for data exchange in co-simulation [37], [38]. FMI can be used for model exchange without solvers – models are described by differential, algebraic and discrete equations with time-, state- and step events – but also for co-simulation, where models are delivered with their solvers and exchange data during their execution at discrete communication points [8]. In the case of co-simulation, a master algorithm controls the data exchange and synchronization of the slave solvers. The exported simulation file implementing the FMI is called Functional Mockup Unit (FMU). Efforts to standardize distributed co-simulation are reported in [39]: the distributed communication protocol (DCP) provides a data model, a finite state machine, and a communication protocol.

The presented survey of current developments makes it clear, that there is definitely a need for standards that provide protocols and APIs for the creation of co-simulations. However, setting up a distributed co-simulation environment requires programming skills as well as knowledge in hardware and special protocols. Thus, this time-consuming process distracts from the actual energy systems integration problem to be studied. In the present paper, a new user-centric architecture for fast setup of co-simulation integrating real-time simulators for the analysis of multi-modal energy systems is introduced. In this context, energy systems integration addresses a holistic view of energy systems comprising also the physical and IT-based interconnections by flexible integration and combination of individual technologies, energy carriers and infrastructures [40], [41], [42], [43], [44].

3 New distributed co-simulation architecture

The new architecture is comprised of three parts. The first one is the physical simulation of the considered energy system. The second part is an analysis component, which is fed by the data from the physical simulation and performs analysis in view of sustainability and economy. The third part is a control component, where necessary dispatching for the physical simulation is carried out. The decisions can be influenced by the results of the analysis component, if appropriate guidelines are given. Figure 2 gives an overview of the three simulation components and the data stream required for their interaction. All simulation components are described in the following subsections.

Figure 2 
            The new co-simulation architecture for energy systems integration.
Figure 2

The new co-simulation architecture for energy systems integration.

Figure 3 
            A configuration example of the Physical Simulation Component.
Figure 3

A configuration example of the Physical Simulation Component.

3.1 Physical simulation component

The physical simulation component is responsible for the simulation of the energy systems behavior. A master control algorithm is managing the coordination of the different simulators, which are implementing a specified API. For the simulation of energy systems integration, where the interaction of different domains of energy carriers is an essential part, this concept has to be extended for the exchange of arbitrary physical quantities between simulations similar to the multi-domain modeling language Modelica® [36]. A configuration example of the physical simulation component is illustrated in figure 3.

Simulators for various domains, the so-called simulation modules consist of a stand-alone simulator and a specific model with a specified interface for data exchange with other simulation modules. The architecture supports distributed execution of the simulation modules and communication via TCP. The simulation master as the central instance is orchestrating the ensemble of simulation modules by controlling the global simulation procedure and coordination of the data exchange between the simulation modules. Each simulation module performs a simulation with its own step size, e. g., the EMT simulation on the real-time power system simulation with RTDS [24] in the microsecond range and, in contrast, e. g., the simulation of the dynamic behavior of heating in buildings with comparably rather large simulation time steps [35]. The implementation of the communication and synchronization in the new architecture is presented in chapter 4.

3.2 Analysis component

The task of this component is to record and analyze data from running simulation modules in the physical simulation component. Data on emission and consumption of raw materials from the physical simulation can be evaluated in order to check whether defined targets are met. Furthermore, aspects for sustainability, equity and economy can be evaluated. In particular, the availability of restricted resources like biomass can have a huge impact on these aspects. A similar component exists in the TransiEntEE project [45], [46], where data on emissions and costs are collected during the simulation and evaluated under consideration of economic aspects. The results of the evaluation can be made available to the control component, which can take them into account in the decision-making process.

3.3 Control component

The control component is the decision-maker in the new architecture, in which control algorithms implement strategies to ensure a balance between demand and supply of energy at all times. It influences the physical simulation in form of switching or control operations. For decision-making, the current simulation states and evaluation results from the analysis component can be incorporated into the control algorithm. Various communication strategies for the transfer of control directives are conceivable, e. g., a direct connection to the simulation modules with a specified protocol or a communication via the simulation master.

4 Implementation

The implementation of a distributed co-simulation environment requires the consideration of some special challenges as the occurrence of temporal delays conditioned by the discrete synchronization mechanism between different simulation modules. Other challenges are the communication between the simulation modules and the setup procedure for arbitrary physical dimensions. This section focuses on the implementation of the physical simulation environment as an essential part of the new architecture.

4.1 Efficient co-simulation setup

One advantage of the new architecture is the ease of interactive co-simulation setup without any need for code-based programming as in other co-simulation environments. The simulation master provides a graphical user interface (GUI) where the global simulation time step is set and the simulation modules – simulators with their respective models – can be coupled to a co-simulation. Figure 4 shows the GUI with an example of a distributed co-simulation setup for the simulation of a multimodal energy system for the campus scenario introduced in chapter 5.

Figure 4 
              Interactive simulator configuration for co-simulation setup with the implemented GUI.
Figure 4

Interactive simulator configuration for co-simulation setup with the implemented GUI.

From the technical point of view, the simulation master opens a TCP-Server and waits for connecting simulation modules as TCP-clients. After communication negotiations and information exchange concerning data interfaces, the simulation modules are visible as icons in the graphical user interface of the simulation master. The coupling of the simulators is a two-stage procedure: First, the dependencies between the simulators are modeled as a network. In the second step, the input interfaces for individual physical quantities are connected to the corresponding output interfaces. The number of selectable interfaces is reduced by an automated pre-selection mechanism by comparing the units of the eligible output interfaces with the input interface. The derived units of both interfaces are traced back to the SI base unit expressions and then checked for equality. The procedure of connecting interfaces is accelerated by using similar name patterns for the inputs and their corresponding outputs and an automatic connection function. To ensure synchronized further processing of simulated physical quantities in complex scenarios, the execution order needs to be determined (see section 4.3). After the configuration of all interfaces and delays, the simulation master initiates the simulation execution. The information regarding the real-time delay during the execution of the co-simulation is provided in the GUI.

4.2 Communication of simulators

Communication within a co-simulation environment can be handled easily by a shared memory solution if all simulations run on a single hardware or have access to the same memory area. For a distributed co-simulation, the use of network communication protocols is a more realistic option. In the proposed new architecture, simulation modules exchange data with a protocol embedded in TCP, which is preferred over UDP in order to avoid the handling of data re-transmission. The implemented protocol consists of command messages sent by the simulation master and replies from the simulation modules. Especially for co-simulations with a high data throughput it is recommendable to keep a close spatial distance of the simulation modules among each other in order to keep the round trip time (RTT) of the data packets small and to avoid packet losses caused by a high network traffic in the public network.

The bidirectional and asynchronous communication of the real-time power system simulator RTDS and the master simulation is performed via TCP utilizing the GTNETx2-SKT [47] protocol, which can hold up to 300 data points per packet with a total size of 1200 bytes containing either integer or floating-point (IEEE 754, 32 bit) values. The communication is thus no bottleneck, as long as the EMT based power grid simulation that runs in an own fast loop on RTDS receives external input data – such as solar radiation data for the simulation of power generation or power consumption data – via the implemented REST interface with a substantially slower rate. An alternative to the use of EMT for the power grid simulation is the transient dynamics simulation either provided by commercial tools as DIgSILENT PowerFactory via OPC and a Python interface [48] or by academic software [49]. For slow grid dynamics, a considerable alternative is the use of steady state powerflow within a co-simulation [50].

In addition to the aforementioned TCP protocol for the communication of individual simulation modules in a co-simulation, the present implementation provides an API for the Java programming language and a FMI co-simulation adaptation. Access to data services is enabled via REST interfaces.

4.3 Synchronization of simulators

The execution of a co-simulation requires synchronization mechanisms for the data exchange between the simulations, which takes place between global time steps. In the proposed architecture, we utilize a master-slave synchronization method with a dedicated master simulation that dictates a fixed global time step, after that an exchange of results of the subsystems occurs. Within a global time step, each simulator is running its own model with a local time step that is equal or smaller than the global time step, according to its model dynamics. As an example, electrolysis and gas turbine simulations have rather slow dynamics; in contrast, a power grid simulation executed on a real-time digital simulator has local time steps in microsecond range. For each simulator, the control of its model dynamics is performed within its individual local time steps.

For managing the co-simulation pathways in order to assure the correct assignment of simulator results for analysis or visualization purposes, the occurrence of temporal misalignments – conditioned by the discrete synchronization mechanism between different simulation modules – has to be considered. In general, two different cases need to be considered: Simulations with high sampling rates in which the physical quantities change slowly in relation to the sample duration and simulations in which they change rapidly. In the first case, an inaccuracy is accepted for an evolving but not yet transferred quantity. In the second case, the simulation behavior can lead to a temporal misalignment of the further processed simulation results. The handling of both cases in the same architecture requires the embedding in a logical structure. The exchanged physical quantities after a global time step are assumed to be related to the simulated duration and not to the instant value at the end of the time step period. Knowing the corresponding simulated time allows a pipeline-like delayed calculation. This implies that the results retain their time reference, but are available in a delayed manner. The buffering of the results ensures the same simulated time of all input variables even in complex dependency constructs without loops. In the first case mentioned above, for cases with loops involved the only interest is the fast processing of the present values without any further delays. The present implementation provides a logical delay block that allows to connect a simulator output to the input of another simulator that has a lower or the same calculation delay. It effects the waiver of delayed calculations with accepting inaccuracies that are caused by the data exchange intervals, which should be small in the first case.

5 Co-simulation case study: A campus scenario

The applicability of the proposed and implemented new architecture for co-simulation of energy systems integration with the focus on power to gas (PtG) and the reversal process [51] is shown by a campus scenario. The electrical network is represented by a complete model of the 20 kV power grid at KIT Campus North (KIT CN) as shown in figure 5, which is an aggregated version of a verified and complete DIgSILENT PowerFactory simulation model at 0.4/20 kV level. The power grid model – after a model conversion via PSSE and revision in RSCAD – is simulated with the RTDS system. Within the power grid model, changes in load or generation lead to a change of the grid frequency that in turn is controlled by the respective primary controller of the generators. In case of large grid models, multiple RTDS simulators can be directly connected with a high-performance fibre-optic connection for distributed power grid simulation (e. g., at the KIT EnergyLab 2.0 [52]).

Figure 5 
            KIT Campus North power grid with 20 kV lines and stations.
Figure 5

KIT Campus North power grid with 20 kV lines and stations.

For a new planning scenario, the model of the actual KIT CN power grid is extended with an electrolysis unit (see e. g., [53]), a hydrogen storage and a hydrogen fired power generation plant (see e. g., [54], [55]). The aim of this scenario is to run the campus grid near autarchy: electrical power is delivered by a 1.5 MW peak photovoltaic facility and three 2 MW combined heat and power (CHP) plants whereby only the electrical power generation is considered in a first step. The connection to the external grid enables the compensation of missing or surplus power. Figure 6 depicts the schematic representation of the test scenario for the analysis of the reaction of the electrolyzer and the gas turbine of the CHP in answer to the change of a variable load with the goal to keep the power production and consumption in balance. The variable load as artificial data for the case study represents e. g., the starting up of a large-scale experiment with high power consumption in a test facility.

Figure 6 
            Schematic representation of the simulated test scenario.
Figure 6

Schematic representation of the simulated test scenario.

5.1 Simulation setup

For the co-simulation, the solar radiation data for the simulation of the photovoltaic facility and load data at the 20 kV stations are made available by the eASiMOV web services [49] in CSV format. The power consumption needs to be scaled appropriately, since the produced power within the microgrid-like electricity grid is not sufficient for autarchy. Two of the CHP plants are operating in full load mode, and the third CHP acts as a hydrogen fired gas turbine fed from the stored hydrogen. All gas transportation processes are neglected. For the simulation of the electrolyzer, the storage, and the gas turbine, the TransiEnt library for Modelica® [46] is used. The simulation is exported in an FMU for co-simulation (FMU-CS), with the input parameters being the intended operating power of the electrolyzer and the gas turbine. The output parameters are the consumed power of the electrolyzer and the torque of the turbine rotating with an angular velocity of 100π rad s−1.

The control strategy of the electrolyzer and the gas turbine is as follows: If the consumed power in the grid is lower than the generated power, the surplus electrical power is consumed in the electrolyzer. If the former is significantly higher than the latter, the gas turbine is activated and its generator produces additional electrical power. The control of the electrolyzer and the turbine of the CHP is implemented provisionally on RTDS, which transmits the desired power values. The co-simulation setup is depicted schematically in figure 7. The principle of the artificial delay is implemented in the dependency chain between the FMU-CS and the corresponding graph plotter. For enabling the feedback loop between RTDS and FMU-CS, a logical delay of two steps is inserted, which handles the temporal misalignment between the simulation modules. The related configuration of the co-simulation in the implementation of the new co-simulation environment is shown in figure 4.

Figure 7 
              Schematic representation of the simulator configuration.
Figure 7

Schematic representation of the simulator configuration.

Figure 8 
              Simulated power (left) and pressure in the hydrogen storage tank (right).
Figure 8

Simulated power (left) and pressure in the hydrogen storage tank (right).

5.2 Simulation results

Figure 8-left shows the surplus power in the beginning, which triggers the start of the electrolyzer. As soon as the variable load changes, the power of the electrolyzer has to be adapted to maintain the balance within the campus grid. As soon as the demand is higher than the production, power is taken from the external grid until the additional demand reaches 250 kW. At this point, the turbine starts to balance the load and production at the same level. The turbine runs as long as there is enough hydrogen in the storage tank what is assumed for a pressure greater than 1 bar. Figure 8-right shows the evolution of the pressure in the hydrogen storage tank. The pressure increases as long as the electrolyzer is active and decreases while the gas turbine is running. As soon as the pressure in the hydrogen storage tank falls below the threshold of 1 bar, the gas turbine is stopped.

6 Conclusion

The present paper introduces a new architecture, which is suitable for the requirements of co-simulations for energy systems integration. The proposed architecture enables the integration of already existing single-domain simulations developed in different tools. The interactive co-simulation setup procedure allows an easy assembling for large co-simulations with low effort using a graphical user interface. A campus scenario with a detailed electrical grid is used to illustrate the applicability of the new architecture to multimodal energy system simulation.

First experiments with the coupled simulation of multi-physics models for energy systems integration are successful. The interactive setup of the simulation scenario is straightforward and enables a fast coupling of the different simulators for the presented case study with the power grid simulator RTDS and the Modelica® based FMU co-simulation for electrolysis, gas storage and hydrogen fired gas turbine together with external data sources such as weather and power consumption data via a REST interface. Although simplifications in modelling as the neglection of a gas network and an adaptation of the power loads are assumed, the simulation results show coherence of the scenario. The control strategy for the energy conversion and storage processes delivers plausible and comprehensible results.

Next steps are the integration of further detailed simulation models for energy conversion technologies in order to assess their impact on the multimodal energy grids, the extension of the co-simulation with a gas network model, a heat network model and the implementation of control and analysis components. Furthermore, in light of the ongoing substation automation at KIT CN with SCADA, data acquisition utilizing the IEC 61850 protocol will enable the direct and fast feed-in of real grid data into the real-time multimodal energy system simulation.

About the authors

Hüseyin Çakmak

Dr.-Ing. Hüseyin K. Çakmak is research associate and project manager at the Institute for Automation and Applied Informatics, Karlsruhe Institute of Technology. Main fields of work: Energy system analysis (modelling, simulation, visualization), 3D, virtual and augmented reality, data analysis, parallel and high-performance computing.

Anselm Erdmann

Anselm Erdmann is a research associate at the Institute for Automation and Applied Informatics, Karlsruhe Institute of Technology. Main area of research is the simulation of energy systems integration.

Michael Kyesswa

Michael Kyesswa is a Ph.D. student at the Institute for Automation and Applied Informatics, Karlsruhe Institute of Technology. Main areas of research: Simulation and analysis of large power systems, parallel and real-time simulations, and computational methods for power system analysis.

Uwe Kühnapfel

Dr.-Ing. Uwe G. Kühnapfel is head of the working group “Simulation and Visualization” at the Institute for Automation and Applied Informatics, Karlsruhe Institute of Technology. Main research areas: Energy systems (modelling, simulation, monitoring, analysis), mechatronics, virtual and augmented reality.

Veit Hagenmeyer

Prof. Dr. Veit Hagenmeyer is director of the Institute for Automation and Applied Informatics, Karlsruhe Institute of Technology. Main fields of work: Automation technology, control engineering, energy informatics.

References

1. J. Rogelj, D. Shindell, K. Jiang, S. Fifita, P. Forster, V. Ginzburg, C. Handa, H. Kheshgi, S. Kobayashi, E. Kriegler, L. Mundaca, R. Séférian, M.V. Vilariño. Mitigation Pathways Compatible with 1.5°C in the Context of Sustainable Development. In: Global Warming of 1.5°C: An IPCC Special Report on the impacts of global warming of 1.5°C above pre-industrial levels and related global greenhouse gas emission pathways, in the context of strengthening the global response to the threat of climate change, sustainable development, and efforts to eradicate poverty; 2018.Search in Google Scholar

2. D. Larcher, J.-M. Tarascon. Towards Greener and More Sustainable Batteries for Electrical Energy Storage. Nature chemistry 2015; 7:19–29.10.1038/nchem.2085Search in Google Scholar PubMed

3. M. Jentsch, T. Trost, M. Sterner. Optimal Use of Power-to-Gas Energy Storage Systems in an 85 % Renewable Energy Scenario. Energy Procedia 2014; 46:254–61.10.1016/j.egypro.2014.01.180Search in Google Scholar

4. A. Maroufmashat, M. Fowler. Transition of Future Energy System Infrastructure; through Power-to-Gas Pathways. Energies 2017; 10(8):1089.10.3390/en10081089Search in Google Scholar

5. P. Kohlhepp, V. Hagenmeyer. Technical Potential of Buildings in Germany as Flexible Power-to-Heat Storage for Smart-Grid Operation. Energy Technology 2017; 5(7):1084–104.10.1002/ente.201600655Search in Google Scholar

6. L. Andresen, G. Schmitz. Bewertung von Power-to-Gas-Anlagen mittels dynamischer Systemsimulation. gwf-Gas+Energie 2016; 682–9.Search in Google Scholar

7. M. Geimer, T. Krüger, P. Linsel. Co-Simulation, gekoppelte Simulation oder Simulatorkopplung?: Ein Versuch der Begriffsvereinheitlichung. O+P Ölhydraulik und Pneumatik 2006; 11:572–6.Search in Google Scholar

8. P. Palensky, A.A. van der Meer, C.D. Lopez, A. Joseph, K. Pan. Cosimulation of Intelligent Power Systems: Fundamentals, Software Architecture, Numerics, and Coupling. EEE Ind. Electron. Mag. 2017; 11(1):34–50.10.1109/MIE.2016.2639825Search in Google Scholar

9. T. Eldabi, M. Balaban, S. Brailsford, N. Mustafee, R.E. Nance, B.S. Onggo et al. Hybrid Simulation: Historical Lessons, Present Challenges and Futures. In: Proceedings of the 2016 Winter Simulation Conference. Piscataway, NJ, USA: IEEE Press; 2016, p. 1388–403.10.1109/WSC.2016.7822192Search in Google Scholar

10. C. Gomes, C. Thule, D. Broman, P.G. Larsen, H. Vangheluwe. Co-simulation: State of the art; Available from: http://arxiv.org/pdf/1702.00686v1.Search in Google Scholar

11. C. Gomes, C. Thule, D. Broman, P.G. Larsen, H. Vangheluwe. Co-Simulation: A Survey. ACM Computing Surveys (CSUR) 2018; 51(3):49.10.1145/3179993Search in Google Scholar

12. M. Busch. Zur effizienten Kopplung von Simulationsprogrammen. Zugl.: Kassel, Univ., Diss., 2012. Kassel: Kassel University Press; 2012.Search in Google Scholar

13. M. Busch. Performance Improvement of Explicit Co-simulation Methods Through Continuous Extrapolation. In: Schweizer B., editor. IUTAM Symposium on Solver-Coupling and Co-Simulation. Cham: Springer International Publishing; 2019, p. 57–80.10.1007/978-3-030-14883-6_4Search in Google Scholar

14. K. Johnstone, S.M. Blair, M.H. Syed, A. Emhemed, G.M. Burt, T.I. Strasser. Co-simulation approach using PowerFactory and MATLAB/Simulink to enable validation of distributed control concepts within future power systems. CIRED - Open Access Proceedings Journal 2017; 2017(1):2192–6.10.1049/oap-cired.2017.1175Search in Google Scholar

15. M. Arnold, M. Günther. Preconditioned Dynamic Iteration for Coupled Differential-Algebraic Systems. BIT Numerical Mathematics 2001; 41(1):1–25.10.1023/A:1021909032551Search in Google Scholar

16. R. Kübler, W. Schiehlen. Two Methods of Simulator Coupling. Mathematical and Computer Modelling of Dynamical Systems 2000; 6(2):93–113.10.1076/1387-3954(200006)6:2;1-M;FT093Search in Google Scholar

17. R. Venkatraman, S.K. Khaitan, V. Ajjarapu. Dynamic Co-Simulation Methods for Combined Transmission-Distribution System and Integration Time Step Impact on Convergence. CoRR 2018;abs/1801.01185.10.1109/PESGM40551.2019.8973942Search in Google Scholar

18. P. Palensky, A.A. van der Meer, C.D. Lopez, A. Joseph, K. Pan. Applied Cosimulation of Intelligent Power Systems: Implementing Hybrid Simulators for Complex Power Systems. EEE Ind. Electron. Mag. 2017; 11(2):6–21.10.1109/MIE.2017.2671198Search in Google Scholar

19. K.G. Müller. SimPy, a discrete event simulation package in Python. In: EuroSciPy2008, Leipzig, Germany; 2008.Search in Google Scholar

20. S. Rohjans, E. Widl, W. Müller, S. Schütte, S. Lehnhoff. Gekoppelte Simulation komplexer Energiesysteme mittels MOSAIK und FMI. at - Automatisierungstechnik 2014; 62(5):325–36.10.1515/auto-2014-1087Search in Google Scholar

21. PYPOWER: Solves power flow and optimal power flow problems. [July 11, 2019]; Available from: https://pypi.org/project/PYPOWER/.Search in Google Scholar

22. OMNeT++ Discrete Event Simulator. [July 11, 2019]; Available from: https://omnetpp.org/.Search in Google Scholar

23. B. Palmintier, D. Krishnamurthy, P. Top, S. Smith, J. Daily, J. Fuller. Design of the HELICS high-performance transmission-distribution-communication-market co-simulation framework. In: Systems WoMaSoC-PE, editor. 2017 Workshop on Modeling and Simulation of Cyber-Physical Energy Systems (MSCPES): April 21, 2017, Pittsburgh, PA, USA held as part of CPS Week, April 18–21, 2017, Pittsburgh, PA, USA proceedings. [Piscataway, NJ]: IEEE; 2017, p. 1–6.10.1109/MSCPES.2017.8064542Search in Google Scholar

24. REAL TIME POWER SYSTEM SIMULATION • RTDS Technologies Inc. [July 09, 2019]; Available from: https://www.rtds.com/real-time-power-system-simulation/.Search in Google Scholar

25. I.k. Park, P. Forsyth, H. Kim, K. Hur. A Study on synchronizing two separate RTDS simulation instances. In: 2016 IEEE Power and Energy Society General Meeting (PESGM): IEEE; 2016, p. 1–5.10.1109/PESGM.2016.7741257Search in Google Scholar

26. H. Dommel. Digital Computer Solution of Electromagnetic Transients in Single-and Multiphase Networks. IEEE Trans. on Power Apparatus and Syst. 1969; PAS-88(4):388–99.10.1109/TPAS.1969.292459Search in Google Scholar

27. VILLAS Framework. [July 08, 2019]; Available from: https://www.fein-aachen.org/projects/villas-framework/.Search in Google Scholar

28. A. Monti, M. Stevic, S. Vogel, R.W. de Doncker, E. Bompard, A. Estebsari et al. A Global Real-Time Superlab: Enabling High Penetration of Power Electronics in the Electric Grid. IEEE Power Electron. Mag. 2018; 5(3):35–44.10.1109/MPEL.2018.2850698Search in Google Scholar

29. M. Stevic, A. Estebsari, S. Vogel, E. Pons, E. Bompard, M. Masera et al. Multi-site European framework for real-time co-simulation of power systems. IET Generation, Transmission & Distribution 2017; 11(17):4126–35.10.1049/iet-gtd.2016.1576Search in Google Scholar

30. M. Stevic, A. Monti, A. Benigni. Development of a simulator-to-simulator interface for geographically distributed simulation of power systems in real time. In: IECON 2015 - 41st Annual Conference of the IEEE Industrial Electronics Society: IEEE; 2015, p. 5020–5.10.1109/IECON.2015.7392888Search in Google Scholar

31. H Lin, S. Sambamoorthy, S. Shukla, J. Thorp, L. Mili. Power system and communication network co-simulation for smart grid applications. In: ISGT 2011; 2011, p. 1–6.10.1109/ISGT.2011.5759166Search in Google Scholar

32. M.O. Faruque, M. Sloderbeck, M. Steurer, V. Dinavahi. Thermo-electric co-simulation on geographically distributed real-time simulators. In: 2009 IEEE Power & Energy Society General Meeting: IEEE; 2009, p. 1–7.10.1109/PES.2009.5275631Search in Google Scholar

33. S. Chatzivasileiadis, M. Bonvini, J. Matanza, R. Yin, T.S. Nouidui, E.C. Kara et al. Cyber–Physical Modeling of Distributed Resources for Distribution System Operations. Proc. IEEE 2016; 104(4):789–806.10.1109/JPROC.2016.2520738Search in Google Scholar

34. C. Molitor, S. Gross, J. Zeitz, A. Monti. MESCOS—A Multienergy System Cosimulator for City District Energy Systems. IEEE Trans. Ind. Inf. 2014; 10(4):2247–56.10.1109/TII.2014.2334058Search in Google Scholar

35. S. Kochanneck, I. Mauser, K. Phipps, H. Schmeck. Hardware-in-the-Loop Co-simulation of a Smart Building in a Low-voltage Distribution Grid. In: 2018 IEEE PES Innovative Smart Grid Technologies Conference Europe (ISGT-Europe): Sarajevo, Bosnia and Herzegovina, October 21–25, 2018 conference proceedings. [Piscataway, New Jersey]: IEEE; 2018, p. 1–6.10.1109/ISGTEurope.2018.8571746Search in Google Scholar

36. Association M. Modelica - A Unified Object-Oriented Language for Systems Modeling, Version 3.3 Revision 1. [July 11, 2019].Search in Google Scholar

37. T. Blochwitz, M. Otter, J. Akesson, M. Arnold, C. Clauss, H. Elmqvist et al. Functional Mockup Interface 2.0: The Standard for Tool independent Exchange of Simulation Models. In: Proceedings of the 9th International MODELICA Conference, September 3–5, 2012, Munich, Germany: Linköping University Electronic Press; 2012, p. 173–84.10.3384/ecp12076173Search in Google Scholar

38. Modelica Association Project “FMI”. Functional Mockup Interface for Model Exchange and Co-Simulation, Version 2.0; Available from: https://fmi-standard.org/.Search in Google Scholar

39. M. Krammer, M. Benedikt, T. Blochwitz, K. Alekeish, N. Amringer, C. Kater et al. The Distributed Co-simulation Protocol for the integration of real-time systems and simulation environments. In: Proceedings of the 50th Computer Simulation Conference. Bordeaux, France: Society for Computer Simulation International; 2018, p. 1–14.Search in Google Scholar

40. K. Karlsson, K. Skytte, P.E. Morthorst, P. Bacher, H. Madsen. Integrated energy systems modelling. In: DTU International Energy Report. Energy systems integration for the transition to non-fossil energy systems, 2015: p. 23–33.Search in Google Scholar

41. S. Mittal, M. Ruth, A. Pratt, M. Lunacek, D. Krishnamurthy, W. Jones. A System-of-systems Approach for Integrated Energy Systems Modeling and Simulation. In: Proceedings of the Conference on Summer Computer Simulation. San Diego, CA, USA: Society for Computer Simulation International; 2015, p. 1–10.Search in Google Scholar

42. M. Robinius, A. Otto, P. Heuser, L. Welder, K. Syranidis, D. Ryberg et al. Linking the Power and Transport Sectors—Part 1: The Principle of Sector Coupling. Energies 2017; 10(7):956.10.3390/en10070956Search in Google Scholar

43. M. Robinius, A. Otto, K. Syranidis, D.S. Ryberg, P. Heuser, L. Welder et al. Linking the Power and Transport Sectors—Part 2: Modelling a Sector Coupling Scenario for Germany. Energies 2017; 10(7):957.10.3390/en10070957Search in Google Scholar

44. M.F. Ruth, B. Kroposki. Energy Systems Integration: An Evolving Energy Paradigm. The Electricity Journal 2014; 27(6):36–47.10.1016/j.tej.2014.06.001Search in Google Scholar

45. L. Andresen, P. Dubucq, R.P. Garcia, G. Ackermann, A. Kather, G. Schmitz. Transientes Verhalten gekoppelter Energienetze mit hohem Anteil Erneuerbarer Energien: Abschlussbericht des Verbundvorhabens; 2017.Search in Google Scholar

46. TransiEnt Library: Simulation von gekoppelten Energienetzen mit hohem Anteil Erneuerbarer Energien in Modelica. [July 11, 2019]; Available from: https://www.tuhh.de/transient-ee/.Search in Google Scholar

47. GTNETx2 CARD • RTDS Technologies Inc. [July 09, 2019]; Available from: https://www.rtds.com/the-simulator/our-hardware/gtnet-card/.Search in Google Scholar

48. A. Latif, M. Shahzad, P. Palensky, W. Gawlik. An alternate PowerFactory Matlab coupling approach. In: 2015 International Symposium on Smart Electric Distribution Systems and Technologies (EDST): IEEE; 2015, p. 486–91.10.1109/SEDST.2015.7315257Search in Google Scholar

49. M. Kyesswa, H.K. Çakmak, U. Kühnapfel, V. Hagenmeyer. A Matlab-Based Dynamic Simulation Module for Power System Transients Analysis in the eASiMOV Framework. In: 2017 European Modelling Symposium (EMS): IEEE; 2017, p. 157–62.10.1109/EMS.2017.36Search in Google Scholar

50. M. Stifter, R. Schwalbe, F. Andren, T. Strasser. Steady-state co-simulation with PowerFactory. In: 2013 Workshop on Modeling and Simulation of Cyber-Physical Energy Systems (MSCPES): IEEE; 2013, p. 1–6.10.1109/MSCPES.2013.6623317Search in Google Scholar

51. R.R. Dickinson, N. Lymperopoulos, A. Le Duigou, P. Lucchese, C. Mansilla, O. Tlili et al. Power-to-hydrogen and hydrogen-to-X pathways: Opportunities for next generation energy systems. In: 2017 14th International Conference on the European Energy Market (EEM), p. 1–6.10.1109/EEM.2017.7981882Search in Google Scholar

52. V. Hagenmeyer, H.K. Çakmak, C. Düpmeier, T. Faulwasser, J. Isele, H.B. Keller et al. Information and Communication Technology in Energy Lab 2.0: Smart Energies System Simulation and Control Center with an Open-Street-Map-Based Power Flow Simulation Example. Energy Technology 2016; 4(1):145–162.10.1002/ente.201500304Search in Google Scholar

53. A. Ursua, L.M. Gandia, P. Sanchis. Hydrogen Production From Water Electrolysis: Current Status and Future Trends. Proc. IEEE 2012; 100(2):410–26.10.1109/JPROC.2011.2156750Search in Google Scholar

54. M. Balestri, G. Benelli, F. Donatini, F. Arlati, G. Conti. Enel’s Fusina hydrogen-fed power generation plant. In: 2007 International Conference on Clean Electrical Power. Piscataway: IEEE; 2007, p. 456–63.10.1109/ICCEP.2007.384254Search in Google Scholar

55. M. Nose, T. Kawakami, H. Araki, N. Senba, S. Tanimura. Hydrogen-fired Gas Turbine Targeting Realization of CO2-free Society; 2018.Search in Google Scholar

Received: 2019-07-18
Accepted: 2019-09-13
Published Online: 2019-11-05
Published in Print: 2019-11-26

© 2019 Çakmak et al., published by De Gruyter

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 12.12.2024 from https://www.degruyter.com/document/doi/10.1515/auto-2019-0081/html
Scroll to top button