Skip to content

Issue 518 - Myopic Implementation #554

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 23 commits into from
Apr 14, 2025
Merged
Show file tree
Hide file tree
Changes from 10 commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
568d664
update dag (#517)
trevorb1 Jan 28, 2025
a5d0a72
added for loop around the solve_network. New myopic and perfect fores…
lfreese Jan 31, 2025
514d497
fixed errors with pulling the myopic foresight from config file, mode…
lfreese Feb 3, 2025
5551b59
moved for loop inside of the solve_network, removed the add_myopic co…
lfreese Feb 4, 2025
f251f81
first working version, no bugs, runs fully. needs checks to see if ou…
lfreese Feb 6, 2025
2f96245
fixed the limited c.df to be for build years below the investment hor…
lfreese Feb 6, 2025
1ec5f04
made it so that the links are properly removed, since their build yea…
lfreese Feb 20, 2025
99d1924
first successful version to have results for all decades, needs first…
lfreese Feb 20, 2025
054de3f
fixed the loop to be inclusive for c_lim rather than less than the pl…
lfreese Feb 21, 2025
782419c
updated the configtables documentation
lfreese Feb 21, 2025
6b0b5dd
Fix clim, other fix for perf foresight.
ktehranchi Feb 27, 2025
105b07a
RPS updates
mchjones Feb 27, 2025
5d3fd7a
Merge pull request #1 from lfreese/pr-issue-518
lfreese Feb 27, 2025
c4ee546
fixed the retirement issue for one time horizon
lfreese Feb 28, 2025
e1da7cf
Merge branch 'issue-518' into issue-518
lfreese Feb 28, 2025
e523584
Merge pull request #2 from mchjones/issue-518
lfreese Feb 28, 2025
34e6f88
updated to m.add to carry over the custom fields
lfreese Feb 28, 2025
43b6853
added the storage state_of_charge rollover, but this is leading to in…
lfreese Mar 25, 2025
1c28c33
changed state of charge to just be the final snapshot, now feasible
lfreese Mar 25, 2025
2160a92
updated the config files in repo_data, and the docs to show that myop…
lfreese Mar 25, 2025
5ea8414
n.madd for generators so that the attributes carry over. this is a fi…
lfreese Mar 25, 2025
d3a0b43
Merge branch 'develop' into issue-518
ktehranchi Apr 14, 2025
e35f4d7
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Apr 14, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added docs/source/_static/dag_sector.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion docs/source/configtables/scenario.csv
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
,Unit,Values,Description
planning_horizons,int,"(2018-2023, 2030, 2040, 2050)","Specifies the year of demand data to use. Historical values will use EIA930 data, Future years will use NREL EFS data. Specify multiple planning horizons to build a multi-horizon model."
foresight,str,perfect,Specifies foresight option for multi-horizon optimization. Currently only perfect foresight is supported. Myopic foresight will be added in the future.
foresight,str,"One of {'perfect','myopic’}",Specifies foresight option for multi-horizon optimization.
19 changes: 16 additions & 3 deletions docs/source/data-sectors.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Full energy system models can be run through PyPSA-USA. This page gives an overv

## Overview

Sector coupled models are built ontop of the power sector representation. This means all assumptions[^1], including spatial and temporal options, implemented to bild the electricity sector are shared in sector coupled studies.
Sector coupled models are built ontop of the power sector representation. This means all assumptions, including spatial and temporal options, implemented to bild the electricity sector are shared in sector coupled studies[^1].

When running sector coupled studies, four end use sectors are added to the system. These include residential, commercial, industrial, and transportation. Each of these sectors will always have electrical loads. Then, based on user configuration options, heating, cooling, or fossil fuel loads may be added to the system. Moreover, the natural gas network is added to track imports/exports, storage levels, methane leaks, and endogenously solve for natural gas cost.

Expand Down Expand Up @@ -65,10 +65,12 @@ Capacity constraints limit how much energy can be delievered through demand resp
&\ \hspace{1cm} s_{n,t} = \text{Allowable shiftable load per unit of } d_{n,t} \\
&\ \hspace{1cm} dr_{n,t} = \text{Discharge of demand response at time } t \text{ and bus } n \\
&\ s.t. \\
&\ \hspace{1cm} d_{n,t} \times s_{n,t} \leq dr_{n,t} \hspace{0.5cm} \forall_{\text{n,t}}\\
&\ \hspace{1cm} d_{n,t} \times s_{n,t} \geq dr_{n,t} \hspace{0.5cm} \forall_{\text{n,t}}\\
\end{align*}

This is not applied directly to the `Link` object via the `p_nom` paprameter to be consistent with the transport sector. Within the transport sector, demand response is applied to the aggregation bus due to endogenous investment options. Therefore, knowing how much electrical load to be shifted at each timestep is not possible before solving. For the transport sector, the following constraint is added. See the [transportation section](./data-transportation.md) schematics for details on where demand reponse is applied.
#### Trasport Demand Response Capacity Constraint

Within the transport sector, demand response is applied to the aggregation bus due to endogenous investment options. This is because it is not possible to know how much electrical load can be shifted before solving. For the transport sector, the following constraint is added. See the [transportation section](./data-transportation.md) schematics for details on where demand reponse is applied.

\begin{align*}
&\ \text{let:} \\
Expand All @@ -82,6 +84,17 @@ This is not applied directly to the `Link` object via the `p_nom` paprameter to
&\ \hspace{1cm} \sum_{v}(p_{n,t,v}) \times s_{n,t} - dr_{n,t} \geq 0 \hspace{0.5cm} \forall_{\text{n,t}}\\
\end{align*}

(workflow-sector)=
## Workflow

The diagram below illustrates the workflow of PyPSA-USA Sector. Many rules overlap with the electricity sector workflow; however, several additional rules are also present.

:::{figure-md} workflow
<img src="./_static/dag_sector.jpg" width="700px">

Snakemake DAG for sector coupled studies
:::

## Further Information

Each sector has a dedicated page giving information on data sources, implementation details, and assumptions. See the corresponding page linked below.
Expand Down
Binary file modified workflow/repo_data/dag.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
116 changes: 89 additions & 27 deletions workflow/scripts/solve_network.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@
based on the rule :mod:`solve_network`.
"""

import copy
import logging
import re
from typing import Optional
Expand Down Expand Up @@ -165,7 +166,7 @@ def prepare_network(
p_nom=1e9, # kW
)

if solve_opts.get("noisy_costs"):
if solve_opts.get("noisy_costs"): ##random noise to costs of generators
for t in n.iterate_components():
if "marginal_cost" in t.df:
t.df["marginal_cost"] += 1e-2 + 2e-3 * (np.random.random(len(t.df)) - 0.5)
Expand Down Expand Up @@ -341,7 +342,7 @@ def add_RPS_constraints(n, config):

Returns
-------
None

"""

def process_reeds_data(filepath, carriers, value_col):
Expand Down Expand Up @@ -615,8 +616,11 @@ def add_regional_co2limit(n, sns, config):
config["electricity"]["regional_Co2_limits"],
index_col=[0],
)

logger.info("Adding regional Co2 Limits.")
regional_co2_lims = regional_co2_lims[regional_co2_lims.planning_horizon.isin(snakemake.params.planning_horizons)]

# Filter the regional_co2_lims DataFrame based on the planning horizons present in the snapshots
regional_co2_lims = regional_co2_lims[regional_co2_lims.planning_horizon.isin(sns.get_level_values(0))]
weightings = n.snapshot_weightings.loc[n.snapshots]

for idx, emmission_lim in regional_co2_lims.iterrows():
Expand Down Expand Up @@ -1456,9 +1460,20 @@ def extra_functionality(n, snapshots):
def solve_network(n, config, solving, opts="", **kwargs):
set_of_options = solving["solver"]["options"]
cf_solving = solving["options"]

if len(n.investment_periods) > 1:
foresight = (
snakemake.params.foresight
) # pulling from the foresight addition into the config file, similar formatting to solve_opts
logger.info(
f"Pulling foresight option {foresight} from the scenario config file",
) # logger statements to ensure expected outcomes
if (
len(n.investment_periods) > 1 and foresight == "perfect"
): # new myopic or perfect foresight addition, if statement to pick which
kwargs["multi_investment_periods"] = config["foresight"] == "perfect"
logger.info(f"Using perfect foresight") # logger statements to ensure expected outcomes
elif len(n.investment_periods) > 1 and foresight == "myopic":
kwargs["multi_investment_periods"] = config["foresight"] == "myopic"
logger.info(f"Using myopic foresight") # logger statements to ensure expected outcomes

kwargs["solver_options"] = solving["solver_options"][set_of_options] if set_of_options else {}
kwargs["solver_name"] = solving["solver"]["name"]
Expand All @@ -1480,28 +1495,76 @@ def solve_network(n, config, solving, opts="", **kwargs):
n.config = config
n.opts = opts

if rolling_horizon:
kwargs["horizon"] = cf_solving.get("horizon", 365)
kwargs["overlap"] = cf_solving.get("overlap", 0)
n.optimize.optimize_with_rolling_horizon(**kwargs)
status, condition = "", ""
elif skip_iterations:
status, condition = n.optimize(**kwargs)
else:
kwargs["track_iterations"] = (cf_solving.get("track_iterations", False),)
kwargs["min_iterations"] = (cf_solving.get("min_iterations", 4),)
kwargs["max_iterations"] = (cf_solving.get("max_iterations", 6),)
status, condition = n.optimize.optimize_transmission_expansion_iteratively(
**kwargs,
)
for i, planning_horizon in enumerate(n.investment_periods):
# planning_horizons = snakemake.params.planning_horizons
sns_horizon = n.snapshots[n.snapshots.get_level_values(0) == planning_horizon]
# add sns_horizon to kwargs
kwargs["snapshots"] = sns_horizon

if rolling_horizon:
kwargs["horizon"] = cf_solving.get("horizon", 365)
kwargs["overlap"] = cf_solving.get("overlap", 0)
n.optimize.optimize_with_rolling_horizon(**kwargs)
status, condition = "", ""
elif skip_iterations:
status, condition = n.optimize(**kwargs)
else:
kwargs["track_iterations"] = (cf_solving.get("track_iterations", False),)
kwargs["min_iterations"] = (cf_solving.get("min_iterations", 4),)
kwargs["max_iterations"] = (cf_solving.get("max_iterations", 6),)
status, condition = n.optimize.optimize_transmission_expansion_iteratively(
**kwargs,
)

if status != "ok" and not rolling_horizon:
logger.warning(
f"Solving status '{status}' with termination condition '{condition}'",
)
if "infeasible" in condition:
# n.model.print_infeasibilities()
raise RuntimeError("Solving status 'infeasible'")
if status != "ok" and not rolling_horizon:
logger.warning(
f"Solving status '{status}' with termination condition '{condition}'",
)
if "infeasible" in condition:
# n.model.print_infeasibilities()
raise RuntimeError("Solving status 'infeasible'")

if foresight == "myopic":
if i == len(n.investment_periods) - 1:
logger.info(f"Final time horizon {planning_horizon}")
continue
logger.info(f"Preparing brownfield from {planning_horizon}")

# electric transmission grid set optimised capacities of previous as minimum
n.lines.s_nom_min = n.lines.s_nom_opt # for lines
dc_i = n.links[n.links.carrier == "DC"].index
n.links.loc[dc_i, "p_nom_min"] = n.links.loc[dc_i, "p_nom_opt"] # for links

for c in n.iterate_components(["Generator", "Link", "StorageUnit"]):
nm = c.name

# limit our components that we remove/modify to those prior to this time horizon
c_lim = c.df.loc[(c.df["build_year"] >= 0) & (c.df["build_year"] <= planning_horizon)]
logger.info(f"Preparing brownfield for the component {nm}")
# attribute selection for naming convention
attr = "p"
# copy over asset sizing from previous period
c_lim[f"{attr}_nom"] = c_lim[f"{attr}_nom_opt"]
c_lim[f"{attr}_nom_extendable"] = False
df = copy.deepcopy(c_lim)
time_df = copy.deepcopy(c.pnl)

for c_idx in c_lim.index:
n.remove(nm, c_idx)

n.add(nm, df.index, **df)
logger.info(n.consistency_check())

# copy time-dependent
selection = n.component_attrs[nm].type.str.contains(
"series",
)
# ) & n.component_attrs[
# nm
# ].status.str.contains("Input")

for tattr in n.component_attrs[nm].index[selection]:
n.import_series_from_dataframe(time_df[tattr], nm, tattr)

return n

Expand Down Expand Up @@ -1563,7 +1626,6 @@ def solve_network(n, config, solving, opts="", **kwargs):
)
n.meta = dict(snakemake.config, **dict(wildcards=dict(snakemake.wildcards)))
n.export_to_netcdf(snakemake.output[0])

with open(snakemake.output.config, "w") as file:
yaml.dump(
n.meta,
Expand Down