Skip to content

Enphase Envoy-S Metered Integration Bug - Failed Setup #142015

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
njwretnm opened this issue Apr 1, 2025 · 11 comments · Fixed by #142107
Closed

Enphase Envoy-S Metered Integration Bug - Failed Setup #142015

njwretnm opened this issue Apr 1, 2025 · 11 comments · Fixed by #142107

Comments

@njwretnm
Copy link

njwretnm commented Apr 1, 2025

The problem

I recently purchased a used Envoy-S Metered to work with my M-Series Enphase microinverters- it has detected the micros, and it has a CT for PV production (nothing for consumption yet, thought I suspect the previous owner had used the consumption CT at some point- perhaps the Envoy thinks it's missing and is doing something with the data).

The Envoy looks like it’s running firmware version M4.2.33 (pretty old!), and I’m able to access quite a bit of data just off the bat with no auth, as well as the installer/admin side of things using ‘installer’ and the installer password pulled using that python script :).

All that said, I’m not able to get the Home Assistant integration working. The integration looks like it’s connecting, but I’m seeing an immediate error: “Failed setup, will retry: ‘measurementType’” . I’ve tried every combination of username password to see if it makes any difference, and it doesn’t seem to do anything.

Looking at the diagnostic logs, it looks like it’s looking for a json key that doesn’t exist:

Unexpected error fetching Envoy 123456789 data Traceback (most recent call last): File "/usr/src/homeassistant/homeassistant/helpers/update_coordinator.py", line 380, in _async_refresh self.data = await self._async_update_data() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/homeassistant/homeassistant/components/enphase_envoy/coordinator.py", line 196, in _async_update_data envoy_data = await envoy.update() ^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.13/site-packages/pyenphase/envoy.py", line 621, in update await self.probe() File "/usr/local/lib/python3.13/site-packages/pyenphase/envoy.py", line 560, in probe if updater_features := await klass.probe(supported_features): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.13/site-packages/pyenphase/updaters/production.py", line 105, in probe meter_type = meter["measurementType"] ~~~~~^^^^^^^^^^^^^^^^^^^ KeyError: 'measurementType'

Additionally, this is an example production.json from my Envoy (envoy.local/production.json):

{"production":[{"type":"inverters","wNow":1406,"whLifetime":11420402.10388889,"readingTime":1743529602,"activeCount":19},{"type":"eim","activeCount":1,"whLifetime":8593158.949,"whLastSevenDays":5734.949,"whToday":5116.949,"wNow":1377.422,"rmsCurrent":11.486,"rmsVoltage":243.419,"reactPwr":241.26,"apprntPwr":1397.953,"pwrFactor":0.99,"readingTime":1743529602}],"consumption":[{"type":"eim","activeCount":0,"whLifetime":0,"whLastSevenDays":0,"whToday":0,"wNow":0,"varhLeadToday":0,"varhLagToday":0,"vahToday":0,"varhLeadLifetime":0,"varhLagLifetime":0,"vahLifetime":0,"rmsCurrent":0,"rmsVoltage":0,"reactPwr":0,"apprntPwr":0,"pwrFactor":0}]}

What version of Home Assistant Core has the issue?

core-2025.3.4

What was the last working version of Home Assistant Core?

No response

What type of installation are you running?

Home Assistant OS

Integration causing the issue

Enphase Envoy

Link to integration documentation on our website

https://www.home-assistant.io/integrations/enphase_envoy

Diagnostics information

No response

Example YAML snippet

Anything in the logs that might be useful for us?

Additional information

No response

@home-assistant
Copy link

home-assistant bot commented Apr 1, 2025

Hey there @bdraco, @cgarwood, @joostlek, @catsmanac, mind taking a look at this issue as it has been labeled with an integration (enphase_envoy) you are listed as a code owner for? Thanks!

Code owner commands

Code owners of enphase_envoy can trigger bot actions by commenting:

  • @home-assistant close Closes the issue.
  • @home-assistant rename Awesome new title Renames the issue.
  • @home-assistant reopen Reopen the issue.
  • @home-assistant unassign enphase_envoy Removes the current integration label and assignees on the issue, add the integration domain after the command.
  • @home-assistant add-label needs-more-information Add a label (needs-more-information, problem in dependency, problem in custom component) to the issue.
  • @home-assistant remove-label needs-more-information Remove a label (needs-more-information, problem in dependency, problem in custom component) on the issue.

(message by CodeOwnersMention)


enphase_envoy documentation
enphase_envoy source
(message by IssueLinks)

@njwretnm njwretnm changed the title Enphase Envoy-S Metered Integration Bug - Missing JSON key Enphase Envoy-S Metered Integration Bug - Failed Setup Apr 1, 2025
@catsmanac
Copy link
Contributor

The formatted json looks like below. Current logic tries to read meter["measurementType"] before testing for "activeCount" > 0.

Needs a change in pyenphase library.

{
    "production": [{
            "type": "inverters",
            "wNow": 1406,
            "whLifetime": 11420402.10388889,
            "readingTime": 1743529602,
            "activeCount": 19
        }, {
            "type": "eim",
            "activeCount": 1,
            "whLifetime": 8593158.949,
            "whLastSevenDays": 5734.949,
            "whToday": 5116.949,
            "wNow": 1377.422,
            "rmsCurrent": 11.486,
            "rmsVoltage": 243.419,
            "reactPwr": 241.26,
            "apprntPwr": 1397.953,
            "pwrFactor": 0.99,
            "readingTime": 1743529602
        }
    ],
    "consumption": [{
            "type": "eim",
            "activeCount": 0,
            "whLifetime": 0,
            "whLastSevenDays": 0,
            "whToday": 0,
            "wNow": 0,
            "varhLeadToday": 0,
            "varhLagToday": 0,
            "vahToday": 0,
            "varhLeadLifetime": 0,
            "varhLagLifetime": 0,
            "vahLifetime": 0,
            "rmsCurrent": 0,
            "rmsVoltage": 0,
            "reactPwr": 0,
            "apprntPwr": 0,
            "pwrFactor": 0
        }
    ]
}

@njwretnm
Copy link
Author

njwretnm commented Apr 2, 2025

@catsmanac I have zero prior experience with this, so I'm not 100% sure I did it right, but I managed to override the core enphase_envoy integration by duplicating the enphase_envoy files in the custom_components folder and then pointed the manifest at your pyenphase pull request branch. Good news and bad news. The good news is I'm seeing authenticated connections now, 200 responses, and production data being pulled(!!!), but I see another json key error down the line:

KeyError: 'statusFlags'
2025-04-01 22:16:22.520 DEBUG (MainThread) [pyenphase.firmware] Requesting https://192.168.0.11/info with timeout Timeout(connect=10.0, read=45.0, write=10.0, pool=10.0)
2025-04-01 22:16:22.522 DEBUG (MainThread) [pyenphase.firmware] Retrying to http://192.168.0.11/info with timeout Timeout(connect=10.0, read=45.0, write=10.0, pool=10.0)
2025-04-01 22:16:22.569 DEBUG (MainThread) [pyenphase.firmware] Request reply in 0.0 sec from http://192.168.0.11/info status 200: text/xml b"<?xml version='1.0' encoding='UTF-8'?>\n<envoy_info>\n  <time>1743563782</time>\n  <device>\n    <sn>1234567890</sn>\n    <pn>800-00547-r05</pn>\n    <software>M4.2.33</software>\n    <euaid>4c8675</euaid>\n    <seqnum>0</seqnum>\n    <apiver>1</apiver>\n  </device>\n  <package name='rootfs'>\n    <pn>500-00001-r01</pn>\n    <version>02.00.00</version>\n    <build>937</build>\n  </package>\n  <package name='full'>\n    <pn>500-00001-r01</pn>\n    <version>02.00.00</version>\n    <build>937</build>\n  </package>\n  <package name='kernel'>\n    <pn>500-00011-r01</pn>\n    <version>04.00.00</version>\n    <build>c90fc2</build>\n  </package>\n  <package name='boot'>\n    <pn>590-00018-r01</pn>\n    <version>02.00.01</version>\n    <build>79181c</build>\n  </package>\n  <package name='app'>\n    <pn>500-00002-r01</pn>\n    <version>04.02.33</version>\n    <build>b68db1</build>\n  </package>\n  <package name='devimg'>\n    <pn>500-00004-r01</pn>\n    <version>01.01.49</version>\n    <build>98cbde</build>\n  </package>\n  <package name='geo'>\n    <pn>500-00008-r01</pn>\n    <version>01.06.05</version>\n    <build>dfe5e2</build>\n  </package>\n  <package name='backbone'>\n    <pn>500-00010-r01</pn>\n    <version>04.02.45</version>\n    <build>afc643</build>\n  </package>\n  <package name='meter'>\n    <pn>500-00013-r01</pn>\n    <version>02.01.04</version>\n    <build>299acc</build>\n  </package>\n  <package name='agf'>\n    <pn>500-00012-r01</pn>\n    <version>01.00.00</version>\n    <build>b13066</build>\n  </package>\n  <package name='security'>\n    <pn>500-00016-r01</pn>\n    <version>02.00.00</version>\n    <build>54a6dc</build>\n  </package>\n</envoy_info>\n"
2025-04-01 22:16:22.570 DEBUG (MainThread) [pyenphase.envoy] FW: 4.2.33, Authenticating to Envoy using envoy/installer authentication
2025-04-01 22:16:22.571 DEBUG (MainThread) [pyenphase.envoy] Requesting http://192.168.0.11/ivp/meters with timeout Timeout(connect=10.0, read=45.0, write=10.0, pool=10.0)
2025-04-01 22:16:22.578 DEBUG (MainThread) [pyenphase.envoy] Request reply in 0.0 sec from http://192.168.0.11/ivp/meters status 200: None b'[\n    {\n        "eid": 704643328,\n        "state": "enabled",\n        "measurementType": "production",\n        "phaseMode": "split",\n        "phaseCount": 2,\n        "meteringStatus": "normal"\n    },\n    {\n        "eid": 704643584,\n        "state": "disabled",\n        "measurementType": "net-consumption",\n        "phaseMode": "split",\n        "phaseCount": 2,\n        "meteringStatus": "not-metering"\n    }\n]'
2025-04-01 22:16:22.578 DEBUG (MainThread) [pyenphase.envoy] Requesting http://192.168.0.11/production.json?details=1 with timeout Timeout(connect=10.0, read=45.0, write=10.0, pool=10.0)
2025-04-01 22:16:22.769 DEBUG (MainThread) [pyenphase.envoy] Request reply in 0.2 sec from http://192.168.0.11/production.json?details=1 status 200: application/json b'{"production":[{"type":"inverters","wNow":0,"whLifetime":11425504.1275,"readingTime":1743563782,"activeCount":19},{"type":"eim","activeCount":1,"whLifetime":8598256.633,"whLastSevenDays":10832.633,"whToday":10214.633,"wNow":-8.796,"rmsCurrent":1.767,"rmsVoltage":241.934,"reactPwr":206.72,"apprntPwr":213.736,"pwrFactor":-0.04,"readingTime":1743563782}],"consumption":[{"type":"eim","activeCount":0,"whLifetime":0,"whLastSevenDays":0,"whToday":0,"wNow":0,"varhLeadToday":0,"varhLagToday":0,"vahToday":0,"varhLeadLifetime":0,"varhLagLifetime":0,"vahLifetime":0,"rmsCurrent":0,"rmsVoltage":0,"reactPwr":0,"apprntPwr":0,"pwrFactor":0}]}'
2025-04-01 22:16:22.770 DEBUG (MainThread) [pyenphase.updaters.production] Expected Production report Phase values not available, 0 of 2
2025-04-01 22:16:22.770 DEBUG (MainThread) [pyenphase.envoy] Requesting http://192.168.0.11/production with timeout Timeout(connect=10.0, read=45.0, write=10.0, pool=10.0)
2025-04-01 22:16:22.834 DEBUG (MainThread) [pyenphase.updaters.production] Skipping production endpoint as user does not have access to /production: Authentication failed for http://192.168.0.11/production with status 401, please check your username/password or token.
2025-04-01 22:16:22.834 DEBUG (MainThread) [pyenphase.updaters.production] Expected Production report Phase values not available, 0 of 2
2025-04-01 22:16:22.834 DEBUG (MainThread) [pyenphase.envoy] Requesting http://192.168.0.11/api/v1/production/inverters with timeout Timeout(connect=10.0, read=45.0, write=10.0, pool=10.0)
2025-04-01 22:16:22.879 DEBUG (MainThread) [pyenphase.envoy] Request reply in 0.0 sec from http://192.168.0.11/api/v1/production/inverters status 200: application/json b'[\n  {\n    "serialNumber": "1234567890",\n    "lastReportDate": 1743551631,\n    "lastReportWatts": 3,\n    "maxReportWatts": 131\n  },\n  {\n    "serialNumber": "121436049348",\n    "lastReportDate": 1743551620,\n    "lastReportWatts": 3,\n    "maxReportWatts": 116\n  },\n  {\n    "serialNumber": "121436049328",\n    "lastReportDate": 1743551627,\n    "lastReportWatts": 3,\n    "maxReportWatts": 115\n  },\n  {\n    "serialNumber": "1234",\n    "lastReportDate": 1743551629,\n    "lastReportWatts": 2,\n    "maxReportWatts": 112\n  },\n  {\n    "serialNumber": "121436048845",\n    "lastReportDate": 1743551627,\n    "lastReportWatts": 2,\n    "maxReportWatts": 112\n  },\n  {\n    "serialNumber": "121436049598",\n    "lastReportDate": 1743551623,\n    "lastReportWatts": 2,\n    "maxReportWatts": 114\n  },\n  {\n    "serialNumber": "121436048659",\n    "lastReportDate": 1743551624,\n    "lastReportWatts": 3,\n    "maxReportWatts": 115\n  },\n  {\n    "serialNumber": "121622033394",\n    "lastReportDate": 1536668599,\n    "lastReportWatts": 13,\n    "maxReportWatts": 13\n  },\n  {\n    "serialNumber": "121622022742",\n    "lastReportDate": 1536668610,\n    "lastReportWatts": 14,\n    "maxReportWatts": 14\n  },\n  {\n    "serialNumber": "121622032547",\n    "lastReportDate": 1536665884,\n    "lastReportWatts": 3,\n    "maxReportWatts": 3\n  },\n  {\n    "serialNumber": "121622025673",\n    "lastReportDate": 1536619007,\n    "lastReportWatts": 2,\n    "maxReportWatts": 2\n  },\n  {\n    "serialNumber": "121622022737",\n    "lastReportDate": 1536667724,\n    "lastReportWatts": 10,\n    "maxReportWatts": 10\n  },\n  {\n    "serialNumber": "121622025752",\n    "lastReportDate": 1536668619,\n    "lastReportWatts": 14,\n    "maxReportWatts": 14\n  },\n  {\n    "serialNumber": "121622032412",\n    "lastReportDate": 1536667737,\n    "lastReportWatts": 5,\n    "maxReportWatts": 5\n  },\n  {\n    "serialNumber": "121622022759",\n    "lastReportDate": 1536668642,\n    "lastReportWatts": 13,\n    "maxReportWatts": 13\n  },\n  {\n    "serialNumber": "121622025680",\n    "lastReportDate": 1536667756,\n    "lastReportWatts": 11,\n    "maxReportWatts": 11\n  },\n  {\n    "serialNumber": "121622025731",\n    "lastReportDate": 1536668656,\n    "lastReportWatts": 17,\n    "maxReportWatts": 17\n  },\n  {\n    "serialNumber": "121622032575",\n    "lastReportDate": 1536668671,\n    "lastReportWatts": 18,\n    "maxReportWatts": 18\n  },\n  {\n    "serialNumber": "121436053372",\n    "lastReportDate": 1743551610,\n    "lastReportWatts": 3,\n    "maxReportWatts": 117\n  },\n  {\n    "serialNumber": "121436053331",\n    "lastReportDate": 1743551616,\n    "lastReportWatts": 2,\n    "maxReportWatts": 59\n  },\n  {\n    "serialNumber": "121436049391",\n    "lastReportDate": 1743551620,\n    "lastReportWatts": 2,\n    "maxReportWatts": 113\n  },\n  {\n    "serialNumber": "121436048644",\n    "lastReportDate": 1743551625,\n    "lastReportWatts": 2,\n    "maxReportWatts": 116\n  },\n  {\n    "serialNumber": "121436049347",\n    "lastReportDate": 1743551630,\n    "lastReportWatts": 2,\n    "maxReportWatts": 93\n  },\n  {\n    "serialNumber": "121436049374",\n    "lastReportDate": 1743551623,\n    "lastReportWatts": 2,\n    "maxReportWatts": 111\n  },\n  {\n    "serialNumber": "121436049274",\n    "lastReportDate": 1743551618,\n    "lastReportWatts": 3,\n    "maxReportWatts": 118\n  },\n  {\n    "serialNumber": "121436048765",\n    "lastReportDate": 1743551617,\n    "lastReportWatts": 2,\n    "maxReportWatts": 114\n  },\n  {\n    "serialNumber": "121436049230",\n    "lastReportDate": 1743551613,\n    "lastReportWatts": 3,\n    "maxReportWatts": 113\n  },\n  {\n    "serialNumber": "121436049236",\n    "lastReportDate": 1743551612,\n    "lastReportWatts": 2,\n    "maxReportWatts": 94\n  },\n  {\n    "serialNumber": "121436048857",\n    "lastReportDate": 1743551614,\n    "lastReportWatts": 2,\n    "maxReportWatts": 107\n  },\n  {\n    "serialNumber": "121436053315",\n    "lastReportDate": 1743551609,\n    "lastReportWatts": 3,\n    "maxReportWatts": 119\n  },\n  {\n    "serialNumber": "121622031692",\n    "lastReportDate": 1536668648,\n    "lastReportWatts": 16,\n    "maxReportWatts": 16\n  },\n  {\n    "serialNumber": "121622033387",\n    "lastReportDate": 1536667749,\n    "lastReportWatts": 10,\n    "maxReportWatts": 10\n  },\n  {\n    "serialNumber": "121622032550",\n    "lastReportDate": 1536618998,\n    "lastReportWatts": 2,\n    "maxReportWatts": 2\n  },\n  {\n    "serialNumber": "121622030849",\n    "lastReportDate": 1536618997,\n    "lastReportWatts": 2,\n    "maxReportWatts": 2\n  },\n  {\n    "serialNumber": "121622032421",\n    "lastReportDate": 1536618992,\n    "lastReportWatts": 2,\n    "maxReportWatts": 2\n  },\n  {\n    "serialNumber": "121622032296",\n    "lastReportDate": 1536666858,\n    "lastReportWatts": 8,\n    "maxReportWatts": 8\n  },\n  {\n    "serialNumber": "121622030861",\n    "lastReportDate": 1536668663,\n    "lastReportWatts": 15,\n    "maxReportWatts": 15\n  },\n  {\n    "serialNumber": "121622030852",\n    "lastReportDate": 1536665972,\n    "lastReportWatts": 6,\n    "maxReportWatts": 6\n  },\n  {\n    "serialNumber": "121622033390",\n    "lastReportDate": 1536668609,\n    "lastReportWatts": 16,\n    "maxReportWatts": 16\n  },\n  {\n    "serialNumber": "121622032346",\n    "lastReportDate": 1536668618,\n    "lastReportWatts": 16,\n    "maxReportWatts": 16\n  },\n  {\n    "serialNumber": "121622029606",\n    "lastReportDate": 1536668634,\n    "lastReportWatts": 17,\n    "maxReportWatts": 17\n  },\n  {\n    "serialNumber": "121622032551",\n    "lastReportDate": 1536666817,\n    "lastReportWatts": 6,\n    "maxReportWatts": 6\n  },\n  {\n    "serialNumber": "121622025749",\n    "lastReportDate": 1536668635,\n    "lastReportWatts": 16,\n    "maxReportWatts": 16\n  },\n  {\n    "serialNumber": "121622030863",\n    "lastReportDate": 1536668647,\n    "lastReportWatts": 16,\n    "maxReportWatts": 16\n  },\n  {\n    "serialNumber": "121622033381",\n    "lastReportDate": 1536668642,\n    "lastReportWatts": 9,\n    "maxReportWatts": 9\n  },\n  {\n    "serialNumber": "121622025745",\n    "lastReportDate": 1536619020,\n    "lastReportWatts": 2,\n    "maxReportWatts": 2\n  },\n  {\n    "serialNumber": "121622022741",\n    "lastReportDate": 1536619013,\n    "lastReportWatts": 2,\n    "maxReportWatts": 2\n  },\n  {\n    "serialNumber": "121622033386",\n    "lastReportDate": 1536668598,\n    "lastReportWatts": 11,\n    "maxReportWatts": 11\n  },\n  {\n    "serialNumber": "121622032673",\n    "lastReportDate": 1536668609,\n    "lastReportWatts": 17,\n    "maxReportWatts": 17\n  },\n  {\n    "serialNumber": "121622032667",\n    "lastReportDate": 1536668621,\n    "lastReportWatts": 18,\n    "maxReportWatts": 18\n  },\n  {\n    "serialNumber": "121622033019",\n    "lastReportDate": 1536668634,\n    "lastReportWatts": 17,\n    "maxReportWatts": 17\n  }\n]\n'
2025-04-01 22:16:22.879 DEBUG (MainThread) [pyenphase.updaters.ensemble] Firmware too old for Ensemble support
2025-04-01 22:16:22.880 DEBUG (MainThread) [pyenphase.envoy] Requesting http://192.168.0.11/admin/lib/tariff with timeout Timeout(connect=10.0, read=45.0, write=10.0, pool=10.0)
2025-04-01 22:16:22.935 DEBUG (MainThread) [pyenphase.updaters.tariff] Skipping tariff endpoint as user does not have access to /admin/lib/tariff: Authentication failed for http://192.168.0.11/admin/lib/tariff with status 401, please check your username/password or token.
2025-04-01 22:16:22.935 DEBUG (MainThread) [pyenphase.updaters.generator] Firmware too old for Ensemble support
2025-04-01 22:16:22.935 DEBUG (MainThread) [pyenphase.envoy] Requesting http://192.168.0.11/ivp/meters/readings with timeout Timeout(connect=10.0, read=45.0, write=10.0, pool=10.0)
2025-04-01 22:16:22.944 DEBUG (MainThread) [pyenphase.envoy] Request reply in 0.0 sec from http://192.168.0.11/ivp/meters/readings status 200: None b'[\n    {\n        "eid": 704643328,\n        "timestamp": 1743563780,\n        "actEnergyDlvd": 8598256.633,\n        "actEnergyRcvd": 15438.970,\n        "apparentEnergy": 12039068.737,\n        "reactEnergyLagg": 4797532.454,\n        "reactEnergyLead": 2.593,\n        "instantaneousDemand": -8.80,\n        "activePower": -8.80,\n        "apparentPower": 213.74,\n        "reactivePower": 206.72,\n        "pwrFactor": -0.04,\n        "voltage": 241.93,\n        "current": 1.77,\n        "freq": 60.00,\n        "channels": [\n            {\n                "eid": 1778385169,\n                "timestamp": 1743563780,\n                "actEnergyDlvd": 4289410.853,\n                "actEnergyRcvd": 10353.255,\n                "apparentEnergy": 6009955.301,\n                "reactEnergyLagg": 2407341.556,\n                "reactEnergyLead": 1.556,\n                "instantaneousDemand": -4.35,\n                "activePower": -4.35,\n                "apparentPower": 106.03,\n                "reactivePower": 101.68,\n                "pwrFactor": -0.04,\n                "voltage": 121.21,\n                "current": 0.87,\n                "freq": 60.00\n            },\n            {\n                "eid": 1778385170,\n                "timestamp": 1743563780,\n                "actEnergyDlvd": 4308845.780,\n                "actEnergyRcvd": 5085.715,\n                "apparentEnergy": 6029113.436,\n                "reactEnergyLagg": 2390190.898,\n                "reactEnergyLead": 1.038,\n                "instantaneousDemand": -4.45,\n                "activePower": -4.45,\n                "apparentPower": 107.71,\n                "reactivePower": 105.04,\n                "pwrFactor": -0.04,\n                "voltage": 120.72,\n                "current": 0.89,\n                "freq": 60.00\n            },\n            {\n                "eid": 1778385171,\n                "timestamp": 1743563780,\n                "actEnergyDlvd": 0.000,\n                "actEnergyRcvd": 0.000,\n                "apparentEnergy": 0.000,\n                "reactEnergyLagg": 0.000,\n                "reactEnergyLead": 0.000,\n                "instantaneousDemand": 0.00,\n                "activePower": 0.00,\n                "apparentPower": 0.00,\n                "reactivePower": 0.00,\n                "pwrFactor": 0.00,\n                "voltage": 0.00,\n                "current": 0.00,\n                "freq": 60.00\n            }\n        ]\n    }\n]\n'
2025-04-01 22:16:22.944 ERROR (MainThread) [custom_components.enphase_envoy.coordinator] Unexpected error fetching Envoy 1234567890 data
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/helpers/update_coordinator.py", line 380, in _async_refresh
    self.data = await self._async_update_data()
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/enphase_envoy/coordinator.py", line 196, in _async_update_data
    envoy_data = await envoy.update()
                 ^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/pyenphase/envoy.py", line 640, in update
    await updater.update(data)
  File "/usr/local/lib/python3.13/site-packages/pyenphase/updaters/meters.py", line 190, in update
    envoy_data.ctmeter_production = EnvoyMeterData.from_api(meter, ct_data)
                                    ~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/pyenphase/models/meters.py", line 87, in from_api
    status_flags=meter_status["statusFlags"],
                 ~~~~~~~~~~~~^^^^^^^^^^^^^^^
KeyError: 'statusFlags'
2025-04-01 22:16:22.945 DEBUG (MainThread) [custom_components.enphase_envoy.coordinator] Finished fetching Envoy 1234567890 data in 0.425 seconds (success: False)

My /ivp/meters json response:

[
    {
        "eid": 704643328,
        "state": "enabled",
        "measurementType": "production",
        "phaseMode": "split",
        "phaseCount": 2,
        "meteringStatus": "normal"
    },
    {
        "eid": 704643584,
        "state": "disabled",
        "measurementType": "net-consumption",
        "phaseMode": "split",
        "phaseCount": 2,
        "meteringStatus": "not-metering"
    }
]

and my /ivp/meters/readings json:

[
    {
        "eid": 704643328,
        "timestamp": 1743564539,
        "actEnergyDlvd": 8598256.633,
        "actEnergyRcvd": 15440.826,
        "apparentEnergy": 12039113.426,
        "reactEnergyLagg": 4797575.856,
        "reactEnergyLead": 2.593,
        "instantaneousDemand": -8.51,
        "activePower": -8.51,
        "apparentPower": 211.47,
        "reactivePower": 204.68,
        "pwrFactor": -0.04,
        "voltage": 242.10,
        "current": 1.75,
        "freq": 60.00,
        "channels": [
            {
                "eid": 1778385169,
                "timestamp": 1743564539,
                "actEnergyDlvd": 4289410.853,
                "actEnergyRcvd": 10354.238,
                "apparentEnergy": 6009977.620,
                "reactEnergyLagg": 2407363.257,
                "reactEnergyLead": 1.556,
                "instantaneousDemand": -5.15,
                "activePower": -5.15,
                "apparentPower": 107.28,
                "reactivePower": 103.84,
                "pwrFactor": -0.05,
                "voltage": 121.42,
                "current": 0.88,
                "freq": 60.00
            },
            {
                "eid": 1778385170,
                "timestamp": 1743564539,
                "actEnergyDlvd": 4308845.780,
                "actEnergyRcvd": 5086.589,
                "apparentEnergy": 6029135.806,
                "reactEnergyLagg": 2390212.598,
                "reactEnergyLead": 1.038,
                "instantaneousDemand": -3.36,
                "activePower": -3.36,
                "apparentPower": 104.19,
                "reactivePower": 100.83,
                "pwrFactor": -0.03,
                "voltage": 120.68,
                "current": 0.86,
                "freq": 60.00
            },
            {
                "eid": 1778385171,
                "timestamp": 1743564539,
                "actEnergyDlvd": 0.000,
                "actEnergyRcvd": 0.000,
                "apparentEnergy": 0.000,
                "reactEnergyLagg": 0.000,
                "reactEnergyLead": 0.000,
                "instantaneousDemand": 0.00,
                "activePower": 0.00,
                "apparentPower": 0.00,
                "reactivePower": 0.00,
                "pwrFactor": 0.00,
                "voltage": 0.00,
                "current": 0.00,
                "freq": 60.00
            }
        ]
    }
]

Looks like maybe this line is this issue here?

https://github.com/pyenphase/pyenphase/blob/153787d76070f30504dd41847b2863ca2471535d/src/pyenphase/models/meters.py#L87

Maybe update to something like this?:

  status_flags=meter_status.get("statusFlags", None),

@catsmanac
Copy link
Contributor

Smart move on that testing :-), helps a lot.

Yes, something like that. I've updated the PR with such a change for a next try.

I've added the data from your log file to the test fixture for this as well. We may need some more iterations on this.

Looking at the data it seems the envoy setup is continuing from it's old configuration. The current lifetime production value shows 8.5 MWh. The list of inverters reports 51 inverters, of which only the first 7 report with current timestamp. All other 44 report with a 2018 timestamp. This will result in a total of 51 inverters in HA of which 44 will be stuck on that old last value.

@njwretnm
Copy link
Author

njwretnm commented Apr 2, 2025

Hey @catsmanac -

Just want to confirm that everything is working great with your latest updates. I'm seeing the integration succeeding and a bunch of entities getting added and it looks like Wattage values are updating! So you definitely have my seal of approval!

My next question is should these have nice names when the integration is added, or is this possibly another issue? I see on my dashboard that the sensor values associated with the gateway all have generic names 'Energy' and 'Power'... Like Envoy 123456780 Energy , and the inverter entity names are: Inverter 1122334455 etc.

With regard to your other statement- I definitely have a long list of 'deleted' inverters in my Envoy gateway page as these were from the previous owner, and they did unfortunately show up as entities with essentially non-changing wattages in HA. I don't see a way to truly delete them from my Envoy (perhaps there is some hard reboot method somewhere?), but does the integration for more modern versions of Envoys prevent the retired inverters from being added as entities on init of the integration? If so, I can help provide more API responses (I see some from the logged in installer page that would probably be helpful in this). Otherwise, this would just be a new feature.

@njwretnm
Copy link
Author

njwretnm commented Apr 2, 2025

Image

Image

@catsmanac
Copy link
Contributor

Just want to confirm that everything is working great with your latest updates. I'm seeing the integration succeeding and a bunch of entities getting added and it looks like Wattage values are updating! So you definitely have my seal of approval!

Ah great, I'll move it to production for an upcoming release.

Can you get me a diagnostics file with test fixtures enabled so I can make sure our test fixture fully represent this case? On the HA Enphase envoy integration page use the configure option
Image and enable the option to include test fixture data in the diagnostics report

Image

Then in the Image menu pulldown use the download diagnostics menu option. It will download the diagnostics report with test data included that can be uploaded in a comment as file.

My next question is should these have nice names when the integration is added, or is this possibly another issue? I see on my dashboard that the sensor values associated with the gateway all have generic names 'Energy' and 'Power'... Like Envoy 123456780 Energy , and the inverter entity names are: Inverter 1122334455 etc.

The inverter names are as expected, the energy and power names should be longer. What does show in the entity properties?

With regard to your other statement- I definitely have a long list of 'deleted' inverters in my Envoy gateway page as these were from the previous owner, and they did unfortunately show up as entities with essentially non-changing wattages in HA. I don't see a way to truly delete them from my Envoy (perhaps there is some hard reboot method somewhere?), but does the integration for more modern versions of Envoys prevent the retired inverters from being added as entities on init of the integration? If so, I can help provide more API responses (I see some from the logged in installer page that would probably be helpful in this). Otherwise, this would just be a new feature.

The integration adds all inverters reported by the Envoy. I've seen it happen with modern firmwares as well with a change in inverters, when old inverters were not removed from the configuration. There is an endpoint /inventory.json on the Envoy. That shows all components it knows. Check if the old ones are in there as well. If so maybe there are some fields that can assist in detecting status of an envoy. The last reported timestamp of the inverter itself could be used as an indicator as well. And another easy method is In HA disabling the entities for these inverters and they will not be used anymore.

@njwretnm
Copy link
Author

njwretnm commented Apr 2, 2025

Diagnostics: config_entry-enphase_envoy-01JQVSJRBVK75A6VTBX9EYR4T8.json

I don't really see any additional metadata anywhere for the energy/power names:

Image

As far as active/inactive, it looks like there is envoy.local/inventory.json and envoy.local/inventory.json?deleted=1. The latter endpoint gives all of the inverters and includes deleted (in Envoy installer page, at least on my firmware, I can set inverters to 'deleted' though there is no way to fully remove them', and the first /inventory.json just provides active ones.

Including those json's here:

inventory.json

inventory-with-deleted.json

Maybe 'admin_state' is the answer to this one.

@catsmanac
Copy link
Contributor

As for the names, I think that may be caused by the custom integration having no translations!? Copy strings.json to en.json in a translations sub folder (custom_components/enphase_envoy/translations/en.json) and cycle HA.

@njwretnm
Copy link
Author

njwretnm commented Apr 2, 2025

Ah that translation folder/file fixed it! The names show up now.

@catsmanac
Copy link
Contributor

Including those json's here:

Can you read these using /inventory or do you need to use /inventory.json?

So the inventory has the active ones and could be a reference. Need to asses if or how to use. As said, you can disable the inverter device or inverter entities in HA.

@github-actions github-actions bot locked and limited conversation to collaborators May 4, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants