WeatherFlow PiConsole - Archive

@peter
sorry if this may have been asked before?
But I’ve noticed that there is a RED “!” in my date / time panel on the client’s that I run on my Windows 11 PCs.
Do you know what they are/mean?

It does not appear on my Rpi clients…
wfpiconsole_!

NEVERMIND… I see that I’m NOT the only one that has noticed it …

said the company who wants to charge you for an upgrade fee plus monthly rental fee increases due to switching to newer gear…

I wouldn’t put too much trust in anything a cable provider tells you.

(now if you can catch a service guy working nearby and walk over and ask a couple questions to the techie, ‘then’ you can likely believe what they tell you, if you get a guy who wants to take a quick break and chat)

2 Likes

Hey all, was a fix for this figured out. Mine just started doing this a couple days ago. I tried reading through and didn’t find the fix if there was one.

Hi All, I loaded WeatherFlow PiConsole on my rpi4 and when running I noticed the CPU usage is pretty high. Most of the time it is hovering with the 4 cores at 60 -70%. Is this normal for the PiConsole?
Any one else seeing this?

As mentioned using a RPI4 with 4GB ram, and 7" 1024x600 HDMI touch display.

`top - 18:04:33 up 1:28, 3 users, load average: 3.21, 3.08, 3.09
Tasks: 156 total, 2 running, 153 sleeping, 0 stopped, 1 zombie
%Cpu(s): 72.1/2.6 75[||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| ]
MiB Mem : 3839.4 total, 3096.6 free, 286.8 used, 456.0 buff/cache
MiB Swap: 100.0 total, 100.0 free, 0.0 used. 3414.7 avail Mem

PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
1462 pi 20 0 426832 169908 66648 S 296.5 4.3 259:30.06 python3`

No, this doesn’t seem normal. What version of Raspberry Pi OS are you running? Buster or Bullseye?

Is the cpu usage high all the time?

Hi Peter, yes, CPU is running between 60 – 90% all the time. Wfpiconsole is the only thing loaded on this RPI4.

It is the main.py that is showing between 300 and 370% in top.

I had to slow down the 1 second polling to 60 seconds polling manually in the code and now the CPU is around 50% load all the time. I was thinking about loading Buster to see if that makes any difference.

This is readout of top with the polling set to 60 seconds.
load average: 2.47, 2.62, 2.60
%CPU %MEM TIME+ COMMAND

270.3 4.2 215:49.30 python3

I’m running

cat /etc/os-release

PRETTY_NAME=“Raspbian GNU/Linux 11 (bullseye)”

NAME=“Raspbian GNU/Linux”

VERSION_ID=“11”

VERSION=“11 (bullseye)”

I reinstalled using Bullseye and everything is working now. CPU is at most like 10%. Crazy, must have been a bad install previously.

Also wanted to mention I did also install Buster via the Raspberry Pi Imager and that the install of bash script failed because Python openssl would error because pip was trying to install version 2.7, I think. Kept seeing messages about Python 2.7 is EOL, and Python 3 needed to be installed.

I have the red circle with the ! Point as well. I can not get this to stop. The phone app is fine. I want to try a new api. Can someone direct me on how to change the api on my current setup without starting from scratch and doing this setup all over?

Sorry you’re having trouble too. I am working with the WF team to track down the exact cause of this issue. I am not sure what you mean by trying a new API. There is only one API provided by WeatherFlow. Do you mean a new Personal Access Token? If so, you will need to stop the console and open the wfpiconsole.ini file in a text editor. Then you can delete the old token in the WeatherFlow key and copy in a new one. I don’t think this will fix the issue though

1 Like

I was referring to CheckWX. In the setup instructions it instructs you to go to CheckWX and create an account. Then you use that API code during setup when setting up piconsole. It says on the CheckWX site that the api can be compromised and sometimes you may need to use a new on.

Ah I see! The same applies as above then: generate a new CheckWX API key on their website, open the wfpiconsole.ini file in a text editor and replace the checkwx key. That all being said, the red exclamation mark is caused by a lack of data flowing from the WF servers. Changing the CheckWX API key (which gets data from a completely different company) is not going to fix that issue. Sorry! I will keep working on it until it is fixed

Thank you. Can you email me directly at dbarnett@alexandercountync.gov ?

We had a ton of rain today (in Southern California) and I noticed the reported daily total is very different from what I see in the tempest app. Wfpi shows 1.17" and the Tempest app shows 1.82". I shut down and restarted the console but the reported amount is the same. Is there a delay in the reported total or did I do something wrong in setup?

Do you have a
Screenshot_20221108_224346
by the rain value? If so, the Tempest app is showing the NearCast Rain value whereas WFPi is showing the raw output from the Tempest.

1 Like

Yes, I do see that now. Is there a way to show the value from my station in the app?

Found the toggle. Thanks for the reply, I didn’t recall the nearcast feature/function.

2 Likes

Glad you found it. FWIW, more NC info here: https://help.weatherflow.com/hc/en-us/articles/360024436634-NC-Rain-NearCast-Rain-

Peter, All,
Over the past few weeks, the Red ! mark has been and still showing up on the WFPIConsole. Like others, I have rebooted the Raspberry, powered off the unit and back on, stopped the WFPIConsole app and restarted but the Red ! still returns.
Has this been fixed and if so, what do I need to do?

Hi @rnix.1, unfortunately this behaviour is still on going and is not due to the PiConsole. Current thinking is that there is a bug in the WF API that results in the Websocket not sending messages, triggering the PiConsole to show the red exclamation mark. Others have seen the same behaviour in completely different integrations: Websocket issue I could use some help with

Unfortunately, we still haven’t had any official confirmation from WF that something is wrong/they are working on fixing it.

Peter,
Thank you for the update.
Cheers