lauantai 25. maaliskuuta 2023

Decoding and replicating IR signal (part 1)

Some home automation, although this is my home away from home we're talking about here. At our cottage we have a Panasonic heat pump, now about 10 years old so at point it is up for replacement (I asked for service for it and technician pretty much told us that there's no point, when it stops working you just replace it). 

For now I have used some remote controlled sockets (with good results) to control heating there. Direct heat is somewhat expensive though, and using said heat pump to do it would be more cost-effective. The problem is that it is controlled by proprietary IR remote, so I can't just tell it to switch to "+20C heat" .. 

Or can I?

I wanted to replicate IR with hardware I have already - effectively same I used for RF stuff. My idea had basically two steps;

1) Receive and decode code send by IR remote
2) Replicate said code.

Seems so simple. It turned out to be much less so.

My original idea was to make a simple IR receiver module (easy, there's plenty of schematics on net) and record sequence. IR signal has usually carrier (IR being turned on and off, generally at 40-80 kHz range) and data is modulated on top. In this case it (I found out after building signal receiver) was 75kHz-ish modulation, with PWM on top of it, starting with longer preamble.

My original idea was to modify same code I had used as RF receiver to serve as IR receiver. Well, long story short, that did not work. RF receiver module did AM decoding, so I was receiving (relatively) clean 1 or 0 data, and only had to decode from there. IR didn't. I was receiving that carrier signal directly. And as it turns out, software was not fast enough for clean reception. It lost pulses too often to be reliable, so that idea was out.

That was last autumn. I ran out of time, winter was coming so I had to install system as-is to have even current control over it (and oh boy has it been great even at its current state; it dropped my power usage by full two thirds since system can keep direct electric heating off when it's warm enough outside for heat pump to manage alone).

I couldn't drop the idea of heat pump control however... (this is to say; to be continued)

 

perjantai 3. helmikuuta 2023

Boredom is essential

Prove me wrong: Boredom, or idleness, is essetial for creativity.

Over the years I've noticed that it is the periods of boredom, or not having things to do, or in other words, being idle, are the ones that bring out the creativity in me. 

Lately I've been extremely busy just keeping up with work (quite successfully.)  Several indenpendent things happened at same time (virus that messed with world being just one of them) and caused massive work load for us. During this time I've been able to do just the essentials. Yes, I kept things rolling and lights on, but it's just doing what is required. There is no creativity, no new things involved.

On the other hand, when this virus first hit, our business was second in line to get hit (first being our customers.)  While this was not a financial problem for us, it meant that we got almost no phone calls, no messages, nothing. Suddenly lots of time previously taken by other things -- was free. I suddenly had (almost) nothing urgent to do!

It took me a week or so before I installed full development environment at my home computer and in few weeks more that things rolling on things I almost never do. I had fun literally playing around with game ideas, and while nothing came out of them, they still were relaxing distraction.

Whenever I have had extended down time something similar has happened every time. Sometimes work related (some low-priority thing I might not normally touch), but I also get started with projects I haven't had energy to start before, like game development. I never get very far there (down time is almost always limited) but at least I get something done.

Years ago I read some blog post from someone, I don't remember the details, but gist was that as a business owner the goal is to make yourself unnecessary. Make your employers do the work - you teach them how, they keep the business rolling, and you make sure they have the tools to do that. Maybe then I could start thinking about other things to get done.

At this moment, I feel that I am on route to that goal. There's just this small issue of needing good software people. These days they're not exactly easily available...


 

perjantai 6. tammikuuta 2023

Server overload

I run a server for IoT-like devices. They send data to server, which verifies it (transmission errors with checksums and so on), checks that it isn't duplicate entry and then stores and acknowledges it to device, after which device marks it handled on its end. 

This system has been going on for almost ten years now, gradually growing by number of users and, by extension, data amount.

Some time ago this failed hard. Server overloaded and requests started timing out. Restarts helped for a while but since devices were still trying to send data, it very quickly fell again. After some while, even more devices had data to send but couldn't, and they really didn't have any throttling implemented so server could them to cool down.

Things were looking quite bad.

I had no real great tools to profile system, so I effectively had to resort to using top to see what's going on. And what was going on was that SQL server (not the MS product with similar name)  was using vast majority of CPU power.

So, I started think back to what code is doing.

1) Parse incoming message, doing validations and checksums
2) Find client device from database, discard data if it doesn't exist
3) Look through device's data, looking for duplicate for this data. Ignore data if duplicate found.
4) Insert new data to database
5) Send device acknowledge of data (in case of duplicate, data is acknowledged without action)

The process is like this for a reason. Basically I want that all data produced by devices is stored exactly once, and process assumes that there will be hiccups. Data getting corrupt during transmission, connections dropping for some reason in mid-transmit, acknowledgements not received and so on.

At this point I had strong suspicions on step 3. After few experiments (trying and timing those duplicate search queries) I indeed confirmed that this step was taking way too long, even with fully indexed device measurement data tables and query optimizations. This was Not Good.

I already knew what I would have to do. I just didn't want to do it, since it would make a small but not insignificant dent on reliability. I had to separate data reception from data storage. But there was no other option.

So I created a new table that contains just raw measurement data. The process above was split in two separate parts, running independently.

A1) Parse incoming data, validating etc.
A2) Store data to intermediate table
A3) Send device acknowledge of data

Then B, which is another script on cronjob. Architecturally this could be a continuosly running service which is notified by (A) via signal or such when there is data available. Cronjob was however quick and good enough solution here, causing latency of few minutes before received data appears to main database. Basically non-issue here.

B1) Load bunch of data from intermediate table
B2) Find client device from database, discard data if not found
B3) Look for duplicates, discard if found
B4) Insert data

And just like that, entire server issue was solved, with a cost of very slight reduction of data reliability (adding very unlikely but non-zero possibility that corrupt-in-transport data that somehow passes A1 gets through but fails either at B2/B3, without possibility of re-transmit).

Since then site has been chugging along nicely. I have also added some throttling to devices (by using remote update capability) that should in future ease the worst-case scenario. Switching to beefier server is another future-proofing option, which I need to keep an eye on in case it seems that things start get iffy again. And of course switching to using separate database and front-end servers.



maanantai 12. joulukuuta 2022

Cost of gaming (continued, again)

It was bound to happen some day. 

I have had laptops for some 20 years now (maybe 10 or so total), and I have managed to avoid damaging any of them seriously. Until now. And it was just about the worst possible time for it.

So, previously I ran some numbers on my laptop and desktop power usage. Laptop was running on some 20w idle, and 160w-ush when running a game. Desktop, on the other hand, was using 160w idle and 250w+ when running a game. (new numbers after some tweaking: 20w for laptop and 90w for desktop, including but not limited to bringing display brightness down a low)

Considering that electricity costs some 30c/kWh right now (tomorrow's high is up to 80c/kWh!), the difference really matters. And of course I managed to drop a damn portable speaker on the laptop, damaging it's screen and making repair necessary at the worst possible time of the year.

It's winter, meaning it's cold (power is needed for heating everywhere), and it's dark (no power from solar panels) and it's also not windy (so very little power from wind). Proverbial  perfect storm. Tomorrow's lowest figure is 53c/kWh so I'll be using it here; later at night it drops to 40c or so, but that's too late to make any real difference.

I recently bought Assassins' Creed: Valhalla and got into it. Bad news is that running it on my desktop computer used whopping 500 watts. That's 1€ per 2 hours of playing at that rate. And this is a long game, some reports say 100 hours or so -- so 50€ of electricity on top of purchase price.

I have no idea how much laptop would use when playing this (I shipped it for repairs before I got to try), but let's say 200w.  That alone drops power cost to 20€ or so, assuming everything else is the same.

Fortunately however I managed to bring AC:V's power usage down. I have 144Hz display, so it was running at rull rate and game had no direct options of limiting it. After some tweaks, including dropping details to "medium" and manually editing config files to force FPS limit of 60Hz, it "only" uses 250-ish watts on my desktop. Not great, but tolerable. I'm also considering dropping details further to bring that figure down.

And we're not even really affected by this too badly, since our house uses district heating (still fairly cheap) instead of electricity (or natgas, which pretty much no one uses here anyway) for heating.



sunnuntai 23. lokakuuta 2022

Mysterious crash

This is kinda follow-up on topics about 433MHz transmitter and receiver.

I've been using this setup to receive temperature data and enable/disable heaters at our cottage over last winter, and in general it has worked very nicely. DIY home automation, kinda-sorta, and previously it was used mostly to enable extra heaters before going there so it would be closer to 20C when arriving there, instead of 5C or so.

With energy mess going on in Europe, I was thinking about optimizing the setup so I lower energy usage even further by having it automatically disable heaters when it's warmer outside (above -10 or -5 C or so), when heat pump alone is sufficient for heating.

I did changes to software, but during testing I found out that system crashes hard occasionally, once or twice a week or so. The thing is, I have watchdog timer running on the MCU, and if software crashes the watchdog should reset MCU automatically. Except here it doesn't. 

Obviously first I checked that watchdog actually is running, and (by simulating imagined failure mode in code) can actually reset MCU in any case. And it works. Except in these mysterious crashes.

I've been trying different things for weeks now and so far with no results. At the moment my best guess is that MCU gets overflown with interrupts (particularily from RF receiver, as it is noisy) and has no time for anything else. 

To counter this, I made changes:

- Timer interrupt has a counter that is increased on every call. If this counter ever exceeds certain value (3 seconds or so), interrupts reset MCU.
- RF receiver interrupt has similar counter. I did some measurements and it seemed that in normal operation it was generating some 3000 interrupts per second (which is nothing, really). Likewise, if counter exceeds certain value, MCU is reset.
- Main loop, which normally runs at 50 Hz or so, resets these counters to prevent said MCU resets.

Winter (well, autumn) got here, so I had to install the system as it is, before I had time to do more testing. So far so good, it's been few weeks and it's still up.

 

 

lauantai 27. elokuuta 2022

Unbearable cost of electricity (laptop edition)

Continuing again from here (musings) and here (desktop computer usage)

Of course I couldn't not test laptop too. This is relatively beefy one, with Ryzen 5 5600H processor and RTX 3070 laptop GPU. I used laptop's display for tests, not external monitor, so using one would add to these figures.

Spoiler: If it weren't a real pita to connect laptop to my desktop display/keyboard/mouse at home, I'd seriously consider using it more often even at home.

Idle power usage seems to vary a bit, from low 10 (!!!) to 25 watts, but seems to be averaging at 20 watts (this with display at full brightness).

Again I tried playing Control with it (which on desktop had power usage of 520 watts total), and was pleasantly surprised to see laptop to use about 160 watts while remaining perfectly playable (albeit with a bit lower settings than desktop). My desktop computer is using that much power when just idling! 

Factorio (same base I used with desktop test - that was using 190-ish watts IIRC) had laptop consuming just 40 watts.

Since I had it installed, I also tried Dyson Sphere Program, with relatively late-game base (sphere being constructed and multiple star systems producing stuff). Like Control, it also used 160 watts when running.

So, what's my conclusion? None, really. This was mostly to feed my curiosity a bit. And hope someone finds these figures useful.


sunnuntai 21. elokuuta 2022

Unbearable cost of electricity (now with numbers!)

 Continuing from previous post .

I was wondering about cost of gaming when electricity can be up to 80c/kWh on some days. I didn't have any power meters handy back then, but now I found one I have had for years. It's mostly cheap low-grade consumer unit, and absolutely not calibrated, so I take its readings as mostly ballpark figures.

Now, to start with, I have desktop computer with AMD Ryzen 7 5800X CPU, AMD RX 6900 XT GPU and a 27" 75 Hz LCD display, and I use old Sony stereo system (with speakers) for sound. All these are on same power meter here.

Booting up desktop (no stereo or display yet) gives reading of 90 watts

Add display, we're up to 150 watts (so display is 60 watts).

Turn on stereo set and we're at 165 watts (15 watts for stereo set alone, with no sound. Note that this is around 25 years old set so I expect power efficiency of newer units would be better). So, at 80c/kWh, that 13 cents per hour for just idling or light work (or at more tolerable 30c/kWh, around 5c/hour).


Game of the day is Assassins' Creed: Odyssey, which I recently started again. At Ultra settings (which my system can run, not fully stable at full 75 Hz but still quite well) total power consumption is up to 400 watts (so 32c/h @ 80c/kWh or 12c/h @ 30c/kWh).

Drop graphics details by a notch to approximately "high" overall settings drops consumption to 300 watts (24c/h / 9c/h). So this drops power usage by full quarter, with effectively zero changes in visual detail.

I don't have any newest AAA titles available, but nearest equivalent I had installed is Control, which I think had also on "high" settings (not ultra). This is a bit more demanding game, upping power usage to 530 watts (42c/h / 16c/h). 

I also tested Factorio, with end-game base. Power consumption (of computer, not the base!) here was expectedly lower, just 190 watts, but still higher than just idling (15c/h / 6c/h).


So, to recap. As expected, AAA titles will cost more to play than "lower end" games, but the difference wasn't as great as I guessed before doing the measurements.

If I were to finish the AC:O (let's say 100 hours - it's a damn long game), it's cost be 12 euros in electricity alone -- and that is at expected average of 30c/kWh prices! And even having computer idling at 8 hours day would be 40 cents. I guess for now at least it's best to put computer in standby when not needing it, and avoid gaming during most expesive hours. While these figures shown here may seem low individually, they still add up over time.