torstai 19. joulukuuta 2024

Decoding and replicating IR signal (part 2)

I seem to have completely forgotten this post, it's been draft for more than a year now. Sorry about that

Part 1 here .

First, why didn't I use dedicated IRDA peripheral on chip? Because I had absolutely no idea of the parameters involved, have never used it - and most importantly, it would have been really difficult to wire on my board since existing design didn't have correct IO exposed. And I really just needed to have few select messages to send, so copying signal timing was good enough. 

Over time I pondered on my spare time how I could make receiver better, less sensitive to signal timing. Previous attempt used IO interrupts,which caused system to occasionally miss one due to timings, so that was no go. Eventually I got a rouch draft of an idea working;

- Set up a timer peripheral to run at twice or clock rate of IR signal (this I could measure with scope, 76 kHz in this case as I wanted both "up" and "down" parts; actual modulation was running at 38 kHz). As listened this does nothing but increase a counter on every tick. This makes period of 13us per tick.

- All other interrupts are disabled

- Main loop does nothing but polls IO input line and on change, it first takes counter increased by timer above and stores its value (Tcurr).

- Now we take timer difference from last change (Td = Tcurr-Tlast). If the value is less than 3, this is modulation and we can ignore it; this is stored as "last activity" count (Tlast = Tcurr). Very first change triggers storing TmodLast.

If difference between from previous "last activity" (Tlast) count is larger (more than 200uS or so), we know that this was "modulation off" period which actually carries bits; now next modulation period was started.

Now compare difference from previous Tlast and TmodLast to get previous "modulation on" length. First is preamble (3600us or so) which resets receiver. All shorter (400us) periods were here for timing; more sophisticated system might use them for data too.

For this sequence, I treated 400us "modulation off" period as "0" and 1200us period as "1". 

- After longer off period, system automatically ends receiving.

This above is the end result, original idea had some flaws I found out when writing the code (debugging it was a bit difficult due to very strict timing requirements! Lot of debugging in the end was done by triggering few IO lines and monitoring IR data and said lines on oscilloscope that those happened when expected). In the end I had receiver working. 


Appendix on 2024: I later had to re-record signals as device I was controlling changed, and this time I got lazy. I just demodulate signal (using commercial demodulating receiver) and store on-off periods, without bothering to actually decode it. 

Replay is done with timer interrupt by toggling IR LED state when signal is "on"; main replay function merely keeps track on state and timing, and sets "IR modulating" signal for interrupt. Much simpler, but at same time far less satisfying...


Cheapest IR controller ever

As usual, I haven't really had anything interesting to write so once again apologies for my extended breaks from posting. Unfortunately I don't see this changing anytime soon.

That being said, today I found this IR remote controller that re-defines what "cheap IR controller" actually means.

This came with (also extremely cheap) LED toy, and is used to toggle it on or off. And of course it broke immediately, after being thrown. There was no real damage, just battery had fallen off, but nevertheless.

There is nothing on the other side of the board. IR LED pins are under the board and those are pins at the bottom middle. Underneath the "cross" (battery holder, battery is not there currently) there is simple press-button dome, giving this conroller surprisingly good tactile sense (compared to usual rubber membranes); user presses down on the cross part, pressing battery against holder and button dome. No modulation, no signalling, just LED on or off.

The only component on the board is single diode seen on bottom left. There isn't even current limiting resistor anywhere! I guess they rely on internal rersistance of the battery there.

And yes, toy can be turned on and off with just about any remote controller one might have lying around. I tried few immediately when I saw what's inside this.




torstai 29. elokuuta 2024

TIL: SQL Limit is slow

 It's been a while since I wrote anything. Mainly reason is that I have been very busy with work and other things, and consequently have had very little time to do any fun experimentation that would (or could) be noteworthy here.

Until now that I ran onto something. I have a Postgresql database with several tens of millions of rows. At one point performance got very bad so I updated to indexes, problem solved.

Now I was doing testing with new feature that does data analysis and did simple query, for testing (x.col being indexed column);

select a,b,c from table x where x.col=something and x.id=12345;

Trivial query and it did complete in milliseconds. In initial testing I wanted to just use one specific row to get started.

When things was working okay for that row, I wanted to update this so I can process things one at a time - again, to verify things work as expected;

select a,b,c from table x where x.col=something limit 1;

This query took ridiculously long time to complete, close to a minute or so. And returned zero rows.

So I did what anyone would, asked postgre to explain what it did. And it showed sequential scan. What?

Okay, when things fail, fall back to something trivial and start working back, so let's try something simpler...

select a,b,c from table x where x.col=something;
(0 rows returned)

This ran again in milliseconds. So limit did something not fun.

Time for googling, and yes, limit apparently manages to make postgre fall back to sequential scan in some cases. 

So there. Drop limit from query and process just one row (for now). And then check around in how many places exactly I am using limit needlessly...

Maybe something many already knew, but I didn't. Ah well.


lauantai 25. maaliskuuta 2023

Decoding and replicating IR signal (part 1)

Some home automation, although this is my home away from home we're talking about here. At our cottage we have a Panasonic heat pump, now about 10 years old so at point it is up for replacement (I asked for service for it and technician pretty much told us that there's no point, when it stops working you just replace it). 

For now I have used some remote controlled sockets (with good results) to control heating there. Direct heat is somewhat expensive though, and using said heat pump to do it would be more cost-effective. The problem is that it is controlled by proprietary IR remote, so I can't just tell it to switch to "+20C heat" .. 

Or can I?

I wanted to replicate IR with hardware I have already - effectively same I used for RF stuff. My idea had basically two steps;

1) Receive and decode code send by IR remote
2) Replicate said code.

Seems so simple. It turned out to be much less so.

My original idea was to make a simple IR receiver module (easy, there's plenty of schematics on net) and record sequence. IR signal has usually carrier (IR being turned on and off, generally at 40-80 kHz range) and data is modulated on top. In this case it (I found out after building signal receiver) was 75kHz-ish modulation, with PWM on top of it, starting with longer preamble.

My original idea was to modify same code I had used as RF receiver to serve as IR receiver. Well, long story short, that did not work. RF receiver module did AM decoding, so I was receiving (relatively) clean 1 or 0 data, and only had to decode from there. IR didn't. I was receiving that carrier signal directly. And as it turns out, software was not fast enough for clean reception. It lost pulses too often to be reliable, so that idea was out.

That was last autumn. I ran out of time, winter was coming so I had to install system as-is to have even current control over it (and oh boy has it been great even at its current state; it dropped my power usage by full two thirds since system can keep direct electric heating off when it's warm enough outside for heat pump to manage alone).

I couldn't drop the idea of heat pump control however... (this is to say; to be continued)

 

perjantai 3. helmikuuta 2023

Boredom is essential

Prove me wrong: Boredom, or idleness, is essetial for creativity.

Over the years I've noticed that it is the periods of boredom, or not having things to do, or in other words, being idle, are the ones that bring out the creativity in me. 

Lately I've been extremely busy just keeping up with work (quite successfully.)  Several indenpendent things happened at same time (virus that messed with world being just one of them) and caused massive work load for us. During this time I've been able to do just the essentials. Yes, I kept things rolling and lights on, but it's just doing what is required. There is no creativity, no new things involved.

On the other hand, when this virus first hit, our business was second in line to get hit (first being our customers.)  While this was not a financial problem for us, it meant that we got almost no phone calls, no messages, nothing. Suddenly lots of time previously taken by other things -- was free. I suddenly had (almost) nothing urgent to do!

It took me a week or so before I installed full development environment at my home computer and in few weeks more that things rolling on things I almost never do. I had fun literally playing around with game ideas, and while nothing came out of them, they still were relaxing distraction.

Whenever I have had extended down time something similar has happened every time. Sometimes work related (some low-priority thing I might not normally touch), but I also get started with projects I haven't had energy to start before, like game development. I never get very far there (down time is almost always limited) but at least I get something done.

Years ago I read some blog post from someone, I don't remember the details, but gist was that as a business owner the goal is to make yourself unnecessary. Make your employers do the work - you teach them how, they keep the business rolling, and you make sure they have the tools to do that. Maybe then I could start thinking about other things to get done.

At this moment, I feel that I am on route to that goal. There's just this small issue of needing good software people. These days they're not exactly easily available...


 

perjantai 6. tammikuuta 2023

Server overload

I run a server for IoT-like devices. They send data to server, which verifies it (transmission errors with checksums and so on), checks that it isn't duplicate entry and then stores and acknowledges it to device, after which device marks it handled on its end. 

This system has been going on for almost ten years now, gradually growing by number of users and, by extension, data amount.

Some time ago this failed hard. Server overloaded and requests started timing out. Restarts helped for a while but since devices were still trying to send data, it very quickly fell again. After some while, even more devices had data to send but couldn't, and they really didn't have any throttling implemented so server could them to cool down.

Things were looking quite bad.

I had no real great tools to profile system, so I effectively had to resort to using top to see what's going on. And what was going on was that SQL server (not the MS product with similar name)  was using vast majority of CPU power.

So, I started think back to what code is doing.

1) Parse incoming message, doing validations and checksums
2) Find client device from database, discard data if it doesn't exist
3) Look through device's data, looking for duplicate for this data. Ignore data if duplicate found.
4) Insert new data to database
5) Send device acknowledge of data (in case of duplicate, data is acknowledged without action)

The process is like this for a reason. Basically I want that all data produced by devices is stored exactly once, and process assumes that there will be hiccups. Data getting corrupt during transmission, connections dropping for some reason in mid-transmit, acknowledgements not received and so on.

At this point I had strong suspicions on step 3. After few experiments (trying and timing those duplicate search queries) I indeed confirmed that this step was taking way too long, even with fully indexed device measurement data tables and query optimizations. This was Not Good.

I already knew what I would have to do. I just didn't want to do it, since it would make a small but not insignificant dent on reliability. I had to separate data reception from data storage. But there was no other option.

So I created a new table that contains just raw measurement data. The process above was split in two separate parts, running independently.

A1) Parse incoming data, validating etc.
A2) Store data to intermediate table
A3) Send device acknowledge of data

Then B, which is another script on cronjob. Architecturally this could be a continuosly running service which is notified by (A) via signal or such when there is data available. Cronjob was however quick and good enough solution here, causing latency of few minutes before received data appears to main database. Basically non-issue here.

B1) Load bunch of data from intermediate table
B2) Find client device from database, discard data if not found
B3) Look for duplicates, discard if found
B4) Insert data

And just like that, entire server issue was solved, with a cost of very slight reduction of data reliability (adding very unlikely but non-zero possibility that corrupt-in-transport data that somehow passes A1 gets through but fails either at B2/B3, without possibility of re-transmit).

Since then site has been chugging along nicely. I have also added some throttling to devices (by using remote update capability) that should in future ease the worst-case scenario. Switching to beefier server is another future-proofing option, which I need to keep an eye on in case it seems that things start get iffy again. And of course switching to using separate database and front-end servers.



maanantai 12. joulukuuta 2022

Cost of gaming (continued, again)

It was bound to happen some day. 

I have had laptops for some 20 years now (maybe 10 or so total), and I have managed to avoid damaging any of them seriously. Until now. And it was just about the worst possible time for it.

So, previously I ran some numbers on my laptop and desktop power usage. Laptop was running on some 20w idle, and 160w-ush when running a game. Desktop, on the other hand, was using 160w idle and 250w+ when running a game. (new numbers after some tweaking: 20w for laptop and 90w for desktop, including but not limited to bringing display brightness down a low)

Considering that electricity costs some 30c/kWh right now (tomorrow's high is up to 80c/kWh!), the difference really matters. And of course I managed to drop a damn portable speaker on the laptop, damaging it's screen and making repair necessary at the worst possible time of the year.

It's winter, meaning it's cold (power is needed for heating everywhere), and it's dark (no power from solar panels) and it's also not windy (so very little power from wind). Proverbial  perfect storm. Tomorrow's lowest figure is 53c/kWh so I'll be using it here; later at night it drops to 40c or so, but that's too late to make any real difference.

I recently bought Assassins' Creed: Valhalla and got into it. Bad news is that running it on my desktop computer used whopping 500 watts. That's 1€ per 2 hours of playing at that rate. And this is a long game, some reports say 100 hours or so -- so 50€ of electricity on top of purchase price.

I have no idea how much laptop would use when playing this (I shipped it for repairs before I got to try), but let's say 200w.  That alone drops power cost to 20€ or so, assuming everything else is the same.

Fortunately however I managed to bring AC:V's power usage down. I have 144Hz display, so it was running at rull rate and game had no direct options of limiting it. After some tweaks, including dropping details to "medium" and manually editing config files to force FPS limit of 60Hz, it "only" uses 250-ish watts on my desktop. Not great, but tolerable. I'm also considering dropping details further to bring that figure down.

And we're not even really affected by this too badly, since our house uses district heating (still fairly cheap) instead of electricity (or natgas, which pretty much no one uses here anyway) for heating.