maanantai 12. joulukuuta 2022

Cost of gaming (continued, again)

It was bound to happen some day. 

I have had laptops for some 20 years now (maybe 10 or so total), and I have managed to avoid damaging any of them seriously. Until now. And it was just about the worst possible time for it.

So, previously I ran some numbers on my laptop and desktop power usage. Laptop was running on some 20w idle, and 160w-ush when running a game. Desktop, on the other hand, was using 160w idle and 250w+ when running a game. (new numbers after some tweaking: 20w for laptop and 90w for desktop, including but not limited to bringing display brightness down a low)

Considering that electricity costs some 30c/kWh right now (tomorrow's high is up to 80c/kWh!), the difference really matters. And of course I managed to drop a damn portable speaker on the laptop, damaging it's screen and making repair necessary at the worst possible time of the year.

It's winter, meaning it's cold (power is needed for heating everywhere), and it's dark (no power from solar panels) and it's also not windy (so very little power from wind). Proverbial  perfect storm. Tomorrow's lowest figure is 53c/kWh so I'll be using it here; later at night it drops to 40c or so, but that's too late to make any real difference.

I recently bought Assassins' Creed: Valhalla and got into it. Bad news is that running it on my desktop computer used whopping 500 watts. That's 1€ per 2 hours of playing at that rate. And this is a long game, some reports say 100 hours or so -- so 50€ of electricity on top of purchase price.

I have no idea how much laptop would use when playing this (I shipped it for repairs before I got to try), but let's say 200w.  That alone drops power cost to 20€ or so, assuming everything else is the same.

Fortunately however I managed to bring AC:V's power usage down. I have 144Hz display, so it was running at rull rate and game had no direct options of limiting it. After some tweaks, including dropping details to "medium" and manually editing config files to force FPS limit of 60Hz, it "only" uses 250-ish watts on my desktop. Not great, but tolerable. I'm also considering dropping details further to bring that figure down.

And we're not even really affected by this too badly, since our house uses district heating (still fairly cheap) instead of electricity (or natgas, which pretty much no one uses here anyway) for heating.



sunnuntai 23. lokakuuta 2022

Mysterious crash

This is kinda follow-up on topics about 433MHz transmitter and receiver.

I've been using this setup to receive temperature data and enable/disable heaters at our cottage over last winter, and in general it has worked very nicely. DIY home automation, kinda-sorta, and previously it was used mostly to enable extra heaters before going there so it would be closer to 20C when arriving there, instead of 5C or so.

With energy mess going on in Europe, I was thinking about optimizing the setup so I lower energy usage even further by having it automatically disable heaters when it's warmer outside (above -10 or -5 C or so), when heat pump alone is sufficient for heating.

I did changes to software, but during testing I found out that system crashes hard occasionally, once or twice a week or so. The thing is, I have watchdog timer running on the MCU, and if software crashes the watchdog should reset MCU automatically. Except here it doesn't. 

Obviously first I checked that watchdog actually is running, and (by simulating imagined failure mode in code) can actually reset MCU in any case. And it works. Except in these mysterious crashes.

I've been trying different things for weeks now and so far with no results. At the moment my best guess is that MCU gets overflown with interrupts (particularily from RF receiver, as it is noisy) and has no time for anything else. 

To counter this, I made changes:

- Timer interrupt has a counter that is increased on every call. If this counter ever exceeds certain value (3 seconds or so), interrupts reset MCU.
- RF receiver interrupt has similar counter. I did some measurements and it seemed that in normal operation it was generating some 3000 interrupts per second (which is nothing, really). Likewise, if counter exceeds certain value, MCU is reset.
- Main loop, which normally runs at 50 Hz or so, resets these counters to prevent said MCU resets.

Winter (well, autumn) got here, so I had to install the system as it is, before I had time to do more testing. So far so good, it's been few weeks and it's still up.

 

 

lauantai 27. elokuuta 2022

Unbearable cost of electricity (laptop edition)

Continuing again from here (musings) and here (desktop computer usage)

Of course I couldn't not test laptop too. This is relatively beefy one, with Ryzen 5 5600H processor and RTX 3070 laptop GPU. I used laptop's display for tests, not external monitor, so using one would add to these figures.

Spoiler: If it weren't a real pita to connect laptop to my desktop display/keyboard/mouse at home, I'd seriously consider using it more often even at home.

Idle power usage seems to vary a bit, from low 10 (!!!) to 25 watts, but seems to be averaging at 20 watts (this with display at full brightness).

Again I tried playing Control with it (which on desktop had power usage of 520 watts total), and was pleasantly surprised to see laptop to use about 160 watts while remaining perfectly playable (albeit with a bit lower settings than desktop). My desktop computer is using that much power when just idling! 

Factorio (same base I used with desktop test - that was using 190-ish watts IIRC) had laptop consuming just 40 watts.

Since I had it installed, I also tried Dyson Sphere Program, with relatively late-game base (sphere being constructed and multiple star systems producing stuff). Like Control, it also used 160 watts when running.

So, what's my conclusion? None, really. This was mostly to feed my curiosity a bit. And hope someone finds these figures useful.


sunnuntai 21. elokuuta 2022

Unbearable cost of electricity (now with numbers!)

 Continuing from previous post .

I was wondering about cost of gaming when electricity can be up to 80c/kWh on some days. I didn't have any power meters handy back then, but now I found one I have had for years. It's mostly cheap low-grade consumer unit, and absolutely not calibrated, so I take its readings as mostly ballpark figures.

Now, to start with, I have desktop computer with AMD Ryzen 7 5800X CPU, AMD RX 6900 XT GPU and a 27" 75 Hz LCD display, and I use old Sony stereo system (with speakers) for sound. All these are on same power meter here.

Booting up desktop (no stereo or display yet) gives reading of 90 watts

Add display, we're up to 150 watts (so display is 60 watts).

Turn on stereo set and we're at 165 watts (15 watts for stereo set alone, with no sound. Note that this is around 25 years old set so I expect power efficiency of newer units would be better). So, at 80c/kWh, that 13 cents per hour for just idling or light work (or at more tolerable 30c/kWh, around 5c/hour).


Game of the day is Assassins' Creed: Odyssey, which I recently started again. At Ultra settings (which my system can run, not fully stable at full 75 Hz but still quite well) total power consumption is up to 400 watts (so 32c/h @ 80c/kWh or 12c/h @ 30c/kWh).

Drop graphics details by a notch to approximately "high" overall settings drops consumption to 300 watts (24c/h / 9c/h). So this drops power usage by full quarter, with effectively zero changes in visual detail.

I don't have any newest AAA titles available, but nearest equivalent I had installed is Control, which I think had also on "high" settings (not ultra). This is a bit more demanding game, upping power usage to 530 watts (42c/h / 16c/h). 

I also tested Factorio, with end-game base. Power consumption (of computer, not the base!) here was expectedly lower, just 190 watts, but still higher than just idling (15c/h / 6c/h).


So, to recap. As expected, AAA titles will cost more to play than "lower end" games, but the difference wasn't as great as I guessed before doing the measurements.

If I were to finish the AC:O (let's say 100 hours - it's a damn long game), it's cost be 12 euros in electricity alone -- and that is at expected average of 30c/kWh prices! And even having computer idling at 8 hours day would be 40 cents. I guess for now at least it's best to put computer in standby when not needing it, and avoid gaming during most expesive hours. While these figures shown here may seem low individually, they still add up over time.



sunnuntai 14. elokuuta 2022

Unbearable cost of electricity

European energy prices are absolutely insane right now, not least because way too large number of people have been way too willing to rely on russian energy, ignoring non-monetary costs involved.

Month ago I had nice fixed-price deal where I paid something like 4c/kWh for energy, plus around 5c/kWh for grid/energy transfer costs. That deal expired and not I'm on spot (market) pricing, which today varies from 40 to 60 c/kWh (plus same 5c/kWh for grid). 

Not greatest time for EV. Funnily enough, most fast chargers still charge 20 or 25 cents/kWh, so it is actually often cheaper for me to use fast charger than my home charger. Insane. Yet home charging would still be cheaper than driving with gasoline, just by much smaller margin now.

I do have solar panels on roof, but those give their best output afternoon, between 13 and 17. Thus I try to schedule dishwashed and washing machine to that time, and always hang-dry clothes instead of using dryer as it is massive energy hog.

Some other things are more difficult to avoid. Computers for example. Many games, especially AAA ones, push both CPU and GPU to maximum, which of course uses more energy. It's kinda weird to turn down graphics settings to save electricity, yet here we are. 

I doesn't seem like things will get easier any time soon either. And really, I shouldn't be complaining about price of damn electricity when others are literally fighting for their homes at the same time, so I just stop here and salute those who are sacrificing the most.

Slava Ukraini!



keskiviikko 13. heinäkuuta 2022

Disastrously failed amplifier repair

We've had Denon AVR-1911 radio/amplifier for fairly long time now, can't remember exactly but I'd guess some 10-15 years. Lately its use has been, well, occasional, at best, as sound from TV has been "good enough" for almost all of the time.

During the pandemic I bought "ticket" to Nightwish internet performance, and obviously would have liked to to listen that on the stereo set. Alas, no sound. Sound system starts up, but is completely mute. It seems like all audio sources are there (HDMI is detected for example) but no matter what, there is no sound coming through the device.

At the time I had no time to troubleshoot it, so I just shoved it in a cabinet at home, and it has been sitting there for year or so. Now I finally had some free time to work with it, so I took it to work where I have much better tools for this kind of work (I had found service manual for it so I could try to figure out what's wrong -- these damn things are seriously compicated to work with, at least without any help)

Before I managed to get properly started, I 

dropped it

while trying to get all dust out. At first it seemed that it fell on rear corner, bending metal chassis and nicking small piece out of one PCB, but now it went to error mode almost immediately when started up. Crap.

On closer inspection I noticed that a PCB at very opposite corner it fell on was also cracked (on picture I apply pressure it to allow seeing the crack more clearly). No idea how, but that certainly qualifies as huge issue. Huge enough issue that I immediately gave up. I might be able to bodge traces together, but that goes way past work amount I was willing to spend -- and the drop might have caused other damage too I've yet to notice. PCBs in it are soldered together, so it's major effort to even take boards out!

Too bad, this might've been interesting to diagnose otherwise. And if I were to use this, say, even weekly, I might be willing to put in more effort. But like I mentioned, when it's been a year and I've barely missed it, it obviously isn't that necessary gadget in our home right now.



sunnuntai 8. toukokuuta 2022

LED lamp

I've been upgrading my home light to LEDs over the years, at this point I think I don't have any incandescent light in use anywhere. Earlier I replaced quite a few incandescents with CFLs, but now it's exclusively leds.

Some people claim that both CFLs and LEDs have very short life, but my anecdotal experince says otherwise. I don't remember how many CFLs have broken but I think it was countable with one hand fingers (that is, without using binary, for you nitpickers out there), and LEDs have been equally long-lived. 

So, when one dimmable one broke after nearly 10 years I wanted to take closer look. 

Led in question was Osram (reputable brand) bulb, with 4 year warrenty.  I have taken habit of writing buy date to lamps, so it's easy to check whether they are covered by warranty when they break (spoler alert: exactly one bulb of CFLs or LEDs have broken during warranty period and was refunded without questions).

It was mostly the broken white base that made me curious, but unfortunately this turned out to be false alert. After years of use the plastic had become brittle and broke very easily. Underneath "glass" (plastic too actually) is small PCB with LEDs on it, glued with some kind of thermal compound to metal base with press-fit connectors acting as LED supply prongs coming out from inside. That is quite a nice setup.

I managed to pry the top off, but at that point I stopped my examinations short. Inside is completely potted. I really didn't feel like digging in that mess, so I just gave up. I expect board to be fairly straightforward led driver board design anyway.

Someone elsewhere mentioned that these bulbs are pretty much designed to run as hot as they can. This one wasn't even high-wattage one, but that claim makes kinda twisted sense, considering what happened to this one and what I've experienced when trying to handle these just after turning them off. I think that if this was designed to run just few tens of degrees colder, it would last much, much longer yet. Too bad for planned obsolescense.


maanantai 2. toukokuuta 2022

How does rainfall sensor work?

Years ago I got a weather station from somewhere, don't even remember where exactly anymore. It had all kinds of goodies like usual temperature and humidity displays, with some history (few days), but also wind and rainfall sensors. I never installed wind meter (due to lack of suitable location) or rain meter (due to snow), so they just sat on my shelf.

When doing cleaning I found the rainfall sensor again. I was just about to throw it away, but then reconsidered and checked it again. How does rainfall sensor work, after all?

Sensor itself is cylinder, with conical opening on top, with some small holes at the bottom. Hmm.


 

Opening the device is enlightening. Inside there is small swing-like device. When enough water drips on upper part of this swing, weigh eventually causes it to fall down to other side, causing accumulated water to spill out through  opening in bottom. Then other side starts collecting water. That small metal-colored cylinder is magnet; inside the box is small board with reed (magnetic) switch which is used to count "swings".

So, let's go back a bit. The opening on top is approximately 11.2cm in diameter (quick'n'dirty tape measurement so accuracy is so-and-so), so opening is approximately 100 square centimetres total. I don't feel like experimenting on water volume needed for tripping the swing, but I think it is also some nice round number, let's say X, eyeballing the size of things suggests possibly few millilitres.

I am not exactly current with details of rainfall math, but all this means that every time magnetic switch triggers, X amount water had rained to area of 100 sq cm. With little more math its fairly trivial to figure out more detailed rainfall figures.  

Also in the box is some kind of RF transmitter. I also don't feel like experimenting with it too much to figure out how it exactly transmits rain amounts to base station, since this very likely works on one-way mode (i.e. no confirmation from base of received data), unfortunately I don't have time for that. My guess is that it transmits raw number of trigger pulses received, leaving detailed math to main unit; or then accumulated rainfall figure over last hour or so. For cheap unit I pretty much expect the first.

And now I can happily toss this thing from taking up space on my shelf.



keskiviikko 20. huhtikuuta 2022

Inside SSD

 

I've been ridiculously busy lately, so there's not been any time for, say, extracurricular activities like writing. Hopefully things get a bit easier soon, too much busy wears me out too quickly.

I've been preparing for office move (slowly), and thus got rid of few older laptops. I did take out SSD first though. It was 240GB (or GiB?) drive. There was a time in past where I would have taken such drive immediately to use, but now ... No. Even if we ignore "it's old SSD and thus prone to failure" factor, it's tiny. Relatively speaking of course. My current laptop has 500GB SSD, and my desktop has 1TB SSD + 4TB HDD, so this would be serious downgrade in any way I could imagine.


So what then? Well, I haven't taken one apart yet ...

This was easy. Just few screw and I have the PCB in my hands. And it's pretty much what I expected. Loads of flash chips (OCZ M2502128T048AX22 marking, custom marking very likely, I could not find anything about it), SandForce SF-2281VB1 Controller (programmable app-specific chip, again, not much more) and some other chips. 

 Looks like there might be RS-232 port on top left  (curious, but not enough to see whether anything comes out of it -- for now at least) and JTAG on top (chip next to it is Lattice POWR607 power management chip, so for programming it and very likely sandforce controller too). Chips at down are very power controllers (buck?) also.

Other side has some more flash, nothing much more.


All in all, pretty much what I expected.

Now, doing some basic math, 240GB divided by 16 chips makes 15GB per chip. A bit weird number, so I assume they have allocated 1GB per chip (16GB total, or around 6.25% of total capacity) for dead blocks and such.

So, back to wondering what I could do with this now, aside trashing it...