AI is ushering in a new golden age of The Text File
AI is ushering in a new golden age of The Text File
Reminds me of statmodeling.stat.columbia.edu/2025/11/23/w...
such is life
"we know the risk of the outcome is very low" sounds like prior information to me - are Bayesian approaches on the table?
If not, I imagine some penalty could give similar results? But in either case, if getting more data is a reasonable ask, that'd likely be better regardless?
bleak house does, in fact, have the juice, i fear
if there are no chameleon plant haters in the world then I am dead
The TLDR is that I set up a process on a server I have (that has a Bluetooth adapter) to listen for my controller scanning for a connection and waking the gaming PC via Wake-on-LAN. Pretty happy with it.
As a long-time Linux gamer, I'm super jazzed that I finally have a working solution for turning on my "SteamOS" couch gaming PC simply by turning on my controller!
Details in this blog post: www.amas.sh/wake-steamos...
affiliate marketing is the will of god
I'm a couple years late to the party, but if you like statistics (and could use a distraction from the dumpster fire that is the US right now) this is a cool paper from @paulbuerkner.com, @avehtari.bsky.social and others. #statsky
The results for my ThinkPad are that each percent increase in screen brightness increases power by about 22 mW. Some ballpark calculations show a decrease in total battery life by about 10 minutes for every 4% increase in brightness.
Fun little experiment to quantify something I've wondered about!
Graph of power readings versus brightness showing a strong linear trend.
I fit a very simple Bayesian model in Stan to infer the direct effect of screen brightness on power draw
New post where I describe a simple experiment I ran to measure precisely how much power my laptop screen uses at varying brightness levels.
www.amas.sh/backlight-po...
π
Computing avg power draw via change in energy / change in time still works! But importantly, on my machine at least, the energy readings have much lower fidelity over short time scales than power!
It does mean that if one were to try to perform any experimentation that used power observations from the battery as estimates of real-time power draw, it would be thrown off unless the experiments were run over long time scale.
I'm sure this behavior is well-known to some, but it was new to me (and I haven't been able to find any references to this smoothing process, I assume its hardware specific?).
In retrospect, it's sensible that smoothing is applied given the common use case of estimating remaining battery time.
Thanks to the simple form of the exponential smoothing applied here, I can invert the smoothing to recover the latent true power readings and get a much more realistic looking profile:
I've finally put in the effort to start blogging and the first post is a dive into my efforts to get reliable real-time power readings on my laptop running Linux.
The main thing I show is that my laptop seems to report _smoothed_ power readings instead of true values.
www.amas.sh/power-readin...
In the post, I show that the observed readings are highly consistent with an exponential smoothing process being applied to latent "true" readings. By assuming that the latent power profile is a step function, I find a smoothing parameter that maps the idealized latent power to the observations.
By taking readings from the battery while applying a step-function artificial load, I observed that the reported power decayed toward near-constant values in a manner that appeared exponential
Screenshot from Parable of the Sower showing a portion that takes place in March 2025.
Highlighted text from Parable of the Spwer that reads: "In New York and New Jersey, a measles epidemic is killing people. Measles!"
you win this round, Octavia Butler
A red kite, perhaps?
being a longtime python user, my perception of julia is that I can see myself really liking it, and I appreciate many of it's ideas, but python has never really given me a strong enough reason to switch. especially when it feels like python is making progress at improving itself
ah yes, the 'decent into madness' phase of becoming a Bayesian
I think I've been to that show. kicked ass
me: "ok but what if the reader isn't very good"
A gentle reminder that if you find it concerning to see the people in charge of MacOS and Windows cozying up to politicians you disagree with while existing guardrails are dismantled, 2025 is truly an excellent year to be a Linux user
dc has never been known as a fashionable city, but good lord, the average dropped off a cliff this weekend
Like the only way to read this is that a guy who is not the President said βyou may break the law for a while and I wonβt punish you when I become Presidentβ and everyone said, sure, good enough