Had a PR merged without further changes, after 5 years
github.com/queryverse/Q...
Had a PR merged without further changes, after 5 years
github.com/queryverse/Q...
was thinking back about my first vaguely significant open source PR today. Almost a decade ago.
Feels much longer.
github.com/JuliaLang/ju...
My employer did a spotlight on me for their social media.
😊😊😊
www.linkedin.com/posts/meet-o...
It's a good point.
And that feels less of a nasty system to work with
In general it feels like it should be useful for things to know stuff like "this collection is unordered".
Which implies maps over it, should be allowed to be parallelized.
(And so implies properties about the function being mapped also).
I feel like this isn't particularly novel.
I just haven't seen it exposed to user.
But to be fair I haven't seen closely any compiler tech focused on automatic parallism.
I think we could probably hack something together using nonnative type lattices in Julia to do all the inference around it
But the tooling for that isn't stable or nice if you don't have access to Keno.
Which we did have when I was at juliahub working with him in DAECompiler.
Compilers do do a fair bit of reasoning around this kinda thing.
Eg using purity to perform common sub expression elimination and loop invariant code motion.
But I haven't seen it very exposed to the end user as properties they could write code that dispatches on.
And if an operation is nilpotent then when mapping it over a sparse array then you don't have to map the structural zeros.
And if it's pure you can cache it.
Etc etc.
For example: if you have a scan (accum, fold) over an associate function (which you can infer) then you can use various prefix sum operations to parallelize it.
See
dspace.mit.edu/bitstream/ha...
Something I have been thinking about on and off is about the annotation of function primatives with properties, like nilpotent, associative, commutative, etc, then using (type) inference to infer this about functions built in them.
Then using that for automatic parallelism and other optimisations.
I accidentally turned on AI completion in VS-Code yesterday.
And I though "_fine_, I will finally try this thing"
and I am really not liking having my tab completion lie to me?
I just have to check it when it doesn't work, then look up the right way.
When I could have just immediately looked it up
the amount of hatred i have for good tests that were marked "skip" because they are too slow that have bit rotted,
is only exceeded by my hatred for good tests that were marked "skip" that are still good and would have caught bugs had they not been set to skip
I've just created a #JuliaLang package BlueskyClient.jl github.com/AtelierArith... .
You can post text, images with ALT text, GIFs, and MP4 videos from the package. Give it a try!
I did yes.
www.oxinabox.net/2024/12/08/b...
It's mostly a nice tutorial on how to work with typical JSON over REST with pagenation APIs.
bsky is shockingly easy API wise since no auth
this is a really cool page on all the tricks you can use to express different ideas in terms of (Mixed Integer) linear programming.
Cos the whole thing is very tricky.
Like the trick of splitting variables so you can have L1 regularisation which is nonlinear in naive implementation
I have spent far too much time dealing with flakey tests and nondeterministic failures today
devo
the amount of hatred i have for good tests that were marked "skip" because they are too slow that have bit rotted,
is only exceeded by my hatred for good tests that were marked "skip" that are still good and would have caught bugs had they not been set to skip
yes, I find i often in python I have modules that are really just one function (at least in their public API).
Because it makes sense for all that to be in one file.
Combined with explict imports and it makes quiet a complexity of import headers.
Also like if you really care about performance for languages not using latin-1 there are considerations about storing code-points rather than code-units.
And you might want a string to be memory mapped etc etc.
does python 3 have an abstract base class for strings?
I feel like it should.
because for data-structures like a rope you want the string API, but you definately do not want to store it like a string, so inheriting from str is no good.
en.wikipedia.org/wiki/Rope_(d...
I am not sure if my copyright is registered anywhere.
I am Australian, the imprint of Springer that was my publisher is based on Singapore.
Australian copyright applies without need for registration.
You need to register to take someone to court over it, but you can do that long after.
I wanted to see if I was part of the Anthropic settlement class:
✅ Book on LibGen before 2021
✅ Book has ISBN
❎ were registered with the United States Copyright Office
Alas, I am not.
While my rights have obviously been violated, it is not in a way that is covered by this court case.
It means means you end up with lots of
```
from package.component.foo_do import foo
from package.component.bar_do import bar
```
I have seen it worked around by reexporting
foo and bar from `component`
yes, I find i often in python I have modules that are really just one function (at least in their public API).
Because it makes sense for all that to be in one file.
Combined with explict imports and it makes quiet a complexity of import headers.
I have successfully convinced folk to call it a SalamanderModel
So we had a model incorporating thermodynamic constrains called a ThermoModel.
And another model with enzyme constraints called a GeckoModel.
(Gecko = Genome-scale model Enzyme Constraints, using Kinetics and Omics.)
And we were making a model to combine them.
Some kind of ThermoGecko.
The problem is not bounds check or no bounds check
The problem is the two (or maybe 3,) compilation caches
I need to set julia to always run with bounds checks enabled.
I have done this before.
The big virtue is no more redoing precompilation between running normally and running tests.
Since julia makes seperate compliation caches for the 3 different settings for bounds checks.
+tests always have them on