This gives me a hint, something to look into. Thank you!
This gives me a hint, something to look into. Thank you!
Since restrictive behaviour for the permission tests is the default value, I would assume that all tests must fail if permissions are not changed and the TestPermissions property is not explicitly set. Do you mean that this is the way it works now in 27?
Short post format left no space for details. :-) It was BC 26 upgraded to 27.
And some more nuances.
TestPermissions = NonRestrictive doesn't help, only Disabled does. And not every test fails, only around one out of three. Probably those accessing some specific tables, I'm not really sure.
Why could BC tests start failing all of a sudden with "Sorry, the current permissions prevented the action"? I've been running my tests for a while, but after upgrading the dev instance got this error. Setting TestPermissions = Disabled resolves it, but I'm curious what could have caused it.
SQL Server allows table rows to exceed the 8KB data page limit. But... some curious side effects can happen.
www.keytogoodcode.com/post/another...
SetLoadFields function helps developers to avoid unnecessary table joins and reduce the amount of data sent over the network. But how significant is the benefit of a smaller dataset? When does it really make a difference? Some measurements in a blog post.
www.keytogoodcode.com/post/perform...
It looks like I need to practice my skills in using the Ctrl+F function. I once asked how I could generate a hash with the Cryptography Management codeunit based on a binary key. Easy, it turns out. There is a function for this that accepts a base64 key. I just did not find it.
How can just in time loads be faster than a full record load? After all, this means two queries reading the same table. And two are slower than one, right? Right, but only if we allow these two queries to run until completion.
www.keytogoodcode.com/post/jit-loa...
Which side are you on when it comes to the question whether SetCurrentKey is beneficial for data access performance in AL? Should we use it or drop it?
I give a definite and unequivocal answer: it depends.
www.keytogoodcode.com/post/to-sort...
If the other side was using a text key, this would make things easier for us. The key is actually a hex number shared as text, but they parse it into its actual hexadecimal value before running the hash function.
This could be another option for a contribution to the system app. The function that generates the hash must decode the Base64 string in this case. But as it is, the function would take the encoded string as the key.
Yes, that's what I was considering too. Just thinking that the developer experience would be a bit different from classic arrays, as we cannot allocate a specific amount of memory, but have to add one element at a time.
It could be on overload. Although AL is generally not too good in manipulating byte arrays. Would it even be a good idea to to have a function accepting a list of bytes, considering that AL arrays can only have fixed size?
I can't see any way around it, but maybe someone has an idea how to generate a hash in AL with a byte sequence as a key instead of a text string?
I have a good question for #bcalhelp. I need to generate a hash with a function from Cryptography Management CU, which requires a text argument as the key. But I need to send a byte array. Text in AL is UTF-8, so bytes above 7F turn into 2-byte sequences, and my key never matches the counterpart.
@nataliekarolak.bsky.social has been concerned about the lack of activity on BSky. I have a suggestion. How about garbage collection? Seems to be a good topic to talk about.
www.keytogoodcode.com/post/garbage...
I worked a little more on the BC cost tracer and added a couple new features: node grouping and a new layout with the fCoSE algorithm, which is very efficient in compound nodes clustering. Thanks to this algorithm, item ledger entries can be grouped by the document no.
github.com/adrogin/BCVi...
Looking forward to be on stage at @bctechdays.com with @stefanmaron.com
And the blank message in this lig entry does not help either. It would be worth adding something meaningful.
Search in the repository points to Master Data Management in ALAppExtensions, just a feature usage log.
No other method I'm aware of. And this way, it looks good. It's a clear intention: "I'm only going to use these fields". And then the runtime will adjust the list anyway, if any other fields must be included.
InStream/OutStream are essentially memory buffers with read/write pointers. In this snippet, CreateInStream instantiates the stream, but does not read the data, whilst ReadAs actually fills the buffer. I don't see as a hack, but it surely can be confusing.
I almost never use the return value of the Insert function in AL, but recently I found it useful in avoiding a specific deadlock situation. And the deadlock itself was a very curious one.
www.keytogoodcode.com/post/buffere...
The holidays are almost over, next week I am back to work and other Business Central activities. But these two weeks made a great breakaway tour.
Are you afraid of filters on FlowFields in AL? Well... There is definitely a reason to be cautious. But it may not be as bad as it seems, and certainly not as bad as some other methods which are used to avoid these filters.
www.keytogoodcode.com/post/filteri...
Following my series of posts where I've been addressing various aspects of table indexing, this post delves into the same argument from a different point of view. Now I want to ask a question: how indexing impacts query performance from the insert/update side.
www.keytogoodcode.com/post/table-i...
Did you ever get confused with Business Central's temporary tables and the way temp tables are passed to procedure parameters? I did too, so I wrote a post as a memo.
www.keytogoodcode.com/post/tempora...