Show more

Setting up runners for the job queue in MediaWiki is ... poorly documented. May need to dive in and shovel some words here and there. :D

Ah... diving into the original MS article I think the idea is as a defense-in-depth mitigation for *trusted* code that processes *untrusted data* (and thus might be tricked into doing something that accesses trusted data elsewhere in the process, on behalf of some untrusted code).

Am I really confused or does this description of MS compiler 'fix' for Spectre variant 1 seem really confused about which code is vulnerable and which is changed? IUIC you could just use a pointer of your choice and not even bother with the Spectre stuff if you can write arbitrary C code injected into a process...? Compiler changes for managed VM code would prevent use of "safe code" loaded in-process for exploiting the attack, though right? Or am I backwards?

hack: on IE 11, the Math.imul polyfill is a bottleneck for VP8/VP9 decoding in . Replacing it with direct multiplication results in a noticeable speedup, but assumes no overflows will happen and breaks asm.js validation. ;)

Of course users will get far bigger performance gains from using Edge or Firefox or Chrome or Safari or anything other than IE 11. ;)

Still not sure if beautiful or gross that a billionaire had his rocket company launch a commercial for his car company as a "test payload".

I always read emails from GitHub thinking they're from GrubHub and vice versa

ugh, need C accessors for the JS anyway so ... stick with C it is! \o/

Object-oriented-style C is still kind of a pain. Nice that it doesn't hide the complexity, but annoying that I have to type a lot. Considering C++ for this polymorphic wrapper code. :P

Size overhead in emscripten output is minimal if i build with -fno-exceptions (don't need em in my code). But C++ grates too -- pure-virtual destructors must have a function body, for instance (wait what?)

Any recommendations on good practices for modern C11-era C programming? (or at least a good overview of what's changed and useful vs 'whatever old dialect of C you learned in the 90s and kept using in Linux stuff')

Still pondering how much to directly share code between OGVKit (iOS, C/Obj-C) and ogv.js (web, JS + C/emscripten). Pretty solidly planning to share the demuxer and codec wrapper & support code more since I've been cribbing functions back and forth. But would it be worth doing the player logic in one C implementation with callbacks versus two Obj-C and JS implementations? Decisions, decisions.

Dug out my iPod Touch 5th-gen to double-check that OGVKit still works on iOS 9. Works, but I have to buffer more frames to smooth out VP9 decode-time spikes at scene boundaries...

Wikimedia all-hands meetings were good time except the part where I got sick and missed half the stuff. Now I'm catching up on soooo many emails before getting back to my projects...

Think of all the stuff that still uses gtk2.

Gtk2 is not unmaintained, but it's in life support mode, thanks to people who actually get paid to do it.

Apps that haven't (been able to) switch to gtk3, they are maintained by people who *don't* get paid to do it.

Meanwhile, end users absolutely depend on those apps. We don't have that many "big" applications in free software, and we don't have an infrastructure for people to pay for them.

This is fucked up.

C warnings are like ... "optional" errors.

Where the project whose library you're using might have different "options" from the project you're using the code in.

Refactored my test case for wasm fail on iOS 11.2.2+ to a single JS file that runs synchronously:

Feel free to nick it to use in projects that need to fall back to asm.js versions on iOS!

The wifi-off option can be disabled, and for now I'm just going to explicitly lock the screen when I want. :P

Seems to crash the display server when the screen dims after idle. Whoops! Also it turned off wifi when turning the screen off, which is .... not good for an unattended vagrant setup that needs the network, either of them. :(

The default open-source nvidia graphics driver is a bit sluggish at 2160p, and more sluggish at 2x 2160p, but it'll do for now.

Also sometimes I have to reset the 200% scaling on my monitors, but that's probably because my monitor has crappy firmware. But I can now do that through the GNOME Settings app very easily, instead of manually running a command like I used to have to.

Show more
Mastodon for Tech Folks

This Mastodon instance is for people interested in technology. Discussions aren't limited to technology, because tech folks shouldn't be limited to technology either!

We adhere to an adapted version of the TootCat Code of Conduct and follow the Toot Café list of blocked instances. Ash is the admin and is supported by Fuzzface as a moderator.

Hosting costs are largely covered by our generous supporters on Patreon – thanks for all the help!