Archive

Monthly Archives: August 2023

I have a notebook with a backlog of projects on which to work. For August, I’d like something on the easier side since I’ll be moving for a good chunk of it and probably exhausted for the rest.

Possibility 1: Workout Calculator

Three to four times a week I find myself needing to throw together a workout plan. That means making a set of sets that cover the main muscles, let rest the ones that need to recover, and get the heartrate up. There are some existing solutions online but honestly they’re all apps that are covered in ads and they annoy the hell out of me. This would probably be a mobile-first app; I’m thinking written in Godot.

Possibility 2: Sketch to Animation

I would very much like to set up a model which will take a character sheet and a pose sketch to make a finalized posed frame for an animation. This would be similar to the “three-quarters view from front and side” project because I hate drawing 3/4ths views. This is mostly targeted at people that like animation but aren’t really keen on detailed character work. Let someone do the detailed work and let another person make animations, then combine their efforts automatically while removing the tedium of redrawing lots of the same content. Sure, there exist animation solutions already which cover a lot of this, but the idea still captures my attention.

Possibility 3: OpenFX Davinci Resolve Plugin

I’ve been meaning to put together a plugin for resolve so I can do something like implement thin-plate-splines (i.e., porting puppet warp from After Effects). I don’t think this would be a finished product in itself; I’d only want to get the .ofx file built and running in Resolve and showing some kind of IO. Could be as simple as ‘invert color’, as long as it runs.

Possibility 4: Easily Addressable Short Term Memory Networks (EAST MN?)

It’s been a while since I did anything with language models, and I’ve been thinking about options to make smaller networks that are more powerful. Transformers do not have any historical state (positional encoding passed as input doesn’t count), which is a blessing and a curse. I’m curious about whether it would be possible to add a memory component to transformers that’s addrassible in the same way attention is.

I have 36 hours left to decide. I’m leaning towards #3 or #4 — #3 because I like writing Rust and #4 because it would be useful and would sate my curiosity.