Recently decided to try to discipline myself a bit more. Too many projects on the go at once, which honestly has been doing nothing good for my mental health. I've decided to abandon some other projects that had been adopted on the side and merge Compton, my 2D engine, into Yin.

This means that going forward Yin will have functionality for supporting both 2D and 3D games. This also means that Compton is being retired proper now. But going forward I'll hopefully only have Hosae and Yin in focus (and whatever game I build on top of Yin.)

Over the last day I've been gradually working on incorporating functionality from Compton into Yin, particularly starting off with the GUI system.

While migrating the GUI system, and rewriting it in the process, I've opted to actually incorporate it as a separate library instead of integrating it directly into the engine, which is an approach I'd like to use for other major additions going forward.

Currently trying out wxWidgets for the editor frontend, though my motivation has been all over the place for that; I don't like wxWidgets but at least it's a little more actively maintained and supported than the Fox Toolkit I was using before (though I'd say Fox is still significantly nicer in a lot of respects.)

Yin's scripting language is now called Dickens.


Did a tiny tiny bit of work on the 3D engine yesterday after a long pause, which basically involved submitting defeat and starting an overhaul of my Blender plugin. I guess I'll be making levels via Blender. For now.

Plan is to make another game this year (or next) to progress the engine tech a bit more. Basic design is in the works.

2D engine now has a functioning animation system; script for pulling groups of sprites from an image and then another script for outlining the individual animations, their speed, what frames, whether to loop or flip and more. Good stuff!

Dan suggested an editor. The thought makes me cringe; I've been slightly fed up of working with various UI frameworks. But then it hit me while I was in the middle of something else today that my 2D engine has a pretty simplistic but functional UI system that I've been looking for an excuse to flesh out a bit more, and that would mean the editor could be integrated within the engine. That means less work... Hopefully!

Didn't make much progress towards that today however. Considering how to do it. Probably just a pre-processor flag that denotes whether to compile as the editor? Handling this all at runtime always ends up a bit of a mess, at least with how I've handled it before. Do you do it via a keyboard shortcut? Should I integrate a console to allow the user to trigger a command (and flesh that out going forward)? Or should it just be a stand-alone application?

Thinking more and more about silly complex things, and really need to avoid that, just need to prioritise actually making progress!

Oh, and came up with a possible name for the game the other day. Dulce Somnii. It roughly translates to 'sweet dreams' in Latin. I've had it as my status on Discord for a while, but it might make sense for the game in mind, certainly more interesting than the other name that was floating around my brain.


So my Anachronox re-implementation now compiles on both x86 and x64 architectures - and on Linux, finally. SDL2 implementation is done, though introduced a bug with movement speed... Rest of simulation seems fine, just movement is slow? Stupid shitty lockstep crap. Will look into it later given we'll be rewriting the movement logic anyway.

Apparently someone got it running on an ARM device as well? I do have an Raspberry Pi here (actually 3/4 of them), but had not really considered doing this yet as I'd like to rewrite parts of the renderer first.

Things on Yin have been busy busy busy. There's been more work on the client/server architecture, entity components, input, audio and the material system. I've been bouncing around a lot. 

Entity components are mostly done, but not yet had a reason to migrate over from the Actor system just yet, and there are some questions about how things will work from a user perspective (currently investigating entity templates outlined via node files). I've not yet considered how networking is going to work in relation to the components though...

New input stuff is mostly done, but again, not had a reason to totally migrate to it just yet, but you can outline a collection of actions with associated keys and the rest of the logic will automatically look out for any key activating said action before using a callback to do whatever is required.

For audio, there's now the AudioDriverInterface, which functions as an abstract interface towards whatever audio system is available on the given platform - in the case of Windows I've been working on supporting XAudio2 and on unix platforms I'll probably use OpenAL Soft instead. This isn't high up my list but I felt it was a good idea to get the ground work in place for doing something more substantial than what we had before, i.e. using SDL2 Mixer.

This was something that was originally envisioned for Hei but I'm on the fence about it for now, but if we have an abstraction layer for graphics in there it would probably make sense to consider one for audio, particularly as there's been a large explosion of audio APIs lately... Perhaps I'll migrate it down the line.

There wasn't really anything that substantial changed for the material system, however the post-processing is now handled through the material system finally, which was very much influenced by a bug I was trying to tackle. This did require changing how we handled the render target built-in variable, as previously this just threw over the render target used for the menu as a bodge job introduced for the game jam. The way in which we hand over the viewport size has been changed a little bit as well for the material system, but nothing too interesting to say about it.

Generally though, it's more flexible now. It also gave me an opportunity to experiment a little bit more with other effects and I think it might be likely I'll be looking at adding effects such as bloom, depth of field and more relatively soon perhaps.

Otherwise, I've been working on a tool to convert maps produced by Hammer/Jackhammer/Trench into geometry that Yin can use and been making pretty good progress. Been lacking motivation but I really hope to have it finished very soon, the only thing left to do is serialization and then we should be back to having proper environments again!

Added support for QOI to Hei before I went to bed today too. It apparently features compression close to PNG, but with a faster decoder, so that's pretty neat. It actually reminded me of arguments I'd had on another project about the speed of decoding a PNG though - dumbest argument I've ever had given it was a retro-style game with pretty small textures, so the impact was absolutely negligible, and if anything contributed to the load time it was our crappy material system we had at the time (which I take full responsibility for.) Anyway, that's all a story for another time.


Been a little while.

The language spec for Yang is pretty close to completion and the lexer is pretty much finished, though some things to figure out in-terms of the preprocessor - so much for not being a priority.

Made further improvements to material system in Yin. It's now possible to declare what defines should be used when compiling a shader stage, allowing for swapping features without having to split things into multiple files. Pretty damn happy with material pipeline now, so much is possible without having to depend on hard-code.

There was a tech demo that was shown and described yesterday, which was produced on the N64 and demonstrating portals. Implementation seemed something close to what I was working on, but my plan was to utilise stencil-buffer - though rather than using stencil-buffer, they max out z values for polygon covering area they rendered to? Not sure I entirely understand the technique, but keen to experiment.

On the side, have been reverse-engineering something rather special that I've been granted early access to with the intent of publishing an article about it on TalonBrave.info the day that it gets released (which is early December, if you're wondering). Ended up recycling an old 2D engine of mine (now under GPLv3) to demonstrate some bits and in-turn ended up rewriting quite a bit of the rendering side (typical). Changes are on a branch named 'bacon', for no particular reason.

Slowly shifting my Anachronox reimplementation over to use SDL2.

That's about it for now.


Wasn't feeling too awesome today so didn't get much done. Been making progress on the entity component system for Yin - probably almost done, but need to prototype functionality to make sure I've got everything covered. Design feels dirty, probably a better way to do it; components are registered into a template list which hold a list of instances, which we then iterate through (tick, draw, serialise and deserialise). Each component has a pointer to it's associated entity so that's used as a reference for dealing with communication between different components (i.e. does this entity have a component called x?)

Settled on calling scripting language for Yin, Yang. Very original. But I'm happy with it so whatever. Still settling on initial set of opcodes and sorting assembler, but it's a very low priority right now. Unlikely to be used for current game still, but we'll see.

In other news, starting to prototype AI for game using older foundation as a starting point. Collection of sensors settled on and gradually working out logic from there, so every tick they will search for whatever based on their type and pass input to the "brain", though brain won't do anything with that yet and need to decide on storage type when passing input - brain needs to be able to make associations and determinations based on input, and some input might be more pressing than others (i.e. pain).