I'd say abstract, but only abstract as needed. It's easy to get deep in the weeds future proofing, but spending all your time on interfaces is a surefire way to burn out.
GameDev
A community about game development.
Rules:
- Adhere to the general lemmy.blahaj.zone rules (#1 being no homophobia, transphobia, racism or other exclusionary content)
- Self-promotion is fine as long as it's not spammy - share your progress, insights, techniques and mishaps! If you recently posted, update the previous post instead of filling the frontpage with your project
- Hide NSFW/NSFL content behind a clear warning, for example: [NSFW Nudity]
More rules might follow if they become necessary; general rule is don't be a pain in the butt. Have fun! ♥
Don't extensively hardcode strings of dialogue into your game! I'm having to think about localisation for the first time and it is really really arduous to replace those thousands of strings with IDs to call from the localisation database.
Feel this too much! I'm fortunate that most isn't hard-coded in, but still...
Test if your build runs before you upload it, even if you think you didn't make any changes that could break anything -_-
Profilers for diagnosing performance issues.
I had an experience where my general basic rendering knowledge (lots of draw calls / polys = bad) got me complacent in solving performance problems. I saw low FPS, I started simplifying meshes. But that's not always the case, there can be runaway code, memory issues, specific render passes etc.
In my case, I was trying to get a Unity game to run on a PS4 devkit but it kept crashing on a certain level. I wasted a lot of time simplifying the meshes used in that scene before jumping on a call with our tester (who had the devkit and was also inexperienced) and remotely profiling the game to determine the root causes.
This turned out to be a memory overload. The amount of functional RAM/VRAM you have on a PS4 is actually pretty limited compared to a desktop PC. In our case, there were several things ramping it up and over the limit:
- Unity's static batching creating new combined meshes which added to memory cost
- Like batching, mipmaps also generate new copies of a texture which take up memory
- Excessively high-resolution textures for simple patterns (we hadn't considered Texel Density at all for that project)
- Erroneous use of high-memory textures where it was not necessary (e.g. a Visual effect was being driven by a 4k pure white texture, instead of just a vector colour)
So now, while my knowledge has significantly improved from experience, I make use of profiling wherever possible to confirm what a problem is. As the saying goes; you don't want to just mindlessly create solutions, you want to identify and solve problems
All libgdx related:
- not working with the asset manager
- not working with an atlas for assets
- not referencing textures, but initializing every time
as a libgdx user myself,
not referencing textures, but initializing every time
I'm fairly certain if you use the asset manager, you get textures and other assets passed by reference by default. I think this kind of kicked by butt at first with particles, since I needed to thenso something with particle effect pools, instead of just loading them as a particle effect in the asset manager to begin with.
not working with an atlas for assets
While I understand what an atlas is, I'm not sure how large of an atlas I should use or try to get away with. At the moment I usually put every different 'object' as it's own sprite sheet.