Hm. Finding the right level that’ll hide the problem but not cause other distortion sounds… less than fun.
The usual way to fix that is not to touch uvs, but to print out a new map from Max with enough edge padding.
That assumes that the maps are being generated by Max, and that the UVs aren’t sized to take up the entirety of the UV space…
You can output any map from max or, I suppose, blender. You just load the model and render to texture.
As others have said, the issue is that ship textures are by default setup to wrap/repeat. There’s no current control to change that - though it’s not a bad idea, maybe I can add that when the HOD format transition is complete. The new compressed texture packaging can have an area for wrap/sampling flags.
Pulling back your UVs would work it isn’t actually too tough - for each MIP level lost you’re going to lose 2 pixels (then 4, then 8, etc). The max MIP drop on backgrounds is actually 3 (very worst case) - so if you scale your backgrounds down 8 on a side (16 total) - centered into the existing area, and clone the 8 pixels that are left valid on each edge to the opposite edge (to counteract sampling bleed) you’ll be golden. The UV math is pretty easy too:
8/4096 = 0.001953
So instead of 0 -> 1 you’d want 0.001953 -> 0.998047 - I’d even recommend just editing the DAE by hand at that point.
Still, I very much recommend taking your existing Sky-Cube and Rendering/Baking it to our Example Sky-Sphere. It’s easy to do in Max (@scole - maybe ask E.G. for a blurb, if you don’t know?) - and would mean being able to use our Stars authoring tools once we’ve documented them - as well as much better Texture usage, uniform pixels on the primary axis, etc.
I know I didn’t have any wrapping/seems issues with the skybox I converted here.
@matththegeek I imagine that’s up to HerraTohtori’s wizardry coming in from the sourcefiles, ala that tutorial you linked me the other day with all the broken images?
@BitVenom I’ll definitely take all that under advisement. Right now my chief concern is a short pipeline between SpaceEngine and game to let as-fast-as-possible iterative tuning of lighting and bloom and similar values on both ends happen, so anything that needs baking or image editing will be back burner. I can knock together a script that shuffles and renames files pretty easily, but not one that does image edits in the process.
I have some Dumb Ideas on mesh based work-arounds to hide the seams too. I’ll give them a spin sometime today probably.
You mentioned a -timerReload flag or something of the sort that periodically reloads the current background? Am I remembering that right? It seems like it’d be handy.
TODAY (aka very soon an update will come along that changes this)
you can do background reloads/cycles like this…
-BACK_SwapTimer=XXX – XXX is a number of frames
This will swap the backgrounds from M01->M16 over and over…
-BACK_Fixed=Background_Name - This changes the swap logic from cycling the HW2 backgrounds to just reloading one over and over… Great for tweaking lights and other tuning without a debugger
For more fun try:
This cycles the background between solid colors - so that a paused scene can be screen-capped - and using something like Photoshop’s ‘diff’ combine you can generate a mask (with alpha/etc) of scene vs background (for composite art, etc)…
I’ll post a proper thread once the update is out with the new (better!) command setup…
Alright. All my mesh hacks, themed generally around extending the faces past each other in some way or another, seem to have still left the seams, so I think I’m left with the box approach for prototyping and something fancier and bakey to get finished backgrounds. Alas.
Also, a random picture.
Some in-progress shots as I test a script I’m putting together to automate as much of this pipeline as possible. Those cli args are very useful BitVenom, thank you.
Now that I’ve more or less solved my sidetrack into automating this, I notice another problem with the import itself. I’m not sure if this is a HODOR or blender issue. @DKesserich, @BitVenom, I’ve uploaded a fresh version of the source files
In blender it is oriented how I expect it to be:
As far as I can tell nothing has transforms that shouldn’t, and no rotations are in plae anywhere, and I’m using the homeworld toolset better collada export. In game it ends up looking like this:
The galaxy appears below the POV, instead of above. It looks like it’s rotated 180 degrees along the long axis. Oddly at least some of the lights seem like they’re put in the places I want, such as the lens flare, visible because they’re now misaligned from their proper places in the background texture.
The lensflare should be placed roughly on the bright white star, but as seen here, isn’t.
Checking in Max looks like it gives similar results to Blender, but I don’t see any export options in the exporter in blender or any stray rotations that would account for this. Oddly, the flare joint looks like it’s lined up right in Max.
You’ve oriented your Blender viewport upside down. Blender is Z-Up, and your manipulator is clearly showing +Z pointing downwards. Realign your view with one of the numpad keys (numpad 1 for front view) and your galaxy will be on the bottom of the cube.
The light misalignment is probably because you also have to rotate everything -90 on the X axis and apply that rotation before exporting, since Homeworld is Y-Up. It’s weird that your background appears to be displaying with the correct Blender space orientation, though. I literally haven’t touched background generation at all, since ships still aren’t 100% there (dock paths coming soon!), so I don’t know what’s going on there.
Well, I feel silly now. With that revelation, fixing it has been relatively trivia. Thanks.
The textured background mesh and the lights seem to be in the right places. It’s hard to say for absolute certain for lights, but after careful examination of the shadows I think they are there. The lens flare joint, on the other hand, is still off in the middle of nowhere.
The only thing other thing I can think of is that in the example_light.dae the Flare key is a child of the PRE_Space joint, and in yours it’s a child of ROOT_LOD, so the local coordinate systems might not be matching up right due to that.
My thoughts exactly… make it a child of your Skymesh so that the transforms are nested and all will be well.
I just located the issue, actually. Turns out I’d named the joint Flare_key and it needed to be Flare_Key. Whoops.
And with that licked and a few other issues ironed out, the most time consuming part of the following backgrounds was picking the camera position:
Tinkering on this some more today. I’d like to request a rundown of light attributes and what they do. Attenuation is apparently a possible value, but not one that seems like it’d make much sense in a homeworld lighting setup. Perhaps I’m misunderstanding what is being attenuated? In particular I’m curious if there’s any way to make a light less ‘sharp’ especially in terms of shadows without just using ambient lighting.
It’d also be useful to have an update on those command line flags, they don’t seem to be working the same after this last update, as @BitVenom warned.
Ummm, yeah, I can tackle this really fast
Command Line Flags:
BACK_SwapTimer=XX - XX is seconds BACK_List=M01,EZ01,MyAmazingBackground,WetPonyGrotto
So, now instead of hard-coding the list of things to swap, you can define it - using command-separated background names (no spaces!). The swap time is now seconds, NOT frames.
- As for lighting…
The 2 directional lights you must have (along with an ambient) are just that, directional - they have zero attenuation. So, no, you can’t soften distance only lights - there’s no reasonable math for that. Shadows are where there’s no light, not where there’s ‘less’ light - so your only option is indeed to push up the Ambient (and consider your Environmental Mapping brightness / specular scale!).
Just to be clear - normally your Env-Map has a ‘sun’ baked into it (so that you can see it in reflections, etc). Your less-sharp version would have a more diffused version of the same object, as well. But when you have shadows, nothing looks dumber than your ‘sun’ reflecting in a shadow of that same sun. So the engine does shadow math, yes - but it also attenuates the Env-map based on how much towards the light being shadowed you’re seeing. Effectively that prevents ‘ghost’ reflections in shadows. It also has the side effect of making specular look much more complex near shadows than it really is. Unreal has a similar system (WAY more complex) called Specular Occlusion ( https://docs.unrealengine.com/latest/INT/Engine/Rendering/LightingAndShadows/DistanceFieldAmbientOcclusion/index.html )
One ‘less than ideal’ way to soften your shadows but still have some directional light would be to place your key and fill directionals very close - and split the difference between them a bit. Let your ambient just compensate for the off-axis lighting.
Overall (away from the 2 lights) you’d get less dimensionality (because the lighting would be uniformly dark before ambient) - but facing the directionals you’d have tons of control… Want softer shadows? Kick up the fill and subtract the same from key.
On the other hand, keeping your fill perpendicular (approx) to your key really helps hold the shapes of things. So it will be a tough thing to balance.
I did some basic experiments with extra static scene lights (which would give you key, fill, and something like fill2) - but it just wasn’t worth it, and the engine can be a real ashhole at times… even after I’ve re-written nearly every light related to lighting/rendering