LDraw.org Discussion Forums

Full Version: Really Big Colour Code: 117313504
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I am working on an import feature for LDraw models, and when I tested it on the Front End Loader model that comes with the LDraw download (model 8464.mpd), I encountered a line with a very large colour code:

0 FILE 8464.ldr
0 Front End Loader
0 Version: 1.0a (20071123)
1 117313504 -125 -208 -400 0 0 1 0 1 0 -1 0 0 3941.DAT

In LDraw, the part looks like this:
[Image: eo0J5.png]

I thought it might be a blended colour, like the filespec would suggest, but the colour code is still too large for that. What is this colour, and how do I interpret it?
That color really probably shouldn't be in use in that model (which is an LDView sample model, created by Peter Bartfai, LDView's Qt version maintainer). However, it's a transparent direct dither color. If you can read C++ code, you can take a look at LDView's color number processing code in this file, starting at line 462. The code for 0x6xxxxxx color codes starts at line 525.

Here are the "special" colors that LDView handles, all of which have been used at some point in the past:

0x2RRGGBB: opaque RGB direct colors
0x3RRGGBB: transparent RGB direct colors
0x4RGBRGB: opaque RGB dither color (RGBRGB are two 12-bit colors)
0x5RGBxxx: transparent RGB dither (xxx is ignored and treated as fully transparent)
0x6xxxRGB: transparent RGB dither (xxx is ignored and treated as fully transparent)
0x7xxxxxx: invisible

Only 0x2RRGGBB opaque direct colors are officially supported in the spec, and they're only there because the LSC at the time wanted to allow those to be used in patterned parts. Personall, I think that the others should at least be mentioned, even if they are continued to be disallowed in official parts. Note, for the 0x4, 0x5, 0x6, and 0x7 numbers above, the bottom two bits in the 5, 6, and 7 can be thought of as ignore bits for the two components of the RGB dither. When a bit is set to ignore one of the components in the RGB dither, that component is treated as fully transparent.
One note, when you convert from 12-bit up to 24-bit in order to do the dithering, make sure to multiply each 4-bit component by 17, and not 16, because the maximum 4-bit value is 15, and 15 * 17 = 255.
Thanks, I managed to get that working. I implemented everything that was in that section of the source file that you linked me, except for the first branch that checks for custom colours (getCustomColorInfo). Is that for when the "0 !COLOUR" metacommand is used somewhere other than in the config file?

If I have implemented all of the "special" colours that you listed, do I additionally have to worry about the Blended Colours, or is that subsumed by what I have already added?
The "custom colors" in question are ones defined by "0 !COLOUR" (generally in ldconfig.ldr). LDView doesn't do a good job with these, treating them all as global, no matter where they appear.

You still need to deal with "blended colors" (colors with numbers between 256 and 511). LDView does that in LDLPalette::initDitherColors(). This is done using hard-coded values for the first 16 colors, so redefining color 12 won't affect the blended colors based on color 12.
This has historically proven contentious, but in my opinion (and as the spec is currently worded), you don't need to worry about either dithered colors or any direct colors other than ones beginning with 0x2. Color dithering was an extremely messy solution to a technical limitation which no longer exists. Nowadays, the dithered color range conflicts with LDConfig !COLOUR definitions, so if you support dithers, make sure to check for an LDConfig code first, as they have precedence.

Also, direct colors based on dither algorithms were never standard. They passed about by obscure tribal knowledge. I seem to have even missed 0x6 and 0x7, as my code doesn't even mention them.

If you really must support dithers, my C code to calculate RGB for dithered colorcodes is here, on Line 273. You even get bonus color commentary with the code.

I would suggest at least supporting the 0x3 ones, which are just 50% transparent versions of the 0x2 ones. IIRC, the only reason that the 0x2 ones made it into the spec was because people were clamoring for RGB colors for use in official patterns. I think I pushed for inclusion of the 0x3 ones also but was overruled (although I could be mistaken about all that).
I agree - I'd like to see the 0x3..0x7 encodings documented somewhere - perhaps as an 'informative' section in the official specs? We co-opted LDLite's syntax, we should acknowledge this.

Having attempted to implement support for these codes myself, I also had to do a lot of digging to find a useable spec - and it doesn't help that of the two main places on the net which do document it, one of them is completely and utterly wrong!
Which one is
completely and utterly wrong

Please note it here, so we will have an eye on that document.
Ah-ha! It's been fixed since I first found it :-) As you were...
Ok, did forget that, because it has been too wired Wink
Okay, I think I've put in all the support for colours that needs to be there now. Thanks for the help. Smile
Travis Cobbs Wrote:Here are the "special" colors that LDView handles, all of which have been used at some point in the past:

0x2RRGGBB: opaque RGB direct colors
0x3RRGGBB: transparent RGB direct colors
0x4RGBRGB: opaque RGB dither color (RGBRGB are two 12-bit colors)
0x5RGBxxx: transparent RGB dither (xxx is ignored and treated as fully transparent)
0x6xxxRGB: transparent RGB dither (xxx is ignored and treated as fully transparent)
0x7xxxxxx: invisible

Anyone know what the official rules (as I can find none) for the edge colors for these codes are. I've been using it's negative in LDCad but for some (transparent) colors that's very ugly.

I'm thinking to change it to 90 or 110% of the original or something, but was wondering if there are rules for this hidden somewhere. Don't really want to use static black though (unless the 'use static edge colors' option is enabled).
Maybe I am not the best who answered, but I also did not remember that such an question has been answered until today. Maybe this is a question that can be best answered by Scott Wardlaw
LDView uses LDraw color 0 as the edge color number for all custom colors. This obviously isn't ideal, and I'm not sure if it's standardized anywhere or not.
I derive a color algorithmically. I actually prefer this method over trying to define compliment colors by hand.

//========== complimentColor() =================================================
// Purpose:     Changes the given RGBA color into a "complimentary" color, which
//              stands out in the original color, but maintains the same hue.
void complimentColor(const GLfloat *originalColor, GLfloat *complimentColor)
    float   brightness      = 0.0;
    // Isolate the color's grayscale intensity http://en.wikipedia.org/wiki/Grayscale
    brightness =    originalColor[0] * 0.30
                +   originalColor[1] * 0.59
                +   originalColor[2] * 0.11;
    //compliment dark colors with light ones and light colors with dark ones.
    if(brightness > 0.5)
        // Darken
        complimentColor[0] = MAX(originalColor[0] - 0.40,   0.0);
        complimentColor[1] = MAX(originalColor[1] - 0.40,   0.0);
        complimentColor[2] = MAX(originalColor[2] - 0.40,   0.0);
        // Lighten
        complimentColor[0] = MIN(originalColor[0] + 0.40,   1.0);
        complimentColor[1] = MIN(originalColor[1] + 0.40,   1.0);
        complimentColor[2] = MIN(originalColor[2] + 0.40,   1.0);
    complimentColor[3] = originalColor[3];
}//end complimentColor
I agree that something like this is probably much better than what I'm doing in LDView. Having said that, I don't think adding (or subtracting) a constant value to the RGB components of the color and then clamping is the best way to do it. It seems that would change the hue noticeably.

For example, suppose that the input color was #FF8040 (kind of tangerine orange). Your algorithm would produce #991A00, which is more of a dark red. Multiplying each component by 0.6 would produce #994D26, which is brown. Generally, "dark orange" means "brown", so I think this is better.

Going the other way (creating a light compliment for a dark color) is a little more work, since you have to pull each component towards 1.0 by 0.6, but it's still fairly simple. In my opinion, the result is a better match of the original color.
Given that the default ldconfig.ldr has mostly black edge lines, I'm tempted to add this algorithm to LDView, and then offer it as an option for all colors (in addition to automatically using it for direct colors). Of course, it would be mutually exclusive with my existing "black edge" option.
Yeah, I didn't think black edge lines everywhere looked very good. That's why I wrote it to begin with.

I have no idea why that code making an additive change. I noticed that was weird when I posted it. However, I didn't have time to test the alternatives this afternoon and I know this isn't producing egregiously garish output. I'm really not sure why it isn't converting to HSV, changing the brightness, then converting back to RGB. That would accurately preserve the hue.
Some time ago I developed my own algorithm to generate edge colors, and applied it to all the colors in LDConfig. I like the results, but that's just a personal preference.

Rather than basing a calculation on the RGB values, I convert them to HSL (Hue, Saturation, Luminance) and transform that instead. (Conveniently, Windows provides two functions ColorRGBToHLS and ColorHLSToRGB, even if they did swap two of the parameters around!) This allows me to keep the hue unchanged, which I think was one of the observations made earlier in this thread.

To calculate the edge color:
  • Use the luminance value to categorize the color as "dark", "dark-ish", "light-ish" or "light" (respectively <85, <112, <159, >=160).
  • Adjust the luminance according to which band it falls in:
    "dark" + 90
    "dark-ish" + 45
    "light-ish" -45
    "light" - 90
  • Halve the stauration
I have this built into a (not very pretty!) Excel spreadsheet where I can just paste the contents of LDConfig into the front sheet, and it spits out the recalculated file at the back. If anybody wants to have a play with it -- and even improve it! -- I'll try and tidy it up a bit and post a copy on here.

Martin James
Thanks everybody, based on the above I'm going with this for the time being:

edgeCol.inc(mainCol.getBrightness()>0.5 ? -0.6 : 0.6);

I'll be using this only for decoded colors and to compliment missing edge info from COLOUR tags. At some point later on I probably make an option for global calculated edge colors using a HSL transformation like Allen and Martin suggest / use.


Playing some more made me change the above to:

void calcEdgeFromMain() { edgeCol.a=1.0; edgeCol.setIncRGB(mainCol, mainCol.getBrightness()>0.5 ? -0.5 : 0.5); }

Because I like it a bit more, I've also tried Travis' suggestion (scaling by 40% up or down):

void calcEdgeFromMain() { edgeCol.a=1.0; edgeCol.setScaledRGB(mainCol, mainCol.getBrightness()>0.5 ? 0.6 : 1.4); }

But I found the contrast isn't high enough resulting in disappearing edges under certain angles (caused by shading). The other methods seem less affected by that effect, maybe that's the reason Alan choose 0.4 (although i agree with Travis on it being to red in his example, hence using 0.5).

Using "rgb2hsl -> adjust -> hsl2rgb" will probably be the best solution, but I havn't tried that yet.

note: the called functions limit rgb values to 0.0 .. 1.0)