Jun. 21st, 2013

Bugz

Jun. 21st, 2013 02:53 pm
izard: (Default)
There is a strange bug I have been investigating for several months already. The initial customer's code was huge, but after some time today I have condensed it to just a few lines that are so trivial that I can publish them in my personal blog without upsetting the customer.

static float floatvar; // can be double, same s#$t
static int intvar;

void test_func()
{
  unsigned long before, timing, i, min, max;

  intvar = 2;
  floatvar = 3000000000.0f;
  min = 10000; max = 0;

  for (i = 0; i < 600000000; i++)
  {
      before = rdtsc();
      intvar = 2 + floatvar;
      timing = rdtsc() - before;

      if (i > 100) {
         if (min > timing) min = timing;
         if (max < timing) max = timing;
      }

      usleep (100); // Whole run is ~1 minute.
   }
   printf ("Min = %d, Max = %d\n", min, max); 
}

What do you think it will print? (The test runs in an RTOS, all sources of jitter (SMIs, power management, etc) are contained).
It prints 388, 3400! 388 to 3400 CPU cycles for single floating point add!!! (Regardless of what kind of code C compiler had generated: scalar SSE or x87)

It is reproducible on all kinds of x86 boxes (I have plenty in the lab). If I plot a timing histogram instead of just collecting min and max values, the resulting graph is totally random...

Luckily for most s/w developers, the system has to be in a very specific state for this issue to happen. Unfortunately I am not able to disclose the state here, but it happens extremely rarely. Curiously enough, there is another way to fix it, besides not entering that specific state:
just change
floatvar = 3000000000.0f;
line to
floatvar = 2000000000.0f;

:-) IEEE754, go figure!

Profile

izard: (Default)
izard

July 2025

S M T W T F S
  12345
67 8 91011 12
13141516171819
20212223242526
27 28293031  

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags
Page generated Aug. 11th, 2025 10:23 pm
Powered by Dreamwidth Studios