Mailing List Archive


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [tlug] issues with format of double (or IEEE754)



Hi giggling high school girl, 

> Hello :)

> I appologize if this is not a sort of "only linux related question".
> Still I believe there is some kind of relation to linux, since I do all
> the described actions using debian testing 

Your question is most definitely Linux related. 

> on an i386 mashine...

ISA bus? 

> I have a problem with getting a double precision number from my digital
> nanovoltmeter (keithley 2182A). 
> 
> ... gpib interface ... 10byte data in format:
> (ascii header)#0 (just 64 bits) 1110010.....
> 
> So after two characters, 64 zeros and ones come. 
> The later are organised such that first 11 bits are "exponent bits"
> followed by 52 "fraction bits", conforming to IEEE754 format.

Maybe your meter follows IEEE754. Maybe it does not. 

Regardless of documentation, 
I would not assume that IEEE754 format is followed. 
I would not be surprised if there are little differences. 

> So I use a function ibrd() [2] to read this data in a buffer defined in
> this way:
> char result_normal_plus[17]="00000000000000000";

If that is global, keep in mind that result_normal_plus will 
initialized only _once_. 

> ibrd(K2182Ud,result_normal_plus,10);

What do you do with the other seven bytes of result_normal_plus[]? 
(That is a side issue, probably unrelated. It just looks wierd)

> result_normal_plus[ibcnt]='\0';

I am confused about which data is binary and which data is ASCII. 
I get the hazy impression that ibrd() does binary, 
so seeing result_normal_plus[ibcnt]='\0' looks odd. 

> presult= result_normal_plus+2;
> 
> where presult is a pointer to double, 
> eg. defined as: double *presult;

Maybe double* follows IEEE754. Maybe it does not. 

It mostly likely follows internal representation of '387 
even if you don't have '387 math chip. I don't expect Intel 
to guarantee IEEE754 compliance, especially back then. 

You are braver than I. I don't expect standards to be followed. 
This is where I would put my greatest suspicion. 
Do you have the 80387 coprocessor? 
Regardless of whether you have a '387 or not, 
using the double* trick is what gets my greatest suspicion. 

> The problem is that if I print the number:
> 
> printf("*presult: %.12f \n",*presult);
> 
> I always get a number which is 4 orders of magnitude lower. For example:
> 0.00002 instead of 0.2 

My gut feeling is that Intel did not use IEEE754 format. 

> Can anyone help me to understand why it is so? 

Simple, different things are using different formats. 

Where is the problem? Divide and conquer. 
Does the meter output in the format you expect? How do you know? 
Does double* use the format you expect? How do you know? 

Print out in hexadecimal (or even binary!), 
the data returned by ibrd(). 

Parse it by hand. Calculate what you believe the numbers to be. 

> Or maybe suggest a more clever [3] way of getting the result from
> nanovoltmeter? 

Not clever, just tedious. 

> The compilation is done with gcc (GCC) 4.0.4 20060507 (prerelease)
> (Debian 4.0.3-3), 
> using -pg -Wall flags. 
> (-pg is for profiler info)

This should not matter. There might be some flag about some 
IEEE compatibility, 
> 
> For clarity, I once more rewrite the structure of my program:

Your program is not clear. Too much magic. 
> 
> ...
> double *presult;
> char result_normal_plus[17]="00000000000000000";

One time initialization? 
Where does 17 come from? 

> ...
> ibrd(K2182Ud,result_normal_plus,10);
> result_normal_plus[ibcnt]='\0';

mixing ASCII and binary is always suspect. 
(BTW, where does ibcnt come from?)

> presult= result_normal_plus+2;
> printf("*presult: %.12f \n",*presult);

> [3] I have implemented reading in ASCII format and than converting with
> atof() function, but this I prefer to avoid, since 

First, make it work right, then make it work correct. 

Did reading in ASCII, then converting with atof() work correctly? 

> atof() takes more
> time than reading straight in double format (or at least I think so)

How much time does it take to do it _correctly_? 
Is the ASCII format and atof(), the fastest way that works _correctly_?
Avoid premature optimization. 

Why do you care about speed? How much speed is enough? 
What are your speed goals? Until you have actual need for speed, 
just worry about correctness and making your program easy to understand. 



Home | Main Index | Thread Index

Home Page Mailing List Linux and Japan TLUG Members Links