labview >> How do I convert 16 bit Binary to ASCII text

by jdam » Wed, 21 Jun 2006 02:10:10 GMT

I have a 16 bit Binary data string (i.e. 0110000110111000) I need to convert to ascii text using the following format.
 
Note: Name represented by char code set defined in ANSI X3.4 (Std 7-bit ASCII the first bit of each character field will be set to logic zero and the 7-bit ascii code will occupy the remaining 7 bits of the field)
 
Field name             BIT NO            Description
Reserved                   -00- 0           *Note
Character 1                -01-A           MSB
                                   -02-A etc.....
..................................-07-A           LSB
Reserved                   -08-A           *Note
Character 2                -09-A           MSB
                                   -10-A etc .....
..................................-15-A           LSB
 
Is there a simple method I'm missing?
 
 
 

labview >> How do I convert 16 bit Binary to ASCII text

by altenbach » Wed, 21 Jun 2006 05:10:09 GMT



I noticed that the second substring's first character is not a zero as stated in your definition.


The way I interpreted the problem is that the code should zero out the first bit. You can do that by ANDing it with an appropriate integer. The code in the attached image shows one possibility. There are many ways to do this.
<img src=" http://forums.ni.com/attachments/ni/170/191007/1/binaryscan.png "> Message Edited by altenbach on 06-20-2006 01:57 PM


binaryscan.png:
http://forums.ni.com/attachments/ni/170/191007/1/binaryscan.png

labview >> How do I convert 16 bit Binary to ASCII text

by altenbach » Wed, 21 Jun 2006 05:40:07 GMT


I have a 16 bit Binary data string (i.e. 0110000110111000) I need to convert to ascii text using the following format.


Of course we need to know exactly how your "16 bit binary" data string actually looks like. Your question is a bit ambiguous.

- Is it a formatted ASCII string of length 16bytes, containing the letteres 1 and 0?

- Is it a 2byte long raw ASCII string, representing the two bytes?

- ...?

So far, we have assumed case (1). Case (2) would be trivial. ;)

labview >> How do I convert 16 bit Binary to ASCII text

by jdam » Wed, 21 Jun 2006 05:40:09 GMT

Thank you both for your (Quick!) assistance with providing a solution to the problem.
The data is presented in Hex string format of a 16 bit word. 8 words (16 ASCII Characters) comprise the Target Name in the data. Sometimes the data transmission can get corrupted providing an invalid word which was the case with my original submit. (might have just been a fat finger on the kb)
Here are 8 words that comprise the following message   KNID/T192023
19278, 18756, 12116, 12601, 12848, 12851, 8224, 8224
 
 
 
 
 

labview >> How do I convert 16 bit Binary to ASCII text

by tbob » Wed, 21 Jun 2006 06:40:06 GMT


Here are 8 words that comprise the following message   KNID/T192023
19278, 18756, 12116, 12601, 12848, 12851, 8224, 8224



Your 8 words should give 16 characters.  The last four must be spaces because there are only 12 characters in KNID/T192023.  Also you stated that the data is in hex, but your 8 words are obviously in decimal.  Anyway, here is a vi to decipher the words.
Once again Altenbach has to come up with a simpler way to do things.  Maybe I should send him all my work for reveiw. :smileywink:Message Edited by tbob on 06-20-2006 04:16 PM


DecimalStringsToAsciiChars.vi:
http://forums.ni.com/attachments/ni/170/191024/1/DecimalStringsToAsciiChars.vi

labview >> How do I convert 16 bit Binary to ASCII text

by jdam » Thu, 22 Jun 2006 00:10:07 GMT

OK,
tbob and altenbach - You guys make this stuff seem too easy.
Let me pose a new problem - I've already devised a solution but I'd like to see how either of you would approach the problem.
The simplest solution wins a prize! I'll send Two Product Patches to the solution that I like the best. Good Luck!
This is a signed 2's comp problem to describe position (North is positive and south is negative). Input is 2 word hex decimal string array.
The bits are laid out with MSB at the left.
MSB = 2^-1, LSB = 2^-31, Max/Min Value is +/- 5.0000E-01, Resolution 4.6566E-10
2 word message
Field name   BIT NO
MSW       -00- S      Sign
                -01- N      MSB
                -02- N .......-15-N
LSW        -00-N .......-14-N
                -15-N       LSB 

labview >> How do I convert 16 bit Binary to ASCII text

by altenbach » Thu, 22 Jun 2006 00:10:12 GMT

Hmm, so what needs to be done???
So far we know how the data is represented. What kind of input do you have and what kind of output do you want?
(Better just give us an example VI with typical data as default in a control of the desired type and a description on how the output should look like).

labview >> How do I convert 16 bit Binary to ASCII text

by jdam » Thu, 22 Jun 2006 01:40:07 GMT

I don't know?
What would your solution be for
word 1 = 624
word 2 = 2448
Max/Min Value +/- 1.0240E+03
resolution 3.8147E-06
MSB = 2^12
LSB = 2^-18
Full scale of the data 8.1920E+03

labview >> How do I convert 16 bit Binary to ASCII text

by altenbach » Thu, 22 Jun 2006 05:10:08 GMT

Your problem is way overdetermined, the only thing you need is log2(MSB) (-18 in this case), which gives the resolution.
 
(I don't know why you need the "min/max" and "full scale" and all that. Do you need some additional scaling?)
 
Anyway, the attached (LabVIEW 7.1) is one possibility. You should really check it with some inputs that yield a negative number, though.
 
 
My result for your sample inputs (624,2448) is about 156.Message Edited by altenbach on 06-21-2006 02:02 PM


16bitWordsToScaledDBL.png:
http://forums.ni.com/attachments/ni/170/191298/3/16bitWordsToScaledDBL.png


16bitWordsToScaledDBL.vi:
http://forums.ni.com/attachments/ni/170/191298/4/16bitWordsToScaledDBL.vi

labview >> How do I convert 16 bit Binary to ASCII text

by altenbach » Thu, 22 Jun 2006 05:10:11 GMT


.. log2(MSB) (-18 in this case)...

Log2(LSB) of course. Sorry for the typo. ;)

labview >> How do I convert 16 bit Binary to ASCII text

by jdam » Fri, 23 Jun 2006 05:10:07 GMT

Altenbach - You are five stars in my book.
I came accross a new problem that I was wondering if you could assist to simplify?
Same type data 16 bit word
Field name    Bit No   Description
Integer             -00- S     Sign Bit
                        -01- N     MSB
                        -02- ..... -10- N
                        -11- N   LSB
Exponent         -12- E   MSB
                         -13-....-14-
                         -15- E   LSB
Integer    Resolution 1    MSB 2^10    LSB 2^0
Exponent  Resolution 1  MSB 2^3      LSB 2^0
Is there a simple way of handling this problem?  
 

labview >> How do I convert 16 bit Binary to ASCII text

by altenbach » Fri, 23 Jun 2006 07:10:15 GMT

Since the LSB of the mantissa is 1 and the exponent is always positive, the result is always an integer and will fit into an I32.
 
I also assume that the exponent is meant to be base 2, i.e. the desired result is (mantissa x 2^exponent). If the definition is different, modify accordingly.
 
Here is a simple possibility to convert to I32 according to the given rules, see if it gives the right answer for you. Sorry, I cannot test this. No guarantee that its right. ;)
 
<img src=" http://forums.ni.com/attachments/ni/170/191628/1/16bitWordToI32.png ">
 
 
 
 Message Edited by altenbach on 06-22-2006 03:52 PM


16bitWordToI32.png:
http://forums.ni.com/attachments/ni/170/191628/1/16bitWordToI32.png


16bitWordToI32.vi:
http://forums.ni.com/attachments/ni/170/191628/2/16bitWordToI32.vi

labview >> How do I convert 16 bit Binary to ASCII text

by jdam » Sat, 24 Jun 2006 01:10:07 GMT

Here are the details I have for this problem
The value of this data word represented by this format is given by: (Integer) X (16**Exponent)
Exponent is 2's comp with Max and Min Values of 15 and 0 respectively.
The full scale of this data word is 2.36E+21 with Max and Min values for the integer of 2047 and -2048 repectively.
2047*16^15 = 2.36E+21
Thank you again for showing me the direct route rather than working on the bit level.
 

Similar Threads

1. Read 12 bit data in 16 bit binary file

I have a binary file that has taken 12 bit data from a camera and has
written it to 16 bit file. Is it possible to read this file with
Labview? I keep on getting numbers above 4096. 4 Of the 16 bits in the
file are not used, but I don't know how to tell this to Labview.

Thnks for the help.

2. Converting 12-bit or 16 bit image to an unflattened pixmap

3. Converting 10bit tiff to 16 bit

4. how do you write text on 16 bit image

5. Converting 16 Bit Integer to Individual Bits

6. Convert varying hex string to array of 8 and 16 bit numbers

7. 16 bit to 12 bit image

I have Labview 8.51 full developmental version and it does NOT have IMAQ arraytoimage.vi!  Anyway I have a camera that captures 12 bit images but the card that supports the camera only outputs 16 bit. So I am getting 16 bit tiff files and I am trying to save it as 12 bit image. Is there any other way of doing this w/o using imaq arraytoimage.vi?

8. 16 bit niDAQ to 8 bit interface board