 Click here to view and discuss this page in DocCommentXchange. In the future, you will be sent there automatically.
SQL Anywhere 11.0.1 » SQL Anywhere Server - SQL Reference » Using SQL » SQL data types » Data type conversions

### Converting bit arrays

###### Converting integers to bit arrays

When converting an integer to a bit array, the length of the bit array is the number of bits in the integer type, and the bit array's value is the integer's binary representation. The most significant bit of the integer becomes the first bit of the array.

###### Examples

`SELECT CAST( CAST( 1 AS BIT ) AS VARBIT )` returns a VARBIT(1) containing 1.

`SELECT CAST( CAST( 8 AS TINYINT ) AS VARBIT )` returns a VARBIT(8) containing 00001000.

`SELECT CAST( CAST( 194 AS INTEGER ) AS VARBIT )` returns a VARBIT(32) containing 00000000000000000000000011000010.

###### Converting binary to bit arrays

When converting a binary type of length n to a bit array, the length of the array is n * 8 bits. The first 8 bits of the bit array become the first byte of the binary value. The most significant bit of the binary value becomes the first bit in the array. The next 8 bits of the bit array become the second byte of the binary value, and so on.

###### Examples

`SELECT CAST( 0x8181 AS VARBIT )` returns a VARBIT(16) containing 1000000110000001.

###### Converting characters to bit arrays

When converting a character data type of length n to a bit array, the length of the array is n bits. Each character must be either '0' or '1' and the corresponding bit of the array is assigned the value 0 or 1.

###### Example

`SELECT CAST( '001100' AS VARBIT )` returns a VARBIT(6) containing 001100.

###### Converting bit arrays to integers

When converting a bit array to an integer data type, the bit array's binary value is interpreted according to the storage format of the integer type, using the most significant bit first.

###### Example

`SELECT CAST( CAST( '11000010' AS VARBIT ) AS INTEGER )` returns 194 (110000102 = 0xC2 = 194).

###### Converting bit arrays to binary

When converting a bit array to a binary, the first 8 bits of the array become the first byte of the binary value. The first bit of the array becomes the most significant bit of the binary value. The next 8 bits are used as the second byte, and so on. If the length of the bit array is not a multiple of 8, then extra zeroes are used to fill the least significant bits of the last byte of the binary value.

###### Examples

`SELECT CAST( CAST( '1111' AS VARBIT ) AS BINARY )` returns 0xF0 (11112 becomes 111100002 = 0xF0).

`SELECT CAST( CAST( '0011000000110001' AS VARBIT ) AS BINARY )` returns 0x3031 (00110000001100012 = 0x3031).

###### Converting bit arrays to characters

When converting a bit array of length n bits to a character data type, the length of the result is n characters. Each character in the result is either '0' or '1', corresponding to the bit in the array.

###### Example

`SELECT CAST( CAST( '01110' AS VARBIT ) AS VARCHAR )` returns the character string '01110'.