SQL Server UniqueIdentifier / GUID internal representation

According to the Wikipedia article on Globally unique identifier (in the "Binary encoding" section), Microsoft's implementation of GUIDs uses "Native" endianness for the first half (the first 8 bytes), and Big Endian encoding for the second 8 bytes. Microsoft Windows and SQL Server are "Little Endian" due to the Intel architecture being Little Endian, hence the "Native" implies "Little Endian". The following chart is copied from that Wikipedia article:

Bits    Bytes   Name    Endianness
32      4       Data1   Native
16      2       Data2   Native
16      2       Data3   Native
64      8       Data4   Big

Regarding Little Endianness of Microsoft SQL Server, the Collation and Unicode Support MSDN page states:

Because the Intel platform is a little endian architecture, Unicode code characters are always stored byte-swapped.


Here is my explanatory-conclusion from long usage and recent review of numerous standards and reference materials.

Given an ASCII Format of: {UInt32a-UInt16b-UInt16c-UInt16d-UInt48e}

ASCII: ascii char representation is the same for both UUID, and GUID data and is thus portable.

MEMORY|LOCAL: The binary representation in memory or local usage is defined to use native endianness format as the most efficient memory usage format, from the original Apollo papers from the 1980's, as UInt64NativeEndian-UInt64BigEndian

PORTABLE: The binary-portable-storage format is where ALL the confusion exists in UUID vs GUID.

A GUID always stores the concatenated TEXT hex-format-fields with dashes removed as TWO 64-bit values using a UInt64[0]{LittleEndian}-UInt64[1]{BigEndian } layout. Making efficient to convert on any platform, often using a single instruction against a 64-bit register.

The UUID portable binary conversion rules are split based on variant2 versus all other variants. For variant2 it follows GUID layout and for all others it stores the first three ascii-format fields as BigEndian 32-bit, 16-bit, 16-bit respectively and the remaining 6 bytes as they appear in ascii form which is equivalent to a 64-bit BigEndian (aka Network order). Making it much less efficient to use as a binary-representation since conversion requires multiple machine instructions even for all the non-variant2 forms.

Note: There are many documents and standards references and they are "inconsistent" or "in-conflict" on how to represent a variant2 value as a binary-portable-UUID.

This makes GUID-binary form the preferred form over UUID-binary for conversion as it is "less" problematic to encode/decode as UUID requires checking the 3-msb variant flag bits at byte[8].

Further noting that the majority of modern machines use native-little-endian (exceptions being some game machines and some networking equipment), it is generally much more efficient to use portable-binary-GUID format since it precisely maps to the native-binary-form of a UUID or GUID.