View Issue Details

IDProjectCategoryView StatusLast Update
0008207.NET APIImplementation Bugpublic2022-08-18 14:27
ReporterBernd Edlinger Assigned To 
PrioritynormalSeverityminorReproducibilityalways
Status newResolutionopen 
Summary0008207: Binary Encoding of Decimal Datatype is wrong
Description

The generated UA-Nodeset/DotNet/Opc.Ua.DataTypes.cs encodes the DecimalDataType
as if it were a struct { int16 scale; ByteString value; }

But the spec says something different:

see https://reference.opcfoundation.org/Core/Part6/5.1.8/

Table 3 – Layout of Decimal
Field Type Description
TypeId NodeId The identifier for the Decimal DataType.
Encoding Byte This value is always 1.
Length Int32 The length of the Scale and Value fields in bytes.
If the length is less than or equal to 2 then the Decimal is an
invalid value that cannot be used.
Scale Int16 A signed integer representing scale which is the inverse power
of ten that is applied to the unscaled value.
i.e., the decimal number of the value multiplied by 10^-scale
The integer is encoded starting with the least significant bit.
Value OctetString A 2-complement signed integer representing the unscaled value.
The number of bytes is the value of the Length field minus size
of the Scale field.
The integer is encoded with the least significant byte first.

So unless we completely misunderstood the spec, it means that the
Value shall just consist of the length-2 bytes that follow after the Scale field
in the extension object. In other words the Value shall not have a length
field of its own. Even if it is a struct member the encoding shall always
be as an extension object, as implied by the following sentence:

"If a Decimal is embedded in another Structure then the DataTypeDefinition for the
field shall specify the NodeId of the Decimal Node as the DataType.
If a Server publishes an OPC Binary type description for the Structure then the type
description shall set the DataType for the field to ExtensionObject."

TagsNo tags attached.

Activities

Bernd Edlinger

2022-08-18 14:15

reporter   ~0017329

BTW: this is the callstack that we observed:

[ERR] Unexpected error processing request.
Opc.Ua.ServiceResultException: MaxByteStringLength 1048576 < 2147483647
at Opc.Ua.BinaryDecoder.ReadByteString(String fieldName, Int32 maxByteStringLength) in ..\Stack\Opc.Ua.Core\Types\Encoders\BinaryDecoder.cs:line 489
at Opc.Ua.BinaryDecoder.ReadByteString(String fieldName) in ..\Stack\Opc.Ua.Core\Types\Encoders\BinaryDecoder.cs:line 465
at Opc.Ua.DecimalDataType.Decode(IDecoder decoder) in ..\Stack\Opc.Ua.Core\Stack\Generated\Opc.Ua.DataTypes.cs:line 4482
at Opc.Ua.BinaryDecoder.ReadExtensionObject() in ..\Stack\Opc.Ua.Core\Types\Encoders\BinaryDecoder.cs:line 2127
at Opc.Ua.BinaryDecoder.ReadVariantValue(String fieldName) in ..\Stack\Opc.Ua.Core\Types\Encoders\BinaryDecoder.cs:line 2368
at Opc.Ua.BinaryDecoder.ReadVariant(String fieldName) in ..\Stack\Opc.Ua.Core\Types\Encoders\BinaryDecoder.cs:line 714
at Opc.Ua.BinaryDecoder.ReadDataValue(String fieldName) in ..\Stack\Opc.Ua.Core\Types\Encoders\BinaryDecoder.cs:line 734
at Opc.Ua.WriteValue.Decode(IDecoder decoder) in ..\Stack\Opc.Ua.Core\Stack\Generated\Opc.Ua.DataTypes.cs:line 50605
at Opc.Ua.BinaryDecoder.ReadEncodeable(String fieldName, Type systemType, ExpandedNodeId encodeableTypeId) in ..\Stack\Opc.Ua.Core\Types\Encoders\BinaryDecoder.cs:line 815
at Opc.Ua.BinaryDecoder.ReadEncodeableArray(String fieldName, Type systemType, ExpandedNodeId encodeableTypeId) in ..\Stack\Opc.Ua.Core\Types\Encoders\BinaryDecoder.cs:line 1400
at Opc.Ua.WriteRequest.Decode(IDecoder decoder) in ..\Stack\Opc.Ua.Core\Stack\Generated\Opc.Ua.DataTypes.cs:line 50870
at Opc.Ua.BinaryDecoder.ReadEncodeable(String fieldName, Type systemType, ExpandedNodeId encodeableTypeId) in ..\Stack\Opc.Ua.Core\Types\Encoders\BinaryDecoder.cs:line 815
at Opc.Ua.BinaryDecoder.DecodeMessage(Type expectedType) in ..\Stack\Opc.Ua.Core\Types\Encoders\BinaryDecoder.cs:line 231
at Opc.Ua.BinaryDecoder.DecodeMessage(Stream stream, Type expectedType, IServiceMessageContext context) in ..\Stack\Opc.Ua.Core\Types\Encoders\BinaryDecoder.cs:line 142
at Opc.Ua.Bindings.TcpServerChannel.ProcessRequestMessage(UInt32 messageType, ArraySegment`1 messageChunk) in ..\Stack\Opc.Ua.Core\Stack\Tcp\TcpServerChannel.cs:line 932

Bernd Edlinger

2022-08-18 14:27

reporter   ~0017330

Well, maybe this is also related:

When we encode the Variant with the Decimal value,
the spec says: "TypeId NodeId The identifier for the Decimal DataType."
So we have the following in opcua_identifiers.h:
AnsiC/opcua_identifiers.h:#define OpcUaId_Decimal 50
AnsiC/opcua_identifiers.h:#define OpcUaId_DecimalDataType 17861
AnsiC/opcua_identifiers.h:#define OpcUaId_DecimalString 12878
AnsiC/opcua_identifiers.h:#define OpcUaId_DecimalDataType_Encoding_DefaultBinary 17863
AnsiC/opcua_identifiers.h:#define OpcUaId_DecimalDataType_Encoding_DefaultXml 17862
AnsiC/opcua_identifiers.h:#define OpcUaId_DecimalDataType_Encoding_DefaultJson 15045

We took TypeId = "ns=0,i=17861" (OpcUaId_DecimalDataType),
because that is the closest match to the wording in the spec.

So we did not consider to use i=50 (Decimal), although that is the NodeId of the Decimal Type.
Nor did we consider to use i=17863 (OpcUaId_DecimalDataType_Encoding_DefaultBinary),
probably because this encoding does not look at all like a "Default" encoding,

Issue History

Date Modified Username Field Change
2022-08-18 11:02 Bernd Edlinger New Issue
2022-08-18 14:15 Bernd Edlinger Note Added: 0017329
2022-08-18 14:27 Bernd Edlinger Note Added: 0017330