[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Signature::Decode bug?



This is quite probably a misunderstanding on my part, but the logic
for DSA and EC signature decoding in cml/cmapi/src/CM_Signature.cpp
looks wrong.

The signature structures are both

	SEQUENCE {
	        r INTEGER,
	        s INTEGER }

the code decodes the signature, and then wants to put the two integers
in a buffer, with both integers being the same length (padding if
necessary).  (Possibly DSA is more constrained than ECDSA.)

It uses the length of the hash to decide how long the integers are,
but that's wrong, I think: the length's determined by a property of
the asymmetric key.  Specifically, for ECDSA the length is at most 2
nLen, where nLen "is the length in octets of the base point order n".

Am I misunderstanding something?

(For usual size keys, and especially for DSA, I guess this'll work
fine: one wouldn't ordinarily use a hash with a DSA or ECDSA key which
produced a larger signature than the hash.)