@Ted_Hudek while I’ve got your attention, I wanted to mention a couple of things that are at best non-optimal about the documentation and annotation of SecLookupAccountSid() (and likely the similar ones). Some of these I can do as proposed changes once it is ported to docs, but if there is a way to put it on the ticket, that would keep me from having to check periodically to see if it has moved. Some of them I cannot accomplish via the docs, and others might not be accepted without explanation.
P.S. This got long. Feel free to ignore it. #TLTR
The function prototype is:
NTSTATUS SecLookupAccountSid(In_ PSID Sid, Out_ PULONG NameSize, Inout_ PUNICODE_STRING NameBuffer, Out_ PULONG DomainSize, Out_opt_ PUNICODE_STRING DomainBuffer, Out_ PSID_NAME_USE NameUse);
The basic usage of the function is:
- Call function with &nameSize and &domainSize, but null nameBuffer and domainBuffer. This gets the needed sizes
- Allocate nameBuffer and optionally domainBuffer, and call again
The problems arise from the fact that this is kind of a wonky function. It attempts to use the common Windows API pattern of: given parameters of &Size and Buffer, call once with Size = 0 and Buffer = null, which fails with STATUS_BUFFER_TOO_SMALL and sets Size to the needed size. Allocate Buffer to Size bytes, and recall with &Size and Buffer. Buffer is an optional parameter, allowing the first call. But this function has two such parameter pairs, and Size doesn’t refer to the size of the corresponding buffer, but rather the MaximumLength of the string it points to. And rather than calling it with null for the buffers, you call it with zeroed out UNICODE_STRINGs (unless you don’t want Domain, in which case you actually do pass null).
One stumble I took here was that I assumed it worked like the other similar calling patterns. But if that were true, I would pass null for the buffers in the first call, and the size returned would be the size of the full buffers for the second call, and in the second call the buffer size would be taken from the size parameters. That would make the size parameters in/out, which is what the current documentation says: “On input, this value specifies the size of the domain buffer”. But the annotations show them as Out only, and in testing, that is exactly what they are. The UNICODE_STRING’s MaximumLength, not the value of the size variables, is used to determine the buffer size.
On the second call, the NameSize and DomainSize values are not used by the function. In testing, I found that they can be null in the second call, but because of the Out annotation, static analysis complains when nullptr is passed. It would be nice to have them both as Out Opt so we don’t have to pass unused parameters, esp. for DomainSize when the domain is not being queried.
I have a utility that allocates the UNICODE_STRING buffer and the string buffer at the same time for easier memory management and utilization. So I don’t have a zeroed out UNICODE_STRING unless I just declare a dummy for making the call. To this end, I tried passing null for NameBuffer on the first call, and it worked fine; NameSize was filled with the needed string size, I allocated the string buffer, and called it a second time with null for the size and the PUNICODE_STRING I had allocated, and it works like a champ. It would be kind of silly to have all of the parameters as optional, but that would keep me from having to use a dummy UNICODE_STRING on the first call or disabling the analyzer warning.
Additionally, the documentation refers to DomainBuffer incorrectly as ReferencedDomain.
P.S. PSID vs. PISID - This function uses a PSID parameter (PVOID). The token gives the owner as PISID, which points to, oddly, an SID structure. Any idea why this oddness? Is it that PSID is intended to be opaque, and PISID was used to make it an Internal pointer to SID?
_Ron