Accessing bitmaps in terminal server

We are conducting a study on RDP protocol. It was found that when applications with large screen update rate (like movies) were run, the amount of data sent from the terminal server to the client was almost same when the rdesktop client was run with and without the -b option. (-b will force the server to send screen updates as bitmaps rather than graphics primitives)

Also, the T.128 protocol specification, upon which RDP is built, states the following

“Where a local application is very actively drawing into a hosted window and the server is experiencing flow control back pressure, the server may prefer to accumulate bounds information for the drawing activity, rather than sending the orders, and then subsequently send bitmap data for the accumulated bounds”.

It was also found that the data send was almost double with -b option, when applications like text editor was used or when we simply scrolled the window.

Does that mean that when the screen update rate is high, RDP is sending bitmaps rather than graphics primitives? If so, how can we access these bitmaps before they are send to the client?

Thanks,
PKD

xxxxx@yahoo.com wrote:

We are conducting a study on RDP protocol. It was found that when applications with large screen update rate (like movies) were run, the amount of data sent from the terminal server to the client was almost same when the rdesktop client was run with and without the -b option. (-b will force the server to send screen updates as bitmaps rather than graphics primitives)

Of course. This should be completely intuitive.

When someone draws a rectangle, there is a huge difference in volume
between this:

left, top, right, bottom

which is about 20 bytes, and this:

which will be width x height x depth bytes.

But for a movie, the situation is very different. In that case, the
graphics primitive is “copy this huge bitmap”. The commands will either be:

…send whole bitmap…

which will be width x height x depth bytes, and this:

which will be width x height x depth bytes.

The amount of data is exactly the same.

> Does that mean that when the screen update rate is high, RDP is sending bitmaps rather than graphics primitives? If so, how can we access these bitmaps before they are send to the client?
>

No. It means that when you are sending a movie, you have to send each
frame. and frames are bitmaps. Whether you send it as a bitmap as input
to a bitblt command, or as a bitmap copied from the screen after the
bitblt command, doesn’t change the volume of data.


Tim Roberts, xxxxx@probo.com
Providenza & Boekelheide, Inc.

Thanks Tim
PKD

That means, if I can somehow compress the bitmaps/frmaes and send the compressed data to the client, I could reduce the amount of data transmitted, isn’t it?
Can I use a mirror driver to access these bitmaps? Will it cause a problem because the driver will be running in DISPATCH LEVEL, but the compression algorithms ( if used ) should be running in PASSIVE LEVEL?
I have gone through so many resources. I am writing based on what I have understood. I may be absolutely wrong also.

That is correct if you compress the bitmaps before sending them to the
client you will reduce the amount of data being sent and be able to generate
more FPS. There are two other factors though in that you may need to either
drop frames or buffer at the client before playing the stream. Also if you
want to synchronize sound you have more work to do because only dealing with
the video portion you will only ‘by chance’ get sound to be in sync since it
is sent via a separate virtual channel.

There are several methods that deal with video the first is the method used
by “Video Frame” in which you recompress the video using a proprietary low
bandwidth format, then just stream that format from the server to the client
using your own virtual channel and the client then becomes a player but it
can be a thin player since you can ship your codec with the virtual channel.

Another method is what Citrix did with Multimedia Acceleration which
basically requires then a thick client. You intercept the media stream and
do not decode it at the server but send it to the client. This then
requires that the client have the codec to decompress whatever video is
being streamed. Depending on the size of the media would also determine
what kind of bandwidth you would require to play it back. Of course you can
also buffer the data and if you stream the full codec the codec contains
both sound and audio so you do not have to deal with synchronization issues.

The next part of your question is about a mirror driver in the context of a
session. The problem is that since you are now mirroring the display you
are not taking over the remote display driver, so that driver will still
stream the image down to the client anyway wasting bandwidth. Secondly,
drivers do not run everything in dispatch level, they can run in passive
level so that is not an issue the issue you have is making sure your codec
would run properly in the kernel (no bugs) as well as if you require
floating point operations to properly preserve thread state (calling a few
save/restore apis).

Of course you can relay the crap back to a user mode service and compress it
there; you just want to be efficient in anything you do. You still have the
sound synchronization to deal with and you have the issue that you need to
stop the remote display driver from sending these images and wasting
bandwidth.

Also, depending on what the application is doing (movie, opengl, directx,
etc.) to do efficient remoting you would really need to case study each of
these application types and determine what is the optimal approach.
Remember, the default graphics display remoting is tuned for GDI and works
well right? But breaks down since its not a general one size fits all
solution. In fact, OpenGL and DirectX are also implemented differently in
the driver level for hardware access, to give an example, so obviously it’s
the same but in the remote context.

To give a different example, say OpenGL, perhaps you want to render the
image on the server hardware then send the image to the client, compressed,
intercepting OpenGL APIs. Here is an example of this
(http://www.thinanywhere.com/).

-----Original Message-----
From: xxxxx@yahoo.com [mailto:xxxxx@yahoo.com]
Sent: Tuesday, November 20, 2007 8:11 PM
To: Windows System Software Devs Interest List
Subject: RE:[ntdev] Accessing bitmaps in terminal server

That means, if I can somehow compress the bitmaps/frmaes and send the
compressed data to the client, I could reduce the amount of data
transmitted, isn’t it?
Can I use a mirror driver to access these bitmaps? Will it cause a problem
because the driver will be running in DISPATCH LEVEL, but the compression
algorithms ( if used ) should be running in PASSIVE LEVEL?
I have gone through so many resources. I am writing based on what I have
understood. I may be absolutely wrong also.


NTDEV is sponsored by OSR

For our schedule of WDF, WDM, debugging and other seminars visit:
http://www.osr.com/seminars

To unsubscribe, visit the List Server section of OSR Online at
http://www.osronline.com/page.cfm?name=ListServer

We had tested 3d games which uses hardware acceleration. They fail to run over rdp. So if we can direct these calls through the normal driver which supports hardware acceleration in the server and access the framebuffer to do compression, wont that be a global solution for all type of moving images? But will we have different instances of frame buffers for different sessions?

There is not any display hardware associated with a remote session so
anything that REQUIRES special HW acceleration features is not going to run.
So of course it’s not running.

“if you can direct these calls … global solution”

It will be a solution for which ever APIs you redirect, each interface uses
different APIs. So if you redirect DX interfaces, you only get DX
applications so its not a global solution. However, in general, for
interfaces that work well with HW acceleration would likely be better to
redirect in a similar manner as you mention but it requires you to implement
the direction for all the interfaces. You want to be selective.

Also you may not want to just redirect everything you may want to understand
each interface and determine what’s appropriate for example with movies
sound synchronization is an issue so you may want to figure out a solution
that includes sound.

Another example is GDI, GDI will stream better without hardware acceleration
since its mostly simple line draw commands, etc. which means that simply
redirecting everything is not the best.

Remember that this is going to be streamed remotely you don’t have limitless
bandwidth and zero latency you need to determine the most efficient method
for whatever class of applications you are targeting or else you will not
get a good remote experience. There may be optimizations you can do if you
have some knowledge of what’s going on that you wouldn’t know to do
otherwise.

I would get more familiar with the display interfaces and classes of
applications you want to start with. I wouldn’t take on the world at first
it is too much undertaking.

And NO you shouldn’t attempt to just load the legacy HW display driver in
the remote session directly because in the least on a multiple user system
you can’t share it then with other sessions.

-----Original Message-----
From: xxxxx@yahoo.com [mailto:xxxxx@yahoo.com]
Sent: Tuesday, November 20, 2007 10:40 PM
To: Windows System Software Devs Interest List
Subject: RE:[ntdev] Accessing bitmaps in terminal server

We had tested 3d games which uses hardware acceleration. They fail to run
over rdp. So if we can direct these calls through the normal driver which
supports hardware acceleration in the server and access the framebuffer to
do compression, wont that be a global solution for all type of moving
images? But will we have different instances of frame buffers for different
sessions?


NTDEV is sponsored by OSR

For our schedule of WDF, WDM, debugging and other seminars visit:
http://www.osr.com/seminars

To unsubscribe, visit the List Server section of OSR Online at
http://www.osronline.com/page.cfm?name=ListServer

Thanks a lot Mr. Toby. I think we will start with the concept of intercepting the media stream, because that seems to be the most straight forward thing.

xxxxx@yahoo.com wrote:

That means, if I can somehow compress the bitmaps/frmaes and send the compressed data to the client, I could reduce the amount of data transmitted, isn’t it?

In addition to Toby’s excellent response, let me point on that
Microsoft’s and Citrix’s Terminal Services display drivers do exactly
that. Remember that the Citrix product that became Terminal Services
was designed to operate adequately over a dial-up telephone line, and it
does remarkably well in that environment.

There is no low-hanging fruit left in this orchard. Any bandwidth
improvements you make will be incremental and difficult. The easy stuff
was implemented long ago.


Tim Roberts, xxxxx@probo.com
Providenza & Boekelheide, Inc.