[winswitch] NVENC and xpra

Antoine Martin antoine at nagafix.co.uk
Thu Dec 12 06:43:13 GMT 2013


On 12/12/13 11:05, ... wrote:
> Hello,
>
> I answered my own question faster than expected.  I have an optimus enabled
> laptop and run bumblebee to access my nvidia GPU.  I successfully ran two
> accelerated instances of glxspheres through xpra on my local machine.  I'll
> try this over a network with a thin client tomorrow.
>
> As far as patching up the data bouncing back and forth between GPU and CPU
> for render and NVENC, I think I know someone who would be able to provide
> sufficient demand for this.  Would a donation to the project cause this
> issue to get fixed?
I am not certain that I have both the skills and the time required to
succeed..
>   Do you have a rough idea of what it would take?
I have added some information to the existing ticket:
http://xpra.org/trac/ticket/365

Antoine

>
> Thanks for the great work,
> Elliot
>
> ===================
>
> Exactly what I did is this:
>
> on server:
>> xpra start :100
>> xpra start :110
>> DISPLAY=:100 optirun glxspheres
>> DISPLAY=:110 optirun glxgears
> On client (another terminal on same laptop):
>> xpra attach :100
> on another terminal:
>> xpra attach :110
> glxspheres ran at 80-90 fps instead of 120 with two instances going, but
> one running through xpra was just as fast as no xpra.
>
>
>
>
>
> On Wed, Dec 11, 2013 at 3:29 PM, ... <offonoffoffonoff at gmail.com> wrote:
>
>> Antoine,
>>
>> Great information.  Thank you!
>>
>> So, VirtualGL would theoretically work for multiple users per card
>> (multiple applications on multiple displays)?
>>
>> My intention is to be serving modern video games, like Minecraft, League
>> of Legends, etc.  I guess I would have to look into if openGL acceleration
>> is all that is needed.
>>
>> I look forward to trying out the virtualGL with my simple setup, and
>> working towards getting having a need for NVENC.
>>
>> -Elliot
>>
>>
>>
>>
>> On Tue, Dec 10, 2013 at 10:33 PM, Antoine Martin <antoine at nagafix.co.uk>wrote:
>>
>>> On 11/12/13 06:18, ... wrote:
>>>> Hello,
>>>>
>>>> I had some questions about NVENC.  Sorry if this information is
>>> somewhere
>>>> already and I didn't find it.
>>> I assume you've already read:
>>> http://xpra.org/trac/wiki/Encodings/nvenc
>>>> Can xpra use a kepler enabled nvidia card to both render graphics
>>> (hardware
>>>> accelerated) and h264 encode them before shipping them off to another
>>>> display across the network?
>>> According to Nvidia:
>>>
>>> https://developer.nvidia.com/sites/default/files/akamai/cuda/files/CUDADownloads/NVENC_AppNote.pdf
>>> "NVIDIA's latest generation of GPUs based on the Kepler architecture,
>>> contain a
>>> hardware-based H.264 video encoder (henceforth referred to as NVENC). "
>>> So, assuming that this is a pro card or that you found a license key
>>> (...), yes you can use NVENC with such cards.
>>> This answers the second half your question.
>>>
>>>
>>> As for the "render graphics hardware accelerated", it is a little bit
>>> more complicated. Based on your description, I assume that the card is
>>> not connected to a monitor or that this monitor will not be used for
>>> viewing. If that's not the case, the answers below are going to be
>>> inadequate.
>>>
>>> First, you need to define "accelerated":
>>> * if you mean OpenGL acceleration - which is often enough, then this
>>> will do what you want and is supported:
>>> http://www.virtualgl.org/
>>> * if you want to use the regular "nvidia" X11 driver for acceleration
>>> directly, there are ways to use a regular X11 server (usually running as
>>> root) to replace xpra's Xvfb, you may need to use the "
>>> ConnectedMonitor" option if no monitor is attached to the card. This
>>> will only work for a single user per card and your mileage may vary: it
>>> "should" work.
>>> Using "xpra shadow" to copy an existing display is not a good solution
>>> at present as it uses polling and will use far too much CPU time -
>>> though that could be fixed.
>>>
>>> The main downside of the current xpra 0.11 code is that it is not really
>>> tailored for this Nvidia specific use-case: during screen updates the
>>> pixel data will be downloaded from the GPU to the CPU and then uploaded
>>> again to the GPU for compression... which is a complete waste of
>>> valuable memory bandwidth. It shouldn't be too hard to bypass this
>>> unnecessary copying, and if there is enough demand for it then we can
>>> certainly look at it.
>>>> If not, is this something that some other VNC like program can do?
>>> Not as far as I know: xpra is the first, and at present the only
>>> open-source software to have NVENC support.
>>>>   If so,
>>>> are there any other hardware requirements or issues that I should know
>>>> about?  Would any kepler/NVENC enabled nvidia card be able to do this?
>>> As per above, with consumer cards (GeForce) you will need to find a
>>> license key...
>>>
>>> I do wonder if some consumer protection law could force Nvidia to
>>> provide the keys required to take advantage of the features they
>>> advertised when the cards were sold (and earlier SDKs did not require
>>> license keys either).
>>> As can be seen here in the GTX680 whitepaper:
>>>
>>> http://www.geforce.com/Active/en_US/en_US/pdf/GeForce-GTX-680-Whitepaper-FINAL.pdf
>>> "All Kepler GPUs also incorporate a new hardware-based H.264 video
>>> encoder, NVENC.  "
>>>> Does anyone have any experience with this?
>>> Cheers
>>> Antoine
>>> _______________________________________________
>>> shifter-users mailing list
>>> shifter-users at lists.devloop.org.uk
>>> http://lists.devloop.org.uk/mailman/listinfo/shifter-users
>>>
>>
> _______________________________________________
> shifter-users mailing list
> shifter-users at lists.devloop.org.uk
> http://lists.devloop.org.uk/mailman/listinfo/shifter-users




More information about the shifter-users mailing list