Vision Five 2 Performance Issues

Hi, I’m making a 3D MMO engine and it runs exceptionally well on Windows 1030 (2000 non-instanced players) Jetson Nano (600) and Raspberry 4 (100) but on the Vision Five 2 performance is abysmal.

I made a release so that your engineers can look at the issues with a real world example:

The colors are also off.

But everything else seems to work as intended which is a great improvement over the release distributions.

Here’s more info on the engine: BinaryTask


Welcome to the PowerVR Developer Forum and thank you for reporting the issue.

We see you have reported the same in our internal developer portal too. We shall follow up with this one over there.

Best Regards,

Forgot you might need to run this before trying the test:

sudo apt-get install libx11-dev libgles2-mesa-dev libopenal-dev libtbb-dev

So I tried replacing the 1335560 byte with the new 1338968 byte one.

It did not change anything. Did you guys download and try to run the zip I made?

I think the program is using software rendering.

How do I:

  1. Compile the software to force it to use the GPU since it seems not default?

  2. Verify what the software is using to render?

Well I wasn’t so sure on how to check it on linux if 3D applications are running on hardware acceleration or software or for that matter if software rendering is possible on linux but while trying to look out for such information over the internet, I came across the following information from this blog.

However there are libraries like MESA that implement the opengl functions entirely inside software. So it is possible to render graphics using opengl without actually having an opengl compatible gpu. So by checking the opengl rendering library, we can find out if hardware acceleration is present or not.
Check the glxinfo command output for OpenGL details
$ glxinfo | grep OpenGL
$ glxinfo | grep OpenGL
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) 965G
OpenGL version string: 2.1 Mesa 10.1.0
OpenGL shading language version string: 1.20
OpenGL extensions:

The “OpenGL renderer string” points to MESA libraries which means that 3d rendering is being handled entirely inside software. This is going to be slow and games would not work well.

The output on a machine with dedicated nvidia geforce 200 graphics card looks like this
$ glxinfo | grep OpenGL
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce 210/PCIe/SSE2
OpenGL core profile version string: 3.3.0 NVIDIA 331.20
OpenGL core profile shading language version string: 3.30 NVIDIA via Cg compiler
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.3.0 NVIDIA 331.20
OpenGL shading language version string: 3.30 NVIDIA via Cg compiler

And if you find out that the renderer string is pointing to IMG BXE-4-32, then you could try to capture a PVRTuneComplete recording for us to further investigate what’s actually causing the performance bottlenecks.

Thank you

OpenGL rendered string: softpipe