Normal
i guess it doesn't matter what codec its using, reality is that when connecting to a remote server cpu runs low and hardware accelerations is definitely used but when the server is local cpu runs high as if its not using hardware acceleration and picture is jerky. Local server has also been confirmed good by connecting to it from another remote client, so the real question is why is performamnce so bad when both client and server is on the same machine, why would the client behave any difirent