[confirm] Playing remote Bluray ISOs doesn't work (1 Viewer)

MJGraf

Retired Team Member
  • Premium Supporter
  • January 13, 2006
    2,478
    1,385
    Hi everybody,

    I just did some tests on playing Bluray Isos remotely, which was always a problem for me, but I didn't find enough time to dig deeper (using the winter release wip branch).

    The situation is as follows:
    MP2-Server runs on my htpc. On that htpc there is a bluray-iso-file on a directly attached harddisk.
    MP2-Client runs on my laptop. There, I want to play the bluray, which resides on the htpc as iso.
    Network between them is a wlan, which at least in theory should be sufficient from a bandwidth perspective...

    The result is: Either the bluray does not play at all or it stutters like hell. However, when I use VCD on my laptop to mount the remote iso, I can use vlc to watch the bluray (nearly) without stuttering.

    My first idea was that dokan may be the problem, but while investigating this, I found the "CachedMultiSegmentHttpStream", which is used to access remote files in MP2. I added a little bit of logging in the "read"-method of that class and the result can be found in the attached logs (just started MP2-Client and opened the respective "video", i.e. the remote bluray iso).

    First of all, the BDHandler seems to look for the iso-file itself on R: (which is the dokan drive). I would therfore assume that the chaining-up of the iso-resourceprovider happens on the client side. But when you have a look at the added logging for the CachedMultiSegmentHttpStream, there is a separate url for every file inside the iso. Furthermore, for blurays it seems that it first accesses hundrets of mpls-files inside the iso before it finally starts reading the m2ts file.

    Now my suspicion is that we have a separate CachedMultiSegmentHttpStream for every file inside the iso file (in particular the mpls-files) and every instance of the CachedMultiSegmentHttpStream reads part of the iso-file to cache it just for the one file we want to access in the iso. Since we use a chunk size of 512K, this would mean that for every single 200-byte-mpls-file, we transfer (probably the same) 512K of the iso file over the network - which would be a huge overhead.

    Maybe my suspicion above is completely wrong, I still did not understand the whole mechanism in detail. But perhaps someone of our pros can shed some light on it...

    Thanks,
    Michael
     

    Attachments

    • Log_Bluray.zip
      22.3 KB

    MJGraf

    Retired Team Member
  • Premium Supporter
  • January 13, 2006
    2,478
    1,385
    I haven't tried, yet, with the current build. About 3 months ago (dev branch) I could play local blurays without problems, so I suspect that the network in between is the culprit. And playing physical blurays from a client, when the bluray is in a server's bd drive should be a rare use case.
    Nevertheless, I'll try a local bluray over the next days. Thanks!
    Michael
     

    tourettes

    Retired Team Member
  • Premium Supporter
  • January 7, 2005
    17,301
    4,800
    And playing physical blurays from a client, when the bluray is in a server's bd drive should be a rare use case.

    It is possible to store the Blu-rays on the server without using ISOs as well. That was the use case I ment. Just to rule out the ISO as source having anything to do with the issue.
     

    MJGraf

    Retired Team Member
  • Premium Supporter
  • January 13, 2006
    2,478
    1,385
    Ah, thanks tourettes, now I got it...
    Will try - and if my suspicion above is correct, that should help.
     

    morpheus_xx

    Retired Team Member
  • Team MediaPortal
  • March 24, 2007
    12,073
    7,459
    Home Country
    Germany Germany
    @MJGraf you are right with your observations! The current BDHandler uses a library (BDInfoLib or similar name). It does a lot of reading in the beginning, scanning all available information from BD folder (no matter if local folder or from ISO). Some time ago I removed some calls, which leads to incomplete media information.

    Without the scanning playback could start much faster.

    The second thing is the network transfer while playback: CachedMultiSegmentHttpStream provides "chunks" of the file, including read ahead and caching algorithms. Here we could tweak the size of chunks and the number of read ahead streams, but settings will be different for media types (DVD/BD) based on their bitrate.

    I'm not sure how much the situation can be improved by changing the playback to BDReader.ax (did not work on this part for months).
     

    MJGraf

    Retired Team Member
  • Premium Supporter
  • January 13, 2006
    2,478
    1,385
    Thanks Morph,
    I will try to investigate this further - but this is definitely not for the next release because I fear it requires a lot of testing, in particular with many different file types and sizes.
    The first thing that comes to my mind is that we have to make sure in case of chained resource providers that we use the same stream for the same base resource provider (I.e. the iso file) even when we access different files inside the iso. I would bet that within the first 512k of the iso, all the small files to be read by the bdlib are already contained. That means that we only have to transfer these 512k once and bdlib can get all these files from the local cache.
    The second thing that comes to my mind is changing the chunk size, the maximum number of chunks to be cached and the number of chunks to be read ahead depending on the file size. For Isos with several GB of size, it may make sense to increase the chunk size from 512kb to 1 or even 5 MB. On the other hand, as you can see from my logs, the bd filter seems to always read chunks of 128kb in size (for DVDs it's 64kb per stream read). So an improvement could also be to adjust the cache chunk size to the actual filter read chunk size.
    I'll do some further tests and let you know. But this may take a while...
    Thanks for your help!
    Michael
     

    tourettes

    Retired Team Member
  • Premium Supporter
  • January 7, 2005
    17,301
    4,800
    I'm not sure how much the situation can be improved by changing the playback to BDReader.ax (did not work on this part for months).

    Probably not much if at all - libbluray accesses lot of files in the startup phase.
     

    MJGraf

    Retired Team Member
  • Premium Supporter
  • January 13, 2006
    2,478
    1,385
    ok, first test shows that extracting the bluray from the iso into a folder even makes things worse. This time, it just takes forever and then MP2-client crashes (logs attached).
    Unfortunately, this speaks against my first suspicion. Now we definitely have a separate CachedMultiSegmentHttpStream for every single small mpls file, which in turn means that using the iso seems to improve the situation. Maybe we already have just one CachedMultiSegmentHttpStream for the iso, which is then used for accessing all the files in it?!? Need to dig deeper...
    Michael
     

    MJGraf

    Retired Team Member
  • Premium Supporter
  • January 13, 2006
    2,478
    1,385
    Just a thought, which I haven't checked, yet:

    Let's assume, currently the access to the file on MP2-client side is like this:
    RemoteResourceProvider->IsoResourceProvider->StreamedResourceToLocalFSAccessBridge->FilterGraph
    This would explain my observations above.

    I'm wondering if it would be better to put another StreamedResourceToLocalFSAccessBridge right after the RemoteResourceProvider like that:
    RemoteResourceProvider->StreamedResourceToLocalFSAccessBridge->IsoResourceProvider->StreamedResourceToLocalFSAccessBridge->FilterGraph
    When my thoughts are correct, this would mean that first just the ISO is mirrored locally via remote file transfer - and only once. All the rest - in particular accessing files in the iso would happen locally.

    As mentioned, just a thought...
     

    Users who are viewing this thread

    Top Bottom