Feature: streaming_transcode_queue

Created on 12 October 2023, about 1 year ago
Updated 7 June 2024, 7 months ago

In altre issue è emersa la necessità di aggiungere un modulo per la gestione delle code. Questa issue raccoglie e organizza il lavoro su questa funzionalità.
Il ramo di riferimento per proporre MR è 2.x. Il ramo 2.x è stabile e funzionante su D10 ma una sua dipendenze non lo è, quindi per far funzionare correttamente la versione 2 di questo modulo su Drupal 10 devi leggere questo https://www.drupal.org/project/php_ffmpeg/issues/3365929 Port to D10? Closed: duplicate .

Thanks @sidgrafix <3 <3 <3

Feature request
Status

Active

Version

2.0

Component

Code

Created by

🇮🇹Italy arturopanetta Grotteria (RC)

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Merge Requests

Comments & Activities

  • Issue created by @arturopanetta
  • First commit to issue fork.
  • Pipeline finished with Success
    about 1 year ago
    Total: 44s
    #30353
  • 🇺🇸United States sidgrafix

    @arturopanetta

    Apologies for delay, ended up sick the end of last week and had only managed to setup the fork and pull it. Did some cleaning up today and discovered a problem with my implementation. If I trigger the queue worker cron manually (which is how I was testing) using simple_cron module everything works, however if cron is triggered via crontab (externally triggered) it does not, it also doesn't produce any errors.

    Turns out it's user/session related as cron by default always runs as anonymous user. So if I manually trigger cron via simple_cron interface my session is attached as (admin) user and works.

    I tried a few things in the queue worker class itself using Drupal's "Account Switcher" with Drupal\Core\Session\AccountProxyInterface but even tho I can make it run as admin with that, it still won't fire on crontab externally. So now I'm attempting to add a service to handle the triggering of the the content controller for streaming_media_views using a similar method. I even tried setting the controller (temporarily) to access content permission in the routing.yml (figuring since anonymous users can view content) just to test as that wouldn't be ideal in any respect but that didn't help either.

    If I can't get it worked out tomorrow - I'll push the manual queue form (working mostly) to the fork and post what I'm attempting here in a comment for some feedback. I think somewhere my logic (thought process) on getting this to work right may be flawed and I may be going about it all wrong.

  • 🇺🇸United States sidgrafix

    Apologies for the delay. Had to focus on some other areas for a bit. However I think I've finally got everything sorted. I just need to run some more tests and clean a couple things up. At latest I will push the files sometime next week. I'll include some screen shots with this.

    To get Cron and the QueueWorker processing correctly I ended up creating a service from the controller you had in the streaming_media_views module as a service for streaming_transcode_queue I didn't modify the original so it shouldn't impact anything you've previously done. There is a lot to what I've done so I just want to double check things before pushing to git.

  • Pipeline finished with Success
    11 months ago
    Total: 73s
    #80139
  • 🇺🇸United States sidgrafix

    This is a patch I'm including for "aminyazdanpanah/php-ffmpeg-video-streaming" (required by the streaming module) that fixes some php 8.x issue:

    Deprecated function: Return type of Streaming\RepsCollection::getIterator() should either be compatible with IteratorAggregate::getIterator(): Traversable, or the #[\ReturnTypeWillChange] attribute should be used to temporarily suppress the notice in include()

    and

    Deprecated function: Implicit conversion from float 0.5 to int loses precision in Streaming\Stream->__destruct()

    The patch above will only be valid for PHP 8.x once PHP 9.x is mainstream I believe the RepsCollection.php file will need to be re-written to be compatible with IteratorAggregate::getIterator(): Traversable (currently beyond my scope of capabilities), should anyone know exactly what the means and knows needs to be done please submit a patch.

    Additional notes and topics for discussion:

    Remote assets that are included in libraries.yml for streaming_media_views:

    css:
      theme:
        //vjs.zencdn.net/7.10.2/video-js.min.css: { type: external, minified: true }
    
    js:
      //vjs.zencdn.net/7.10.2/video.min.js: { type: external, minified: true }
      //unpkg.com/videojs-contrib-quality-levels@2.0.9/dist/videojs-contrib-quality-levels.min.js: { type: external, minified: true }
      //www.unpkg.com/videojs-http-source-selector@1.1.6/dist/videojs-http-source-selector.min.js: { type: external, minified: true }

    I personally believe these should be downloaded and added locally (either by the end user of' or packaged with streaming_media_views) rather than remotely via url on every page load. Reason: Overall "Security".
    - Having them loaded by url causes issues when using advanced security modules like CSP "Content Security Policy".

    On a personal note, I downloaded the needed files and added them to the assets directory in the streaming_media_views module. and modified streaming_media_views.libraries.yml as follows:

    videojs:
      css:
        theme:
          assets/video-js.min.css: { minified: true }
          assets/backend.css: {}
    
      js:
        assets/video.min.js: { minified: true }
        assets/videojs-contrib-quality-levels.min.js: { minified: true }
        assets/videojs-http-source-selector.min.js: { minified: true }
        assets/video.js: {}

    In regards to the css and js if someone wants a patch let me know and I'll add one for that as well.

    Note on video formats: Initially I was under the impression the streaming module required the uploaded video (that will be transcoded) to be in an mp4 format. This is not the case, pretty much any format can be used (depending on the version of ffmpeg and what libraries were compiled with it).

    If your apache based web server happens to be built on a Debian Linux disto 9 or 10 and ffmpeg was installed using the Debian package manager, you should be able to upload videos in (mp4, avi, mov, flv, ogg, ogv, ogm, webm, mpeg, mpg, m4v) that can successfully be transcoded to a streaming format.

    Lastly I would like to discuss the handling of transcode on save. I included an option in "Miscellaneous" setting of the streaming_transcode_queue called "Transcode on node entity save" - it works, but because transcoding occurs attached/through the user session it can make your site seam like it is crawling for the user who uploaded the video - similar to how transcoding works from the admin media page (the original/default method for streaming_media_views). At the same time this suppresses any logging to Dblog that occurs during the transcode processing until transcoding has completed. In comparison, transcoding that occurs via QueueWorker using the streaming transcode queue happens in the background and continually logs to Dblog as each process occurs and there does not appear to be any kind of performance impact to users while transcoding is happening.

    - Initially I was exploring options to use a callback function to trigger transcoding on node save in the background. However I'm not convinced that would actually work. Plus I couldn't figure out a method to see/check if any of the fields on a node are a media entity references (either by bundle type or the video file field name of field_media_video_file) without knowing the actual field name that was created/added to the content type for uploading a video (which in theory could be named anything) like a generic name of field_video_upload.
    - So I'm looking into alternative methods (ideally without the need to include any additional libraries) were you could send something to be processed without waiting for it to complete (basically trigger a function without waiting for a response).
    - My current thought process is, instead of trying to make a callback work use another queue that gets checked once a minute! For example a queue named "transcode_now" and if the option to transcode on node entity save is on, add the newly uploaded video details to the queue. The next minute the transcode_now queue is checked by its cron queue worker (if there are any items in queue to be transcoded) transcoding starts and completes in the background. Maybe that's overkill or maybe there is a better way (I'm not sure myself), so if anyone has a thought on that please chime in.

  • Pipeline finished with Success
    11 months ago
    #88116
  • 🇺🇸United States sidgrafix

    Not exactly shor why but applying the merge request as a patch says fails to apply and causes all kinds of wacky issues. I've looked at the repo and it all looks correct nor is the first time I've use a merge request as a patch file. Regardless, when trying to patch via composer it ends up adding the feature 3 times and leaves a bunch of .rej files. Initially I thought even tho composer said it couldn't apply the patch "where you would expect it to revert any changes it may have attempted" but did not, when it did infact apply the patch (but as I said far from clean) which caused me some serious headaches for a couple days until I tracked down what actually happened.

    So I'm including a regular patch file here that works no-problem. I'm also including a patch to make the external sourced css and js that are for the streaming_media_views module (mentioned above in #7)

    Patches apply to streaming 2.x-dev

    - I did a comparison of the feature patch here and using the merge diff as patch and noticed two chunks of extra data at the top and bottom marked GITLAB and probably why trying to apply the merge as a patch borks.

  • 🇺🇸United States sidgrafix

    Adding updated patch version to add streaming_transcode_queue module to streaming (made against 2.x-dev should also patch on 2.x-alpha1), I think my initial git push of the module is why patching fails if trying to patch from the merge request url, at the time my version of git was outdated and i think it caused issues, when I can I'll redo all of that and push everything fresh.

    - Just fixed a couple small issues in the event a fatal error occurs, or an error thought to be fatal and was not occurs.

    When I can in the future I plan to rework a lot of this module, by adding a second queue to handle transcoding verification and implementing some events and event subscribers to handle things more efficiently. I've learned a considerable amount in other areas of Drupal after my initial creation of this which should drastically improve things.

Production build 0.71.5 2024