Dataclay — Automating Digital Production
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Register
    • Login

    Data Source size limit?

    Scheduled Pinned Locked Moved Local JSON
    4 Posts 3 Posters 37 Views 3 Watching
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • pbretzundefined Offline
      pbretz
      last edited by

      Hi,

      We’re testing out using local JSON as a data source for our projects. In Google Sheets, data is composed of about 150 columns, and can range anywhere from 1 to 750 records. I’ve done a test with about 30 records which took a little while for the JSON to load (the file is loaded from the local hard drive that has the .aep) which rendered much more quickly than a Google Sheet due to minimal delay between renders. I then ran a test with a file of about 500 records which never actually loaded, it took so long. Would you expect this to be the case? Is there a maximum number of key:value pairs and jobs that we should consider? Thanks.

      Pete

      Jeffundefined ariestavundefined 2 Replies Last reply Reply Quote 0
      • Jeffundefined Offline
        Jeff @pbretz
        last edited by

        @pbretz

        The main issue with this scenario is the amount of memory After Effects allocates to ExtendScripts like Templater. Adobe doesn’t want third-party apps to have free rein over all a system’s memory, which means that loading a large Data Source, as you’ve described here, can be a bit unwieldy. The problem is that Templater has to open and read the entire dataset with these limited resources, which is causing a significant slowdown.

        To get around this, we’d need to feed Templater the data in smaller chunks. The easiest way to achieve this would probably be to use a JSON file with only one entry and then use an Event Script on the “After Output” Templater Event to update the JSON with the next record. That would probably be the most efficient method and should keep the time required to parse the JSON to a minimum.

        Another option would be to use a custom JSON URL feed to pass one record at a time to Templater for processing. This method also requires an Event Script to update the JSON record after each render, since Templater can’t write back to the URL feed to update the render-status. However, this can also be a very efficient way to pass data to Templater, as you can customize the URL feed internally.

        Hopefully, that all makes sense, but if you have any other questions, let us know.

        Thanks,

        Jeff

        1 Reply Last reply Reply Quote 0
        • ariestavundefined Offline
          ariestav @pbretz
          last edited by

          @pbretz Adding onto what @Jeff wrote, if you are going to have a single entry in the local JSON file, you’d want to use a render-status property key in the JSON object so that Bot would know when the over-written file was ready to be processed.

          pbretzundefined 1 Reply Last reply Reply Quote 0
          • pbretzundefined Offline
            pbretz @ariestav
            last edited by

            @ariestav @Jeff Thanks for the info. sounds like we’d be better off just using Que?!

            1 Reply Last reply Reply Quote 1
            • First post
              Last post