Large data set 'not responding' before replicating
-
I have a json with 3500 videos. Templater will replicate, but it takes 30 minutes of ‘not responding’ before it begins. I don’t attempt to preview here. However, if I split the json or csv/tsv into 1200 videos, it proceeds to replicate after only a couple of minutes not responding. Is this expected and is there a data limit on one json or tsv file?
-
Thanks for getting in touch with us. The answer depends on how you’re doing the replication. If you’re replicating using the UI, Templater will attempt to read the whole data set in before starting the replication process. This is why we generally don’t recommend using the Templater UI controls for large data sets. Instead, we’d suggest using Templater Bot since it will process the data set one row at a time and should be much less resource-intensive for data sets of this size. Hopefully, that helps, but if you have any other questions, just let us know.
-
@dhannigan To add to what Jeff said, when you set the render range in the Templater panel (rows x to z) and click on either the “Render” or “Replicate” buttons, those processes work differently from “the Bot,” which can process large datasets in smaller chunks (1-20 rows can be queued at once, then once those are done, the next set will start processing). If you are setting rows 1 through 3500 and clicking “Replicate,” Templater will have to add all 3500 rows into it’s memory, which will be a computationally taxing process. Using Render/Replicate on large datasets can be done (even if it’s slow or unresponsive), but it’s not recommended to use Templater in that way, for the symptoms you describe (hanging for half an hour). It’s why we added the Bot functionality, since we’d generally recommend setting a smaller range of data at one time, manually, then once that range is done, manually setting the next range.
-
Thank you for the detailed response. I got word we are licensed for bot so once we get that going it should streamline the process!