Skip to main content

Hi, i've been trying to import my invoice items for all of 2024 using the API, but so far it's a quite slow process. The API doesn't seem to allow me to import chunks of data (an updated_before parameter would be sufficient), so i've resorted to taking all invoice id's from the invoice table and calling the invoice items per invoice.

The problem with this is that this costs me one API call per invoice. For all of 2024 that's going to take forever.

Is there a better way to get the invoice item data?

For context, what i'm trying to achieve is to link treatment id's to the payment allocations table. So far my route has been to link payment allocations through the invoice item id to a treatment plan item, which has a treatment id linked to it. If there's a smarter way of approaching this i'd be happy to hear about it.

Hi Enzolima,

Thanks for your question. Unfortunately, there doesn’t seem to be a built-in parameter in our API documentation (like an updated_before) to bulk import invoice items. Right now, you do need to call the invoice items endpoint for each invoice individually which is not ideal and I get it! 

Here are a couple of thoughts/suggestions you could try:
    1.    You could try to minimise the number of API calls by first filtering invoices by date (using our Invoices endpoint) so that you only process invoices from 2024, and then retrieve their items. This might help reduce the volume of calls.
    2.    Since your goal is to link treatment IDs to the payment allocations table, it might be worth exploring whether the Payment Allocations endpoint can be leveraged more directly to reduce the dependency on invoice item calls.
    3.    Our team is working hard on some great improvements to our API and we really can't wait to show them to everybody! If you have any additional feedback on this or other features you'd like to see, feel free to share them here. 

Hope this helps clarify the current setup. Let me know if you need further details or have any additional ideas! I’m also waiting to hear from our engineering team to see if there’s any other workarounds.


Reply