Viva Insights query integration with enterprise systems

Glen_Hing
Valued Contributor II
Valued Contributor II

Currently we are manually accessing our Viva Insights queries using a Microsoft API to download the query result sets to our Enterprise backbone and one of our loads takes roughly 3 days to get executed and often time out. We are looking for a solution and are open to suggestions.

- Is there a configurable API by date range that we can use instead of the current MS API that only allows us to access 10 K records at a time.

- Are we able to get a query automatically generating a file that could be sent to us via SFTP.

 Looking to hear from you and this request is urgent.

Thanks, Glen MH

  

 

7 REPLIES 7

Dazza
Valued Contributor II
Valued Contributor II

Hi @Glen_Hing

I'm not sure if this post may help - Solved: Re: Using Data Factory via OData connector - Microsoft Viva Insights Community

Unless you're already doing something similar?

Glen_Hing
Valued Contributor II
Valued Contributor II

Hi Dazza,

Thank you very much for this prompt reply but we are already doing this and accessing Queries that we pre built into our WpA instance. The limitation here is that we will need to build queries for each month of data that we are looking to download. 

 

What we are looking for an flexible API were we can pass a parameter/s for say "the month of the data we are trying to down load". here the query will remain the same but the data will change based on the month we are looking for.

Looking for feedback on this.

Thanks,

Glen

 

 

 

Glen_Hing
Valued Contributor II
Valued Contributor II

Hi Dazza,

  1. In WpA/Viva Insight can we have API with date filters?
  2. Is it possible to extract data via API for incremental loads based on last updated timestamp (like monthly, daily or date range based).

 

We are currently loading data from Mar,2021 to Mar,2022 with 60 million records and this takes around 4 days to load and this volume is expected to grow every month and we find it not feasible to load this every month.

If we have API which can fetch all the data that has changed since our last extract then the number of records would be manageable to extract and merge into already existing historical load. Please suggest us an approach to handle incremental load via API.

Thanks,

Glen

 

Glen_Hing
Valued Contributor II
Valued Contributor II

No we did not accept the solution since it is what we are already doing please see my post yesterday 

I second this request. we need a more efficient way to pull data automatically each month. the current process is fine for adhoc analysis but difficult to scale. A timeline on more flexible API would be great. In fact if there could be an api that makes it easier to build some of the different types of queries through a more advance interface it would make running analysis fair more powerful.

Dazza
Valued Contributor II
Valued Contributor II

Totally in agreement, I'm currently researching this - Automate query data exports | Microsoft Docs, so don't have answer's at this point. 

@Jake_Caddes, do you think you could get the product team to provide some insights on this?

Jake_Caddes
Community Manager
Community Manager

Hi @Glen_Hing @Dazza , I apologize for taking awhile to get back to you both. I finally heard back from the Product Team about this one. This is what they said, 
"Hi to all - thanks for providing this excellent and specific feedback about a use case you would like to see in Insights. We have heard this ask for more flexible data egress solutions from other customers and partners as well, and we currently have solutions in development. We aren't yet in a position to provide a timeline, but we will update our public roadmap when that changes. And if we need more context to understand your scenarios, we will reach back out."

Recommendations for you