ADF
1 TopicGetting Project Server data using REST APIs in ADF
Hello, I have created a pipeline that copies Project Server data from REST APIs to a JSON Datalake and then to my SQL Server database (SSMS). I was inspired by the following blog: https://techcommunity.microsoft.com/blog/projectsupport/reading-project-online-odata-with-azure-data-factory/3616300 I built the pipeline like in the blog and it looks like this: The first 5 activities retrieve login information necessary to access Project Server. All that information is brought together with a concat in the next step. I also have a access token (bearer token). Next, I have three different copy steps, one for each table of Project Server data that I want to copy (Projects, Resources & Assignments). Each table has its own REST API. The Projects REST API look like this: The other two are the same only with ‘Assignments’ and ‘Resources’ at the end. The Copy activity looks like this: The Sink is a JSON Datalake. For the mapping part, I only put the ProjectId column for testing purposes. My main issue is the amount of rows that the API’s copy. Firstly, the Projects API copies only copies exactly 300 rows, meanwhile it should be 626. The Resources API copies way more rows: 838, which is the right amount. The Assignments API copies exactly 1000 rows. The Assignments are a large Fact table of more than 40 000 rows. Another issue is the rest of the output I receive from the Copy activity. The output does not give any info about ‘offset’ or ‘pages’, so I also cannot use pagination. I tried to work around this with adding "?top=1000" to the API URL and by using relative URL’s, an Until loop, but nothing really seems to work. Does someone have experience in this niche issue? I haven’t found much documentation (the blog also did not mention anything about the count of rows) and I am new to ADF so any help is more than welcome! Thank you in advance!82Views0likes2Comments