Thank you for your response.
While attempting to dump the model into ADX, I encountered an error:
(Py4JJavaError: An error occurred while calling o1118.save.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 7 in stage 43.0 failed 4 times, most recent failure: Lost task 7.3 in stage 43.0 (TID 186) (10.139.64.4 executor 0): java.lang.NoSuchMethodError: com.azure.core.util.CoreUtils.getDefaultTimeoutFromEnvironment(Lcom/azure/core/util/Configuration;Ljava/lang/String;Ljava/time/Duration;Lcom/azure/core/util/logging/ClientLogger;)Ljava/time/Duration;
at com.azure.core.http.netty.NettyAsyncHttpClientBuilder.<clinit>(NettyAsyncHttpClientBuilder.java:79)
Upon further investigation, it came to my attention that this issue might be caused by a versioning mismatch among the packages.
Currently, I am utilizing the following packages for our purpose:
1. com.microsoft.azure.kusto:kusto-data:5.0.0
2. com.microsoft.azure.kusto:kusto-ingest:5.0.0
3. com.microsoft.azure.kusto:kusto-spark_3.0_2.12:5.0.0
While using the same set of versions and packages, I can successfully read from ADX but encounter difficulties when attempting to write to it.
I have the permission of Database Admin.
Note: I am executing my code over data bricks only as per the documentation.