Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluation Remote Not Working #160

Open
konabuta opened this issue Dec 9, 2024 · 1 comment
Open

Evaluation Remote Not Working #160

konabuta opened this issue Dec 9, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@konabuta
Copy link

konabuta commented Dec 9, 2024

Operating System

MacOS

Version Information

Python Version: 3.12.7
azure-ai-generative package version:
azure-ai-evaluation 1.0.1
azure-ai-ml 1.22.4
azure-ai-projects 1.0.0b2
azure-common 1.1.28
azure-core 1.32.0
azure-identity 1.19.0
azure-mgmt-core 1.5.0
azure-monitor-opentelemetry-exporter 1.0.0b32
azure-storage-blob 12.24.0
azure-storage-file-datalake 12.18.0
azure-storage-file-share 12.20.0

Steps to reproduce

  1. Run Evaluate_On_Cloud.ipynb

path: scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/Evaluate_On_Cloud.ipynb

  1. Get error in project_client.evaluations.create function.

Expected behavior

Be able to create evaluation from code

Actual behavior

Got error messages - DecodeError: Cannot deserialize content-type: application/octet-stream

The full stack is as below.

---------------------------------------------------------------------------
DecodeError                               Traceback (most recent call last)
Cell In[13], [line 9](vscode-notebook-cell:?execution_count=13&line=9)
      [1](vscode-notebook-cell:?execution_count=13&line=1) evaluation = Evaluation(
      [2](vscode-notebook-cell:?execution_count=13&line=2)     display_name="Cloud Evaluation",
      [3](vscode-notebook-cell:?execution_count=13&line=3)     description="Cloud Evaluation of dataset",
      [4](vscode-notebook-cell:?execution_count=13&line=4)     data=Dataset(id=data_id),
      [5](vscode-notebook-cell:?execution_count=13&line=5)     #evaluators=evaluators,
      [6](vscode-notebook-cell:?execution_count=13&line=6) )
      [8](vscode-notebook-cell:?execution_count=13&line=8) # Create evaluation
----> [9](vscode-notebook-cell:?execution_count=13&line=9) evaluation_response = project_client.evaluations.create(
     [10](vscode-notebook-cell:?execution_count=13&line=10)     evaluation=evaluation,
     [11](vscode-notebook-cell:?execution_count=13&line=11) )

File ~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/tracing/decorator.py:105, in distributed_trace.<locals>.decorator.<locals>.wrapper_use_tracer(*args, **kwargs)
    [103](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/tracing/decorator.py:103) span_impl_type = settings.tracing_implementation()
    [104](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/tracing/decorator.py:104) if span_impl_type is None:
--> [105](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/tracing/decorator.py:105)     return func(*args, **kwargs)
    [107](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/tracing/decorator.py:107) # Merge span is parameter is set, but only if no explicit parent are passed
    [108](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/tracing/decorator.py:108) if merge_span and not passed_in_parent:

File ~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:6969, in EvaluationsOperations.create(self, evaluation, **kwargs)
   [6966](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:6966) _request.url = self._client.format_url(_request.url, **path_format_arguments)
   [6968](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:6968) _stream = kwargs.pop("stream", False)
-> [6969](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:6969) pipeline_response: PipelineResponse = self._client._pipeline.run(  # pylint: disable=protected-access
   [6970](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:6970)     _request, stream=_stream, **kwargs
   [6971](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:6971) )
   [6973](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:6973) response = pipeline_response.http_response
   [6975](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:6975) if response.status_code not in [201]:

File ~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:240, in Pipeline.run(self, request, **kwargs)
    [238](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:238) pipeline_request: PipelineRequest[HTTPRequestType] = PipelineRequest(request, context)
    [239](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:239) first_node = self._impl_policies[0] if self._impl_policies else _TransportRunner(self._transport)
--> [240](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:240) return first_node.send(pipeline_request)

File ~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:96, in _SansIOHTTPPolicyRunner.send(self, request)
     [94](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:94) _await_result(self._policy.on_request, request)
     [95](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:95) try:
---> [96](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:96)     response = self.next.send(request)
     [97](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:97) except Exception:
     [98](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:98)     _await_result(self._policy.on_exception, request)

File ~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:96, in _SansIOHTTPPolicyRunner.send(self, request)
     [94](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:94) _await_result(self._policy.on_request, request)
     [95](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:95) try:
---> [96](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:96)     response = self.next.send(request)
     [97](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:97) except Exception:
     [98](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:98)     _await_result(self._policy.on_exception, request)

    [... skipping similar frames: _SansIOHTTPPolicyRunner.send at line 96 (1 times)]

File ~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:96, in _SansIOHTTPPolicyRunner.send(self, request)
     [94](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:94) _await_result(self._policy.on_request, request)
     [95](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:95) try:
---> [96](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:96)     response = self.next.send(request)
     [97](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:97) except Exception:
     [98](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:98)     _await_result(self._policy.on_exception, request)

File ~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:100, in _SansIOHTTPPolicyRunner.send(self, request)
     [98](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:98)     _await_result(self._policy.on_exception, request)
     [99](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:99)     raise
--> [100](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:100) _await_result(self._policy.on_response, request, response)
    [101](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:101) return response

File ~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:49, in await_result(func, *args, **kwargs)
     [38](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:38) def await_result(func: Callable[P, T], *args: P.args, **kwargs: P.kwargs) -> T:
     [39](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:39)     """If func returns an awaitable, raise that this runner can't handle it.
     [40](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:40) 
     [41](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:41)     :param func: The function to run.
   (...)
     [47](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:47)     :raises: TypeError
     [48](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:48)     """
---> [49](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:49)     result = func(*args, **kwargs)
     [50](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:50)     if hasattr(result, "__await__"):
     [51](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:51)         raise TypeError("Policy {} returned awaitable object in non-async pipeline.".format(func))

File ~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:714, in ContentDecodePolicy.on_response(self, request, response)
    [710](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:710)     return
    [712](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:712) response_encoding = request.context.get("response_encoding")
--> [714](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:714) response.context[self.CONTEXT_NAME] = self.deserialize_from_http_generics(
    [715](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:715)     response.http_response, response_encoding
    [716](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:716) )

File ~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:681, in ContentDecodePolicy.deserialize_from_http_generics(cls, response, encoding)
    [679](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:679)     if not inspect.iscoroutinefunction(response.read):  # type: ignore
    [680](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:680)         response.read()  # type: ignore
--> [681](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:681) return cls.deserialize_from_text(response.text(encoding), mime_type, response=response)

File ~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:641, in ContentDecodePolicy.deserialize_from_text(cls, data, mime_type, response)
    [639](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:639) elif mime_type.startswith("text/"):
    [640](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:640)     return data_as_str
--> [641](https://file+.vscode-resource.vscode-cdn.net/Users/keonabut/code/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/code/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:641) raise DecodeError("Cannot deserialize content-type: {}".format(mime_type))

DecodeError: Cannot deserialize content-type: application/octet-stream

Addition information

No response

@konabuta konabuta added the bug Something isn't working label Dec 9, 2024
@eitansela
Copy link
Contributor

Reproduced the bug:

---------------------------------------------------------------------------
DecodeError                               Traceback (most recent call last)
Cell In[7], [line 9](vscode-notebook-cell:?execution_count=7&line=9)
      [1](vscode-notebook-cell:?execution_count=7&line=1) evaluation = Evaluation(
      [2](vscode-notebook-cell:?execution_count=7&line=2)     display_name="Cloud Evaluation",
      [3](vscode-notebook-cell:?execution_count=7&line=3)     description="Cloud Evaluation of dataset",
      [4](vscode-notebook-cell:?execution_count=7&line=4)     data=Dataset(id=data_id),
      [5](vscode-notebook-cell:?execution_count=7&line=5)     evaluators=evaluators,
      [6](vscode-notebook-cell:?execution_count=7&line=6) )
      [8](vscode-notebook-cell:?execution_count=7&line=8) # Create evaluation
----> [9](vscode-notebook-cell:?execution_count=7&line=9) evaluation_response = project_client.evaluations.create(
     [10](vscode-notebook-cell:?execution_count=7&line=10)     evaluation=evaluation,
     [11](vscode-notebook-cell:?execution_count=7&line=11) )

File ~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/tracing/decorator.py:105, in distributed_trace.<locals>.decorator.<locals>.wrapper_use_tracer(*args, **kwargs)
    [103](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/tracing/decorator.py:103) span_impl_type = settings.tracing_implementation()
    [104](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/tracing/decorator.py:104) if span_impl_type is None:
--> [105](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/tracing/decorator.py:105)     return func(*args, **kwargs)
    [107](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/tracing/decorator.py:107) # Merge span is parameter is set, but only if no explicit parent are passed
    [108](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/tracing/decorator.py:108) if merge_span and not passed_in_parent:

File ~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:7067, in EvaluationsOperations.create(self, evaluation, **kwargs)
   [7064](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:7064) _request.url = self._client.format_url(_request.url, **path_format_arguments)
   [7066](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:7066) _stream = kwargs.pop("stream", False)
-> [7067](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:7067) pipeline_response: PipelineResponse = self._client._pipeline.run(  # pylint: disable=protected-access
   [7068](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:7068)     _request, stream=_stream, **kwargs
   [7069](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:7069) )
   [7071](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:7071) response = pipeline_response.http_response
   [7073](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/ai/projects/operations/_operations.py:7073) if response.status_code not in [201]:

File ~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:240, in Pipeline.run(self, request, **kwargs)
    [238](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:238) pipeline_request: PipelineRequest[HTTPRequestType] = PipelineRequest(request, context)
    [239](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:239) first_node = self._impl_policies[0] if self._impl_policies else _TransportRunner(self._transport)
--> [240](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:240) return first_node.send(pipeline_request)

File ~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:96, in _SansIOHTTPPolicyRunner.send(self, request)
     [94](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:94) _await_result(self._policy.on_request, request)
     [95](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:95) try:
---> [96](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:96)     response = self.next.send(request)
     [97](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:97) except Exception:
     [98](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:98)     _await_result(self._policy.on_exception, request)

File ~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:96, in _SansIOHTTPPolicyRunner.send(self, request)
     [94](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:94) _await_result(self._policy.on_request, request)
     [95](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:95) try:
---> [96](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:96)     response = self.next.send(request)
     [97](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:97) except Exception:
     [98](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:98)     _await_result(self._policy.on_exception, request)

    [... skipping similar frames: _SansIOHTTPPolicyRunner.send at line 96 (1 times)]

File ~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:96, in _SansIOHTTPPolicyRunner.send(self, request)
     [94](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:94) _await_result(self._policy.on_request, request)
     [95](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:95) try:
---> [96](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:96)     response = self.next.send(request)
     [97](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:97) except Exception:
     [98](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:98)     _await_result(self._policy.on_exception, request)

File ~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:100, in _SansIOHTTPPolicyRunner.send(self, request)
     [98](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:98)     _await_result(self._policy.on_exception, request)
     [99](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:99)     raise
--> [100](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:100) _await_result(self._policy.on_response, request, response)
    [101](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_base.py:101) return response

File ~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:49, in await_result(func, *args, **kwargs)
     [38](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:38) def await_result(func: Callable[P, T], *args: P.args, **kwargs: P.kwargs) -> T:
     [39](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:39)     """If func returns an awaitable, raise that this runner can't handle it.
     [40](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:40) 
     [41](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:41)     :param func: The function to run.
   (...)
     [47](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:47)     :raises: TypeError
     [48](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:48)     """
---> [49](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:49)     result = func(*args, **kwargs)
     [50](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:50)     if hasattr(result, "__await__"):
     [51](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/_tools.py:51)         raise TypeError("Policy {} returned awaitable object in non-async pipeline.".format(func))

File ~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:714, in ContentDecodePolicy.on_response(self, request, response)
    [710](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:710)     return
    [712](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:712) response_encoding = request.context.get("response_encoding")
--> [714](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:714) response.context[self.CONTEXT_NAME] = self.deserialize_from_http_generics(
    [715](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:715)     response.http_response, response_encoding
    [716](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:716) )

File ~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:681, in ContentDecodePolicy.deserialize_from_http_generics(cls, response, encoding)
    [679](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:679)     if not inspect.iscoroutinefunction(response.read):  # type: ignore
    [680](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:680)         response.read()  # type: ignore
--> [681](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:681) return cls.deserialize_from_text(response.text(encoding), mime_type, response=response)

File ~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:641, in ContentDecodePolicy.deserialize_from_text(cls, data, mime_type, response)
    [639](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:639) elif mime_type.startswith("text/"):
    [640](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:640)     return data_as_str
--> [641](https://file+.vscode-resource.vscode-cdn.net/Users/eitansela/Dev/github/azureai-samples/scenarios/evaluate/Supported_Evaluation_Targets/Evaluate_On_Cloud/~/Dev/github/azureai-samples/scenarios/evaluate/.venv/lib/python3.12/site-packages/azure/core/pipeline/policies/_universal.py:641) raise DecodeError("Cannot deserialize content-type: {}".format(mime_type))

DecodeError: Cannot deserialize content-type: application/octet-stream

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants