Skip to content

Commit 4f8dc00

Browse files
authored
[Monitor Query] Add batch metrics query integration (#31049)
This adds the `MetricsBatchQueryClient` for querying resource URIs in batches. Signed-off-by: Paul Van Eck <paulvaneck@microsoft.com>
1 parent 08d0201 commit 4f8dc00

48 files changed

Lines changed: 4558 additions & 128 deletions

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

sdk/monitor/azure-monitor-query/CHANGELOG.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,11 @@
11
# Release History
22

3-
## 1.2.1 (Unreleased)
3+
## 1.3.0b1 (Unreleased)
44

55
### Features Added
66

7+
- Added `MetricsBatchQueryClient` to support batch querying metrics from Azure resources. ([#31049](https://github.com/Azure/azure-sdk-for-python/pull/31049))
8+
79
### Breaking Changes
810

911
### Bugs Fixed

sdk/monitor/azure-monitor-query/README.md

Lines changed: 39 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ pip install azure-monitor-query
3737

3838
### Create the client
3939

40-
An authenticated client is required to query Logs or Metrics. The library includes both synchronous and asynchronous forms of the clients. To authenticate, create an instance of a token credential. Use that instance when creating a `LogsQueryClient` or `MetricsQueryClient`. The following examples use `DefaultAzureCredential` from the [azure-identity](https://pypi.org/project/azure-identity/) package.
40+
An authenticated client is required to query Logs or Metrics. The library includes both synchronous and asynchronous forms of the clients. To authenticate, create an instance of a token credential. Use that instance when creating a `LogsQueryClient`, `MetricsQueryClient`, or `MetricsBatchQueryClient`. The following examples use `DefaultAzureCredential` from the [azure-identity](https://pypi.org/project/azure-identity/) package.
4141

4242
#### Synchronous clients
4343

@@ -524,6 +524,43 @@ for metric in response.metrics:
524524
)
525525
```
526526

527+
### Metrics batch query
528+
529+
A user can also query metrics from multiple resources at once using the `query_batch` method of `MetricsBatchQueryClient`. This uses a different API than the `MetricsQueryClient` and requires that a user pass in a regional endpoint when instantiating the client (for example, "https://westus3.metrics.monitor.azure.com").
530+
531+
Note, each resource must be in the same region as the endpoint passed in when instantiating the client, and each resource must be in the same Azure subscription. Furthermore, the metric namespace that contains the metrics to be queried must also be passed. A list of metric namespaces can be found [here][metric_namespaces].
532+
533+
534+
```python
535+
from datetime import timedelta
536+
import os
537+
538+
from azure.core.exceptions import HttpResponseError
539+
from azure.identity import DefaultAzureCredential
540+
from azure.monitor.query import MetricsBatchQueryClient, MetricAggregationType
541+
542+
543+
credential = DefaultAzureCredential()
544+
client = MetricsBatchQueryClient(endpoint, credential)
545+
546+
resource_uris = [
547+
"/subscriptions/<id>/resourceGroups/<rg-name>/providers/<source>/storageAccounts/<resource-name-1>",
548+
"/subscriptions/<id>/resourceGroups/<rg-name>/providers/<source>/storageAccounts/<resource-name-2>"
549+
]
550+
551+
response = client.query_batch(
552+
resource_uris,
553+
metric_namespace="Microsoft.Storage/storageAccounts",
554+
metric_names=["Ingress"],
555+
timespan=timedelta(hours=2),
556+
granularity=timedelta(minutes=5),
557+
aggregations=[MetricAggregationType.AVERAGE],
558+
)
559+
560+
for metrics_query_result in response:
561+
print(metrics_query_result.timespan)
562+
```
563+
527564
## Troubleshooting
528565

529566
See our [troubleshooting guide][troubleshooting_guide] for details on how to diagnose various failure scenarios.
@@ -568,6 +605,7 @@ This project has adopted the [Microsoft Open Source Code of Conduct][code_of_con
568605
[azure_subscription]: https://azure.microsoft.com/free/python/
569606
[changelog]: https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/monitor/azure-monitor-query/CHANGELOG.md
570607
[kusto_query_language]: https://learn.microsoft.com/azure/data-explorer/kusto/query/
608+
[metric_namespaces]: https://learn.microsoft.com/azure/azure-monitor/reference/supported-metrics/metrics-index#metrics-by-resource-provider
571609
[package]: https://aka.ms/azsdk-python-monitor-query-pypi
572610
[pip]: https://pypi.org/project/pip/
573611
[python_logging]: https://docs.python.org/3/library/logging.html

sdk/monitor/azure-monitor-query/assets.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,5 +2,5 @@
22
"AssetsRepo": "Azure/azure-sdk-assets",
33
"AssetsRepoPrefixPath": "python",
44
"TagPrefix": "python/monitor/azure-monitor-query",
5-
"Tag": "python/monitor/azure-monitor-query_8b3197a327"
5+
"Tag": "python/monitor/azure-monitor-query_70a241fcfc"
66
}

sdk/monitor/azure-monitor-query/azure/monitor/query/__init__.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@
66

77
from ._logs_query_client import LogsQueryClient
88
from ._metrics_query_client import MetricsQueryClient
9+
from ._metrics_batch_query_client import MetricsBatchQueryClient
910

1011
from ._enums import (
1112
LogsQueryStatus,
@@ -45,6 +46,7 @@
4546
"LogsTableRow",
4647
"LogsBatchQuery",
4748
"MetricsQueryClient",
49+
"MetricsBatchQueryClient",
4850
"MetricNamespace",
4951
"MetricNamespaceClassification",
5052
"MetricDefinition",

sdk/monitor/azure-monitor-query/azure/monitor/query/_generated/_serialization.py

Lines changed: 18 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -662,8 +662,9 @@ def _serialize(self, target_obj, data_type=None, **kwargs):
662662
_serialized.update(_new_attr) # type: ignore
663663
_new_attr = _new_attr[k] # type: ignore
664664
_serialized = _serialized[k]
665-
except ValueError:
666-
continue
665+
except ValueError as err:
666+
if isinstance(err, SerializationError):
667+
raise
667668

668669
except (AttributeError, KeyError, TypeError) as err:
669670
msg = "Attribute {} in object {} cannot be serialized.\n{}".format(attr_name, class_name, str(target_obj))
@@ -741,6 +742,8 @@ def query(self, name, data, data_type, **kwargs):
741742
742743
:param data: The data to be serialized.
743744
:param str data_type: The type to be serialized from.
745+
:keyword bool skip_quote: Whether to skip quote the serialized result.
746+
Defaults to False.
744747
:rtype: str
745748
:raises: TypeError if serialization fails.
746749
:raises: ValueError if data is None
@@ -749,10 +752,8 @@ def query(self, name, data, data_type, **kwargs):
749752
# Treat the list aside, since we don't want to encode the div separator
750753
if data_type.startswith("["):
751754
internal_data_type = data_type[1:-1]
752-
data = [self.serialize_data(d, internal_data_type, **kwargs) if d is not None else "" for d in data]
753-
if not kwargs.get("skip_quote", False):
754-
data = [quote(str(d), safe="") for d in data]
755-
return str(self.serialize_iter(data, internal_data_type, **kwargs))
755+
do_quote = not kwargs.get("skip_quote", False)
756+
return str(self.serialize_iter(data, internal_data_type, do_quote=do_quote, **kwargs))
756757

757758
# Not a list, regular serialization
758759
output = self.serialize_data(data, data_type, **kwargs)
@@ -891,6 +892,8 @@ def serialize_iter(self, data, iter_type, div=None, **kwargs):
891892
not be None or empty.
892893
:param str div: If set, this str will be used to combine the elements
893894
in the iterable into a combined string. Default is 'None'.
895+
:keyword bool do_quote: Whether to quote the serialized result of each iterable element.
896+
Defaults to False.
894897
:rtype: list, str
895898
"""
896899
if isinstance(data, str):
@@ -903,9 +906,14 @@ def serialize_iter(self, data, iter_type, div=None, **kwargs):
903906
for d in data:
904907
try:
905908
serialized.append(self.serialize_data(d, iter_type, **kwargs))
906-
except ValueError:
909+
except ValueError as err:
910+
if isinstance(err, SerializationError):
911+
raise
907912
serialized.append(None)
908913

914+
if kwargs.get("do_quote", False):
915+
serialized = ["" if s is None else quote(str(s), safe="") for s in serialized]
916+
909917
if div:
910918
serialized = ["" if s is None else str(s) for s in serialized]
911919
serialized = div.join(serialized)
@@ -950,7 +958,9 @@ def serialize_dict(self, attr, dict_type, **kwargs):
950958
for key, value in attr.items():
951959
try:
952960
serialized[self.serialize_unicode(key)] = self.serialize_data(value, dict_type, **kwargs)
953-
except ValueError:
961+
except ValueError as err:
962+
if isinstance(err, SerializationError):
963+
raise
954964
serialized[self.serialize_unicode(key)] = None
955965

956966
if "xml" in serialization_ctxt:

sdk/monitor/azure-monitor-query/azure/monitor/query/_generated/_vendor.py

Lines changed: 0 additions & 20 deletions
This file was deleted.

sdk/monitor/azure-monitor-query/azure/monitor/query/_generated/aio/operations/_operations.py

Lines changed: 24 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@
77
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
88
# --------------------------------------------------------------------------
99
import datetime
10+
from io import IOBase
1011
import sys
1112
from typing import Any, Callable, Dict, IO, Optional, TypeVar, Union, cast, overload
1213

@@ -19,8 +20,7 @@
1920
map_error,
2021
)
2122
from azure.core.pipeline import PipelineResponse
22-
from azure.core.pipeline.transport import AsyncHttpResponse
23-
from azure.core.rest import HttpRequest
23+
from azure.core.rest import AsyncHttpResponse, HttpRequest
2424
from azure.core.tracing.decorator_async import distributed_trace_async
2525
from azure.core.utils import case_insensitive_dict
2626

@@ -165,6 +165,8 @@ async def get(
165165
response = pipeline_response.http_response
166166

167167
if response.status_code not in [200]:
168+
if _stream:
169+
await response.read() # Load the body in memory and close the socket
168170
map_error(status_code=response.status_code, response=response, error_map=error_map)
169171
raise HttpResponseError(response=response)
170172

@@ -464,7 +466,7 @@ async def execute(
464466
content_type = content_type or "application/json"
465467
_json = None
466468
_content = None
467-
if isinstance(body, (IO, bytes)):
469+
if isinstance(body, (IOBase, bytes)):
468470
_content = body
469471
else:
470472
_json = body
@@ -488,6 +490,8 @@ async def execute(
488490
response = pipeline_response.http_response
489491

490492
if response.status_code not in [200]:
493+
if _stream:
494+
await response.read() # Load the body in memory and close the socket
491495
map_error(status_code=response.status_code, response=response, error_map=error_map)
492496
raise HttpResponseError(response=response)
493497

@@ -605,6 +609,8 @@ async def resource_get(
605609
response = pipeline_response.http_response
606610

607611
if response.status_code not in [200]:
612+
if _stream:
613+
await response.read() # Load the body in memory and close the socket
608614
map_error(status_code=response.status_code, response=response, error_map=error_map)
609615
raise HttpResponseError(response=response)
610616

@@ -901,7 +907,7 @@ async def resource_execute(
901907
content_type = content_type or "application/json"
902908
_json = None
903909
_content = None
904-
if isinstance(body, (IO, bytes)):
910+
if isinstance(body, (IOBase, bytes)):
905911
_content = body
906912
else:
907913
_json = body
@@ -925,6 +931,8 @@ async def resource_execute(
925931
response = pipeline_response.http_response
926932

927933
if response.status_code not in [200]:
934+
if _stream:
935+
await response.read() # Load the body in memory and close the socket
928936
map_error(status_code=response.status_code, response=response, error_map=error_map)
929937
raise HttpResponseError(response=response)
930938

@@ -1287,7 +1295,7 @@ async def batch(self, body: Union[JSON, IO], **kwargs: Any) -> JSON:
12871295
content_type = content_type or "application/json"
12881296
_json = None
12891297
_content = None
1290-
if isinstance(body, (IO, bytes)):
1298+
if isinstance(body, (IOBase, bytes)):
12911299
_content = body
12921300
else:
12931301
_json = body
@@ -1309,6 +1317,8 @@ async def batch(self, body: Union[JSON, IO], **kwargs: Any) -> JSON:
13091317
response = pipeline_response.http_response
13101318

13111319
if response.status_code not in [200]:
1320+
if _stream:
1321+
await response.read() # Load the body in memory and close the socket
13121322
map_error(status_code=response.status_code, response=response, error_map=error_map)
13131323
raise HttpResponseError(response=response)
13141324

@@ -1426,6 +1436,8 @@ async def resource_get_xms(
14261436
response = pipeline_response.http_response
14271437

14281438
if response.status_code not in [200]:
1439+
if _stream:
1440+
await response.read() # Load the body in memory and close the socket
14291441
map_error(status_code=response.status_code, response=response, error_map=error_map)
14301442
raise HttpResponseError(response=response)
14311443

@@ -1722,7 +1734,7 @@ async def resource_execute_xms(
17221734
content_type = content_type or "application/json"
17231735
_json = None
17241736
_content = None
1725-
if isinstance(body, (IO, bytes)):
1737+
if isinstance(body, (IOBase, bytes)):
17261738
_content = body
17271739
else:
17281740
_json = body
@@ -1746,6 +1758,8 @@ async def resource_execute_xms(
17461758
response = pipeline_response.http_response
17471759

17481760
if response.status_code not in [200]:
1761+
if _stream:
1762+
await response.read() # Load the body in memory and close the socket
17491763
map_error(status_code=response.status_code, response=response, error_map=error_map)
17501764
raise HttpResponseError(response=response)
17511765

@@ -2163,6 +2177,8 @@ async def get(self, workspace_id: str, **kwargs: Any) -> JSON:
21632177
response = pipeline_response.http_response
21642178

21652179
if response.status_code not in [200]:
2180+
if _stream:
2181+
await response.read() # Load the body in memory and close the socket
21662182
map_error(status_code=response.status_code, response=response, error_map=error_map)
21672183
raise HttpResponseError(response=response)
21682184

@@ -2562,6 +2578,8 @@ async def post(self, workspace_id: str, **kwargs: Any) -> JSON:
25622578
response = pipeline_response.http_response
25632579

25642580
if response.status_code not in [200]:
2581+
if _stream:
2582+
await response.read() # Load the body in memory and close the socket
25652583
map_error(status_code=response.status_code, response=response, error_map=error_map)
25662584
raise HttpResponseError(response=response)
25672585

sdk/monitor/azure-monitor-query/azure/monitor/query/_generated/metrics/_serialization.py

Lines changed: 18 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -662,8 +662,9 @@ def _serialize(self, target_obj, data_type=None, **kwargs):
662662
_serialized.update(_new_attr) # type: ignore
663663
_new_attr = _new_attr[k] # type: ignore
664664
_serialized = _serialized[k]
665-
except ValueError:
666-
continue
665+
except ValueError as err:
666+
if isinstance(err, SerializationError):
667+
raise
667668

668669
except (AttributeError, KeyError, TypeError) as err:
669670
msg = "Attribute {} in object {} cannot be serialized.\n{}".format(attr_name, class_name, str(target_obj))
@@ -741,6 +742,8 @@ def query(self, name, data, data_type, **kwargs):
741742
742743
:param data: The data to be serialized.
743744
:param str data_type: The type to be serialized from.
745+
:keyword bool skip_quote: Whether to skip quote the serialized result.
746+
Defaults to False.
744747
:rtype: str
745748
:raises: TypeError if serialization fails.
746749
:raises: ValueError if data is None
@@ -749,10 +752,8 @@ def query(self, name, data, data_type, **kwargs):
749752
# Treat the list aside, since we don't want to encode the div separator
750753
if data_type.startswith("["):
751754
internal_data_type = data_type[1:-1]
752-
data = [self.serialize_data(d, internal_data_type, **kwargs) if d is not None else "" for d in data]
753-
if not kwargs.get("skip_quote", False):
754-
data = [quote(str(d), safe="") for d in data]
755-
return str(self.serialize_iter(data, internal_data_type, **kwargs))
755+
do_quote = not kwargs.get("skip_quote", False)
756+
return str(self.serialize_iter(data, internal_data_type, do_quote=do_quote, **kwargs))
756757

757758
# Not a list, regular serialization
758759
output = self.serialize_data(data, data_type, **kwargs)
@@ -891,6 +892,8 @@ def serialize_iter(self, data, iter_type, div=None, **kwargs):
891892
not be None or empty.
892893
:param str div: If set, this str will be used to combine the elements
893894
in the iterable into a combined string. Default is 'None'.
895+
:keyword bool do_quote: Whether to quote the serialized result of each iterable element.
896+
Defaults to False.
894897
:rtype: list, str
895898
"""
896899
if isinstance(data, str):
@@ -903,9 +906,14 @@ def serialize_iter(self, data, iter_type, div=None, **kwargs):
903906
for d in data:
904907
try:
905908
serialized.append(self.serialize_data(d, iter_type, **kwargs))
906-
except ValueError:
909+
except ValueError as err:
910+
if isinstance(err, SerializationError):
911+
raise
907912
serialized.append(None)
908913

914+
if kwargs.get("do_quote", False):
915+
serialized = ["" if s is None else quote(str(s), safe="") for s in serialized]
916+
909917
if div:
910918
serialized = ["" if s is None else str(s) for s in serialized]
911919
serialized = div.join(serialized)
@@ -950,7 +958,9 @@ def serialize_dict(self, attr, dict_type, **kwargs):
950958
for key, value in attr.items():
951959
try:
952960
serialized[self.serialize_unicode(key)] = self.serialize_data(value, dict_type, **kwargs)
953-
except ValueError:
961+
except ValueError as err:
962+
if isinstance(err, SerializationError):
963+
raise
954964
serialized[self.serialize_unicode(key)] = None
955965

956966
if "xml" in serialization_ctxt:

0 commit comments

Comments
 (0)