Merge pull request #211 from tiqi-group/feat/add_logging_config_helper

Feat: add logging config helper
This commit is contained in:
Mose Müller 2025-03-27 11:47:08 +01:00 committed by GitHub
commit d1feff1a6a
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
5 changed files with 140 additions and 64 deletions

View File

@ -226,45 +226,15 @@ For details, please see [here](https://pydase.readthedocs.io/en/stable/user-guid
## Logging in pydase
The `pydase` library organizes its loggers on a per-module basis, mirroring the Python package hierarchy. This structured approach allows for granular control over logging levels and behaviour across different parts of the library.
The `pydase` library provides structured, per-module logging with support for log level configuration, rich formatting, and optional client identification in logs.
### Changing the Log Level
To configure logging in your own service, you can use:
You have two primary ways to adjust the log levels in `pydase`:
```python
from pydase.utils.logging import configure_logging_with_pydase_formatter
```
1. directly targeting `pydase` loggers
You can set the log level for any `pydase` logger directly in your code. This method is useful for fine-tuning logging levels for specific modules within `pydase`. For instance, if you want to change the log level of the main `pydase` logger or target a submodule like `pydase.data_service`, you can do so as follows:
```python
# <your_script.py>
import logging
# Set the log level for the main pydase logger
logging.getLogger("pydase").setLevel(logging.INFO)
# Optionally, target a specific submodule logger
# logging.getLogger("pydase.data_service").setLevel(logging.DEBUG)
# Your logger for the current script
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
logger.info("My info message.")
```
This approach allows for specific control over different parts of the `pydase` library, depending on your logging needs.
2. using the `ENVIRONMENT` environment variable
For a more global setting that affects the entire `pydase` library, you can utilize the `ENVIRONMENT` environment variable. Setting this variable to "production" will configure all `pydase` loggers to only log messages of level "INFO" and above, filtering out more verbose logging. This is particularly useful for production environments where excessive logging can be overwhelming or unnecessary.
```bash
ENVIRONMENT="production" python -m <module_using_pydase>
```
In the absence of this setting, the default behavior is to log everything of level "DEBUG" and above, suitable for development environments where more detailed logs are beneficial.
**Note**: It is recommended to avoid calling the `pydase.utils.logging.setup_logging` function directly, as this may result in duplicated logging messages.
For more information, see the [full guide](https://pydase.readthedocs.io/en/stable/user-guide/Logging/).
## Documentation

View File

@ -47,6 +47,9 @@
options:
filters: ["!render_in_frontend"]
::: pydase.utils.logging
handler: python
::: pydase.units
handler: python

View File

@ -1,44 +1,46 @@
## Logging in pydase
# Logging in pydase
The `pydase` library organizes its loggers per module, mirroring the Python package hierarchy. This structured approach allows for granular control over logging levels and behaviour across different parts of the library. Logs can also include details about client identification based on headers sent by the client or proxy, providing additional context for debugging or auditing.
### Changing the Log Level
## Changing the pydase Log Level
You have two primary ways to adjust the log levels in `pydase`:
1. **Directly targeting `pydase` loggers**
You can set the log level for any `pydase` logger directly in your code. This method is useful for fine-tuning logging levels for specific modules within `pydase`. For instance, if you want to change the log level of the main `pydase` logger or target a submodule like `pydase.data_service`, you can do so as follows:
You can set the log level for any `pydase` logger directly in your code. This method is useful for fine-tuning logging levels for specific modules within `pydase`. For instance, if you want to change the log level of the main `pydase` logger or target a submodule like `pydase.data_service`, you can do so as follows:
```python
# <your_script.py>
import logging
```python
# <your_script.py>
import logging
# Set the log level for the main pydase logger
logging.getLogger("pydase").setLevel(logging.INFO)
# Set the log level for the main pydase logger
logging.getLogger("pydase").setLevel(logging.INFO)
# Optionally, target a specific submodule logger
# logging.getLogger("pydase.data_service").setLevel(logging.DEBUG)
# Optionally, target a specific submodule logger
# logging.getLogger("pydase.data_service").setLevel(logging.DEBUG)
# Your logger for the current script
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
logger.info("My info message.")
```
# Your logger for the current script
from pydase.utils.logging import configure_logging_with_pydase_formatter
configure_logging_with_pydase_formatter(level=logging.DEBUG)
This approach allows for specific control over different parts of the `pydase` library, depending on your logging needs.
logger = logging.getLogger(__name__)
logger.debug("My debug message.")
```
This approach allows for specific control over different parts of the `pydase` library, depending on your logging needs.
2. **Using the `ENVIRONMENT` environment variable**
For a more global setting that affects the entire `pydase` library, you can utilize the `ENVIRONMENT` environment variable. Setting this variable to `"production"` will configure all `pydase` loggers to only log messages of level `"INFO"` and above, filtering out more verbose logging. This is particularly useful for production environments where excessive logging can be overwhelming or unnecessary.
For a more global setting that affects the entire `pydase` library, you can utilize the `ENVIRONMENT` environment variable. Setting this variable to `"production"` will configure all `pydase` loggers to only log messages of level `"INFO"` and above, filtering out more verbose logging. This is particularly useful for production environments where excessive logging can be overwhelming or unnecessary.
```bash
ENVIRONMENT="production" python -m <module_using_pydase>
```
```bash
ENVIRONMENT="production" python -m <module_using_pydase>
```
In the absence of this setting, the default behavior is to log everything of level `"DEBUG"` and above, suitable for development environments where more detailed logs are beneficial.
In the absence of this setting, the default behavior is to log everything of level `"DEBUG"` and above, suitable for development environments where more detailed logs are beneficial.
### Client Identification in Logs
## Client Identification in pydase Logs
The logging system in `pydase` includes information about clients based on headers sent by the client or a proxy. The priority for identifying the client is fixed and as follows:
@ -53,3 +55,37 @@ For example, a log entries might include the following details based on the avai
2025-01-20 06:48:13.710 | INFO | pydase.server.web_server.api.v1.application:_get_value:36 - Client [user=Max Muster] is getting the value of 'property_attr'
```
## Configuring Logging in Services
To configure logging in services built with `pydase`, use the helper function [`configure_logging_with_pydase_formatter`][pydase.utils.logging.configure_logging_with_pydase_formatter]. This function sets up a logger with the same formatting used internally by `pydase`, so your service logs match the style and structure of `pydase` logs.
### Example
If your service follows a typical layout like:
```text
└── src
└── my_service
├── __init__.py
└── ...
```
you should call `configure_logging_with_pydase_formatter` inside `src/my_service/__init__.py`. This ensures the logger is configured as soon as your service is imported, and before any log messages are emitted.
```python title="src/my_service/__init__.py"
import sys
from pydase.utils.logging import configure_logging_with_pydase_formatter
configure_logging_with_pydase_formatter(
name="my_service", # Use the package/module name or None for the root logger
level=logging.DEBUG, # Set the desired logging level (defaults to INFO)
stream=sys.stderr # Optional: set the output stream (stderr by default)
)
```
### Notes
- If you pass `name=None`, the root logger will be configured. This affects **all logs** that propagate to the root logger.
- Passing a specific `name` like `"my_service"` allows you to scope the configuration to your service only, which is safer in multi-library environments.
- You can use `sys.stdout` instead of `sys.stderr` if your logs are being captured or processed differently (e.g., in containers or logging systems).

View File

@ -4,7 +4,7 @@ import logging.config
import sys
from collections.abc import Callable
from copy import copy
from typing import ClassVar, Literal
from typing import ClassVar, Literal, TextIO
import click
import socketio # type: ignore[import-untyped]
@ -189,3 +189,51 @@ def setup_logging() -> None:
logger.debug("Configuring pydase logging.")
logging.config.dictConfig(LOGGING_CONFIG)
def configure_logging_with_pydase_formatter(
name: str | None = None, level: int = logging.INFO, stream: TextIO | None = None
) -> None:
"""Configure a logger with the pydase `DefaultFormatter`.
This sets up a `StreamHandler` with the custom `DefaultFormatter`, which includes
timestamp, log level with color (if supported), logger name, function, and line
number. It can be used to configure the root logger or any named logger.
Args:
name: The name of the logger to configure. If None, the root logger is used.
level: The logging level to set on the logger (e.g., logging.DEBUG,
logging.INFO). Defaults to logging.INFO.
stream: The output stream for the log messages (e.g., sys.stdout or sys.stderr).
If None, defaults to sys.stderr.
Example:
Configure logging in your service:
```python
import sys
from pydase.utils.logging import configure_logging_with_pydase_formatter
configure_logging_with_pydase_formatter(
name="my_service", # Use the package/module name or None for the root logger
level=logging.DEBUG, # Set the desired logging level (defaults to INFO)
stream=sys.stdout # Set the output stream (stderr by default)
)
```
Notes:
- This function adds a new handler each time it's called.
Use carefully to avoid duplicate logs.
- Colors are enabled if the stream supports TTY (e.g., in terminal).
""" # noqa: E501
logger = logging.getLogger(name=name)
handler = logging.StreamHandler(stream=stream)
formatter = DefaultFormatter(
fmt="%(asctime)s.%(msecs)03d | %(levelprefix)s | "
"%(name)s:%(funcName)s:%(lineno)d - %(message)s",
datefmt="%Y-%m-%d %H:%M:%S",
)
handler.setFormatter(formatter)
logger.addHandler(handler)
logger.setLevel(level)

View File

@ -1,9 +1,10 @@
import logging
from pytest import LogCaptureFixture
import pytest
from pydase.utils.logging import configure_logging_with_pydase_formatter
def test_log_error(caplog: LogCaptureFixture):
def test_log_error(caplog: pytest.LogCaptureFixture) -> None:
logger = logging.getLogger("pydase")
logger.setLevel(logging.ERROR)
@ -20,7 +21,7 @@ def test_log_error(caplog: LogCaptureFixture):
assert any(record.levelname == "ERROR" for record in caplog.records)
def test_log_warning(caplog: LogCaptureFixture):
def test_log_warning(caplog: pytest.LogCaptureFixture) -> None:
logger = logging.getLogger("pydase")
logger.setLevel(logging.WARNING)
@ -37,7 +38,7 @@ def test_log_warning(caplog: LogCaptureFixture):
assert any(record.levelname == "ERROR" for record in caplog.records)
def test_log_debug(caplog: LogCaptureFixture):
def test_log_debug(caplog: pytest.LogCaptureFixture) -> None:
logger = logging.getLogger("pydase")
logger.setLevel(logging.DEBUG)
@ -53,7 +54,7 @@ def test_log_debug(caplog: LogCaptureFixture):
assert "This is an error message" in caplog.text
def test_log_info(caplog: LogCaptureFixture):
def test_log_info(caplog: pytest.LogCaptureFixture) -> None:
logger = logging.getLogger("pydase")
logger.setLevel(logging.INFO)
@ -67,3 +68,21 @@ def test_log_info(caplog: LogCaptureFixture):
assert "This is an info message" in caplog.text
assert "This is a warning message" in caplog.text
assert "This is an error message" in caplog.text
def test_before_configuring_root_logger(caplog: pytest.LogCaptureFixture) -> None:
logger = logging.getLogger(__name__)
logger.info("Hello world")
assert "Hello world" not in caplog.text
def test_configure_root_logger(caplog: pytest.LogCaptureFixture) -> None:
configure_logging_with_pydase_formatter()
logger = logging.getLogger(__name__)
logger.info("Hello world")
assert (
"INFO tests.utils.test_logging:test_logging.py:83 Hello world"
in caplog.text
)