Skip to content

Releases: getsentry/sentry-python

1.22.1

05 May 13:55
Compare
Choose a tag to compare

Various fixes & improvements

  • Fix: Handle a list of keys (not just a single key) in Django cache spans (#2082) by @antonpirker

1.22.0

05 May 12:03
917ef8f
Compare
Choose a tag to compare

Various fixes & improvements

  • Add cache.hit and cache.item_size to Django (#2057) by @antonpirker

    Note: This will add spans for all requests to the caches configured in Django. This will probably add some overhead to your server an also add multiple spans to your performance waterfall diagrams. If you do not want this, you can disable this feature in the DjangoIntegration:

    sentry_sdk.init(
        dsn="...",
        integrations=[
            DjangoIntegration(cache_spans=False),
        ]
    )
  • Use http.method instead of method (#2054) by @AbhiPrasad

  • Handle non-int exc.status_code in Starlette (#2075) by @sentrivana

  • Handle SQLAlchemy engine.name being bytes (#2074) by @sentrivana

  • Fix KeyError in capture_checkin if SDK is not initialized (#2073) by @antonpirker

  • Use functools.wrap for ThreadingIntegration patches to fix attributes (#2080) by @EpicWink

  • Pin urllib3 to <2.0.0 for now (#2069) by @sl0thentr0py

1.21.1

28 Apr 19:29
Compare
Choose a tag to compare

Various fixes & improvements

1.21.0

25 Apr 13:50
1aa5788
Compare
Choose a tag to compare

Various fixes & improvements

  • Better handling of redis span/breadcrumb data (#2033) by @antonpirker

    Note: With this release we will limit the description of redis db spans and the data in breadcrumbs represting redis db operations to 1024 characters.

    This can can lead to truncated data. If you do not want this there is a new parameter max_data_size in RedisIntegration. You can set this to None for disabling trimming.

    Example for disabling trimming of redis commands in spans or breadcrumbs:

    sentry_sdk.init(
      integrations=[
        RedisIntegration(max_data_size=None),
      ]
    )

    Example for custom trim size of redis commands in spans or breadcrumbs:

    sentry_sdk.init(
      integrations=[
        RedisIntegration(max_data_size=50),
      ]
    )`
  • Add db.system to redis and SQLAlchemy db spans (#2037, #2038, #2039) (#2037) by @AbhiPrasad

  • Upgraded linting tooling (#2026) by @antonpirker

  • Made code more resilient. (#2031) by @antonpirker

1.20.0

19 Apr 11:14
f3a5b8d
Compare
Choose a tag to compare

Various fixes & improvements

  • Send all events to /envelope endpoint when tracing is enabled (#2009) by @antonpirker

    Note: If you’re self-hosting Sentry 9, you need to stay in the previous version of the SDK or update your self-hosted to at least 20.6.0

  • Profiling: Remove profile context from SDK (#2013) by @Zylphrex

  • Profiling: Additionl performance improvements to the profiler (#1991) by @Zylphrex

  • Fix: Celery Beat monitoring without restarting the Beat process (#2001) by @antonpirker

  • Fix: Using the Codecov uploader instead of deprecated python package (#2011) by @antonpirker

  • Fix: Support for Quart (#2003)` (#2003) by @antonpirker

1.19.1

05 Apr 15:43
eb37f64
Compare
Choose a tag to compare

Various fixes & improvements

1.19.0

04 Apr 11:48
fe941eb
Compare
Choose a tag to compare

Various fixes & improvements

  • New: Celery Beat auto monitoring (#1967) by @antonpirker

    The CeleryIntegration can now also monitor your Celery Beat scheduled tasks automatically using the new Crons feature of Sentry.

    To learn more see our Celery Beat Auto Discovery documentation.

    Usage:

    from celery import Celery, signals
    from celery.schedules import crontab
    
    import sentry_sdk
    from sentry_sdk.integrations.celery import CeleryIntegration
    
    
    app = Celery('tasks', broker='...')
    app.conf.beat_schedule = {
        'set-in-beat-schedule': {
            'task': 'tasks.some_important_task',
            'schedule': crontab(...),
        },
    }
    
    
    @signals.celeryd_init.connect
    def init_sentry(**kwargs):
        sentry_sdk.init(
            dsn='...',
            integrations=[CeleryIntegration(monitor_beat_tasks=True)],  # πŸ‘ˆ here
            environment="local.dev.grace",
            release="v1.0",
        )

    This will auto detect all schedules tasks in your beat_schedule and will monitor them with Sentry Crons.

  • New: gRPC integration (#1911) by @hossein-raeisi

    The gRPC integration instruments all incoming requests and outgoing unary-unary, unary-stream grpc requests using grpcio channels.

    To learn more see our gRPC Integration documentation.

    On the server:

    import grpc
    from sentry_sdk.integrations.grpc.server import ServerInterceptor
    
    
    server = grpc.server(
        thread_pool=...,
        interceptors=[ServerInterceptor()],
    )

    On the client:

    import grpc
    from sentry_sdk.integrations.grpc.client import ClientInterceptor
    
    
    with grpc.insecure_channel("example.com:12345") as channel:
        channel = grpc.intercept_channel(channel, *[ClientInterceptor()])
  • New: socket integration (#1911) by @hossein-raeisi

    Use this integration to create spans for DNS resolves (socket.getaddrinfo()) and connection creations (socket.create_connection()).

    To learn more see our Socket Integration documentation.

    Usage:

    import sentry_sdk
    from sentry_sdk.integrations.socket import SocketIntegration
    sentry_sdk.init(
        dsn="___PUBLIC_DSN___",
        integrations=[
            SocketIntegration(),
        ],
    )
  • Fix: Do not trim span descriptions. (#1983) by @antonpirker

1.18.0

28 Mar 12:18
fefb454
Compare
Choose a tag to compare

Various fixes & improvements

  • New: Implement EventScrubber (#1943) by @sl0thentr0py

    To learn more see our Scrubbing Sensitive Data documentation.

    Add a new EventScrubber class that scrubs certain potentially sensitive interfaces with a DEFAULT_DENYLIST. The default scrubber is automatically run if send_default_pii = False:

    import sentry_sdk
    from sentry_sdk.scrubber import EventScrubber
    sentry_sdk.init(
        # ...
        send_default_pii=False,
        event_scrubber=EventScrubber(),  # this is set by default
    )

    You can also pass in a custom denylist to the EventScrubber class and filter additional fields that you want.

    from sentry_sdk.scrubber import EventScrubber, DEFAULT_DENYLIST
    # custom denylist
    denylist = DEFAULT_DENYLIST + ["my_sensitive_var"]
    sentry_sdk.init(
        # ...
        send_default_pii=False,
        event_scrubber=EventScrubber(denylist=denylist),
    )
  • New: Added new functions_to_trace option for central way of performance instrumentation (#1960) by @antonpirker

    To learn more see our Tracing Options documentation.

    An optional list of functions that should be set up for performance monitoring. For each function in the list, a span will be created when the function is executed.

    functions_to_trace = [
        {"qualified_name": "tests.test_basics._hello_world_counter"},
        {"qualified_name": "time.sleep"},
        {"qualified_name": "collections.Counter.most_common"},
    ]
    
    sentry_sdk.init(
        # ...
        traces_sample_rate=1.0,
        functions_to_trace=functions_to_trace,
    )
  • Updated denylist to include other widely used cookies/headers (#1972) by @antonpirker

  • Forward all sentry- baggage items (#1970) by @cleptric

  • Update OSS licensing (#1973) by @antonpirker

  • Profiling: Handle non frame types in profiler (#1965) by @Zylphrex

  • Tests: Bad arq dependency in tests (#1966) by @Zylphrex

  • Better naming (#1962) by @antonpirker

1.17.0

16 Mar 15:53
d65cc68
Compare
Choose a tag to compare

Various fixes & improvements

  • New: Monitor Celery Beat tasks with Sentry Cron Monitoring.

    With this feature you can make sure that your Celery beat tasks run at the right time and see if they where successful or not.

    Warning
    Cron Monitoring is currently in beta. Beta features are still in-progress and may have bugs. We recognize the irony.
    If you have any questions or feedback, please email us at crons-feedback@sentry.io, reach out via Discord (#cronjobs), or open an issue.

    Usage:

    # File: tasks.py
    
    from celery import Celery, signals
    from celery.schedules import crontab
    
    import sentry_sdk
    from sentry_sdk.crons import monitor
    from sentry_sdk.integrations.celery import CeleryIntegration
    
    
    # 1. Setup your Celery beat configuration
    
    app = Celery('mytasks', broker='redis://localhost:6379/0')
    app.conf.beat_schedule = {
        'set-in-beat-schedule': {
            'task': 'tasks.tell_the_world',
            'schedule': crontab(hour='10', minute='15'),
            'args': ("in beat_schedule set", ),
        },
    }
    
    
    # 2. Initialize Sentry either in `celeryd_init` or `beat_init` signal.
    
    #@signals.celeryd_init.connect
    @signals.beat_init.connect
    def init_sentry(**kwargs):
        sentry_sdk.init(
            dsn='...',
            integrations=[CeleryIntegration()],
            environment="local.dev.grace",
            release="v1.0.7-a1",
        )
    
    
    # 3. Link your Celery task to a Sentry Cron Monitor
    
    @app.task
    @monitor(monitor_slug='3b861d62-ff82-4aa0-9cd6-b2b6403bd0cf')
    def tell_the_world(msg):
        print(msg)
  • New: Add decorator for Sentry tracing (#1089) by @ynouri

    This allows you to use a decorator to setup custom performance instrumentation.

    To learn more see Custom Instrumentation.

    Usage: Just add the new decorator to your function, and a span will be created for it:

    import sentry_sdk
    
    @sentry_sdk.trace
    def my_complex_function():
      # do stuff
      ...
  • Make Django signals tracing optional (#1929) by @antonpirker

    See the Django Guide to learn more.

  • Deprecated with_locals in favor of include_local_variables (#1924) by @antonpirker

  • Added top level API to get current span (#1954) by @antonpirker

  • Profiling: Add profiler options to init (#1947) by @Zylphrex

  • Profiling: Set active thread id for quart (#1830) by @Zylphrex

  • Fix: Update get_json function call for werkzeug 2.1.0+ (#1939) by @michielderoos

  • Fix: Returning the tasks result. (#1931) by @antonpirker

  • Fix: Rename MYPY to TYPE_CHECKING (#1934) by @untitaker

  • Fix: Fix type annotation for ignore_errors in sentry_sdk.init() (#1928) by @tiangolo

  • Tests: Start a real http server instead of mocking libs (#1938) by @antonpirker

1.16.0

27 Feb 10:59
c3ce15d
Compare
Choose a tag to compare

Various fixes & improvements

  • New: Add arq Integration (#1872) by @Zhenay

    This integration will create performance spans when arq jobs will be enqueued and when they will be run.
    It will also capture errors in jobs and will link them to the performance spans.

    Usage:

    import asyncio
    
    from httpx import AsyncClient
    from arq import create_pool
    from arq.connections import RedisSettings
    
    import sentry_sdk
    from sentry_sdk.integrations.arq import ArqIntegration
    from sentry_sdk.tracing import TRANSACTION_SOURCE_COMPONENT
    
    sentry_sdk.init(
        dsn="...",
        integrations=[ArqIntegration()],
    )
    
    async def download_content(ctx, url):
        session: AsyncClient = ctx['session']
        response = await session.get(url)
        print(f'{url}: {response.text:.80}...')
        return len(response.text)
    
    async def startup(ctx):
        ctx['session'] = AsyncClient()
    
    async def shutdown(ctx):
        await ctx['session'].aclose()
    
    async def main():
        with sentry_sdk.start_transaction(name="testing_arq_tasks", source=TRANSACTION_SOURCE_COMPONENT):
            redis = await create_pool(RedisSettings())
            for url in ('https://facebook.com', 'https://microsoft.com', 'https://github.com', "asdf"
                        ):
                await redis.enqueue_job('download_content', url)
    
    class WorkerSettings:
        functions = [download_content]
        on_startup = startup
        on_shutdown = shutdown
    
    if __name__ == '__main__':
        asyncio.run(main())
  • Update of Falcon Integration (#1733) by @bartolootrit

  • Adding Cloud Resource Context integration (#1882) by @antonpirker

  • Profiling: Use the transaction timestamps to anchor the profile (#1898) by @Zylphrex

  • Profiling: Add debug logs to profiling (#1883) by @Zylphrex

  • Profiling: Start profiler thread lazily (#1903) by @Zylphrex

  • Fixed checks for structured http data (#1905) by @antonpirker

  • Make set_measurement public api and remove experimental status (#1909) by @sl0thentr0py

  • Add trace_propagation_targets option (#1916) by @antonpirker

  • Add enable_tracing to default traces_sample_rate to 1.0 (#1900) by @sl0thentr0py

  • Remove deprecated tracestate (#1907) by @sl0thentr0py

  • Sanitize URLs in Span description and breadcrumbs (#1876) by @antonpirker

  • Mechanism should default to true unless set explicitly (#1889) by @sl0thentr0py

  • Better setting of in-app in stack frames (#1894) by @antonpirker

  • Add workflow to test gevent (#1870) by @Zylphrex

  • Updated outdated HTTPX test matrix (#1917) by @antonpirker

  • Switch to MIT license (#1908) by @cleptric