Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add verbose logging option to ProviderDataIngester #4068

Draft
wants to merge 12 commits into
base: main
Choose a base branch
from

Conversation

nicoepp
Copy link
Contributor

@nicoepp nicoepp commented Apr 8, 2024

To log for a certain DAG set SHOULD_VERBOSE_LOG to a list containing that dag id.

For example, set the SHOULD_VERBOSE_LOG Airflow variable to: ["smk_workflow"]

Fixes

Fixes #1420 by @AetherUnbound

Description

Testing Instructions

Checklist

  • My pull request has a descriptive title (not a vague title likeUpdate index.md).
  • My pull request targets the default branch of the repository (main) or a parent feature branch.
  • My commit messages follow best practices.
  • My code follows the established code style of the repository.
  • I added or updated tests for the changes I made (if applicable).
  • I added or updated documentation (if applicable).
  • I tried running the project locally and verified that there are no visible errors.
  • I ran the DAG documentation generator (if applicable).

Developer Certificate of Origin

Developer Certificate of Origin
Developer Certificate of Origin
Version 1.1

Copyright (C) 2004, 2006 The Linux Foundation and its contributors.
1 Letterman Drive
Suite D4700
San Francisco, CA, 94129

Everyone is permitted to copy and distribute verbatim copies of this
license document, but changing it is not allowed.


Developer's Certificate of Origin 1.1

By making a contribution to this project, I certify that:

(a) The contribution was created in whole or in part by me and I
    have the right to submit it under the open source license
    indicated in the file; or

(b) The contribution is based upon previous work that, to the best
    of my knowledge, is covered under an appropriate open source
    license and I have the right under that license to submit that
    work with modifications, whether created in whole or in part
    by me, under the same open source license (unless I am
    permitted to submit under a different license), as indicated
    in the file; or

(c) The contribution was provided directly to me by some other
    person who certified (a), (b) or (c) and I have not modified
    it.

(d) I understand and agree that this project and the contribution
    are public and that a record of the contribution (including all
    personal information I submit with it, including my sign-off) is
    maintained indefinitely and may be redistributed consistent with
    this project or the open source license(s) involved.

To log for a certain DAG set SHOULD_VERBOSE_LOG to a list containing that dag id
e.g. ["smk_workflow"]
@nicoepp nicoepp requested a review from a team as a code owner April 8, 2024 21:54
@nicoepp nicoepp requested review from krysal and stacimc April 8, 2024 21:54
@openverse-bot openverse-bot added 🚦 status: awaiting triage Has not been triaged & therefore, not ready for work 🟩 priority: low Low priority and doesn't need to be rushed ✨ goal: improvement Improvement to an existing user-facing feature 🤖 aspect: dx Concerns developers' experience with the codebase 🏷 status: label work required Needs proper labelling before it can be worked on labels Apr 8, 2024
@AetherUnbound AetherUnbound added 🧱 stack: catalog Related to the catalog and Airflow DAGs and removed 🏷 status: label work required Needs proper labelling before it can be worked on 🚦 status: awaiting triage Has not been triaged & therefore, not ready for work labels Apr 8, 2024
Copy link
Contributor

@sarayourfriend sarayourfriend left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR @nicoepp! I've left two requested changes to align with #4053, which would have conflicts with this branch.

Also: can you run linting locally? The CI linting check failed. You can run it with just lint from the root of the repository.

Thanks again 🙂

nicoepp and others added 3 commits April 9, 2024 07:58
Co-authored-by: sarayourfriend <24264157+sarayourfriend@users.noreply.github.com>
Co-authored-by: sarayourfriend <24264157+sarayourfriend@users.noreply.github.com>
@nicoepp
Copy link
Contributor Author

nicoepp commented Apr 9, 2024

Thanks for you feedback @sarayourfriend! I pushed some commits and I'm running the linter locally now but I gotta go now. I will hopefully push the linting changes tonight.

Copy link
Contributor

@stacimc stacimc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks very much @nicoepp! I've left a few suggestions to clarify the (admittedly confusing 😄) distinction between terms like "record"/"batch item" in the logs. In addition, I think we should add at least one more verbose log to log the full result of get_batch_data.

Since that is quite the verbose log, I've also asked that we not enable this logging by default locally, treating this more as an optional development tool. (CC @AetherUnbound to give you a chance to disagree with me if that's not what you had in mind! 😄)

@nicoepp
Copy link
Contributor Author

nicoepp commented Apr 10, 2024

I'm playing around with the idea to log the full result of get_batch_data as suggested by @stacimc. I'm running into problems logging the full list of dictionaries of the returned batch. The chrome tab where I start the DAG in Airflow and where I try to view the log starts becoming unresponsive and then crashes since the log gets very very big!

I modified my code now to just log the first 5 and the last 5 dictionaries in each list returned by get_batch_data. See attached screenshot. Would that work for you guys?

image

@stacimc
Copy link
Contributor

stacimc commented Apr 10, 2024

I modified my code now to just log the first 5 and the last 5 dictionaries in each list returned by get_batch_data. See attached screenshot. Would that work for you guys?

Absolutely perfect, even just the first five is probably fine for our needs!

@nicoepp
Copy link
Contributor Author

nicoepp commented Apr 10, 2024

Here is the full log file (from the screenshot) for better reference:
dag_id=smk_workflow_run_id=manual__2024-04-10T20_03_28.log

@nicoepp
Copy link
Contributor Author

nicoepp commented Apr 10, 2024

Ok. Thanks for the feedback. I'm only logging the first 5 dictionaries of the batch for now.

I also pushed a commit making the verbose logging always controlled by the Airflow variable. No more checking the "ENVIRONMENT" variable while deciding to log or not.

The variable value has to look like this:
image

Copy link
Contributor

@stacimc stacimc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks great! Just a few very small comments and then we should ship this 💯

@@ -401,6 +407,7 @@ def get_batch(self, query_params: dict) -> tuple[list | None, bool]:

# Build a list of records from the response
batch = self.get_batch_data(response_json)
self._verbose_log("Batch data:", batch)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
self._verbose_log("Batch data:", batch)
self._verbose_log(
f"Got batch with {len(batch)} items. The first items are:",
batch[:5] if batch else None
)

Can we update this message with the actual batch len, to make sure that it's not misleading that only 5 are actually logged?

We could also do the list slicing here, and then have _verbose_log log all the items in data to make it more flexible for future use cases.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm running into several problems changing the code as suggested:

It seems like batch is not always a list, it sometimes can be a dict which makes the tests fail since a dict cannot be sliced. This was not a problem if we did the slicing inside the _verbose_log method because we would first check if the Variable was set and the tests never activate verbose logging. See following screenshot showing the test that is mocking a dict batch instead of a list batch.

image

Also, the len(batch) fails since batch can be None. So, code that could handle all this would look something like this:

    # ...

    # Build a list of records from the response
    batch = self.get_batch_data(response_json)

    batch_to_log = batch if batch and not isinstance(batch, dict) else []
    self._verbose_log(
        f"Got batch with {len(batch_to_log)} items. The first items are:",
        batch_to_log[:5]
    )

    # Optionally, apply some logic to the response to determine whether
    # ingestion should continue or if should be short-circuited. By default
    # this will return True and ingestion continues.
    should_continue = self.get_should_continue(response_json)

    return batch, should_continue

This code isn't the nicest looking IMO. Is this what we want? Ideas on how to improve it?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@stacimc I pushed a commit now that tries to address this with code that is a bit nicer looking. E.g. for the exceptional case where batch is a dict in the tests I thought it was better to do the handling of that and the batch slicing inside the _verbose_log method. So, I added an optional param to indicate if you want to slice the batch and by how much before you print it.

Can you take a look and tell what you think?

logger.info(msg)
if data:
for item in data[:5]:
logger.info(f"\t{item}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I love the added formatting, much visually cleaner! However instead of logging each line separately, I think we should do this all in one log statement by instead appending the items from data to the msg before logging. Something like:

if data:
    msg += "\n".join([f"\t{item}" for item in data])

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I pushed code that does all this in one log statement now

@stacimc
Copy link
Contributor

stacimc commented Apr 12, 2024

I've also just noticed the catalog test failures -- that's because we are mocking all the results of Variable.get in the provider data ingester tests and are now getting an unexpected new call. @nicoepp you'll need to update test_provider_data_ingester.py, everywhere you see a line like this:

        MockVariable.get.side_effect = [
            20,  # ingestion_limit
            {},  # skipped_ingestion_errors
+           [],  # should_verbose_log 
        ]

@sarayourfriend sarayourfriend dismissed their stale review April 16, 2024 05:00

Changes addressed

@openverse-bot
Copy link
Collaborator

Based on the low urgency of this PR, the following reviewers are being gently reminded to review this PR:

@krysal
This reminder is being automatically generated due to the urgency configuration.

Excluding weekend1 days, this PR was ready for review 6 day(s) ago. PRs labelled with low urgency are expected to be reviewed within 5 weekday(s)2.

@nicoepp, if this PR is not ready for a review, please draft it to prevent reviewers from getting further unnecessary pings.

Footnotes

  1. Specifically, Saturday and Sunday.

  2. For the purpose of these reminders we treat Monday - Friday as weekdays. Please note that the operation that generates these reminders runs at midnight UTC on Monday - Friday. This means that depending on your timezone, you may be pinged outside of the expected range.

Copy link
Member

@krysal krysal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nico, I'm converting this to a draft. Please let us know when this is ready for review. Thanks!

@krysal krysal marked this pull request as draft April 17, 2024 21:31
nicoepp and others added 2 commits April 17, 2024 14:46
Co-authored-by: Krystle Salazar <github@krysal.co>
and make amount of lines logged configurable
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖 aspect: dx Concerns developers' experience with the codebase ✨ goal: improvement Improvement to an existing user-facing feature 🟩 priority: low Low priority and doesn't need to be rushed 🧱 stack: catalog Related to the catalog and Airflow DAGs
Projects
Status: 🚧 Draft
Development

Successfully merging this pull request may close these issues.

Add verbose logging option to ProviderDataIngester
6 participants