Skip to content

Conversation

@shiftsayan
Copy link
Contributor

@shiftsayan shiftsayan commented Sep 12, 2022

The model metrics are not being used anywhere in the test.

This call is causing errored jobs on the backend because by the time the async job starts the test is completed and dataset yielded in the test fixture is deleted from the database.

Copy link
Contributor

@gatli gatli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is being used as an integration test for the model_run_commit job so this is letting us know when the pipeline is broken. I'd rather we wait on the job for it to finish.

Copy link
Contributor

@gatli gatli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since apparently we can't currently track the state of the evaluation job we can merge this. @shiftsayan is going to add a task to add AsyncJob tracking to the calculate_evaluation_metrics such that we can properly test that it runs to end.

@shiftsayan shiftsayan merged commit 309daed into master Sep 13, 2022
@shiftsayan shiftsayan deleted the sayan/fix-model-metrics-test branch September 13, 2022 09:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants