Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Average grades in export #2366

Open
wants to merge 7 commits into
base: main
Choose a base branch
from

Conversation

fidoriel
Copy link
Collaborator

closes #2001

@fidoriel fidoriel force-pushed the average-grades-in-export branch 3 times, most recently from e5117a8 to d51385d Compare January 13, 2025 19:12
@fidoriel fidoriel force-pushed the average-grades-in-export branch from d51385d to af9697b Compare January 13, 2025 20:04
@fidoriel fidoriel force-pushed the average-grades-in-export branch from af9697b to 382680b Compare January 13, 2025 20:13
@@ -198,6 +203,8 @@ def write_headings_and_evaluation_info(
else:
self.write_cell(export_name, "headline")

self.write_cell("Average for this Question", "evaluation")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think "Average for this Question" would need to be translated here.

The styles used here and in the lines changed below confuse me. Why is this using "evaluation"? Why is the empty cell in the course type column using the "program" style below?

Please double-check that these all make sense, and if we really need some unintuitive value, add some explanation as to why.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is still not marked for translation, is it? I would also prefer to have something like "Average for this question over all evaluations in all published semesters" and please make it italic to distinguish it from the course names.

@fidoriel fidoriel force-pushed the average-grades-in-export branch from dced350 to 6163186 Compare January 27, 2025 18:58
Copy link
Member

@niklasmohrin niklasmohrin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems mostly correct, but I think we should make an effort to keep the structure of these methods clear

@richardebeling richardebeling self-requested a review February 17, 2025 19:11

self.assertEqual(
float(workbook.sheets()[0].row_values(5)[1]),
(float(workbook.sheets()[0].row_values(5)[2]) + float(workbook.sheets()[0].row_values(5)[3])) / 2,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From a quick glance, it looks like the setup above creates the same exact average result for both evaluations and same count of answers inside of them. This doesn't test whether we compute the average-of-averages with correct weighting. Can we change them to have different answer counts?

Coincidentally, float-equality here made me wonder about correctness. It should work if the two values here are identical, the sum is then 2 * the original value, which is divisible by 2 to perfectly recreate the original value. However, I'd like the test to test a somewhat more sophisticated case. We'll likely need to use assertAlmostEqual then.

Comment on lines +706 to +707
self.assertEqual(float(workbook.sheets()[0].row_values(8)[1]), float(workbook.sheets()[0].row_values(8)[3]))
self.assertEqual("", workbook.sheets()[0].row_values(8)[2])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I may be tired, but I don't understand what we're asserting here. What is the intention here? Can we maybe improve the test name or add a comment?

@@ -198,6 +203,8 @@ def write_headings_and_evaluation_info(
else:
self.write_cell(export_name, "headline")

self.write_cell("Average for this Question", "evaluation")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is still not marked for translation, is it? I would also prefer to have something like "Average for this question over all evaluations in all published semesters" and please make it italic to distinguish it from the course names.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

Add averages to results exports
4 participants