Skip mandatory validation with object_template#895
Conversation
When an object specifies an `object_template`, the template provides default values for mandatory fields (including resource pool allocations). The client-side validation was rejecting these objects because mandatory fields like were not present in the YAML data, even though the server would resolve them from the template. Skip the mandatory field check in `validate_object()` when `object_template` is present in the data, allowing the server to validate template completeness instead.
WalkthroughThe changes implement conditional mandatory field validation during object loading. When an 🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Deploying infrahub-sdk-python with
|
| Latest commit: |
fd0f0c0
|
| Status: | ✅ Deploy successful! |
| Preview URL: | https://ba50df54.infrahub-sdk-python.pages.dev |
| Branch Preview URL: | https://gma-20260326-ifc2404.infrahub-sdk-python.pages.dev |
Codecov Report✅ All modified and coverable lines are covered by tests. @@ Coverage Diff @@
## stable #895 +/- ##
=======================================
Coverage 80.68% 80.68%
=======================================
Files 119 119
Lines 10335 10336 +1
Branches 1550 1551 +1
=======================================
+ Hits 8339 8340 +1
Misses 1473 1473
Partials 523 523
Flags with carried forward coverage won't be shown. Click here to find out more.
🚀 New features to boost your workflow:
|
e2fcbee to
3569b63
Compare
3569b63 to
fd0f0c0
Compare
There was a problem hiding this comment.
🧹 Nitpick comments (4)
tests/integration/test_spec_object.py (3)
171-184: Consider resource cleanup for integration tests.Per coding guidelines, integration tests should clean up resources. The tests create
TestingPersonand template objects but don't explicitly clean them up. If test isolation is provided by class-scoped containers or fresh databases, this may be acceptable. Otherwise, consider adding cleanup fixtures.🧹 Example cleanup pattern
`@pytest.fixture`(scope="class") async def cleanup_test_data(self, client: InfrahubClient) -> None: yield # Cleanup after all tests in the class dogs = await client.all(kind="TestingDog") for dog in dogs: await dog.delete() persons = await client.all(kind="TestingPerson") for person in persons: await person.delete()As per coding guidelines: "Clean up resources in integration tests"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/integration/test_spec_object.py` around lines 171 - 184, Tests test_create_owner and test_create_templates create resources (TestingPerson via client.create/owner.save and TemplateTestingDog via make_object_file/complete.process/incomplete.process) but do not clean them up; add an async teardown/fixture (e.g., a class-scoped pytest fixture or finalizer) that queries the client for created kinds (TestingPerson and TemplateTestingDog or TemplateTestingDog-derived templates) and deletes them (using client.all(...) and item.delete()) after the tests run to ensure isolation and resource cleanup.
162-169: Consider extracting shared fixture.The
initial_schemafixture is nearly identical to the one inTestSpecObject(lines 50-56). Consider extracting it to a shared mixin or conftest to avoid duplication.♻️ Optional: Share fixture via mixin
# Could be added to SchemaAnimal or a dedicated mixin class SchemaAnimalFixtures: `@pytest.fixture`(scope="class") async def initial_schema(self, default_branch: str, client: InfrahubClient, schema_base: SchemaRoot) -> None: await client.schema.wait_until_converged(branch=default_branch) resp = await client.schema.load( schemas=[schema_base.to_schema_dict()], branch=default_branch, wait_until_converged=True ) assert resp.errors == {}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/integration/test_spec_object.py` around lines 162 - 169, The fixture initial_schema is duplicated between TestSpecObjectWithTemplate and TestSpecObject; extract it into a shared location (e.g., add a SchemaAnimalFixtures mixin or move to conftest) and have both test classes inherit or import it; specifically create a shared fixture named initial_schema (same async signature using client.schema.wait_until_converged and client.schema.load with schema_base.to_schema_dict()) and remove the duplicate definitions from TestSpecObjectWithTemplate and TestSpecObject so both classes (or SchemaAnimal) reuse the single implementation.
198-210: Verify error message stability.The assertion on line 210 checks an exact error message string:
assert exc_info.value.errors[0]["message"] == "breed is mandatory for TestingDog at breed"This could become brittle if the server-side error message format changes. Consider a more flexible assertion:
💡 More resilient assertion
- assert exc_info.value.errors[0]["message"] == "breed is mandatory for TestingDog at breed" + error_msg = exc_info.value.errors[0]["message"] + assert "breed" in error_msg and "mandatory" in error_msgOtherwise, the test correctly validates that the server rejects objects with incomplete templates.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/integration/test_spec_object.py` around lines 198 - 210, Replace the brittle exact-string check in test_dog_with_incomplete_template_fails by asserting the error message contains the key phrase instead of matching the whole string; locate the assertion referencing exc_info (in test_dog_with_incomplete_template_fails) and change the equality check on exc_info.value.errors[0]["message"] to a contains-style assertion (e.g., assert "breed is mandatory" in exc_info.value.errors[0]["message"] or use a regex match) so the test remains resilient to minor server-side wording changes.tests/unit/sdk/spec/test_object.py (1)
462-474: LGTM!The test correctly validates that mandatory field errors are skipped when
object_templateis present. The approach of filtering for the specific mandatory error message is appropriate.Consider adding a complementary assertion to verify that other validation (e.g., invalid attribute detection) still runs even with
object_templatepresent, to ensure the skip is targeted only at mandatory checks:💡 Optional: Add assertion for continued validation
async def test_validate_object_skips_mandatory_check_with_object_template( client_with_schema_01: InfrahubClient, ) -> None: """When object_template is present, mandatory field validation should be skipped.""" schema = await client_with_schema_01.schema.get(kind="BuiltinLocation") data = {"name": "Site1", "object_template": "Standard Site"} errors = await InfrahubObjectFileData.validate_object( client=client_with_schema_01, position=[1], schema=schema, data=data ) mandatory_errors = [e for e in errors if e.message == "type is mandatory"] assert mandatory_errors == [] + + # Verify other validation still runs + data_with_invalid = {"name": "Site1", "object_template": "Standard Site", "invalid_attr": "x"} + errors_invalid = await InfrahubObjectFileData.validate_object( + client=client_with_schema_01, position=[1], schema=schema, data=data_with_invalid + ) + assert any("invalid_attr" in e.message for e in errors_invalid)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/unit/sdk/spec/test_object.py` around lines 462 - 474, Add a complementary assertion in the test_validate_object_skips_mandatory_check_with_object_template test to ensure other validations still run when object_template is present: call InfrahubObjectFileData.validate_object (already used) with data that includes a deliberately invalid attribute (or reuse returned errors) and assert that an expected non-mandatory error (e.g., invalid attribute/type validation error) is present in errors; this verifies only mandatory checks are skipped while other schema validations remain active.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@tests/integration/test_spec_object.py`:
- Around line 171-184: Tests test_create_owner and test_create_templates create
resources (TestingPerson via client.create/owner.save and TemplateTestingDog via
make_object_file/complete.process/incomplete.process) but do not clean them up;
add an async teardown/fixture (e.g., a class-scoped pytest fixture or finalizer)
that queries the client for created kinds (TestingPerson and TemplateTestingDog
or TemplateTestingDog-derived templates) and deletes them (using client.all(...)
and item.delete()) after the tests run to ensure isolation and resource cleanup.
- Around line 162-169: The fixture initial_schema is duplicated between
TestSpecObjectWithTemplate and TestSpecObject; extract it into a shared location
(e.g., add a SchemaAnimalFixtures mixin or move to conftest) and have both test
classes inherit or import it; specifically create a shared fixture named
initial_schema (same async signature using client.schema.wait_until_converged
and client.schema.load with schema_base.to_schema_dict()) and remove the
duplicate definitions from TestSpecObjectWithTemplate and TestSpecObject so both
classes (or SchemaAnimal) reuse the single implementation.
- Around line 198-210: Replace the brittle exact-string check in
test_dog_with_incomplete_template_fails by asserting the error message contains
the key phrase instead of matching the whole string; locate the assertion
referencing exc_info (in test_dog_with_incomplete_template_fails) and change the
equality check on exc_info.value.errors[0]["message"] to a contains-style
assertion (e.g., assert "breed is mandatory" in
exc_info.value.errors[0]["message"] or use a regex match) so the test remains
resilient to minor server-side wording changes.
In `@tests/unit/sdk/spec/test_object.py`:
- Around line 462-474: Add a complementary assertion in the
test_validate_object_skips_mandatory_check_with_object_template test to ensure
other validations still run when object_template is present: call
InfrahubObjectFileData.validate_object (already used) with data that includes a
deliberately invalid attribute (or reuse returned errors) and assert that an
expected non-mandatory error (e.g., invalid attribute/type validation error) is
present in errors; this verifies only mandatory checks are skipped while other
schema validations remain active.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 44cb0c5d-285a-47c4-96ca-e17760f2bca6
📒 Files selected for processing (5)
changelog/+ifc2404.fixed.mdinfrahub_sdk/spec/object.pyinfrahub_sdk/testing/schemas/animal.pytests/integration/test_spec_object.pytests/unit/sdk/spec/test_object.py
When an object specifies an
object_template, the template provides default values for mandatory fields (including resource pool allocations). The client-side validation was rejecting these objects because mandatory fields like were not present in the YAML data, even though the server would resolve them from the template.Skip the mandatory field check in
validate_object()whenobject_templateis present in the data, allowing the server to validate template completeness instead.Checklist
uv run towncrier create ...)Summary by CodeRabbit
New Features
Tests