Conversation
|
@claude review |
|
Claude: ## Summary This PR fixes memory leaks in VL (vision-language) data loading by:
The resource cleanup changes (context managers) are correct and good practice. A few concerns: IssuesWarning
Nit
VerdictCOMMENT — the core fix (context managers) is solid. The |
| _TRIM_MEMORY_WARNED = False | ||
|
|
||
|
|
||
| def trim_memory(logger: logging.Logger | None = None): |
There was a problem hiding this comment.
logger in function arg seems weird. looks like we want to trim memory induced by logger per se. under what circumstances do we need a logger other than the global loguru logger returned by xtuner.v1.utils.get_logger?
| data_name: str | None = None, | ||
| llm_pack_weight: float = 1.0, | ||
| visual_pack_weight: float = 0.0, | ||
| trim_memory_step: int = 1, |
There was a problem hiding this comment.
might be better if we stick to trainer naming convention -- use ..._step as counter, and ..._interval for specifying frequency of doing things.
This PR fixes memory leaks in VL (vision-language) data loading by: