Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nested dicts do not get updated but overwritten #387

Open
florianblume opened this issue Sep 19, 2023 · 2 comments
Open

Nested dicts do not get updated but overwritten #387

florianblume opened this issue Sep 19, 2023 · 2 comments
Labels
duplicate This issue or pull request already exists enhancement New feature or request

Comments

@florianblume
Copy link

Again, thanks for all your great work on this project. Despite opening a couple of issues I really want to point out how much I dig your way of defining configs and calling code.

馃悰 Bug report

I do have one issue though: My use case is Pytorch Lightning and there I have a config file and generate a second config file on the fly for a hyper parameter search using ray tune.

The problem is that I cannot tune the learning rate because for that I have to adjust the init_args of the optimizer_init, which is a nested dict in the model key.

To reproduce

Executing the following code shows that weight_decay has been discarded but should be kept in fact:

from jsonargparse import ArgumentParser, ActionConfigFile
import yaml
import tempfile

class ModelBaseClass():
    def __init__(self, batch_size: int=0):
        self.batch_size = batch_size

class ImageModel(ModelBaseClass):
    def __init__(self, optimizer_init: dict, **kwargs):
        super().__init__(**kwargs)
        self.image_size = image_size

parser = ArgumentParser(
    prog='app',
    description='Description for my app.'
)

parser.add_subclass_arguments(ModelBaseClass, 'model')
parser.add_argument("--config", action=ActionConfigFile)

model_config_file = tempfile.NamedTemporaryFile(suffix='.yaml')

with open(model_config_file.name, 'w') as _config_file:
    model = yaml.safe_dump({
        'class_path': '__main__.ImageModel',
        'init_args': {
            'batch_size': 10,
            'optimizer_init': {
                'init_args': {
                    'lr': 0.001,
                    'weight_decay': 0.00001
                }
            }
        }
    }, _config_file)

config_file = tempfile.NamedTemporaryFile(suffix='.yaml')

with open(config_file.name, 'w') as _config_file:
    config = yaml.safe_dump({
        'model': {
            'init_args': {
                'optimizer_init': {
                    'init_args': {
                        'lr': 0.1
                    }
                }
            }
        }
    }, _config_file)

print(parser.parse_args(['--model=' + model_config_file.name, '--config=' + config_file.name]))

The output is:

Namespace(config=[Path_fr(/shared/tmp/tmpamv5l6pv.yaml)], model=Namespace(class_path='__main__.ImageModel', init_args=Namespace(optimizer_init={'init_args': {'lr': 0.1}}, batch_size=10), __path__=Path_fr(/shared/tmp/tmp4is99glf.yaml)))

The reason is (I checked this manually in jsonargparse) is that optimizer_init gets overwritten as a whole and not updated.

Expected behavior

Would be the following output:

Namespace(config=[Path_fr(/shared/tmp/tmpamv5l6pv.yaml)], model=Namespace(class_path='__main__.ImageModel', init_args=Namespace(optimizer_init={'init_args': {'lr': 0.1, 'weight_decay': 0.00001}}, batch_size=10), __path__=Path_fr(/shared/tmp/tmp4is99glf.yaml)))

Environment

jsonargparse version 4.24.1

@mauvilsa
Copy link
Member

This is not a bug. dict replacement is the expected behavior as explained in dict-items. This is actually a duplicate of #236. For partial update of dicts that also works in config files, an append option + could be added, similar to list-append.

@mauvilsa mauvilsa added duplicate This issue or pull request already exists enhancement New feature or request and removed bug Something isn't working labels Sep 20, 2023
@florianblume
Copy link
Author

I see. Thanks for the quick reply, I'll try to implement what you suggested in the other issue. Then the mistake was on my side for not reading the documentation carefully enough, sorry for that! ;)

Should I close this issue, or edit it to represent your enhancement suggestion? Might make more sense if you do it since I lack a lot of understanding of the details of the project.

By the way, thanks for also pointing out the list-append option! I wasn't aware of it and this is making one pretty ugly part of my code obsolete :D

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
duplicate This issue or pull request already exists enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants