We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OS: WIndows 11 Python 3.7.9
we run python .\FeTS_Challenge.py
python .\FeTS_Challenge.py
we get this error, I understand that we are loading data with the data_loader and its gets a dict with headers built in experiment.py
with task runner
File ".\FeTS_Challenge.py", line 584, in <module> restore_from_checkpoint_folder = restore_from_checkpoint_folder) File "C:\CodeRepos\MoaniSandbox\FETS-AI\Challenge\Task_1\fets_challenge\experiment.py", line 292, in run_challenge_experiment task_runner = copy(plan).get_task_runner(collaborator_data_loaders[col]) File "C:\CodeRepos\MoaniSandbox\FETS-AI\Challenge\Task_1\venv\lib\site-packages\openfl\federated\plan\plan.py", line 389, in get_task_runner self.runner_ = Plan.build(**defaults) File "C:\CodeRepos\MoaniSandbox\FETS-AI\Challenge\Task_1\venv\lib\site-packages\openfl\federated\plan\plan.py", line 182, in build instance = getattr(module, class_name)(**settings) File "C:\CodeRepos\MoaniSandbox\FETS-AI\Challenge\Task_1\venv\lib\site-packages\openfl\federated\task\runner_fets_challenge.py", line 43, in __init__ model, optimizer, train_loader, val_loader, scheduler, params = create_pytorch_objects(fets_config_dict, train_csv=train_csv, val_csv=val_csv, device=device) File "C:\CodeRepos\MoaniSandbox\FETS-AI\Challenge\Task_1\venv\lib\site-packages\GANDLF\compute\generic.py", line 48, in create_pytorch_objects train_loader = get_train_loader(parameters) File "C:\CodeRepos\MoaniSandbox\FETS-AI\Challenge\Task_1\venv\lib\site-packages\GANDLF\data\__init__.py", line 24, in get_train_loader loader_type="train", File "C:\CodeRepos\MoaniSandbox\FETS-AI\Challenge\Task_1\venv\lib\site-packages\GANDLF\data\ImagesFromDataFrame.py", line 65, in ImagesFromDataFrame preprocessing = parameters["data_preprocessing"] KeyError: 'data_preprocessing'
we print() out the parameters we can see that "data_preprocessing" is actually missing, is this a breaking change with GANDLF?
{'batch_size': 1, 'data_augmentation': {}, 'data_postprocessing': {}, 'enable_padding': False, 'in_memory': True, 'inference_mechanism': {'grid_aggregator_overlap': 'crop', 'patch_overlap': 0}, 'learning_rate': 0.001, 'loss_function': 'dc', 'medcam_enabled': False, 'metrics': ['dice', 'dice_per_label', 'hd95_per_label'], 'model': {'amp': True, 'architecture': 'resunet', 'base_filters': 32, 'class_list': [0, 1, 2, 4], 'dimension': 3, 'final_layer': 'softmax', 'norm_type': 'instance', 'type': 'torch', 'num_channels': 4, 'num_classes': 4}, 'nested_training': {'testing': 1, 'validation': -5}, 'num_epochs': 1, 'optimizer': {'type': 'sgd'}, 'output_dir': '.', 'parallel_compute_command': '', 'patch_sampler': 'label', 'patch_size': [64, 64, 64], 'patience': 100, 'pin_memory_dataloader': False, 'print_rgb_label_warning': True, 'q_max_length': 100, 'q_num_workers': 0, restore_from_checkpoint_folder = restore_from_checkpoint_folder)
The text was updated successfully, but these errors were encountered:
Sorry, something went wrong.
solved by overriding data_preprocessing because the data_preprocessing in the plan ymal file was not set accordingly
No branches or pull requests
OS: WIndows 11
Python 3.7.9
we run
python .\FeTS_Challenge.py
we get this error, I understand that we are loading data with the data_loader and its gets a dict with headers built in experiment.py
with task runner
we print() out the parameters we can see that "data_preprocessing" is actually missing, is this a breaking change with GANDLF?
The text was updated successfully, but these errors were encountered: