Skip to content

[OptApp / SIApp] Updating the Expressions to TensorAdaptors#14214

Merged
sunethwarna merged 124 commits intomasterfrom
optapp_siapp_tensor_adaptor_update
Mar 12, 2026
Merged

[OptApp / SIApp] Updating the Expressions to TensorAdaptors#14214
sunethwarna merged 124 commits intomasterfrom
optapp_siapp_tensor_adaptor_update

Conversation

@sunethwarna
Copy link
Copy Markdown
Member

📝 Description
This PR removes all the use cases of Expression in OptimizationApplication and SystemIdentificationApplication, and replaces them with new TensorAdaptors.

This requires following PRs to be merged first.

🆕 Changelog

  • Replaces the uses of Expression with TensorAdaptor

# model part
return self.model_part.GetRootModelPart()

def __ExtractTensorData(self, variable, tensor_adaptor: Kratos.TensorAdaptors.DoubleTensorAdaptor, nodes_container: Kratos.NodesArray) -> Kratos.TensorAdaptors.DoubleTensorAdaptor:
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do you need this function ? And what it does? I can't follow. Add please some doc

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added comments. Could you please check if it is descriptive enough?

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

__ExtractShapeUpdateFromDesignSurface() ???

May be we can add the check that it is used only in combination with SHAPE_VAR ? I guess it is not required for other variables ?

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we avoid general names in the variables like tensor_adaptor and have more specific ones like, shape_update ?

@sunethwarna
Copy link
Copy Markdown
Member Author

@Igarizza I addressed all of your comments. could you check?

@sunethwarna sunethwarna requested a review from Igarizza March 9, 2026 21:07
# model part
return self.model_part.GetRootModelPart()

def __ExtractTensorData(self, variable, tensor_adaptor: Kratos.TensorAdaptors.DoubleTensorAdaptor, nodes_container: Kratos.NodesArray) -> Kratos.TensorAdaptors.DoubleTensorAdaptor:
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

__ExtractShapeUpdateFromDesignSurface() ???

May be we can add the check that it is used only in combination with SHAPE_VAR ? I guess it is not required for other variables ?

# model part
return self.model_part.GetRootModelPart()

def __ExtractTensorData(self, variable, tensor_adaptor: Kratos.TensorAdaptors.DoubleTensorAdaptor, nodes_container: Kratos.NodesArray) -> Kratos.TensorAdaptors.DoubleTensorAdaptor:
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we avoid general names in the variables like tensor_adaptor and have more specific ones like, shape_update ?

Comment thread applications/OptimizationApplication/python_scripts/controls/master_control.py Outdated
2, 2.6811e+03, 1.5885e-03
3, 2.6677e+03, 1.5436e-03
4, 2.6191e+03, 1.5046e-03
4, 2.6841e+03, 1.5475e-03
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we keep the test passing with original settings / values ?

condition_ta_p = Kratos.TensorAdaptors.VariableTensorAdaptor(self.model_part.Conditions, Kratos.PRESSURE)
condition_ta_p.CollectData()

nodal_ta_v = KratosOA.OptimizationUtils.MapContainerDataToNodalData(condition_ta_v, self.model_part.Nodes)
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So this map is not a copy but rather a linking? So if later we change original var, nodal will change as well? Is map correct name in this case, because you not change map but you also sync. _Map&SyncConditionDatatoNodalData ?? Please comment as I am lost here =()

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nope, it will not change the model part values, so no automatic synching, but if you call nodal_ta_v.StoreData, then it will write the value to whatever the nodes [ manual synching by the request of the user]. So I would keep the name since there is no automatic synching.

Copy link
Copy Markdown
Member

@Igarizza Igarizza left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@suneth Extremely good work! WoW! I have put few minor comments but in general excelent work!

@sunethwarna
Copy link
Copy Markdown
Member Author

@Igarizza I addressed your comments and reverted the changes to the summary_ref.csv in this PR. This test is failing in the master as well. Unfortunately, it is skipped in the CI due to unavailability of scypy in the CI. We need to fix it in a seperate PR.

Copy link
Copy Markdown
Member

@Igarizza Igarizza left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one more small thing and it is done=)


def test_CombinedResponseCalculateValue3(self):
eval_resp = EvaluateResponseExpression("r1 + r2 + 4.0", self.optimization_problem)
eval_resp = EvaluateResponseExpression("(r1 + r2) * 1e+6 + 4.0", self.optimization_problem)
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do you cahnge this?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was checking to fail this test before updating to TensorAdaptors, the problem was, it was not failing because, (r1 + r2) was super low, and the 4.0 was dominating the whole response... Thats why i made it a more sensible test.

@sunethwarna sunethwarna enabled auto-merge March 12, 2026 09:13
@sunethwarna sunethwarna requested a review from Igarizza March 12, 2026 09:13
Copy link
Copy Markdown
Member

@Igarizza Igarizza left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think, it is ready to go !!!

@sunethwarna sunethwarna merged commit f0cbb9e into master Mar 12, 2026
10 checks passed
@sunethwarna sunethwarna deleted the optapp_siapp_tensor_adaptor_update branch March 12, 2026 09:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Applications C++ Python Refactor When code is moved or rewrote keeping the same behavior

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants