Skip to content

Conversation

@luss
Copy link
Contributor

@luss luss commented Nov 9, 2023

Changes Title (EquinixMetal small fix to handle large instances RAM measured in TB)

Description

Replace this with the PR description (mention the changes you have made, why
you have made them, provide some background and any references to the provider
documentation if needed, etc.).

For more information on contributing, please see Contributing
section of our documentation.

Status

Replace this: describe the PR status. Examples:

  • work in progress
  • done, ready for review

Checklist (tick everything that applies)

  • Code linting (required, can be done after the PR checks)
  • Documentation
  • Tests
  • ICLA (required for bigger changes)

@luss
Copy link
Contributor Author

luss commented Nov 9, 2023

I've disccused this small fix to EquinixMetal with my friends at Equinix and they are supportive.

@luss luss changed the title fix to handle listing nodes with mem in TB's EquinixMetal small fix to handle large instances RAM measured in TB Nov 9, 2023
Copy link

@displague displague left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm.

The specs in the general plan list only use GB today:

$ metal plans list -o json | jq -r '.[]|.specs.memory.total' | sort -n -u
null
16GB
32GB
64GB
128GB
192GB
256GB
384GB
512GB
768GB
1024GB

There are workload-optimized and custom reserved hardware instances that may have the memory spec written as this PR would address. This is a reasonable defensive patch.

I verified that this raw python block handles the input.

@luss do you have instances that return the memory in this format today?

Signed-off-by: Ayush Rangwala <ayush.rangwala@gmail.com>
@displague
Copy link

@luss It seems we now have 3 Equinix Metal PRs up:

This PR's inclusion of 50c187e is complicating this PR and tying it to the features present in the other two PRs. Can we remove that commit from this PR?

For testing purposes, if you want to run with all of the PRs, I'd suggest using a different branch for that, one that is not opened as a PR.

@luss
Copy link
Contributor Author

luss commented Nov 16, 2023

I'm not the worlds biggest git expert.... I will create a new PR from a clean repo that has only my suggest fix. The I will withdraw this PR

@pgEdge pgEdge closed this by deleting the head repository Nov 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants