Skip to content

Conversation

@seifertdan
Copy link
Contributor

@seifertdan seifertdan commented Jul 17, 2024

Add AWS region me-central-1

Description

We need support for opt in AWS region me-central-1.

Resources:

Status

  • Done, ready for review.

We tried to follow the documentation (see [1]) to update theses files:

  • contrib/scrape-ec2-prices.py
  • contrib/scrape-ec2-sizes.py

Sadly when running tox -e scrape-ec2-sizes,scrape-ec2-prices, scrape-ec2-sizes failed:

root@a2eea4b5cfec:/work/libcloud# tox -e scrape-ec2-sizes,scrape-ec2-prices
scrape-ec2-sizes: commands[0]> bash -c 'echo "Scrapping EC2 sizes, this may take up to 10 minutes or more since the actual JSON data we download and scrape is very large"'
Scrapping EC2 sizes, this may take up to 10 minutes or more since the actual JSON data we download and scrape is very large
scrape-ec2-sizes: commands[1]> bash -c 'python contrib/scrape-ec2-sizes.py'
Scraping size data, this may take up to 10-15 minutes...
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 6.00G/6.00G [02:12<00:00, 45.4MiB/s]
scrape-ec2-sizes: exit -9 (652.54 seconds) /work/libcloud> bash -c 'python contrib/scrape-ec2-sizes.py' pid=21654
scrape-ec2-sizes: FAIL ✖ in 10 minutes 52.65 seconds
scrape-ec2-prices: commands[0]> python contrib/scrape-ec2-prices.py
Scraping EC2 pricing data (if this runs for the first time it has to download a 3GB file, depending on your bandwith it might take a while)....
Using data from existing cached file /tmp/ec.json (mtime=2024-07-22 21:09:14 UTC)
Starting to parse pricing data, this could take up to 15 minutes...
297107179it [09:43, 508892.55it/s] 
Using data from existing cached file /tmp/ec.json (mtime=2024-07-22 21:09:14 UTC)
Starting to parse pricing data, this could take up to 15 minutes...
101166581it [05:49, 289104.63it/s]
Unexpected OS Ubuntu Pro
Unexpected OS Ubuntu Pro
...
  scrape-ec2-sizes: FAIL code -9 (652.65=setup[0.09]+cmd[0.01,652.54] seconds)
  scrape-ec2-prices: OK (942.42=setup[0.12]+cmd[942.30] seconds)
  evaluation failed :( (1595.18 seconds)

Afterwards we retried with trunk and its also failing:

root@a2eea4b5cfec:/work/libcloud# tox -e scrape-ec2-sizes
scrape-ec2-sizes: commands[0]> bash -c 'echo "Scrapping EC2 sizes, this may take up to 10 minutes or more since the actual JSON data we download and scrape is very large"'
Scrapping EC2 sizes, this may take up to 10 minutes or more since the actual JSON data we download and scrape is very large
scrape-ec2-sizes: commands[1]> bash -c 'python contrib/scrape-ec2-sizes.py'
Scraping size data, this may take up to 10-15 minutes...
Using data from existing cached file /tmp/ec.json
scrape-ec2-sizes: exit -9 (576.53 seconds) /work/libcloud> bash -c 'python contrib/scrape-ec2-sizes.py' pid=1832
  scrape-ec2-sizes: FAIL code -9 (576.78=setup[0.19]+cmd[0.06,576.53] seconds)
  evaluation failed :( (577.21 seconds)

Also the mentioned example (see [2]) in the documentation is outdated. The list EC2_REGIONS does not exist anymore in contrib/scrape-ec2-prices.py.

The change we did seems to be sufficient to delete a VM, are there other tests to run?

[1] https://libcloud.readthedocs.io/en/latest/development.html#updating-ec2-sizing-and-supported-regions-data
[2] 762f0e5

Checklist (tick everything that applies)

  • Code linting (required, can be done after the PR checks)
  • Documentation
  • Tests
  • ICLA (required for bigger changes)

@seifertdan seifertdan marked this pull request as draft July 17, 2024 12:01
@seifertdan seifertdan closed this Jul 17, 2024
@seifertdan seifertdan deleted the patch-1 branch July 17, 2024 12:09
@seifertdan seifertdan restored the patch-1 branch July 30, 2024 08:13
@seifertdan seifertdan reopened this Jul 30, 2024
@seifertdan seifertdan marked this pull request as ready for review July 30, 2024 08:24
@elakito
Copy link

elakito commented Jul 31, 2024

Is this PR ready for review or someone can investigate the problem? I also tried running the tox, scrape-ec2-sizes.py downloads a 6GB ec.json file, which looks okay. But this script prints this error.

Using data from existing cached file /tmp/ec.json
scrape-ec2-sizes: exit -9 (837.75 seconds) /opt/libcloud> bash -c 'python contrib/scrape-ec2-sizes.py' pid=28
scrape-ec2-sizes: FAIL ✖ in 13 minutes 58.98 seconds

thanks.

@Kami Kami added this to the v3.9.0 milestone Mar 2, 2025
@codecov-commenter
Copy link

codecov-commenter commented Mar 2, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 83.40%. Comparing base (1117987) to head (0a5bc0d).
Report is 45 commits behind head on trunk.

Additional details and impacted files
@@           Coverage Diff           @@
##            trunk    #2030   +/-   ##
=======================================
  Coverage   83.40%   83.40%           
=======================================
  Files         353      353           
  Lines       81685    81685           
  Branches     8632     8632           
=======================================
  Hits        68124    68124           
  Misses      10738    10738           
  Partials     2823     2823           
Files with missing lines Coverage Δ
libcloud/compute/constants/ec2_instance_types.py 100.00% <ø> (ø)
...d/compute/constants/ec2_region_details_complete.py 100.00% <ø> (ø)
libcloud/storage/drivers/s3.py 89.61% <ø> (ø)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@Kami Kami merged commit 973f891 into apache:trunk Mar 2, 2025
17 checks passed
@Kami
Copy link
Member

Kami commented Mar 2, 2025

Sorry for the delay. The PR (with somewhat related updates to AWS EC2 sizes and prices) has been merged into trunk.

Thanks for the contribution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants