Skip to content
This repository has been archived by the owner on Mar 24, 2022. It is now read-only.

issue with ec2_info.py and aws maintenance metadata #198

Open
kdecoteauCS opened this issue Sep 8, 2018 · 2 comments
Open

issue with ec2_info.py and aws maintenance metadata #198

kdecoteauCS opened this issue Sep 8, 2018 · 2 comments

Comments

@kdecoteauCS
Copy link

We encountered an issue where ec2_info.py is generating critical errors as it steps through the metadata URL's.

Code is able to extract all other metadata, but dies on maintenance events if it is blank.
url : /latest/meta-data/events/maintenance/
I'm assuming it is due to the response time. I added two lines to _call_aws(url) to print the URL variable contents before establishing a connection.

` .....
PRINTING URL ----- it should be below
/latest/meta-data/events/maintenance/

.....
[CRITICAL] Failed to load grains defined in grain file ec2_info.ec2_info in function <function ec2_info at 0x7f4fff6326e0>, error:
Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/salt/loader.py", line 726, in grains
ret = fun()
File "/var/cache/salt/minion/extmods/grains/ec2_info.py", line 157, in ec2_info
grains.update(_get_ec2_hostinfo())
File "/var/cache/salt/minion/extmods/grains/ec2_info.py", line 73, in _get_ec2_hostinfo
d[_dash_to_snake_case(line[:-1])] = _get_ec2_hostinfo(path + line)
File "/var/cache/salt/minion/extmods/grains/ec2_info.py", line 73, in _get_ec2_hostinfo
d[_dash_to_snake_case(line[:-1])] = _get_ec2_hostinfo(path + line)
File "/var/cache/salt/minion/extmods/grains/ec2_info.py", line 50, in _get_ec2_hostinfo
if line[-1] != "/":
IndexError: string index out of range
`

Curls to these work and confirm maintenance is just blank:

`root@schmoe02:/var/cache/salt/minion/extmods/grains# curl http://169.254.169.254/latest/meta-data/events/
maintenance/

root@schmoe02:/var/cache/salt/minion/extmods/grains# curl http://169.254.169.254/latest/meta-data/events/maintenance/

root@schmoe02:/var/cache/salt/minion/extmods/grains# curl -Iv http://169.254.169.254/latest/meta-data/events/maintenance/

  • Hostname was NOT found in DNS cache
  • Trying 169.254.169.254...
  • Connected to 169.254.169.254 (169.254.169.254) port 80 (#0)

HEAD /latest/meta-data/events/maintenance/ HTTP/1.1
User-Agent: curl/7.35.0
Host: 169.254.169.254
Accept: /

  • HTTP 1.0, assume close after body
    < HTTP/1.0 200 OK
    HTTP/1.0 200 OK
    < Content-Type: text/plain
    Content-Type: text/plain
    < Accept-Ranges: bytes
    Accept-Ranges: bytes
    < ETag: "3389151415"
    ETag: "3389151415"
    < Last-Modified: Thu, 06 Sep 2018 00:01:50 GMT
    Last-Modified: Thu, 06 Sep 2018 00:01:50 GMT
    < Connection: close
    Connection: close
    < Date: Sat, 08 Sep 2018 12:45:12 GMT
    Date: Sat, 08 Sep 2018 12:45:12 GMT
    < Server: EC2ws
    Server: EC2ws

<

  • Closing connection 0
    `
@kdecoteauCS
Copy link
Author

I've continued trying to track down this issue with no luck. I'm suspecting it may be an issue with httplib2/urllib3 versions mismatch deep in python.

Two virtually identical boxes, setup the same way the same week on the same salt functioned fine for almost a year. Now we have not one but several instances which are just "missing" the ec2 grain. These boxes all had it last week.

The only changes include upgrading some python components automatically. The upgrades worked fine for around a week, and are now intermittently failing.

Example of a system which is failing:
`Salt Version:
Salt: 2018.3.2

Dependency Versions:
cffi: 1.11.5
cherrypy: Not Installed
dateutil: 2.7.3
docker-py: Not Installed
gitdb: Not Installed
gitpython: Not Installed
ioflo: Not Installed
Jinja2: 2.10
libgit2: Not Installed
libnacl: Not Installed
M2Crypto: 0.30.1
Mako: 1.0.7
msgpack-pure: Not Installed
msgpack-python: 0.5.6
mysql-python: 1.2.5
pycparser: 2.18
pycrypto: 2.6.1
pycryptodome: Not Installed
pygit2: Not Installed
Python: 2.7.12 (default, Jul 18 2016, 15:02:52)
python-gnupg: Not Installed
PyYAML: 3.13
PyZMQ: 14.0.1
RAET: Not Installed
smmap: Not Installed
timelib: Not Installed
Tornado: 4.2.1
ZMQ: 4.0.5

System Versions:
dist: Ubuntu 14.04 trusty
locale: UTF-8
machine: x86_64
release: 4.4.0-1028-aws
system: Linux
version: Ubuntu 14.04 trusty

`

I will attempt a clean python install for kicks. And reinstalling salt-minion after ... again
Via: : sh bootstrap-salt.sh -P -y -x python2.7 git v2016.3.3

Of 280 salt minions, almost 40 have lost their ec2 grain in the last two weeks.

For sanity, I have checked two different versions of ec2_info.py. Current git clone and git checkout 848cdaf

@kdecoteauCS
Copy link
Author

This is solved by :
if path == "events/maintenance/":
continue

Or more elegantly in PR:
#195

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant