[Bug 1412545] Re: proxy isn't used after a dropped connection
Stefano Rivera
1412545 at bugs.launchpad.net
Thu Jan 21 00:20:22 UTC 2021
I assume this is all done. Closing the bug.
** Changed in: python-virtualenv (Ubuntu)
Status: New => Invalid
** Changed in: python-virtualenv (Ubuntu Trusty)
Status: Triaged => Invalid
--
You received this bug notification because you are a member of Ubuntu
OpenStack, which is subscribed to python-urllib3 in Ubuntu.
https://bugs.launchpad.net/bugs/1412545
Title:
proxy isn't used after a dropped connection
Status in python-urllib3 package in Ubuntu:
Fix Released
Status in python-virtualenv package in Ubuntu:
Invalid
Status in python-urllib3 source package in Trusty:
Fix Released
Status in python-virtualenv source package in Trusty:
Invalid
Bug description:
Ubuntu version: Ubuntu 14.04.1 LTS
python-urllib3 version: 1.7.1-1build1
Steps to reproduce:
1. set up an http proxy
2. configure a ProxyManager to use said http proxy
3. make successive GET requests to https://pypi.python.org/
example script:
https://gist.github.com/stratoukos/7545c5c909fa9b5d1cfb
What happens:
urllib3 stops using the http proxy after the connection is dropped
(after 24 requests in my testing with pypi. other people have seen
different numbers)
What I expected to happen:
urllib3 should always use the proxy
Other Info:
This has been fixed in commit 1c30a1f3 of urllib3 and included in its
1.8.3 release. This bug also affects pip and requests as reported
here: https://github.com/pypa/pip/issues/1805
I really hope the bugfix will be backported, since pip is currently
unusable behind an outgoing firewall on 14.04
[Impact]
urllib3 stops using the proxy after a connection is dropped making users of
python-urllib3 (such as pip) that are behind a firewall unable to connect to external sites.
[Test Case]
1. Start a trusty VM
2. Get the test script ($ wget https://gist.githubusercontent.com/stratoukos/7545c5c909fa9b5d1cfb/raw/456381dff95d503818d35c393e71ec0272ab08d3/gistfile1.py -O test.py)
3. Install Squid ($ apt-get install squid)
4. Install python-urllib3 if it's not installed ($ apt-get install python-urllib3)
5. Block outgoing connections (
$ sudo iptables -A OUTPUT -m owner --uid-owner root -j ACCEPT
$ sudo iptables -A OUTPUT -m owner --uid-owner proxy -j ACCEPT
$ sudo iptables -A OUTPUT -p tcp --dport 80 -j DROP
$ sudo iptables -A OUTPUT -p tcp --dport 443 -j DROP
)
6. Run the test script ($ python test.py)
7. In another terminal, tail the squid log ($ sudo tailf /var/log/squid3/access.log)
With python-urllib3 1.7.1-1build1, one would see a connection timeout after 24
or so requests, while with the backported package, one sees hits in the proxy
log meaning the requests are going through the proxy after a reconnect.
[Regression Potential]
The fix was released on 1.8.3 and the current upstream code is on 1.9.1 so the
potential for the upstream fix code to be wrong is minimal. That said, the
current upstream code moved some modules around and the backport is an attempt
to recreate the intent of the patch in the old codebase available on Trusty
making the mininmal amount of changes necessary to get things working.
To manage notifications about this bug go to:
https://bugs.launchpad.net/ubuntu/+source/python-urllib3/+bug/1412545/+subscriptions
More information about the Ubuntu-openstack-bugs
mailing list