Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

request.get_data() is always empty because of flask/werkzeug bug with chunked transfer encoding #5

Open
telackey opened this issue Jul 25, 2018 · 10 comments

Comments

@telackey
Copy link
Contributor

The python3-flask template (and likely the python2) is bitten by this Flask issue: pallets/flask#2229

Flask / Werkzeug have a bug handling chunked transfer encoding--and perhaps any other time that Content-Length is not set--when combined with WSGI.

When the request is forwarded through the of-watchdog process, http.NewRequest is given a reference to the body io.ReadCloser. Since that is a stream, the outbound request will always use chunked transfer encoding and no Content-Length.

This could be avoided by changing of-watchdog to read the whole body into a buffer and then send it unchunked; but it would make more sense, I think, either to upgrade to a fixed version of Flask/Werkzeug (assuming there is one) or revert back to Flask's development server (app.run(...)) rather than using gevent's WSGIServer, as the built-in server does not seem to be affected.

The following examples are using the same of-watchdog and function code. The only difference is that the first one is using Flask with WSGIServer and the second is using the Flask built-in server.

WSGI server

$ curl -i -XPOST -d "rawdsddddddddddddddddddddddd" 'http://127.0.0.1:9999/'
HTTP/1.1 200 OK
Content-Length: 0
Content-Type: text/html; charset=utf-8
Date: Wed, 25 Jul 2018 23:33:36 GMT

Flask's server

$ curl -i -XPOST -d "rawdsddddddddddddddddddddddd" 'http://127.0.0.1:9999/'
HTTP/1.1 200 OK
Content-Length: 28
Content-Type: text/html; charset=utf-8
Date: Wed, 25 Jul 2018 23:31:29 GMT
Server: Werkzeug/0.14.1 Python/3.6.5

rawdsddddddddddddddddddddddd
@ssullivan
Copy link
Contributor

I recently started to try and build some new functions and use this template and I've run into this same problem.

@ssullivan
Copy link
Contributor

ssullivan commented Oct 18, 2018

I was able to get this working by applying a suggestion from benoitc/gunicorn#1733 (comment)

@app.before_request
def fix_transfer_encoding():
    """
    Sets the "wsgi.input_terminated" environment flag, thus enabling
    Werkzeug to pass chunked requests as streams.  The gunicorn server
    should set this, but it's not yet been implemented.
    """

    transfer_encoding = request.headers.get("Transfer-Encoding", None)
    if transfer_encoding == u"chunked":
        request.environ["wsgi.input_terminated"] = True

ssullivan added a commit to ssullivan/python-flask-template that referenced this issue Oct 18, 2018
Added new method to index.py to fix an issue with chunked
encoding and missing request bodies

Signed-off-by: Stephen Sullivan <[email protected]>
@alexellis
Copy link
Member

Plain Flask was both simpler to use and quicker to build, but is not said to be production-quality.

I'll take a look at this proposal.

@alexellis
Copy link
Member

@ssullivan I'm talking with @LucasRoesler about this. Do we want to revert to "plain flask" which is much faster at build time and doesn't need the work-around?

alexellis pushed a commit that referenced this issue Nov 3, 2018
Added new method to index.py to fix an issue with chunked
encoding and missing request bodies

Signed-off-by: Stephen Sullivan <[email protected]>
@ssullivan
Copy link
Contributor

At the moment I’m not familiar with the performance differences between plain flask and the current implementation. If it is more performant and doesn’t have the problem with chunked encoding it seems to make sense to switch to that.

@LucasRoesler
Copy link
Member

I think for short term, a working template is pretty important. But in the long term it would be good to use wsgi.

Have we considered using conda instead of pip. Using conda and the conda-forge could greatly speed up the install of packages like wsgi

@daMichaelB
Copy link

I was able to get this working by applying a suggestion from benoitc/gunicorn#1733 (comment)

@app.before_request
def fix_transfer_encoding():
    """
    Sets the "wsgi.input_terminated" environment flag, thus enabling
    Werkzeug to pass chunked requests as streams.  The gunicorn server
    should set this, but it's not yet been implemented.
    """

    transfer_encoding = request.headers.get("Transfer-Encoding", None)
    if transfer_encoding == u"chunked":
        request.environ["wsgi.input_terminated"] = True

you saved me hours ! Thank you 👍

@alexellis
Copy link
Member

Michael is that not in the OpenFaaS templates yet or are you wanting the hack for something else?

@alexellis
Copy link
Member

We also have RAW_BODY available now as an environment variable which may also be useful here

@daMichaelB
Copy link

Hey @alexellis , thanks for reaching out! I came here from using the DockerHub Image: tiangolo/uwsgi-nginx-flask. I had the exact same problem there and struggled a lot to find a solution. Maybe there might be a better way to solve this issue, but it works perfectly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants