4

I'm using curl to upload large files (from 5 to 20Gb) to HOOP based on HDFS (Hadoop Cluster) as follows:

curl -f --data-binary "@$file" "$HOOP_HOST$UPLOAD_PATH?user.name=$HOOP_USER&op=create"

But when curl uploading large file it trying to fully cache it in RAM wich produces high memory load. I've tried to use -N flag from man curl which should disable buffering. But nothing happened. So my question is, is there any way to force curl write directly to socket, or could you advice me another util which will cover my needs of simple HTTP POST/PUT requests.

I don't wont to write it by myself since it looks like inventing a bicycle all along. Thank you.

1
  • I don't have 50 points for a comment, but at least -T works for me. Commented Jun 7, 2016 at 5:23

2 Answers 2

4

Sadly, according to Daniel Stenberg it's not currently possible.

See https://github.com/bagder/curl/issues/290

1
cat myfile.csv | curl -X PUT -T - -H 'content-type:text/csv' http://localhost 

actually works.

1
  • Indeed, though the indirection via stdin is unnecessary: as long as the URL does not end in / (read the manpage, it's weird), curl -X PUT -T myfile.csv -H ... http://ocalhost works fine. Commented Oct 1, 2022 at 9:08

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.