curl page source from text file containing URLs
I have a text file containing up to 100 URLs. I am able to curl the page source from them using:
cat /path/to/url.txt|xargs curl -o /path/to/output.txt
This will download the page source for all of the URLs (as i can see this happening in command line), but it will only save (in output.txt
) the page source for the URL at the top of the list.
How would I go about saving page source for each URL, whether in the same text file or if necessary in individual text files.
Thanks,
curl cat xargs url
add a comment |
I have a text file containing up to 100 URLs. I am able to curl the page source from them using:
cat /path/to/url.txt|xargs curl -o /path/to/output.txt
This will download the page source for all of the URLs (as i can see this happening in command line), but it will only save (in output.txt
) the page source for the URL at the top of the list.
How would I go about saving page source for each URL, whether in the same text file or if necessary in individual text files.
Thanks,
curl cat xargs url
add a comment |
I have a text file containing up to 100 URLs. I am able to curl the page source from them using:
cat /path/to/url.txt|xargs curl -o /path/to/output.txt
This will download the page source for all of the URLs (as i can see this happening in command line), but it will only save (in output.txt
) the page source for the URL at the top of the list.
How would I go about saving page source for each URL, whether in the same text file or if necessary in individual text files.
Thanks,
curl cat xargs url
I have a text file containing up to 100 URLs. I am able to curl the page source from them using:
cat /path/to/url.txt|xargs curl -o /path/to/output.txt
This will download the page source for all of the URLs (as i can see this happening in command line), but it will only save (in output.txt
) the page source for the URL at the top of the list.
How would I go about saving page source for each URL, whether in the same text file or if necessary in individual text files.
Thanks,
curl cat xargs url
curl cat xargs url
edited Nov 6 '15 at 20:35
lese
2,12031327
2,12031327
asked Nov 6 '15 at 16:49
LewandajoLewandajo
183
183
add a comment |
add a comment |
4 Answers
4
active
oldest
votes
With GNU Parallel you can get multiple URLs in parallel and you do not need to worry about the outputs getting mixed:
cat /path/to/url.txt | parallel curl > /path/to/output.txt
add a comment |
for i in $(cat urls.txt); do curl "$i" >> output.txt; done
add a comment |
with a simple list of urls on each line, this should do the job and output everything in a single file :
while read in; do xargs curl -K "$in" >> /path/to/output.txt; done < /path/to/url.txt
add a comment |
With a simple list of urls on each line this should do the job
New contributor
It appears that you have written the preface to an answer, but forgotten to post the answer itself.
– G-Man
1 hour ago
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "106"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f241303%2fcurl-page-source-from-text-file-containing-urls%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
With GNU Parallel you can get multiple URLs in parallel and you do not need to worry about the outputs getting mixed:
cat /path/to/url.txt | parallel curl > /path/to/output.txt
add a comment |
With GNU Parallel you can get multiple URLs in parallel and you do not need to worry about the outputs getting mixed:
cat /path/to/url.txt | parallel curl > /path/to/output.txt
add a comment |
With GNU Parallel you can get multiple URLs in parallel and you do not need to worry about the outputs getting mixed:
cat /path/to/url.txt | parallel curl > /path/to/output.txt
With GNU Parallel you can get multiple URLs in parallel and you do not need to worry about the outputs getting mixed:
cat /path/to/url.txt | parallel curl > /path/to/output.txt
answered Nov 12 '15 at 12:14
Ole TangeOle Tange
12.1k1451105
12.1k1451105
add a comment |
add a comment |
for i in $(cat urls.txt); do curl "$i" >> output.txt; done
add a comment |
for i in $(cat urls.txt); do curl "$i" >> output.txt; done
add a comment |
for i in $(cat urls.txt); do curl "$i" >> output.txt; done
for i in $(cat urls.txt); do curl "$i" >> output.txt; done
answered Nov 12 '15 at 12:52
blissiniblissini
1945
1945
add a comment |
add a comment |
with a simple list of urls on each line, this should do the job and output everything in a single file :
while read in; do xargs curl -K "$in" >> /path/to/output.txt; done < /path/to/url.txt
add a comment |
with a simple list of urls on each line, this should do the job and output everything in a single file :
while read in; do xargs curl -K "$in" >> /path/to/output.txt; done < /path/to/url.txt
add a comment |
with a simple list of urls on each line, this should do the job and output everything in a single file :
while read in; do xargs curl -K "$in" >> /path/to/output.txt; done < /path/to/url.txt
with a simple list of urls on each line, this should do the job and output everything in a single file :
while read in; do xargs curl -K "$in" >> /path/to/output.txt; done < /path/to/url.txt
edited Nov 16 '15 at 15:43
answered Nov 6 '15 at 20:26
leselese
2,12031327
2,12031327
add a comment |
add a comment |
With a simple list of urls on each line this should do the job
New contributor
It appears that you have written the preface to an answer, but forgotten to post the answer itself.
– G-Man
1 hour ago
add a comment |
With a simple list of urls on each line this should do the job
New contributor
It appears that you have written the preface to an answer, but forgotten to post the answer itself.
– G-Man
1 hour ago
add a comment |
With a simple list of urls on each line this should do the job
New contributor
With a simple list of urls on each line this should do the job
New contributor
New contributor
answered 1 hour ago
ramzan siddiquiramzan siddiqui
1
1
New contributor
New contributor
It appears that you have written the preface to an answer, but forgotten to post the answer itself.
– G-Man
1 hour ago
add a comment |
It appears that you have written the preface to an answer, but forgotten to post the answer itself.
– G-Man
1 hour ago
It appears that you have written the preface to an answer, but forgotten to post the answer itself.
– G-Man
1 hour ago
It appears that you have written the preface to an answer, but forgotten to post the answer itself.
– G-Man
1 hour ago
add a comment |
Thanks for contributing an answer to Unix & Linux Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f241303%2fcurl-page-source-from-text-file-containing-urls%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown