Passing inline arguments to shell script being executed on HDFS
I am running a shell script stored on HDFS (so that it can be recognized by my oozie workflow). to run this script I am using
hadoop fs -cat script.sh |exec sh
However I need to pass inline arguments to the script. On the CLI I would simply do this with
./script.sh arg1
Then echo the varaiable with $1. I am trying to figure out how I would do the same with a script stored in HDFS
shell-script command-line hadoop
add a comment |
I am running a shell script stored on HDFS (so that it can be recognized by my oozie workflow). to run this script I am using
hadoop fs -cat script.sh |exec sh
However I need to pass inline arguments to the script. On the CLI I would simply do this with
./script.sh arg1
Then echo the varaiable with $1. I am trying to figure out how I would do the same with a script stored in HDFS
shell-script command-line hadoop
add a comment |
I am running a shell script stored on HDFS (so that it can be recognized by my oozie workflow). to run this script I am using
hadoop fs -cat script.sh |exec sh
However I need to pass inline arguments to the script. On the CLI I would simply do this with
./script.sh arg1
Then echo the varaiable with $1. I am trying to figure out how I would do the same with a script stored in HDFS
shell-script command-line hadoop
I am running a shell script stored on HDFS (so that it can be recognized by my oozie workflow). to run this script I am using
hadoop fs -cat script.sh |exec sh
However I need to pass inline arguments to the script. On the CLI I would simply do this with
./script.sh arg1
Then echo the varaiable with $1. I am trying to figure out how I would do the same with a script stored in HDFS
shell-script command-line hadoop
shell-script command-line hadoop
edited Jul 6 '17 at 17:04
Romeo Ninov
6,01332028
6,01332028
asked Jul 6 '17 at 16:47
user2211504user2211504
11
11
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
You might try something like the following; it uses a separate invocation of hadoop fs cat (in a process substitution) to retrieve each file and present it to script.sh as a file name to open for reading.
# Adjust the hdfs: URLs as necessary
hadoop fs -cat hdfs://path_to_script/sample.sh | exec bash
<(hadoop fs -cat hdfs://param1)
<(hadoop fs -cat hdfs://param2)
<(hadoop fs -cat hdfs://param3)
<(hadoop fs -cat hdfs://param4)
If script.sh already knows that how to read from hdfs, then
hadoop fs -cat hdfs://path_to_script/script.sh | exec bash -s param1 param2 param3 param4
may be sufficient. The -s option tells bash to read the script from standard input, so that it doesn't mistake param1 as the name of the script to run.
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "106"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f375805%2fpassing-inline-arguments-to-shell-script-being-executed-on-hdfs%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
You might try something like the following; it uses a separate invocation of hadoop fs cat (in a process substitution) to retrieve each file and present it to script.sh as a file name to open for reading.
# Adjust the hdfs: URLs as necessary
hadoop fs -cat hdfs://path_to_script/sample.sh | exec bash
<(hadoop fs -cat hdfs://param1)
<(hadoop fs -cat hdfs://param2)
<(hadoop fs -cat hdfs://param3)
<(hadoop fs -cat hdfs://param4)
If script.sh already knows that how to read from hdfs, then
hadoop fs -cat hdfs://path_to_script/script.sh | exec bash -s param1 param2 param3 param4
may be sufficient. The -s option tells bash to read the script from standard input, so that it doesn't mistake param1 as the name of the script to run.
add a comment |
You might try something like the following; it uses a separate invocation of hadoop fs cat (in a process substitution) to retrieve each file and present it to script.sh as a file name to open for reading.
# Adjust the hdfs: URLs as necessary
hadoop fs -cat hdfs://path_to_script/sample.sh | exec bash
<(hadoop fs -cat hdfs://param1)
<(hadoop fs -cat hdfs://param2)
<(hadoop fs -cat hdfs://param3)
<(hadoop fs -cat hdfs://param4)
If script.sh already knows that how to read from hdfs, then
hadoop fs -cat hdfs://path_to_script/script.sh | exec bash -s param1 param2 param3 param4
may be sufficient. The -s option tells bash to read the script from standard input, so that it doesn't mistake param1 as the name of the script to run.
add a comment |
You might try something like the following; it uses a separate invocation of hadoop fs cat (in a process substitution) to retrieve each file and present it to script.sh as a file name to open for reading.
# Adjust the hdfs: URLs as necessary
hadoop fs -cat hdfs://path_to_script/sample.sh | exec bash
<(hadoop fs -cat hdfs://param1)
<(hadoop fs -cat hdfs://param2)
<(hadoop fs -cat hdfs://param3)
<(hadoop fs -cat hdfs://param4)
If script.sh already knows that how to read from hdfs, then
hadoop fs -cat hdfs://path_to_script/script.sh | exec bash -s param1 param2 param3 param4
may be sufficient. The -s option tells bash to read the script from standard input, so that it doesn't mistake param1 as the name of the script to run.
You might try something like the following; it uses a separate invocation of hadoop fs cat (in a process substitution) to retrieve each file and present it to script.sh as a file name to open for reading.
# Adjust the hdfs: URLs as necessary
hadoop fs -cat hdfs://path_to_script/sample.sh | exec bash
<(hadoop fs -cat hdfs://param1)
<(hadoop fs -cat hdfs://param2)
<(hadoop fs -cat hdfs://param3)
<(hadoop fs -cat hdfs://param4)
If script.sh already knows that how to read from hdfs, then
hadoop fs -cat hdfs://path_to_script/script.sh | exec bash -s param1 param2 param3 param4
may be sufficient. The -s option tells bash to read the script from standard input, so that it doesn't mistake param1 as the name of the script to run.
edited Jul 6 '17 at 17:32
answered Jul 6 '17 at 17:25
Bhavya JainBhavya Jain
2101311
2101311
add a comment |
add a comment |
Thanks for contributing an answer to Unix & Linux Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f375805%2fpassing-inline-arguments-to-shell-script-being-executed-on-hdfs%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown