"create"
********

* Description

* Usage

* Required Parameters

* Optional Parameters

* Global Parameters

* Example using required parameter


Description
===========

Creates an application.


Usage
=====

   oci data-flow application create [OPTIONS]


Required Parameters
===================

--compartment-id, -c [text]

The OCID of a compartment.

--display-name [text]

A user-friendly name. It does not have to be unique. Avoid entering
confidential information.

--driver-shape [text]

The VM shape for the driver. Sets the driver cores and memory.

--executor-shape [text]

The VM shape for the executors. Sets the executor cores and memory.

--language [text]

The Spark language.

Accepted values are:

   JAVA, PYTHON, SCALA, SQL

--num-executors [integer]

The number of executor VMs requested.

--spark-version [text]

The Spark version utilized to run the application.


Optional Parameters
===================

--application-log-config [complex type]

This is a complex type whose value must be valid JSON. The value can
be provided as a string on the command line or passed in as a file
using the file://path/to/file syntax.

The "--generate-param-json-input" option can be used to generate an
example of the JSON which must be provided. We recommend storing this
example in a file, modifying it as needed and then passing it back in
via the file:// syntax.

--archive-uri [text]

A comma separated list of one or more archive files as Oracle Cloud
Infrastructure URIs. For example,
"oci://path/to/a.zip,oci://path/to/b.zip". An Oracle Cloud
Infrastructure URI of an archive.zip file containing custom
dependencies that may be used to support the execution of a Python,
Java, or Scala application. See https://docs.cloud.oracle.com/iaas/Co
ntent/API/SDKDocs/hdfsconnector.htm#uriformat.

--arguments [text]

The arguments passed to the running application as command line
arguments. Arguments may contain zero or more placeholders that are
replaced using values from the parameters map. Each placeholder
specified must be represented in the parameters map else the request
will fail with a HTTP 400 status code. Placeholders are specified as
*${name}*, where *name* is the name of the parameter. Example:
‘–input ${input_file} –name “John Doe”’  Alternatively, the arguments
can be specified as a JSON array of strings where each string
represent an argument. Example:  [ “–input”, “${input_file}”, “–name”,
“John Doe” ]  If “input_file” has a value of “mydata.xml”, then the
value above will be translated to *–input mydata.xml –name “John Doe”*

--class-name [text]

The class for the application.

--configuration [text]

The Spark configuration passed to the running process. See
https://spark.apache.org/docs/latest/configuration.html#available-
properties Example: ‘spark.app.name=”My App Name”
spark.shuffle.io.maxRetries=4’  Alternatively, the configuration can
be specified as a JSON objects. Example:  { “spark.app.name” : “My App
Name”, “spark.shuffle.io.maxRetries” : “4” }  Note: Not all Spark
properties are permitted to be set. Attempting to set a property that
is not allowed to be overwritten will cause a 400 status to be
returned.

--defined-tags [complex type]

Defined tags for this resource. Each key is predefined and scoped to a
namespace. For more information, see Resource Tags. Example:
*{“Operations”: {“CostCenter”: “42”}}* This is a complex type whose
value must be valid JSON. The value can be provided as a string on the
command line or passed in as a file using the file://path/to/file
syntax.

The "--generate-param-json-input" option can be used to generate an
example of the JSON which must be provided. We recommend storing this
example in a file, modifying it as needed and then passing it back in
via the file:// syntax.

--description [text]

A user-friendly description. Avoid entering confidential information.

--driver-shape-config [complex type]

This is a complex type whose value must be valid JSON. The value can
be provided as a string on the command line or passed in as a file
using the file://path/to/file syntax.

The "--generate-param-json-input" option can be used to generate an
example of the JSON which must be provided. We recommend storing this
example in a file, modifying it as needed and then passing it back in
via the file:// syntax.

--executor-shape-config [complex type]

This is a complex type whose value must be valid JSON. The value can
be provided as a string on the command line or passed in as a file
using the file://path/to/file syntax.

The "--generate-param-json-input" option can be used to generate an
example of the JSON which must be provided. We recommend storing this
example in a file, modifying it as needed and then passing it back in
via the file:// syntax.

--file-uri [text]

An Oracle Cloud Infrastructure URI of the file containing the
application to execute. See https://docs.cloud.oracle.com/iaas/Conten
t/API/SDKDocs/hdfsconnector.htm#uriformat.

--freeform-tags [complex type]

Free-form tags for this resource. Each tag is a simple key-value pair
with no predefined name, type, or namespace. For more information, see
Resource Tags. Example: *{“Department”: “Finance”}* This is a complex
type whose value must be valid JSON. The value can be provided as a
string on the command line or passed in as a file using the
file://path/to/file syntax.

The "--generate-param-json-input" option can be used to generate an
example of the JSON which must be provided. We recommend storing this
example in a file, modifying it as needed and then passing it back in
via the file:// syntax.

--from-json [text]

Provide input to this command as a JSON document from a file using the
file://path-to/file syntax.

The "--generate-full-command-json-input" option can be used to
generate a sample json file to be used with this command option. The
key names are pre-populated and match the command option names
(converted to camelCase format, e.g. compartment-id –> compartmentId),
while the values of the keys need to be populated by the user before
using the sample file as an input to this command. For any command
option that accepts multiple values, the value of the key can be a
JSON array.

Options can still be provided on the command line. If an option exists
in both the JSON document and the command line then the command line
specified value will be used.

For examples on usage of this option, please see our “using CLI with
advanced JSON options” link: https://docs.cloud.oracle.com/iaas/Conte
nt/API/SDKDocs/cliusing.htm#AdvancedJSONOptions

--idle-timeout-in-minutes [integer]

The timeout value in minutes used to manage Runs. A Run would be
stopped after inactivity for this amount of time period. Note: This
parameter is currently only applicable for Runs of type *SESSION*.
Default value is 2880 minutes (2 days)

--logs-bucket-uri [text]

An Oracle Cloud Infrastructure URI of the bucket where the Spark job
logs are to be uploaded. See https://docs.cloud.oracle.com/iaas/Conte
nt/API/SDKDocs/hdfsconnector.htm#uriformat.

--max-duration-in-minutes [integer]

The maximum duration in minutes for which an Application should run.
Data Flow Run would be terminated once it reaches this duration from
the time it transitions to *IN_PROGRESS* state.

--max-wait-seconds [integer]

The maximum time to wait for the resource to reach the lifecycle state
defined by "--wait-for-state". Defaults to 1200 seconds.

--metastore-id [text]

The OCID of OCI Hive Metastore.

--parameters [text]

A string of name=value pairs used to supply SQL parameters or fill
placeholders found in the arguments parameter. The name must be a
string of one or more word characters (a-z, A-Z, 0-9, _). The value
can be a string of zero or more characters of any kind. Example:
‘iterations=10 input_file=mydata.xml variable_x=${x}’  Alternatively,
the arguments can be specified as a JSON array of objects. Example:  [
{ name : “iterations”, value : “10” }, { name : “input_file”, value :
“mydata.xml” }, { name : “variable_x”, value : “${x}” } ]

--pool-id [text]

The OCID of a pool. Unique Id to indentify a dataflow pool resource.

--private-endpoint-id [text]

The OCID of a private endpoint.

--type [text]

The Spark application processing type.

Accepted values are:

   BATCH, SESSION, STREAMING

--wait-for-state [text]

This operation creates, modifies or deletes a resource that has a
defined lifecycle state. Specify this option to perform the action and
then wait until the resource reaches a given lifecycle state. Multiple
states can be specified, returning on the first state. For example, "
--wait-for-state" SUCCEEDED "--wait-for-state" FAILED would return on
whichever lifecycle state is reached first. If timeout is reached, a
return code of 2 is returned. For any other error, a return code of 1
is returned.

Accepted values are:

   ACTIVE, DELETED, INACTIVE

--wait-interval-seconds [integer]

Check every "--wait-interval-seconds" to see whether the resource has
reached the lifecycle state defined by "--wait-for-state". Defaults to
30 seconds.

--warehouse-bucket-uri [text]

An Oracle Cloud Infrastructure URI of the bucket to be used as default
warehouse directory for BATCH SQL runs. See https://docs.cloud.oracle
.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.


Global Parameters
=================

Use "oci --help" for help on global parameters.

"--auth-purpose", "--auth", "--cert-bundle", "--cli-auto-prompt", "--
cli-rc-file", "--config-file", "--connection-timeout", "--debug", "--
defaults-file", "--endpoint", "--generate-full-command-json-input", "
--generate-param-json-input", "--help", "--latest-version", "--max-
retries", "--no-retry", "--opc-client-request-id", "--opc-request-id",
"--output", "--profile", "--proxy", "--query", "--raw-output", "--
read-timeout", "--realm-specific-endpoint", "--region", "--release-
info", "--request-id", "--version", "-?", "-d", "-h", "-i", "-v"


Example using required parameter
================================

Copy the following CLI commands into a file named example.sh. Run the
command by typing “bash example.sh” and replacing the example
parameters with your own.

Please note this sample will only work in the POSIX-compliant bash-
like shell. You need to set up the OCI configuration and appropriate
security policies before trying the examples.

       export compartment_id=<substitute-value-of-compartment_id> # https://docs.cloud.oracle.com/en-us/iaas/tools/oci-cli/latest/oci_cli_docs/cmdref/data-flow/application/create.html#cmdoption-compartment-id
       export display_name=<substitute-value-of-display_name> # https://docs.cloud.oracle.com/en-us/iaas/tools/oci-cli/latest/oci_cli_docs/cmdref/data-flow/application/create.html#cmdoption-display-name
       export driver_shape=<substitute-value-of-driver_shape> # https://docs.cloud.oracle.com/en-us/iaas/tools/oci-cli/latest/oci_cli_docs/cmdref/data-flow/application/create.html#cmdoption-driver-shape
       export executor_shape=<substitute-value-of-executor_shape> # https://docs.cloud.oracle.com/en-us/iaas/tools/oci-cli/latest/oci_cli_docs/cmdref/data-flow/application/create.html#cmdoption-executor-shape
       export language=<substitute-value-of-language> # https://docs.cloud.oracle.com/en-us/iaas/tools/oci-cli/latest/oci_cli_docs/cmdref/data-flow/application/create.html#cmdoption-language
       export num_executors=<substitute-value-of-num_executors> # https://docs.cloud.oracle.com/en-us/iaas/tools/oci-cli/latest/oci_cli_docs/cmdref/data-flow/application/create.html#cmdoption-num-executors
       export spark_version=<substitute-value-of-spark_version> # https://docs.cloud.oracle.com/en-us/iaas/tools/oci-cli/latest/oci_cli_docs/cmdref/data-flow/application/create.html#cmdoption-spark-version

       oci data-flow application create --compartment-id $compartment_id --display-name $display_name --driver-shape $driver_shape --executor-shape $executor_shape --language $language --num-executors $num_executors --spark-version $spark_version
