class Google::Apis::BigqueryV2::ExternalDataConfiguration

Attributes

autodetect[RW]
Experimental

Try to detect schema and format options automatically. Any

option specified explicitly will be honored. Corresponds to the JSON property `autodetect` @return [Boolean]

autodetect?[RW]
Experimental

Try to detect schema and format options automatically. Any

option specified explicitly will be honored. Corresponds to the JSON property `autodetect` @return [Boolean]

bigtable_options[RW]
Optional

Additional options if sourceFormat is set to BIGTABLE.

Corresponds to the JSON property `bigtableOptions` @return [Google::Apis::BigqueryV2::BigtableOptions]

compression[RW]
Optional

The compression type of the data source. Possible values include

GZIP and NONE. The default value is NONE. This setting is ignored for Google Cloud Bigtable, Google Cloud Datastore backups and Avro formats. Corresponds to the JSON property `compression` @return [String]

csv_options[RW]

Additional properties to set if sourceFormat is set to CSV. Corresponds to the JSON property `csvOptions` @return [Google::Apis::BigqueryV2::CsvOptions]

ignore_unknown_values[RW]
Optional

Indicates if BigQuery should allow extra values that are not

represented in the table schema. If true, the extra values are ignored. If false, records with extra columns are treated as bad records, and if there are too many bad records, an invalid error is returned in the job result. The default value is false. The sourceFormat property determines what BigQuery treats as an extra value: CSV: Trailing columns JSON: Named values that don't match any column names Google Cloud Bigtable: This setting is ignored. Google Cloud Datastore backups: This setting is ignored. Avro: This setting is ignored. Corresponds to the JSON property `ignoreUnknownValues` @return [Boolean]

ignore_unknown_values?[RW]
Optional

Indicates if BigQuery should allow extra values that are not

represented in the table schema. If true, the extra values are ignored. If false, records with extra columns are treated as bad records, and if there are too many bad records, an invalid error is returned in the job result. The default value is false. The sourceFormat property determines what BigQuery treats as an extra value: CSV: Trailing columns JSON: Named values that don't match any column names Google Cloud Bigtable: This setting is ignored. Google Cloud Datastore backups: This setting is ignored. Avro: This setting is ignored. Corresponds to the JSON property `ignoreUnknownValues` @return [Boolean]

max_bad_records[RW]
Optional

The maximum number of bad records that BigQuery can ignore when

reading data. If the number of bad records exceeds this value, an invalid error is returned in the job result. The default value is 0, which requires that all records are valid. This setting is ignored for Google Cloud Bigtable, Google Cloud Datastore backups and Avro formats. Corresponds to the JSON property `maxBadRecords` @return [Fixnum]

schema[RW]
Optional

The schema for the data. Schema is required for CSV and JSON

formats. Schema is disallowed for Google Cloud Bigtable, Cloud Datastore backups, and Avro formats. Corresponds to the JSON property `schema` @return [Google::Apis::BigqueryV2::TableSchema]

source_format[RW]
Required

The data format. For CSV files, specify “CSV”. For newline-

delimited JSON, specify “NEWLINE_DELIMITED_JSON”. For Avro files, specify “ AVRO”. For Google Cloud Datastore backups, specify “DATASTORE_BACKUP”. [ Experimental] For Google Cloud Bigtable, specify “BIGTABLE”. Please note that reading from Google Cloud Bigtable is experimental and has to be enabled for your project. Please contact Google Cloud Support to enable this for your project. Corresponds to the JSON property `sourceFormat` @return [String]

source_uris[RW]
Required

The fully-qualified URIs that point to your data in Google Cloud.

For Google Cloud Storage URIs: Each URI can contain one '*' wildcard character and it must come after the 'bucket' name. Size limits related to load jobs apply to external data sources, plus an additional limit of 10 GB maximum size across all URIs. For Google Cloud Bigtable URIs: Exactly one URI can be specified and it has be a fully specified and valid HTTPS URL for a Google Cloud Bigtable table. For Google Cloud Datastore backups, exactly one URI can be specified, and it must end with '.backup_info'. Also, the '*' wildcard character is not allowed. Corresponds to the JSON property `sourceUris` @return [Array<String>]

Public Class Methods

new(**args) click to toggle source
# File generated/google/apis/bigquery_v2/classes.rb, line 771
def initialize(**args)
   update!(**args)
end

Public Instance Methods

update!(**args) click to toggle source

Update properties of this object

# File generated/google/apis/bigquery_v2/classes.rb, line 776
def update!(**args)
  @autodetect = args[:autodetect] if args.key?(:autodetect)
  @bigtable_options = args[:bigtable_options] if args.key?(:bigtable_options)
  @compression = args[:compression] if args.key?(:compression)
  @csv_options = args[:csv_options] if args.key?(:csv_options)
  @ignore_unknown_values = args[:ignore_unknown_values] if args.key?(:ignore_unknown_values)
  @max_bad_records = args[:max_bad_records] if args.key?(:max_bad_records)
  @schema = args[:schema] if args.key?(:schema)
  @source_format = args[:source_format] if args.key?(:source_format)
  @source_uris = args[:source_uris] if args.key?(:source_uris)
end