Package com.google.cloud.retail.v2
Interface BigQuerySourceOrBuilder
-
- All Superinterfaces:
com.google.protobuf.MessageLiteOrBuilder,com.google.protobuf.MessageOrBuilder
- All Known Implementing Classes:
BigQuerySource,BigQuerySource.Builder
public interface BigQuerySourceOrBuilder extends com.google.protobuf.MessageOrBuilder
-
-
Method Summary
All Methods Instance Methods Abstract Methods Modifier and Type Method Description StringgetDataSchema()The schema to use when parsing the data from the source.com.google.protobuf.ByteStringgetDataSchemaBytes()The schema to use when parsing the data from the source.StringgetDatasetId()Required.com.google.protobuf.ByteStringgetDatasetIdBytes()Required.StringgetGcsStagingDir()Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters.com.google.protobuf.ByteStringgetGcsStagingDirBytes()Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters.BigQuerySource.PartitionCasegetPartitionCase()com.google.type.DategetPartitionDate()BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.com.google.type.DateOrBuildergetPartitionDateOrBuilder()BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.StringgetProjectId()The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters.com.google.protobuf.ByteStringgetProjectIdBytes()The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters.StringgetTableId()Required.com.google.protobuf.ByteStringgetTableIdBytes()Required.booleanhasPartitionDate()BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.-
Methods inherited from interface com.google.protobuf.MessageOrBuilder
findInitializationErrors, getAllFields, getDefaultInstanceForType, getDescriptorForType, getField, getInitializationErrorString, getOneofFieldDescriptor, getRepeatedField, getRepeatedFieldCount, getUnknownFields, hasField, hasOneof
-
-
-
-
Method Detail
-
hasPartitionDate
boolean hasPartitionDate()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format. Only supported in [ImportProductsRequest][google.cloud.retail.v2.ImportProductsRequest].
.google.type.Date partition_date = 6;- Returns:
- Whether the partitionDate field is set.
-
getPartitionDate
com.google.type.Date getPartitionDate()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format. Only supported in [ImportProductsRequest][google.cloud.retail.v2.ImportProductsRequest].
.google.type.Date partition_date = 6;- Returns:
- The partitionDate.
-
getPartitionDateOrBuilder
com.google.type.DateOrBuilder getPartitionDateOrBuilder()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format. Only supported in [ImportProductsRequest][google.cloud.retail.v2.ImportProductsRequest].
.google.type.Date partition_date = 6;
-
getProjectId
String getProjectId()
The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
string project_id = 5;- Returns:
- The projectId.
-
getProjectIdBytes
com.google.protobuf.ByteString getProjectIdBytes()
The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
string project_id = 5;- Returns:
- The bytes for projectId.
-
getDatasetId
String getDatasetId()
Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
string dataset_id = 1 [(.google.api.field_behavior) = REQUIRED];- Returns:
- The datasetId.
-
getDatasetIdBytes
com.google.protobuf.ByteString getDatasetIdBytes()
Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
string dataset_id = 1 [(.google.api.field_behavior) = REQUIRED];- Returns:
- The bytes for datasetId.
-
getTableId
String getTableId()
Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
string table_id = 2 [(.google.api.field_behavior) = REQUIRED];- Returns:
- The tableId.
-
getTableIdBytes
com.google.protobuf.ByteString getTableIdBytes()
Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
string table_id = 2 [(.google.api.field_behavior) = REQUIRED];- Returns:
- The bytes for tableId.
-
getGcsStagingDir
String getGcsStagingDir()
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
string gcs_staging_dir = 3;- Returns:
- The gcsStagingDir.
-
getGcsStagingDirBytes
com.google.protobuf.ByteString getGcsStagingDirBytes()
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
string gcs_staging_dir = 3;- Returns:
- The bytes for gcsStagingDir.
-
getDataSchema
String getDataSchema()
The schema to use when parsing the data from the source. Supported values for product imports: * `product` (default): One JSON [Product][google.cloud.retail.v2.Product] per line. Each product must have a valid [Product.id][google.cloud.retail.v2.Product.id]. * `product_merchant_center`: See [Importing catalog data from Merchant Center](https://cloud.google.com/retail/recommendations-ai/docs/upload-catalog#mc). Supported values for user events imports: * `user_event` (default): One JSON [UserEvent][google.cloud.retail.v2.UserEvent] per line. * `user_event_ga360`: The schema is available here: https://support.google.com/analytics/answer/3437719. * `user_event_ga4`: The schema is available here: https://support.google.com/analytics/answer/7029846. Supported values for autocomplete imports: * `suggestions` (default): One JSON completion suggestion per line. * `denylist`: One JSON deny suggestion per line. * `allowlist`: One JSON allow suggestion per line.
string data_schema = 4;- Returns:
- The dataSchema.
-
getDataSchemaBytes
com.google.protobuf.ByteString getDataSchemaBytes()
The schema to use when parsing the data from the source. Supported values for product imports: * `product` (default): One JSON [Product][google.cloud.retail.v2.Product] per line. Each product must have a valid [Product.id][google.cloud.retail.v2.Product.id]. * `product_merchant_center`: See [Importing catalog data from Merchant Center](https://cloud.google.com/retail/recommendations-ai/docs/upload-catalog#mc). Supported values for user events imports: * `user_event` (default): One JSON [UserEvent][google.cloud.retail.v2.UserEvent] per line. * `user_event_ga360`: The schema is available here: https://support.google.com/analytics/answer/3437719. * `user_event_ga4`: The schema is available here: https://support.google.com/analytics/answer/7029846. Supported values for autocomplete imports: * `suggestions` (default): One JSON completion suggestion per line. * `denylist`: One JSON deny suggestion per line. * `allowlist`: One JSON allow suggestion per line.
string data_schema = 4;- Returns:
- The bytes for dataSchema.
-
getPartitionCase
BigQuerySource.PartitionCase getPartitionCase()
-
-