| <html><body> |
| <style> |
| |
| body, h1, h2, h3, div, span, p, pre, a { |
| margin: 0; |
| padding: 0; |
| border: 0; |
| font-weight: inherit; |
| font-style: inherit; |
| font-size: 100%; |
| font-family: inherit; |
| vertical-align: baseline; |
| } |
| |
| body { |
| font-size: 13px; |
| padding: 1em; |
| } |
| |
| h1 { |
| font-size: 26px; |
| margin-bottom: 1em; |
| } |
| |
| h2 { |
| font-size: 24px; |
| margin-bottom: 1em; |
| } |
| |
| h3 { |
| font-size: 20px; |
| margin-bottom: 1em; |
| margin-top: 1em; |
| } |
| |
| pre, code { |
| line-height: 1.5; |
| font-family: Monaco, 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', 'Lucida Console', monospace; |
| } |
| |
| pre { |
| margin-top: 0.5em; |
| } |
| |
| h1, h2, h3, p { |
| font-family: Arial, sans serif; |
| } |
| |
| h1, h2, h3 { |
| border-bottom: solid #CCC 1px; |
| } |
| |
| .toc_element { |
| margin-top: 0.5em; |
| } |
| |
| .firstline { |
| margin-left: 2 em; |
| } |
| |
| .method { |
| margin-top: 1em; |
| border: solid 1px #CCC; |
| padding: 1em; |
| background: #EEE; |
| } |
| |
| .details { |
| font-weight: bold; |
| font-size: 14px; |
| } |
| |
| </style> |
| |
| <h1><a href="bigquerydatatransfer_v1.html">BigQuery Data Transfer API</a> . <a href="bigquerydatatransfer_v1.projects.html">projects</a> . <a href="bigquerydatatransfer_v1.projects.locations.html">locations</a> . <a href="bigquerydatatransfer_v1.projects.locations.transferConfigs.html">transferConfigs</a></h1> |
| <h2>Instance Methods</h2> |
| <p class="toc_element"> |
| <code><a href="bigquerydatatransfer_v1.projects.locations.transferConfigs.runs.html">runs()</a></code> |
| </p> |
| <p class="firstline">Returns the runs Resource.</p> |
| |
| <p class="toc_element"> |
| <code><a href="#create">create(parent, body, authorizationCode=None, versionInfo=None, x__xgafv=None)</a></code></p> |
| <p class="firstline">Creates a new data transfer configuration.</p> |
| <p class="toc_element"> |
| <code><a href="#delete">delete(name, x__xgafv=None)</a></code></p> |
| <p class="firstline">Deletes a data transfer configuration,</p> |
| <p class="toc_element"> |
| <code><a href="#get">get(name, x__xgafv=None)</a></code></p> |
| <p class="firstline">Returns information about a data transfer config.</p> |
| <p class="toc_element"> |
| <code><a href="#list">list(parent, pageSize=None, dataSourceIds=None, pageToken=None, x__xgafv=None)</a></code></p> |
| <p class="firstline">Returns information about all data transfers in the project.</p> |
| <p class="toc_element"> |
| <code><a href="#list_next">list_next(previous_request, previous_response)</a></code></p> |
| <p class="firstline">Retrieves the next page of results.</p> |
| <p class="toc_element"> |
| <code><a href="#patch">patch(name, body, authorizationCode=None, updateMask=None, versionInfo=None, x__xgafv=None)</a></code></p> |
| <p class="firstline">Updates a data transfer configuration.</p> |
| <p class="toc_element"> |
| <code><a href="#scheduleRuns">scheduleRuns(parent, body, x__xgafv=None)</a></code></p> |
| <p class="firstline">Creates transfer runs for a time range [start_time, end_time].</p> |
| <p class="toc_element"> |
| <code><a href="#startManualRuns">startManualRuns(parent, body, x__xgafv=None)</a></code></p> |
| <p class="firstline">Start manual transfer runs to be executed now with schedule_time equal to</p> |
| <h3>Method Details</h3> |
| <div class="method"> |
| <code class="details" id="create">create(parent, body, authorizationCode=None, versionInfo=None, x__xgafv=None)</code> |
| <pre>Creates a new data transfer configuration. |
| |
| Args: |
| parent: string, The BigQuery project id where the transfer configuration should be created. |
| Must be in the format projects/{project_id}/locations/{location_id} |
| If specified location and location of the destination bigquery dataset |
| do not match - the request will fail. (required) |
| body: object, The request body. (required) |
| The object takes the form of: |
| |
| { # Represents a data transfer configuration. A transfer configuration |
| # contains all metadata needed to perform a data transfer. For example, |
| # `destination_dataset_id` specifies where data should be stored. |
| # When a new transfer configuration is created, the specified |
| # `destination_dataset_id` is created when needed and shared with the |
| # appropriate data source service account. |
| "dataRefreshWindowDays": 42, # The number of days to look back to automatically refresh the data. |
| # For example, if `data_refresh_window_days = 10`, then every day |
| # BigQuery reingests data for [today-10, today-1], rather than ingesting data |
| # for just [today-1]. |
| # Only valid if the data source supports the feature. Set the value to 0 |
| # to use the default value. |
| "updateTime": "A String", # Output only. Data transfer modification time. Ignored by server on input. |
| "destinationDatasetId": "A String", # The BigQuery target dataset id. |
| "displayName": "A String", # User specified display name for the data transfer. |
| "name": "A String", # The resource name of the transfer config. |
| # Transfer config names have the form of |
| # `projects/{project_id}/locations/{region}/transferConfigs/{config_id}`. |
| # The name is automatically generated based on the config_id specified in |
| # CreateTransferConfigRequest along with project_id and region. If config_id |
| # is not provided, usually a uuid, even though it is not guaranteed or |
| # required, will be generated for config_id. |
| "schedule": "A String", # Data transfer schedule. |
| # If the data source does not support a custom schedule, this should be |
| # empty. If it is empty, the default value for the data source will be |
| # used. |
| # The specified times are in UTC. |
| # Examples of valid format: |
| # `1st,3rd monday of month 15:30`, |
| # `every wed,fri of jan,jun 13:15`, and |
| # `first sunday of quarter 00:00`. |
| # See more explanation about the format here: |
| # https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format |
| # NOTE: the granularity should be at least 8 hours, or less frequent. |
| "datasetRegion": "A String", # Output only. Region in which BigQuery dataset is located. |
| "disabled": True or False, # Is this config disabled. When set to true, no runs are scheduled |
| # for a given transfer. |
| "userId": "A String", # Deprecated. Unique ID of the user on whose behalf transfer is done. |
| "scheduleOptions": { # Options customizing the data transfer schedule. # Options customizing the data transfer schedule. |
| "disableAutoScheduling": True or False, # If true, automatic scheduling of data transfer runs for this configuration |
| # will be disabled. The runs can be started on ad-hoc basis using |
| # StartManualTransferRuns API. When automatic scheduling is disabled, the |
| # TransferConfig.schedule field will be ignored. |
| "endTime": "A String", # Defines time to stop scheduling transfer runs. A transfer run cannot be |
| # scheduled at or after the end time. The end time can be changed at any |
| # moment. The time when a data transfer can be trigerred manually is not |
| # limited by this option. |
| "startTime": "A String", # Specifies time to start scheduling transfer runs. The first run will be |
| # scheduled at or after the start time according to a recurrence pattern |
| # defined in the schedule string. The start time can be changed at any |
| # moment. The time when a data transfer can be trigerred manually is not |
| # limited by this option. |
| }, |
| "state": "A String", # Output only. State of the most recently updated transfer run. |
| "dataSourceId": "A String", # Data source id. Cannot be changed once data transfer is created. |
| "nextRunTime": "A String", # Output only. Next time when data transfer will run. |
| "params": { # Data transfer specific parameters. |
| "a_key": "", # Properties of the object. |
| }, |
| } |
| |
| authorizationCode: string, Optional OAuth2 authorization code to use with this transfer configuration. |
| This is required if new credentials are needed, as indicated by |
| `CheckValidCreds`. |
| In order to obtain authorization_code, please make a |
| request to |
| https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?client_id=<datatransferapiclientid>&scope=<data_source_scopes>&redirect_uri=<redirect_uri> |
| |
| * client_id should be OAuth client_id of BigQuery DTS API for the given |
| data source returned by ListDataSources method. |
| * data_source_scopes are the scopes returned by ListDataSources method. |
| * redirect_uri is an optional parameter. If not specified, then |
| authorization code is posted to the opener of authorization flow window. |
| Otherwise it will be sent to the redirect uri. A special value of |
| urn:ietf:wg:oauth:2.0:oob means that authorization code should be |
| returned in the title bar of the browser, with the page text prompting |
| the user to copy the code and paste it in the application. |
| versionInfo: string, Optional version info. If users want to find a very recent access token, |
| that is, immediately after approving access, users have to set the |
| version_info claim in the token request. To obtain the version_info, users |
| must use the “none+gsession” response type. which be return a |
| version_info back in the authorization response which be be put in a JWT |
| claim in the token request. |
| x__xgafv: string, V1 error format. |
| Allowed values |
| 1 - v1 error format |
| 2 - v2 error format |
| |
| Returns: |
| An object of the form: |
| |
| { # Represents a data transfer configuration. A transfer configuration |
| # contains all metadata needed to perform a data transfer. For example, |
| # `destination_dataset_id` specifies where data should be stored. |
| # When a new transfer configuration is created, the specified |
| # `destination_dataset_id` is created when needed and shared with the |
| # appropriate data source service account. |
| "dataRefreshWindowDays": 42, # The number of days to look back to automatically refresh the data. |
| # For example, if `data_refresh_window_days = 10`, then every day |
| # BigQuery reingests data for [today-10, today-1], rather than ingesting data |
| # for just [today-1]. |
| # Only valid if the data source supports the feature. Set the value to 0 |
| # to use the default value. |
| "updateTime": "A String", # Output only. Data transfer modification time. Ignored by server on input. |
| "destinationDatasetId": "A String", # The BigQuery target dataset id. |
| "displayName": "A String", # User specified display name for the data transfer. |
| "name": "A String", # The resource name of the transfer config. |
| # Transfer config names have the form of |
| # `projects/{project_id}/locations/{region}/transferConfigs/{config_id}`. |
| # The name is automatically generated based on the config_id specified in |
| # CreateTransferConfigRequest along with project_id and region. If config_id |
| # is not provided, usually a uuid, even though it is not guaranteed or |
| # required, will be generated for config_id. |
| "schedule": "A String", # Data transfer schedule. |
| # If the data source does not support a custom schedule, this should be |
| # empty. If it is empty, the default value for the data source will be |
| # used. |
| # The specified times are in UTC. |
| # Examples of valid format: |
| # `1st,3rd monday of month 15:30`, |
| # `every wed,fri of jan,jun 13:15`, and |
| # `first sunday of quarter 00:00`. |
| # See more explanation about the format here: |
| # https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format |
| # NOTE: the granularity should be at least 8 hours, or less frequent. |
| "datasetRegion": "A String", # Output only. Region in which BigQuery dataset is located. |
| "disabled": True or False, # Is this config disabled. When set to true, no runs are scheduled |
| # for a given transfer. |
| "userId": "A String", # Deprecated. Unique ID of the user on whose behalf transfer is done. |
| "scheduleOptions": { # Options customizing the data transfer schedule. # Options customizing the data transfer schedule. |
| "disableAutoScheduling": True or False, # If true, automatic scheduling of data transfer runs for this configuration |
| # will be disabled. The runs can be started on ad-hoc basis using |
| # StartManualTransferRuns API. When automatic scheduling is disabled, the |
| # TransferConfig.schedule field will be ignored. |
| "endTime": "A String", # Defines time to stop scheduling transfer runs. A transfer run cannot be |
| # scheduled at or after the end time. The end time can be changed at any |
| # moment. The time when a data transfer can be trigerred manually is not |
| # limited by this option. |
| "startTime": "A String", # Specifies time to start scheduling transfer runs. The first run will be |
| # scheduled at or after the start time according to a recurrence pattern |
| # defined in the schedule string. The start time can be changed at any |
| # moment. The time when a data transfer can be trigerred manually is not |
| # limited by this option. |
| }, |
| "state": "A String", # Output only. State of the most recently updated transfer run. |
| "dataSourceId": "A String", # Data source id. Cannot be changed once data transfer is created. |
| "nextRunTime": "A String", # Output only. Next time when data transfer will run. |
| "params": { # Data transfer specific parameters. |
| "a_key": "", # Properties of the object. |
| }, |
| }</pre> |
| </div> |
| |
| <div class="method"> |
| <code class="details" id="delete">delete(name, x__xgafv=None)</code> |
| <pre>Deletes a data transfer configuration, |
| including any associated transfer runs and logs. |
| |
| Args: |
| name: string, The field will contain name of the resource requested, for example: |
| `projects/{project_id}/transferConfigs/{config_id}` (required) |
| x__xgafv: string, V1 error format. |
| Allowed values |
| 1 - v1 error format |
| 2 - v2 error format |
| |
| Returns: |
| An object of the form: |
| |
| { # A generic empty message that you can re-use to avoid defining duplicated |
| # empty messages in your APIs. A typical example is to use it as the request |
| # or the response type of an API method. For instance: |
| # |
| # service Foo { |
| # rpc Bar(google.protobuf.Empty) returns (google.protobuf.Empty); |
| # } |
| # |
| # The JSON representation for `Empty` is empty JSON object `{}`. |
| }</pre> |
| </div> |
| |
| <div class="method"> |
| <code class="details" id="get">get(name, x__xgafv=None)</code> |
| <pre>Returns information about a data transfer config. |
| |
| Args: |
| name: string, The field will contain name of the resource requested, for example: |
| `projects/{project_id}/transferConfigs/{config_id}` (required) |
| x__xgafv: string, V1 error format. |
| Allowed values |
| 1 - v1 error format |
| 2 - v2 error format |
| |
| Returns: |
| An object of the form: |
| |
| { # Represents a data transfer configuration. A transfer configuration |
| # contains all metadata needed to perform a data transfer. For example, |
| # `destination_dataset_id` specifies where data should be stored. |
| # When a new transfer configuration is created, the specified |
| # `destination_dataset_id` is created when needed and shared with the |
| # appropriate data source service account. |
| "dataRefreshWindowDays": 42, # The number of days to look back to automatically refresh the data. |
| # For example, if `data_refresh_window_days = 10`, then every day |
| # BigQuery reingests data for [today-10, today-1], rather than ingesting data |
| # for just [today-1]. |
| # Only valid if the data source supports the feature. Set the value to 0 |
| # to use the default value. |
| "updateTime": "A String", # Output only. Data transfer modification time. Ignored by server on input. |
| "destinationDatasetId": "A String", # The BigQuery target dataset id. |
| "displayName": "A String", # User specified display name for the data transfer. |
| "name": "A String", # The resource name of the transfer config. |
| # Transfer config names have the form of |
| # `projects/{project_id}/locations/{region}/transferConfigs/{config_id}`. |
| # The name is automatically generated based on the config_id specified in |
| # CreateTransferConfigRequest along with project_id and region. If config_id |
| # is not provided, usually a uuid, even though it is not guaranteed or |
| # required, will be generated for config_id. |
| "schedule": "A String", # Data transfer schedule. |
| # If the data source does not support a custom schedule, this should be |
| # empty. If it is empty, the default value for the data source will be |
| # used. |
| # The specified times are in UTC. |
| # Examples of valid format: |
| # `1st,3rd monday of month 15:30`, |
| # `every wed,fri of jan,jun 13:15`, and |
| # `first sunday of quarter 00:00`. |
| # See more explanation about the format here: |
| # https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format |
| # NOTE: the granularity should be at least 8 hours, or less frequent. |
| "datasetRegion": "A String", # Output only. Region in which BigQuery dataset is located. |
| "disabled": True or False, # Is this config disabled. When set to true, no runs are scheduled |
| # for a given transfer. |
| "userId": "A String", # Deprecated. Unique ID of the user on whose behalf transfer is done. |
| "scheduleOptions": { # Options customizing the data transfer schedule. # Options customizing the data transfer schedule. |
| "disableAutoScheduling": True or False, # If true, automatic scheduling of data transfer runs for this configuration |
| # will be disabled. The runs can be started on ad-hoc basis using |
| # StartManualTransferRuns API. When automatic scheduling is disabled, the |
| # TransferConfig.schedule field will be ignored. |
| "endTime": "A String", # Defines time to stop scheduling transfer runs. A transfer run cannot be |
| # scheduled at or after the end time. The end time can be changed at any |
| # moment. The time when a data transfer can be trigerred manually is not |
| # limited by this option. |
| "startTime": "A String", # Specifies time to start scheduling transfer runs. The first run will be |
| # scheduled at or after the start time according to a recurrence pattern |
| # defined in the schedule string. The start time can be changed at any |
| # moment. The time when a data transfer can be trigerred manually is not |
| # limited by this option. |
| }, |
| "state": "A String", # Output only. State of the most recently updated transfer run. |
| "dataSourceId": "A String", # Data source id. Cannot be changed once data transfer is created. |
| "nextRunTime": "A String", # Output only. Next time when data transfer will run. |
| "params": { # Data transfer specific parameters. |
| "a_key": "", # Properties of the object. |
| }, |
| }</pre> |
| </div> |
| |
| <div class="method"> |
| <code class="details" id="list">list(parent, pageSize=None, dataSourceIds=None, pageToken=None, x__xgafv=None)</code> |
| <pre>Returns information about all data transfers in the project. |
| |
| Args: |
| parent: string, The BigQuery project id for which data sources |
| should be returned: `projects/{project_id}`. (required) |
| pageSize: integer, Page size. The default page size is the maximum value of 1000 results. |
| dataSourceIds: string, When specified, only configurations of requested data sources are returned. (repeated) |
| pageToken: string, Pagination token, which can be used to request a specific page |
| of `ListTransfersRequest` list results. For multiple-page |
| results, `ListTransfersResponse` outputs |
| a `next_page` token, which can be used as the |
| `page_token` value to request the next page of list results. |
| x__xgafv: string, V1 error format. |
| Allowed values |
| 1 - v1 error format |
| 2 - v2 error format |
| |
| Returns: |
| An object of the form: |
| |
| { # The returned list of pipelines in the project. |
| "nextPageToken": "A String", # Output only. The next-pagination token. For multiple-page list results, |
| # this token can be used as the |
| # `ListTransferConfigsRequest.page_token` |
| # to request the next page of list results. |
| "transferConfigs": [ # Output only. The stored pipeline transfer configurations. |
| { # Represents a data transfer configuration. A transfer configuration |
| # contains all metadata needed to perform a data transfer. For example, |
| # `destination_dataset_id` specifies where data should be stored. |
| # When a new transfer configuration is created, the specified |
| # `destination_dataset_id` is created when needed and shared with the |
| # appropriate data source service account. |
| "dataRefreshWindowDays": 42, # The number of days to look back to automatically refresh the data. |
| # For example, if `data_refresh_window_days = 10`, then every day |
| # BigQuery reingests data for [today-10, today-1], rather than ingesting data |
| # for just [today-1]. |
| # Only valid if the data source supports the feature. Set the value to 0 |
| # to use the default value. |
| "updateTime": "A String", # Output only. Data transfer modification time. Ignored by server on input. |
| "destinationDatasetId": "A String", # The BigQuery target dataset id. |
| "displayName": "A String", # User specified display name for the data transfer. |
| "name": "A String", # The resource name of the transfer config. |
| # Transfer config names have the form of |
| # `projects/{project_id}/locations/{region}/transferConfigs/{config_id}`. |
| # The name is automatically generated based on the config_id specified in |
| # CreateTransferConfigRequest along with project_id and region. If config_id |
| # is not provided, usually a uuid, even though it is not guaranteed or |
| # required, will be generated for config_id. |
| "schedule": "A String", # Data transfer schedule. |
| # If the data source does not support a custom schedule, this should be |
| # empty. If it is empty, the default value for the data source will be |
| # used. |
| # The specified times are in UTC. |
| # Examples of valid format: |
| # `1st,3rd monday of month 15:30`, |
| # `every wed,fri of jan,jun 13:15`, and |
| # `first sunday of quarter 00:00`. |
| # See more explanation about the format here: |
| # https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format |
| # NOTE: the granularity should be at least 8 hours, or less frequent. |
| "datasetRegion": "A String", # Output only. Region in which BigQuery dataset is located. |
| "disabled": True or False, # Is this config disabled. When set to true, no runs are scheduled |
| # for a given transfer. |
| "userId": "A String", # Deprecated. Unique ID of the user on whose behalf transfer is done. |
| "scheduleOptions": { # Options customizing the data transfer schedule. # Options customizing the data transfer schedule. |
| "disableAutoScheduling": True or False, # If true, automatic scheduling of data transfer runs for this configuration |
| # will be disabled. The runs can be started on ad-hoc basis using |
| # StartManualTransferRuns API. When automatic scheduling is disabled, the |
| # TransferConfig.schedule field will be ignored. |
| "endTime": "A String", # Defines time to stop scheduling transfer runs. A transfer run cannot be |
| # scheduled at or after the end time. The end time can be changed at any |
| # moment. The time when a data transfer can be trigerred manually is not |
| # limited by this option. |
| "startTime": "A String", # Specifies time to start scheduling transfer runs. The first run will be |
| # scheduled at or after the start time according to a recurrence pattern |
| # defined in the schedule string. The start time can be changed at any |
| # moment. The time when a data transfer can be trigerred manually is not |
| # limited by this option. |
| }, |
| "state": "A String", # Output only. State of the most recently updated transfer run. |
| "dataSourceId": "A String", # Data source id. Cannot be changed once data transfer is created. |
| "nextRunTime": "A String", # Output only. Next time when data transfer will run. |
| "params": { # Data transfer specific parameters. |
| "a_key": "", # Properties of the object. |
| }, |
| }, |
| ], |
| }</pre> |
| </div> |
| |
| <div class="method"> |
| <code class="details" id="list_next">list_next(previous_request, previous_response)</code> |
| <pre>Retrieves the next page of results. |
| |
| Args: |
| previous_request: The request for the previous page. (required) |
| previous_response: The response from the request for the previous page. (required) |
| |
| Returns: |
| A request object that you can call 'execute()' on to request the next |
| page. Returns None if there are no more items in the collection. |
| </pre> |
| </div> |
| |
| <div class="method"> |
| <code class="details" id="patch">patch(name, body, authorizationCode=None, updateMask=None, versionInfo=None, x__xgafv=None)</code> |
| <pre>Updates a data transfer configuration. |
| All fields must be set, even if they are not updated. |
| |
| Args: |
| name: string, The resource name of the transfer config. |
| Transfer config names have the form of |
| `projects/{project_id}/locations/{region}/transferConfigs/{config_id}`. |
| The name is automatically generated based on the config_id specified in |
| CreateTransferConfigRequest along with project_id and region. If config_id |
| is not provided, usually a uuid, even though it is not guaranteed or |
| required, will be generated for config_id. (required) |
| body: object, The request body. (required) |
| The object takes the form of: |
| |
| { # Represents a data transfer configuration. A transfer configuration |
| # contains all metadata needed to perform a data transfer. For example, |
| # `destination_dataset_id` specifies where data should be stored. |
| # When a new transfer configuration is created, the specified |
| # `destination_dataset_id` is created when needed and shared with the |
| # appropriate data source service account. |
| "dataRefreshWindowDays": 42, # The number of days to look back to automatically refresh the data. |
| # For example, if `data_refresh_window_days = 10`, then every day |
| # BigQuery reingests data for [today-10, today-1], rather than ingesting data |
| # for just [today-1]. |
| # Only valid if the data source supports the feature. Set the value to 0 |
| # to use the default value. |
| "updateTime": "A String", # Output only. Data transfer modification time. Ignored by server on input. |
| "destinationDatasetId": "A String", # The BigQuery target dataset id. |
| "displayName": "A String", # User specified display name for the data transfer. |
| "name": "A String", # The resource name of the transfer config. |
| # Transfer config names have the form of |
| # `projects/{project_id}/locations/{region}/transferConfigs/{config_id}`. |
| # The name is automatically generated based on the config_id specified in |
| # CreateTransferConfigRequest along with project_id and region. If config_id |
| # is not provided, usually a uuid, even though it is not guaranteed or |
| # required, will be generated for config_id. |
| "schedule": "A String", # Data transfer schedule. |
| # If the data source does not support a custom schedule, this should be |
| # empty. If it is empty, the default value for the data source will be |
| # used. |
| # The specified times are in UTC. |
| # Examples of valid format: |
| # `1st,3rd monday of month 15:30`, |
| # `every wed,fri of jan,jun 13:15`, and |
| # `first sunday of quarter 00:00`. |
| # See more explanation about the format here: |
| # https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format |
| # NOTE: the granularity should be at least 8 hours, or less frequent. |
| "datasetRegion": "A String", # Output only. Region in which BigQuery dataset is located. |
| "disabled": True or False, # Is this config disabled. When set to true, no runs are scheduled |
| # for a given transfer. |
| "userId": "A String", # Deprecated. Unique ID of the user on whose behalf transfer is done. |
| "scheduleOptions": { # Options customizing the data transfer schedule. # Options customizing the data transfer schedule. |
| "disableAutoScheduling": True or False, # If true, automatic scheduling of data transfer runs for this configuration |
| # will be disabled. The runs can be started on ad-hoc basis using |
| # StartManualTransferRuns API. When automatic scheduling is disabled, the |
| # TransferConfig.schedule field will be ignored. |
| "endTime": "A String", # Defines time to stop scheduling transfer runs. A transfer run cannot be |
| # scheduled at or after the end time. The end time can be changed at any |
| # moment. The time when a data transfer can be trigerred manually is not |
| # limited by this option. |
| "startTime": "A String", # Specifies time to start scheduling transfer runs. The first run will be |
| # scheduled at or after the start time according to a recurrence pattern |
| # defined in the schedule string. The start time can be changed at any |
| # moment. The time when a data transfer can be trigerred manually is not |
| # limited by this option. |
| }, |
| "state": "A String", # Output only. State of the most recently updated transfer run. |
| "dataSourceId": "A String", # Data source id. Cannot be changed once data transfer is created. |
| "nextRunTime": "A String", # Output only. Next time when data transfer will run. |
| "params": { # Data transfer specific parameters. |
| "a_key": "", # Properties of the object. |
| }, |
| } |
| |
| authorizationCode: string, Optional OAuth2 authorization code to use with this transfer configuration. |
| If it is provided, the transfer configuration will be associated with the |
| authorizing user. |
| In order to obtain authorization_code, please make a |
| request to |
| https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?client_id=<datatransferapiclientid>&scope=<data_source_scopes>&redirect_uri=<redirect_uri> |
| |
| * client_id should be OAuth client_id of BigQuery DTS API for the given |
| data source returned by ListDataSources method. |
| * data_source_scopes are the scopes returned by ListDataSources method. |
| * redirect_uri is an optional parameter. If not specified, then |
| authorization code is posted to the opener of authorization flow window. |
| Otherwise it will be sent to the redirect uri. A special value of |
| urn:ietf:wg:oauth:2.0:oob means that authorization code should be |
| returned in the title bar of the browser, with the page text prompting |
| the user to copy the code and paste it in the application. |
| updateMask: string, Required list of fields to be updated in this request. |
| versionInfo: string, Optional version info. If users want to find a very recent access token, |
| that is, immediately after approving access, users have to set the |
| version_info claim in the token request. To obtain the version_info, users |
| must use the “none+gsession” response type. which be return a |
| version_info back in the authorization response which be be put in a JWT |
| claim in the token request. |
| x__xgafv: string, V1 error format. |
| Allowed values |
| 1 - v1 error format |
| 2 - v2 error format |
| |
| Returns: |
| An object of the form: |
| |
| { # Represents a data transfer configuration. A transfer configuration |
| # contains all metadata needed to perform a data transfer. For example, |
| # `destination_dataset_id` specifies where data should be stored. |
| # When a new transfer configuration is created, the specified |
| # `destination_dataset_id` is created when needed and shared with the |
| # appropriate data source service account. |
| "dataRefreshWindowDays": 42, # The number of days to look back to automatically refresh the data. |
| # For example, if `data_refresh_window_days = 10`, then every day |
| # BigQuery reingests data for [today-10, today-1], rather than ingesting data |
| # for just [today-1]. |
| # Only valid if the data source supports the feature. Set the value to 0 |
| # to use the default value. |
| "updateTime": "A String", # Output only. Data transfer modification time. Ignored by server on input. |
| "destinationDatasetId": "A String", # The BigQuery target dataset id. |
| "displayName": "A String", # User specified display name for the data transfer. |
| "name": "A String", # The resource name of the transfer config. |
| # Transfer config names have the form of |
| # `projects/{project_id}/locations/{region}/transferConfigs/{config_id}`. |
| # The name is automatically generated based on the config_id specified in |
| # CreateTransferConfigRequest along with project_id and region. If config_id |
| # is not provided, usually a uuid, even though it is not guaranteed or |
| # required, will be generated for config_id. |
| "schedule": "A String", # Data transfer schedule. |
| # If the data source does not support a custom schedule, this should be |
| # empty. If it is empty, the default value for the data source will be |
| # used. |
| # The specified times are in UTC. |
| # Examples of valid format: |
| # `1st,3rd monday of month 15:30`, |
| # `every wed,fri of jan,jun 13:15`, and |
| # `first sunday of quarter 00:00`. |
| # See more explanation about the format here: |
| # https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format |
| # NOTE: the granularity should be at least 8 hours, or less frequent. |
| "datasetRegion": "A String", # Output only. Region in which BigQuery dataset is located. |
| "disabled": True or False, # Is this config disabled. When set to true, no runs are scheduled |
| # for a given transfer. |
| "userId": "A String", # Deprecated. Unique ID of the user on whose behalf transfer is done. |
| "scheduleOptions": { # Options customizing the data transfer schedule. # Options customizing the data transfer schedule. |
| "disableAutoScheduling": True or False, # If true, automatic scheduling of data transfer runs for this configuration |
| # will be disabled. The runs can be started on ad-hoc basis using |
| # StartManualTransferRuns API. When automatic scheduling is disabled, the |
| # TransferConfig.schedule field will be ignored. |
| "endTime": "A String", # Defines time to stop scheduling transfer runs. A transfer run cannot be |
| # scheduled at or after the end time. The end time can be changed at any |
| # moment. The time when a data transfer can be trigerred manually is not |
| # limited by this option. |
| "startTime": "A String", # Specifies time to start scheduling transfer runs. The first run will be |
| # scheduled at or after the start time according to a recurrence pattern |
| # defined in the schedule string. The start time can be changed at any |
| # moment. The time when a data transfer can be trigerred manually is not |
| # limited by this option. |
| }, |
| "state": "A String", # Output only. State of the most recently updated transfer run. |
| "dataSourceId": "A String", # Data source id. Cannot be changed once data transfer is created. |
| "nextRunTime": "A String", # Output only. Next time when data transfer will run. |
| "params": { # Data transfer specific parameters. |
| "a_key": "", # Properties of the object. |
| }, |
| }</pre> |
| </div> |
| |
| <div class="method"> |
| <code class="details" id="scheduleRuns">scheduleRuns(parent, body, x__xgafv=None)</code> |
| <pre>Creates transfer runs for a time range [start_time, end_time]. |
| For each date - or whatever granularity the data source supports - in the |
| range, one transfer run is created. |
| Note that runs are created per UTC time in the time range. |
| DEPRECATED: use StartManualTransferRuns instead. |
| |
| Args: |
| parent: string, Transfer configuration name in the form: |
| `projects/{project_id}/transferConfigs/{config_id}`. (required) |
| body: object, The request body. (required) |
| The object takes the form of: |
| |
| { # A request to schedule transfer runs for a time range. |
| "endTime": "A String", # End time of the range of transfer runs. For example, |
| # `"2017-05-30T00:00:00+00:00"`. |
| "startTime": "A String", # Start time of the range of transfer runs. For example, |
| # `"2017-05-25T00:00:00+00:00"`. |
| } |
| |
| x__xgafv: string, V1 error format. |
| Allowed values |
| 1 - v1 error format |
| 2 - v2 error format |
| |
| Returns: |
| An object of the form: |
| |
| { # A response to schedule transfer runs for a time range. |
| "runs": [ # The transfer runs that were scheduled. |
| { # Represents a data transfer run. |
| "updateTime": "A String", # Output only. Last time the data transfer run state was updated. |
| "destinationDatasetId": "A String", # Output only. The BigQuery target dataset id. |
| "name": "A String", # The resource name of the transfer run. |
| # Transfer run names have the form |
| # `projects/{project_id}/locations/{location}/transferConfigs/{config_id}/runs/{run_id}`. |
| # The name is ignored when creating a transfer run. |
| "schedule": "A String", # Output only. Describes the schedule of this transfer run if it was |
| # created as part of a regular schedule. For batch transfer runs that are |
| # scheduled manually, this is empty. |
| # NOTE: the system might choose to delay the schedule depending on the |
| # current load, so `schedule_time` doesn't always match this. |
| "scheduleTime": "A String", # Minimum time after which a transfer run can be started. |
| "userId": "A String", # Deprecated. Unique ID of the user on whose behalf transfer is done. |
| "state": "A String", # Data transfer run state. Ignored for input requests. |
| "errorStatus": { # The `Status` type defines a logical error model that is suitable for # Status of the transfer run. |
| # different programming environments, including REST APIs and RPC APIs. It is |
| # used by [gRPC](https://github.com/grpc). Each `Status` message contains |
| # three pieces of data: error code, error message, and error details. |
| # |
| # You can find out more about this error model and how to work with it in the |
| # [API Design Guide](https://cloud.google.com/apis/design/errors). |
| "message": "A String", # A developer-facing error message, which should be in English. Any |
| # user-facing error message should be localized and sent in the |
| # google.rpc.Status.details field, or localized by the client. |
| "code": 42, # The status code, which should be an enum value of google.rpc.Code. |
| "details": [ # A list of messages that carry the error details. There is a common set of |
| # message types for APIs to use. |
| { |
| "a_key": "", # Properties of the object. Contains field @type with type URL. |
| }, |
| ], |
| }, |
| "params": { # Output only. Data transfer specific parameters. |
| "a_key": "", # Properties of the object. |
| }, |
| "startTime": "A String", # Output only. Time when transfer run was started. |
| # Parameter ignored by server for input requests. |
| "dataSourceId": "A String", # Output only. Data source id. |
| "runTime": "A String", # For batch transfer runs, specifies the date and time that |
| # data should be ingested. |
| "endTime": "A String", # Output only. Time when transfer run ended. |
| # Parameter ignored by server for input requests. |
| }, |
| ], |
| }</pre> |
| </div> |
| |
| <div class="method"> |
| <code class="details" id="startManualRuns">startManualRuns(parent, body, x__xgafv=None)</code> |
| <pre>Start manual transfer runs to be executed now with schedule_time equal to |
| current time. The transfer runs can be created for a time range where the |
| run_time is between start_time (inclusive) and end_time (exclusive), or for |
| a specific run_time. |
| |
| Args: |
| parent: string, Transfer configuration name in the form: |
| `projects/{project_id}/transferConfigs/{config_id}`. (required) |
| body: object, The request body. (required) |
| The object takes the form of: |
| |
| { # A request to start manual transfer runs. |
| "requestedTimeRange": { # A specification for a time range, this will request transfer runs with # Time range for the transfer runs that should be started. |
| # run_time between start_time (inclusive) and end_time (exclusive). |
| "endTime": "A String", # End time of the range of transfer runs. For example, |
| # `"2017-05-30T00:00:00+00:00"`. The end_time must not be in the future. |
| # Creates transfer runs where run_time is in the range betwen start_time |
| # (inclusive) and end_time (exlusive). |
| "startTime": "A String", # Start time of the range of transfer runs. For example, |
| # `"2017-05-25T00:00:00+00:00"`. The start_time must be strictly less than |
| # the end_time. Creates transfer runs where run_time is in the range betwen |
| # start_time (inclusive) and end_time (exlusive). |
| }, |
| "requestedRunTime": "A String", # Specific run_time for a transfer run to be started. The |
| # requested_run_time must not be in the future. |
| } |
| |
| x__xgafv: string, V1 error format. |
| Allowed values |
| 1 - v1 error format |
| 2 - v2 error format |
| |
| Returns: |
| An object of the form: |
| |
| { # A response to start manual transfer runs. |
| "runs": [ # The transfer runs that were created. |
| { # Represents a data transfer run. |
| "updateTime": "A String", # Output only. Last time the data transfer run state was updated. |
| "destinationDatasetId": "A String", # Output only. The BigQuery target dataset id. |
| "name": "A String", # The resource name of the transfer run. |
| # Transfer run names have the form |
| # `projects/{project_id}/locations/{location}/transferConfigs/{config_id}/runs/{run_id}`. |
| # The name is ignored when creating a transfer run. |
| "schedule": "A String", # Output only. Describes the schedule of this transfer run if it was |
| # created as part of a regular schedule. For batch transfer runs that are |
| # scheduled manually, this is empty. |
| # NOTE: the system might choose to delay the schedule depending on the |
| # current load, so `schedule_time` doesn't always match this. |
| "scheduleTime": "A String", # Minimum time after which a transfer run can be started. |
| "userId": "A String", # Deprecated. Unique ID of the user on whose behalf transfer is done. |
| "state": "A String", # Data transfer run state. Ignored for input requests. |
| "errorStatus": { # The `Status` type defines a logical error model that is suitable for # Status of the transfer run. |
| # different programming environments, including REST APIs and RPC APIs. It is |
| # used by [gRPC](https://github.com/grpc). Each `Status` message contains |
| # three pieces of data: error code, error message, and error details. |
| # |
| # You can find out more about this error model and how to work with it in the |
| # [API Design Guide](https://cloud.google.com/apis/design/errors). |
| "message": "A String", # A developer-facing error message, which should be in English. Any |
| # user-facing error message should be localized and sent in the |
| # google.rpc.Status.details field, or localized by the client. |
| "code": 42, # The status code, which should be an enum value of google.rpc.Code. |
| "details": [ # A list of messages that carry the error details. There is a common set of |
| # message types for APIs to use. |
| { |
| "a_key": "", # Properties of the object. Contains field @type with type URL. |
| }, |
| ], |
| }, |
| "params": { # Output only. Data transfer specific parameters. |
| "a_key": "", # Properties of the object. |
| }, |
| "startTime": "A String", # Output only. Time when transfer run was started. |
| # Parameter ignored by server for input requests. |
| "dataSourceId": "A String", # Output only. Data source id. |
| "runTime": "A String", # For batch transfer runs, specifies the date and time that |
| # data should be ingested. |
| "endTime": "A String", # Output only. Time when transfer run ended. |
| # Parameter ignored by server for input requests. |
| }, |
| ], |
| }</pre> |
| </div> |
| |
| </body></html> |