docs: update docs/dyn (#1096)
This PR was generated using Autosynth. :rainbow:
Synth log will be available here:
https://source.cloud.google.com/results/invocations/6f0f288a-a1e8-4b2d-a85f-00b1c6150185/targets
- [ ] To automatically regenerate this PR, check this box.
Source-Link: https://github.com/googleapis/synthtool/commit/39b7149da4026765385403632db3c6f63db96b2c
Source-Link: https://github.com/googleapis/synthtool/commit/9a7d9fbb7045c34c9d3d22c1ff766eeae51f04c9
Source-Link: https://github.com/googleapis/synthtool/commit/dc9903a8c30c3662b6098f0e4a97f221d67268b2
Source-Link: https://github.com/googleapis/synthtool/commit/7fcc405a579d5d53a726ff3da1b7c8c08f0f2d58
Source-Link: https://github.com/googleapis/synthtool/commit/d5fc0bcf9ea9789c5b0e3154a9e3b29e5cea6116
Source-Link: https://github.com/googleapis/synthtool/commit/e89175cf074dccc4babb4eca66ae913696e47a71
Source-Link: https://github.com/googleapis/synthtool/commit/7d652819519dfa24da9e14548232e4aaba71a11c
Source-Link: https://github.com/googleapis/synthtool/commit/7db8a6c5ffb12a6e4c2f799c18f00f7f3d60e279
Source-Link: https://github.com/googleapis/synthtool/commit/1f1148d3c7a7a52f0c98077f976bd9b3c948ee2b
Source-Link: https://github.com/googleapis/synthtool/commit/2c8aecedd55b0480fb4e123b6e07fa5b12953862
Source-Link: https://github.com/googleapis/synthtool/commit/3d3e94c4e02370f307a9a200b0c743c3d8d19f29
Source-Link: https://github.com/googleapis/synthtool/commit/c7824ea48ff6d4d42dfae0849aec8a85acd90bd9
Source-Link: https://github.com/googleapis/synthtool/commit/ba9918cd22874245b55734f57470c719b577e591
Source-Link: https://github.com/googleapis/synthtool/commit/b19b401571e77192f8dd38eab5fb2300a0de9324
Source-Link: https://github.com/googleapis/synthtool/commit/6542bd723403513626f61642fc02ddca528409aa
diff --git a/docs/dyn/bigquery_v2.tabledata.html b/docs/dyn/bigquery_v2.tabledata.html
index e70c8fe..ae92f3c 100644
--- a/docs/dyn/bigquery_v2.tabledata.html
+++ b/docs/dyn/bigquery_v2.tabledata.html
@@ -81,7 +81,7 @@
<code><a href="#insertAll">insertAll(projectId, datasetId, tableId, body=None)</a></code></p>
<p class="firstline">Streams data into BigQuery one record at a time without needing to run a load job. Requires the WRITER dataset role.</p>
<p class="toc_element">
- <code><a href="#list">list(projectId, datasetId, tableId, selectedFields=None, maxResults=None, pageToken=None, startIndex=None)</a></code></p>
+ <code><a href="#list">list(projectId, datasetId, tableId, startIndex=None, pageToken=None, maxResults=None, selectedFields=None)</a></code></p>
<p class="firstline">Retrieves table data from a specified set of rows. Requires the READER dataset role.</p>
<p class="toc_element">
<code><a href="#list_next">list_next(previous_request, previous_response)</a></code></p>
@@ -104,6 +104,7 @@
The object takes the form of:
{
+ "templateSuffix": "A String", # If specified, treats the destination table as a base template, and inserts the rows into an instance table named "{destination}{templateSuffix}". BigQuery will manage creation of the instance table, using the schema of the base template table. See https://cloud.google.com/bigquery/streaming-data-into-bigquery#template-tables for considerations when working with templates tables.
"rows": [ # The rows to insert.
{
"json": { # Represents a single JSON object. # [Required] A JSON object that contains a row of data. The object's properties and values must match the destination table's schema.
@@ -112,10 +113,9 @@
"insertId": "A String", # [Optional] A unique ID for each row. BigQuery uses this property to detect duplicate insertion requests on a best-effort basis.
},
],
- "templateSuffix": "A String", # If specified, treats the destination table as a base template, and inserts the rows into an instance table named "{destination}{templateSuffix}". BigQuery will manage creation of the instance table, using the schema of the base template table. See https://cloud.google.com/bigquery/streaming-data-into-bigquery#template-tables for considerations when working with templates tables.
+ "ignoreUnknownValues": True or False, # [Optional] Accept rows that contain values that do not match the schema. The unknown values are ignored. Default is false, which treats unknown values as errors.
"skipInvalidRows": True or False, # [Optional] Insert all valid rows of a request, even if invalid rows exist. The default value is false, which causes the entire request to fail if any invalid rows exist.
"kind": "bigquery#tableDataInsertAllRequest", # The resource type of the response.
- "ignoreUnknownValues": True or False, # [Optional] Accept rows that contain values that do not match the schema. The unknown values are ignored. Default is false, which treats unknown values as errors.
}
@@ -123,42 +123,43 @@
An object of the form:
{
- "kind": "bigquery#tableDataInsertAllResponse", # The resource type of the response.
"insertErrors": [ # An array of errors for rows that were not inserted.
{
- "index": 42, # The index of the row that error applies to.
"errors": [ # Error information for the row indicated by the index property.
{
- "reason": "A String", # A short error code that summarizes the error.
- "location": "A String", # Specifies where the error occurred, if present.
"message": "A String", # A human-readable description of the error.
+ "location": "A String", # Specifies where the error occurred, if present.
"debugInfo": "A String", # Debugging information. This property is internal to Google and should not be used.
+ "reason": "A String", # A short error code that summarizes the error.
},
],
+ "index": 42, # The index of the row that error applies to.
},
],
+ "kind": "bigquery#tableDataInsertAllResponse", # The resource type of the response.
}</pre>
</div>
<div class="method">
- <code class="details" id="list">list(projectId, datasetId, tableId, selectedFields=None, maxResults=None, pageToken=None, startIndex=None)</code>
+ <code class="details" id="list">list(projectId, datasetId, tableId, startIndex=None, pageToken=None, maxResults=None, selectedFields=None)</code>
<pre>Retrieves table data from a specified set of rows. Requires the READER dataset role.
Args:
projectId: string, Project ID of the table to read (required)
datasetId: string, Dataset ID of the table to read (required)
tableId: string, Table ID of the table to read (required)
- selectedFields: string, List of fields to return (comma-separated). If unspecified, all fields are returned
- maxResults: integer, Maximum number of results to return
- pageToken: string, Page token, returned by a previous call, identifying the result set
startIndex: string, Zero-based index of the starting row to read
+ pageToken: string, Page token, returned by a previous call, identifying the result set
+ maxResults: integer, Maximum number of results to return
+ selectedFields: string, List of fields to return (comma-separated). If unspecified, all fields are returned
Returns:
An object of the form:
{
- "totalRows": "A String", # The total number of rows in the complete table.
"pageToken": "A String", # A token used for paging results. Providing this token instead of the startIndex parameter can help you retrieve stable results when an underlying table is changing.
+ "kind": "bigquery#tableDataList", # The resource type of the response.
+ "totalRows": "A String", # The total number of rows in the complete table.
"rows": [ # Rows of results.
{
"f": [ # Represents a single row in the result set, consisting of one or more fields.
@@ -169,7 +170,6 @@
},
],
"etag": "A String", # A hash of this page of results.
- "kind": "bigquery#tableDataList", # The resource type of the response.
}</pre>
</div>